Supporting Doc 2

CLI OMB Appendices F-O 5-22-06.doc

The Strategic Prevention Framework State Incentive Grant (SPF SIG) State-Level Interview Protocols

Supporting Doc 2

OMB: 0930-0279

Document [doc]
Download: doc | pdf
















APPENDIX F


SPF SIG National Evaluation Design

SPF-SIG NATIONAL EVALUATION DESIGN

(2/10/2006)


EXECUTIVE SUMMARY

  1. Overview

    1. Key Features of the Design

    2. Organization of this document

  2. Evaluation Questions

  3. Logic Model of SPF Impact

    1. Measured processes and outcomes

    2. Measured baseline status.

    3. Measured post-baseline contextual change

    4. Unmeasured Factors

  4. Comparison Conditions

    1. Non-SPF States (for State-Level Comparisons)

    2. Non-Funded Communities Within SPF States (for Community-Level Comparisons)

  5. Mediators and Moderators

  6. Data Sources and Measures

    1. Overview

    2. State-Level (baseline status, process measures, and system-level outcomes)

    3. Community-Level (baseline status, process measures, and system-level outcomes)

    4. Cultural Competence

    5. Epidemiological Outcomes

    6. Next Steps

  7. Analysis Plan

    1. Descriptive/Normative Analyses

    2. Inferential (Cause and Effect) Analyses

  8. Implementation of Participatory Collaborative Model

    1. Instrument Development and Selection of Outcomes

    2. Evaluation Design

    3. Implementation

    4. Analysis and Dissemination

  9. Frequently Asked Questions

    1. Questions related to comparing funded and non-funded communities

    2. Questions related to availability of data

    3. Questions related to analysis limitations

References

Appendices


SPF-SIG NATIONAL EVALUATION DESIGN

EXECUTIVE SUMMARY


  1. Overview


The Strategic Prevention Framework State Incentives Grant (SPF SIG) program is one of SAMHSA’s Infrastructure Grant programs supporting an array of activities to help states and communities build a solid foundation for delivering and sustaining effective substance abuse and/or mental health services. The goals are: (1) prevent the onset and reduce the progression of substance abuse, including childhood and underage drinking; (2) reduce substance abuse-related problems in communities; and (3) build prevention capacity and infrastructure at the state/territory and community levels (CSAP, 2004). Other notable characteristics of the initiative are an emphasis on epidemiologic data with a population-based perspective, an increased emphasis on cultural competence, and a focus on sustainability from the outset. Although the direct recipients of SPF SIG funds are states and territories, CSAP envisions the SPF SIGs being implemented through partnerships between the states/territories and communities. CSAP funded 21 states and territories in FY2004 for up to 5 years to implement the SPF, and 5 additional states/territories in FY2005.1


To assess CSAP’s SPF SIG goals effectively, the evaluation team is implementing a multilevel, multi-method quasi-experimental design. The scope of the evaluation encompasses national, state, and community levels. The design uses both quantitative and qualitative data, the latter providing process data and systems outcomes at the state and community levels, as well as context for analyzing the NOMs and other epidemiological outcomes. Key features of the methodology include:

  • A rigorous, yet practical approach to evaluating processes and outcomes at state and community levels, grounded in lessons learned from experience with the current SIG national evaluation;

  • Due consideration of program aspects critical to CSAP (including strategic and data-driven planning, state-level system change, environmental change at all levels, and underage drinking in addition to illicit drug use);

  • A vision of grantees as full partners in the design and implementation of the national evaluation, with continuing collaboration over the entire course of the evaluation;

  • Leveraging the relationships with the SPF states to yield data that mutually benefit the national and state-level evaluations;

  • Standardization of data collection at the state and community levels;

  • Use of natural variation and replications within and across states, in tandem with the non-SPF comparison states to explain effect estimates at the state and community levels;

  • Use of the states’ own SPF SIG evaluations to augment and aid interpretation of national evaluation data;

  • Accounting for pre-SPF SIG activities in estimating the effects of SPF SIG-initiated activities, recognizing that all states have prevention activities already under way, with many having completed or currently involved in efforts related to the precursor SIG initiative;

  • Explicit consideration of program selection and implementation fidelity in interpreting state and community- level outcomes; and

  • Use of multilevel modeling and meta-analytic methods to explain cross-site variation in state- and community-level outcomes.


  1. Evaluation Questions


There are six impact questions, divided into three symmetrical pairs, to be collaboratively addressed by the state and national evaluators (see Table 1). For Questions 1, 2, and 3, the unit of analysis is the state, the community, and communities clustered within states, respectively. The national evaluators will have primary responsibility for addressing Question 1, the state evaluators will have primary responsibility for addressing Question 2, and the national evaluators working collaboratively with state evaluators will have joint responsibility for addressing Question 3.


Table 1. Strategic Prevention Framework Evaluation Questions

Question

Basis of Comparison

Unit of Analysis

Primary Responsibility

1a. Did SPF funding improve statewide performance on NOMs and other outcomes?

SPF v. non-SPF states

States

National evaluators

1b. What accounted for variation in NOMs and other outcomes performance across SPF states?

Natural variation among SPF states

2a. Within states, did SPF funding lead to community-level improvement on NOMs and other outcomes?

Funded v. non-funded communities, within state

Communities, within states

State evaluators

2b. Within SPF states, what accounted for variation in NOM and other outcomes performance across funded communities?

Natural variation among funded communities, within state

3a. Across states, did SPF funding lead to community-level improvement on NOMs and other outcomes?

Funded v. non-funded communities, across states

Communities, across states

Both national and state evaluators

3b. Across SPF states, what accounted for variation in NOM and other outcomes performance across funded communities?

Natural variation among funded communities, across states


  1. Logic Model of SPF Impact


To help guide the evaluation design process for addressing the six impact questions, the national evaluation team has developed a draft logic model of SPF SIG impact. The model depicts the chain of activities that logically links funding of SPF SIG states to community and statewide epidemiological outcomes, and articulates a broader theory of impact, not only of SPF SIG elements and the relationships among them, but also of non-SPF factors that potentially influence the same processes and outcomes as SPF. This is critical to identifying the design and data elements needed to address the impact questions. The logic model of impact represents the flow of state- and community-level activities that lead to systems change and epidemiological outcomes in the uncontrolled “open system” where prevention operates. The model is depicted in Figure 1.

As detailed in the Chapter 3 of the Evaluation Design, the logic model dictates the need for measures of processes and outcomes (including all NOMs), baseline status, and post-baseline contextual change, with each of these assessed at both state- and community-level. There is also a need to accommodate unmeasured factors; hence the need for a comparison condition at both levels. States that did not receive SPF grants will serve as comparison states to address Question 1a. Within SPF states (for Question 2a) and across SPF states (for Question 3a), communities that do not receive intervention funds from the state will serve as comparison communities. Table 2 shows the relationship of the logic model to the study measures at both state and community levels. The “Available From All” column summarizes the data sources that will be available from both SPF and comparison states and communities, respectively, while the “Available From SPF-Funded Only” column summarizes the additional data sources that will be available from SPF states and funded communities within those states.


Table 2. Relationship of Logic Model to Measures

Level

Role In Logic Model

Available From All

Available From SPF-Funded Only

State

Baseline Status

NOMS

Other outcomes from SEDS, DCC data, Census data, and Other federal data sources (non-SA)

SPF applications

Round 0 site visits

Planning & Implementation/ Systems Change/ Post-baseline contextual change

Annual updates from Census and other federal data sources

Strategic plan

Quarterly reports

State evaluation reports

Round 1-3 phone interviews

Community

Baseline Status

NOMS

Other outcomes from SEDS, Census data, and Other state epi data

Initial community survey

Planning & Implementation/Systems Change/ Post-baseline contextual change

 

Follow-up community surveys (semi-annual)

Standardized implementation fidelity scales



  1. Data Sources and Measures


All data sources will be linked through an integrated system of hierarchical databases. The design of the databases includes the list of the data sources, the databases that will be created for them, the data flow among the different actors, and the eventual relationships among the data sources.


Process and Systems-Level Measures


The process and systems-level data sources include;

  • State documents, including: grantee applications; quarterly reports; strategic plans; state evaluator reports; and SIG reports and SPAS reports (if applicable)

  • State-level surveys of: SPF implementation (the 5 steps) and infrastructure development, conducted annually via telephone

  • Community-level surveys of SPF implementation (including prevention interventions) and systems change (e.g., capacity building), conducted semi-annually via the web

  • Possible state site visits “for cause”


Key process components of the state-level evaluation focus on the characteristics of the statewide ATOD needs assessments, strategic plans, and the State Epidemiological Workgroups (SEWs) that are being formed as part if the SPF SIGs during Year 1 of the project. We also will be collecting and analyzing data on two crucial outcomes within the SPF SIG states. First, as an indicator of ATOD prevention capacity, we will examine changes in the substance abuse prevention system in each state. Second, we will be examining the process of ATOD prevention resource allocation and the characteristics of the prevention strategies funded by the states.


The underlying assumption of the Strategic Prevention Framework is that faithful implementation of the SPF, with added attention to cultural competence and sustainability, will build states’ and communities’ ATOD prevention capacity, and that this increased capacity will result in reductions in ATOD-related problems. One of the tasks of the national evaluation, therefore, was to identify the components of a strong ATOD prevention system in order to craft a State Prevention Systems Infrastructure Instrument that would measure each state’s improvement in their infrastructure and serve as one indicator of a state’s ATOD prevention capacity. We used an iterative approach to developing this instrument, combining findings from the empirical literature, CSAP documents, lessons learned from the SIG, and input from SPF-SIG grantee stakeholders solicited during interviews and via feedback on drafts. The instrument will be implemented via a telephone survey of key informants in each state, conducted annually.


Note that the State Prevention Systems Infrastructure Instrument will be used to assess ATOD prevention capacity within various domains at the state system level. It will capture infrastructure development activities that occur as a consequence of SPF but also those that result from other causes. A second state-level instrument – more normative in character -- will directly assess each state’s implementation of the five SPF steps and be limited to actions that have occurred as a direct result of the SPF.  The items will be grouped by SPF step and will contain graduated behavioral anchors to document progress.  Implementation of the SEW process also will be captured by this instrument. For the final two steps (implementation and monitoring/evaluation), the questions and their anchors will be tailored to each state’s approved strategic plan.2 Consequently, each state’s SPF implementation will ultimately be evaluated relative to its own specific goals and milestones as articulated in its own plan. States will have the opportunity to review the tailored instrument prior to its use (or if they wish, to collaborate in its development), to ensure accurate interpretation by the national team of the state’s goals and milestones.


The community-level evaluation will include structural as well as process and system-level measures. The structural measures will be used to assess a community’s capability to provide substance abuse prevention services through adequate and appropriate settings, instrumentalities and infrastructure, including staffing, facilities and equipment, financial resources, information systems, governance and administrative structures, and other features related to organizational context in which services are provided (Donabedian, 1988; Siedman, Steinwachs, & Rubin, 2003). Outcome measures will assess the communities’ progress in capacity expansion and implementation of specific programs. Contextual factors external to SPF that may influence capacity building will also be measured. The web-based Community Level Instrument (CLI) survey form will be the principal community-level data source for the national evaluation. It will collect standardized data from all funded communities across the 26 funded SPF-SIG states.


The CLI has three parts. Part I is the main instrument. There are also two sub-forms: one sub-form to report information about evidence-based participant interventions and one sub-form to report information about environmental interventions. The communities in each state will only need to complete the sub-forms if they are implementing that particular type of intervention. There is also a list of definitions. In the final web-based format, these definitions will be accessed using a hyperlink within the forms. Table 3 shows the contents of each form.

Table 3: Outline of Cross-site Community Level Instrument

MAIN INSTRUMENT

Sub-Grantee Organizational Information

Organization Type and Funding

Cultural Competence

Implementation Process

Strategic Prevention Framework

Needs Assessments

Capacity Building

Awareness & Openness

Relationship Building

Organizational and Community Resources

Work Force Development

Sustainability

Strategic Plan Development

Intervention Selection and Implementation

Project Level Outcome Evaluation

Contextual Factors

PARTICIPANT INTERVENTION FORM

Intervention Information

Logic Model

Participant Intervention Description

Dosage and Fidelity

Adaptations

Race, Ethnicity & Cultural Appropriateness

Intervention Outcomes

ENVIRONMENTAL INTERVENTION FORM

Environmental Intervention Implementation

Description of Environmental Intervention Activities

Environmental Intervention Outcomes


The CLI form will be accessible via a password protected web-site.3 It will be completed either by the states for the communities or by the communities themselves.


Cultural Competence


One of the critical components of the SPF-SIG is the emphasis on cultural competence throughout the life of the grant, for all SPF activities, at both the state and community levels. The national cross-site evaluation team is committed both to assessing the degree to which states and communities demonstrate cultural competence in their SPF-SIG activities and to employing culturally competent measures and methods in our own work. For the purposes of the national evaluation, we will use the definition of cultural and linguistic competence included in the National Standards for Culturally and Linguistically Appropriate Services (CLAS) in Health Care issued by the U.S. Department of Health and Human Services' (HHS) Office of Minority Health (OMH).


Measurement is the key to assessing the impact of cultural competence on outcomes at both state and community levels. If cultural competence can be reliably and validly measured, its influence as a mediator of outcomes can be directly tested in the context of Questions 1b (accounting for state-level variation), 2b (accounting for community-level variation within states), and 3b (accounting for community-level variation across states).


  • At the state-level, we will assess the degree to which SPF-SIG grantees employ culturally competent practices within Data collection, Data use, and Building cultural competence capacity. We will assess grantees’ cultural competence in these domains via: information collected on the SPF-SIG quarterly reports; state level process evaluation reports; and the state telephone interviews. In addition, analyses of epidemiological data will reveal changes in ATOD-related health disparities over the life of the SPF-SIG.

  • At the community level, domains and strategies to assess cultural competence will mirror those at the state level, with an added emphasis on cultural competence of the ATOD prevention strategies selected and implemented by the communities. Our primary data source for assessing cultural competence at the community level for funded communities within SPF states will be the information on program content and fidelity from the web-based community survey (described above). We also will have information on considerations of cultural competence in community-level ATOD prevention funding allocation from the SPF quarterly reports and state-level evaluation reports. In addition, there will be some information about funding allocations and assessments of cultural competence from the State system survey.

Epidemiological Outcome Measures


As noted above, the first two goals of the SPF are to reduce substance abuse – alternately called consumption -- and substance abuse related problems – alternately called consequences. These are operationalized through epidemiological outcomes, i.e., population-based estimates of consumption and consequences at state- and community-level.4

The epidemiological outcomes of principal interest to CSAP are contained in the National Outcome Measures (NOMs). In collaboration with States and other stakeholders, SAMHSA has recently reviewed its discretionary and block grant programs, examining their ability to capture and assess performance data on treatment and prevention outcomes. The result has been the identification of domains of NOMs on which grantees are expected to report. For CSAP these include abstinence, education/employment, crime and criminal justice, access/capacity (number of persons served), cost-effectiveness, and use of evidence-based practices.

It is important to recognize the limits of the national indicators in regard to SPF SIG. For example, the NSDUH will provide state-level outcome data on the abstinence domain, but attempting to use it for community-level estimates (desirable because most interventions will not be state-wide) is more tenuous.5 A better source for community-level estimates might come from State-administered data sources, because estimates are often available for individual communities within states (CSAP/NCAP, 2000). A number of states will likely have community–level outcome data for both intervention and non-intervention communities, as provided by student surveys, possibly other surveys (e.g., college student surveys), and archival indicators.


While the NOMs are mandated, CSAP has given the states discretion as to data sources. This decision was based on two considerations. First, the usefulness of federally-funded survey data (e.g., NHSDUH) will vary by state, particularly with respect to the availability of community-level estimates. Second, the states differ markedly regarding other data sources they can bring to bear on the problem.6

Some grantees have expressed concerns that the NOMs do not fully capture the consumption and consequences outcomes they hope to achieve through SPF. Therefore, we are leaving the design “open” to accommodate additional outcomes, to be determined collaboratively with the states. A working committee of interested state evaluators has been constituted for this purpose, facilitated by the national team. We anticipate that the committee will make recommendations on additional outcomes for the rest of the states to consider. If additional outcomes are adopted, the national team will provide follow-on assistance as needed to support the states in collecting them, using them for their state-level evaluations, and providing them to the national evaluation. In addition, CSAP is making epidemiological data available to States for purposes of substance use/abuse prevention needs assessment, planning, and monitoring through the State Epidemiological Data System (SEDS) website. This data is provided as a resource for State Epidemiology Workgroups (SEWs) in support of SPF. The data system provides a preliminary set of data elements that are critical for substance use/abuse prevention planning.

  1. Analysis Plan


Descriptive/normative analyses


Although the primary focus of the national evaluation is on assessing impact, many descriptive and normative analyses will occur first. We will use standard techniques for analyzing, displaying, and reporting descriptive and normative results as they become available throughout the evaluation period. These will include summary statistics (e.g., means and standard deviations), univariate and multivariate frequency distributions (including cross-classification displays), as well as appropriate charts and graphs. Subsequently, answers to various descriptive and normative questions, coded into numerical indicators and scales, will support the six impact questions as key predictors of systems-level and population-level outcomes.


Inferential (cause and effect) Analyses


We will rely on multilevel models to sort out the cause-and-effect relationships of state and community characteristics on changes in population outcome trends, both separately and in combination (e.g., under what mix of state and community circumstances are positive effects most likely). By properly adjusting standard errors for within-state clustering of communities and serial correlation of longitudinal outcomes, multilevel models increase the confidence with which observed changes can be attributed to the implementation of SPF activities. This in turn will lead to better-grounded recommendations for improving effectiveness in the future.


Like other more traditional analyses employing linear models, the approach can also reveal sites that, for any reason, are discrepant from others with similar characteristics on the variables included in the model (that is, “outliers” in the distributions of outcomes) through graphic display of estimates and residuals. This step can be the basis for beginning further analyses of the reasons for such discrepancies, which may involve values of other variables available in the data but not included in the multilevel model, or point the way to other, more global characteristics highlighted only in more narrative and qualitative data or in the expertise of state and community informants.


Chapter 7 of the Evaluation Plan describes our proposed use of propensity scoring to reduce potential bias from group nonequivalence at the state (SPF vs. non-SPF) and community (funded vs. non-funded) levels, respectively, and details the multilevel statistical models to be used for addressing each of the six evaluation questions.7


  1. Implementation of Participatory Collaborative Model


The national evaluation team views the 26 SPF states as full partners in the design and implementation of the national evaluation, with continuing collaboration over the entire course of the study. This includes:

  • Instrument Development and Selection of Outcomes. States have had extensive input on the development, revisions, and piloting of the state- and community-level survey instruments. As noted above, states also will have the opportunity to nominate additional epidemiological outcomes beyond the NOMs that they may perceive as more sensitive indicators of SPF success. A working committee of interested state evaluators has been constituted for this purpose, facilitated by the national team.

  • Evaluation design. Prior to any detailed design work, we presented the anticipated general approach at the December, 2004 new grantees meeting. The model was further discussed with states during the Round 0 site visits and with state SPF coordinators, evaluators, SEW chairs, and others in attendance at the May, 2005 grantee meeting. Much of the feedback from state-level stakeholders has been incorporated into the current design, in particular the “Frequently Asked Questions” of Chapter 9 of the Evaluation Plan. The design document was distributed to states via the SPF list-serv for review and comment in advance of the September 2005 evaluators meeting, and additional feedback was sought and obtained at that meeting.

  • Implementation. Grantees can participate in the implementation of the national evaluation in two important ways. First, they can volunteer to chair or serve on one or more of the five task-oriented committees that were launched at the September 2005 grantee meeting. Second, they can adopt a comparative design for assessing the impact of SPF funding on their targeted communities, through which they can contribute to and benefit from the integrated state/cross-state design strategy proposed in this document.8

  • Analysis and Dissemination. While the national evaluation team has a contractual obligation to analyze and report on the cross-site data, the SPF cross-site data will be shared with all the participating states as promptly and fully as possible. Based on experience with prior cross-sites, for example, we anticipate an interest in substudies that pool data from an ad hoc subset of states with a common interest.9 Rights of authorship in presentations and publications that result from these analyses would of course go to the states that took the lead on the substudy. We also welcome the opportunity to collaborate with interested states in panel presentations and co-authored publications.


SPF-SIG NATIONAL EVALUATION DESIGN


  1. Overview


The Strategic Prevention Framework State Incentives Grant (SPF SIG) program, which builds upon the earlier SIG program, is one of SAMHSA’s Infrastructure Grant programs supporting an array of activities to help states and communities build a solid foundation for delivering and sustaining effective substance abuse and/or mental health services. CSAP has provided funding to states and territories to implement the SPF in order to accomplish the following: (1) prevent the onset and reduce the progression of substance abuse, including childhood and underage drinking; (2) reduce substance abuse-related problems in communities; and (3) build prevention capacity and infrastructure at the state/territory and community levels (CSAP, 2004). The SPF is built on a community-based risk and protective factors approach to prevention and a series of guiding principles that can be operationalized at the Federal, state/territory, and community levels. Other notable characteristics of the initiative are an emphasis on epidemiologic data with a population-based perspective, an increased emphasis on cultural competence, and a focus on sustainability from the outset. Although the direct recipients of SPF SIG funds are states and territories, CSAP envisions the SPF SIGs being implemented through partnerships between the states/territories and communities. CSAP funded 21 states and territories in FY2004 for up to 5 years to implement the SPF, and 5 additional states/territories in FY2005.10


The SPF consists of the following five steps, which each state must accomplish: (1) conduct a statewide needs assessment, including the establishment of a state Epidemiological Workgroup; (2) mobilize and build state and community capacity to address needs; (3) develop a statewide strategic plan for prevention; (4) implement evidence-based prevention practices to meet state and community needs; (5) and monitor/evaluate the implementation of the project. The SPF is a synthesis of a variety of empirically driven models, such as Getting to Outcomes (2004) that have emerged in the prevention field over the past decade. States must allocate a minimum of 85 percent of the total grant award to community-level organizations to carry out prevention programs, practices, and policies.


The SPF SIG model reflects the maturity of the prevention field and takes into account lessons learned from previous initiatives. In particular, the SPF SIG differs from its precursors in that it provides a complete model of state and community prevention systems change. The SPF SIG is the first CSAP initiative to embrace all the elements that are necessary to initiate and sustain systems change—needs assessment, capacity enhancement, data-driven planning, evidence-based practices, and continual monitoring—and to do so in partnership with states and communities. Other initiatives have included some of these elements, but not all. The SIG program, for example, did not require a thorough needs assessment, nor did it place the same priority on states' prevention activities being guided by epidemiological data, rather than by predefining the target population and/or problems. 


    1. Key features of the design


To address CSAP’s SPF SIG questions effectively, the evaluation team will implement a multilevel, multi-method quasi-experimental design. The scope of the evaluation will encompass national, state, and community levels. The design will use both quantitative and qualitative data, the latter providing process data and systems outcomes at the state and community levels, as well as context for analyzing the epidemiological outcomes. Key features of the proposed methodology include:

  • A rigorous, yet practical approach to evaluating processes and outcomes at state and community levels, grounded in lessons learned from experience with the current SIG national evaluation;

  • Due consideration of program aspects critical to CSAP (including strategic and data-driven planning, state-level system change, environmental change at all levels, and underage drinking in addition to illicit drug use);

  • A vision of grantees as full partners in the design and implementation of the national evaluation, with continuing collaboration over the entire course of the evaluation; leveraging the relationships with the SPF states to yield data that mutually benefit the national and state-level evaluations;

  • Standardization of data collection at the state and community levels;

  • Use of natural variation and replications within and across states, in tandem with the non-SPF comparison states to explain effect estimates at the state and community levels;

  • Use of the states’ own SPF SIG evaluations to augment and aid interpretation of national evaluation data;

  • Accounting for pre-SPF SIG activities in estimating the effects of SPF SIG-initiated activities, recognizing that all states have prevention activities already under way, with many having completed or currently involved in efforts related to the precursor SIG initiative;

  • Explicit consideration of program selection and implementation fidelity in interpreting state and community- level outcomes; and

  • Use of multilevel modeling and meta-analytic methods to explain cross-site variation in state- and community-level outcomes;


The evaluation will make use of quantitative and qualitative data to evaluate both processes and outcomes at the state and community levels. It is important to emphasize that process data collection and evaluation will occur at each level, in order to describe and document the activities undertaken as part of the SPF SIGs and to support the evaluation of epidemiological outcomes (Sonnefeld, et al., 1998). For example, to properly evaluate the effectiveness of prevention initiatives and strategies, it is first necessary to assess whether the strategies are fully implemented as intended, thus avoiding the “Type 3 error” of attributing any lack of effect to the strategy itself rather than failure to implement a strategy or implementing it improperly or incompletely (Steckler 1989). Process evaluation activities will enable us to assess program fidelity, that is, the degree to which policies and programs implemented in states or communities are “faithful” to the model upon which they are based (Orwin, et al., 1998; Orwin, 2000). The process components of the SPF SIG evaluation will facilitate disentangling the effects of various project-related activities and strive to identify which program and policy elements are effective, under what conditions, and with which target populations.


In keeping with the vision of grantees as full partners in the national evaluation, we will use a participatory model (Greenwood & Levin, 1998) for the evaluation, which involves collaboration among key stakeholders in: developing measures; identifying data sources; facilitating data collection; reviewing drafts of data collection instruments; interpreting the evaluation findings; disseminating information from the evaluation; and using the findings for project revisions and strategic planning. Federal stakeholders include, but are not limited to: CSAP, NIDA, OAS, and DEA. State and sub-state stakeholders include, but are not limited to: Members of the SPF Advisory Councils, the SEWs, state project coordinators and evaluators, communities that are funded to implement interventions, and local prevention providers. Actions to date and plans for continuing engagement of stakeholders are described in Section 8.


    1. Organization of the this document


Section 2 presents the Evaluation Questions that lay the foundation for the design. As CSAP’s program goals are clearly focused on the impact of SPF on state and community level outcomes, the focus of the evaluation questions is on impact as well. We propose six questions, divided into three symmetrical pairs, each addressing a different aspect of the impact of SPF on outcomes at the state and community levels. We propose that these questions be collaboratively addressed by the state and national evaluators.


To help guide the evaluation design process for addressing the six questions, Section 3 presents a Logic Model of SPF SIG Impact. The model depicts the chain of activities that logically links funding of SPF SIG states to community and statewide epidemiological outcomes, and articulates a broader theory of impact, not only of SPF SIG elements and the relationships among them, but also of non-SPF factors – both measurable and unmeasurable -- that potentially influence the same processes and outcomes as SPF. This is critical to identifying the design and data elements needed to address the impact questions.


Addressing the impact questions will require an estimate of the counterfactual effect – i.e., what would have resulted in the absence of SPF, all else equal. Section 4 specifies the Comparison Conditions to be used for this purpose: non-SPF States (for state-level comparisons) and non-funded communities within SPF states (for community-level comparisons). Section 5 introduces the Mediators and Moderators that, in conjunction with the counterfactual effect estimates, will be used to address the six questions. It lays out, for each question, 1) the basis of comparison that establishes the counterfactual estimate, 2) the moderators and their associated data sources, and 3) the mediators and data and their associated data sources. Consequently, it links the design and measurement requirements of the logic model to the data sources needed to address each of the six evaluation questions.


Section 6 describes the details of the data sources themselves, i.e., the Data Sources and Measures that will address the informational requirements of the design. These include baseline status, process measures, and systems outcomes at both state and community levels, as well as longitudinal indicators of consumption and consequences for detecting population-based impacts (i.e., the epidemiological outcomes). In addition, special attention is also given to measuring cultural competence at all levels of the evaluation. Section 7 describes the Analysis Plan, which primarily focuses on the statistical modeling that will be employed to address the six questions, and how the various parameter estimates map to inferences about SPF effects and the state and community factors that influence them.


Section 8 describes our Implementation of the Participatory Collaborative Model. Most importantly, it explains how the states are participating in each aspect of the national evaluation (instrument development, outcome selection, evaluation design and implementation, and analysis and dissemination).


Section 9 provides a compiled list of “Frequently Asked Questions” (FAQs) and our attempt to address them. Most were asked by state SPF coordinators and evaluators; others were asked by federal staff; while still others we posed to ourselves. Categories include: 1) Questions related to comparing funded and non-funded communities, 2) Questions related to availability of data, and 3) Questions related to analysis limitations. (Note: The document does not have a separate Limitations section; instead the limitations are discussed in response to the FAQs.)


  1. Evaluation Questions


Evaluation questions are frequently classified as one of three types: descriptive, normative, and impact (U. S. General Accounting Office, 1991). Descriptive questions provide, as the name implies descriptive information about specific conditions or events—e.g., does a census of community assets and resources exist within a given state? At the cross-site level, descriptive questions usually focus on variation across sites (e.g., states or communities) on specific conditions or events —e.g., how do states vary in their knowledge of community assets and resources that exist within their states? Other descriptive questions for SPF of documented interest to CSAP include:

  • What changes in allocation of funds and other resources occurred at the State and community-levels for substance abuse prevention programs and other activities?

  • What programs and activities have been added, eliminated and maintained?

  • What State and community level mobilization activities have been implemented?

  • What State and community level capacity building activities have been implemented?

  • What key State and community leaders are involved in prevention decision-making?

  • How were State and community leaders recruited?


The answers to normative questions--which unlike descriptive questions, focus on what should be rather than what is--compare an observed outcome to an expected level of performance. In a state, an expected performance outcome for a needs/assets assessment might include the development of a statewide census of community assets and resources. A comparison of the observed and expected outcome should readily reveal if the expectation was met. Not surprisingly, at the cross-site level, the parallel question is: how did states vary in their ability to meet the objective of developing a statewide census of community assets and resources. Normative questions have a long history in evaluation, dating back at least to Provus’ (1971) introduction of the “discrepancy model,” an early treatment of the normative approach in evaluation. “Criterion-referenced” evaluation (Popham, 1975) is rooted in the discrepancy model, as is the performance-monitoring approach of Wholey (1979), and most contemporary models of implementation fidelity assessment in treatment (Orwin, 2000) and prevention (CSAP, 2001). Some normative questions for SPF of documented interest to CSAP include:

  • Do key State and community leaders represent the key opinion leaders?

  • Do SEWs perform according to RFA standards?

  • Do needs assessments conform to RFA standards?

  • Do strategic plans meet RFA standards?

  • To what degree was the selection and adoption of prevention programming specific to local level problems, needs, and resources based on data collected through the SEW?

  • Has cultural competence been integrated into prevention programs, policies, and practices in States?

  • When interventions are adapted, to what extent are CSAP cultural competence standards met without compromising intervention content?

  • Have data been continuously monitored and evaluated to ensure that selected programming continues to address the local level needs?

  • To what extent has the prevention infrastructure improved?

  • To what extent are selected programs evidence-based?

  • To what extent are selected programs implemented with fidelity?


As described later, the descriptive and normative question domains covered by the state- and community-level instruments being developed for the national process evaluation are quite comprehensive. Evolving questions will be addressed through the Quarterly Report Forms submitted to CSAP by the grantees.


The answers to impact (cause-and-effect) questions help reveal whether observed conditions or events can be attributed to programmatic interventions. At the state-level, the most summative expression of an impact question might be: Did SPF SIG have an impact on consumption and consequences within the state? At the cross-site level, the parallel question is: What accounted for variation across states in their impact on consumption and consequences? It is in addressing the latter question that the earlier descriptive and normative questions come back into play. For example, does the existence of a state-level census of community assets and resources ultimately predict outcomes (descriptive), or alternatively, did a state’s ability to meet its own expectations in developing such a census ultimately predict outcomes (normative), either directly or, more likely, through some other mediator?


The choice of question type is typically driven by the program goals. As noted above, CSAP’s goals for SPF are: (1) prevent the onset and reduce the progression of substance abuse, including childhood and underage drinking; (2) reduce substance abuse-related problems in communities; and (3) build prevention capacity and infrastructure at the state and community levels. All three stress impact; epidemiological consumption and consequence impacts for Goals 1 and 2, and state system-level impacts for Goal 3. The focus of the program goals on impact make clear that the focus of the evaluation questions must be on impact as well. While descriptive and normative questions can be of interest and importance to various stakeholders in their own right, their primary role in the national evaluation will be to support the addressing of impact questions. In that capacity, however, their role will be critical. Answers to various descriptive and normative questions, subsequently coded into numerical indicators and scales, will be the principal means by which the national evaluation explains variation in whatever systems-level and population-level impacts are observed. Mark et al.’s (2000) theory of evaluation as “assisted sense-making” (p. vii) provides context. This approach sees the primary role of evaluation as enhancing and supplementing the natural sense-making efforts of democratic actors seeking social betterment. As such, it goes beyond traditional evaluation questions (e.g., Was the program properly implemented? Did it have an effect?) to explore, among other things, the underlying causes of program successes and failures.


We propose six impact questions, divided into three symmetrical pairs, to be collaboratively addressed by the state and national evaluators (see Table 1). For Questions 1,2, and 3, the unit of analysis is the state, the community, and communities clustered within states, respectively. We propose that the national evaluators will have primary responsibility for addressing Question 1, the state evaluators will have primary responsibility for addressing Question 2, and the national evaluators working collaboratively with state evaluators will have joint responsibility for addressing Question 3.


Table 1. Strategic Prevention Framework Evaluation Questions

Question

Unit of Analysis

Primary Responsibility (proposed)

1a. Did SPF funding improve statewide performance on NOMs and other outcomes?

States

National evaluators

1b. What accounted for variation in NOMs and other outcomes performance across SPF states?

2a. Within states, did SPF funding lead to community-level improvement on NOMs and other outcomes?

Communities, within states

State evaluators

2b. Within SPF states, what accounted for variation in NOM and other outcomes performance across funded communities?

3a. Across states, did SPF funding lead to community-level improvement on NOMs and other outcomes?

Communities, across states

Both national and state evaluators

3b. Across SPF states, what accounted for variation in NOM and other outcomes performance across funded communities?


  1. Logic Model of SPF Impact


To help guide the evaluation design process for addressing the six impact questions, the national evaluation team has developed a draft logic model of SPF SIG impact. The model depicts the chain of activities that logically links funding of SPF SIG states to community and statewide epidemiological outcomes, and articulates a broader theory of impact, not only of SPF SIG elements and the relationships among them, but also of non-SPF factors that potentially influence the same processes and outcomes as SPF. This is critical to identifying the design and data elements needed to address the impact questions. In principle, it would be desirable to isolate all non-SPF influences on target outcomes, but in practice, that is not possible outside of controlled laboratory conditions. The logic model of impact represents the flow of state- and community-level activities that lead to systems change and epidemiological outcomes in the uncontrolled “open system” where prevention operates. The model is depicted in Figure 1.

    1. Measured processes and outcomes


State activities are represented in Figure 1 in rectangles, community activities in ovals (the multiple ovals represent multiple communities within states). As shown, SPF funding went to selected states and territories. Planning and implementation of the SPF grant by funded states (i.e., progress on the five steps) is expected to lead to systems change (e.g., infrastructure development) and to funding of selected communities to build capacity and/or implement prevention interventions. We expect that most states will elect to provide SPF funding and support to a small set of communities (10 to 25), usually (though not always) defined geographically. Because the majority of the SPF funds in those states will be allocated to these subsets of communities, it is reasonable to expect that SPF impacts on pre-defined outcome measures will be concentrated primarily in those subsets, though not exclusively. Non-funded communities also may benefit from state systems change.

The arrow connecting planning and implementation to systems change is bidirectional, denoting the iterative character of their relationship. In turn, planning and implementation by communities of evidence-based programs, practices, and policies with ample reach, strength, and fidelity and with culturally competent adaptations, is expected to lead to local systems change and to positive epidemiological outcomes – including decreased substance use and substance related behaviors at the community-level. Population-based community-level estimates of substance abuse and related behaviors will be important in SPF SIG because a sizeable number of SPF states will implement interventions designed to achieve community-level outcomes in those communities they form partnerships with (rather than or in addition to, say, classroom level). Some states in the initial SIG took this approach (e.g., Vermont, Kansas, and Washington), and the SPF SIG RFA clearly encourages such interventions. For the national evaluation, this provides an opportunity to compare community-level outcomes on prevalence of substance use and various causal factors from a large number of funded communities across multiple states with outcomes from unfunded communities (where comparable data are available) and/or from state and national data (Question 3a). It also permits us to compare outcomes across a large number of funded communities, thus comparing different types of community approaches, target populations, levels of implementation and fidelity, etc. (Question 3b). The substantial number of communities available for such analysis across multiple states will provide a very rich data source for helping identify characteristics of community that predict outcomes.


The endpoint in the model is the state-level change in epidemiological outcomes that results from aggregating or “rolling up” the various outcomes from funded and non-funded communities. The extent to which this occurs in a given state will likely depend on the percentage of the state population falling within the intervention catchment areas (i.e., coverage rate). The dashed arrows from community to state outcomes represent the aggregation.


Note that the model also allows for state systems change from SPF to affect community-level systems and epidemiological outcomes over and above the effects produced through direct funding of communities. For example, state-level SPF activities might result in a law raising penalties for driving while intoxicated, or a prevention workforce initiative that increases the efficiency and effectiveness of prevention workers throughout the state. Or a state could use some of its 85% allocation on a statewide intervention, e.g., a media campaign. Each of these strategies could improve epidemiological outcomes in communities independent of direct funding, and of course do so in both funded and non-funded communities. This is recognized by including multiple paths of influence from state activities to community outcomes, not all of which go through SPF-funded communities.


Measuring what transpires at each link is necessary to explain cross-site variation in outcomes (Questions 1b and 2b), but it is not sufficient to establish that said variation was caused by SPF SIG funding (Questions 1a and 2a). For this purpose, we also need to assess the role played at all levels by influences outside of SPF. As described next, some of these influences are measurable, while others are not.

    1. Measured baseline status.


Not all states are starting at the same place, nor do they face the same problems. For example, state-level baseline/history characteristics influenced the state’s decision to apply for an SPF grant, as well as their success in obtaining one. Clearly, there is a need to account for pre-SPF SIG activities in estimating effects of SPF SIG-initiated activities, recognizing that all states have prevention activities already under way, with many having completed or currently involved in efforts related to the precursor SIG initiative. Such activities, and the achievements they produced, will directly affect the state’s success in planning and carrying out its SPF plans, and indirectly affect the achievement of systems change and reductions in substance abuse and substance related behaviors in the population. Failure to account for the influence of baseline characteristics in outcome analyses could yield biased and misleading results. Fortunately, many of these state-level baseline/history characteristics are measurable.


Analogously, communities within a state differ at baseline, and a community’s baseline/history characteristics will directly influence its chances of receiving SPF funds from the state for capacity building and/or prevention programming. They will continue to influence its success in using those funds (planning and implementation), through which they indirectly affect systems change and epidemiological outcomes in that community. As with states, many of these community-level baseline/history characteristics are measurable (e.g., where census variables are available for both).


    1. Measured post-baseline contextual change

After the project begins, contextual change occurring outside of SPF and the prevention system also can influence SPF implementation and systems change at the state level, and capacity building, the delivery of prevention interventions, and epidemiological outcomes at the community level. While projects can incorporate awareness of these events into their planning and interventions, they typically cannot be changed through project activities. Classic examples in communities are environmental disasters (e.g., hurricanes) and major economic shocks (e.g., plant closings). Either can cause dramatic changes in consumption (particularly with respect to alcohol abuse) and consequences (e.g., DWI and domestic violence rates). More subtle developments can also affect processes and outcomes, such as changes in demographic composition (e.g., the influx of retirees into southwestern states), changes in social service structures (e.g., the change from AFDC to TANF), or reductions in income supports (e.g., elimination of state general assistance or federal SSI benefits). Like baseline status, much of the relevant post-baseline contextual change at both state and community levels can be measured.



    1. Unmeasured factors

When incorporated into the analysis of outcomes, measures of state- and community-level baseline status and post-baseline contextual change will reduce the likelihood of obtaining biased or misleading results. They will also provide insights into the importance of these factors in predicting outcomes relative to SPF activities, which is of interest in its own right. Even with an extensive measurement effort, however, some amount of systems change and epidemiological change will occur in states that would have occurred in the absence of SPF, yet cannot be explained by characteristics of funded states as measured. Such unmeasured factors are omnipresent, and influence both processes and outcomes targeted by SPF in unknown ways and to an unknown degree at both state and community levels. Note that unmeasured factors in the state environment can influence implementation, systems change, and epidemiological outcomes directly -- i.e., statewide in both funded and unfunded communities -- as well as indirectly through their impact on SPF-funded communities. In this sense they parallel the dual paths of the measured state-level influences (SPF implementation and systems change) described above.

For the purposes of addressing the impact questions, however, it is not necessary to know the effect of unmeasured factors at each phase of the SPF process. However, it is necessary to know – or at least estimate -- their cumulative effect on systems and epidemiological outcomes. Specifically, answering Questions 1a, 2a, and 3a requires an estimated effect of the counterfactual condition – i.e., what would have resulted in the absence of SPF, all else equal – that can only be obtained through data on systems and epidemiological outcomes from non-SPF states (for Question 1a), non-funded communities within each SPF state (for Question 2a), and non-funded communities across SPF states (for Question 3a), respectively. Hence the requirement for comparison states at state-level and comparison communities at community-level to establish counterfactual approximations. We recognize that the systems change implementation data available from comparison states and non-funded communities will be less complete than the data available from their funded counterparts, and the current design does not depend on these data.11 However, these data are not needed to address Questions 1a, 2a, and 3a, which can be viewed as “intent-to-treat” questions – analogous to hypothesis tests in clinical trials that simply assess whether two interventions yielded different results. Comprehensive implementation data will be needed to address Questions 1b, 2b, and 3b, which rely on implementation data as one potential source of variation in outcomes, but only from funded states and communities.



  1. Comparison Conditions


    1. Non-SPF states (for state-level comparisons)

As noted above, states that did not receive SPF grants will serve as comparison states to establish a counterfactual estimate for Question 1a. All 34 non-SPF states are potentially available as comparison states. It is not realistic to expect the same level of data access as in the SPF states, since non-SPF states are not receiving federal funds to participate in the evaluation, and cannot be obligated to assemble epidemiological datasets or respond to surveys about infrastructure development. Therefore, in addressing Question 1a we will restrict the set of baseline covariates and state-level longitudinal outcomes to those available from publicly available sources. The potential limitations and implications of this approach are discussed in Section 9.

While all 34 non-SPF states are potentially available for this purpose, it is possible they will not all be used as comparisons. For example, a superior counterfactual estimate might result from eliminating a small subset of states that are least like the 26 funded states on the basis of propensity scores formed from the baseline covariates. So long as the number of comparison states does not go below the number of SPF states, the loss in statistical power will be minimal (Cohen, 1988).


    1. Non-funded communities within SPF states (for community-level comparisons)


Within SPF states (for Question 2a) and across SPF states (for Question 3a), communities that do not receive intervention funds from the state will serve as comparison communities to establish counterfactual estimates. To address Question 2a, we therefore encourage all SPF states that allocate SPF funds to selected communities to compare, as a component of their state’s SPF evaluation effort, outcome measures in these communities with outcomes in otherwise comparable communities within the state that do not receive SPF funding. This approach will provide useful information for the state evaluations of the SPF, and is consistent with the focus of the SPF on statewide and communitywide (i.e., population-level) impacts. Although some of the data needed for this effort may be available through SAMHSA’s State Epidemiological Data System (SEDS) data base, it is likely that many community-level outcome indicators will not be provided in SEDS, but rather will need to be obtained from state and local sources. The national evaluation team will provide assistance to state evaluators in identifying pertinent data elements and potential sources for their community-level outcome evaluations. At least some of these data will likely be collected and maintained by each state’s SEW.


States may choose any of a number of methods to use in analyzing and displaying the findings from their evaluations of community-level impacts of the SPF. The underlying strategy of any such analysis, however, would likely involve a comparison between intervention and comparison communities on changes in the values of outcome measures over time.12 Reductions over time in substance abuse and related problems in intervention communities, relative to no reductions (or smaller reductions) in the same measures in comparison communities, would be indicative of desirable intervention effects.


The national evaluation will use these same community-level outcome data, as well as information on selected characteristics of communities, for the national evaluation effort (Questions 3a and 3b), and will therefore request that state evaluators provide those data to the national evaluation team. We view coordination between the national evaluation and each individual state evaluator as extremely important, as it will create efficiencies in the identification and collection of the data necessary for the assessment of community-level impacts of the SPF, both within individual states and across multiple states.


We recognize that the evaluation model of comparing funded and non-funded communities may not fit equally well with all states’ strategies for allocating intervention funds. For example, some states may propose a so-called “equity” model for allocating intervention funds, under which funds are provided to every county on a per capita basis. Our plans for handling this and other contingencies are discussed in Section 9, “Frequently Asked Questions.”


  1. Mediators and Moderators


In the causal analysis literature, intervening variables like implementation fidelity are sometimes referred to as mediators,13 while baseline status variables like pre-existing differences between states are sometimes referred to as moderators.14 We will adopt this convention here. Table 2 shows, for each of the six questions, 1) the basis of comparison that establishes the counterfactual estimate (e.g., SPF v. non-SPF states), 2) the moderators and their associated data sources, and 3) the mediators and their associated data sources. Consequently, Table 2 links the design and measurement requirements of the logic model to the data sources needed to address each of the six evaluation questions. The next section describes the details of the data sources themselves.

Table 2. Basis of Comparison, Moderators, Mediators, and Data Sources by Study Question

Question

Comparison

Moderators

Data Sources

Mediators

Data Sources

1a. Did SPF funding improve statewide performance on NOMs and other outcomes?

SPF v. non-SPF states

State-level baseline/history characteristics (subset available from all states)

SEDS

Relevant DCC data

Census data

Other federal data sources (non-substance abuse related)

None (no mediators available from non-SPF states)


N/A

State-level post-baseline contextual change (subset available from all states)

Annual updates from Census and other federal data sources

1b. What accounted for variation in NOM and other outcomes performance across SPF states?

Natural variation among SPF states

State-level baseline/history characteristics (full set available from SPF states)

SAME AS QUESTION 1a PLUS

SPF applications

Round 0 site visits

Quantitative variation (performance) on state-level SPF requirements (the 5 steps)

Strategic plan

Quarterly reports

Round 1-3 phone interviews

State-level post-baseline contextual change (full set available from SPF states)

SAME AS QUESTION 1a PLUS

Round 1-3 phone interviews

State-level evaluations

Qualitative variation on state-level implementation (i.e., how they chose to implement their SPF)

Strategic plan

Quarterly reports

Round 1-3 phone interviews

State-level evaluations

Intervention strategy (e.g., how communities were selected, what they were, population coverage rate)

Strategic plan

Quarterly reports

Round 1-3 phone interviews

Community surveys

State-level evaluations

Aggregate score (state-level) on community-level implementation fidelity, cultural competence

Standardized implementation fidelity scales (aggregated)

Community surveys (aggregated)

2a. Within SPF states, did SPF funding lead to community-level improvement on NOMs and other outcomes?

Funded v. non-funded communities, within state

Community -level baseline/history characteristics (subset available from all communities)

SEDS

Census data

Other state epi data


None (no mediators available from non-funded communities)

N/A

Community -level post-baseline contextual change (subset available from all states)

Census data

2b. Within SPF states, what accounted for variation in NOM and other outcomes performance across funded communities?

Natural variation among funded communities, within state

Community -level baseline/history characteristics (full set available from funded communities)

SAME AS QUESTION 2a PLUS

Community surveys


Community-level implementation fidelity, cultural competence

Standardized implementation fidelity scales

Community surveys

Community -level post-baseline contextual change (full set available from funded communities)

SAME AS QUESTION 2a PLUS

Community surveys


Pre-intervention planning and choice of intervention(s) at community-level

Community surveys


3a. Across states, did SPF funding lead to community-level improvement on NOMs and other outcomes?

Funded v. non-funded communities, across states

State-level baseline/history characteristics (subset available from all states)

SAME AS FOR QUESTION 1a

None (no mediators available from non-SPF states or non-funded communities)




N/A


State-level post-baseline contextual change (subset available from all states)

SAME AS FOR QUESTION 1a

Community -level baseline/history characteristics (subset available from all communities)

SAME AS FOR QUESTIONS 2a

Quantitative variation (performance) on state-level SPF requirements (the 5 steps)

SAME AS FOR QUESTION 1b

Community -level post-baseline contextual change (subset available from all states)

SAME AS FOR QUESTIONS 2a

3b. Across SPF states, what accounted for variation in NOM and other outcomes performance across funded communities?

Natural variation among funded communities, across states

State-level baseline/history characteristics (full set available from SPF states)

SAME AS FOR QUESTIONS 1b

Qualitative variation on state-level implementation (i.e., how they chose to implement their SPF)

SAME AS FOR QUESTION 1b

State-level post-baseline contextual change (full set available from SPF states)

SAME AS FOR QUESTIONS 1b

Intervention strategy (e.g., how communities were selected, what they were, population coverage rate)

SAME AS FOR QUESTION 1b

Community -level baseline/history characteristics (full set available from funded communities)

SAME AS FOR QUESTION 2b

Aggregate score (state-level) on community-level implementation fidelity, cultural competence

SAME AS FOR QUESTION 1b

Community -level post-baseline contextual change (full set available from funded communities)

SAME AS FOR QUESTION 2b

Community-level implementation fidelity, cultural competence

SAME AS FOR QUESTION 2b

Pre-intervention planning and choice of intervention(s) at community-level

SAME AS FOR QUESTION 2b


  1. Data Sources and Measures


The evaluation will make use of quantitative and qualitative data sources to evaluate both processes and outcomes at the state and community levels, following the multilevel structure of the logic model. Outcomes include systems change and population change (consumption and consequences), again at both state and community-level. In addition to the epidemiological outcome data being gathered by the SEWs and national survey instruments that will require some level of effort by states and communities to complete, a variety of data collection, extraction, and coding tools are being developed to make maximum use of existing data documentation (e.g., grant applications and strategic plans).


    1. Overview


The data sources may be divided into three types: key informant surveys, population (epidemiological) outcomes, and documents. They are summarized as follows:


Key informant surveys

  • Semi-annual web-based community-level survey

  • Annual telephone-based state-level survey

  • Possible site visits to subset of states and/or communities


Population data (epidemiological data)

  • State-level epidemiological data from national sources (e.g., SEDS and/or component data systems including NSDUH, FARS, BRFSS, etc.).

  • State-level epidemiological data not available from national sources, but rather provided by the state SEWs (e.g., hospital discharge data).

  • Community-level epidemiological data from national sources (e.g., SEDS and/or component data systems including FARS, UCR, NVSS).

  • Community-level epidemiological data not available from national sources, but rather provided by the state SEWs (e.g., hospital discharge data).


Documents

  • Grantee applications

  • Quarterly (Progress) reports

  • State strategic plans

  • SPF SIG state evaluation reports

  • SIG reports (if applicable)

  • State Prevention Advancement and Support (SPAS) reports (if applicable)


All data sources will be linked through an integrated system of hierarchical databases. The design of the databases includes the list of the data sources, the databases that will be created for them, the data flow among the different actors, and the eventual relationships among the data sources.15


    1. State-level (baseline status, process measures, and system-level outcomes)


Key process components of the state-level evaluation will focus on the characteristics of the statewide ATOD needs assessments, strategic plans, and the State Epidemiological Workgroups (SEWs) that will be formed as part if the SPF SIGs during Year 1 of the project. We also will be collecting and analyzing data on two crucial outcomes within the SPF SIG states. First, as an indicator of ATOD prevention capacity, we will examine changes in the substance abuse prevention system in each state. Second, we will be examining the process of ATOD prevention resource allocation and the characteristics of the prevention strategies funded by the states.


      1. Instrument development: State systems


The underlying assumption of the Strategic Prevention Framework is that faithful implementation of the SPF, with added attention to cultural competence and sustainability, will build states’ and communities’ ATOD prevention capacity, and that this increased capacity will result in reductions in ATOD-related problems. One of the tasks of the national evaluation, therefore, was to identify the components of a strong ATOD prevention system in order to craft a “State ATOD Prevention System Instrument” that would measure each state’s improvement in their infrastructure and serve as one indicator of a state’s ATOD prevention capacity. We used an iterative approach to developing this instrument, combining findings from the empirical literature, CSAP documents, lessons learned from the SIG, and input from SPF-SIG grantee stakeholders solicited during interviews and via feedback on drafts. Steps in the instrument development process to date are as follows:


Step 1: Identifying draft domains of state ATOD prevention systems

As a starting point, we collected and reviewed information from pertinent literature, such as the “Characteristics of an Effective Substance Abuse Prevention System” developed by the South Carolina SIG, and the CDC’s National Public Health Performance Standards for state public health systems, and generated a list of seven critical elements (planning, resource development/management, state organizational structure, conceptual clarity, leadership, workforce development, evaluation and monitoring) for state ATOD prevention systems. For each element, we created a primary question, along with specific probes for that topic and additional questions designed to prompt discussion. This list was circulated among national evaluation team members for comments, was revised accordingly, and became the basis for the Round 0, Groups 1-2 interview protocol.


Step 2: Round 0, Group 1-2 Stakeholder Interviews

As part of the “Round 0 (i.e. “instrument development and rapport-building”) national evaluation team visits, 3-4 people teams traveled to ten SPF-SIG grantee states from January 24-March 4, 2005. During these visits, a pair of national evaluation team members conducted in-person semi-structured interviews based on the protocol described above with a variety of SPF-SIG stakeholders (n=74 total). Designated note takers recorded detailed information from the interviews, which lasted about an hour and were also audiotaped with the respondents’ consent.


Step 3: Content Analysis of Round 0, Group 1-2 Interview Data

Interview data from all 74 respondents were compiled electronically and common themes that emerged were coded. A codebook was created that organized the coded themes by domain.


Step 4: Development of “State Domains and Indicators” Draft Instrument

In late March, five members of the national evaluation team met to examine the data collected from SPF SIG stakeholders during our visits, and identify common themes that emerged across states and stakeholders. Based on those themes, we revised the draft list of ATOD prevention systems domains, which resulted in a list of 11 domains. We then created a series of indicators for each domain, with the indicators representing “benchmarks” for that domain. In order to get stakeholders’ perceptions concerning the domains and indicators, we created a draft instrument that listed the all domains and indicators and had space for the stakeholders to note: whether the indicators were appropriate as measures of a “quality” ATOD prevention system (yes/no); whether the indicators were culturally competent (yes/no); and how important the indicator was as a measure of ATOD prevention system quality (4-point scale).


Step 5: Written and Verbal Feedback on Draft Domains and Indicators from SPF-SIG Grantees

SPF-SIG stakeholders in six states and two territories were sent the draft Domains and Indicators and asked to rate various aspects of their appropriateness as described above. National evaluation team members then solicited their feedback during in-person interviews conducted during the “Round 0, Group 3” visits from April 11- May, 2005.

In addition, SPF-SIG Project Directors and State Evaluators from states visited in Round 0, Groups 1-2 were emailed the draft Domains and Indicators and asked to rate various aspects of their appropriateness and include comments.


Step 6: Development of State System Instrument

Based on the feedback received via the Round 0, Group3 interviews and written comments from SPF-SIG Project Directors and State Evaluators from states visited in Round 0, Groups 1-2, four members of the national evaluation team met and created a draft State Systems Instrument that consisted of 10 domains, each with a series of indicators. For each indicator, we developed 1-3 measures that were in the form of declarative sentences, to which the respondent was instructed to answer “yes” or “no.” Internal review of the instrument suggested that bringing in ordinal responses would improve the instrument’s ability to account for variability across states. The instrument is currently under revision. The final instrument will be a telephone survey of key informants in each state, conducted annually.


      1. Instrument development: SPF process


Note that the state systems/infrastructure instrument will be used to assess ATOD prevention capacity within various domains at the state system level. It will capture infrastructure development activities that occur as a consequence of SPF but also those that result from other causes. For example, the state legislature could pass a law that increases penalties to liquor stores for selling alcohol to minors. This would be an important environmental change that potentially affects consumption and consequences outcomes, yet may have occurred regardless of SPF.


A second state-level instrument – more normative in character -- will directly assess each state’s implementation of the five SPF steps and be limited to actions that have occurred as a direct result of the SPF.  The items will be grouped by SPF step and will contain graduated behavioral anchors to document progress.  Implementation of the SEW process also will be captured by this instrument. Table 3 extracts the program expectations specified in the RFA. These expectations form the basis for developing items and anchors for the state process instrument.


The five steps are:

  1. Profile population needs, resources, and readiness to address the problems and gaps in service delivery

  2. Mobilize and/or build capacity to address needs

  3. Develop a Comprehensive Strategic Plan

  4. Implement evidence-based prevention programs and infrastructure development activities

  5. Monitor process, evaluate effectiveness, sustain effective programs/activities, and improve or replace those that fail

For the first three steps the normative anchors for the process questions will be derived from the original RFA. We recognize that states will vary in their interpretation of RFA requirements, and that “one size does not fit all.” For the final two steps, however, the questions and their anchors will be tailored to each state’s approved strategic plan.16 Consequently, each state’s SPF implementation will ultimately be evaluated relative to its own specific goals and milestones as articulated in its own plan. States will have the opportunity to review the tailored instrument prior to its use (or if they wish, to collaborate in its development), to ensure accurate interpretation by the national team of the state’s goals and milestones.


Table 3. SPF-SIG Program Expectations from CSAP RFA

THE SPF SIG GRANTEES WILL:

  • Complete a statewide needs assessment, using SEW data to determine:

    • Magnitude of substance abuse and related mental health disorders in the state

    • Levels of risk and protective factors associated with substance abuse and related mental health disorders

    • Community assets and resources

    • Gaps in services and capacity

    • Readiness to act.

  • Identify target communities to implement the SPF (i.e., geographic areas and target populations for which levels of substance abuse and related mental health disorders are most severe)

  • Develop an approved strategic plan that:

    • Specifies the priorities that will be targeted

    • Articulates a vision for activities to address needs

    • Describes infrastructure needed to select and implement evidence-based policies, programs and practices

    • Identifies/coordinates/allocates resources and sources of funding for the plan

    • Identifies appropriate funding mechanism(s) to allocate resources to targeted communities

    • Identifies training requirements

    • Specifies key policies and guidance for interrelationships among stakeholders

    • Involves public and private service systems in creating a seamless continuum of planning and services

    • Includes plans for sustaining the infrastructure and services that are implemented

    • Identifies key milestones and outcomes against which to gauge performance

    • Includes plans for making adjustments, based on on-going needs assessment activities

  • Provide the infrastructure and other necessary support for selection and implementation of policies, programs, and practices that are:

    • Proven to be effective in research settings and communities (evidence -based)

    • Adaptations, if necessary are culturally competent and preserve core elements of the program

  • Provide training and technical assistance (to partners) to support SPF SIG

  • Conduct on-going monitoring and oversight of SPF SIG implementation in partner communities to assess:

    • Program effectiveness

    • Ensure service delivery quality

    • Identify successes

    • Encourage needed improvement

    • Promote sustainability of effective policies, programs, and practices.

    • Supervise the delivery of required performance data to SAMHSA

  • Conduct a state-level evaluation of the SPF SIG project

  • Engage stakeholders across the state to complement parallel engagement activities in partner communities


PARTNER COMMUNITIES WILL:

  • Assess levels of local substance abuse-related problems using state and local epidemiology data to determine

    • Magnitude of substance abuse and related mental health disorders in the state, and locations of where the problems are most severe

    • Levels of risk and protective factors associated with substance abuse and related mental health disorders

    • Community assets and resources

    • Gaps in services and capacity

    • Readiness to act.

  • Engage stakeholders to sustain prevention activities (i.e., convene leaders and stakeholders and/or build coalitions; train community stakeholders, coalitions, and service providers; organize agency networks; leverage resources)

  • Develop a strategic plan that:

    • Articulates a vision for the prevention activities and strategies for organizing and implementing prevention efforts based on documented needs

    • Builds on identified resources/strengths

    • Sets measurable objectives

    • Specifies performance measures and baseline data against which progress will be monitored

    • Describes mechanisms for making adjustments using needs assessment and monitoring information

    • Describes a long-term strategy to sustain policies, programs and practices

  • Select and implement evidence–based policies, programs and practices proven to be effective in research settings and communities such as NREP programs, and ensure that adaptations are both culturally competent and preserve core program elements

  • Provide performance data to the states for monitoring, evaluating, sustaining, and improving activities

SEWs WILL:

  • Collect, organize, analyze, and interpret and promote use of data on the causes and consequences of substance use at all stages of the implementation of the SPF

  • Represent key agencies and organizations (e.g., public health, criminal justice, education, behavioral heath, and research and statistics)

  • Possess:

    • Ability to collect and analyze data

    • Technical expertise in geographically-defined data from multiple sources (e.g. GIS)

    • Extensive knowledge of the State context to enable interpretation of the data

    • Knowledge transfer skills to promote use of data by decision-makers

    • Experience in prevention planning and needs assessment activities

    • Access to critical State data on substance-related problems and prevention strategies

  • Perform the following tasks

    • Collect and analyze epidemiology data to produce a profile of population needs, resources and readiness for the Advisory Group that will serve as baseline data against which progress and outcomes will be measured

    • Assist the Advisory Group with collecting, analyzing, and interpreting capacity assessment data

    • Work with the Advisory Council to ensure that SPF SIG priorities are aligned with needs assessment findings to the greatest degree possible and play a significant role in establishing key milestones and outcomes

    • Assist the SPF SIG Advisory Council efforts to ensure that strategies align with established priorities (i.e., the use of evidence-based programs)

    • Ensure that needs assessment data can serve as reference points for monitoring/evaluation


      1. Data collection


We will collect state level data on baseline status, process measures, and system-level outcomes from a variety of sources using a variety of methods. As noted earlier, our assessment of baseline status for SPF-SIG states will come from extracted archival information available for each state. In particular, we will rely heavily on data obtained from the states’ SPF-SIG application, in which they were required to describe their current infrastructure capacity (i.e., baseline). We will include several types of process measures at the state level, including implementation of each of the 5 SPF steps, SEW implementation, and cultural competence. SPF implementation will be assessed via examination of data from archival sources, namely the quarterly progress reports, needs assessment document, strategic plan, and documentation of SPF resource allocation. During Years 2-5 of the project we will conduct telephone interviews with key informants in each state, during which we will collect additional information via group and individual discussions. We will use the RFA and other CSAP documents describing the SPF as the “gold standard” for SPF implementation with which to gauge states’ SPF implementation (including fidelity). SEW implementation will be assessed using the same archival data sources and telephone interviews. Site visits may also be conducted in selected cases.


In terms of systems-level outcomes, our focus is change in the states’ ATOD prevention capacity, as indicated by their score on the State Systems Instrument. Given that there was little variability in the stakeholders’ perceptions of the importance of the various indicators that were retained in the instrument, and the fact that there is no empirical precedent for assigning weights to indicators of state ATOD prevention capacity, we plan to use the instrument as an index (i.e. the items will be weighted equally and summed to determine a score for each domain and an overall ATOD prevention capacity score). This quantitative format will allow us to use statistical analyses to assess natural variation that occurs across the SPF SIG states in terms of changes in the ATOD systems and their prevention capacity. The small sample size of our pilot (n=9) precludes us from performing internal consistency analyses typical during the instrument development phase, however after the first administration of the telephone survey, we will conduct tests of internal consistency to determine the reliability of the index. A Cronbach’s alpha of .70 will be considered adequate.


    1. Community-level (baseline status, process measures, and system-level outcomes)


The community-level evaluation will include structural as well as process and system-level outcome measures. The structural measures will be used to assess a community’s capability to provide substance abuse prevention services through adequate and appropriate settings, instrumentalities and infrastructure, including staffing, facilities and equipment, financial resources, information systems, governance and administrative structures, and other features related to organizational context in which services are provided (Donabedian, 1988; Siedman, Steinwachs, & Rubin, 2003). Outcome measures will assess the communities’ progress in capacity expansion and implementation of specific programs. Contextual factors external to SPF that may influence capacity building will also be measured..


The web-based Community Level Instrument (CLI) survey form will be the principal community-level data source for the national evaluation. It will collect standardized data from all funded communities across the 26 funded SPF-SIG states.


      1. Instrument development


As indicated above, many communities will not simply be implementing prevention activities, they will also be building capacity and developing infrastructure. Consequently the CLI will cover both types of effort, as well as the context in which they occur.


Capacity Building. Developing capacity and building infrastructure are necessary activities for states and communities conducting substance abuse prevention work. However, improving capacity involves extensive planning, resources, and coordination. In SAMHSA’s evaluation framework, the capacity expansion component considers the needs of the organization and community, the kinds of human and material resources that need to be factored into prevention planning, as well as the steps necessary to build organizational and community capacity (e.g., in a collaborative, task force, or coalition). Elements to be covered by the instrument include 1) organizational and community resources, 2) community awareness of and openness to prevention activities, 3) relationships and 4) sustainability.


While many evaluators are familiar with capacity building terminology, it is likely that each community has a different meaning for various capacity building activities such as collaboration, sustainability or leadership. In order to ensure that data captured at the state and community levels is consistent across sites, as is necessary to facilitate an effective national evaluation, we will ensure that specific terminology and meanings are assigned to each term so that there is a clear understanding of the concept being measured.


Many community capacity evaluation instruments are already publicly available. Some of the primary problems with existing instruments are that they are not designed to measure change over time; they are not designed to be used in a cross-site evaluation, but rather as an assessment instrument for individual communities; and most instruments do not include all of the components included in our community-level evaluation framework, but rather focus on one aspect of the community capacity building. Our proposed instrument will be designed to address all of these issues in order to facilitate an effective national evaluation and also to encourage communities to reflect on their activities and their relative effectiveness at accomplishing their capacity building goals.


Selection and Implementation of Prevention Interventions. This component focuses on activities directly related to developing and implementing prevention interventions, such as selecting, creating or modifying a curriculum, obtaining IRB approval, and recruiting participants (if applicable). In addition, this component evaluates the factors that have facilitated or limited the prevention intervention delivery. Our evaluation will place particular focus on whether communities are 1) selecting evidence-based programs, practices, and policies, 2) implementing them in appropriate settings, and 3) monitoring program fidelity to ensure implementation is adequate.


Contextual Factors. Demographic, cultural, and systemic factors can positively or negatively affect projects. Although these factors exist outside the scope of the project, they influence prevention intervention and capacity expansion activities. While the projects can incorporate these issues into their planning and interventions, typically they are not able to change these conditions through specific interventions. However, it is important to consider how these conditions influence the delivery of prevention services. Contextual demographic factors (as distinguished from individual) include geographic location, employment rates, and economic issues that may affect the outcome of the project. Cultural factors (conditions that result from cultural norms within the community) are often associated with race and ethnicity, but may also concern the culture of certain sub-populations such as gay, lesbian, bisexual, or transgender youth, or youth who consider themselves part of the “club” subculture. Systemic factors such as policies, laws, or regulations may also facilitate or limit prevention efforts. However, these contextual factors can also be the target of environmental interventions, so while it will be important to describe and assess policies, laws, and regulations as contextual factors at baseline, they will also be followed as measures of the implementation of the environmental interventions.


The CLI has three (3) parts. Part I is the main instrument. There are also two (2) sub-forms: one sub-form to report information about evidence-based participant interventions and one sub-form to report information about environmental interventions. The communities in each state will only need to complete the sub-forms if they are implementing that particular type of intervention. There is also a list of definitions, some of which are still in development. In the final web-based format, these definitions will be accessed using a hyperlink within the forms. Table 4 shows the contents of each form.

Table 4: Outline of Cross-site Community Level Instrument

MAIN INSTRUMENT

Sub-Grantee Organizational Information

Organization Type and Funding

Cultural Competence

Implementation Process

Strategic Prevention Framework

Needs Assessments

Capacity Building

Awareness & Openness

Relationship Building

Organizational and Community Resources

Work Force Development

Sustainability

Strategic Plan Development

Intervention Selection and Implementation

Project Level Outcome Evaluation

Contextual Factors

PARTICIPANT INTERVENTION FORM

Intervention Information

Logic Model

Participant Intervention Description

Dosage and Fidelity

Adaptations

Race, Ethnicity & Cultural Appropriateness

Intervention Outcomes

ENVIRONMENTAL INTERVENTION FORM

Environmental Intervention Implementation

Description of Environmental Intervention Activities

Environmental Intervention Outcomes


      1. Data collection


Although a few community-level site visits may be conducted, the web-based CLI form will serve as the principal community-level data source for the national evaluation. The form will be accessible via a password protected web-site. The advantages of a web-based instrument over a paper instrument include: a reduced respondent burden, the ability to build in skip patterns and quality checks, and direct downloads into an electronic database.


The CLI form will be completed either by the states for the communities or by the communities themselves. The current SIG evaluation relies on the state to provide information about the community-level activities, with each state being responsible for aggregating input from funded communities. We recommend the same approach be adopted here. The communities themselves will be able to provide a level of detail that the state is unlikely to be able to provide. In addition to providing a detailed administration guide, the national evaluation team will provide on-call and email technical assistance for those projects needing help completing the instruments. This type of technical assistance ensures that projects have the resources they need to complete the instruments in as thorough a manner as possible, thereby ensuring quality data with which to evaluate the community building and capacity expansion activities of the projects.


Community level data collection will begin early in mid-2006. Prior to that time it is expected that there will be little implementation of programs in the communities.17 Once initiated, the community-level data will be collected every six months. In our experience collecting similar types of data, we have found that quarterly data collection imposes too much of a burden on grantees but annual data collection is too infrequent for tracking community-level change.


    1. Cultural competence


One of the critical components of the SPF-SIG is the emphasis on cultural competence throughout the life of the grant, for all SPF activities, at both the state and community levels. The national cross-site evaluation team is committed both to assessing the degree to which states and communities demonstrate cultural competence in their SPF-SIG activities and to employing culturally competent measures and methods in our own work.


      1. Why measure cultural competence

Recent landmark reports have highlighted the broad range of disparities in the health status (including patterns of ATOD-related consumption and consequences) and access to health-related services among marginalized racial, ethnic and social populations in the United States. These disparities have been attributed to: institutional and organizational characteristics of service agencies; attitudes, beliefs and behaviors of service providers and stakeholders; biases in policies that result in differential access to preventive and treatment interventions; and the direct and indirect effects of racism, sexism, and other forms of discrimination. Eliminating health disparities is one of the overarching goals of Healthy People 2010, and the U.S. Department of Health and Human Services has recommended that SAMHSA “collect data to specifically identify racial and ethnic disparities in mental health and substance abuse epidemiology and services delivery.”

Effective substance abuse prevention and reducing ATOD-related disparities require understanding the cultural context in which ATOD-related consumption and consequences occur, including the differences in the patterns of consumption and consequences, risk and protective factors, and barriers and facilitating factors among the array of sub-populations within a state or community. Furthermore, ATOD prevention strategies are more effective when they are provided within a relevant and meaningful cultural, gender-sensitive, and age-appropriate context, and in the participants’ primary language. Thus, reliable racial, ethnic and socio-cultural (e.g. socioeconomic status, geographic, behavioral risk factors, education level, occupation, language proficiency, birthplace) data are needed to develop and implement effective prevention, intervention, treatment, and other programs, policies, and services.


Conversely, ignoring culture context can reduce the effectiveness of ATOD prevention strategies and even lead to negative health and social consequences. For example, potential clients may elect not to participate in ATOD prevention services for fear of being misunderstood or disrespected, or because they do not understand or trust the provider. Environmental strategies such as field check points and sobriety tests are inappropriate if the instructions and commands are issued in rapid-fire English, rather than the respondent’s primary language.


      1. Defining cultural competence


For the purposes of the national evaluation, we will use the definition of cultural and linguistic competence included in the National Standards for Culturally and Linguistically Appropriate Services (CLAS) in Health Care issued by the U.S. Department of Health and Human Services' (HHS) Office of Minority Health (OMH): "Cultural and linguistic competence is a set of congruent behaviors, attitudes, and policies that come together in a system, agency, or among professionals that enables effective work in cross-cultural situations. By culture, we are referring to “thoughts, communications, actions, customs, beliefs, values and institutions of racial, ethnic, religious, or social groups.” We recognize that race and ethnicity are social-political constructs, and will use the classifications of race and ethnicity included in the OMB’s Revisions to the Standards for the Classification of Federal Data on Race and Ethnicity. These standards were developed to provide a common language for uniformity and comparability in the collection and use of data on race and ethnicity by Federal agencies, and are not intended to serve as definitions for race and ethnicity. The standards have five categories for data on race: American Indian or Alaska Native, Asian, Black or African American, Native Hawaiian or Other Pacific Islander, and White; and two categories for data on ethnicity: "Hispanic or Latino," and "Not Hispanic or Latino."


      1. Measuring cultural competence at State-level


We will assess the degree to which SPF-SIG grantees employ culturally competent practices within the domains described below.

Data collection: At a minimum, all ATOD-related data collected and used by SPF grantees should include the OMB's revised standard racial and ethnic categories. However, in order to capture heterogeneity within sub-populations, states are strongly encouraged to collect information about sub-groups within populations, and to employ appropriate sampling techniques (e.g. oversampling, combining multiple years of data) to increase statistical power and facilitate analysis. In addition, states are strongly encouraged to collect and/or utilize existing sociocultural data, and these variables should be defined, collected and analyzed in a consistent manner so as to ensure comparability. In some cases, additional data collection using alternate methods will need to be used to include hard-to-reach populations (e.g. migrant workers, homeless people, and battered women).

Racial and ethnic self-identification is preferred, and when used, respondents who wish to identify their mixed racial heritage should be able to select more than one of the racial categories. States should not establish criteria or qualifications (such as blood levels) for determining racial or ethnic classifications, and the term "nonwhite" is not acceptable for use in the presentation of data or in publications or reports. Whenever possible, data should be collected in the respondent’s primary language, and culturally and linguistically appropriate data collection techniques should be employed at all times. Representatives of the groups being studied should be consulted to learn about the relevant cultural factors and language requirements.

Data use: In order to enhance the utility of the data, the SEW and other ATOD prevention stakeholder groups involved with data analysis and interpretation should be multi-disciplinary and include members who are familiar with and sensitive to the cultural factors and issues. To enhance data interpretation, representatives of states’ various communities should be involved in data collection plans, encouraging participation by their communities, and providing feedback. Data should not be used in ways that would stigmatize groups or individuals, and must be presented in a manner that prohibits direct or indirect identification of individuals, and is compliant with HIPPA regulations. Data should be used to identify and monitor ATOD-related disparities among racial, ethnic, and social groups.

Building cultural competence capacity: States should have a well-defined “Cultural Competence Plan” that includes a process for integrating cultural competence in all aspects of organizational strategic planning. In order to ensure staff and other service providers at both the state and local levels have the requisite attitudes, knowledge and skills for delivering culturally competent services, the state should provide training, technical assistance, and guidelines for selecting culturally appropriate prevention strategies, and cultural tailoring. In addition, states should facilitate the development of culturally appropriate ATOD outreach, prevention and intervention activities, including the adaptation (or “cultural tailoring”) of evidence-based strategies.

We will assess grantees’ cultural competence in the domains described above via: information collected on the SPF-SIG quarterly reports; state level process evaluation reports; and the state telephone interviews. In addition, analyses of epidemiological data will reveal changes in ATOD-related health disparities over the life of the SPF-SIG for both SPF and comparison states.

      1. Measuring cultural competence at community-level

At the community level, domains and strategies to assess cultural competence will mirror those at the state level, with an added emphasis on cultural competence of the ATOD prevention strategies selected and implemented by the communities. Communities should develop a process for determining their unique needs within their populations using existing agency databases, surveys, community forums, and key informants. Selected ATOD prevention strategies should be informed by external and internal stakeholders, employ culturally appropriate delivery techniques, and utilize linguistically and culturally appropriate service modalities and models. Communities will also need to identify and mobilize community leaders, cross-system coalitions, and natural support systems.

Our primary data source for assessing cultural competence at the community level for funded communities within SPF states will be the information on program content and fidelity from the web-based community survey (described above). We also will have information on considerations of cultural competence in community-level ATOD prevention funding allocation from the SPF quarterly reports and state-level evaluation reports. In addition, there will be some information about funding allocations and assessments of cultural competence from the State system survey for both the SPF and SEW-only states.

      1. Assessing the impact of cultural competence on outcomes


Measurement is also the key to assessing the impact of cultural competence on outcomes at both state and community levels. If cultural competence can be reliably and validly measured, its influence as a mediator of outcomes can be directly tested in the context of Questions 1b (accounting for state-level variation), 2b (accounting for community-level variation within states), and 3b (accounting for community-level variation across states). Section 7 provides details on how cultural competence and other mediators will be treated in the analysis.


    1. Epidemiological outcomes


As indicated in Section 1, the first two goals of the SPF are to reduce substance abuse – alternately called consumption -- and substance abuse related problems – alternately called consequences. These are operationalized through epidemiological outcomes, i.e., population-based estimates of consumption and consequences at state- and community-level. By population-based, we mean that all members of the target population or subpopulation, or at least a representative sample of the same, have the opportunity to contribute to the measures. Pre- and post-test measures obtained from participants in prevention programs are unlikely to be population-based, as they typically include small and non-random subsets of the target population.


Table 5 shows candidate national outcome measures that address substance abuse, substance abuse-related problems, or both, that were identified in our original proposal. Only measures that produce estimates bi-annually or more often are included, since a longer interval would be less useful for tracking and reporting change. The National Household Survey on Drug Use and Health (NSDUH) is a natural choice to consider first. Collected annually, it contains both ATOD use measures and several important risk/protective factor measures, for age groups 12-17, 18-25, and 26 and over, and state-level estimates have been available since 1999. The Behavioral Risk Factor Surveillance System (BRFSS), Monitoring the Future (MTF), and Youth Risk Behavior Surveillance System (YRBSS) are also good for tracking use, though each have their limitations. The BRFSS is collected annually and permits state-level estimates, but does not cover illicit drugs. The YRBSS permits state-level estimates and covers illicit drugs, but is bi-annual rather than annual. Finally, MTF is collected annually and covers illicit drugs, but only produces regional (rather than state) estimates. Other instruments in the table are potentially useful for tracking substance abuse-related problems, such as AOD-related ER admissions and deaths (DAWN), AOD-related vehicular deaths (FARS), and AOD-related arrests (UCR).


At the same time, we need to recognize the limits of the national indicators in regard to SPF SIG. For example, the NSDUH will provide state-level outcome data, but attempting to use it for community-level estimates (desirable because most interventions will not be state-wide) is more tenuous. First, there is the difficulty of determining the communities in which NSDUH respondents live. Second, we don’t know how many respondents would actually fall in the intervention communities. Third, samples from communities are not designed to be representative of those communities. Fourth, there may be some difficulty getting the confidential geographic information necessary to identify the communities of the respondents.

Table 5. Candidate national outcome measures for monitoring changes in substance abuse and substance abuse-related problems

Instrument

Agency

Substances monitored

Data on use frequency

Data on problems

Other

Frequency of collection

Level of estimates possible

Behavioral Risk Factor Surveillance System (BRFSS)

CDC

A,T use

Yes

Yes

Demographics, health risks

Annual

L,S,R,N


Consumer Expenditure Survey (CES)

BLS

A,T,OD (MJ only) use

Yes

No

Demographics, expenses

Annual

Selected MSA, R,N

Drug Abuse Warning Survey (DAWN) – ER

SAMHSA

A,OD-related ER admissions

No

Yes

Demographics, disposition from ER

Annual

L (21 cities), N

Drug Abuse Warning Survey (DAWN) – ME

SAMHSA

A,OD-related deaths

No

Yes

Demographics, cause and manner of death

Annual

L (38 cities), N

Fatal Accident Reporting System (FARS)

NHTSA

Vehicular deaths from A,OD, and/or T

No

Yes

Limited demographics (age and sex), info related to accident

Annual

L,S, N

Monitoring the Future (MTF)

NIDA

A,T,OD use

Yes

Yes

Demographics, R/P factors, attitudes, perceptions of risk, school experience

Annual

R,N


Mortality, Multiple Cause-of-Death Data

NCHS

Deaths from A,T,OD

No

Yes

Demographics, cause and manner of death

Annual

L,S, N

National Hospital Discharge Survey (NHDS)

NCHS

A,T,OD-related medical care utilization and costs

No

Yes

Demographics, info related to hospital stay and discharge

Annual

R,N


National Household Survey on Drug Use and Health (NSDUH)

SAMHSA

A,T,OD use

Yes

Yes

Demographics, R/P factors, health, employment, and legal status

Annual

R,S,N*

Uniform Crime Reporting Data (UCR)

FBI

A,OD-related arrests

No

Yes

Demographics, type of crime (e.g., possession, sales, DUI).

Monthly

L,S, N

Youth Risk Behavior Surveillance System (YRBSS)

CDC

A,T,OD use

Yes

Yes

Demographics, violence risks, health risks

Bi-annual

L (22 large cities), S (43 of 50), N

*In the NSDUH, state-level estimates have been available since 1999. Because of improvements and modifications to the 2002 NSDUH, estimates from the 2002 survey should not be compared with estimates from the 2001 or earlier versions of the survey to examine changes over time. For the purpose of evaluating the SPF SIG, this will still leave 5 years of pre-implementation data (2002-2006) to establish baseline trends.

L--local, S—State, R—Regional, N—National

SOURCE: Larson et al. (1995), updated by Arieira (2004)


A better source for community-level estimates might come from State-administered data sources, because estimates are often available for individual communities within states (CSAP/NCAP, 2000). A number of states will likely have community–level outcome data for both intervention and non-intervention communities, as provided by student surveys, possibly other surveys (e.g., college student surveys), and archival indicators.


      1. National Outcome Measures (NOMs)

The epidemiological outcomes of principal interest to CSAP are contained in the National Outcome Measures (NOMs). In collaboration with States and other stakeholders, SAMHSA has recently reviewed its discretionary and block grant programs, examining their ability to capture and assess performance data on treatment and prevention outcomes. The result has been the identification of domains of National Outcome Measures (NOMs) on which grantees are expected to report. Some of these are not relevant to prevention and will not be addressed by the SPF.

Those that are relevant are listed in Table 6, along with CSAP’s suggested operationalization of each. The first four are population based epidemiological outcomes, although the fourth -- Increased Social Supports/Social Connectedness – is classified as “developmental,” meaning that the operationalization (and requirement) is still in development. Two of the remaining three that are non-epidemiological -- Cost Effectiveness and Use of Evidence-Based Practices – were added as a result of the Office of Management and Budget (OMB) Program Assessment and Review Tool (PART) review of SAMHSA’s block grants.


Table 6. SAMHSA National Outcome Measures for Prevention

SAMHSA Outcome

Suggested CSAP Operationalizations

Abstinence from Drug/ Alcohol Abuse

 

 

No use in the prior 30 days

Perceived risk of use

Age of First Use

Perception of Disapproval

Increased/Retained Employment or Return to/Stay in School

ATOD suspensions/ expulsions

School attendance over enrollment

Workplace AOD use

Decreased Criminal Justice Involvement

Drug-related crime

Alcohol-related car crashes

Alcohol-related injuries

Increased Social Supports/Social Connectedness

(Developmental)

Increased Access to Services (Service Capacity)

Number of persons served by age, gender, race and ethnicity

Cost Effectiveness (Average Cost)

Services provided within cost bands within universal, selected, and indicated programs

Use of Evidence-Based Practices

 

Total number of evidence-based programs and strategies


While the NOMs are mandated, CSAP has given the states discretion as to data sources. This decision was based on two considerations. First, the usefulness of federally-funded survey data (e.g., NHSDUH) will vary by state, particularly with respect to the availability of community-level estimates. Second, the states differ markedly regarding other data sources they can bring to bear on the problem. For example, some states have extensive longitudinal data on consumption and consequences among youth from school-based surveys, while others do not. Similarly, some states have well-developed systems of maintaining and linking administrative records on school enrollment, employment, arrests, etc., others do not. The national evaluation team has recently completed compiling a matrix of available data sources by state.


      1. Other outcomes


Some grantees have expressed concerns that the NOMs do not fully capture the consumption and consequences outcomes they hope to achieve through SPF. Therefore, we are leaving the design “open” to accommodate additional outcomes, to be determined collaboratively with the states. A working committee of interested state evaluators will be constituted for this purpose, facilitated by the national team. We anticipate that the committee will make recommendations on additional outcomes for the rest of the states to consider. If additional outcomes are adopted, the national team will provide follow-on assistance as needed to support the states in collecting them, using them for their state-level evaluations, and providing them to the national evaluation. The SEDS website (described below) will soon contain a section with suggestions for measures and data sources not provided in SEDS but which states may be able to obtain from state and local agencies.


      1. Use of the State Epidemiological Data System (SEDS) for tracking epidemiological outcomes


CSAP is making epidemiological data available to States for purposes of substance use/abuse prevention needs assessment, planning, and monitoring through the State Epidemiological Data System (SEDS) website. This data is provided as a resource for State Epidemiology Workgroups (SEWs) in support of SPF. The data system provides a preliminary set of data elements that are critical for substance use/abuse prevention planning.


SEDS provides prevention-relevant data on both consumption and consequences:

  • Consumption: These data outline patterns of alcohol, tobacco, and illicit drug use/abuse, including initiation of use, regular or typical use, and high-risk use.

  • Consequences: These data include social, health, and safety consequences associated with alcohol, tobacco, or illicit drug use\abuse. Consequences include mortality and morbidity and other undesired events for which alcohol, tobacco, and/or illicit drugs are clearly and consistently involved. While a specific substance may not be the single cause of the consequence, scientific evidence must support a link to alcohol, tobacco, or illicit drugs as a contributing factor to the consequence. 


The criteria used for selecting the indicators included in SEDS make them good candidates for SPF. These are:

  • National source. The measure must be available from a centralized, national data source.

  • Availability at the State level. The measure must be available in disaggregated form at the State level (or lower geographic level).

  • Validity. The measure must meet basic criteria for validity, e.g., the data should accurately measure the specific construct.

  • Periodic collection over at least 3 to 5 past years. The measure should be available for the past 3 to 5 years, preferably on an annual or least bi-annual basis.

  • Consistency. The measure must be consistent, i.e., the method or means of collecting and organizing data should be relatively unchanged over time.

  • Sensitivity. For monitoring, the measure must be sufficiently sensitive to detect change over time that might be associated with changes in alcohol, tobacco, or illicit drug use/abuse.


PIRE is providing technical assistance to SPF grantees regarding the use of the SEDS data system under a separate contract. Some members of the national evaluation team also serve on the technical assistance team, helping to ensure a consistent message to states.

    1. Next steps


Several steps remain with respect to development of the cross-site data sources and measures:


      1. Review, initial revisions, and piloting of state-level and community-level instruments


Over the course of Summer, 2005, the state- and community-level instruments are being reviewed by the ETAG and the Grantees. In addition to seeking individual feedback, group feedback will be sought at both the ETAG (July 22) and state evaluators meeting (September 27-28). All instruments will be revised accordingly and made ready for piloting. We will pilot test the SLI in no more than more than 9 states, and the CLI on no more than 9 communities, in order to get a sense of the clarity, ease of completion, and time burden. There will not yet be any SPF communities in which to pilot the CLI; however at least two states have volunteered to make available their current SIG subrecipients to assist in piloting the instrument, and others are being recruited. Piloting of the SLI and CLI will begin shortly after the September evaluators meeting.


      1. Final revisions and OMB submission


Another round of revisions will be completed based on the pilot results. The final versions will be tested by members of the national evaluation team to ensure that the technical aspects of the instruments function properly. Next, all instruments and data collection protocols will be submitted to OMB for approval. Protocols will include procedures for obtaining epidemiological outcome data from states in addition to obtaining process and systems change data. We anticipate receiving approval by March 2006.


      1. Agreements with states on national evaluation use of state- and community-level epidemiological data


As noted above, states differ markedly regarding availability of and access to potential outcome data sources, and the national evaluation team is currently compiling a matrix of available data sources by state. This will be followed by discussions with states, collectively at the September evaluators meeting and individually thereafter, about the logistics of sharing the data with the national team and ultimately with other states who wish to do cross-state analyses (see Section 8.4), including any confidentiality issues or other potential barriers that may arise.


  1. Analysis Plan


    1. Descriptive/normative analyses


Although the primary focus of the national evaluation is on assessing impact, many descriptive and normative analyses will occur first. We will use standard techniques for analyzing, displaying, and reporting descriptive and normative results as they become available throughout the evaluation period. These will include summary statistics (e.g., means and standard deviations), univariate and multivariate frequency distributions (including cross-classification displays), as well as appropriate charts and graphs. Subsequently, answers to various descriptive and normative questions, coded into numerical indicators and scales, will support the six impact questions as key predictors of systems-level and population-level outcomes. These are discussed next.


    1. Inferential (cause and effect) analyses


Multilevel models are well suited to sort out the cause-and-effect relationships of state and community characteristics on changes in population outcome trends, both separately and in combination (e.g., under what mix of state and community circumstances are positive effects most likely). By properly adjusting standard errors for within-state clustering of communities and serial correlation of longitudinal outcomes, they will also increase the confidence with which observed changes can be attributed to the implementation of SPF activities. This in turn will lead to better-grounded recommendations for improving effectiveness in the future.


As in other analyses of multisite evaluations that team members have conducted (e.g., Orwin et al., 1999; 2000; 2004), we will begin these analyses with careful examination of the distributional characteristics of the data, and assess the baseline differences among all the groups being compared. Only after thoroughly understanding the particulars of the datasets will we proceed to within-state and cross-state outcome analyses, using multilevel statistical modeling methods that take account of the “nested” character of the longitudinal epidemiological outcome data within communities and states, as well as nesting of communities within states (Murray, 1998). Because data from the selected surveys will be available from earlier years, it will be important to include the trends in repeated cross-sectional measurements in the statistical models.


Like other more traditional analyses employing linear models, the approach can also reveal sites that, for any reason, are discrepant from others with similar characteristics on the variables included in the model (that is, “outliers” in the distributions of outcomes) through graphic display of estimates and residuals. This step can be the basis for beginning further analyses of the reasons for such discrepancies, which may involve values of other variables available in the data but not included in the multilevel model, or point the way to other, more global characteristics highlighted only in more narrative and qualitative data or in the expertise of state and community informants.


First we describe our proposed use of propensity scoring to reduce potential bias from group nonequivalence at the state (SPF vs. non-SPF) and community (funded vs. non-funded) levels, respectively. Next, we describe in some detail the statistical models to be used for addressing each of the six evaluation questions.


      1. Propensity scores


There will be too many variables in the pool of potential confounding moderators to remove the effects of each individually. Instead, we will summarize the information from the pool using propensity scoring. Propensity scoring is an efficient method for reducing bias in nonrandomized designs due to variables that may be confounded with treatment/comparison group membership. In our case the groups are SPF vs. non-SPF where the state is the unit of analysis (Question 1a) and funded vs. non-funded where the community is the unit of analysis (Questions 2a and 3a). Briefly, the technique involves fitting a logistic regression model to predict group membership (i.e., predict the probability of being in the treatment group) using a series of covariates thought to be related to group membership. Formally, for subject i (i =1, … N), the probability of assignment to the treatment group (Zi=1) versus comparison group (Zi=0) given the vector of covariates, xi, is e(x)=pr(Zi=1| Xi=xi), where it is assumed that given the X’s, the Zi are independent (D'Agostino, 1998). The method was introduced by Rosenbaum and Rubin (1984) and is widely used to analyze observational studies (Rosenbaum, 2002).


While not addressing every concern with respect to causal attribution (limitations are discussed in Section 9), propensity scoring methods bring tangible improvements over earlier methods, such as analysis of covariance/regression modeling. Specifically, it frees the regression modeling process from its usual limitation of reliance on a small number of covariates and simplistic functional forms (e.g., linear main effects only). Rather, a complex model with interactions and higher-order terms can be fit at the propensity scoring stage without great concern about overparameterization or multicollinearity. When subsequently included in the regression model, the propensity score carries all the information from the complex covariate model in a single variable, consuming only one degree of freedom. In addition, propensity score technology can accommodate reasonable numbers of missing observations in the covariates, so fewer cases are lost in analytic procedures requiring complete cases for inclusion. However, the most important advance may be that propensity scoring allows for direct diagnosis of the success with which confounder influence was removed, which is not possible with traditional ANCOVA models.18 Because propensity scoring is designed to remove the effects of confounding variables from the association between outcomes and exposures, the counterfactual projections of population means for the confounding variables should be the same across conditions. This property is referred to as balance. Simulations, studies of actual data, as well as formal proofs have shown that subclassification of the propensity score into about five strata or “quintiles” is generally sufficient to assess the quality of the adjustment for all the covariates that went into its estimation, no matter how many there are (Rubin, 1997).


The procedure for generating propensity scores will begin with our complete list of potential moderators, i.e. potential predictors of group membership will be drawn from various demographic and baseline status variables from the sources listed in Table 2. A logistic model will be fit to identify which of the admissible potential confounders are actually predictive of condition and then to estimate the vector of slope parameters for those predictors. All candidate variables that significantly discriminate at the univariate level between conditions will be entered into the model on a stepwise basis.


To test covariates for balance, the propensity scores will be ordered into five approximately equal sized groups, or quintiles. A covariate will be considered out of balance if the F test (or χ2 test for categorical variables) of association with group membership is significant at p<0.05 within one or more quintiles. Because lack of balance sometimes results from nonlinear relationships between predictors and condition, or alternatively, interactions between predictors, a recommended practice for improving balance is to add interactions, sample size permitting (D’Agostino, 1998; Rosenbaum and Rubin, 1984). Interaction and higher-order terms (e.g., quadratics) will be added as needed until balance is achieved or can no longer be improved. To properly adjust standard errors for within-site clustering in the communities across states propensity (for Question 3a), the testing will be done within WesVar. WesVar uses replicate weighting methods to calculate variances, thereby ensuring proper estimation of standard errors in clustered data (Westat, 2000).


      1. Statistical models


Modeling will be performed using Hierarchical Linear Modeling (HLM) Version 6 (Raudenbush et al., 2004). The program handles virtually every known variety of the 2- and 3-level mixed model. The coefficients estimated by the “standard” HLM model are applicable to a hierarchical data structure with three levels of random variation in which the errors of prediction at each level can be assumed to be approximately normally distributed. Some of our outcome variables may qualify or be made to qualify through algebraic transformations. For those that do not (e.g., prevalence rates), HLM lets the user specify a nonlinear analysis appropriate for the distributional characteristics of the dependent variable (dichotomous, ordinal, counts, nominal, etc.). It also accommodates sampling weights in both linear and nonlinear models.19 This is relevant to our analysis because 1) most of the NOMs and other outcomes will not meet normality assumptions and therefore require nonlinear models, and 2) states will contribute unequal numbers of communities and population sizes to the cross-site database. Therefore, inverse weighting by these inequalities at the appropriate level will increase the generalizability of the findings.


Applied to longitudinal data, the mixed-effects approach can allow for individual-varying intercepts and slopes across time and can estimate the degree to which these time-related terms vary. Model covariates can be either time-varying or time-invariant. Thus they will accommodate both fixed (e.g., propensity scores) and dynamic (e.g., infrastructure development levels) characteristics of states and communities.


QUESTION 1a: Did SPF funding improve statewide performance on NOMs and other outcomes?


The Question 1a analysis potentially includes all states (SPF and non-SPF), and outcomes are analyzed at state-level. For illustrative purposes, we’ll assume that the outcome is 30-day prevalence, measured annually. Where the level-1 outcome is a rate, a linear model is not appropriate. Instead, a binomial model will be used to model the log-odds of prevalence, consistent with recommended practice (Raudenbush and Bryk, 2002; Goldstein, 1995). Where level-1 outcomes are continuous normal, dichotomous, counts, ordinal, or multinomial, the level-1 model will be adjusted accordingly. The level-2 and (where applicable) level-3 models will remain the same.


The nonlinear link function for a binomial distribution is

ψts = log(φts/(1- φts))

where ψts is the log of the odds of prevalence for state s at time t. Thus if the prevalence rate, φts, is 0.5, the odds of prevalence is 1.0 and the log-odds or "logit" is zero. When the prevalence rate is less than 0.5, the odds are less than one and the logit is negative; when the probability is greater than 0.5, the odds are greater than unity and the logit is positive. Thus, while φts is constrained to range from 0 to 1, ψts can take on any real value.


The initial level-1 structural model is:

ψts = π0 + π1 *(TIME) + π2 *(PERIOD) + π3*(PERxTIME) + π4*(TIME2) + π5*(PERxTIME2)


The initial level-2 structural model is:

π0 = β00 + β01*(SPF) + β02*(PROPENSITYS) + ρ0

π1 = β10 + β11*( SPF) + β12*(PROPENSITYS) + ρ1

π2 = β20 + β21*( SPF) + β22*(PROPENSITYS) + ρ2

π3 = β30 + β31*( SPF) + β32*(PROPENSITYS) + ρ3

π4 = β40 + β41*( SPF) + β42*(PROPENSITYS) + ρ4

π5 = β50 + β51*( SPF) + β52*(PROPENSITYS) + ρ5

or alternatively,

πi = βi0 + βi1*( SPF) + βi2*(PROPENSITYS) + ρi

where i=0,1,…,5

(The indexed notation will be used from hereon.)


In the π0 equation, β00 is the population intercept, or the average log-odds of prevalence at baseline, or the first observation year. β01 is the effect of group membership (SPF v. non-SPF) on the intercept, controlling for pre-existing differences with the state-level propensity score (β02), and ρ0 is a random effect representing residual variation in intercepts across states.

In the π1 equation, β10 is a slope parameter representing the average rate of annual change in the log-odds of prevalence over time across the entire observation period, and β11 is the effect of group membership on that slope, adjusted for pre-existing differences (β12). It could also be thought of as the variance in the rate of change accounted for by group membership. ρ1 is a random effect representing residual variation in rate of change across state. The π2 and π3 equations terms have analogous meanings for the period (pre-implementation=0, post-implementation=1) and time-by-period interaction slopes, respectively. The π4 and π5 equations allow for the possibility that the relationships of the outcomes with time and time within period may be curvilinear. Thus, terms β21, β31, and β51 – group differences on period, period by time, and period by time squared -- represent the net effects of SPF, controlling for pre-existing differences between groups on moderators (the propensity terms), and differences between groups on the outcome intercept (β01) and secular rate of change (β11 and β41). Specifically, β21 answers “Is there a difference?”; β31 answers “Is the difference increasing or decreasing over time?”; and β51 answers “Is the increase or decrease over time accelerating or decelerating?” In addition to tests for each term individually (Ho1: β21=0; Ho2: β31=0; Ho3: β51=0), a multivariate test tests their joint significance (Ho: β21= β31= β51=0).20 If one or more individual terms are significant, but the joint test is nonsignificant, caution is warranted in interpreting the individual terms.


QUESTION 1b: What accounted for variation in NOM and other outcomes performance across SPF states?


The Question 1b analysis includes SPF states only, and outcomes are analyzed at state-level.

The level-1 structural model is the same as for Question 1a.


The level-2 structural model (without mediators) is

πi = βi0 + βi1*(MODS1) + βi2*(MODS2) + … + βiJ*(MODSJ) + ρi

where i=0,1,…,5


In the π0 equation, β00 is again the population intercept, β01 is the effect of the first state-level moderator (MODS1) on the intercept, β02 is the effect of the second moderator (MODS2) on the intercept,), and so forth for each of the j moderators in the model. As with Question 1, ρ0 is a random effect representing residual variation in intercepts across states.

In the π1 equation, β10 is again a slope parameter representing the average rate of annual change in the log-odds of prevalence over time across the entire observation period, and β11 is the effect of MODS1 on that slope, β12 the effect of MODS2, etc. (Collectively, β11 through β1J represent the variance in the rate of change accounted for by moderators.) ρ1 is a random effect representing residual variation in rate of change across SPF states after the fixed moderator effects are specified. The π2 and π3 equations terms have analogous meanings for the period (pre-implementation=0, post-implementation=1) and time-by-period interaction slopes, respectively, and the π4 and π5 equations allow for the possibility that the effect of the moderators on outcomes over time, and over time within period, may be curvilinear.


Note that in Question 1a, the moderators were collapsed into a propensity score because their sole purpose was to reduce bias stemming from initial nonequivalence between SPF and non-SPF states. They have a different purpose in Question 1b, one of providing substantive explanations for variation in outcomes across SPF states. This holds for the mediators as well. The mediators (MEDS1 through MEDSJ) will be added to the models after the moderators.21 In theory, interactions between moderators and mediators can be examined as well as main effects (e.g., did the quality of SPF implementation have a greater effect on reducing substance use in states who had previously built capacity through a SIG grant), though in practice, the number of main effect and interaction terms will be limited by level-2 sample size (N=26). It may also prove useful to model mediators that are measured at multiple time points—e.g., infrastructure development levels--as time-varying. This would increase the sensitivity to variation in infrastructure growth rates across states.


QUESTION 2a: Within states, did SPF funding lead to community-level improvement on NOMs and other outcomes?


As noted above, direct comparisons of funded vs. non-funded communities will provide useful information for the state evaluations of the SPF, and is consistent with the focus of the SPF on statewide and communitywide (i.e., population-level) impacts. The national evaluation will plan on using these same community-level outcome data, as well as information on selected characteristics of communities, for the national evaluation effort (Questions 3a and 3b).

The data include all funded and non-funded communities within a particular SPF state, and outcomes are analyzed at community-level. The level-1 structural model is the same as for Question 1a, except that ψts is now ψtc, the log of the odds of prevalence at community-level rather than state (i.e., for community c at time t).

ψtc = π0 + π1 *(TIME) + π2 *(PERIOD) + π3*(PERxTIME) + π4*(TIME2) + π5*(PERxTIME2)


The level-2 structural model (without mediators) also remains the same, with the level-2 predictors brought down to community-level.

πi = βi0 + βi1*(FUNDED) + βi2*(PROPENSITYC) + ρi

where i=0,1,…,5


In the π0 equation, β01 is now the effect of funding (funded community v. non-funded community) on the intercept, controlling for pre-existing differences between communities (β02), and ρ0 is a random effect representing residual variation in intercepts across communities. The π1 through π5 equations also directly analogous to the Question 1 model, with communities substituted for states.


Thus, terms β21, β31, and β51 – group differences on period, period by time, and period by time squared -- represent the net effects of being a funded community, controlling for pre-existing differences between groups on moderators (the propensity terms), and differences between funded and non-funded communities on the outcome intercept (β01) and secular rate of change (β11 and β41). As with the SPF-non-SPF state-level comparison, β21 answers “Is there a difference?”; β31 answers “Is the difference increasing or decreasing over time?”; and β51 answers “Is the increase or decrease over time accelerating or decelerating?” Again, the multivariate test tests their joint significance (Ho: β21= β31= β51=0).


QUESTION 2b: Within SPF states, what accounted for variation in NOM and other outcomes performance across funded communities?


The Question 2b analysis includes funded communities within SPF states only, and outcomes are analyzed at community-level. The level-1 structural model is the same as for Question 2a, with ψtc representing the log of the odds of prevalence for community c at time t.

.

The level-2 structural model (without mediators) is analogous to that of Question 1b:

πi = βi0 + βi1*(MODC1) + βi2*(MODC2) + … + βiK*(MODCK) + ρi

where i=0,1,…,5


In this case, however, the moderators are measured at community- rather than state-level. Some of the k moderators will be community-level versions of state moderators (e.g., where census variables are available for both), whereas others will be specific to communities (e.g., baseline levels of prior prevention programming gleaned from the cross-site community-level survey).


The community-level mediators (MEDC1 through MEDCK) will be added to the models after the moderators. As with states, interactions between moderators and mediators can be examined as well as main effects (e.g., did implementation of interventions with cultural competence have a greater effect on reducing substance use in communities with higher initial readiness to change), though again the total number of main effect and interaction terms will be limited in practice by level-2 sample size, i.e., the number of funded communities. The severity of the limitation will vary across states, whose funding strategies and definitions of community will differ. It may also prove useful to model mediators that are measured at multiple time points—e.g., cultural competence levels--as time-varying. This would increase the sensitivity of the analysis to variation in cultural competence change rates across communities.


Note that it is not necessary or even desirable that all states choose the same moderators and mediators to address Question 2b. We encourage states to collect considerable information regarding characteristics of the intervention communities, including the specific intervention activities they implement and various measures of implementation level (e.g., dosage and fidelity). The national evaluation will also collect data from the funded communities via the web-based community-level survey, which will be made available to the state coordinators and evaluators in analyzable form. These data, along with any state-specific data, will facilitate an exploration by individual states of the relationships between such characteristics and the outcomes achieved in their state. Prior community-based research has shown that communities may vary widely in their response to an intervention. Analysis of subgroups formed from the intervention communities can be useful in identifying community attributes, including measures of implementation that influenced the effectiveness of the intervention. Findings from such analyses can be helpful in shaping future community-based interventions in the state.


QUESTION 3a: Across states, did SPF funding lead to community-level improvement on NOMs and other outcomes?


As discussed above, it is likely that impacts on pre-defined outcome measures will, for most states, be concentrated primarily in the subset of communities that receive SPF funding. In addition, therefore, to comparing SPF and non-SPF states on statewide outcome measures (question 1a), the national evaluation will also compare communities that either receive or do not receive SPF funding and support. Community-level data from both SPF and non-SPF communities across all of the SPF states will provide a very substantial number of communities upon which to base the analysis, thus providing a level of statistical power for assessing community-level impacts of the SPF that goes far beyond what individual state analyses can offer. It will also allow for extensive subgroup analysis among intervention communities in order to examine community characteristics that are associated with the level of outcomes achieved (see question 3b below).


To address this question, the national evaluation will assemble community-level outcome data (as described above for question 2a) from all SPF states. The same general design and analysis strategy that applies to within state evaluations will also be used for the national evaluation, although with a much larger N of communities. To help ensure efficiency in data collection and comparability of the outcome measures available across states, the national evaluation team will work with state evaluators to identify common measures and data sources that will serve both the needs of the individual within state evaluations and the needs of the national evaluation.


At a minimum, we expect that states will provide summary data at the community-level (i.e., means, percentages, rates, etc.) for as many outcome measures as data are available. The analysis will be more powerful – both statistically and inferentially – where multiple time points are available both before and after implementation, rather than a single pre- and post-test. Due to the anticipated large N of communities involved, and the added power of longitudinal data (where available), analysis of the community-level indicators should provide reasonable statistical power to detect nontrivial intervention effects. Where the underlying individual-level record data are also available (e.g., student survey data, alcohol-related traffic accident records), we will consider requesting those data from the states as well. Access to the individual-level records will provide greater flexibility to the national evaluation effort in the analysis phase (e.g., flexibility to define summary measures in slightly different ways or for different demographic subgroups). These data will also permit a wider array of statistical analysis options, including deeper multi-level analysis, and potentially greater statistical power than achieved through analysis of only the community-level indicators. However, there is also a potential downside to this approach. Depending on the condition these records are in when submitted, they could significantly increase the data processing and management burden on the national evaluation team. The feasibility of productively using individual-level records will be investigated further by the data management working group as more information about the state data sources becomes available.


The Question 3a analysis includes all funded and non-funded communities within all SPF states, and outcomes are analyzed at community-level. Because communities are nested within states, this question requires a 3-level model rather than 2-level (outcomes within communities within states). The level-1 structural model is the same as for Question 2a, with ψtcs representing the log of the odds of prevalence for community c at time t, but with communities nested within states (s).

ψtcs = π0 + π1 *(TIME) + π2 *(PERIOD) + π3*(PERxTIME) + π4*(TIME2) + π5*(PERxTIME2)


The level-2 structural model (without mediators) is also the same as for Question 2a, with the level-1 intercept and growth rates modeled at level-2 as functions of community funding (yes/no) and the propensity to be funded:

πi = βi0 + βi1*(FUNDED) + βi2*(PROPENSITYC) + ρi

where i=0,1,…,5


The terms β21, β31, and β51 – group differences on period, period by time, and period by time squared – again represent the net effects of being a funded community within a state, but this time averaged across all SPF states. It could also be thought of as the generalized effect of SPF funding on communities, over and above any statewide effects of SPF.


Each of the level-2 coefficients β00 through β52 defined in the level-2 model becomes an outcome variable in the level-3 model:

βik = γik0 + γik1*(MODS1) + γik2*(MODS2) + ….+ γikJ*(MODSJ) + μik

where i=0,1,…,5 and k=0,1,2


In the β00 equation, γ000 is the prevalence population intercept, γ001 is the effect of MODS1 on the intercept, γ002 is the effect of MODS2 on the intercept, etc. In the β01 equation, γ011 is the effect of MODS1 on the effect of FUNDED on the intercept, γ012 the effect of FUNDED on the intercept, etc. The β02 equation serves to adjust the β01 equation effects for PROPENSITY. In the β10 equation, γ100 is the average prevalence change rate over time, and γ101 through γ10J are the effects of MODS1 through MODSJ on that change rate. In the β11 equation, γ111 is the effect of MODS1 on FUNDED on the change rate over time, γ112 the effect of MODS2, and so forth. The other level-3 equations are similarly interpreted. While the focus of the level 2 equation was on estimating net effects, the focus of the level 3 equation is on estimating the influence of state-level moderators on those effects. The β21, β31, and β51 parameters which represented the net effects of FUNDED in the level-2 within-state analysis (Question 2a) are further adjusted for pre-SPF differences among states (MODS1 through MODSJ) in the level-3 cross-state analysis. γ211 through γ21J are moderator influences on “Is there a difference?” effect, γ411 through γ41J are moderator influences on “Is the difference increasing or decreasing over time?” effect; and γ511 through γ51J are moderator influences on the “Is the increase or decrease over time accelerating or decelerating?” effect.


QUESTION 3b: Across SPF states, what accounted for variation in NOM and other outcomes performance across funded communities?


The Question 3b analysis also includes funded communities within and across SPF states, with outcomes analyzed at community-level. The level-1 structural model is the same as for Question 3a, with ψtsc representing the log of the odds of prevalence for community c within state s at time t.


The level-2 structural model (without mediators) is the same as for Question 2b:

πi = βi0 + βi1*(MODC1) + βi2*(MODC2) + … + βiK*(MODCK) + ρi

where i=0,1,…,5


As in Question 3a, each of the level-2 coefficients defined in the level-2 model becomes an outcome variable in the level-3 model. In this case, however, the number of equations is indefinite, as the number of level-3 outcomes represented by β00 through β5K is not known in advance:

βik = γik0 + γik1*(MODS1) + γik2*(MODS2) + …. + γikJ*(MOD SJ) + μik

where i=0,1,…,5 and k=0,1,…,K


The level-3 model again represents state-level influences on community-level effects. In this case, however, the influences are on community-level moderators and mediators among funded communities, rather than on the net effect of funding vs. not funding.


As noted previously, Question 3b is the analog to Question 2b, but to be addressed across all SPF states by the national evaluators instead of individually within each state by the state evaluators. Of the six questions, Question 3b may best demonstrate the power of a cross-site evaluation to yield generalizable inferences about selecting and implementing community interventions under the SPF model. For example, suppose that based on their needs assessment and problem analysis, a community within a given state elects to implement Communities Mobilizing for Change on Alcohol (CMCA), a community-based SAMHSA model program designed to reduce adolescent (13 to 20 years old) access to alcohol by changing community policies and practices (SAMHSA, 2004).22 If the implementation of CMCA fails to change outcomes in the target community, potential reasons include measurement failure, implementation failure, and theory failure. Measurement failure can be ruled out if the instrument has previously demonstrated sensitivity to change (particularly differential change across interventions), and implementation failure can be ruled out if the implementation assessment shows the program to be implemented with fidelity and cultural competence. Theory failure, on the other hand, is near impossible to “unpack” from a single implementation. Was there something about the community that made reduction of underage drinking more difficult than anticipated? If so, the answer could lie in any number of demographic, cultural, and environmental factors. Alternatively, was there something about the local adaptation of CMCA that missed the mark, yet was not picked up by monitoring systems in place at the time? Clearly, the unpacking process is greatly facilitated by having multiple replications of the same program, policy, or practice both within and across states. The integrated multilevel analysis can bring out the common elements of successful replications, be they moderators, mediators, or both, at state-level, community-level, or both. This in turn sets up an outcomes-driven empirical basis for advancing best practices that is not possible with a single study or series of case studies.


  1. Implementation of Participatory Collaborative Model


As noted in Section 1, the national evaluation team views the 26 SPF states as full partners in the design and implementation of the national evaluation, with continuing collaboration over the entire course of the study. This section describes actions to date and plans for sustaining the partnership.


    1. Instrument Development and Selection of Outcomes

To date, most of the grantee participation in instrument development has revolved around the State-level infrastructure development survey. Input on the draft state-level SPF process survey and Community-level survey will be sought prior to and at the September Evaluators Meeting.


      1. State-level infrastructure development survey.


As part of the “Round 0” cross-site team visits, in-person semi-structured interviews were conducted with a variety of SPF-SIG stakeholders (n=74 total) from ten SPF-SIG grantee states from January 24-March 4, 2005. After the national evaluation team content analyzed the responses into themes within domains, and created a series of indicators for each domain, stakeholders from the remaining sites were asked to rate the appropriateness, cultural competence, and importance of each as measures of a “quality” state prevention system.


In addition, SPF-SIG Project Directors and State Evaluators from the first ten states visited were emailed the draft Domains and Indicators and asked to rate them, consequently, all site-visited states had the opportunity for input into this part of the process. Based in large part on grantee responses, the instrument is currently being revised, and will be distributed for further comment prior to the September Evaluators Meeting. It will then be revised again and piloted with selected states.

.

      1. State-level SPF process survey.


States will be asked to review and comment on the first draft of the State-level SPF process survey prior to the September Evaluators Meeting. Based on input prior to and at the meeting, the instrument will be revised on the same timetable as the State-level infrastructure development survey, and piloted on the same selected states.

      1. Community-level survey.


States will be asked to review and comment on the first draft of the Community-level survey prior to the September Evaluators Meeting. Based on input prior to and at the meeting, the instrument will be revised on the same timetable as the State-level infrastructure development and process surveys. There will not yet be any SPF funded communities; however, some SPF states that still have active SIG grants have offered to pilot the revised instrument on one or more of their SIG subrecipients,

      1. Epidemiological outcomes


Collection and reporting of the NOMS is mandatory, but states will have the opportunity to nominate additional outcomes that they may perceive as more sensitive indicators of SPF success. Assuming there is sufficient state interest, a committee will be formed at the September Evaluators Meeting to pursue this issue and report back to the full group.


The national evaluation team will also work with state evaluators on identifying measures and sources, and will encourage the use of definitions that are common across states in order to facilitate the national evaluation effort.


    1. Evaluation design


Prior to any detailed design work, Dr. Orwin presented the anticipated general approach at the December, 2004 new grantees meeting, stressing that this would not be a “go-it-alone” venture by the national evaluators; rather it would rely on a collaborative, participatory model that encouraged active involvement by states. The model was further discussed with states during the orientation session of the Round 0 site visits. Drs. Diana and Orwin then presented an overview of the proposed national design to state SPF coordinators, evaluators, SEW chairs, and others in attendance at the May, 2005 grantee meeting, followed by a lengthy Q&A session. Much of the feedback from state-level stakeholders has been incorporated into the present document, in particular the “Frequently Asked Questions” of Section 9. This first draft will be distributed to states via the SPF list-serv for review and comment in advance of the September 2005 evaluators meeting. Additional feedback will be sought in one or more sessions at the meeting. Based on the combined written and oral feedback, the design will be revised as appropriate and finalized.


    1. Implementation


Grantees can participate in the implementation of the national evaluation in two important ways. First, they can volunteer to chair or serve on one or more of the task-oriented committees that will be launched at the September 2005 grantee meeting. Second, they can adopt a comparative design for assessing the impact of SPF funding on their targeted communities, through which they can contribute to and benefit from the integrated state/cross-state design strategy proposed in this document. Some states planned to implement a comparative design from the outset, and others are considering it. As described in Section 9, there is tremendous flexibility in how a comparative design can be conceptualized (e.g., communities do not have to be geographically distinct entities), as long as some basis of comparison exists.

    1. Analysis and dissemination


While the national evaluation team has a contractual obligation to analyze and report on the cross-site data, it has been our position from the outset that the SPF cross-site data should be shared with all the participating states as promptly and fully as possible. Based on experience with prior cross-sites, for example, we anticipate an interest in substudies that pool data from an ad hoc subset of states with a common interest. The commonality could be demographic (e.g., states with a large Hispanic populations), problem based (e.g., states with a proliferation of methamphetamine labs), or programmatic (e.g., states investing in a particular environmental strategy). In principle, the national team could offer to do all these analyses, but we would not be able to guarantee them any priority status, so it may be more efficient for the group of interested states to do it themselves, with technical assistance from the national team if needed.


Rights of authorship in presentations and publications that result from these analyses would of course go to the states that took the lead on the substudy. We also welcome the opportunity to collaborate with interested states in panel presentations and co-authored publications. States will also have the opportunity to review any official reports that the national team produces for CSAP for factual accuracy and appropriate interpretation.


  1. Frequently Asked Questions


This section provides a compiled list of “Frequently Asked Questions” (FAQs) and our attempt to address them. Most were asked by state SPF coordinators and evaluator, others were asked by federal staff, while still others we posed to ourselves. Categories include: 1) Questions related to comparing funded and non-funded communities, 2) Questions related to availability of data, and 3) Questions related to analysis limitations. This is an initial list; we expect that other questions will arise when the states review this document and the draft instruments. As noted previously, this document does not have a separate Limitations section; instead the limitations are discussed in the context of the FAQs.


    1. Questions related to comparing funded and non-funded communities


Q. Must all states define “community” the same way?

A. No. Each state, and potentially each grantee community, will need to address this based on its own context. In most states, we expect that target or “catchment” areas will be defined geographically, typically on the basis of place (i.e., town, township, or city), metropolitan area, or county. Geographically-defined communities could also be aggregations of several towns or other geographic units. SPF-supported prevention activities will be expected to at least potentially reach and impact all persons, or all persons of a selected target subpopulation, throughout the entire community as it is geographically defined.


Q. What if a state defines its communities non-geographically?

A. We recognize that states may use some basis other than geography to define and fund communities, such as targeting a high-priority subpopulation--e.g., methamphetamine users--that may reside in limited geographical areas, but may also reside throughout the state. In the case of the latter, the state might choose to implement the intervention strategy statewide. It would then still be possible to compare funded with non-funded “communities” by comparing outcomes for targeted vs. non-targeted subpopulations. If the intervention strategy is successful, methamphetamine use and related problems (e.g., burns from meth lab explosions) should show a greater reduction than other substances and their associated problems (e.g., binge drinking and alcohol-related car crashes). Alternatively, states might target an ethnic population due to its high problem severity (e.g., Native Americans). The comparative logic is the same—Native Americans should outperform their counterfactual projections to a greater degree than non-native populations. Non-geographical definitions of community will cause some complications for the cross-site analysis, but can still be accommodated.


Q. What if a state uses the “equity” model for allocating intervention funds?

A. We recognize the possibility that some states may choose to allocate SPF funds across the entire state, and therefore not provide an opportunity to compare SPF-funded communities with non-funded communities. An example would be a state that elects to provide the funds to all of their counties on a per capita basis. In this case, the state evaluator could fall back to a weaker but potentially still credible counterfactual estimate: the pre-intervention trend. There are two caveats. First, this works only for outcomes with pre-intervention data at multiple time points to permit at least a rough projection of the counterfactual post-intervention trend. Second, there remains the threat of “history” (Cook & Campbell, 1979)—other events occurring coincident or after SPF funding that plausibly influence the same outcomes.23


Q. Won’t comparing funded and non-funded communities under-estimate state-level effects of SPF, because allocation of prevention dollars to communities is only one aspect of SPF, whereas state-level infrastructure development activities should increase the effectiveness of prevention services statewide?

A. It could, but the purpose of comparing funded and non-funded communities is not to estimate state-level effects. Rather, it is to estimate community-level effects from states funding communities under the SPF model, both within each SPF state (Question 2a) and across all SPF states (Question 3a). The overall effectiveness of the state’s SPF program is more appropriately assessed from the national evaluation’s summative comparison of SPF vs. non-SPF states (Question 1a). The availability of state-level SPF process data and infrastructure development outcomes will also help us assess the likelihood that SPF funding led to state-level changes in consumption and consequences over and above the effects of funding communities, as can the over-time patterns in the longitudinal outcomes.


    1. Questions related to availability of data


Q. Will SEDS provide all the data that states need for their state- and community-level outcomes?

A. No. Because there are significant gaps in the data available from SEDS, especially for community-level evaluation, states are encouraged to identify their own additional sources of state- and community-level data for outcome measures not available in SEDS. For example, there are substance use consequences (e.g., ER visits and hospitalizations for AOD-related causes) that are not systematically tracked through national surveillance systems and therefore not in SEDS, but we are hopeful that many states may be able to capture these through their own state systems for both the state and community-levels.  The SEDS website will soon contain a section with suggestions for measures and data sources not provided in SEDS but which states may be able to obtain from state and local agencies.


Q. What if, in a given state, community-level outcomes are unavailable from some non-funded communities?

A. We recognize that not all outcome measures that may be available for state-level analysis will necessarily be available at the community-level, particularly where communities are non-funded. However, the proposed design does not require the availability of community-level outcomes from all non-funded communities within a given state. If longitudinal outcomes are available from only a subset, that may be sufficient to indicate differences in post-SPF slope changes. If longitudinal outcomes are uniformly unavailable from non-funded communities, it should still be possible to estimate aggregate outcomes from non-funded communities by subtracting funded community outcomes from statewide estimates. This alternative does not permit any adjustment for initial nonequivalence on moderators, but does permit comparison of aggregate differences in post-SPF slope changes.


Q. Many population-based outcome measures that might be used for communities, including those in the SEDS, provide data only down to the county level. What if a state’s targeted communities are at the sub-county level, yet outcomes are only available at the county-level?

A. It may still be helpful to analyze county-level data as long as the intervention communities include a considerable proportion (e.g., more than half) of the entire county’s population. We would recommend that state evaluators incorporate the coverage fraction into the analysis (though weighting or covariates) to maintain apples-to-apples comparisons across counties. The national evaluation will do the same when analyzing county-level data across states.


Q. What if targeted outcomes are not available at multiple pre-intervention time points?

A. The availability of multiple pre-intervention time points is very powerful both statistically and inferentially, and as noted above, periodic collection over at least 3 to 5 past years was one of the criteria established for including an indicator in SEDS. Consequently, we recommend that wherever possible, states identify community-level outcomes measured at multiple time points prior to the initiation of SPF-funded prevention activities (e.g., annual measurements for each of three years preceding implementation) to establish a pre-intervention trend. At the same time, we recognize that not all states will have community-level measures for all outcomes of interest that satisfy this criterion. However, even one point in time prior to initiation is sufficient to provide a pre-intervention level (though not a trend). This “single-pretest” comparison-group design provides less statistical and inferential power that the “multiple pretest” comparison-group design, but is still more powerful than having no pretest at all, particularly in the context of comparison communities that also provide pretest levels.


    1. Questions related to analysis limitations


Q. Can the proposed technique for nonequivalence adjustment (propensity scores) really adjust out self-selection effects at the state level (Question 1a), community level within each state (Question 2a), and community level across states (Question 3a)?

A. At the state level (Question 1a), if outcomes were limited to posttest-only, the answer is probably no. It is essential to recognize that no adjustment or matching solution can be relied on to circumvent the inherent selection bias that resulted from the initial funding of the SPF states. No matter how similar the non-SPF states may be on other criterion-relevant characteristics, it would be unwise to assume that – even with statistical adjustments -- states that applied for and won SPF SIG grants had the same potential to succeed at SPF objectives than states that applied and were not funded, or that simply did not apply. That assumption is tantamount to assuming that states were 1) funded at random or 2) that all relevant differences between the states were measured without error and completely explained pre-SPF nonequivalence. Where the outcomes are not limited to posttest-only, but are longitudinal outcomes with multiple pretests, self-selection is a more manageable threat. As indicated in the description of statistical models in Section 7, isolating group effects (SPF v. non-SPF) on period (post v. pre), period by time (year or bi-year, depending on the outcome), and period by time squared (allowing for curvilinearity) is one way to represent the net effects of SPF, controlling for pre-existing differences between groups on moderators (the propensity terms), but also for differences between groups on initial outcome value and secular rate of change. This means that 1) if outcomes in SPF states are starting at systematically different levels and changing at systematically different rates than in non-SPF states independent of any SPF effects, these differences are controlled for, and 2) to the extent that these differences are effects of pre-existing differences on covariates, those are controlled for as well, even before the propensity adjustment is applied. Consequently, a perfectly unbiased adjustment from the propensity scores is less critical with longitudinal data.


A second issue is that the maximum N will be 60 (26 SPF states, 34 non-SPF states). Consequently, significance tests of balance in the covariates—essential to diagnosing the adequacy of the propensity scores in equating the two conditions on measured variables-- will be underpowered, and fail to reject the null hypothesis when it is false. There is a potential work-around. In another multisite comparison study conducted at Westat (Claus et al., 2004), balance tests revealed that the distribution of cases in outer quintiles was highly skewed across conditions, substantially reducing the “effective” sample size (the harmonic mean) per condition, a key component of statistical power. Therefore, to assess balance, a small effect size difference was used as a standard, rather than a significance level. Results from the initial propensity model revealed that 9.7% of the contrasts (14 of 145) had greater than small effect size differences between the treatment conditions. Interaction terms were added to the model, selected by testing all possible interactions where any main effect was out of balance in any quintile. In the revised model, only 6.2% of contrasts (9 of 145) had greater than small effect size differences between conditions, i.e., the model was essentially in balance. The effect size alternative to significance testing may be particularly justified in the present case because the 60 states are a census, not a sample. Technically, there is no sampling error, so one might argue that statistical tests are irrelevant at this level.


At the community level within each state (Question 2a), the situation faced by most state-level evaluators will be analogous to that of the national evaluators with states. That is, there is an inherent selection bias that no adjustment or matching solution can circumvent because by definition, resource allocation based on data-driven planning is neither random not random-like. States can select communities for any number of reasons (e.g., high in need, high in capacity, high in readiness to change) but whatever the reason, funded communities are expected to be demonstrably different from non-funded communities in ways validated by data. However, unlike the national evaluators, states have the opportunity to avoid the inherent selection bias problem by enumerating a statewide pool of qualified communities and then randomly funding a subset, with the remainder serving as control communities.24 This effectively removes selection bias, and is ethically defensible wherever the needs exceed the funds to meet them (Boruch, 2005). Whether it is politically or administratively feasible will of course vary by state. Where states fund nonrandomly, the issues for addressing Question 2a parallel those of Question 1a, as do the potential solutions (e.g., emphasize longitudinal models that are less sensitive to selection bias).


At the community level across states (Question 3a), this concern diminishes. Most importantly, there will be 26 times the mean number of communities per state, dramatically increasing the ability to test for and improve balance. In addition, the inherent selection bias of the within-state community funding choices dissipates in the cross-site analysis due to variation across states in their rationales for selecting communities.

Q. Don’t propensity scores have other limitations as well?

A. Yes. Like traditional methods for removing group nonequivalence, propensity score methods can adjust only for confounding covariates that are observed and measured. This is always a limitation of nonrandomized studies compared with randomized studies, where the randomization tends to balance the distribution of all covariates, observed and unobserved. However, tests can be devised to determine the robustness of the conclusions to potential influences of unobserved covariates (Rosenbaum, 2004). Such sensitivity analyses suppose that a relevant but unobserved covariate has been left out of the propensity score model. By explicating how this hypothetical unmeasured covariate is related to treatment assignment and outcome, one can estimate how the treatment effect that adjusts for it might change if such a covariate were available for adjustment. Moreover, propensity scores appear to be more robust to certain types of specification error than standard methods. In a simulation to investigate the relative influence of specification error in propensity scores versus regression models, Drake (1993) found that propensity scores are as vulnerable as standard methods to bias from omitted variables, but less vulnerable to bias from variables that are included but in the wrong functional form (e.g., linear rather than quadratic).


A second limitation of propensity score methods—that they require reasonably large samples to support the subclassification—will definitely be a factor where state or community within a single state is the unit of analysis (Questions 1a and 2a). It is less of a factor where community across multiple states is the unit of analysis (Question 3a). Similarly, sample size will be adequate to detect lack of balance though traditional statistical tests for Question 3a, but not for Question 1a and 2a. Consequently, a finding of balance may mean that balance was achieved, or it may simply mean that power was insufficient to detect unbalancedness, as described above.


References


Baron, R.M., and Kenny, D.A. (1986). The moderator-mediator variable distinction in social psychological research: Conceptual, strategic, and statistical considerations. Journal of Personality and Social Psychology 51 (6): 1173-82.

Boruch, R.F. (2005). Better Evaluation for Evidence-Based Policy: Place Randomized Trials in Education, Criminology, Welfare, and Health. In R.F. Boruch (Ed.) Place Randomized Trials: Experimental Tests of Public Policy. Annals of the American Academy of Political and Social Science, 599.

Claus, R. E., Orwin, R. G., Kissin, W., Krupski, A., & Campbell, K. (2004). Continuity of care for substance abusing women who enter specialized and standard residential treatment. Presentation at the Annual Meeting of the American Public Health Association (APHA), Washington, DC. (November).

Cohen, J. (1988). Statistical power analysis for the behavioral sciences. (2nd ed.) Hillsdale, NJ: Erlbaum.

Cook, T. D., and D. T. Campbell. (1979) Quasi-Experimentation: Design and Analysis Issues for Field Settings. Chicago: Rand McNally.

CSAP ( 2004). Strategic Prevention Framework State Incentive Grants (Short Title: SPF SIG) SP 04-002. (Initial Announcement)

D'Agostino, R. B., Jr. (1998). Tutorial in biostatistics: Propensity score methods for bias reduction in the comparison of a treatment to a non-randomized control group . Statistics in Medicine, 17, 2265-2281.

Department of Health and Human Services, Office of Civil Rights. (1998) Guidance Memorandum: Title VI Prohibition Against National Origin Discrimination-Persons with Limited-English Proficiency.

Donabedian, A. (1988). The Quality of Care. How Can it be Assessed?” Journal of the American Medical Association, 260, pp. 1743-1748.

Drake C: (1993) Effects of Misspecification of the Propensity Score on Estimators of Treatment Effect. Biometrics 49(4), 1231-1236

Fulbright-Anderson, K., Kubisch, A. & Connell, J. (Eds.). (1998), New approaches to evaluating community initiatives: Theory, measurement, and analysis. Washington, DC: The Aspen Institute.

Greenwood, D.J. & Levin, M.  (1998).  Introduction to action research: Social research for social change.  Thousand Oaks, CA: Sage Publications.

HRSA. (2000).Lesbian, Gay, Bisexual, and Transgender Health: Findings And Concerns. Washington, DC: HRSA.

Institute of Medicine (2001), Unequal Treatment: Confronting Racial and Ethnic Disparities in Healthcare. National Academies Press.

Mark, M. M., Henry, G. T., & Julnes, G. (2000). Evaluation: An integrated framework for understanding, guiding, and improving policies and programs. San Francisco: Jossey Bass.

Mark, M. M., Hofmann, D. A., & Reichardt, C. S. (1992). Testing theories in theory-driven evaluations: (Tests of) moderation in all things. In H. T. Chen & P. H. Rossi (Eds.), Using theory to improve program and policy evaluations (pp. 71-84). Westport, CT: Greenwood Press. 

Murray, D. M. (1998). Design and analysis of group-randomized trials. New York: Oxford University Press.

Office of Management and Budget. Revisions to the Standards for the Classification of Federal Data on Race and Ethnicity. Federal Register Notice. October 30, 1997;62(210)

Orwin, R. G., Ellis, B., Williams, V., & Maranda, M. (2000). Relationships between treatment components, client-level factors, and positive treatment outcomes. Journal of Psychopathology and Behavioral Assessment, 22, 383-397.

Orwin, R. G., Iachan, R., Ellis, B., & Wolters, C. (1999). The National Treatment Improvement Evaluation Study: Multilevel reanalysis of treatment outcomes. Proceedings of the Survey Research Methods Section of the American Statistical Association, pp. 221-226.

Orwin, R. G., Kissin, W., & Claus, R. (2004). Specialized versus Standard Treatment for Women with Children: Attending to Heterogeneity in a Multisite Study. Presented at the College on Problems of Drug Dependence, San Juan, PR.

Orwin, R. G., Sonnefeld, L. J., Cordray, D. S., Pion, G. M., & Perl, H. I. (1998). Constructing quantitative implementation scales from categorical services data. Examples from a multisite evaluation. Evaluation Review, 22, 245-288.

Orwin, R.G. (2000). Assessing program fidelity in substance abuse health services research. Addiction, 95, 5309-5328.

Panel on DHHS Collection of Race and Ethnicity Data (2004). Eliminating Health Disparities: Measurement and Data Needs. Michele Ver Ploeg and Edward Perrin, Editors, Committee on National Statistics, National Research Council of the National Academies.

Popham, W. J. (1975). Educational Evaluation. Englewood Cliffs, NJ: Prentice-Hall.

Provus, M. M. (1971). Discrepancy evaluation/or education program improvement and assessment. Berkeley, CA: McCutchan.

Raudenbush, S. W., & Bryk, A. S. (2002). Hierarchical linear models: Applications and data analysis methods, Second Edition. Newbury Park, CA: Sage.

Raudenbush, S. W., Congdon, C., & Bryk, A. S. (2004). HLM6: Hierarchical Linear and Nonlinear Modeling. Lincolnwood, IL: Scientific Software International

Rosenbaum, P. R. (2002). Observational studies. New York: Springer.

Rosenbaum, P.R. (2004) Design sensitivity in observational studies. Biometrika, 91, 1, pp. 153–164.

Rosenbaum, P.R., & Rubin, D.B. (1984). Reducing bias in observational studies using subclassification on the propensity score. Journal of the American Statistical Association, 79, 516-524.

Rubin, D. B. (1997). Estimating causal effects from large data sets using propensity scores. Annals of Internal Medicine, 127, 757-63.

SAMHSA. (2004). Communities Mobilizing for Change on Alcohol (Fact Sheet) http://modelprograms.samhsa.gov/pdfs/FactSheets/Cmca.pdf

Siedman, J., Steinwachs, D., & Rubin, H. 2003; Conceptual framework for a new tool for evaluating the quality of diabetes consumer-information web sites. Journal of Medical Internet Research, 5(4) e29

Sonnefeld, L.J., Orwin, R.G., et al. (1998). Integrating process and outcome evaluation: Developing project-level databases for multi-site evaluations and multi-level analyses. Cambridge, MA: Evaluation Center@HSRI.

Steckler A. (1989) The use of qualitative evaluation methods to test internal validity. Evaluation and the Health Professions. 12:115–33.

U. S. General Accounting Office (1991). Designing evaluations. PEMD-10.1.4, 5/91. Available online in .pdf format.

U.S. Department of Health and Human Services, Agency for Healthcare Research and Quality, (2004) National Healthcare Disparities Report,. Rockville, MD.

Westat (2000). WesVar® 4.2 User's Guide. Rockville. MD: Westat.

Wholey, J S.(1979) Evaluation: Promise and performance. Washington, D.C.: Urban Institute.

Weiss C. (1997), Theory-based evaluation: Past, present and future, in D. J. Rog and D. Fournier, eds., Progress and Future Directions in Evaluation: Perspectives on Theory, Practice and Methods, New Directions for Program Evaluation, n. 76, San Francisco, Jossey Bass.


















APPENDIX G


SPF SIG Community-level Instrument Crosswalk

SPF SIG Community-level Instrument Part I Items by Domain Index and Logic Model

Component

Item #

Item Text

Logic Model
Component

Domain

1

What is the state procedure for monitoring the SPF process at the community level?
" _ Communities must submit formal results of a needs and resources assessment.
" _ Communities must submit formal strategic plans.
" _ Communities must obtain approval of their strategic plans.
" " _ The state monitors the communities’ intervention selection to ensure that interventions match the target outcomes and causal factors identified.
_ The state conducts community-level evaluation.
_ The community conducts its own evaluation and reports back to the state.
_ Don’t know

State Level: Planning & Implementation

State Questions

2

Did your state select a statewide substance abuse problem that all funded communities are targeting?
_ Yes
" _ No (If no, proceed to question 4.)

State Level: Planning & Implementation

State Questions

3

Describe the statewide substance abuse problem the communities are focusing on:

State Level: Planning & Implementation

State Questions

4

Is your state allowing the communities to proceed with the Strategic Prevention Framework without conducting a community needs and resources assessment?
" _ Yes
" _ No

State Level: Planning & Implementation

State Questions

5

Did you conduct an assessment of the training and technical assistance needs of the prevention workforce within your state, during this reporting period?
" _" Yes
" _ " No

State Level: Planning & Implementation

State Questions

6

Were there opportunities for skills development and/or continuing education for the prevention workforce within your state, during this reporting period?
" _" Yes
" _" No
" _" Don’t know

State Level: Planning & Implementation

State Questions

7

Indicate the month and year this community partner began receiving SPF SIG funds:
MM/YYYY ____/______

Community Level: SPF $ in Selected Communities

State Questions

8

Indicate the month and year SPF SIG funding for this community partner is scheduled to end for the overall project.
MM/YYYY ____/______

Community Level: SPF $ in Selected Communities

State Questions

9

Select the description of “community” being used by this community partner.
" _ A defined geographic area, such as a neighborhood, city, or county
" _ A specific statewide target population, such as high school students
" _ A specific target population within a defined geographic area
" _ Don’t know yet
" _ Other (Describe.)______________

Community Level: Planning & Implementation

State Questions

10

Are there specific workforce issues within this SPF SIG community that we should be aware of?
" _" Yes
" _" No (If no, proceed to question 12.)
" _" Don’t know (If marked, proceed to question 12.)

Community Level: Baseline Status

State Questions

11

Describe the community-level workforce issues.

Community Level: Baseline Status

State Questions

12

Name: _______________

n/a

Record Management

13

Title: ____________

n/a

Record Management

14

Name of organization: ______

n/a

Record Management

15

Telephone number: ________

n/a

Record Management

16

Email address: ___________

n/a

Record Management

17

Instrument submission date: ______________(Web programming note: this field will autofill based on the date the state provides approval of the instrument and submits it.)

n/a

Record Management

18

State: ___________________

n/a

Record Management

19

Create a Community Partner Grantee ID_______________

n/a

Record Management

20

Mark the timeframe for which you are reporting.
" October 1, 2004–March 31, 2005
" April 1, 2005–September 30, 2005
" October 1, 2005–March 31, 2006
" April 1 2006–September 30, 2006
" October 1, 2006–March 31, 2007
" April 1, 2007–September 30, 2007
" October 1, 2007–March 31, 2008
" April 1, 2008–September 30, 2008
" October 1, 2008–March 31, 2009
" April 1, 2009–September 30, 2009

n/a

Record Management

21

As a community partner, what type of organization would you say you are?
" _ Non youth-focused, local grassroots or community-based service and/or advocacy organization (e.g., substance abuse prevention organizations, HIV prevention organizations, YMCAs)
"" _ Faith-based organization
" _ Youth-focused local grassroots or community-based service and/or advocacy organization (e.g., local chapter of Students Against Destructive Decisions, local youth councils, Boy Scouts/Girl Scouts, Big Brothers/Big Sisters)
"" _ Other non-profit organization, not listed above
" _ School district
"" _ Law enforcement organization
" _ College/university
" _ Government agency
" _ Local healthcare facility, treatment or prevention provider/facility (e.g., local hospital, community mental health center, local substance abuse prevention agency)
" _ Other (Describe.)__________

Community Level: Planning & Implementation

Community Partner Organizational Information/Type and Funding

22

Are you partnering with a community coalition?
" _" Yes
" " _ No (If no, proceed to question 25.)

Community Level: Planning & Implementation

Community Partner Organizational Information/Type and Funding

23

What month and year was the coalition established?
MM/YYYY ____/______
" Don’t know

Community Level: Planning & Implementation

Community Partner Organizational Information/Type and Funding

24

Indicate the role of the coalition in changing community capacity, knowledge, norms and behaviors related to substance abuse prevention and program implementation.
" _" Collect and organize data
"" _ Conduct needs assessments
"" _ Train community members in substance abuse prevention
" _ Leverage funds from sources other than the SPF SIG
" _ Plan and/or implement interventions
" _ Ensure SPF SIG funded intervention(s) address issues related to cultural competence
" _ Plan and/or implement process or outcome evaluations of interventions
" _ Set substance abuse policy at the organizational, local, or state level
" _ Educate others about needed changes in substance abuse policy at the organizational, local, or state level
"" _ Other (Describe.)
" _" Don’t know

Community Level: Planning & Implementation

Community Partner Organizational Information/Type and Funding

25

Do you currently receive alcohol, tobacco or other drug prevention funding from sources other than the SPF SIG Initiative?
" _ Yes
"" _ No (
If no, proceed to question 27.)
" _ Don’t know (
If marked, proceed to question 27.)

Community Level: Planning & Implementation

Community Partner Organizational Information/Type and Funding

26

What other types of funding do you currently receive?
" _ State funds
" _ County or municipal funds
" _" Foundation funds
" " _ Private contributions from individuals
" _" Corporate contributions
" _" Weed and Seed
" _" Federal Substance Abuse Prevention and Treatment Block Grant funds
" _" Drug Free Communities funds
" _" Safe and Drug Free Schools funds
" _" SIG funds (this is funding that came from the first round of State Incentive Grants, and does not include current SPF SIG funding)
" _" SIG planning funds
" _" SIG enhancement funds
" " _ Community Anti-Drug Coalitions of America (CADCA)
" _" Department of Justice, Office of Juvenile Justice and Delinquency Prevention funds
" _" Medicaid, as provided by a managed care organization
" _" Other Federal funds (Describe.)
" _" Other (Describe.)
" _" Don’t know

Community Level: Planning & Implementation

Community Partner Organizational Information/Type and Funding

27

Indicate the areas in which you, as the community partner, have formal, written policies and practices in place to address cultural competence.
" _
Organizational administration (e.g., purchasing, contracting)
" _ Board representation (e.g., board recruitment, board leadership)
" _ Training and staff development
" _ Language and internal and external communication (e.g., availability of interpreters, documents avoid derogatory language)
" _ Service approach
" _ Evaluation design
" _ Data collection (qualitative and quantitative)
" _ Other (Describe.)
" _ We are aware that cultural competence is an issue but we have not developed formal, written policies yet or these policies are currently being developed. (If marked, proceed to question 30.)
" _ Don’t know (If marked, proceed to question 30.)
" _ Not applicable (If not applicable, proceed to question 30.)

Community Level: Baseline Status

Community Partner Organizational Information/Cultural Competence Policies and Practices

28

How is compliance with cultural competence policies and/or practices monitored within your organization, as the community partner?
_ Compliance is not monitored at all
_ Compliance is monitored once a year or less frequently by a director, executive, or administrator
_ Compliance is monitored twice a year or more often by a director, executive, or administrator
_ Compliance is monitored once a year or less frequently by someone other than a director, executive, or administrator
_ Compliance is monitored twice a year or more often by someone other than a director, executive, or administrator
_ Don’t know if compliance is monitored or don’t know how compliance is monitored

Community Level: Baseline Status

Community Partner Organizational Information/Cultural Competence Policies and Practices

29

If contract agencies are used, are they held to the same standards with regard to cultural competence (Web programming note: definition link)?
_ " Yes
" " _ " No
" _ " " Don’t know
" _ " " Not applicable

Community Level: Baseline Status

Community Partner Organizational Information/Cultural Competence Policies and Practices

30

Did you receive SPF SIG funded guidance, training or technical assistance with regard to cultural competence during this reporting period?
" _ " " Yes
" _ " " No (If no, proceed to
question 32.)

Community Level: Planning & Implementation

Community Partner Organizational Information/Cultural Competence Policies and Practices

31

How likely is it that you will use what you learned during the guidance, training or technical assistance on cultural competence in your SPF SIG activities?
" _ " " Very likely
" _ Somewhat likely
"_ Not likely

Community Level: Planning & Implementation

Community Partner Organizational Information/Cultural Competence Policies and Practices

32a

For questions 32 and 33 below, indicate which of the five component(s) of the Strategic Prevention Framework you worked on during this reporting period and indicate the approximate percentage of time you spent on this component during this reporting period.

" Needs assessments

Community Level: Planning & Implementation

SPF

32b

For questions 32 and 33 below, indicate which of the five component(s) of the Strategic Prevention Framework you worked on during this reporting period and indicate the approximate percentage of time you spent on this component during this reporting period.

Capacity building

Community Level: Planning & Implementation

SPF

32c

For questions 32 and 33 below, indicate which of the five component(s) of the Strategic Prevention Framework you worked on during this reporting period and indicate the approximate percentage of time you spent on this component during this reporting period.

Strategic plan development

Community Level: Planning & Implementation

SPF

32d

For questions 32 and 33 below, indicate which of the five component(s) of the Strategic Prevention Framework you worked on during this reporting period and indicate the approximate percentage of time you spent on this component during this reporting period.

Intervention implementation

Community Level: Planning & Implementation

SPF

32e

For questions 32 and 33 below, indicate which of the five component(s) of the Strategic Prevention Framework you worked on during this reporting period and indicate the approximate percentage of time you spent on this component during this reporting period.

" Monitoring and evaluation

Community Level: Planning & Implementation

SPF

33a

For questions 32 and 33 below, indicate which of the five component(s) of the Strategic Prevention Framework you worked on during this reporting period and indicate the approximate percentage of time you spent on this component during this reporting period.

Percentage of Time Spent This Reporting Period ______%

Community Level: Planning & Implementation

SPF

33b

For questions 32 and 33 below, indicate which of the five component(s) of the Strategic Prevention Framework you worked on during this reporting period and indicate the approximate percentage of time you spent on this component during this reporting period.

Percentage of Time Spent This Reporting Period ______%

Community Level: Planning & Implementation

SPF

33c

For questions 32 and 33 below, indicate which of the five component(s) of the Strategic Prevention Framework you worked on during this reporting period and indicate the approximate percentage of time you spent on this component during this reporting period.

Percentage of Time Spent This Reporting Period ______%

Community Level: Planning & Implementation

SPF

33d

For questions 32 and 33 below, indicate which of the five component(s) of the Strategic Prevention Framework you worked on during this reporting period and indicate the approximate percentage of time you spent on this component during this reporting period.

Percentage of Time Spent This Reporting Period ______%

Community Level: Planning & Implementation

SPF

33e

For questions 32 and 33 below, indicate which of the five component(s) of the Strategic Prevention Framework you worked on during this reporting period and indicate the approximate percentage of time you spent on this component during this reporting period.

Percentage of Time Spent This Reporting Period ______%

Community Level: Planning & Implementation

SPF

34

Have you completed an organizational needs and resources assessment (Web programming note: definition link) during this reporting period?

_"" " Yes
_"" " No (If no, proceed to question 37.)

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

35

Indicate the types of organizational needs and resources you assessed.
_" Mission/vision
_" Leadership ability
_" Cultural competence
_" Human resources
_" Technical resources
" _ Infrastructure
_" Funding sources
_" Organizational experience
_" Up-to-date knowledge of substance abuse prevention
_" Other (Describe.) __________

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

36

For all items marked in question 35 (above), describe the specific organizational needs and resources that were identified.

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

37

Have you completed a community needs and resources assessment during this reporting period?
_ " Yes
" _ " No (If no, proceed to question 63.)

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

38

Indicate the types of community needs and resources that were assessed.
_" Data on populations not typically included in assessments (e.g., homeless, undocumented workers)
_" " Prevention resources (e.g., call centers and trained counselors)
_" " Cultural competence
_" " Partnerships within the community
_" " Substance use rates of the potential target population
_" " Substance use consequences in potential target populations, (e.g., alcohol-related mortality)
_" " Factors that might cause, lead to, or promote substance use
_" " Experience within the community of working with the potential target population (e.g., previous encounters with the target population perhaps in serving members with prevention services or in conducting outreach to this population).
_" " Community readiness (If selected, you must complete question 40 below.)
_" " Workforce training issues within the community (e.g., not enough slots in a community-college training program)
_" " Other (Describe.) __________

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

39

Describe the community needs and resources identified through the assessment. (Provide a written description in the space available.)

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

40

If you indicated in question 38 that you assessed community readiness, did you use a community readiness measure that has been tested and/or published?
_" " Yes
_" " No (If no, proceed to question 43.)
_" " Don’t know (If marked, proceed to question 43.)

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

41

If yes, what measure was used?

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

42

What were the results of the community readiness assessment?

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

43a

For questions 43 through 54 indicate below the types of data you used in conducting your needs and resources assessment and indicate if the data were provided to you by the State Epidemiology and Outcomes Workgroup (SEOW).

Student school survey data

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

43b

For questions 43 through 54 indicate below the types of data you used in conducting your needs and resources assessment and indicate if the data were provided to you by the State Epidemiology and Outcomes Workgroup (SEOW).

(Student school survey data) Provided by SEOW

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

44a

For questions 43 through 54 indicate below the types of data you used in conducting your needs and resources assessment and indicate if the data were provided to you by the State Epidemiology and Outcomes Workgroup (SEOW).

School achievement data

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

44b

For questions 43 through 54 indicate below the types of data you used in conducting your needs and resources assessment and indicate if the data were provided to you by the State Epidemiology and Outcomes Workgroup (SEOW).

(School achievement data) Provided by SEOW

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

45a

For questions 43 through 54 indicate below the types of data you used in conducting your needs and resources assessment and indicate if the data were provided to you by the State Epidemiology and Outcomes Workgroup (SEOW).

Community surveys

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

45b

For questions 43 through 54 indicate below the types of data you used in conducting your needs and resources assessment and indicate if the data were provided to you by the State Epidemiology and Outcomes Workgroup (SEOW).

(Community surveys) Provided by SEOW

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

46a

For questions 43 through 54 indicate below the types of data you used in conducting your needs and resources assessment and indicate if the data were provided to you by the State Epidemiology and Outcomes Workgroup (SEOW).

Public health statistics (e.g., mortality rates due to drug overdose)

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

46b

For questions 43 through 54 indicate below the types of data you used in conducting your needs and resources assessment and indicate if the data were provided to you by the State Epidemiology and Outcomes Workgroup (SEOW).

(Public health statistics ) Provided by SEOW

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

47a

For questions 43 through 54 indicate below the types of data you used in conducting your needs and resources assessment and indicate if the data were provided to you by the State Epidemiology and Outcomes Workgroup (SEOW).

Census data

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

47b

For questions 43 through 54 indicate below the types of data you used in conducting your needs and resources assessment and indicate if the data were provided to you by the State Epidemiology and Outcomes Workgroup (SEOW).

(Census data) Provided by SEOW

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

48a

For questions 43 through 54 indicate below the types of data you used in conducting your needs and resources assessment and indicate if the data were provided to you by the State Epidemiology and Outcomes Workgroup (SEOW).

Interviews and/or focus groups

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

48b

For questions 43 through 54 indicate below the types of data you used in conducting your needs and resources assessment and indicate if the data were provided to you by the State Epidemiology and Outcomes Workgroup (SEOW).

(Interviews and/or focus groups) Provided by SEOW

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

49a

For questions 43 through 54 indicate below the types of data you used in conducting your needs and resources assessment and indicate if the data were provided to you by the State Epidemiology and Outcomes Workgroup (SEOW).

Public meetings or forums

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

49b

For questions 43 through 54 indicate below the types of data you used in conducting your needs and resources assessment and indicate if the data were provided to you by the State Epidemiology and Outcomes Workgroup (SEOW).

(Public meetings or forums) Provided by SEOW

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

50a

For questions 43 through 54 indicate below the types of data you used in conducting your needs and resources assessment and indicate if the data were provided to you by the State Epidemiology and Outcomes Workgroup (SEOW).

Law enforcement data (e.g., drug arrests or drug trafficking)

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

50b

For questions 43 through 54 indicate below the types of data you used in conducting your needs and resources assessment and indicate if the data were provided to you by the State Epidemiology and Outcomes Workgroup (SEOW).

(Law enforcement data) Provided by SEOW

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

51a

For questions 43 through 54 indicate below the types of data you used in conducting your needs and resources assessment and indicate if the data were provided to you by the State Epidemiology and Outcomes Workgroup (SEOW).

Department of Justice data (e.g., outcomes of criminal cases)

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

51b

For questions 43 through 54 indicate below the types of data you used in conducting your needs and resources assessment and indicate if the data were provided to you by the State Epidemiology and Outcomes Workgroup (SEOW).

(Department of Justice data) Provided by SEOW

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

52a

For questions 43 through 54 indicate below the types of data you used in conducting your needs and resources assessment and indicate if the data were provided to you by the State Epidemiology and Outcomes Workgroup (SEOW).

Public safety data (e.g., number of automobile accidents caused by drinking and driving)

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

52b

For questions 43 through 54 indicate below the types of data you used in conducting your needs and resources assessment and indicate if the data were provided to you by the State Epidemiology and Outcomes Workgroup (SEOW).

(Public safety data) Provided by SEOW

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

53a

For questions 43 through 54 indicate below the types of data you used in conducting your needs and resources assessment and indicate if the data were provided to you by the State Epidemiology and Outcomes Workgroup (SEOW).

Social norms data

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

53b

For questions 43 through 54 indicate below the types of data you used in conducting your needs and resources assessment and indicate if the data were provided to you by the State Epidemiology and Outcomes Workgroup (SEOW).

(Social norms data) Provided by SEOW

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

54a

For questions 43 through 54 indicate below the types of data you used in conducting your needs and resources assessment and indicate if the data were provided to you by the State Epidemiology and Outcomes Workgroup (SEOW).

Other (
Describe.)_________

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

54b

For questions 43 through 54 indicate below the types of data you used in conducting your needs and resources assessment and indicate if the data were provided to you by the State Epidemiology and Outcomes Workgroup (SEOW).

(Other) Provided by SEOW

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

55

Based on the needs and resources assessments described above, have you identified consumption patterns that you are going to target for substance abuse prevention?
_" " Yes
_" " No (If no, proceed to question 58.)

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

56

Indicate the consumption patterns you are targeting: (Select all that apply.)
_" " Underage use of alcohol
_" " Any use of alcohol
_" " Heavy use of alcohol, defined as consuming five or more drinks on five or more occasions in the past 30 days
_" " Binge drinking, defined as consuming five or more drinks in a row at one sitting for males and four or more in a row for females
_" " Any use of tobacco under age 18
_" " Any use of tobacco 18 years of age or older
_" " Any use of illegal drugs (If selected, complete question 57.) (If this is not selected, automatically skip question 59.)
_" " Other consumption pattern (Describe.)

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

57

If you indicated in question 56 that you are targeting the use of illegal drugs, indicate which drugs you are targeting:
_" " All illegal drugs
_" " Marijuana
_" " Ecstasy
_" " Cocaine
_" " Crack cocaine
_" " Methamphetamine/Crystal meth
_" " Other substances (Describe.) _________

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

58

Based on the needs and resources assessments described above, have you identified consequences (Web programming note: definition link) of substance use that you are targeting?
_" " Yes
_" " No (If no, proceed to question 60.)

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

59

Indicate the consequences you are targeting:
_" " Motor vehicle crashes
_" " Crime
_" " Dependence or abuse
_" " Alcohol-related mortality
_" " Tobacco-related mortality
_" " Drug-related mortality
_" " Other consequences (Describe.) ____

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

60

Based on the needs and resources assessments described above, have you identified specific populations that you will be targeting for SPF SIG funded substance abuse prevention?
" _" Yes
_" " No (If no, proceed to question 63.)

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

61

Indicate the populations you will be targeting for substance abuse prevention.
_" " All races/ethnicities
_" " Specific races/ethnicities
" _" A"frican American
" _" American Indian/Alaska Native
" _" Asian/Pacific Islander
" _" White
" _" Hispanic
_"" " Elementary school students
_"" " Middle school students
" _"" High school students
_"" " College students
_"" " Under 18
_"" " Under 21
_"" " Young adults age 18-25
_"" " Construction workers
_"" " Pregnant women
_"" " Gay/Lesbian/Bisexual/Transgender/Men who have sex with men
_"" " Other target population (Describe.)

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

62

If you are targeting specific consumption patterns or consequences with specific target populations, use the space below to describe those connections.
Consumption Pattern or Consequence
Target Population

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

63

Did you receive SPF SIG funded guidance, training or technical assistance with regard to conducting a needs and resources assessment during this reporting period?
_"" " Yes
_"" No (If no, proceed to question 65.)

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

64

How likely is it that you will use what you learned during the guidance, training or technical assistance on needs and resources assessment in your SPF SIG activities?
_"" " Very likely
_"" Somewhat likely
_"" " Not likely

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

65

If your project experienced any challenges with conducting needs and resources assessments (including coalition needs and resources) during this reporting period, please describe them here. Examples might include difficulty scheduling time with key individuals to determine need, challenges accessing data, or difficulty finding the resources (time and money) to conduct the needs assessment.

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

66

If your project experienced any specific successes with conducting needs and resource assessments (including coalition needs and resources) during this reporting period, please describe them here. Examples might include identifying appropriate data or being able to contact key individuals for their input into the assessment.

Community Level: Planning & Implementation

SPF/Needs and Resources Assessments

67a

Has this position been vacant at all during this reporting period?

Leader/director/manager
" _ Yes
" _ No
" _ NA

Community Level: Planning & Implementation

SPF/Capacity Building

67b

Has this position been vacant at all during this reporting period?

Coordinator
" _ Yes
" _ No
" _ NA

Community Level: Planning & Implementation

SPF/Capacity Building

67c

Has this position been vacant at all during this reporting period?

Evaluator
" _ Yes
" _ No
" _ NA

Community Level: Planning & Implementation

SPF/Capacity Building

67d

Has this position been vacant at all during this reporting period?

Curriculum/Intervention Developer
" _ Yes
" _ No
" _ NA

Community Level: Planning & Implementation

SPF/Capacity Building

67e

Has this position been vacant at all during this reporting period?

Curriculum/Intervention Facilitator
" _ Yes
" _ No
" _ NA

Community Level: Planning & Implementation

SPF/Capacity Building

67f

Has this position been vacant at all during this reporting period?

Curriculum/Intervention Aide
" _ Yes
" _ No
" _ NA

Community Level: Planning & Implementation

SPF/Capacity Building

67g

Has this position been vacant at all during this reporting period?

Volunteers/Interns (non-paid positions)
" _ Yes
" _ No
" _ NA

Community Level: Planning & Implementation

SPF/Capacity Building

67h

Has this position been vacant at all during this reporting period?

Other 1_________
" _ Yes
" _ No
" _ NA

Community Level: Planning & Implementation

SPF/Capacity Building

67i

Has this position been vacant at all during this reporting period?

Other 2 _________
" _ Yes
" _ No
" _ NA

Community Level: Planning & Implementation

SPF/Capacity Building

67j

Has this position been vacant at all during this reporting period?

Other 3 __________
" _ Yes
" _ No
" _ NA

Community Level: Planning & Implementation

SPF/Capacity Building

68a

If the position was vacant, indicate how many weeks during this reporting period it was vacant.

Leader/director/manager

Community Level: Planning & Implementation

SPF/Capacity Building

68b

If the position was vacant, indicate how many weeks during this reporting period it was vacant.

Coordinator

Community Level: Planning & Implementation

SPF/Capacity Building

68c

If the position was vacant, indicate how many weeks during this reporting period it was vacant.

Evaluator

Community Level: Planning & Implementation

SPF/Capacity Building

68d

If the position was vacant, indicate how many weeks during this reporting period it was vacant.

Curriculum/Intervention Developer

Community Level: Planning & Implementation

SPF/Capacity Building

68e

If the position was vacant, indicate how many weeks during this reporting period it was vacant.

Curriculum/Intervention Facilitator

Community Level: Planning & Implementation

SPF/Capacity Building

68f

If the position was vacant, indicate how many weeks during this reporting period it was vacant.

Curriculum/Intervention Aide

Community Level: Planning & Implementation

SPF/Capacity Building

68g

If the position was vacant, indicate how many weeks during this reporting period it was vacant.

Volunteers/Interns (non-paid positions)

Community Level: Planning & Implementation

SPF/Capacity Building

68h

If the position was vacant, indicate how many weeks during this reporting period it was vacant.

Other 1_________

Community Level: Planning & Implementation

SPF/Capacity Building

68i

If the position was vacant, indicate how many weeks during this reporting period it was vacant.

Other 2_________

Community Level: Planning & Implementation

SPF/Capacity Building

68j

If the position was vacant, indicate how many weeks during this reporting period it was vacant.

Other 3_________

Community Level: Planning & Implementation

SPF/Capacity Building

69

Did you work on capacity building activities during this reporting period?
_"" Yes
_"" No (
If no, proceed to question 71.)

Community Level: Planning & Implementation

SPF/Capacity Building

70

Indicate the activities you conducted during this reporting period to improve organizational and/or coalition resources.
_" We did not conduct organizational/coalition capacity building activities during this reporting period.
_"" Wrote, reviewed or rewrote organizational or coalition mission/vision
_"" Identified key organizational or coalition activities and goals
_"" Hired staff
_"" Trained staff
_"" Identified coalition leader(s)
_"" Improved cultural competence
_"" Identified or secured physical space
_"" Coordinated or improved technical resources
_"" Coordinated data collection and/or management information systems (MIS) plans
_"" Other: (Describe.) __________

Community Level: Planning & Implementation

SPF/Capacity Building/Organizational Resources

71

Did you receive SPF SIG funded guidance, training or technical assistance with regard to staff, task force, and/or coalition member training during this reporting period?
_"" Yes
_"" No (If no, proceed to question 73.)

Community Level: Planning & Implementation

SPF/Capacity Building/Organizational Resources

72

How likely is it that you will use what you learned during the guidance, training, or technical assistance on staff, task force, and/or coalition member training in your SPF SIG activities?
_" " Very likely
_"" Somewhat likely
_" " Not likely

Community Level: Planning & Implementation

SPF/Capacity Building/Organizational Resources

73

If your project experienced any challenges with improving organizational resources (including coalition resources) during this reporting period, please describe them here.

Community Level: Planning & Implementation

SPF/Capacity Building/Organizational Resources

74

If your project experienced any specific successes with improving organizational resources (including coalition resources) during this reporting period, please describe them here.

Community Level: Planning & Implementation

SPF/Capacity Building/Organizational Resources

75

Did you work to raise awareness in the community of substance use or abuse problems during this reporting period?
_"" " Yes
_"" " No (
If no, proceed to question 81.)

Community Level: Planning & Implementation

SPF/Capacity Building/Community Awareness

76

Indicate the issues you are attempting to raise awareness of in the community.
_"" Substance use rates or trends
_"" Consequences related to substance use, such as crashes or arrests for drunk driving
_"" " Intervening variables associated with substance use and consequences
_"" " Coordination among agencies
_"" " Funding for substance abuse prevention
_"" " Other (Describe.)

Community Level: Planning & Implementation

SPF/Capacity Building/Community Awareness

77

Indicate which community members and/or groups you are focusing your awareness raising efforts on.
_"" " " The general public
_" Youth
_" Parents/family/caregiver groups
_" Business community
_" Media (e.g., radio and television stations, newspapers and magazines)
_" School(s)/school districts
_" Youth serving organization(s) other than schools (e.g., Big Brothers/Big Sisters, Boy Scouts/Girl Scouts)
_" Law enforcement agency/agencies
_" Local or state courts
_" Department of Justice
_" State and/or local jails and prisons
_" Faith-based organization(s) (e.g., churches or charitable organizations with religious affiliations such as Catholic Charities)
_" Civic or volunteer organization(s) (e.g., Kiwanis, Fraternal Order of Police, Women’s League, local sports or neighborhood associations)
_" Healthcare professionals
_" State, local, village or tribal government agencies
_" Other (Describe.)
_" Don’t know

Community Level: Planning & Implementation

SPF/Capacity Building/Community Awareness

78

Indicate the activities that are being conducted to raise awareness of the issue(s) marked in question 76 among the group(s) marked in question 77.
" Media activities such as television, radio, or newspaper advertisements or public service announcements
" Internet activities such as listservs, web sites, or mass e-mails to targeted populations
" Direct mailings
" Face-to-face outreach such as health fairs, classroom visits, other community events, etc.
" Other: (Describe.)

Community Level: Planning & Implementation

SPF/Capacity Building/Community Awareness

79

If your project experienced any challenges with raising community awareness (Web programming note: definition link) during this reporting period, please describe them here.

Community Level: Planning & Implementation

SPF/Capacity Building/Community Awareness

80

If your project experienced any specific successes with raising community awareness during this reporting period, please describe them here.

Community Level: Planning & Implementation

SPF/Capacity Building/Community Awareness

81

Have you identified key stakeholders, partners and partner organizations to participate in your SPF SIG intervention activities?
_" Yes
_" No (
If no, proceed to question 50.)

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

82

Think about your partners and stakeholders involved in intervention activities. Have you identified any stakeholders or partners who should be involved, but are not?
_ " Yes
_ " No (If no, proceed to question 84.)

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

83

Describe what you are doing to bring these stakeholders and partners to the table:

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

84

Do you feel it is important for you to partner with youth groups in order to meet the goals and objectives of your SPF SIG intervention?
_ Yes
_ " No

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

85

Have you partnered with youth groups (e.g., local youth councils, church youth groups, youth recreation leagues)?
_ " Yes
_ " No (If no, proceed to question 88.)

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

86

How many youth groups do you partner with?

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

87a

Indicate how many of the youth groups you partner with fall into each of the categories below.

This partner is a valuable and active participant in the partnership and contributes at a level above and beyond what is expected

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

87b

Indicate how many of the youth groups you partner with fall into each of the categories below.

This partner contributes at a level appropriate for its role in the partnership

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

87c

Indicate how many of the youth groups you partner with fall into each of the categories below.

This partner rarely or almost never participates

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

88

Do you feel it is important for you to partner with parent/family/caregiver groups in order to meet the goals and objectives of your SPF SIG intervention?
_ " Yes
_ " No

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

89

Have you partnered with parent/family/caregiver groups?
_ " Yes
_ " No (If no, proceed to question 92.)

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

90

How many parent/family/caregiver groups do you partner with?

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

91a

Indicate how many of the parent/family/caregiver groups you partner with fall into each of the categories below.

This partner is a valuable and active participant in the partnership and contributes at a level above and beyond what is expected

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

91b

Indicate how many of the parent/family/caregiver groups you partner with fall into each of the categories below.

This partner contributes at a level appropriate for its role in the partnership

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

91c

Indicate how many of the parent/family/caregiver groups you partner with fall into each of the categories below.

This partner rarely or almost never participates

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

92

Do you feel it is important for you to partner with the business community in order to meet the goals and objectives of your SPF SIG intervention?
_ " Yes
_ " No

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

93

Have you partnered with the business community?
_ " Yes
_ " No (If no, proceed to question 96.)

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

94

How many businesses or business groups do you partner with?

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

95a

Indicate how many of the members of the business community that you partner with fall into each of the categories below.

This partner is a valuable and active participant in the partnership and contributes at a level above and beyond what is expected

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

95b

Indicate how many of the members of the business community that you partner with fall into each of the categories below.

This partner contributes at a level appropriate for its role in the partnership

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

95c

Indicate how many of the members of the business community that you partner with fall into each of the categories below.

This partner rarely or almost never participates

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

96

Do you feel it is important for you to partner with the media (e.g., radio and television stations, newspapers and magazines) in order to meet the goals and objectives of your SPF SIG intervention?
_ " Yes
_ " No

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

97

Have you partnered with the media (e.g., radio and television stations, newspapers and magazines)?
_ " Yes
_ " No (If no, proceed to question 100.)

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

98

How many media organizations or groups do you partner with?

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

99a

Indicate how many of the media organizations or groups that you partner with fall into each of the categories below.

This partner is a valuable and active participant in the partnership and contributes at a level above and beyond what is expected

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

99b

Indicate how many of the media organizations or groups that you partner with fall into each of the categories below.

This partner contributes at a level appropriate for its role in the partnership

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

99c

Indicate how many of the media organizations or groups that you partner with fall into each of the categories below.

This partner rarely or almost never participates

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

100

Do you feel it is important for you to partner with schools in order to meet the goals and objectives of your SPF SIG intervention?
_ " Yes
_ " No

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

101

Have you partnered with schools or school districts?
" _ Yes
_ " No (If no, proceed to question 105.)

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

102

How many schools do you partner with?

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

103

How many school districts do you partner with?

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

104a

Indicate how many of the schools or school districts that you partner with fall into each of the categories below.

This (school) partner is a valuable and active participant in the partnership and contributes at a level above and beyond what is expected

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

104b

Indicate how many of the schools or school districts that you partner with fall into each of the categories below.

This (school) partner contributes at a level appropriate for its role in the partnership

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

104c

Indicate how many of the schools or school districts that you partner with fall into each of the categories below.

This (school) partner rarely or almost never participates

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

104d

Indicate how many of the schools or school districts that you partner with fall into each of the categories below.

This (school district) partner is a valuable and active participant in the partnership and contributes at a level above and beyond what is expected

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

104e

Indicate how many of the schools or school districts that you partner with fall into each of the categories below.

This (school district) partner contributes at a level appropriate for its role in the partnership

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

104f

Indicate how many of the schools or school districts that you partner with fall into each of the categories below.

This (school district) partner rarely or almost never participates

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

105

Do you feel it is important for you to partner with youth serving organizations (e.g., Big Brothers Big Sisters, Boy Scouts/Girl Scouts) in order to meet the goals and objectives of your SPF SIG intervention?
_ " Yes
_ " No

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

106

Have you partnered with youth serving organizations (e.g., Big Brothers Big Sisters, Boy Scouts/Girl Scouts)?
_ " Yes
_ " No (If no, proceed to question 109.)

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

107

How many youth serving organizations do you partner with?

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

108a

Indicate how many of the youth serving organizations that you partner with fall into each of the categories below.

This partner is a valuable and active participant in the partnership and contributes at a level above and beyond what is expected

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

108b

Indicate how many of the youth serving organizations that you partner with fall into each of the categories below.

This partner contributes at a level appropriate for its role in the partnership

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

108c

Indicate how many of the youth serving organizations that you partner with fall into each of the categories below.

This partner rarely or almost never participates

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

109

Do you feel it is important for you to partner with law enforcement agencies such as local and state police, FBI, and the Drug Enforcement Administration (DEA) in order to meet the goals and objectives of your SPF SIG intervention?
" Yes
" No

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

110

Have you partnered with law enforcement agencies (e.g., local and state police, FBI, DEA)?
_ " Yes
_ " No (If no, proceed to question 113.)

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

111

How many law enforcement agencies do you partner with?

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

112a

Indicate how many of the law enforcement agencies that you partner with fall into each of the categories below.

This partner is a valuable and active participant in the partnership and contributes at a level above and beyond what is expected

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

112b

Indicate how many of the law enforcement agencies that you partner with fall into each of the categories below.

This partner contributes at a level appropriate for its role in the partnership

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

112c

Indicate how many of the law enforcement agencies that you partner with fall into each of the categories below.

This partner rarely or almost never participates

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

113

Do you feel it is important for you to partner with local or state courts in order to meet the goals and objectives of your SPF SIG intervention?
" _ Yes
_ " No

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

114

Have you partnered with local or state courts?
_ " Yes
_ " No (If no, proceed to question 117.)

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

115

How many local or state courts do you partner with?

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

116a

Indicate how many local or state courts that you partner with fall into each of the categories below.

This partner is a valuable and active participant in the partnership and contributes at a level above and beyond what is expected

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

116b

Indicate how many local or state courts that you partner with fall into each of the categories below.

This partner contributes at a level appropriate for its role in the partnership

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

116c

Indicate how many local or state courts that you partner with fall into each of the categories below.

This partner rarely or almost never participates

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

117

Do you feel it is important for you to partner with the Federal Department of Justice in order to meet the goals and objectives of your SPF SIG intervention?
_ " Yes
_ " No

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

118

Have you partnered with the Federal Department of Justice?
_ " Yes
_ " No (If no, proceed to question 121.)

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

119

How many Federal Department of Justice units do you partner with?

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

120a

Indicate how many Federal Department of Justice units that you partner with fall into each of the categories below.

This partner is a valuable and active participant in the partnership and contributes at a level above and beyond what is expected

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

120b

Indicate how many Federal Department of Justice units that you partner with fall into each of the categories below.

This partner contributes at a level appropriate for its role in the partnership

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

120c

Indicate how many Federal Department of Justice units that you partner with fall into each of the categories below.

This partner rarely or almost never participates

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

121

Do you feel it is important for you to partner with local or state jails or prisons in order to meet the goals and objectives of your SPF SIG intervention?
_ " Yes
" _ No

Mobilize and/or build capacity to address needs

SPF/Capacity Building/Relationship Building

122

Have you partnered with local or state jails or prisons?
_ " Yes
" _ No (If no, proceed to question 125.)

Mobilize and/or build capacity to address needs

SPF/Capacity Building/Relationship Building

123

How many local or state jails or prisons do you partner with?

Mobilize and/or build capacity to address needs

SPF/Capacity Building/Relationship Building

124a

Indicate how many of the local or state jails or prisons that you partner with fall into each of the categories below.

This partner is a valuable and active participant in the partnership and contributes at a level above and beyond what is expected

Mobilize and/or build capacity to address needs

SPF/Capacity Building/Relationship Building

124b

Indicate how many of the local or state jails or prisons that you partner with fall into each of the categories below.

This partner contributes at a level appropriate for its role in the partnership

Mobilize and/or build capacity to address needs

SPF/Capacity Building/Relationship Building

124c

Indicate how many of the local or state jails or prisons that you partner with fall into each of the categories below.

This partner rarely or almost never participates

Mobilize and/or build capacity to address needs

SPF/Capacity Building/Relationship Building

125

Do you feel it is important for you to partner with faith-based organizations (e.g., churches or charitable organizations with religious affiliations such as Catholic Charities) in order to meet the goals and objectives of your SPF SIG intervention?
_ " Yes
" _ No

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

126

Have you partnered with faith-based organizations (e.g., churches or charitable organizations with religious affiliations such as Catholic Charities)?
_ " Yes
" _ No (If no, proceed to question 129.)

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

127

How many faith-based organizations do you partner with?

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

128a

Indicate how many of the faith-based organizations that you partner with fall into each of the categories below.

This partner is a valuable and active participant in the partnership and contributes at a level above and beyond what is expected

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

128b

Indicate how many of the faith-based organizations that you partner with fall into each of the categories below.

This partner contributes at a level appropriate for its role in the partnership

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

128c

Indicate how many of the faith-based organizations that you partner with fall into each of the categories below.

This partner rarely or almost never participates

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

129

Do you feel it is important for you to partner with civic or volunteer organizations (e.g., Kiwanis, Fraternal Order of Police, Women’s League, local sports or neighborhood associations) in order to meet the goals and objectives of your SPF SIG intervention?
" _ " Yes
" _ No

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

130

Have you partnered with civic or volunteer organizations (e.g., Kiwanis, Fraternal Order of Police, Women’s League, local sports or neighborhood associations)?
_ " Yes
" _ No (If no, proceed to question 133.)

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

131

How many civic or volunteer organizations do you partner with?

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

132a

Indicate how many of the civic or volunteer organizations that you partner with fall into each of the categories below.

This partner is a valuable and active participant in the partnership and contributes at a level above and beyond what is expected

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

132b

Indicate how many of the civic or volunteer organizations that you partner with fall into each of the categories below.

This partner contributes at a level appropriate for its role in the partnership

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

132c

Indicate how many of the civic or volunteer organizations that you partner with fall into each of the categories below.

This partner rarely or almost never participates

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

133

Do you feel it is important for you to partner with healthcare professionals in order to meet the goals and objectives of your SPF SIG intervention?
_ " Yes
" _ No

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

134

Have you partnered with healthcare professionals?
_ " Yes
" _ No (If no, proceed to question 137.)

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

135

How many healthcare professionals do you partner with?

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

136a

Indicate how many of the healthcare professionals that you partner with fall into each of the categories below.

This partner is a valuable and active participant in the partnership and contributes at a level above and beyond what is expected

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

136b

Indicate how many of the healthcare professionals that you partner with fall into each of the categories below.

This partner contributes at a level appropriate for its role in the partnership

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

136c

Indicate how many of the healthcare professionals that you partner with fall into each of the categories below.

This partner rarely or almost never participates

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

137

Do you feel it is important for you to partner with state government agencies (e.g., public health, public safety, social services) in order to meet the goals and objectives of your SPF SIG intervention?
" _ " Yes
" _ No

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

138

Have you partnered with state government agencies (e.g., public health, public safety, social services) that have expertise in substance abuse?
" _ " Yes
" _ No (If no, proceed to question 141.)

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

139

How many state government agencies do you partner with?

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

140a

Indicate how many of the state government agencies that you partner with fall into each of the categories below.

This partner is a valuable and active participant in the partnership and contributes at a level above and beyond what is expected

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

140b

Indicate how many of the state government agencies that you partner with fall into each of the categories below.

This partner contributes at a level appropriate for its role in the partnership

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

140c

Indicate how many of the state government agencies that you partner with fall into each of the categories below.

This partner rarely or almost never participates

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

141

Do you feel it is important for you to partner with local, village or tribal agencies (e.g., Mayor’s Office, city councils, tribal councils), including those funded by the state, in order to meet the goals and objectives of your SPF SIG intervention?
_ " Yes
" _ No

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

142

Have you partnered with local, village or tribal agencies (e.g., Mayor’s Office, city councils, tribal councils) that have expertise in substance abuse?
" _ " Yes
" _ No (If no, proceed to question 145.)

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

143

How many local, village or tribal agencies do you partner with?

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

144a

Indicate how many of the local, village or tribal agencies that you partner with fall into each of the categories below.

This partner is a valuable and active participant in the partnership and contributes at a level above and beyond what is expected

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

144b

Indicate how many of the local, village or tribal agencies that you partner with fall into each of the categories below.

This partner contributes at a level appropriate for its role in the partnership

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

144c

Indicate how many of the local, village or tribal agencies that you partner with fall into each of the categories below.

This partner rarely or almost never participates

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

145

Have you partnered with other groups/organizations?
_ " Yes
" _ No (If no, proceed to question 150.)

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

146

Describe the other type(s) of groups/organizations worked with in 25 words or less.

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

147

Do you feel it is important for you to partner with these other groups/organizations in order to meet the goals and objectives of your SPF SIG intervention?
" _ " Yes
" _ No

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

148

How many of these other groups/organizations do you partner with?

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

149a

Indicate how many of the other groups/organizations that you partner with fall into each of the categories below.

This partner is a valuable and active participant in the partnership and contributes at a level above and beyond what is expected

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

149b

Indicate how many of the other groups/organizations that you partner with fall into each of the categories below.

This partner contributes at a level appropriate for its role in the partnership

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

149c

Indicate how many of the other groups/organizations that you partner with fall into each of the categories below.

This partner rarely or almost never participates

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

150

Did you receive SPF SIG funded guidance, training or technical assistance with regard to building relationships?
" _ " Yes
" " _ No
(If no, proceed to question 152.)

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

151

How likely is it that you will use what you learned during the guidance, training or technical assistance on building relationships in your SPF SIG activities?
" _ " Very likely
" _ " Somewhat likely
" _ " Not likely

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

152

If your project experienced any challenges with relationship building during this reporting period, please describe them here.

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

153

If your project experienced any specific successes with relationship building during this reporting period, please describe them here.

Community Level: Planning & Implementation

SPF/Capacity Building/Relationship Building

154

Have you worked during this reporting period to ensure that the intervention activities and outcomes continue when SPF SIG funding ends?
" _ Yes
_ " No (
If no, proceed to question 156.)

Community Level: Planning & Implementation

SPF/Capacity Building/Sustainabilitly

155

How have you worked to ensure that intervention activities and outcomes continue after SPF SIG funding has ended?
" _ Leveraged other funding sources
_" Worked to ensure that intervention activities are incorporated in to the missions/goals and activities of other organizations
" _ Worked to implement local level laws, policies or regulations to guarantee the continuation of intervention activities
_" Worked on developing coalition structure to ensure sustainability
_" Other (Describe.)

Community Level: Planning & Implementation

SPF/Capacity Building/Sustainabilitly

156

Did you receive SPF SIG funded guidance, training or technical assistance with regard to ensuring that intervention activities and outcomes continue after SPF SIG funding ends
" _ Yes
" _ No
(If no, proceed to question 158.)

Community Level: Planning & Implementation

SPF/Capacity Building/Sustainabilitly

157

How likely is it that you will use what you learned during the guidance, training or technical assistance on ensuring that intervention activities and outcomes continue after SPF SIG funding ends in your SPF SIG activities?
" _ Very likely
" _ Somewhat likely
_ " Not likely

Community Level: Planning & Implementation

SPF/Capacity Building/Sustainabilitly

158

If your project experienced any challenges while working to ensure that intervention activities and outcomes continue after SPF SIG funding ends during this reporting period, please describe them here.

Community Level: Planning & Implementation

SPF/Capacity Building/Sustainabilitly

159

If your project experienced any successes while working to ensure that intervention activities and outcomes continue after SPF SIG funding ends during this reporting period, please describe them here.

Community Level: Planning & Implementation

SPF/Capacity Building/Sustainabilitly

160

Have you completed a strategic plan?
" _ Yes
_ " No (
If no, proceed to question 173.)

Community Level: Planning & Implementation

SPF/Strategic Plan Development

161

Who worked on the strategic plan?
_" Youth
_" Parents/family/caregiver groups
_" Business community
_" Media (e.g., radio and television stations, newspapers and magazines)
_" Advocacy volunteers
_" School(s)/school districts
_" Youth serving organization(s) (other than school) (e.g., Big Brothers/Big Sisters, Boy Scouts/Girl Scouts)
_" Law enforcement agency/agencies
_" Faith-based organization(s) (e.g., churches or charitable organizations with religious affiliations such as Catholic Charities)
_" Civic or volunteer organization(s) (e.g., Kiwanis, Fraternal Order of Police, Women’s League, local sports or neighborhood associations)
_" Healthcare professionals
_" State, local, village or tribal government agencies (e.g., social services, public health, etc.)
_" Local evaluator
_" Other (Describe.)

Community Level: Planning & Implementation

SPF/Strategic Plan Development

162

Which of the following does the strategic plan address or include?
_" Data indicators on substance abuse
_" Data on factors causing, leading to, or promoting substance use
_" Underage drinking initiative
_" Cultural competence
_" Connection with state SPF SIG initiative
_" Current community resources/strengths
_" Identification of conditions outside the scope of the intervention (e.g., poverty rates, immigration trends, laws) that might affect it
_" Logic model
_" Necessary infrastructure development
_" Role of stakeholders
_" Appropriate interventions selected to match target outcomes or causal factors
_" Barriers to implementation
_" Measurable objectives
_" Identification of available data sources to measure objectives
_" Data collection plans
_" Data monitoring plans
_" Data analysis plans
_" Sustainability
_" Opportunity for adjustments based on initial outcomes

Community Level: Planning & Implementation

SPF/Strategic Plan Development

163

If you indicated in question 162 that your strategic plan includes a logic model, does the strategic plan also include a way to evaluate the relationships, activities and outcomes illustrated in the logic model?
" _ Yes
" " _ No

Community Level: Planning & Implementation

SPF/Strategic Plan Development

164

Has your strategic plan been reviewed by the agency responsible for the SPF SIG initiative in your state?
" _ " Yes
" _ " No (If no, proceed to question 166.)

Community Level: Planning & Implementation

SPF/Strategic Plan Development

165

Have you received feedback on your strategic plan by the agency responsible for the SPF SIG initiative in your state?
" _ " Yes
" _ " No

Community Level: Planning & Implementation

SPF/Strategic Plan Development

166

Has your strategic plan been approved by the agency responsible for the SPF SIG initiative in your state?
" _ " Yes
" _ " No
" _ " Our state does not require or provide approval of the strategic plan.

Community Level: Planning & Implementation

SPF/Strategic Plan Development

167

Was the strategic plan revisited during this reporting period?
" " _ " Yes
" _ " No (If no, proceed to question 171.)

Community Level: Planning & Implementation

SPF/Strategic Plan Development

168

If the strategic plan was revisited, were any changes made?
" _ " Yes
" _ " No (If no, proceed to question 171.)

Community Level: Planning & Implementation

SPF/Strategic Plan Development

169

If the strategic plan was changed, indicate why it was changed.
_" New data indicated new priority areas
_" Political considerations
_" New technology made additional surveillance or evaluation methods available
_" Funding changes increased or decreased the scope of intervention activities
_" Other (Describe.)

Community Level: Planning & Implementation

SPF/Strategic Plan Development

170

Indicate areas where changes were made to the strategic plan after revisiting the plan.
_ " Data indicators on substance abuse
_ " Data on factors causing, leading to, or promoting substance use
_" Underage drinking initiative
_" Cultural competence
_" Connection with state SPF SIG initiative
_" Current community resources/strengths
_" Identification of conditions outside the scope of the intervention (e.g., poverty rates, immigration trends, laws) that might affect it
_" Logic model
_" Necessary infrastructure development
_" Role of stakeholders
_" Appropriate interventions selected to match target outcomes or causal factors
_" Barriers to implementation
_" Measurable objectives
_" Identification of available data sources to measure objectives
_" Data collection plans
_" Data monitoring plans
_" Data analysis plans
_" Sustainability
_" Opportunity for adjustments based on initial outcomes

Community Level: Planning & Implementation

SPF/Strategic Plan Development

171

Was the logic model revised during this reporting period?
_" " Yes
_" " No (If no, proceed to question 173.)

Community Level: Planning & Implementation

SPF/Strategic Plan Development

172

Indicate why the logic model was revised.
_" New data indicated new priority areas
_" " Political considerations
_" " New technology made additional surveillance or evaluation methods available
_" " Funding changes increased or decreased the scope of intervention activities
_" " Other (Describe.)

Community Level: Planning & Implementation

SPF/Strategic Plan Development

173

Did you receive SPF SIG funded guidance, training or technical assistance with regard to developing a strategic plan during this reporting period?
_" " Yes
_" " No
(If no, proceed to question 175.)

Community Level: Planning & Implementation

SPF/Strategic Plan Development

174

How likely is it that you will use what you learned during the guidance, training or technical assistance on developing a strategic plan in your SPF SIG activities?
_" " Very likely
_" " Somewhat likely
_" " Not likely

Community Level: Planning & Implementation

SPF/Strategic Plan Development

175

Did you receive SPF SIG funded guidance, training or technical assistance with regard to selecting intervention strategies during this reporting period?
_" " " Yes
_" " " No (If no, proceed to question 177.)

Community Level: Planning & Implementation

SPF/Strategic Plan Development

176

How likely is it that you will use what you learned during the guidance, training or technical assistance on selecting intervention strategies in your SPF SIG activities?
_" " Very likely
_" " Somewhat likely
_" " Not likely

Community Level: Planning & Implementation

SPF/Strategic Plan Development

177

If your project experienced any challenges with developing the strategic plan during this reporting period, please describe them here.

Community Level: Planning & Implementation

SPF/Strategic Plan Development

178

If your project experienced any specific successes with developing the strategic plan during this reporting period, please describe them here.

Community Level: Planning & Implementation

SPF/Strategic Plan Development

179

Did you work on intervention implementation during this reporting period?
_ " Yes
_ " No (
If no, proceed to question 181.)

Community Level: Planning & Implementation

SPF/Intervention Implementation

180a

For question 180 below, name the intervention(s) you implemented/delivered during this reporting period.

Intervention (1) Name

Community Level: Planning & Implementation

SPF/Intervention Implementation

180b

For question 180 below, name the intervention(s) you implemented/delivered during this reporting period.

Intervention (2) Name

Community Level: Planning & Implementation

SPF/Intervention Implementation

180c

For question 180 below, name the intervention(s) you implemented/delivered during this reporting period.

Intervention (3) Name

Community Level: Planning & Implementation

SPF/Intervention Implementation

181

Did you receive SPF SIG funded guidance, training or technical assistance with regard to recruiting participants for interventions during this reporting period?
_ " Yes
_ " No (If no, proceed to question 183.)
_ " Not applicable (If not applicable, proceed to question 183.)

Community Level: Planning & Implementation

SPF/Intervention Implementation

182

How likely is it that you will use what you learned during the guidance, training or technical assistance on recruiting participants in your SPF SIG activities?
_ " Very likely
_" Somewhat likely
_ " Not likely

Community Level: Planning & Implementation

SPF/Intervention Implementation

183

If your project experienced any challenges with intervention implementation during this reporting period, please describe them here.

Community Level: Planning & Implementation

SPF/Intervention Implementation

184

If your project experienced any specific successes related to intervention implementation during this reporting period, please describe them here.

Community Level: Planning & Implementation

SPF/Intervention Implementation

185

Was intervention implementation monitored by the Single State Agency (SSA) or state agency in charge of the SPF SIG funding during this reporting period?
" _ Yes
" Implementation of some interventions was monitored, but not all interventions were monitored.
_" No
_" Don’t know

Community Level: Planning & Implementation

SPF/Monitoring and Evaluation

186

Did you work on intervention level evaluation activities during this reporting period?
" _ Yes
_ " No (
If no, proceed to question 194.)

Community Level: Planning & Implementation

SPF/Monitoring and Evaluation

187

Have you developed an evaluation plan developed?
_ " Yes
_ " No (If no, proceed to question 190.)

Community Level: Planning & Implementation

SPF/Monitoring and Evaluation

188

Was the evaluation plan revised during this reporting period?
_ " Yes
_ " No (If no, proceed to question 190.)

Community Level: Planning & Implementation

SPF/Monitoring and Evaluation

189

Indicate how the evaluation plan was revised.
" _ Immediate outcomes were changed
_ " Intermediate outcomes were changed
_ " Instruments or assessment tools were changed
_ " Data collection points (intervals between pre- and post-test or follow-up) were changed
" _ Analysis plans were changed
_ " Plans for dissemination of evaluation results were changed
_ " Other (Describe.)

Community Level: Planning & Implementation

SPF/Monitoring and Evaluation

190

Did you develop any evaluation reports during this reporting period?
" _ Yes
_ " No

Community Level: Planning & Implementation

SPF/Monitoring and Evaluation

191

Did you communicate any evaluation findings to key stakeholders/key informants during this reporting period?
" _ Yes
_ " No (If no, proceed to question 194.)

Community Level: Planning & Implementation

SPF/Monitoring and Evaluation

192

If so, how did you communicate the findings?
_ " Distributed written report to stakeholders
_ " Presented findings at a meeting of stakeholders
_ " " Presented findings to community members/participants
_ " " Written press release
_ " " Televised press conference
_ " " Other (Describe.)

Community Level: Planning & Implementation

SPF/Monitoring and Evaluation

193

Indicate how stakeholders used these evaluation findings.
" _ To set policy
" _ " To change substance abuse priorities
" _ " To leverage additional funds
" _ " To recruit additional partners
" _ " To leverage additional prevention staff
" _ " To encourage coordination among organizations or agencies
" _ " To learn/increase knowledge
" " _ Other (Describe.)
" _ " Don’t know

Community Level: Planning & Implementation

SPF/Monitoring and Evaluation

194

Did you receive SPF SIG funded guidance, training or technical assistance with regard to evaluation activities during this reporting period?
" _ "
Yes
" _ " No (If no, proceed to question 196.)

Community Level: Planning & Implementation

SPF/Monitoring and Evaluation

195

How likely is it that you will use what you learned during the guidance, training or technical assistance on evaluation activities in your SPF SIG activities?
" _ " Very likely
" _ Somewhat likely
" _ " Not likely

Community Level: Planning & Implementation

SPF/Monitoring and Evaluation

196

If your project experienced any challenges with intervention evaluation during this reporting period, please describe them here.

Community Level: Planning & Implementation

SPF/Monitoring and Evaluation

197

If your project experienced any specific successes with intervention evaluation during this reporting period, please describe them here.

Community Level: Planning & Implementation

SPF/Monitoring and Evaluation

198

Does your community have a specific plan or vision/mission statement about substance abuse prevention that guides the community substance abuse prevention planning process?
" _ Yes
_ " No (If no, proceed to question 200.)

Community Level: Systems Change

Systems Factors

199

If yes, describe the primary goals of the plan or vision/mission statement.

Community Level: Systems Change

Systems Factors

200

Does your community have a written, documented process for making substance abuse prevention-related decisions?
" " _ Yes
" _ " No (If no, proceed to question 203.)

Community Level: Systems Change

Systems Factors

201

Describe the basic steps in the process.

Community Level: Systems Change

Systems Factors

202

Who is involved in making substance abuse prevention-related decisions?

Community Level: Systems Change

Systems Factors

203

Do multiple organizations and agencies in your community work together to collect, manage and organize community ATOD data?
" _ " Yes (If yes, proceed to question 205.)
" _ " No

Community Level: Systems Change

Systems Factors

204

If you answered no to question 203, please describe why not.

Community Level: Systems Change

Systems Factors

205

If you answered yes to question 203, please describe the types of community data collected by these organizations.

Community Level: Systems Change

Systems Factors

206

Is there a primary organization or agency that has responsibility for management of the data?
" _ " Yes
" _ " No (
If no, proceed to question 208.)

Community Level: Systems Change

Systems Factors

207

How was this organization selected to manage the data?

Community Level: Systems Change

Systems Factors

208

Do you have access to prevention data systems?
" " _ Yes
" _ No (
If no, proceed to question 210.)

Community Level: Systems Change

Systems Factors

209

Describe the types of data systems you have access to.

Community Level: Systems Change

Systems Factors

210

Describe here any demographic factors or issues that have had an impact or will have an impact on prevention activities in your community during this reporting period.

Community Level: Contextual Change & Unmeasured Factors

Contextual Factors

211

Describe here any cultural factors that have had an impact or will have an impact on prevention activities in your community during this reporting period.

Community Level: Contextual Change & Unmeasured Factors

Contextual Factors

212

Describe here any community factors that have had an impact or will have an impact on prevention activities in your community during this reporting period.

Community Level: Contextual Change & Unmeasured Factors

Contextual Factors

213

Describe here any environmental or systems factors that have had an impact or will have an impact on prevention activities in your community during this reporting period.

Community Level: Contextual Change & Unmeasured Factors

Contextual Factors

214

Do you have any additional comments about any aspects of the SPF SIG Initiative?

n/a

Closing Questions

215

Did following the steps of the Strategic Prevention Framework during this reporting period lead to specific successes within your community in dealing with substance abuse prevention?
" _ Yes
_ " No (
If no, proceed to question 217.)
_ " Too soon to determine (
If marked, proceed to question 217, if applicable.)

n/a

Closing Questions

216

If yes, please describe how following the specific steps of the framework contributed to your success, and what you consider a success.

n/a

Closing Questions

217

Who is the lead agency for the community coalition (the agency responsible for making the primary decisions of the coalition and/or the agency controlling the money)?

Community Level: Planning & Implementation

Coalition Sub-Form

218

Does this agency have financial responsibility for the coalition?
_
Yes
_ " No
_ " Don’t know

Community Level: Planning & Implementation

Coalition Sub-Form

219

Does the community coalition have a funding source?
_ Yes
_ " No
_ " Don’t know

Community Level: Planning & Implementation

Coalition Sub-Form

220

Does the project director for the SPF SIG project work for the coalition’s lead agency?
_ Yes
_ " No
_ " Don’t know

Community Level: Planning & Implementation

Coalition Sub-Form

221

Does the community coalition have an identifiable leader (an individual, not an agency)?
_ Yes
_ " No
_ " Don’t know

Community Level: Planning & Implementation

Coalition Sub-Form

222

Is the leader of the coalition a paid position?
_ Yes
_ " No
_ " Don’t know

Community Level: Planning & Implementation

Coalition Sub-Form

223

The coalition has a clear vision and focus.
_ " Strongly agree
" _ Agree
_ " Neither agree nor disagree"
_ Disagree
" _ Strongly disagree

Community Level: Planning & Implementation

Coalition Sub-Form

224

The community coalition has collaborative leadership.
_ " Strongly agree
" _ Agree
_ " Neither agree nor disagree" Disagree
" _ Strongly disagree

Community Level: Planning & Implementation

Coalition Sub-Form

225

Responsibilities among coalition members are fairly and effectively delegated.
_ " Strongly agree
" _ Agree
_ " Neither agree nor disagree" Disagree
" _ Strongly disagree

Community Level: Planning & Implementation

Coalition Sub-Form

226

The coalition has a broad-based, diverse membership that represents the various groups and organizations involved in substance abuse prevention.
_ " Strongly agree
" _ Agree
_ " Neither agree nor disagree" Disagree
" _ Strongly disagree

Community Level: Planning & Implementation

Coalition Sub-Form

227

There is too much talking and not enough follow through with actions.
_ " Strongly agree
" _ Agree
_ " Neither agree nor disagree" Disagree
" _ Strongly disagree

Community Level: Planning & Implementation

Coalition Sub-Form

228

The coalition has a process for tracking decisions.
_ " Strongly agree
" _ Agree
_ " Neither agree nor disagree" Disagree
" _ Strongly disagree

Community Level: Planning & Implementation

Coalition Sub-Form

229

The coalition does not monitor whether or not there is follow through on decisions.
_ " Strongly agree
" _ Agree
_ " Neither agree nor disagree" Disagree
" _ Strongly disagree

Community Level: Planning & Implementation

Coalition Sub-Form

230

The coalition needs more structure in order to be effective.
_ " Strongly agree
" _ Agree
_ " Neither agree nor disagree" Disagree
" _ Strongly disagree

Community Level: Planning & Implementation

Coalition Sub-Form

231

Denial and apathy among community members toward local substance use issues is a major barrier to our coalition’s effectiveness.
_ " Strongly agree
" _ Agree
_ " Neither agree nor disagree" Disagree
" _ Strongly disagree

Community Level: Planning & Implementation

Coalition Sub-Form

SPF SIG Community-level Instrument Part II Items by Domain Index and Logic Model

Component

Item #

Item Text

Logic Model
Component

Domain

1

Name of the intervention

Community Level: Planning & Implementation

Intervention Form/Intervention Information

2

When did you begin funding this intervention?

Community Level: Planning & Implementation

Intervention Form/Intervention Information

3

When did you complete implementing this intervention?

Community Level: Planning & Implementation

Intervention Form/Intervention Information

4

What factors, beyond data driven planning, influenced your intervention selection?
_" Local capacity to deliver interventions
" _ Cost
" _ Experience implementing intervention prior to SPF SIG funding
" _ Political environment
" _ Requirements of partnering organizations
" _ Evidence-based literature on effectiveness
" _ Other information supporting the effectiveness of the intervention
" _ Demographics or cultural characteristics of local population
" _ Availability of technical assistance
" _ Recommendation by state funding agency
" _ Other (Describe.)

Community Level: Planning & Implementation

Intervention Form/Intervention Information

5

Is this an evidence-based program, policy or practice?
_ Yes
" _ No (If no, proceed to question 7.)

Community Level: Planning & Implementation

Intervention Form/Intervention Information

6

How do you know this is an evidence-based program, policy or practice?
" _ We did not use any specific criteria to determine that this was an evidence-based program, policy or practice
" _ Evaluator recommendation
" _ Listed in National Registry of Effective Programs and Practices (NREPP)
" _ Listed on some other federal agency or national organization’s list of “effective programs”
" _ Found to be effective in a peer-reviewed journal article
" _ Based on a theory or conceptual model
" _ Implemented in a similar community
" _ CSAP recommendation
" _ Center for the Application of Prevention Technologies (CAPT) Web site
" _ Other (Describe.)

Community Level: Planning & Implementation

Intervention Form/Intervention Information

7

Is this a new intervention developed and tested by you, the community partner?

Community Level: Planning & Implementation

Intervention Form/Intervention Information

8

If the intervention you are developing is based on an evidence-based program, policy or practice, provide the name of that intervention.

Community Level: Planning & Implementation

Intervention Form/Intervention Information

9

Indicate why you decided to develop a new intervention rather than using a previously tested intervention.
" _ Previously tested interventions did not address the need in our community
" _ Previously tested interventions were not culturally appropriate
" _ Previously tested interventions were too costly
" _ Other (Describe.)

Community Level: Planning & Implementation

Intervention Form/Intervention Information

10

Which of the following best describes this intervention?
" _ Not implemented in the community prior to SPF SIG funding (If marked, proceed to question 12.)
" _ Continuation of an intervention with no change (If marked, proceed to question 12.)
" _ Continuation of an intervention with changes or adaptations

Community Level: Planning & Implementation

Intervention Form/Intervention Information

11

If the intervention is the continuation of an intervention with changes or adaptations, describe the changes or adaptations and the reasons for the changes.

Community Level: Planning & Implementation

Intervention Form/Intervention Information

12

Is your definition of community based on something other than geography, such as a target population?
" _ Yes (If yes, proceed to question 18.)
" _ No

Community Level: Planning & Implementation

Intervention Form/Intervention Information

13

For questions 13 through 17 below, indicate the areas being served by this intervention and the estimated population of this area.

City/Town

Community Level: Planning & Implementation

Intervention Form/Intervention Information

14

For questions 13 through 17 below, indicate the areas being served by this intervention and the estimated population of this area.

County/Parish

Community Level: Planning & Implementation

Intervention Form/Intervention Information

15

For questions 13 through 17 below, indicate the areas being served by this intervention and the estimated population of this area.

Zip code(s)

Community Level: Planning & Implementation

Intervention Form/Intervention Information

16

For questions 13 through 17 below, indicate the areas being served by this intervention and the estimated population of this area.

Other geographic areas, including statewide

Community Level: Planning & Implementation

Intervention Form/Intervention Information

17

For questions 13 through 17 below, indicate the areas being served by this intervention and the estimated population of this area.

What is the estimated population for the area described?

Community Level: Planning & Implementation

Intervention Form/Intervention Information

18

Of the total SPF SIG funding you received, what was the amount spent on this entire intervention--including planning, developing, implementing and evaluating the intervention--during this reporting period?

Community Level: Planning & Implementation

Intervention Form/Intervention Information

19

Approximately what percentage of total funding for this intervention comes from SPF SIG funds?

Community Level: Planning & Implementation

Intervention Form/Intervention Information

20

Indicate the CSAP domain this intervention targets:
" _ Individual domain
" _ Family domain
" _ Peer domain
" _ School domain
" _ Community domain
" _ Society/Environmental domain

Community Level: Planning & Implementation

Intervention Form/Intervention Information

21

Indicate the component(s) that are included in this intervention. Interventions may employ several different components:
" _ Prevention education
" _ Alternative drug-free activities
" _ Problem identification and referral
" _ Community based processes
" _ Environmental strategies
" _ Information dissemination
" _ Other activities or services delivered to individuals (Describe.)
_ Other activities or services not delivered to individuals (Describe.)

Community Level: Planning & Implementation

Intervention Form/Intervention Information

22

Does this intervention include a curriculum or manual?
" _ Yes
" _ No

Community Level: Planning & Implementation

Intervention Form/Intervention Information

23

Have any individual participants been served by this intervention during this reporting period, for example in classroom-based interventions or other direct service interventions?
" _ Yes
" _ No (If no, proceed to next section, Adaptations.)

Community Level: Planning & Implementation

Intervention Form/Intervention Information

24

How many new participants were served by this intervention during this reporting period?

Community Level: Planning & Implementation

Intervention Form/Intervention Information

25

Did you adapt the intervention in order to deliver it to a target population that was not indicated by the developer?
" _ Yes
" _ No (If no, proceed to question 27.)
" _ Intervention developer makes no recommendations for target population (If marked, proceed to question 27.)
" _ Not applicable (If not applicable, proceed to question 27.)

Community Level: Planning & Implementation

Intervention Form/Adaptations

26

Describe the adaptation you made in order to deliver the intervention to a target population that was not indicated by the developer. ______

Community Level: Planning & Implementation

Intervention Form/Adaptations

27

Did you make any adaptation to the curriculum or manual content of the intervention?
" _ Yes
" _ No (If no, proceed to question 29.)
" _ Intervention developer makes no recommendations for curriculum or manual content (If marked, proceed to question 29.)
" _ Not applicable (If not applicable, proceed to question 29.)

Community Level: Planning & Implementation

Intervention Form/Adaptations

28

Describe the adaptation made to the curriculum or manual content. ________

Community Level: Planning & Implementation

Intervention Form/Adaptations

29

Did you make any adaptations to address the cultural appropriateness of the intervention for a particular group?
" _ Yes
" _ No (If no, proceed to question 31.)
" _ Intervention developer makes no recommendations regarding the cultural appropriateness of the intervention for different groups (If marked, proceed to question 31.)
" _ Not applicable (If not applicable, proceed to question 31.)

Community Level: Planning & Implementation

Intervention Form/Adaptations

30

Describe the changes you made to improve the cultural appropriateness of the intervention and how the fit was improved for a particular group.

Community Level: Planning & Implementation

Intervention Form/Adaptations

31

Did you adapt the recommended dosage for this intervention, for example the number of sessions or number of public service announcements (PSAs) or other media spots?
" _ Yes
" _ No (If no, proceed to question 34.)
" _ Intervention developer makes no recommendations for dosage (If marked, proceed to question 34.)
" _ Not applicable (If not applicable, proceed to question 34.)

Community Level: Planning & Implementation

Intervention Form/Adaptations

32

Indicate the recommended dosage. ___________

Community Level: Planning & Implementation

Intervention Form/Adaptations

33

Indicate the dosage actually delivered. ______

Community Level: Planning & Implementation

Intervention Form/Adaptations

34

Did you adapt the recommended duration (e.g., days or hours) of this intervention?
" _ Yes
" _ No (If no, proceed to question 37.)
" _ Intervention developer makes no recommendations for duration (If marked, proceed to question 37.)
" _ Not applicable (If not applicable, proceed to question 37.)

Community Level: Planning & Implementation

Intervention Form/Adaptations

35

Indicate the recommended duration, in hours, of this intervention.

Community Level: Planning & Implementation

Intervention Form/Adaptations

36

Indicate the number of hours actually spent delivering the intervention.

Community Level: Planning & Implementation

Intervention Form/Adaptations

37

Did you make an adaptation to the setting of the intervention (e.g., classroom, worksite, etc.)?
" _ Yes
" _ No (If no, proceed to question 39.)
" _ Intervention developer makes no recommendations for setting (If marked, proceed to question 39.)
" _ Not applicable (If not applicable, proceed to question 39.)

Community Level: Planning & Implementation

Intervention Form/Adaptations

38

Describe the adaptation made to the setting of the intervention (e.g., classroom, worksite, etc.).

Community Level: Planning & Implementation

Intervention Form/Adaptations

39

Did you collect information regarding participant satisfaction with the cultural appropriateness of the intervention?
" _ Yes
" _ No (If no, proceed to next section, Intervention Outcomes.)

Community Level: Planning & Implementation

Intervention Form/Adaptations

40

What were the results of the assessment of participants’ satisfaction with the cultural appropriateness of the intervention?

Community Level: Planning & Implementation

Intervention Form/Adaptations

41

Were any outcome data collected during this reporting period?
" _ Yes
" _ No (If no, proceed to question 45.)

Community Level: Planning & Implementation

Intervention Form/Intervention Outcomes

42

If outcome data were collected, what was your sampling strategy?
_" The entire target population for the intervention
" _ Only the actual persons who directly participated in the intervention
" _ A specifically selected comparison group that did not receive the intervention
" _ Some other population or subgroup (Describe.)

Community Level: Planning & Implementation

Intervention Form/Intervention Outcomes

43

Indicate the CSAP National Outcome Measures (NOMs) that are being collected.
"_ 30-day use
" _ Perceived risk of use
" _ Age of first use
" _ Perception of disapproval
" _ ATOD (Alcohol Tobacco and Other Drugs) suspensions/expulsions
" _ School attendance divided by enrollment (defined as attendance as a percentage of enrollment)
" _ Workplace ATOD (Alcohol Tobacco and Other Drugs) use
" _ Drug-related crime
" _ Alcohol-related car crashes and injuries
" _ Number of persons served by age, gender, race and ethnicity
" _ Total number of evidence-based interventions
" _ Increased services provided within cost bands for universal, selective, and indicated programs.

Community Level: Planning & Implementation

Intervention Form/Intervention Outcomes

44

Was an analysis of outcome data completed during this reporting period?
" _ Yes
" _ No

Community Level: Planning & Implementation

Intervention Form/Intervention Outcomes

45

Provide any additional comments about your prevention intervention activities here.

Community Level: Planning & Implementation

Intervention Form/Closing Question

46

When did you first start serving participants with this Prevention Education component of the intervention, including all cycles?

Community Level: Planning & Implementation

Prevention Education Sub-Form

47

Is this a recurring intervention, in which the same group of people are served over multiple intervention sessions?
" _ Yes
" _ No

Community Level: Planning & Implementation

Prevention Education Sub-Form

48

Is the prevention education component of this intervention implemented in a series of cycles, in which a new group of participants is served on a regular schedule, such as a new school year?
_" Yes
"_ No (If no, proceed to question 50.)

Community Level: Planning & Implementation

Prevention Education Sub-Form

49

If the prevention education component of the intervention is implemented in cycles, what are the cycles based on?
" _ The school calendar (quarters, semesters, school year)
" _ The SPF SIG funding cycle
" _ An organizational fiscal cycle
" _ Other (Describe.)

Community Level: Planning & Implementation

Prevention Education Sub-Form

50

How many new groups of participants started the prevention education component of the intervention during this reporting period?

Community Level: Planning & Implementation

Prevention Education Sub-Form

51

How many new groups of participants completed the prevention education component of the intervention during this reporting period?

Community Level: Planning & Implementation

Prevention Education Sub-Form

52

What was the total number of sessions provided for each group of participants in the prevention education component of the intervention during this reporting period?

Community Level: Planning & Implementation

Prevention Education Sub-Form

53

What was the average length of the individual sessions, in hours, during this reporting period?

Community Level: Planning & Implementation

Prevention Education Sub-Form

54

What was the format of the prevention education component of the intervention during this reporting period?
" _ " Individual
" _ " Small group (2-9)
" _ " Large group (10-49)
" _ " Extra large group (50+)
" _ " Web-based
" _ " Other (Describe.)

Community Level: Planning & Implementation

Prevention Education Sub-Form

55

Indicate the types of participants served by the prevention education component of the intervention during this reporting period.
" _ " " Children age 0 to 3
" _ " " Children age 4 to 5
" _ " " Children age 6 to 11
" _ " " Youth age 12 to 17
" _ " " Young adults age 18 to 20
" _ " " Young adults age 21 to 24
" _ " " Parents
" _ " " Adults 18 and over, but not parents
" _ " " Community leaders
" _ " "
Health care providers
" _ " " Substance abuse prevention/treatment workers
" _ " " Law enforcement
" _ " " Other (Describe.)

Community Level: Planning & Implementation

Prevention Education Sub-Form

56

As delivered, how would you classify this Prevention Education component according to the Institute of Medicine categories?
" _ " " " Universal
" _ " " " Selective
" _ " " " Indicated

Community Level: Planning & Implementation

Prevention Education Sub-Form

57a

Percentage of participants served ____%

Racial Category/American Indian/Alaska Native

Community Level: Planning & Implementation

Prevention Education Sub-Form

57b

Subgroups targeted, if applicable

Racial Category/American Indian/Alaska Native

Community Level: Planning & Implementation

Prevention Education Sub-Form

58a

Percentage of participants served ____%

Racial Category/Asian

Community Level: Planning & Implementation

Prevention Education Sub-Form

58b

Subgroups targeted, if applicable

Racial Category/Asian

Community Level: Planning & Implementation

Prevention Education Sub-Form

59a

Percentage of participants served ____%

Racial Category/Black or African American

Community Level: Planning & Implementation

Prevention Education Sub-Form

59b

Subgroups targeted, if applicable

Racial Category/Black or African American

Community Level: Planning & Implementation

Prevention Education Sub-Form

60a

Percentage of participants served ____%

Racial Category/Native Hawaiian or Other Pacific Islander

Community Level: Planning & Implementation

Prevention Education Sub-Form

60b

Subgroups targeted, if applicable

Racial Category/Native Hawaiian or Other Pacific Islander

Community Level: Planning & Implementation

Prevention Education Sub-Form

61a

Percentage of participants served ____%

Racial Category/White

Community Level: Planning & Implementation

Prevention Education Sub-Form

61b

Subgroups targeted, if applicable

Racial Category/White

Community Level: Planning & Implementation

Prevention Education Sub-Form

62a

Percentage of participants served ____%

Racial Category/Participants who selected more than one race

Community Level: Planning & Implementation

Prevention Education Sub-Form

62b

Subgroups targeted, if applicable

Racial Category/Participants who selected more than one race

Community Level: Planning & Implementation

Prevention Education Sub-Form

63a

Percentage of participants served ____%

Racial Category/Other

Community Level: Planning & Implementation

Prevention Education Sub-Form

63b

Subgroups targeted, if applicable

Racial Category/Other

Community Level: Planning & Implementation

Prevention Education Sub-Form

64a

Percentage of participants served ____%

Ethnic Category/Hispanic/Latino

Community Level: Planning & Implementation

Prevention Education Sub-Form

64b

Subgroups targeted, if applicable

Ethnic Category/Hispanic/Latino

Community Level: Planning & Implementation

Prevention Education Sub-Form

65

When did you first start serving participants with this Alternative Drug-Free Activities component of the intervention, including all cycles?

Community Level: Planning & Implementation

Alternative Drug-Free Activities Sub-Form

66

Are these recurring activities, in which the same group of people are served over multiple intervention sessions?
_" Yes
_" No

Community Level: Planning & Implementation

Alternative Drug-Free Activities Sub-Form

67

Is the alternative drug-free activities component of this intervention implemented in a series of cycles, in which a new group of participants is served on a regular schedule, such as a new school year?
" _ Yes
" _ No (If no, proceed to question 69.)

Community Level: Planning & Implementation

Alternative Drug-Free Activities Sub-Form

68

If the alternative drug-free activities component of this intervention is implemented in cycles, what are the cycles based on?
_" The school calendar (quarters, semesters, school year)
" _ The SPF SIG funding cycle
" _ An organizational fiscal cycle
_" Other (Describe.)

Community Level: Planning & Implementation

Alternative Drug-Free Activities Sub-Form

69

How many new groups of participants started the alternative drug-free activities component of this intervention during this reporting period?

Community Level: Planning & Implementation

Alternative Drug-Free Activities Sub-Form

70

How many new groups of participants completed the alternative drug-free activities component of this intervention during this reporting period?

Community Level: Planning & Implementation

Alternative Drug-Free Activities Sub-Form

71

What was the total number of sessions provided for each group of participants in the alternative drug-free activities component of this intervention during this reporting period?

Community Level: Planning & Implementation

Alternative Drug-Free Activities Sub-Form

72

What was the average length of the individual sessions, in hours, during this reporting period?

Community Level: Planning & Implementation

Alternative Drug-Free Activities Sub-Form

73

73. What was the format of the alternative drug-free activities component of this intervention during this reporting period?
" _ Individual
" _ Small group (2-9)
" _ Large group (10-49)
" _ Extra large group (50+)
" _ Web-based
" _ Other (Describe.)

Community Level: Planning & Implementation

Alternative Drug-Free Activities Sub-Form

74

Indicate the types of participants served by the alternative drug-free activities component of this intervention during this reporting period.
_" Children age 0 to 3
" _ Children age 4 to 5
" _ Children age 6 to 11
" _ Youth age 12 to 17
" _ Young adults age 18 to 20
" _ Young adults age 21 to 24
" _ Parents
" _ Adults 18 and over, but not parents
" _ Community leaders
" _ Healthcare providers
" _ Substance abuse prevention/treatment workers
" _ Law enforcement
" _ Other (Describe.)

Community Level: Planning & Implementation

Alternative Drug-Free Activities Sub-Form

75

As delivered, how would you classify the alternative drug-free activities component of this intervention according to the Institute of Medicine categories?
" _ Universal
" _Selective
" _ Indicated

Community Level: Planning & Implementation

Alternative Drug-Free Activities Sub-Form

76

Did you conduct drug-free events (concerts, festivals/fairs, picnics, sporting events) during this reporting period that were not targeted to specific groups of participants?
" _ Yes
" _ No

Community Level: Planning & Implementation

Alternative Drug-Free Activities Sub-Form

77

How many drug-free events were conducted during this reporting period?

Community Level: Planning & Implementation

Alternative Drug-Free Activities Sub-Form

78

How many people were reached through the drug-free events during this reporting period?

Community Level: Planning & Implementation

Alternative Drug-Free Activities Sub-Form

79a

Percentage of participants served ____%

Racial Category/American Indian/Alaska Native

Community Level: Planning & Implementation

Alternative Drug-Free Sub-form

79b

Subgroups targeted, if applicable

Racial Category/American Indian/Alaska Native

Community Level: Planning & Implementation

Alternative Drug-Free Sub-form

80a

Percentage of participants served ____%

Racial Category/Asian

Community Level: Planning & Implementation

Alternative Drug-Free Sub-form

80b

Subgroups targeted, if applicable

Racial Category/Asian

Community Level: Planning & Implementation

Alternative Drug-Free Sub-form

81a

Percentage of participants served ____%

Racial Category/Black or African American

Community Level: Planning & Implementation

Alternative Drug-Free Sub-form

81b

Subgroups targeted, if applicable

Racial Category/Black or African American

Community Level: Planning & Implementation

Alternative Drug-Free Sub-form

82a

Percentage of participants served ____%

Racial Category/Native Hawaiian or Other Pacific Islander

Community Level: Planning & Implementation

Alternative Drug-Free Sub-form

82b

Subgroups targeted, if applicable

Racial Category/Native Hawaiian or Other Pacific Islander

Community Level: Planning & Implementation

Alternative Drug-Free Sub-form

83a

Percentage of participants served ____%

Racial Category/White

Community Level: Planning & Implementation

Alternative Drug-Free Sub-form

83b

Subgroups targeted, if applicable

Racial Category/White

Community Level: Planning & Implementation

Alternative Drug-Free Sub-form

84a

Percentage of participants served ____%

Racial Category/Participants who selected more than one race

Community Level: Planning & Implementation

Alternative Drug-Free Sub-form

84b

Subgroups targeted, if applicable

Racial Category/Participants who selected more than one race

Community Level: Planning & Implementation

Alternative Drug-Free Sub-form

85a

Percentage of participants served ____%

Racial Category/Other

Community Level: Planning & Implementation

Alternative Drug-Free Sub-form

85b

Subgroups targeted, if applicable

Racial Category/Other

Community Level: Planning & Implementation

Alternative Drug-Free Sub-form

86a

Percentage of participants served ____%

Ethnic Category/Hispanic/Latino

Community Level: Planning & Implementation

Alternative Drug-Free Sub-form

86b

Subgroups targeted, if applicable

Ethnic Category/Hispanic/Latino

Community Level: Planning & Implementation

Alternative Drug-Free Sub-form

87

When did you first start serving participants with this Problem Identification and Referral component of the intervention, including all cycles?

Community Level: Planning & Implementation

Problem Identification and Referral Sub-Form

88

Indicate the types of participants served by the Problem Identification and Referral component of this intervention during this reporting period.
_" Children age 0 to 3
" _ Children age 4 to 5
" _ Children age 6 to 11
" _ Youth age 12 to 17
" _ Young adults age 18 to 20
" _ Young adults age 21 to 24
" _ Parents
" _ Adults 18 and over, but not parents
" _ Community leaders
" _ Healthcare providers
" _ Substance abuse prevention/treatment workers
" _ Law enforcement
" _ Other (Describe.)

Community Level: Planning & Implementation

Problem Identification and Referral Sub-Form

89

What was the total number of individuals for whom problem identification and referral services were provided during this reporting period?

Community Level: Planning & Implementation

Problem Identification and Referral Sub-Form

90

Where did problem identification and referral activities take place?
_" School
_" " Health care facilities
_" " Jails or prisons
_" " Courts
_" " Other (Describe.)

Community Level: Planning & Implementation

Problem Identification and Referral Sub-Form

91

What type of services were individuals referred to?
_" " Substance abuse treatment
_" " Mental health treatment
_" " Substance abuse prevention activities
_" " Housing services
_" " After school activities
_" " Transportation
_" " Day care or adult care services
_" " Other (Describe.)

Community Level: Planning & Implementation

Problem Identification and Referral Sub-Form

92a

Percentage of participants served ____%

Racial Category/American Indian/Alaska Native

Community Level: Planning & Implementation

Problem Identification and Referral Sub-Form

92b

Subgroups targeted, if applicable

Racial Category/American Indian/Alaska Native

Community Level: Planning & Implementation

Problem Identification and Referral Sub-Form

93a

Percentage of participants served ____%

Racial Category/Asian

Community Level: Planning & Implementation

Problem Identification and Referral Sub-Form

93b

Subgroups targeted, if applicable

Racial Category/Asian

Community Level: Planning & Implementation

Problem Identification and Referral Sub-Form

94a

Percentage of participants served ____%

Racial Category/Black or African American

Community Level: Planning & Implementation

Problem Identification and Referral Sub-Form

94b

Subgroups targeted, if applicable

Racial Category/Black or African American

Community Level: Planning & Implementation

Problem Identification and Referral Sub-Form

95a

Percentage of participants served ____%

Racial Category/Native Hawaiian or Other Pacific Islander

Community Level: Planning & Implementation

Problem Identification and Referral Sub-Form

95b

Subgroups targeted, if applicable

Racial Category/Native Hawaiian or Other Pacific Islander

Community Level: Planning & Implementation

Problem Identification and Referral Sub-Form

96a

Percentage of participants served ____%

Racial Category/White

Community Level: Planning & Implementation

Problem Identification and Referral Sub-Form

96b

Subgroups targeted, if applicable

Racial Category/White

Community Level: Planning & Implementation

Problem Identification and Referral Sub-Form

97a

Percentage of participants served ____%

Racial Category/Participants who selected more than one race

Community Level: Planning & Implementation

Problem Identification and Referral Sub-Form

97b

Subgroups targeted, if applicable

Racial Category/Participants who selected more than one race

Community Level: Planning & Implementation

Problem Identification and Referral Sub-Form

98a

Percentage of participants served ____%

Racial Category/Other

Community Level: Planning & Implementation

Problem Identification and Referral Sub-Form

98b

Subgroups targeted, if applicable

Racial Category/Other

Community Level: Planning & Implementation

Problem Identification and Referral Sub-Form

99a

Percentage of participants served ____%

Ethnic Category/Hispanic/Latino

Community Level: Planning & Implementation

Problem Identification and Referral Sub-Form

99b

Subgroups targeted, if applicable

Ethnic Category/Hispanic/Latino

Community Level: Planning & Implementation

Problem Identification and Referral Sub-Form

100

Indicate the number of task force/coalition members you recruited during this reporting period, if any:

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

101

Indicate the number of task force/coalition meetings you held during this reporting period, if any:

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

102

Indicate the number of task force/coalition members you trained during this reporting period, if any:

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

103

Indicate the number of other community members you trained during this reporting period, if any:

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

104

Did you coordinate funding with other organizations/projects during this reporting period?
" _Yes
" _No

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

105

Did you develop interagency coordination mechanisms during this reporting period?
" _Yes
" _No

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

106

Did you develop prevention or provider networks during this reporting period?
" _Yes
_No

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

107

Indicate how many community outreach/education sessions you hosted during this reporting period, if any.

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

108

Indicate the number of community organizations to whom you provided funding or other in-kind donations during this reporting period, if any:
(If none, proceed to question 110.)

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

109

How much funding did you provide to community organizations during this reporting period?

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

110

Indicate the number of community organizations to whom you provided technical assistance during this reporting period, if any:

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

111

Did you reorganize local agencies to promote efficiency in delivering substance abuse prevention during this reporting period?
_Yes
" _No

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

112

Did you reallocate local funds for substance abuse prevention during this reporting period?
_Yes
" _No

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

113

Did you formally change ways local organizations work together to address substance abuse prevention during this reporting period, for example by officially changing school curricula or by documenting specific policies or practices for working together?
_Yes
" _No

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

114

Did you monitor regulatory or compliance changes by the state toward local or regional organizations during this reporting period?
_Yes
" _No

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

115

Did you conduct other community activities during this reporting period?
_Yes (Describe.)
_No

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

116

How often did you conduct other community activities during this reporting period?

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

117a

Percentage of population targeted ____%

Racial Category/American Indian/Alaska Native

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

117b

Subgroups targeted, if applicable

Racial Category/American Indian/Alaska Native

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

118a

Percentage of population targeted ____%

Racial Category/Asian

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

118b

Subgroups targeted, if applicable

Racial Category/Asian

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

119a

Percentage of population targeted ____%

Racial Category/Black or African American

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

119b

Subgroups targeted, if applicable

Racial Category/Black or African American

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

120a

Percentage of population targeted ____%

Racial Category/Native Hawaiian or Other Pacific Islander

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

120b

Subgroups targeted, if applicable

Racial Category/Native Hawaiian or Other Pacific Islander

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

121a

Percentage of population targeted ____%

Racial Category/White

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

121b

Subgroups targeted, if applicable

Racial Category/White

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

122a

Percentage of population targeted ____%

Racial Category/Participants who selected more than one race

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

122b

Subgroups targeted, if applicable

Racial Category/Participants who selected more than one race

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

123a

Percentage of population targeted ____%

Racial Category/Other

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

123b

Subgroups targeted, if applicable

Racial Category/Other

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

124a

Percentage of population targeted ____%

Ethnic Category/Hispanic/Latino

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

124b

Subgroups targeted, if applicable

Ethnic Category/Hispanic/Latino

Community Level: Planning & Implementation

Community-Based Processes Sub-Form

125

Did you work to enact open container laws prohibiting alcohol consumption in public places during this reporting period?
_Yes
" _No (If no, proceed to question 127.)
" _Not applicable. This type of policy was in place prior to receipt of SPF SIG funding. (If not applicable, proceed to question 127.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

126

Were you successful in your efforts to enact open container laws during this reporting period?
_Yes
" _We made some progress in this effort during this reporting period, but we still have some work to do.
" _No

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

127

Did you work to enact limits on the location, density, and hours of operation of liquor stores during this reporting period?
_Yes
_No (If no, proceed to question 129.)
_Not applicable. This type of policy was in place prior to receipt of SPF SIG funding. (If not applicable, proceed to question 129.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

128

Were you successful in your efforts to enact limits on the location, density, and hours of operation of liquor stores during this reporting period?
_Yes
_We made some progress in this effort during this reporting period, but we still have some work to do
_No

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

129

Did you work to enact zoning ordinances to prohibit new alcohol outlets during this reporting period?
_Yes
_No (If no, proceed to question 131.)
_Not applicable. This type of policy was in place prior to receipt of SPF SIG funding. (If not applicable, proceed to question 131.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

130

Were you successful in your efforts to enact zoning ordinances during this reporting period?
_Yes
_We made some progress in this effort during this reporting period, but we still have some work to do.
_No

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

131

Did you work to enact limits on smoking in public places (e.g., movie theaters and restaurants) during this reporting period?
_Yes
_No (If no, proceed to question 133.)
_Not applicable. This type of policy was in place prior to receipt of SPF SIG funding. (If not applicable, proceed to question 133.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

132

Were you successful in your efforts to enact limits on smoking in public places during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

133

Did you work to enact limits on the use and placement of cigarette vending machines during this reporting period?
_Yes
_No (If no, proceed to question 135.)
_Not applicable. This type of policy was in place prior to receipt of SPF SIG funding. (If not applicable, proceed to question 135.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

134

Were you successful in your efforts to enact limits on the use and placement of cigarette vending machines during this reporting period?
_Yes
_We made some progress in this effort during this reporting period, but we still have some work to do.
_No

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

135

Did you work to enact regulations on alcohol or tobacco advertising and billboard placements in the community during this reporting period?
_Yes
_No (If no, proceed to question 137.)
_Not applicable. This type of policy was in place prior to receipt of SPF SIG funding. (If not applicable, proceed to question 137.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

136

Were you successful in your efforts to enact regulations on alcohol or tobacco advertising and billboard placements during this reporting period?
_Yes
_We made some progress in this effort during this reporting period, but we still have some work to do.
_No

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

137

Did you work to establish drug/alcohol/tobacco-free school zones and/or school use policies during this reporting period?
_Yes
_No (If no, proceed to question 139.)
_Not applicable. This type of policy was in place prior to receipt of SPF SIG funding. (If not applicable, proceed to question 139.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

138

Were you successful in your efforts to establish drug/alcohol/tobacco-free school zones and/or school use policies during this reporting period?
_Yes
_We made some progress in this effort during this reporting period, but we still have some work to do.
_No

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

139

Did you work to establish drug/alcohol/tobacco-free workplaces and/or workplace use policies during this reporting period?
_Yes
_No (If no, proceed to question 141.)
_Not applicable. This type of policy was in place prior to receipt of SPF SIG funding. (If not applicable, proceed to question 141.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

140

Were you successful in your efforts to establish drug/alcohol/tobacco-free workplaces and/or workplace use policies during this reporting period?
_Yes
_We made some progress in this effort during this reporting period, but we still have some work to do.
_No

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

141

Did you work to enact policies to reduce the problems/consequences associated with substance abuse (e.g., crime, driving under the influence, etc.) during this reporting period?
_Yes
_No (If no, proceed to question 143.)
_Not applicable. This type of policy was in place prior to receipt of SPF SIG funding. (If not applicable, proceed to question 143.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

142

Were you successful in your efforts to enact policies to reduce the problems/consequences associated with substance abuse during this reporting period?
_Yes
_We made some progress in this effort during this reporting period, but we still have some work to do.
_No

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

143

Did you work to implement organizational policies (e.g., within recreation leagues, summer camps, other non-governmental organizations) to reduce drug/alcohol/tobacco use among staff and youth during this reporting period?
_Yes
_No (If no, proceed to question 145.)
_Not applicable. This type of policy was in place prior to receipt of SPF SIG funding. (If not applicable, proceed to question 145.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

144

Were you successful in your efforts to implement organizational policies to reduce drug/alcohol/tobacco use among staff and youth during this reporting period?
_Yes
_We made some progress in this effort during this reporting period, but we still have some work to do.
_No

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

145

Did you work to implement keg registration during this reporting period?
_Yes
_No (If no, proceed to question 147.)
_Not applicable. This type of policy was in place prior to receipt of SPF SIG funding. (If not applicable, proceed to question 147.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

146

Were you successful in your efforts to implement keg registration during this reporting period?
_Yes
_We made some progress in this effort during this reporting period, but we still have some work to do.
_No

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

147

Did you conduct other policy interventions during this reporting period?
_Yes (Describe.)
_No (If no, proceed to question 149.)
_Not applicable. This type of policy was in place prior to receipt of SPF SIG funding. (If not applicable, proceed to question 149.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

148

Were you successful in your efforts to conduct other policy interventions during this reporting period?
_Yes
_We made some progress in this effort during this reporting period, but we still have some work to do.
_No

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

149

Did you contact your representatives (e.g., to prohibit alcohol consumption and smoking in public places) during this reporting period?
_" Yes
" _ No (If no, proceed to question 152.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

150

How many representatives were contacted during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

151

How many issues did you contact your representatives about during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

152

Did you provide information to elected officials about policies to be enacted (e.g., to prohibit new alcohol outlets in the community) during this reporting period?
_Yes
_No (If no, proceed to question 155.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

153

How many elected officials were provided information during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

154

How many policies did you provide information on during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

155

Did you organize a ballot initiative during this reporting period?
_Yes
_No (If no, proceed to question 157.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

156

How many ballot initiatives were organized during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

157

Did you work with school administrators and teachers to implement a drug-free policy during this reporting period?
_Yes
_No (If no, proceed to question 159.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

158

How many schools did you engage in policy implementation during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

159

Did you work with businesses to implement a drug-free workplace during this reporting period?
_Yes
_No (If no, proceed to question 161.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

160

How many businesses did you engage in policy implementation during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

161

Did you conduct other policy activities during this reporting period?
_Yes
_No (If no, proceed to question 163.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

162

How often did you conduct other policy activities during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Policy

163

Did you conduct compliance checks that target merchants who sell alcohol and tobacco to minors during this reporting period?
_Yes
_No (If no, proceed to question 166.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

164

How many compliance checks were conducted during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

165

How many merchants were targeted during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

166

Did you establish sobriety checkpoints during this reporting period?
_" Yes
_" No (If no, proceed to question 169.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

167

How many sobriety checkpoints were established during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

168

Provide the frequency of checkpoints during this reporting period.

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

169

Did you set up surveillance of areas known for illegal drug sales during this reporting period?
_" Yes
_" No (If no, proceed to question 172.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

170

How many areas were targeted for surveillance during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

171

Provide the frequency of the surveillance during this reporting period.

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

172

Did you work to increase building inspections during this reporting period from the number of inspections conducted prior to this reporting period?
_" Yes
_" No

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

173

Did you work to ensure that policies to force landlords to improve or demolish run-down buildings were enforced during this reporting period?
_" Yes
_" No

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

174

Did you make use of civil and criminal "nuisance abatement" statutes, which require landlords to evict tenants involved in narcotics-related activities or risk personal prosecution during this reporting period?
_" Yes
_" No

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

175

Did you enforce policies to reduce the problems/consequences associated with substance abuse during this reporting period?
_" Yes
_" No

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

176

Did you conduct other enforcement activities during this reporting period?
_" Yes (Describe.) ____
_" No (If no, proceed to question 178.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

177

How often did you conduct other enforcement activities during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

178

Did you educate law enforcement during this reporting period?
_" Yes
_" No (If no, proceed to question 181.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

179

How many law enforcement education sessions were conducted during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

180

How many law enforcement officers were educated during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

181

Did you collaborate with law enforcement (e.g., work with law enforcement to familiarize them with high-risk areas of the community for sting operations, sobriety check-points, etc.) during this reporting period?
_" Yes
" _ No (If no, proceed to question 183.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

182

How many law enforcement officers were engaged in collaboration during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

183

Did you conduct citizen patrols in neighborhoods known for illegal drug sales during this reporting period?
" _ Yes
" _ No (If no, proceed to question 187.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

184

How many citizen patrols were conducted during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

185

How many neighborhoods known for illegal drugs were patrolled during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

186

Did you collaborate with municipal officials and private landlords to improve, rebuild, or raze abandoned buildings that are used to engage in drug use, adolescent alcohol use, and other illegal activities during this reporting period?
_" Yes
" _ No (If no, proceed to question 189.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

187

How many municipal officials were engaged in collaboration during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

188

How many private landlords were engaged in collaboration during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

189

Did you conduct server training programs that work with bartenders and wait staff to reduce service to minors and intoxicated customers during this reporting period?
_" Yes
" _ No (If no, proceed to question 192.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

190

How many server training programs were offered during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

191

How many bartenders/wait staff were trained during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

192

Did you educate merchants about the laws and penalties for selling to underage customers during this reporting period?
_" Yes
" _ No (If no, proceed to question 195.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

193

How many merchant training programs were offered during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

194

How many merchants were educated about the laws and penalties for selling to underage customers during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

195

Did you conduct other enforcement activities during this reporting period?
_" Yes (Describe.)________
" _ No (If no, proceed to question 197.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

196

How often did you conduct other enforcement activities during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Enforcement

197

Did you engage in social marketing during this reporting period?
_" Yes
_" No (If no, proceed to question 208.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

198

How many social marketing campaigns were implemented during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

199

How many television ads were created during this reporting period, as part of your social marketing campaigns?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

200

How many television ads were aired during this reporting period, as part of your social marketing campaigns?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

201

How many radio ads were created during this reporting period, as part of your social marketing campaigns?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

202

How many radio ads were aired during this reporting period, as part of your social marketing campaigns?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

203

How many print ads were created during this reporting period, as part of you social marketing campaigns?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

204

How many print ads were published during this reporting period, as part of your social marketing campaigns?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

205

How many special events (e.g., drug-free concert, smoke-free sponsored softball tournament) were hosted during this reporting period, as part of your social marketing campaigns?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

206

How many other promotional activities (e.g., providing smoke-free pamphlets at a fair, distributing drug-free book covers at a school) were hosted during this reporting period as part of your social marketing campaigns?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

207

How many promotional items were distributed during this reporting period, as part of your social marketing campaigns?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

208

Did you engage in media literacy efforts during this reporting period?
_" Yes
" _ No (If no, proceed to question 210.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

209

How many media literacy building sessions were held during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

210

Did you conduct other communication interventions during this reporting period?
_" Yes (Describe.)___
" _ No (If no, proceed to question 212.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

211

How often did you conduct other communication activities during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

212

Did you present at community meetings (e.g., PTA meetings, town meetings, school assemblies) during this reporting period?
_" Yes
_" No (If no, proceed to question 215.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

213

How many community meetings were presented at during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

214

What was the total number of participants at all community meetings where you presented during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

215

Did you send letters to the editor of the local newspaper or community newsletters during this reporting period?
_" Yes
_" No (If no, proceed to question 218.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

216

How many letters were sent during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

217

How many letters were published during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

218

Did community members gather to show disapproval of upcoming alcohol-sponsored events during this reporting period?
_" Yes
_" No (If no, proceed to question 220.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

219

How many of the gatherings protesting alcohol-sponsored events were held during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

220

Did you develop substance abuse prevention public service announcements (PSAs) during this reporting period?
_" Yes
_" No (If no, proceed to question 222.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

221

How many PSAs were developed during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

222

Did you broadcast substance abuse prevention public service announcements (PSAs) during this reporting period?
_" Yes
_" No (If no, proceed to question 224.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

223

How often were the PSAs broadcast during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

224

Did you produce and/or distribute substance abuse prevention posters?
_" Yes
_" No (If no, proceed to question 227.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

225

How many posters were distributed?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

226

How many weeks are the posters scheduled to be displayed?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

227

Did you develop prevention-focused Web site(s) during this reporting period?
_" Yes
_" No (If no, proceed to question 229.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

228

How many hits did the Web site(s) receive during this reporting period

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

229

Did you conduct other communication activities during this reporting period?
_" Yes (Describe.)_______
_" No (If no, proceed to question 231.)

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

230

How often did you conduct other communication activities during this reporting period?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

231

Describe any other type(s) of environmental strategies you worked to implement or implemented during this reporting period that do not fall into the categories listed above.

Community Level: Planning & Implementation

Environmental Strategies Sub-Form/Communication

232

What individuals or organizations did you work with in planning or implementing environmental strategies during this reporting period?
" _ Youth
" _ Parents
" _ Business community
" _ Media (e.g., radio and television stations, newspapers and magazines)
" _ School(s)
" _ Youth serving organization(s) (other than schools) (e.g., Big Brothers Big Sisters, Boy Scouts/Girl Scouts)
" _ Law enforcement agency/agencies
" _ Religious or fraternal organization(s) (e.g., churches, Lions Club, Kiwanis)
" _ Civic or volunteer organization(s) (e.g., local sports associations, neighborhood associations)
" _ Healthcare
professionals
" _ State and/or local and/or tribal government agencies
" _ Other (Describe.)________

Community Level: Planning & Implementation

Environmental Strategies Sub-Form

233

When did you first start conducting environmental strategies as part of this intervention?

Community Level: Planning & Implementation

Environmental Strategies Sub-Form

234a

Percentage of population targeted ____%

Racial Category/American Indian/Alaska Native

Community Level: Planning & Implementation

Environmental Strategies Sub-Form

234b

Subgroups targeted, if applicable

Racial Category/American Indian/Alaska Native

Community Level: Planning & Implementation

Environmental Strategies Sub-Form

235a

Percentage of population targeted ____%

Racial Category/Asian

Community Level: Planning & Implementation

Environmental Strategies Sub-Form

235b

Subgroups targeted, if applicable

Racial Category/Asian

Community Level: Planning & Implementation

Environmental Strategies Sub-Form

236a

Percentage of population targeted ____%

Racial Category/Black or African American

Community Level: Planning & Implementation

Environmental Strategies Sub-Form

236b

Subgroups targeted, if applicable

Racial Category/Black or African American

Community Level: Planning & Implementation

Environmental Strategies Sub-Form

237a

Percentage of population targeted ____%

Racial Category/Native Hawaiian or Other Pacific Islander

Community Level: Planning & Implementation

Environmental Strategies Sub-Form

237b

Subgroups targeted, if applicable

Racial Category/Native Hawaiian or Other Pacific Islander

Community Level: Planning & Implementation

Environmental Strategies Sub-Form

238a

Percentage of population targeted ____%

Racial Category/White

Community Level: Planning & Implementation

Environmental Strategies Sub-Form

238b

Subgroups targeted, if applicable

Racial Category/White

Community Level: Planning & Implementation

Environmental Strategies Sub-Form

239a

Percentage of population targeted ____%

Racial Category/Participants who selected more than one race

Community Level: Planning & Implementation

Environmental Strategies Sub-Form

239b

Subgroups targeted, if applicable

Racial Category/Participants who selected more than one race

Community Level: Planning & Implementation

Environmental Strategies Sub-Form

240a

Percentage of population targeted ____%

Racial Category/Other

Community Level: Planning & Implementation

Environmental Strategies Sub-Form

240b

Subgroups targeted, if applicable

Racial Category/Other

Community Level: Planning & Implementation

Environmental Strategies Sub-Form

241a

Percentage of population targeted ____%

Ethnic Category/Hispanic/Latino

Community Level: Planning & Implementation

Environmental Strategies Sub-Form

241b

Subgroups targeted, if applicable

Ethnic Category/Hispanic/Latino

Community Level: Planning & Implementation

Environmental Strategies Sub-Form

242

What types of information did you disseminate?
_" Program information (e.g., contact information, meeting times, etc.)
_" " Substance abuse prevention information
_" " Surveillance and monitoring information, for example information about whom to contact if you suspect a meth lab is operating in your neighborhood.
_" " Drunk driving prevention information, such as free cab rides home on New Years Eve.
_" " Other (Describe.)___

Community Level: Planning & Implementation

Information Dissemination Sub-Form

243

What format was the information you disseminated?
" Brochures
_" Flyers
_" Magnets
_" Other promotional items (Frisbees, balls, cups)
_" Other (Describe.)________

Community Level: Planning & Implementation

Information Dissemination Sub-Form

244

Describe the settings in which the information was disseminated.

Community Level: Planning & Implementation

Information Dissemination Sub-Form

245

Approximately how many individuals received the information disseminated?

Community Level: Planning & Implementation

Information Dissemination Sub-Form

246

What individuals or organizations did you work with in planning or implementing your information dissemination efforts during this reporting period?
_" Youth
_" Parents
_" Business community
_" Media (e.g., radio and television stations, newspapers and magazines)
_" School(s)
_" Youth serving organization(s) (other than schools) (e.g., Big Brothers Big Sisters, Boy Scouts/Girl Scouts)
_" Law enforcement agency/agencies
_" Religious or fraternal organization(s) (e.g., churches, Lions Club, Kiwanis)
_" Civic or volunteer organization(s) (e.g., local sports associations, neighborhood associations)
_" Healthcare professionals
_" State and/or local and/or tribal government agencies
_" Other (Describe.)_______

Community Level: Planning & Implementation

Information Dissemination Sub-Form

247

When did you first start conducting information dissemination activities as part of this intervention?

Community Level: Planning & Implementation

Information Dissemination Sub-Form

248a

Percentage of population targeted ____%

Racial Category/American Indian/Alaska Native

Community Level: Planning & Implementation

Information Dissemination Sub-Form

248b

Subgroups targeted, if applicable

Racial Category/American Indian/Alaska Native

Community Level: Planning & Implementation

Information Dissemination Sub-Form

249a

Percentage of population targeted ____%

Racial Category/Asian

Community Level: Planning & Implementation

Information Dissemination Sub-Form

249b

Subgroups targeted, if applicable

Racial Category/Asian

Community Level: Planning & Implementation

Information Dissemination Sub-Form

250a

Percentage of population targeted ____%

Racial Category/Black or African American

Community Level: Planning & Implementation

Information Dissemination Sub-Form

250b

Subgroups targeted, if applicable

Racial Category/Black or African American

Community Level: Planning & Implementation

Information Dissemination Sub-Form

251a

Percentage of population targeted ____%

Racial Category/Native Hawaiian or Other Pacific Islander

Community Level: Planning & Implementation

Information Dissemination Sub-Form

251b

Subgroups targeted, if applicable

Racial Category/Native Hawaiian or Other Pacific Islander

Community Level: Planning & Implementation

Information Dissemination Sub-Form

252a

Percentage of population targeted ____%

Racial Category/White

Community Level: Planning & Implementation

Information Dissemination Sub-Form

252b

Subgroups targeted, if applicable

Racial Category/White

Community Level: Planning & Implementation

Information Dissemination Sub-Form

253a

Percentage of population targeted ____%

Racial Category/Participants who selected more than one race

Community Level: Planning & Implementation

Information Dissemination Sub-Form

253b

Subgroups targeted, if applicable

Racial Category/Participants who selected more than one race

Community Level: Planning & Implementation

Information Dissemination Sub-Form

254a

Percentage of population targeted ____%

Racial Category/Other

Community Level: Planning & Implementation

Information Dissemination Sub-Form

254b

Subgroups targeted, if applicable

Racial Category/Other

Community Level: Planning & Implementation

Information Dissemination Sub-Form

255a

Percentage of population targeted ____%

Ethnic Category/Hispanic/Latino

Community Level: Planning & Implementation

Information Dissemination Sub-Form

255b

Subgroups targeted, if applicable

Ethnic Category/Hispanic/Latino

Community Level: Planning & Implementation

Information Dissemination Sub-Form

256

Describe any other component of the intervention that was delivered to individuals.

Community Level: Planning & Implementation

Other Activities or Services Delivered to Individuals Sub-Form

257

When did you first start conducting this component of the intervention?

Community Level: Planning & Implementation

Other Activities or Services Delivered to Individuals Sub-Form

258

What was the average duration of one session during this reporting period?

Community Level: Planning & Implementation

Other Activities or Services Delivered to Individuals Sub-Form

259

How many sessions did you conduct during this reporting period?

Community Level: Planning & Implementation

Other Activities or Services Delivered to Individuals Sub-Form

260

What was the format of this component during this reporting period?
_Individual
_Small group (2-9)
_Large group (10-49)
_Extra large group (50+)
_Web-based
_Other (Describe.)

Community Level: Planning & Implementation

Other Activities or Services Delivered to Individuals Sub-Form

261

Indicate the types of participants served by this component during this reporting period.
_Children age 0 to 3
_Children age 4 to 5
_Children age 6 to 11
_Youth age 12 to 17
_Young adults age 18 to 20
_Young adults age 21 to 24
_Parents
_Adults 18 and over, but not parents
_Community leaders
_Healthcare providers
_Substance abuse prevention/treatment workers
_Law enforcement
_Other (Describe.)

Community Level: Planning & Implementation

Other Activities or Services Delivered to Individuals Sub-Form

262

As delivered, how would you classify this other intervention component according to the Institute of Medicine categories?
_Universal
_Selective
_Indicated

Community Level: Planning & Implementation

Other Activities or Services Delivered to Individuals Sub-Form

263a

Percentage of participants served ____%

Racial Category/American Indian/Alaska Native

Community Level: Planning & Implementation

Other Activities or Services Delivered to Individuals Sub-Form

263b

Subgroups targeted, if applicable

Racial Category/American Indian/Alaska Native

Community Level: Planning & Implementation

Other Activities or Services Delivered to Individuals Sub-Form

264a

Percentage of participants served ____%

Racial Category/Asian

Community Level: Planning & Implementation

Other Activities or Services Delivered to Individuals Sub-Form

264b

Subgroups targeted, if applicable

Racial Category/Asian

Community Level: Planning & Implementation

Other Activities or Services Delivered to Individuals Sub-Form

265a

Percentage of participants served ____%

Racial Category/Black or African American

Community Level: Planning & Implementation

Other Activities or Services Delivered to Individuals Sub-Form

265b

Subgroups targeted, if applicable

Racial Category/Black or African American

Community Level: Planning & Implementation

Other Activities or Services Delivered to Individuals Sub-Form

266a

Percentage of participants served ____%

Racial Category/Native Hawaiian or Other Pacific Islander

Community Level: Planning & Implementation

Other Activities or Services Delivered to Individuals Sub-Form

266b

Subgroups targeted, if applicable

Racial Category/Native Hawaiian or Other Pacific Islander

Community Level: Planning & Implementation

Other Activities or Services Delivered to Individuals Sub-Form

267a

Percentage of participants served ____%

Racial Category/White

Community Level: Planning & Implementation

Other Activities or Services Delivered to Individuals Sub-Form

267b

Subgroups targeted, if applicable

Racial Category/White

Community Level: Planning & Implementation

Other Activities or Services Delivered to Individuals Sub-Form

268a

Percentage of participants served ____%

Racial Category/Participants who selected more than one race

Community Level: Planning & Implementation

Other Activities or Services Delivered to Individuals Sub-Form

268b

Subgroups targeted, if applicable

Racial Category/Participants who selected more than one race

Community Level: Planning & Implementation

Other Activities or Services Delivered to Individuals Sub-Form

269a

Percentage of participants served ____%

Racial Category/Other

Community Level: Planning & Implementation

Other Activities or Services Delivered to Individuals Sub-Form

269b

Subgroups targeted, if applicable

Racial Category/Other

Community Level: Planning & Implementation

Other Activities or Services Delivered to Individuals Sub-Form

270a

Percentage of participants served ____%

Ethnic Category/Hispanic/Latino

Community Level: Planning & Implementation

Other Activities or Services Delivered to Individuals Sub-Form

270b

Subgroups targeted, if applicable

Ethnic Category/Hispanic/Latino

Community Level: Planning & Implementation

Other Activities or Services Delivered to Individuals Sub-Form

271

Describe the activities or services you provided that were not delivered to individuals.

Community Level: Planning & Implementation

Other Activities or Services Not Delivered to Individuals Sub-Form

272

Describe the intended target population for these activities or services.

Community Level: Planning & Implementation

Other Activities or Services Not Delivered to Individuals Sub-Form

273

When did you first start conducting these other non-participant based activities, as a component of this intervention?

Community Level: Planning & Implementation

Other Activities or Services Not Delivered to Individuals Sub-Form






















APPENDIX H


Introductory Email to SPF SIG State Project Directors






To: State Project Directors


From: SPF SIG Cross-Site Evaluation Team


Re: Registration for Access to the Web-Based Community Level Instrument (CLI)



The state is responsible for several tasks regarding completion of the CLI by the community partners. These tasks include:

  • Setting up the username and password for each community partner;

  • Entering the contact information for each community partner; and

  • Reviewing and approving each community partner’s CLI.


Each state is asked to designate one person to serve as the state administrator. This state administrator will be responsible for each of the tasks described above. You may choose to recruit other state level individuals to assist with these tasks, but there will only be one username and password provided to each state. Please note that the state administrator will be responsible for submitting each of the CLI’s following every reporting period.


Please complete the attached form, providing the contact information for the person identified as the state administrator. The form should be returned to Shelly Kowalczyk ([email protected]) by (insert date).


Once we have entered your contact information into the system and have created your user profile, you will be able to access the CLI. An email notification will be sent to you with your username and temporary password. You will be prompted to change this temporary password after you first log in. Also included will be instructions on how to register each of your funded communities.


If you have any questions, please contact Shelly Kowalczyk at 301-587-1600 ([email protected]).


Thank you.
















APPENDIX I


State-level Administrator Registration Form

Web-Based Community Level Instrument (CLI)

State Administrator

User Profile



Please provide the following information to allow the SPF SIG Cross-Site Evaluation Team to complete your user profile for access to the CLI.

FIRST NAME ___________________________________________________________


LAST NAME ___________________________________________________________


E-MAIL ADDRESS ______________________________________________________


PHONE NUMBER _______________________________________________________


ORGANIZATION NAME _________________________________________________


ADDRESS _____________________________________________________________


CITY, STATE, ZIP _______________________________________________________





Thank You.
















APPENDIX J


Introductory Email to State-level Administrators

To: State Administrators


From: SPF SIG Cross-Site Evaluation Team


Re: Access to the Community Level Instrument (CLI)


You will find below your username and temporary password for access to the web-based Community Level Instrument. Please go to http://westat.hmstech.com/ and enter the following information on the log in screen:


USERNAME

PASSWORD


Once you have entered this information, you will be prompted to change your password.


We recommend that you review your user profile and confirm that the cross-site team has entered all of the information correctly. Once you have done so, you are ready to begin entering the profile information for each of your funded communities.


In order to enter a community partner profile, please go to the state user tab and click on it. Once you are on this page, you will see a link to add user. This will allow you to enter and submit the contact information for each community.


If you have any questions, please contact Shelly Kowalczyk at 301-587-1600 ([email protected]).


Thank you.

















APPENDIX K


Community Partner Registration Form

Web-Based Community Level Instrument (CLI)

Community Partner

User Profile



Please provide the following information to allow the State Administrator to complete your user profile for access to the CLI.

FIRST NAME ___________________________________________________________


LAST NAME ___________________________________________________________


E-MAIL ADDRESS ______________________________________________________


PHONE NUMBER _______________________________________________________


ORGANIZATION NAME _________________________________________________


ADDRESS _____________________________________________________________


CITY, STATE, ZIP _______________________________________________________





Thank You.


















APPENDIX L


Introductory Email to Community Partners

To: Community Partners


From: SPF SIG State Administrator


Re: Access to the Community Level Instrument (CLI)


You will find below your username and temporary password for access to the web-based Community Level Instrument. Please go to http://westat.hmstech.com/ and enter the following information on the log in screen:


USERNAME

PASSWORD


Once you have entered this information, you will be prompted to change your password.


We recommend that you review your user profile and confirm that the state administrator has entered all of the information correctly. Once you have done so, you are ready to begin completing the CLI by clicking on the survey tab.


If you have any questions about your username or password, or if any of your profile information is incorrect, please contact (insert state administrator’s contact information).


If you have questions about the CLI or navigating through the Web site, please contact Shelly Kowalczyk at 301-587-1600 ([email protected]).


Thank you.

















APPENDIX M


Reminder Emails to Community Partners and State-level Administrators

Initial Reminder Email

Subject: CLI Submission Reminder


Today is the last day of the current SPF SIG semiannual reporting period. Please remember that the Community Level Instrument must be completed and submitted online by (insert date). You may access the instrument at http://westat.hmstech.com/.


Thank you.




2-Week Reminder Email

Subject: CLI Submission Due (insert date)


Just a reminder that there are only two weeks left to complete and submit your CLI.

If you have any questions, please contact your State Administrator or Shelly Kowalczyk at 301-587-1600 ([email protected]). Thank you.

















APPENDIX N


Follow-up Emails


Follow-up Email to State-level Administrators


Subject: CLI Submission Past Due


According to our records, there remain (# of CLIs) outstanding CLIs from your state. If there are reasons that will preclude you from completing submission of the CLIs by (insert date), please contact Shelly Kowalczyk at 301-587-1600 ([email protected]).


Thank you.








Follow-up Email to State-level Administrators and State Project Directors



Subject: Final Reminder Regarding CLI Submission


The SPF SIG Cross-Site Evaluation Team has yet to receive (insert # of CLIs) of your state’s CLIs (insert CLI names). In order for the Cross-Site team to analyze your communities’ data and provide you with cleaned data to work with in a timely manner, we need to receive the CLIs by the due date. Please contact Shelly Kowalczyk at (301) 587-1600 ([email protected]) to coordinate a process for submitting your remaining CLIs.


Thank you.


















APPENDIX O


Federal Register Notice


Federal Register / Vol. 71, No. 6 / Tuesday, January 10, 2006 / Notices 154

development through collaborative

research opportunities with the

inventors.

A Knockout Mouse for Transcription

Factor Nurr1

Dr. Vera Nikodem (NIDDK)

HHS Reference No. E–024–1999/0—

Research Tool

Licensing Contact: Marlene Shinn-

Astor; 301/435–4426;

[email protected]

Transcriptional factor Nurr1 is an

obligatory factor for neurotransmitter

dopamine biosynthesis only in ventral

midbrain as demonstrated by the Nurr1

genomic locus inactivation using

homologous recombination.

From a neurological and clinical

perspective, it suggests an entirely new

mechanism for dopamine depletion in a

region where dopamine is known to be

involved in Parkinson’s disease.

Clinically, our findings indicate that

activation of Nurr1 may be

therapeutically useful for Parkinson’s

disease patients; therefore, the mice

would be useful in Parkinson’s disease

research.

Dated: January 3, 2006.

Steven M. Ferguson,

Director, Division of Technology Development

and Transfer, Office of Technology Transfer,

National Institutes of Health.

[FR Doc. E6–86 Filed 1–9–06; 8:45 am]

BILLING CODE 4140–01–P

DEPARTMENT OF HEALTH AND

HUMAN SERVICES

National Institutes of Health

Prospective Grant of Exclusive

License: Anthrax Lethal Factor Is a

MAPK Kinase Protease

AGENCY: National Institutes of Health,

Public Health Service, HHS.

ACTION: Notice.

SUMMARY: This is notice, in accordance

with 35 U.S.C. 209(c)(1) and 37 CFR

404.7(a)(1)(i), that the National

Institutes of Health (NIH), Department

of Health and Human Services, is

contemplating the grant of an exclusive

license to practice the inventions

embodied in U.S. Patent Nos. 6,485,925

B1, issued November 26, 2002,

6,893,835 B2, issued May 17, 2005, and

6,911,203 B1, issued June 28, 2005, and

U.S. Patent App. No. 11/112,137, filed

April 22, 2005 and published on

September 8, 2005 as U.S. Pat. Pub. No.

2005/0196822 A1, all titled ‘‘Lethal

Factor is a MAPK Kinase Protease’’

(HHS Ref. Nos. E–066–1998/0–US–06,

07, –08, and –10) to Van Andel

Research Institute, of Grand Rapids,

Michigan. The patent rights in these

inventions have been assigned to the

Government of the United States.

The prospective exclusive license

territory will be worldwide. The field of

use may be limited to the development

and sale of Anthrax lethal factor, a

MAPK kinase protease, as a therapeutic

agent for the treatment of cancer.

DATES: Only license applications which

are received by the National Institutes of

Health on or before March 13, 2006 will

be considered.

ADDRESSES: Requests for information,

inquiries, comments, and other

materials relating to the contemplated

co-exclusive license should be directed

to: Thomas P. Clouse, Office of

Technology Transfer, National Institutes

of Health, 6011 Executive Boulevard,

Suite 325, Rockville, MD 20852–3804;

Telephone: 301–435–4076; Facsimile:

301–402–0220; E-mail:

[email protected]. Copies of the U.S.

patent publications can be obtained

from http://www.uspto.gov.

SUPPLEMENTARY INFORMATION: The

above-identified patents relates to the

discovery that Mitogen Activated

Protein Kinase (MAPK) signal

transduction pathway is an

evolutionarily conserved pathway for

effecting gene regulation that controls

cell proliferation and differentiation in

response to extracellular signals and

also plays a crucial role in regulating

oocyte meiotic maturation. The above identified patent discloses in vitro and in vivo methods of screening for modulators, homologues, and mimetics of LF mitogen activated protein kinase kinase (MAPKK) protease activity. Mos (i.e., an oncogene first identified as the transforming determinant of Moloney Murine Sarcoma Virus) is a serine/ threonine kinase which phosphorylates and activates MAPK1 kinase which in turn phosphorylates and activates MAPK. The patent also discloses that LF prevents activation of MAPK in oocytes of Xenopus laevis and tumor derived NIH3T3 (490) cells expressing an effector domain mutant form of the human V12HaRas oncogene. The tumor derived NIH3T3 cells reverted to a more normal morphology after LF treatment.

Therefore, LF directly inhibits the Mos/ MAPK pathway. Tumor cells utilize MAPK kinases in a different way than normal cells as in tumor cells there is a constitutive MAPK kinase activity. Additionally, MAPKK1 was found to be

a proteolytic substrate for the

metalloprotease LF. By analysis of

MAPKK2, a consensus sequence for LF activity was found. The disclosure is claimed in the above-identified patent and other patents in the same patent family.

The prospective exclusive license will be royalty-bearing and will comply with the terms and conditions of 35 U.S.C.

209 and 37 CFR 404.7. The prospective exclusive license may be granted unless within sixty (60) days from the date of this published notice, the NIH receives written evidence and argument that

establish that the grant of the license

would not be consistent with the

requirements of 35 U.S.C. 209 and 37 CFR 404.7.

Applications for a license in the field

of use filed in response to this notice

will be treated as objections to the grant of the contemplated exclusive license. Comments and objections submitted to this notice will not be made available for public inspection and, to the extent permitted by law, will not be released under the Freedom of Information Act,

5 U.S.C. 552.

Dated: January 3, 2006.

Steven M. Ferguson,

Director, Division of Technology Development

and Transfer, Office of Technology Transfer,

National Institutes of Health.

[FR Doc. E6–89 Filed 1–9–06; 8:45 am]

BILLING CODE 4140–01–P

DEPARTMENT OF HEALTH AND HUMAN SERVICES

Substance Abuse and Mental Health Services Administration

Agency Information Collection

Activities: Proposed Collection;

Comment Request

In compliance with Section

3506(c)(2)(A) of the Paperwork

Reduction Act of 1995 concerning

opportunity for public comment on

proposed collections of information, the

Substance Abuse and Mental Health

Services Administration will publish

periodic summaries of proposed

projects. To request more information

on the proposed projects or to obtain a

copy of the information collection

plans, call the SAMHSA Reports

Clearance Officer on (240) 276–1243.

Comments are invited on: (a) Whether

the proposed collections of information

are necessary for the proper

performance of the functions of the

agency, including whether the

information shall have practical utility;

(b) the accuracy of the agency’s estimate

of the burden of the proposed collection

of information; (c) ways to enhance the

quality, utility, and clarity of the

information to be collected; and (d)

1546 Federal Register / Vol. 71, No. 6 / Tuesday, January 10, 2006 / Notices

ways to minimize the burden of the

respondents, including through the use

of automated collection techniques or

other forms of information technology.

Proposed Project: Strategic Prevention

Framework State Incentive Grant (SPF

SIG) Program—NEW

The Substance Abuse and Mental

Health Services Administration’s

(SAMHSA) Center for Substance Abuse

Prevention (CSAP) is responsible for the

Evaluation of the Strategic Prevention

Framework State Incentive Grant (SPF

SIG) Program. The program is a major

national initiative designed to: (1)

Prevent the onset and reduce the

progression of substance abuse,

including childhood and underage

drinking; (2) reduce substance abuse related problems in communities; and,

(3) build prevention capacity and

infrastructure at the State/territory and

community levels.

Five steps comprise the SPF:

Step 1: Profile population needs,

resources, and readiness to address

needs and gaps.

Step 2: Mobilize and/or build

capacity to address needs.

Step 3: Develop a comprehensive

strategic plan.

Step 4: Implement evidence-based

prevention programs, policies, and

collection of information on

practices.

Step 5: Monitor, evaluate, sustain,

and improve or replace those that fail.

Under a contract with CSAP, an

evaluation team will implement a multimethod

quasi-experimental evaluation

at national, State, and community

levels. Evaluation data will be collected

from 26 states receiving grants in 2004

and 2005 and as many as 32 non-grantee

states that will serve as a comparison

group. The primary evaluation objective

is to determine the impact of SPF SIG

on the SAMHSA National Outcome

Measures (NOMs).

This notice invites comment on statelevel

and community-level data

collection instruments. The instruments

for assessing state-level change will be

included in an OMB review package

submitted immediately after the

expiration of the comment period and

are the main focus of this

announcement. These instruments will

be reviewed first by OMB to ensure that

state-level data collection occurs as

specified in the evaluation plan (on or

before June 30, 2006). Because the states

have not awarded community-level

funding, the evaluators will not initiate

community-level data collection until

late in 2006. Thus, the community-level

survey will be submitted as an

addendum approximately one month

after the comment period expires.

However, the instrument is described in

this notice and comments on the

instrument are invited.

State-Level Data Collection

Two instruments were developed for

assessing state-level effects. Both

instruments are guides for telephone

interviews that will be conducted by

trained interviewers three to four times

over the life of the SPF SIG award. The

Strategic Prevention Framework Index

will be used to assess the relationship

between SPF implementation and

change in the national outcome

measures. The State Infrastructure

Index will capture data to assess

infrastructure change and to test the

relationship of this change to outcomes.

Prevention infrastructure refers to the

organizational features of the system

that delivers prevention services,

including all procedures related to

planning, data management systems,

workforce development, intervention

implementation, evaluation and

monitoring, financial management, and

sustainability. The estimated annual

burden for state-level data collection is

displayed below in the table.

STATE LEVEL BURDEN ESTIMATE

Interview guide

Content description

Number of respondents

Number of responses

Hourly burden per response

Total hourly burden

Year 1

SPF Implementation Index ..........




State Infrastructure Index ............




Total State Level Year 1 Burden

SEW activities, indicators for each SPF step, including cultural competence throughout all five steps.

Assessment of a state’s progress over time toward the implementation of these best practices.

.......................................................

26




26

1




1

3




6

78




156


........................


2


9


234

Year 2

SPF Implementation Index ..........




State Infrastructure Index ............





Total State Level Year 2 Burden

SEW activities, indicators for each SPF step, including cultural competence throughout all five steps.

Assessment of a state’s progress over time toward the implementation of these best practices.


.......................................................

26




26

1




1

3




6

78




156





.......................


2


9


234

PROD1PC65 with NOTICES

Federal Register / Vol. 71, No. 6 / Tuesday, January 10, 2006 / Notices 1547

STATE LEVEL BURDEN ESTIMATE—Continued

Interview guide

Content description

Number of respondents

Number of responses

Hourly burden per response

Total hourly burden

Year 3

SPF Implementation Index .........




State Infrastructure Index ............





SEW activities, indicators for each SPF step, including cultural competence throughout all five steps.

Assessment of a state’s progress over time toward the implementation of these best practices.


26




26

1




1

3




6

78




156



Total State Level Year 3 Burden

......................................................

.......................

2

9

234


Average Annual State Burden.

.......................................................

..................

2

9

234



Community-level Data Collection

The Community Level Index is a twopart,

web-based survey for capturing

information about SPF SIG

implementation at the community level.

Part 1 of the survey focuses on the five

SPF SIG steps and efforts to ensure

cultural competency throughout the SPF

SIG process. Part 2 will capture data on

the specific intervention(s)

implemented at the community level

including both individual-focused and

environmental prevention strategies.

Community partners receiving SPF SIG

awards will be required to complete the

survey every six months, using a secure



password system. The survey data will

be analyzed in conjunction with state

and community outcome data to

determine the relationship, if any,

between the SPF process and substance

use outcomes. This survey will be

submitted as an addendum to the

forthcoming OMB package

approximately one month after the

expiration of the comment period. The

estimated annual burden for

community-level data collection is

displayed below. Note that the total

burden assumes an average of 15

community-level sub-grantees per state

(a total of 390 respondents) and two



survey administrations per year. Note

also that some questions will be

addressed only once and the responses

will be used to pre-fill subsequent

surveys. In addition, as community

partners work through the SPF steps,

they will report only on step-related

activities. For example, needs

assessment activities will likely precede

monitoring and evaluation activities.

Thus, respondents will answer

questions related to needs assessment in

the first few reports but will not need to

address monitoring and evaluation

items until later in the implementation

process

COMMUNITY LEVEL SURVEY BURDEN ESTIMATE

Survey Section

Content description

Number of respondents

Number of responses

Hourly burden per response

Total hourly burden

Year 1

Part I, 1–10 .......................................


11–19 ................................................

20–26 ................................................


27–47 ................................................

48–137 ..............................................

138–155 ............................................

172–178 ............................................


Sub-form 179–191 ............................

Part II 1–52 .......................................


Review of past responses ................


Total Community Level Year 1

Burden.

Contact Information and Reporting

Period.

Organization Type and Funding.

Cultural Competence, Sustainability

and Framework Progress.

Needs and Resources Assessments

Capacity Building Activities

Strategic Plan Development

Contextual Factors and Closing

Questions.

Coalition Organizational Information

Intervention Specific Information and

Adaptations.

..........................................................


...........................................................

390


390

390


390

390

390

390


390

390


390

1


1

2


1

2

1

2


1

3


2

0.2


0.2

0.1


0.5

1.7

1.0

1.0


1.0

2.0


1.0

78


78

78


195

1,326

390

780


390

2,340


780


.....................


16


8.6


6,435


1548 Federal Register / Vol. 71, No. 6 / Tuesday, January 10, 2006 / Notices


COMMUNITY LEVEL SURVEY BURDEN ESTIMATE—Continued

Survey Section

Content description

Number of respondents

Number of responses

Hourly burden per response

Total hourly burden

Year 2

Part I, 20–26 .....................................


48–137 ..............................................

172–178 ............................................


Part II 1–52 .......................................


53-60...............................................

Sub-forms.......................................

Review of past responses ................

Total Community Level Year 2

Burden.

Cultural Competence, Sustainability

and Framework Progress.

Capacity Building Activities

Contextual Factors and Closing

Questions.

Intervention Specific Information and

Adaptations.

Intervention Outcomes

Intervention Component Information

..........................................................


..........................................................

390


390

390


390


390

390

390


2


2

2


3


6

6

2


0.1


1.7

1.0


2.0


1.0

1.0

1.0

78


1,326

780


2,340


2,340

2,340

780

.....................

23

7.8

9,984

Year 3

Part I, 20–26 .....................................


48–137 ..............................................

156-160………………………………..

Cultural Competence, Sustainability

and Framework Progress.

Capacity Building Activities

Intervention Implementation

390


390

390

2


1

2

0.1


1.7

0.1

78


1,326

78

172-178………………………………..


Part II 1–52 .......................................


53–60 ..............................................

Sub-forms.......................................

Review of past responses ................


Total Community Level Year 2

Burden.


Average Annual Community

Burden

Contextual Factors and Closing

Questions.

Intervention Specific Information and

Adaptations.

Intervention Outcomes

Intervention Component Information

..........................................................


..........................................................



..........................................................

390


390


390

390

390


2


3


6

6

2

1.0


2.0


1.0

1.0

1.0

780


2,340


2,340

2,340

780

.....................

24


7.9

10,062


.....................


21


8.1


8,827


Send comments to Summer King,

SAMHSA Reports Clearance Officer,

Room 71–1044, One Choke Cherry

Road, Rockville, MD 20857. Written

comments should be received within 60

days of this notice.

Dated: December 30, 2005.

Anna Marsh,

Director, Office of Program Services.

[FR Doc. E6–95 Filed 1–9–06; 8:45 am]

BILLING CODE 4162–20–P

DEPARTMENT OF HEALTH AND

HUMAN SERVICES

Substance Abuse and Mental Health

Services Administration

Current List of Laboratories Which

Meet Minimum Standards To Engage in

Urine Drug Testing for Federal

Agencies

AGENCY: Substance Abuse and Mental

Health Services Administration, HHS.

ACTION: Notice.

SUMMARY: The Department of Health and Human Services (HHS) notifies Federal agencies of the laboratories currently certified to meet the standards of

Subpart C of the Mandatory Guidelines

for Federal Workplace Drug Testing

Programs (Mandatory Guidelines). The

Mandatory Guidelines were first

published in the Federal Register on

April 11, 1988 (53 FR 11970), and

subsequently revised in the Federal

Register on June 9, 1994 (59 FR 29908),

on September 30, 1997 (62 FR 51118),

and on April 13, 2004 (69 FR 19644).

A notice listing all currently certified

laboratories is published in the Federal

Register during the first week of each

month. If any laboratory’s certification

is suspended or revoked, the laboratory

will be omitted from subsequent lists

until such time as it is restored to full

certification under the Mandatory

Guidelines. If any laboratory has withdrawn from the HHS National Laboratory Certification Program (NLCP) during the past month, it will be listed at the end, and will be omitted from the monthly listing thereafter. This notice is also available on the Internet at http://workplace.samhsa.gov

and http://www.drugfreeworkplace.gov.

FOR FURTHER INFORMATION CONTACT: Mrs. Giselle Hersh or Dr. Walter Vogl,

Division of Workplace Programs,

SAMHSA/CSAP, Room 2–1035, 1 Choke Cherry Road, Rockville, Maryland

20857; 240–276–2600 (voice), 240–276–

2610 (fax).

SUPPLEMENTARY INFORMATION: The

Mandatory Guidelines were developed

in accordance with Executive Order

12564 and section 503 of Public Law

100–71. Subpart C of the Mandatory

Guidelines, ‘‘Certification of

Laboratories Engaged in Urine Drug

Testing for Federal Agencies,’’ sets strict

standards that laboratories must meet in

order to conduct drug and specimen

validity tests on urine specimens for

Federal agencies. To become certified,

an applicant laboratory must undergo

three rounds of performance testing plus

an on-site inspection. To maintain that

certification, a laboratory must

participate in a quarterly performance

testing program plus undergo periodic,

on-site inspections. Laboratories which claim to be in the applicant stage of certification are not to be considered as meeting the minimum requirements described in the HHS Mandatory Guidelines. A laboratory must have its letter of certification from HHS/SAMHSA (formerly: HHS/NIDA)

which attests that it has met minimum

standards.

In accordance with Subpart C of the

Mandatory Guidelines dated April 13,

2004 (69 FR 19644), the following

laboratories meet the minimum

standards to conduct drug and specimen

validity tests on urine specimens:

ACL Laboratories, 8901 W. Lincoln

Ave., West Allis, WI 53227. 414–328–

7840/800–877–7016. (Formerly:

Bayshore Clinical Laboratory).

ACM Medical Laboratory, Inc., 160

Elmgrove Park, Rochester, NY 14624.

585–429–2264.

Advanced Toxicology Network, 3560

Air Center Cove, Suite 101, Memphis,

TN 38118. 901–794–5770/888–290–

1150.

Aegis Analytical Laboratories, Inc., 345

Hill Ave., Nashville, TN 37210. 615–

255–2400.

Baptist Medical Center-Toxicology

Laboratory, 9601 I–630, Exit 7, Little

Rock, AR 72205–7299. 501–202–2783.

(Formerly: Forensic Toxicology

Laboratory Baptist Medical Center).

Clinical Reference Lab, 8433 Quivira

Road, Lenexa, KS 66215–2802. 800–

445–6917.

Diagnostic Services, Inc., dba DSI,

12700 Westlinks Drive, Fort Myers,

FL 33913. 239–561–8200/800–735–

5416.

Doctors Laboratory, Inc., 2906 Julia

Drive, Valdosta, GA 31602. 229–671–

2281.

DrugScan, Inc., P.O. Box 2969, 1119

Mearns Road, Warminster, PA 18974.

215–674–9310.

Dynacare Kasper Medical Laboratories,*

10150–102 St., Suite 200, Edmonton,

Alberta, Canada T5J 5E2. 780–451–

3702/800–661–9876.

ElSohly Laboratories, Inc., 5 Industrial

Park Drive, Oxford, MS 38655. 662–

236–2609.

Express Analytical Labs, 3405 7th Ave.,

Suite 106, Marion, IA 52302. 319–

377–0500.

Gamma-Dynacare Medical

Laboratories,* A Division of the

Gamma-Dynacare, Laboratory

Partnership, 245 Pall Mall Street,

London, ONT, Canada N6A 1P4. 519–

679–1630.

4703 E:\FR\FM\10JAN1.SGM 10JAN1




1 For simplicity of presentation, from here on the term “state” will be used to connote both states and territories.

2 The State Strategic Plan must be approved by the SAMHSA/CSAP Government Project Officer before implementation activities can begin (CSAP, 2004).


3 The advantages of a web-based instrument over a paper instrument include: a reduced respondent burden, the ability to build in skip patterns and quality checks, and direct downloads into an electronic database.

4 By population-based, we mean that all members of the target population or subpopulation, or at least a representative sample of the same, have the opportunity to contribute to the measures Pre- and post-test measures obtained from participants in prevention programs are unlikely to be population-based, as they typically include small and non-random subsets of the target population.

5 First, there is the difficulty of determining the communities in which NSDUH respondents live. Second, we don’t know how many respondents would actually fall in the intervention communities. Third, samples from communities are not designed to be representative of those communities. Fourth, there may be some difficulty getting the confidential geographic information necessary to identify the communities of the respondents.

6 For example, some states have extensive longitudinal data on consumption and consequences among youth from school-based surveys, while others do not. Similarly, some states have well-developed systems of maintaining and linking administrative records on school enrollment, employment, arrests, etc., others do not.

7 Of the six questions, Question 3b may best demonstrate the power of a cross-site evaluation to yield generalizable inferences about selecting and implementing community interventions under the SPF model. For example, suppose that based on their needs assessment and problem analysis, a community within a given state elects to implement Communities Mobilizing for Change on Alcohol (CMCA), a community-based SAMHSA model program designed to reduce adolescent (13 to 20 years old) access to alcohol by changing community policies and practices (SAMHSA, 2004). Initiated in 1991, CMCA has been shown to effectively reduce teen drinking by limiting access to alcohol to underage youth, as well as communicate a clear message to the community that underage drinking is inappropriate and unacceptable. http://modelprograms.samhsa.gov/pdfs/FactSheets/Cmca.pdf If the implementation of CMCA fails to change outcomes in the target community, potential reasons include measurement failure, implementation failure, and theory failure. Measurement failure can be ruled out if the instrument has previously demonstrated sensitivity to change (particularly differential change across interventions), and implementation failure can be ruled out if the implementation assessment shows the program to be implemented with fidelity and cultural competence. Theory failure, on the other hand, is near impossible to “unpack” from a single implementation. Was there something about the community that made reduction of underage drinking more difficult than anticipated? If so, the answer could lie in any number of demographic, cultural, and environmental factors. Alternatively, was there something about the local adaptation of CMCA that missed the mark, yet was not picked up by monitoring systems in place at the time? Clearly, the unpacking process is greatly facilitated by having multiple replications of the same program, policy, or practice both within and across states. The integrated multilevel analysis can bring out the common elements of successful replications, be they moderators, mediators, or both, at state-level, community-level, or both. This in turn sets up an outcomes-driven empirical basis for advancing best practices that is not possible with a single study or series of case studies.

8 Some states planned to implement a comparative design from the outset, and others are considering it.

9 The commonality could be demographic (e.g., states with a large Hispanic populations), problem based (e.g., states with a proliferation of methamphetamine labs), or programmatic (e.g., states investing in a particular environmental strategy).

10 For simplicity of presentation, from here on the term “state” will be used to connote both states and territories.

11 CSAP may obtain infrastructure development data from another contract. If such data are available, they will permit refining of effect estimates and as well as enhance our understanding of causes.

12 That strategy could be mapped into the statistical models in Section 7, though alternative approaches could also be viable.

13 Baron and Kenny (1986) define a mediating variable as “the generative mechanism through which the focal independent variable is able to influence the dependent variable of interest” (p. 1173). In statistical terms, the mediator y is the consequence of the independent variable x but the antecedent of the dependent variable z. The attention to testing program theory in evaluation—particularly the underlying causal assumptions about why intervention should work—has emphasized the role of mediators in outcome studies (Weiss 1997).

14 In contrast to the causal relationship of the mediator to both the independent and dependent variable, moderators in an evaluation examine the interaction of the program variable with some other variable. In the treatment literature, moderator analyses are often described as the search for differential or subgroup effects. Statistically, the variable y is a moderator if the relationship between the independent variable x and the dependent variable z varies as a function of y, that is, the independent variable’s effects vary along levels of the moderator (Mark, Hofmann, and Reichardt, 1992).


15 See SPF SIG Automatic Data Processing/IT Plan (Westat, 2005) for details.

16 The State Strategic Plan must be approved by the SAMHSA/CSAP Government Project Officer before implementation activities can begin (CSAP, 2004).


17 Communities will not be funded until the beginning of year 2 at the earliest (Oct ‘05 – March ‘06) and will not start planning, or data collection, until then.

18 With a single covariate (e.g., age), the same diagnosis could generally be made visually, but with many confounding covariates this is more difficult, and the issues of inadequate overlap and reliance on untrustworthy model-based extrapolations are more serious because small differences in many covariates can accumulate into a substantial overall difference.

19 Prior to Version 6, this option was available for linear models only.

20 To maximize the reliability of our results, all reported estimates and significance tests will be based on HLM’s population-average model with robust standard errors (Raudenbush and Bryk, 2002).

21 The moderators are “causally prior” to the mediators; hence it is common practice to test them first.

22 Initiated in 1991, CMCA has been shown to effectively reduce teen drinking by limiting access to alcohol to underage youth, as well as communicate a clear message to the community that underage drinking is inappropriate and unacceptable. http://modelprograms.samhsa.gov/pdfs/FactSheets/Cmca.pdf


23 A variant of the threat called “local” history exists with a comparative design, but it presupposes a community by history interaction, which is generally less likely to occur than a history main effect, which the comparative design rules out

24 Several states are currently considering this option.

1037503 5

File Typeapplication/msword
File TitleAPPENDIX G
AuthorIlene Klein
Last Modified ByCarol Hagen
File Modified2006-05-22
File Created2006-05-22

© 2024 OMB.report | Privacy Policy