SUPPORTING STATEMENT FOR THE STRATEGIC PREVENTION FRAMEWORK STATE INCENTIVE GRANT (SPF SIG) COMMUNITY-LEVEL INSTRUMENT
JUSTIFICATION
A1. Circumstances of Information Collection
The Substance Abuse Mental Health Services Administration (SAMHSA) Center for Substance Abuse Prevention (CSAP) requests OMB approval for a new two-part Community-level Instrument (see Appendix B). This two-part web-based survey is a part of the Strategic Prevention Framework State Incentive Grant (SPF SIG) National Cross-site Evaluation. Part I of this instrument was developed to assess the progress of communities as they implement the Strategic Prevention Framework (SPF), and Part II was developed to gather descriptive information about the specific interventions being implemented at the community level and the populations being served including the gender, age, race, ethnicity, and number of individuals in target populations. Each SPF SIG funded community will complete a separate Part II form for each intervention they implement. The evaluation of the SPF SIG project is authorized under Section 501 (d)(4) of the Public Health Service Act (42 USC 290aa) (see Appendix A). This Community-level Instrument (parts I and II) request is an addendum to the SPF SIG State-level Interview supporting statement submitted earlier.
The SPF SIG Program
The SPF SIG is a major national SAMHSA Infrastructure Grant program that supports an array of activities to help states and communities build a solid foundation for delivering and sustaining effective substance abuse and/or mental health services. The SPF SIG is implemented by CSAP and is designed to: (1) prevent the onset and reduce the progression of substance abuse, including childhood and underage drinking; (2) reduce substance abuse-related problems in communities; and (3) build prevention capacity and infrastructure at the state/territory and community levels. CSAP provides funding to states and territories to implement the five steps of the strategic prevention framework (SPF), which are:
Step 1: Profile population needs, resources, and readiness to address needs and gaps
Step 2: Mobilize and/or build capacity to address needs
Step 3: Develop a comprehensive strategic plan
Step 4: Implement evidence-based prevention programs, policies, and practices
Step 5: Monitor, evaluate, sustain, and improve or replace those that fail.
CSAP funded 21 states and territories in FY2004 for up to 5 years to implement the SPF, and 5 additional states/territories in FY2005.
The National Evaluation
The National Institute on Drug Abuse (NIDA) is providing support to SAMHSA’s Center for Substance Abuse Prevention (CSAP) to evaluate the impact of the SPF SIG project. The national cross-site evaluation of the SPF-SIG project has received funding through September 2007. Community-level data collection, however, is expected to continue through September 2009.
The national cross-site evaluation of the SPF SIG program provides an important opportunity for the field of prevention. The SPF SIG is the first broad-based, data-driven effort that simultaneously attempts to influence both strategic planning and prevention systems at the state and community levels, as well as implement evidence-based prevention interventions in communities. This evaluation will help determine whether the SPF SIG has met these expectations and, if so, under what conditions.
The cross-site evaluation team will implement a multi-method quasi-experimental evaluation of the SPF SIG project at national, state, and community levels. A major objective of the SPF SIG evaluation is to determine the impact of SPF SIG on the SAMHSA National Outcome Measures (NOMs), and to assess the impact of the program as a whole. The evaluation will also measure: the effect of establishing and sustaining infrastructure at the state and community-levels to allow for data-based decision-making; the implementation of the SPF; and environmental factors that affect substance abuse. The data from the Community-level Instrument (parts I and II) will be used to interpret the impact of the SPF SIG on all of the NOMs domains related to prevention (i.e., Abstinence, Education/Employment, Crime and Criminal Justice, Access/Capacity, Retention, Cost Effectiveness, and Use of Evidence-based Practices).
The national cross-site evaluation is based on the data that will be collected through: (1) State Epidemiology and Outcome Workgroups (SEOW) and communities, (2) state-level evaluations, (3) existing national- and state-level population-based indicators, (4) standardized data collected by the evaluators on the implementation of the SPF, and (5) archival sources such as grant applications and State Prevention Advancement and Support Program (SPAS) reports. The timing of the evaluation, in beginning concurrently with the funding of the programs, will allow for the gathering of meaningful baseline data and observation of community-level accomplishments within SPF SIG states throughout the life-cycle of the program.
Both quantitative and qualitative data will be gathered as a part of the SPF SIG national cross-site evaluation. These data will provide information about processes and systems outcomes at the state and community levels, as well as a context for analyzing epidemiological outcomes at the national level. Data will be gathered from communities within the 26 states and territories receiving community partner grants in 2004 and 2005 and as many as 32 non-grantee states and territories that will serve as a comparison group.
A2: Purpose and Use of Information
The SPF SIG is a major investment by the Federal Government to improve state substance abuse prevention systems, and enhance the quality of prevention programs, primarily through the implementation of the SPF. The goal of this initiative is to provide states and communities with the tools necessary to develop an effective prevention system with attention to the processes, directions, goals, expectations, and accountabilities necessary for functionality. SAMHSA/CSAP needs to collect information on an ongoing basis to monitor the progress of the SPF SIG initiative, particularly the implementation of evidence-based practices by communities. The agency will use the findings from the national cross-site evaluation to assess the implementation of the SPF, infrastructure development at the state and community level, and the outcomes achieved by this initiative. Without these data the impact of the SPF SIG will be unknown. Additionally, findings from this evaluation may assist CSAP policymakers and program developers as they design and implement future initiatives.
The national cross-site evaluation of the SPF SIG will focus on the relationship between the implementation of the SPF and changes in the NOMs. In particular, data from the Community-level Instrument (parts I and II) will be used to assess the relationship between SPF implementation and changes in the NOMs. Additionally, data from this instrument will be used to assess the types of interventions being implemented in communities that receive SPF funds and changes in prevention infrastructure at the community level. Prevention infrastructure refers to the organizational characteristics of the system that delivers prevention services, including all procedures related to planning, data management systems, workforce development, intervention implementation, evaluation and monitoring, financial management, and sustainability. All of the data from this instrument will be used to determine what accounts for any variation in the NOMs. Without these data, it would be impossible to determine how the SPF SIG initiative had an impact on changes in the NOMs or which components of the SPF process were responsible for the observed changes.
The Community-level Instrument (parts I and II) will be administered two times a year (every six months) over the course of the SPF SIG initiative. Thus, data from this instrument will also allow CSAP to assess the progress of the communities in their implementation of the both the SPF and prevention-related interventions funded under the initiative. The data may also be used to assess obstacles to the implementation of the SPF and prevention-related interventions and facilitate mid-course corrections for communities experiencing implementation difficulties.
A3. Use of Information Technology
The Community-level Instrument is a web-based survey and both part I and part II will be completed online. Web-based administration of this survey will increase the efficiency of data submission and improve data quality. Additionally, completion of this survey online will reduce the burden on communities as some items will be pre-filled based on information from the initial submission, and some items in part II will be pre-filled with information from part I of the instrument. A full 100 percent participation rate is expected for the use of the online version of this instrument. See Appendix C for screen shots of this web-based survey.
Technology is also being used to facilitate communication and provide updates to SPF SIG personnel. Through the SPF SIG web board, state evaluators, project directors, coordinators and other key staff have the opportunity to exchange valuable advice and receive announcements and clarifications from CSAP, other SPF SIG states, and the national cross-site evaluation team. This web board has been operational since December 2004 with more than 135 messages posted as of February 2, 2006. In addition to the web board, the national cross-site evaluation team also sends electronic copies of the guidance and resource materials via email and CD to SPF SIG states upon request. Each state’s data from the state-level interviews as well as the Community-level Instrument (parts I and II) will be made available to that state via the web for online analysis as well as downloading for offline analysis.
A4. Efforts to Identify Duplication
The proposed data collection is unique because the information is specific to the evaluation of the SPF-SIG program and is not available elsewhere.
A5. Involvement of Small Entities
Data will not be collected from small business entities.
A6. Consequences If Information Collected Less Frequently
This request is for approval to collect data from state and community-level stakeholders using the SPF Community-level Instrument (parts I and II). This survey will be administered twice per year to each state and community that receives SPF funding over the course of three years. Experience from the SIG project as well as discussions with state-level evaluators for the SIG project, has shown that it is necessary to gather this information at least twice per year. Community-level activities change frequently within a year, and staff turnover at the community-level is often common. Thus, to ensure the collection of valid and reliable data, data collection needs to occur twice per year. In addition, data from multiple time periods within a year is essential for monitoring the progress of states and communities as they implement the SPF, and for identifying communities that are experiencing obstacles to implementing the SPF. Without data from multiple time periods during the program, it will be impossible to determine whether implementation progress is related to changes in NOMs outcomes.
A7. Consistency With Guidelines in 5 CFR 1320.5(d)(2)
This information collection fully complies with 5 CFR 1320.5(d)(2).
A8. Consultation Outside the Agency
The notice required in 5 CFR 1320.8(d) was published in the Federal Register on Tuesday January 10, 2006 (Volume 71, Number 6, pages 1545-1548). A copy of the published Federal Register Notice can be found in Appendix O. No comments were received.
The current evaluation design, data analysis plan, and Community-level Instrument (parts I and II) received several rounds of review. These reviews were the result of ongoing collaboration with two SPF SIG advisory groups, and state level evaluators and program directors.
Consultation
with Internal and External Advisory Groups
Members of the
SPF SIG External Technical Advisory Group (ETAG) reviewed the
national cross-site evaluation design, analysis plan, and
Community-level Instrument (parts I and II). The ETAG includes
a group of SPF SIG project directors and evaluators; evaluation and
prevention experts; a representative from the National Institute on
Drug Abuse (NIDA); and three SAMHSA staff not directly involved in
the evaluation. Each ETAG member was carefully selected to
ensure representation from the following: federal and state
government staff; local providers; representatives of the national
prevention network system (CADCA); and members versed in specialized
areas such as cultural competence, environmental strategies, fidelity
and adaptation, evaluation design and data analysis. Their
feedback was incorporated into working and final drafts of the
evaluation design, data analysis plan, and Community-level Instrument
(parts I and II). These reviewers’ names, titles,
organizational affiliations, and current telephone numbers are
provided in Appendix D.
The national cross-site evaluation team also seeks regular consultation with the SPF SIG Internal Workgroup. This group meets on a monthly basis at CSAP and consists primarily of CSAP and NIDA staff but also includes two SAMHSA staff outside of CSAP. As with the External Technical Advisory Group, the Internal Work Group provided feedback on the evaluation design and data analysis plan which was incorporated in working and final drafts. A list of the members of the Internal Work Group can be found in Appendix E.
Consultation
with Respondents
The SPF SIG national cross-site evaluation
team was responsible for the development and pilot testing of the
Community-level Instrument (parts I and II). This team
frequently sought consultation with respondents in the development
and refinement of this survey, as well as the pilot testing of this
survey.
In the development of the Community-level Instrument (parts I and II), key prevention stakeholders, including state SPF SIG project directors and evaluators and other key SPF SIG staff, were consulted. They provided feedback on the content and format of the survey’s domains, indicators, and measures to ensure that they had face validity and were not too burdensome for respondents to answer. In addition, all SPF SIG states were given the opportunity to review the instrument and provide comments and questions on their content and format.
The Community-level Instrument (parts I and II) was pilot tested in four states in January 2006. The individuals that participated in the pilot test represented the following types of organizations: mental health services, juvenile justice program services, substance abuse prevention services, youth-focused community organizations, and coalitions. Minor changes were made to the instrument as a result of the pilot testing; these are discussed in B4. Participants in the pilot test were also consulted on their estimate of the amount of time required to complete this survey, and the burden associated with this survey; these are discussed in A12.
A9. Payment to Respondents
There is no payment to respondents.
A10. Assurance of Confidentiality
All information gathered through the administration of the Community-level Instrument (parts I and II) focus on organizational activities undertaken as part of the SPF SIG program, rather than information about individuals. However, all respondents to the Community-level Instrument (parts I and II) will be required to register with the online survey site where the survey will be completed. As part of this registration, it will be necessary to obtain identifying information about these individuals (i.e., name, email address, organizational affiliation, and title/position). This information will be used for the creation of a user profile and every attempt will be made to keep this information confidential. After participants have registered with the website they will be provided with a UserID and temporary password to ensure that all of their survey responses remain confidential. Additionally, no survey responses will be attributed to a specific individual in any reports prepared from this data.
Community-level Instrument (parts I and II) participants will also be provided with the following information prior to completing the survey: the purpose of the survey; how the results will be used; the fact that participation is voluntary; that they may refuse to answer any question at any time or end the survey at any time; that responses will be kept confidential to the extent possible; that individual names and positions will not be connected with any responses in any reports prepared from the data; and that all individual responses will be combined with the responses of others in all reports prepared from the data.
A11. Questions of a Sensitive Nature
No questions of a sensitive nature will be collected.
A12. Estimates of Annualized Hour Burden
Annualized reporting burden for the Community-level Instrument (parts I and II) is shown in Table 1. These burden estimates are based on pilot respondents’ feedback as well as the experience of the survey developers. Burden estimates are provided for each of the three years of data collection and for each section of the survey. The burden for this survey differs across years and by survey section because some survey sections will not need to be completed every year or reporting period. Additionally, an individual community’s burden may be lower than the burden displayed in Table 1 because all sections of the Community-level Instrument (parts I and II) may not apply them, so that they may not have to complete every section of the survey.
It is estimated that 390 communities (approximately 15 communities per state) will receive SPF funds from their respective states. All of the directors of the community-based organizations that receive SPF funds will be required to complete both parts of this survey. Their hourly cost is estimated to be $32 per hour. State project directors will also be required to review the responses of their community partners, and complete one section of the survey in year one. It is estimated that their hourly cost is $42 per hour.
A13. Estimates of Annualized Cost Burden to Respondents
There are no capital/startup costs or operational/maintenance of services costs associated with this project.
A14. Estimates of Annualized Cost to the Government
The costs associated with the national cross-site evaluation of the SPF SIG project, which is responsible for gathering, processing, analyzing, and reporting the data, serves as the basis for the estimated costs for these activities. The National Institute on Drug Abuse is providing the funding for all of these activities. The estimated annual cost of the national cross-site evaluation is $1,708,915. In addition, there are costs for 50 percent of a GS-14 CSAP project officer of approximately $103,594 per year or $51,797 semi-annually. Thus, the total annual cost associated with the evaluation is $1,760,712.
A15. Changes in Burden
This is a new project.
Table 1. Estimates of Annualized Hour Burden for Community-level Instrument
Year 1 |
||||||
Community-level Instrument Section/Domain |
Number of Respondents |
Responses per Respondent |
Burden per Response |
Total Burden |
Hourly Wage Cost |
Total Hour Cost |
Part I, 1 - 11 State Responses |
26 |
1 |
0.08 |
2.08 |
$42.00 |
$87.36 |
Part I, 12 - 20 Contact Information and Reporting Period |
390 |
1 |
0.08 |
31.20 |
$32.00 |
$998.40 |
Part I, 21 - 26 Organization Type and Funding |
390 |
1 |
0.08 |
31.20 |
$32.00 |
$998.40 |
Part I, 27 - 33 Cultural Competence, Sustainability, and Framework Progress |
390 |
2 |
0.17 |
132.60 |
$32.00 |
$4,243.20 |
Part I, 34 - 66 Needs and Resources Assessments |
390 |
2 |
0.50 |
390.00 |
$32.00 |
$12,480.00 |
Part I, 67 - 159 Capacity Building Activities |
390 |
2 |
0.50 |
390.00 |
$32.00 |
$12,480.00 |
Part I, 160 - 178 Strategic Plan Development |
390 |
2 |
0.50 |
390.00 |
$32.00 |
$12,480.00 |
Part I, 198 - 216 Systems and Contextual Factors and Closing Questions |
390 |
2 |
1.00 |
780.00 |
$32.00 |
$24,960.00 |
Part I, subform 217 - 231 Coalition Organizational Information |
390 |
1 |
0.17 |
66.30 |
$32.00 |
$2,121.60 |
Part II 1 - 40; 45 Intervention Specific Information and Adaptations |
390 |
3 |
1.00 |
1,170.00 |
$32.00 |
$37,440.00 |
Review of past responses |
390 |
2 |
0.50 |
390.00 |
$32.00 |
$12,480.00 |
Preparation and gathering of supporting materials |
390 |
2 |
2.00 |
1,560.00 |
$32.00 |
$49,920.00 |
State Review of Community Responses |
26 |
2 |
1.00 |
52.00 |
$42.00 |
$2,184.00 |
Total Year 1 Burden - State-level |
54.08 |
|
$2,271.36 |
|||
Total Year 1 Burden - Community-level |
5,331 |
|
$170,601.60 |
|||
Year 2 |
||||||
Part I, 27 - 33 Cultural Competence, Sustainability, and Framework Progress |
390 |
2 |
0.17 |
132.60 |
$32.00 |
$4,243.20 |
Part I, 67 - 159 Capacity Building Activities |
390 |
2 |
0.50 |
390.00 |
$32.00 |
$12,480.00 |
Part I, 160 - 178 Strategic Plan Development |
390 |
2 |
0.50 |
390.00 |
$32.00 |
$12,480.00 |
Part I, 179 - 184 Intervention Implementation |
390 |
2 |
0.17 |
132.60 |
$32.00 |
$4,243.20 |
Part I, 198 - 216 Systems and Contextual Factors and Closing Questions |
390 |
2 |
1.00 |
780.00 |
$32.00 |
$24,960.00 |
Part II 1 - 40; 45 Intervention Specific Information and Adaptations |
390 |
3 |
1.00 |
1,170.00 |
$32.00 |
$37,440.00 |
Part II 41 - 44 Intervention Outcomes |
390 |
6 |
0.17 |
397.80 |
$32.00 |
$12,729.60 |
Part II subforms Intervention Component Information |
390 |
6 |
1.00 |
2,340.00 |
$32.00 |
$74,880.00 |
Review of past responses |
390 |
2 |
0.50 |
390.00 |
$32.00 |
$12,480.00 |
Preparation and gathering of supporting materials |
390 |
2 |
2.00 |
1,560.00 |
$32.00 |
$49,920.00 |
State Review of Community Responses |
26 |
2 |
1.00 |
52.00 |
$42.00 |
$2,184.00 |
Total Year 2 Burden - State-level |
52.00 |
|
$2,184.00 |
|||
Total Year 2 Burden - Community-level |
7,683 |
|
$245,856.00 |
|||
Year 3 |
||||||
Part I, 27 - 33 Cultural Competence, Sustainability, and Framework Progress |
390 |
2 |
0.17 |
132.60 |
$32.00 |
$4,243.20 |
Part I, 67 - 159 Capacity Building Activities |
390 |
2 |
0.50 |
390.00 |
$32.00 |
$12,480.00 |
Part I, 179 - 184 Intervention Implementation |
390 |
2 |
0.17 |
132.60 |
$32.00 |
$4,243.20 |
Part I, 185-197 Monitoring and Evaluation |
390 |
2 |
0.33 |
257.40 |
$32.00 |
$8,236.80 |
Part I, 198 - 216 Systems and Contextual Factors and Closing Questions |
390 |
2 |
1.00 |
780.00 |
$32.00 |
$24,960.00 |
Part II 1 - 40; 45 Intervention Specific Information and Adaptations |
390 |
3 |
1.00 |
1,170.00 |
$32.00 |
$37,440.00 |
Part II 41 - 44 Intervention Outcomes |
390 |
6 |
0.17 |
397.80 |
$32.00 |
$12,729.60 |
Part II subforms Intervention Component Information |
390 |
6 |
1.00 |
2,340.00 |
$32.00 |
$74,880.00 |
Review of past responses |
390 |
2 |
0.50 |
390.00 |
$32.00 |
$12,480.00 |
Preparation and gathering of supporting materials |
390 |
2 |
2.00 |
1,560.00 |
$32.00 |
$49,920.00 |
State Review of Community Responses |
26 |
2 |
1.00 |
52.00 |
$42.00 |
$2,184.00 |
Total Year 3 Burden - State-level |
52.00 |
|
$2,184.00 |
|||
Total Year 3 Burden - Community-level |
7,5501 |
|
$241,612.80 |
A16. Time Schedule, Publication, and Analysis Plan
Time Schedule
Table 2 shows the time schedule for the national cross-site evaluation of the SPF SIG initiative. As indicated in Table 2 data collection for the Community-level Instrument (parts I and II) is scheduled to begin in September 2006, following OMB approval, and end in July 2009. Both parts of this survey will be administered twice per year (every six months) over the course of three years. Thus, OMB clearance for these instruments is requested for three years.
Evaluation reports that include results of preliminary analyses conducted using data from these instruments will be produced every year in December. The first report is scheduled to be delivered in December 2006. A comprehensive final report for the SPF SIG will be delivered in December 2009.
Table 2. SPF SIG National Cross-site Evaluation Time Schedule
SPF SIG National Cross-site Evaluation Data Collection, Analysis, and Reporting Deadlines |
|
Activity |
Date |
Obtain OMB approval for state-level interview instruments |
September 2006 |
Obtain OMB approval for Community-level Instrument (parts I and II) |
December 2006 |
Collect state-level data (annually) State Infrastructure interviews SPF Implementation interviews |
Ongoing for three years: September 2006 (following OMB approval)– July 2009 |
Collect community partner survey data (semi-annually) |
Ongoing for three years: December 2006 (following OMB approval)– July 2009 |
Obtain epidemiological and outcome data |
Ongoing for three years |
Analyze evaluation data to assess relationship between interview/survey data and outcomes. |
Annual interim analyses (2006-2008); comprehensive final analyses (2009) |
Create data files for secondary analysis. |
December 2006 – December 2008 |
Produce bi-monthly reports |
December 2006 – September 2009 |
Produce annual evaluation reports. |
December 2006 – December 2008 |
Produce final evaluation report |
December 2009 |
Logic Model of SPF SIG Impact
The national cross-site evaluation team has developed a logic model of SPF SIG impact to help guide the evaluation design and requirements. This logic model depicts the flow of state- and community-level activities that lead to systems change, and epidemiological outcomes within the broader context where prevention programs operate. The model is depicted in Figure 1.
State activities are represented in Figure 1 in rectangles, and community activities are represented in ovals (the multiple ovals represent multiple communities within states). The logic model begins with SPF funding being received by selected states and territories. After receipt of funds, states and territories begin the planning and implementation of the SPF. The implementation of the SPF is expected to lead to both state-level systems change and funding of selected communities. Funding of selected communities is expected to lead to planning and implementation of the SPF at the community-level and community-level system change. Systems change at both the state and community levels is expected to lead to changes in epidemiological outcomes.
The arrow connecting planning and implementation (both at the state and community levels) to systems change is bidirectional, indicating that both influence each other. Planning and implementation lead to systems change, and systems change leads to further refinement and efficiency of planning and implementation.
To determine if cross-site variation in outcomes is caused by SPF SIG funding, the logic model also includes baseline status and contextual change and unmeasured factors for both states and communities. Baseline status refers to pre-SPF SIG activities and achievements related to SPF SIG-initiated activities. Contextual change and unmeasured factors refer to anything that occurs in states and communities unrelated to the SPF SIG project that potentially has an impact on epidemiological outcomes.
The two-part Community-level Instrument, which is the focus of this request, will be used to gather data directly related to the highlighted ovals in Figure 1.
Research Questions
Six impact research questions will guide the SPF SIG outcome evaluation. A detailed description of the national cross-site evaluation as well as a discussion of these questions can be found in the National Cross-site Evaluation Design (Appendix F). These six impact research questions assess whether observed conditions/events can be attributed to SPF-SIG programmatic interventions. The six questions are:
1a. Did SPF funding improve statewide performance on NOMs and other outcomes?
1b. What accounted for variation in NOMs and other outcomes performance across SPF states?
2a. Within states, did SPF funding lead to community-level improvement on NOMs and other outcomes?
2b. Within states, what accounted for variation in NOMs and other outcomes performance across funded communities?
3a. Across states, did SPF funding lead to community-level improvement on NOMs and other outcomes?
3b. Across states, what accounted for variation in NOMs and other outcomes performance across funded communities?
In addition to these six impact research questions which are the central focus of the SPF SIG evaluation, the evaluation design also includes process-related research questions. These provide information necessary for interpreting the outcomes found in the evaluation, and focus on: interpreting the effects of project-related activities; identifying effective program and policy elements (e.g., conditions necessary for effective programs, populations for whom programs are effective); and assessing contextual factors related to SPF SIG outcomes. Some examples of process-related research questions included in the design are: What changes in allocation of funds and other resources for substance abuse prevention programs and other activities occurred at the state and community-levels; what state and community level mobilization and capacity building activities have been implemented; has cultural competence been integrated into prevention programs, policies, and practices in states; to what extent has the prevention infrastructure improved; to what extent are selected programs evidence-based; and to what extent are selected programs implemented with fidelity?
Analysis Plan
The two state-level instruments and Community-level Instrument (parts I and II) will be used to gather data related to research questions 1b, 2b, and 3b, each of which addresses the impact of the SPF SIG initiative on NOMs and other outcome measures, both system- and population-level. Specifically, the three questions address the moderators and mediators of outcome variation across SPF-funded states, communities within funded states, and communities across funded states, respectively. Data from the Community-level Instrument (parts I and II) in particular will be used to identify similarities and differences in the way SPF SIG is being implemented across communities and states. This in turn will permit the analyses to draw generalized inferences about the effects of different types of community approaches.
Data reduction, scoring and scaling
As described earlier, the Community-level Instrument (parts I and II) was developed using input from program staff in the states who are implementing the SPF initiative and policymakers who designed it. Our use of data from this instrument in outcome and process analyses will focus more on the scales and indexes that will be derived from each of the sections in the instrument than on a community’s or state’s responses to any individual item. We therefore refer to sections rather than individual items when indicating the relationships between evaluation questions and items in the Community-level Instrument (parts I and II). Appendix G provides a list of items associated with each survey section and logic model component. Table 3 shows the survey sections associated with each of the three impact research questions and logic model component.
The first phase of the analysis of data from each of this instrument will consist of review, coding, scoring and scaling of responses within each survey section with the goal of reducing the data to a set of reliable scales that will be used in subsequent analyses. For each section, summary scores or indexes will be developed that go beyond the limited response codes contained in the instrument to encompass the range of responses. Further development of empirically-based anchors for scales and the development of additional summary scores for sections will be based on analysis of the first wave of surveys using standard scale development procedures. Although considerable revision and winnowing of questions within sections has already taken place based on the pilot test, it is expected that some items in each section will yield more useful information for coding and some may show insufficient variation to be retained in final versions of the summary scores. Attention will be given to developing reliable and valid measures of the constructs in each survey section, including assessment of inter-coder reliabilities and relationships among both the items within potential summary scores and between the sections.
Descriptive/normative analyses
Although the primary focus of the national cross-site evaluation is on assessing impact, many descriptive and normative analyses will occur first. The scales and indexes from the state-level instruments and Community-level Instrument (parts I and II) will support these analyses, in tandem with coded data from archival sources such as grant applications, quarterly reports and strategic plans. We will use standard techniques for analyzing, displaying, and reporting descriptive and normative results as they become available throughout the evaluation period. These will include summary statistics (means, medians, ranges, and standard deviations) and univariate and multivariate frequency distributions (including cross-classification displays), as well as appropriate charts and graphs. Subsequently, the scales and indexes developed in the initial phases of analysis will also support the impact questions as key predictors of systems- and population-level outcomes.
Inferential (cause and effect) analyses
The data gathered will be used to conduct a variety of analyses related to the six impact evaluation questions and also the process-related research questions. The state-level instruments and Community-level Instrument (parts I and II) will be used to address questions 1b, 2b, and 3b, as noted above. As part of these analyses, the distributional characteristics of the data as well as the baseline differences among the groups being compared will be assessed. Then, within-state and cross-state outcome analyses will be conducted using multilevel statistical modeling methods that account for the “nested” nature of the data. (The data are not independent, they are nested within the communities and within the states). To estimate the effects of SPF, trends in repeated cross-sectional measurements of population outcomes at the state and community-level will be modeled in these analyses. Additionally, propensity scores will be used to reduce potential bias from group nonequivalence between funded and non-funded communities, or groups of communities implementing different types of interventions. (See Chapter 7 of the Evaluation Plan Appendix F, for details on the statistical models to be used for each of the six impact questions.)
Table 3. SPF-SIG Community-level Instrument Section Index by Research Question, Moderator/Mediator, and Logic Model Component
Research Question |
Moderator/Mediator |
Logic Model Component |
Data Source |
Survey Section |
1b. What accounted for variation in NOM and other outcomes performance across SPF states? |
Aggregate score (state-level) on community-level implementation fidelity, cultural competence |
Community-level Planning and Implementation |
Community-level Instrument (parts I and II) |
Part I State Questions SPF-Needs and Resource Assessment SPF-Capacity Building SPF-Strategic Plan Development SPF-Intervention Implementation SPF-Intervention Level Outcome Evaluation Systems Factors Contextual Factors Coalition Subform Part II Intervention Strategies Prevention Education Alternative Drug-free Activities Problem Identification and Referral Community-based Processes Environmental Strategies Information Dissemination Other Strategies |
Community-level Systems Change |
Community-level Instrument (part I) |
Part I State Questions SPF-Needs and Resource Assessment SPF-Capacity Building SPF-Strategic Plan Development SPF-Intervention Implementation SPF-Intervention Level Outcome Evaluation Systems Factors Contextual Factors Coalition Subform |
||
2b. Within SPF states, what accounted for variation in NOMs and other outcomes performance across funded communities? |
Community-level baseline characteristics |
Community-level Baseline Status |
Community-level Instrument (part I) |
Part I State Questions SPF-Needs and Resource Assessment SPF-Capacity Building SPF-Strategic Plan Development SPF-Intervention Implementation SPF-Intervention Level Outcome Evaluation Systems Factors Contextual Factors Coalition Subform |
Table 3. SPF-SIG Community-level Instrument Section Index by Research Question, Moderator/Mediator, and Logic Model Component (Continued)
Research Question |
Moderator/Mediator |
Logic Model Component |
Data Source |
Survey Section |
2b. Within SPF states, what accounted for variation in NOMs and other outcomes performance across funded communities? (cont.) |
Community-level implementation fidelity, cultural competence |
Community-level Planning and Implementation |
Community-level Instrument (parts I and II) |
Part I State Questions SPF-Needs and Resource Assessment SPF-Capacity Building SPF-Strategic Plan Development SPF-Intervention Implementation SPF-Intervention Level Outcome Evaluation Systems Factors Contextual Factors Coalition Subform Part II Intervention Strategies Prevention Education Alternative Drug-free Activities Problem Identification and Referral Community-based Processes Environmental Strategies Information Dissemination Other Strategies |
Community-level Systems Change |
Community-level Instrument (part I) |
Part I State Questions SPF-Needs and Resource Assessment SPF-Capacity Building SPF-Strategic Plan Development SPF-Intervention Implementation SPF-Intervention Level Outcome Evaluation Systems Factors Contextual Factors Coalition Subform |
||
Community-level post-baseline contextual change |
Community-level Contextual Change and Unmeasured Factors |
Community-level Instrument (part I) |
Part I State Questions SPF-Needs and Resource Assessment SPF-Capacity Building SPF-Strategic Plan Development SPF-Intervention Implementation SPF-Intervention Level Outcome Evaluation Systems Factors Contextual Factors Coalition Subform |
Table 3. SPF-SIG Community-level Instrument Section Index by Research Question, Moderator/Mediator, and Logic Model Component (Continued)
Research Question |
Moderator/Mediator |
Logic Model Component |
Data Source |
Survey Section |
2b. Within SPF states, what accounted for variation in NOMs and other outcomes performance across funded communities? (cont.) |
Pre-intervention planning and choice of intervention/s at community-level |
Community-level Planning and Implementation |
Community-level Instrument (parts I and II) |
Part I State Questions SPF-Needs and Resource Assessment SPF-Capacity Building SPF-Strategic Plan Development SPF-Intervention Implementation SPF-Intervention Level Outcome Evaluation Systems Factors Contextual Factors Coalition Subform Part II Intervention Strategies Prevention Education Alternative Drug-free Activities Problem Identification and Referral Community-based Processes Environmental Strategies Information Dissemination Other Strategies Target population characteristics |
Community-level Systems Change |
Community-level Instrument (part I) |
Part I State Questions SPF-Needs and Resource Assessment SPF-Capacity Building SPF-Strategic Plan Development SPF-Intervention Implementation SPF-Intervention Level Outcome Evaluation Systems Factors Contextual Factors Coalition Subform |
||
3b. Across SPF states, what accounted for variation in NOM and other outcomes performance across funded communities? |
Community-level baseline characteristics |
Community-level Baseline Status |
Community-level Instrument (parts I and II) |
Part I State Questions SPF-Needs and Resource Assessment SPF-Capacity Building SPF-Strategic Plan Development SPF-Intervention Implementation SPF-Intervention Level Outcome Evaluation Systems Factors Contextual Factors Coalition Subform
|
Table 3. SPF-SIG Community-level Instrument Section Index by Research Question, Moderator/Mediator, and Logic Model Component (Continued)
Research Question |
Moderator/Mediator |
Logic Model Component |
Data Source |
Survey Section |
3b. Across SPF states, what accounted for variation in NOM and other outcomes performance across funded communities? (cont.) |
Community-level baseline characteristics |
Community-level Baseline Status |
Community-level Instrument (parts I and II) |
Part II Intervention Strategies Prevention Education Alternative Drug-free Activities Problem Identification and Referral Community-based Processes Environmental Strategies Information Dissemination Other Strategies Target population characteristics |
Aggregate score (state-level) on community-level implementation fidelity, cultural competence |
Community-level Planning and Implementation |
Community-level Instrument (part I) |
Part I State Questions SPF-Needs and Resource Assessment SPF-Capacity Building SPF-Strategic Plan Development SPF-Intervention Implementation SPF-Intervention Level Outcome Evaluation Systems Factors Contextual Factors Coalition Subform |
|
Community-level Systems Change |
Community-level Instrument (part I) |
Part I State Questions SPF-Needs and Resource Assessment SPF-Capacity Building SPF-Strategic Plan Development SPF-Intervention Implementation SPF-Intervention Level Outcome Evaluation Systems Factors Contextual Factors Coalition Subform |
||
Community-level post-baseline contextual change |
Community-level Contextual Change and Unmeasured Factors |
Community-level Instrument (part I) |
Part I State Questions SPF-Needs and Resource Assessment SPF-Capacity Building SPF-Strategic Plan Development SPF-Intervention Implementation SPF-Intervention Level Outcome Evaluation Systems Factors Contextual Factors Coalition Subform |
Table 3. SPF-SIG Community-level Instrument Section Index by Research Question, Moderator/Mediator, and Logic Model Component (Continued)
Research Question |
Moderator/Mediator |
Logic Model Component |
Data Source |
Survey Section |
3b. Across SPF states, what accounted for variation in NOM and other outcomes performance across funded communities? (cont.) |
Pre-intervention planning and choice of intervention/s at community-level |
Community-level Planning and Implementation |
Community-level Instrument (parts I and II) |
Part I State Questions SPF-Needs and Resource Assessment SPF-Capacity Building SPF-Strategic Plan Development SPF-Intervention Implementation SPF-Intervention Level Outcome Evaluation Systems Factors Contextual Factors Coalition Subform Part II Intervention Strategies Prevention Education Alternative Drug-free Activities Problem Identification and Referral Community-based Processes Environmental Strategies Information Dissemination Other Strategies Target population characteristics2 |
Community-level Systems Change |
Community-level Instrument (part I) |
Part I State Questions SPF-Needs and Resource Assessment SPF-Capacity Building SPF-Strategic Plan Development SPF-Intervention Implementation SPF-Intervention Level Outcome Evaluation Systems Factors Contextual Factors Coalition Subform |
||
Community-level implementation fidelity, cultural competence |
Community-level Planning and Implementation
|
Community-level Instrument (parts I and II) |
Part I State Questions SPF-Needs and Resource Assessment SPF-Capacity Building SPF-Strategic Plan Development SPF-Intervention Implementation SPF-Intervention Level Outcome Evaluation Systems Factors Contextual Factors Coalition Subform
|
Table 3. SPF-SIG Community-level Instrument Section Index by Research Question, Moderator/Mediator, and Logic Model Component (Continued)
Research Question |
Moderator/Mediator |
Logic Model Component |
Data Source |
Survey Section |
3b. Across SPF states, what accounted for variation in NOM and other outcomes performance across funded communities? (cont.) |
Community-level implementation fidelity, cultural competence |
Community-level Planning and Implementation
|
Community-level Instrument (parts I and II) |
Part II Intervention Strategies Prevention Education Alternative Drug-free Activities Problem Identification and Referral Community-based Processes Environmental Strategies Information Dissemination Other Strategies Target population characteristics* |
Community-level Systems Change |
Community-level Instrument (part I) |
Part I State Questions SPF-Needs and Resource Assessment SPF-Capacity Building SPF-Strategic Plan Development SPF-Intervention Implementation SPF-Intervention Level Outcome Evaluation Systems Factors Contextual Factors Coalition Subform |
One system-level outcome of interest will be changes in prevention infrastructure over time. Data from the state interviews and Community-level Instrument (part I) will be used to measure state systems infrastructure. This includes changes in planning capacity, training capacity, and support for the implementation of evidence-based practices. Thus, data from these instruments will serve as outcome data for state systems change and as mediators of changes in consumption and consequences population outcomes, including the NOMs. To support analyses that explain outcome variation among the SPF SIG states, a global index of state prevention infrastructure will be developed using data from the state interviews and Community-level Instrument (part I). This index will enable us to categorize the prevention infrastructure of states as “highly developed,” “moderately developed,” or “less well developed” over the course of SPF implementation. The state prevention infrastructure index will also be used in analyses to measure changes from year to year among the SPF SIG states.
The construct of prevention infrastructure is, however, too complex to be captured by a single summary statistic. In addition to the global index, therefore, indexes will also be developed based on specific infrastructure domains (planning, workforce development, etc). Analyses of these indexes will help show whether some domains appear more critical to outcomes than others. Other analyses will focus on the relationship between SPF implementation and observed variation in outcomes across states.
Tables 4 and 5 illustrate two sample table shells. Table 4, part of the descriptive analysis, would show the frequency distribution of communities’ achieved implementation level for each of the 5 SPF steps. The achieved implementation levels will be derived primarily from the Community-level Index, supplemented by archival sources such as quarterly reports and strategic plans. Table 5, part of the inferential analysis, would show the association between implementation level for one SPF step and selected outcomes, corrected for baseline differences and other potential confounders. The measure of association used is the gamma coefficient3. There are of course many other ways results could be presented, so these tables should be viewed only as examples.
Table 4. Sample Table Shell: Cross-Community Frequency Distribution of Implementation Level by SPF Step
SPF Step |
Implementation level |
|||
1 |
2 |
3 |
4 |
|
Step 1 Conduct a needs assessment |
N (%) |
N (%) |
N (%) |
N (%) |
Step 2 mobilize and build community capacity to address needs |
N (%) |
N (%) |
N (%) |
N (%) |
Step 3 develop a strategic plan for prevention; |
N (%) |
N (%) |
N (%) |
N (%) |
Step 4 implement evidence-based prevention practices to meet community needs |
N (%) |
N (%) |
N (%) |
N (%) |
Step 5 and monitor/evaluate the implementation of the project. |
N (%) |
N (%) |
N (%) |
N (%) |
1—not or minimally implemented
2 –partially implemented, significant shortcomings
3 –partially implemented, minor shortcomings
4 –fully implemented
Table 5. Sample Table Shell: Relationship of SPF SIG Step 2 (“mobilize and build state and community capacity to address needs”) Implementation Level with Abstinence NOMS (youth) *
Abstinence NOM (youth) |
Implementation level for Step 2 |
Gamma (CI) |
|||
1 |
2 |
3 |
4 |
||
30-day use |
Mean (N) |
Mean (N) |
Mean (N) |
Mean (N) |
±0.xx |
Age of 1st use |
Mean (N) |
Mean (N) |
Mean (N) |
Mean (N) |
±0.xx |
Perception of Disapproval/Attitude |
Mean (N) |
Mean (N) |
Mean (N) |
Mean (N) |
±0.xx |
Perceived Risk/Harm of Use |
Mean (N) |
Mean (N) |
Mean (N) |
Mean (N) |
±0.xx |
* All associations adjusted for baseline differences between states
Community-level analyses conducted with the data gathered from Community-level Instrument (parts I and II) will aim to identify characteristics of community-level interventions that are most effective in producing desired outcomes. These analyses will focus on: 1) comparisons of community-level outcomes from funded communities across multiple states with outcomes from unfunded communities where comparable data are available or with state and national data; and 2) comparisons of outcomes across the funded communities, exploring the relationships between different types of community approaches, target populations, levels of implementation and fidelity, and aggregated outcomes. Systems-level outcomes to be included in these analyses include changes in the number and operation of coalitions as assessed by Community-level Instrument (parts I and II). Population outcomes will focus on changes in consumption and consequences NOMs and other outcomes over time.
Statistical modeling methods will be performed using Hierarchical Linear Modeling (HLM) Version 6 (Raudenbush et al., 2004). The coefficients estimated by the HLM model are applicable to a hierarchical data structure with up to three levels of random variation. In our case, the three levels will be state, community, and over time. It also accommodates sampling weights in both linear and nonlinear models. This is relevant to our analysis because 1) most of the NOMs and other outcomes will not meet normality assumptions and therefore require nonlinear models, and 2) states will contribute unequal numbers of communities and population sizes to the cross-site database. Therefore, inverse weighting by these inequalities at the appropriate level will increase the generalizability of the findings. Note that the state-level instruments will support analyses of variation at level 3, the Community-level Instrument will support analyses of variation at level 2, and both will support analyses of variation at level 1 through repeated administrations over time.
Westat will provide CSAP with the reports necessary to determine, in consultation with the relevant SAMHSA and NIDA staff, if the overall quality and quantity of the evaluation data are adequate for public release. Once it is determined that the data will be released, Westat will perform a disclosure analysis of the data to detect both direct and indirect identifiers within the data, as well as the most likely sources for a possible breach of confidentiality. Based on the standards published by the Standing Review Committee for Disclosure Analysis at the Inter-University Consortium for Political and Social Research (ICPSR) Westat will recommend a plan for each detected identifier. Once the disclosure plan is approved by CSAP, Westat will produce a public use data file in compliance with ICPSR recommendations for public use data. Data will also be made available to the prevention community through the Data Coordination and Consolidation Center (DCCC).
A17. Display of Expiration Date
The expiration date for OMB approval will be displayed.
A18. Exceptions to Certification Statement
This collection of information involves no exceptions to the Certification for Paperwork Reduction Act Submissions. The certifications are included in this clearance package.
PART B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS
B1. Respondent Universe and Sampling Methods
This request is for a new data gathering instrument, the Community-level Instrument (parts I and II). Completion of this survey will be required of all active community partners (communities that receive SPF SIG funds from their states), estimated to be 390, and the 26 SPF SIG grantees (states that receive SPF SIG funds). The information gathered from this instrument will be used by CSAP to monitor community partners and state grantees, and as important data sources for the national cross-site evaluation. The estimated response rate for these instruments is approximately 100 percent, as completion of these interviews will be required from all active community partners and SPF SIG states.
B2. Information Collection Procedures
SPF SIG Project Directors in each state and territory will be contacted by the National Cross-Site Evaluation Team by email. This email will request that the Project Director provide the name of a state-level administrator who will be responsible for reviewing and approving completed Community-level Instrument (parts I and II). The text of this email can be found in Appendix H. A form will also be attached to this email that is to be completed by the state-level administer (see Appendix I). This form will request contact information and will be used to create a user profile in the web-based system. Once the form has been completed and submitted to the National Cross-Site evaluation team, the state-level administrator will be provided with a Username and Password and will be granted access to the Community-level Instrument (parts I and II) website. The text of the email that will be sent to state-level administrators can be found in Appendix J.
Once a state-level administrator has been provided access to the Community-level Instrument (parts I and II) website, they will be required to register the community agencies that have been awarded SPF funds in their state (i.e., community partners). All community partners that receive SPF funding from their state are required to be registered in the web-based system. Each state-level administer will complete separate forms for each of their community partners (see Appendix K). Upon submission of the forms to the National Cross-Site Evaluation Team, each community partner will be provided with a user profile in the web-based system and provided access to the Community-level Instrument (parts I and II) website. See Appendix L for the text of the email that will be sent to the community partners notifying them that they have been registered in the Community-level Instrument (parts I and II) website.
The Community-level Instrument (parts I and II) is to be completed every six months by all community partners. Additionally, state-level administrators will be required to review the information provided by the community partners and complete a brief set of nine questions. Reminder emails requesting completion of the Community-level Instrument (parts I and II) will be sent at the end of each reporting period and two-weeks prior to the deadline for completing the survey (see Appendix M). Community partners and state-level administrators must complete the Community-level Instrument (parts I and II) within 30 days after the end of a reporting period. Follow-up email reminders will be sent the day after the deadline date to state-level administrators who have not submitted all of their community partner’s surveys to the National Cross-Site Evaluation Team (see Appendix N). A second follow-up email will be sent two weeks after the deadline to state-level administrators and State Project Directors for those community partners who still have not completed the Community-level Instrument (parts I and II) two weeks after the deadline notifying them of any outstanding surveys (see Appendix N).
B3. Methods to Maximize Response Rates
Because community partners will be required to complete the Community-level Instrument (Parts I and II) and states will be required to review and verify responses as a condition of award, the response rate should approach 100 percent. The follow-up procedures, described in the preceding section, further increase the likelihood that a very high percentage of community partners will respond. Given our experience with the State Incentive Grant cross-site evaluation that preceded this project, for which the sub-recipient response rate averaged 97 percent, we are confident that the response rate for this data collection will be between 95 and 98 percent.
B4. Tests of Procedures
Instrument Development
In the development of the Community-level Instruments (parts I and II), an extensive review of literature, program requirements, and evaluation frameworks was conducted to identify the appropriate concepts to measure. The following concepts were considered important to measure: community awareness of and openness to prevention efforts; relationship building, including coalition activities; organizational and community resources; sustainability; cultural competency; contextual factors; and systems and environmental factors.
State project directors, evaluators, and CSAP Federal Project Officers reviewed several versions of the Community-level Instrument (parts I and II). Their comments and suggestions on content and format were incorporated where appropriate. Additionally, the survey was rigorously tested to ensure an appropriate reading level and was pilot tested with community grantees.
Pilot Testing of Instruments
The Community-level Instrument (parts I and II) was pilot tested in January 2006. Nine volunteers from four states participated in the pilot test. Pilot test participants were recommended by the SPF SIG Project Director in their state or their state’s evaluator, and represented the following types of organizations: mental health services; juvenile justice program services; substance abuse prevention services; youth-focused community organizations; coalitions.
Pilot test participants provided feedback on the amount of time required to complete each part of the Community-level Instrument as well as comments on the content of the survey. Minor changes were made to both parts of Community-level Instrument as a result of pilot tester’s feedback. These changes included: addition of definitions for specific terms used throughout the survey; inclusion of examples of concepts; clarification of who should be answering specific questions (state-level administrator or community partner); addition of response options; and additional instructions to the survey.
B5. Statistical Consultants
Several individuals from the External Technical Advisory Group provided consultation on the statistical aspects of the evaluation design including:
Sandeep Kasat, Ph.D. Epidemiologist Office of Substance Abuse 11 State House Station Augusta, ME 04333 Phone: 207-287-4372 Email: [email protected] |
Wayne Harding, Ph.D. Social Science Research and Evaluation, Inc. 21-C Cambridge Street Burlington, MA 01803 Phone: 781-270-6613 Email: [email protected]
|
The primary individuals responsible for the analytic tasks for the evaluation of the SPF SIG initiative are:
Robert Orwin, Ph.D., the National Cross-site Evaluation Project Director, Westat,
(301)251-2277.
Bob Flewelling, Ph.D., Pacific Institute for Research and Evaluation, (919)265-2621.
Additionally, several national cross-site evaluation staff have expertise in statistical approaches to analyzing data and will also be contributing to the analytic tasks including:
Joseph Sonnefeld, M.A., Westat, (240)214-2522.
Alan D. Stein-Seroussi, Ph.D., Pacific Institute for Research and Evaluation,
(919)967-8998.
1 Total annualized burden for the community-level instrument is 6,908 hours.
2 Target population characteristics include number, gender, age, race, and ethnicity.
3 Like the Pearson correlation coefficient, gamma varies from –1 to +1, with zero being no relationship, but unlike the Pearson correlation, does not assume that either the independent or dependent variable are measured as interval level variables. It therefore is appropriately used to estimate associations between ordered variables.
File Type | application/msword |
File Title | SUPPORTING STATEMENT FOR THE STRATEGIC PREVENTION FRAMEWORK STATE INCENTIVES GRANT STATE-LEVEL INTERVIEW PROTOCOLS |
Author | Ilene Klein |
Last Modified By | Ann Landy |
File Modified | 2006-08-03 |
File Created | 2006-08-03 |