BASE -Evaluation and Performance Mgmt Plan

Monitoring and Reporting for the Core State Violence and Injury Prevention Program Cooperative Agreement

Attachment D_063016_F

BASE -Evaluation and Performance Mgmt Plan

OMB: 0920-1120

Document [pdf]
Download: pdf | pdf
Form Approved
OMB No. 0920-XXXX
Exp. Date xx/xx/20xx

CoreVIPP State Evaluation Plan Template

{FOCUS AREA} Evaluation Plan
{State PH Dept. Name}
{Injury & Violence Prevention Division/Unit Name}
{Date}

Public reporting burden of this collection of information is estimated to average 2 hours for awardees funded for BASE component, 3 hours for awardees funded
for 1 enhanced component, and 4 hours for respondents funded for 2 enhanced components. The response includes the time for reviewing instructions,
searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. An agency may not
conduct or sponsor, and a person is not required to respond to a collection of information unless it displays a currently valid OMB control number. Send
comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden to CDC/ATSDR
Information Collection Review Office, 1600 Clifton Road NE, MS D-74, Atlanta, Georgia 30333; ATTN: PRA (0920-XXXX).

INTRODUCTION
Evaluation Purpose and Goals
It is often helpful to start your evaluation plan with a summary of the overall purpose of your
evaluation as well as your evaluation goals. Specifying your evaluation goals helps to focus your
evaluation’s purpose and scope.




What do you want to learn through your evaluation?
How would you like to be able to use your evaluation findings?
What impact do you hope your evaluation will have on your organization’s work?

Here are some examples of evaluation goals:




Example 1- Implementation/Process Evaluation Goal: Evaluate the
implementation of our injury and violence prevention program’s [FOCUS AREA]
activities for the purpose of continuously improving these activities.
Example 2- Outcome Evaluation Goal: Evaluate the impact of our injury and
violence prevention program’s [FOCUS AREA] activities for the purpose of
communicating the value of our work to our stakeholders and the public.

Evaluation Team
It is also important to determine up front who will be responsible for “doing the work.” It often
helps to think of those who will be working on the evaluation as an “evaluation team,” with one
person serving as the evaluation lead/coordinator. Evaluation team roles can include:
coordinating/leading the evaluation, developing the evaluation plan, collecting data, analyzing
data, interpreting findings, reporting findings back to stakeholders, etc.
o
o

Who will be the Lead Evaluator for your evaluation? What will his or her
responsibilities be?
Who are the other team members who will be involved in the evaluation? What
will their roles/responsibilities be in carrying out the evaluation?

NOTE: See Tables 2 & 4. You can list each evaluation team member’s role next to the
evaluation activity(ies) he or she is responsible for.
NOTE: For team members who may be new to evaluation, check out CDC’s EvaluAction tool:
http://vetoviolence.cdc.gov/apps/evaluaction/ which provides an introduction to evaluation
for injury and violence prevention practitioners
Evaluation Standards
When planning and implementing your evaluation, it is important to keep sight of the
overarching standards and values that will guide your evaluation. These standards should be
infused into every stage of your evaluation and guide your evaluation activities. CDC’s
2

Evaluation framework specifies a number of standards which you may consider
adopting/adapting to fit with the values of your own organization. CDC’s standards include
making sure your evaluation:





Serves the information needs of intended users
Is realistic, prudent, diplomatic and frugal
Is conducted legally, ethically and with due regard for the welfare of those involved in the
evaluation, as well as those affected by its results
Reveals and conveys technically adequate information about the features that determine
worth or merit of the program being evaluated

3

I. ENGAGE STAKEHOLDERS
Once you have determined your core evaluation team (those who will be doing the majority of
the “work” for the evaluation), it is important to engage stakeholders who will in some way have
an interest in or be impacted by your evaluation (e.g. state/local decision makers, leaders or
members of other divisions in your public health agency, leaders/staff from other state agencies,
leaders/staff from local health depts., staff who will be implementing the activities, members of
the public who will be participating in/served by the activities, etc.). Engaging stakeholders in
your evaluation can help build buy-in for your evaluation activities, ensure that your evaluation
questions and activities are in line with stakeholders needs, and increase the chances that the
evaluation process and findings will be useful and meaningful.
o

Who are the key stakeholders for your topic area activities in your state?

o
o

What role do they have in the evaluation?
How do you plan to engage your stakeholders (e.g., evaluation team(s), coalition,
advisory board)?
When will you engage your stakeholders (e.g. evaluation planning stage,
ongoing, during data collection, during data analysis, during dissemination of
evaluation findings, etc.)?

o

Table 1. Stakeholder Assessment and Engagement Plan
Stakeholder Stakeholder Expertise/
Organization Category Perspective
Example:
State DOT

Role in the
Evaluation

Example:
Example:
Example:
State agency State Dept. of Help develop/review
partner
Transportation evaluation questions;
Help interpret
evaluation findings;
Help disseminate
evaluation findings

Example:
Example:
Example:
State HD
Internal state Epidemiologist
Epidemiologist health dept.
partner

How to
Engage
Example:
Evaluation
advisory
board

Example:
Example:
Help access and
Evaluation
analyze morbidity and team
mortality outcome
data

4

When to Engage
Example:
Early (evaluation
planning stage),
and ongoing

Example:
Early (evaluation
planning stage),
and data
collection/analysis
stage

II. DESCRIBE THE PROGRAM
Need


Why are your [FOCUS AREA] strategies needed (e.g. morbidity/mortality rates in your
state)?

Population of Focus


Who and where will your [FOCUS AREA] activities be directed?

Logic Model
Logic models are a key part of an evaluation plan. They help to create a “path”
between your activities and the outcomes you are hoping to achieve. The sections
and questions below will help you get started in building your logic model. You can
also see some examples of logic models , templates for creating them, and further
guidance in developing them here:
http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html
 Resources/Inputs
o What resources are available for implementing your [FOCUS AREA] activities
(e.g. staff, money, space, time, partnerships, etc.)?


Activities
o What activities are being undertaken (or planned) to achieve the
outcomes you are looking for in [FOCUS AREA] ?



Outputs/Implementation Outcomes
o
o
o

How will you know that your activities are being successfully implemented?
What are some of the measurable or observable elements that can tell you about
how well your [FOCUS AREA] activities are being implemented?
What products or direct achievements (i.e., materials, number of people served,
number of staff trained, etc.) will be produced from the activities?

NOTE: In some cases, your proximal [FOCUS AREA] objectives from your Annual
Progress Report may be outputs/implementation outcomes.


Intermediate/Proximal Outcomes
o What are some of the more immediate outcomes you can expect from your
[FOCUS AREA] activities that would begin to indicate success (e.g. passing of a
graduated driver’s licensing law; Increase in registered users of state Prescription
Drug Monitoring System)?
o How will you measure these short term outcomes?
NOTE: In some cases, your proximal [FOCUS AREA] objectives from your Annual
Progress Report may be the intermediate outcomes of your activities.
5



Long Term/Distal Outcomes
o What impact do you hope to achieve with your [FOCUS AREA] activities?
o How will you measure these long term outcomes?
NOTE: In many cases, your distal [FOCUS AREA] objectives from your Annual
Progress Report may be the long term outcomes of your activities.

6

III. FOCUS THE EVALUATION DESIGN
Stakeholder Needs
The first step in developing your evaluation design is determining the needs of you
and your stakeholders. You may want to refer back to what you have written in the
“Engage Stakeholders” section as you think through the answers to the questions
below.





Who will use the evaluation findings?
How will the findings be used?
What do you need to learn from the evaluation? What do your stakeholders need/want to
learn?
Who will be involved in developing evaluation questions? How will they be involved?

Evaluation Questions
Evaluation questions help to focus your evaluation on the key information you want to learn
about your [FOCUS AREA] activities. These are the questions that help you achieve your
evaluation goals. You may want to refer back to your evaluation purpose and goals as you
narrow down your specific evaluation questions.



What do your evaluation team and stakeholders want to learn from the evaluation?
What questions need to be asked in order to gather the information you want to know?

Here are some example evaluation questions:






Example 1- Process/Implementation Evaluation Question: Were our [FOCUS
AREA] activities implemented successfully?
Example 2- Process/Implementation Evaluation Question: What can we do to
improve the implementation of our [FOCUS AREA] activities?
Example 3- Outcome Evaluation Question: Did our [FOCUS AREA] activities
result in desired behavior change (e.g. increased number of prescribers registered in the
state Prescription Drug Monitoring Program)?
Example 4- Outcome Evaluation Question: Did our [FOCUS AREA] activities reduce
injuries and death?

Resource Considerations




What resources are available to conduct the evaluation?
What data is available to you/are you already collecting?
How will you prioritize you evaluation questions/activities given the resources you have
available?

7

IV. GATHERING CREDIBLE EVIDENCE
Data Collection and Timeline
The kinds of data you will be gathering for your evaluation depend on your
evaluation questions, the type of evaluation you’ll be conducting (e.g.
process/implementation or outcome), and how you’ll be using your data (e.g. to
improve your program/activities; to demonstrate your activities’ impact).
Some





examples of Process/Implementation data and data sources include:
Number of trainings delivered (source: training logs)
Number of sites implementing activities (source: MOUs)
Number of individuals served (source: service logs)
Staff attitudes/enthusiasm/confidence in their ability to deliver the activities
(source: staff survey)
 Public perceptions/attitudes towards the activities (sources: participant
survey; media/news coverage)
 Barriers/challenges encountered during implementation (source: Program
Improvement Log- NOTE: see Table 3).

Some examples of Outcome data and sources include:
 Intermediate/Proximal Outcome: Percentage of teens using seatbelts
(State Dept. of Transportation data base)
 Intermediate/Proximal Outcome: Number of prescribers registered for
the state Prescription Drug Monitoring System (State Prescription Drug
Monitoring Program database)
 Intermediate/Proximal Outcome: Number of Question, Persuade, Refer
suicide prevention program trainer s trained (Training records from the
Question, Persuade, Refer program)
 Long Term/Distal Outcome: Rate of emergency room visits due to
prescription opioid poisoning (Emergency room data).
 Long Term/Distal Outcome: Rate of hospitalizations due to older adul t falls
(Hospital discharge data)
 Long Term/Distal Outcome: Rate of motor vehicle related deaths (Vital
statistics)
Other considerations to keep in mind when preparing for data collection include:




How often will data be collected?
Who is responsible for collecting the data?
How will you manage and store the data?

NOTE: You can use Table 2 to plan your data collection timeline and to organize your answers
to these questions.

8

Table 2. Data Collection and Analysis

Evaluation Question

Example: Were our
[FOCUS AREA]
activities implemented
successfully?

Data Source(s)

Example: Participation
logs, Service logs

Data Collection,
Analysis, and
Reporting Activities

Position Responsible

Timeline/Due Date

Example: Data
Collection- get
program participant
logs from local
partners

Example:
Epidemiologist

Example: April, 2015

Example: Data
Analysis- calculate
total number of
participants and
compare to
participation goals

Example:
Epidemiologist

Example: May, 2015

Example: Meet with
local partners and
share participation
data. Brainstorm
ways to increase
participation.

Example:
Epidemiologist

Example: June, 2015

9

Collecting Data for Program Improvement
One of the main purposes of conducting an implementation or process evaluation, is to gather data
on how well (or not) your activities are being implemented. Gathering this kind of data enables you
to become aware of potential issues and develop solutions to improve your activities. These kinds of
“mid course corrections” help increase the likelihood that your activities are being implemented well
and will eventually lead to better outcomes and impact.
Planning Ahead
Before you start implementing your program, you can start the program improvement process early
and think pre-emptively about some of the potential challenges or barriers you may come across
when putting your activities into place. This will allow you to think of ways to avoid or mitigate these
issues before they arise.
 What from your agency/state’s broader context is or has the potential to influence the
successful implementation of your activities (e.g. new state policies or laws/lack of key state
policies or laws, staff turn over, local tragedy related to your focus area, etc.)?
 How might these contextual factors influence your activities (e.g. changes/adaptations to the
activities themselves, changes in timeline for implementation, changes in partners needed to
successfully implement the activities)?
 How can you avoid or mitigate the impact of these factors on the implementation of your
activities (e.g. coalition building/reaching out to new partners, gathering key stakeholder
buy-in before implementation, involving stakeholders in program planning and evaluation,
etc.)?

Tracking Challenges and Solutions
Even the most thoughtful planning processes cannot foresee all challenges and barriers that may
come up once you begin implementing your strategies. It is important to put a system in place to
track these challenges as they come up, communicate the data to key stakeholders, brainstorm
solutions, and make changes/improvements.







What challenges or barriers have come up that have affected the implementation
of your activities?
Who will track these challenges when they arise?
How will this information be communicated to key stakeholders? Who will be in
charge of doing this?
What solutions have been found/are proposed for addressing these barriers?
Who will be responsible for impl ementing changes/improvements to the
activities?
Who will track/record these improvements?

NOTE: You can use Table 3 to track and record your implementation challenges and solutions and
plan how to use this data for program improvement.

10

Barrier/
Challenge
Example: Lack
of participation
from other state
agencies in the
State Traumatic
Brain Injury
Prevention
Coalition

Table 3. Program Improvement Log
Potential
Action Items
Position Responsible
Solutions
Example: Have
State Health Dept.
leadership reach
out to leadership
in members’
agencies to
encourage
participation

Example: Set up a
meeting with State
Health Dept.
leadership

Example: Send
an anonymous
survey to
members to
determine reasons
for low
participation

Timeline/ Due
Date

Example:
Epidemiologist

Example: Nov
10, 2014

Example:
Epidemiologist

Example: Nov
12, 2014

Example: Develop
and send survey

Example:
Epidemiologist

Example: Nov
15, 2014

Example: Analyze
survey results and
report back to
evaluation team

Example:
Epidemiologist

Example: Dec
1, 2014

Example: Draft
letter/email from State
Health Dept.
leadership to send to
other agency leads

11

V. JUSTIFY CONCLUSIONS
Analysis




How will you analyze your data?
Who will be responsible for analyzing your data? Do members of your evaluation team have
the time, expertise, and resources (e.g. software) to analyze your data?
How often will you need to analyze your data? If your data is being used for program
improvement, what is the best timeline for ensuring data can be used in a timely manner to
improve your program?

NOTE: You can use Table 2 to plan your data analysis timeline and to organize your answers
to these questions.
Interpretation



Who will you involve in drawing, interpreting and justifying conclusions?
What are your plans to involve them in this process?

NOTE: Information from the “Engaging Stakeholders” section above may be useful for thinking
through this.

12

VI. ENSURE USE AND SHARE LESSONS LEARNED
Dissemination
It is important to develop a plan for how you will share your evaluation findings with
your stakeholders, and how these findings will be used. Referring back to your
evaluation goals can help you focus your dissemination activities. Was your goal to
evaluate your program for the purpose of improving it? O r was the purpose of your
evaluation to show the impact of your activities on selected outcomes? The answers to
these questions, and the ones below, can help you organize your plan for disseminating
and using your evaluation findings.





Who is your audience/audiences? (e.g. coalition members, state public health department,
state decision makers, community members, potential funders, etc.)
What medium(s) do you plan to use to disseminate the evaluation findings to your audience
(e.g. presentation at a meeting, brief fact sheet/summary of findings, weekly program
improvement meetings, overview of findings on a website, etc.)?
Who will develop these dissemination materials and/or present your evaluation findings to
key stakeholders?

Use





What are your plans for using evaluation findings? (e.g. program improvement, generating
stakeholder buy-in, demonstrating impact, etc.)
How, where, and when will the findings be used?
Who will use these findings?
How will you monitor the use of these findings?

Table 4. Data Dissemination and Use
Evaluation
Finding/ Result
Example: Bicycle
helmet safety
program serving
50% more youth
than expected, and
running out of
helmets to give to
youth who
participate

Dissemination
Medium/
Method
Example:
Health
department
leadership
meeting

Intended Audience

Goals/ Intended Use

Example: Health
Example:
department leadership Use data to
communicate a
programmatic need
(more helmets) to
leaders/those who
can access more
resources

Example: Youth Example:
Example: State
hospitalization for Evaluation brief legislators
sports concussions
have decreased by
20% since
implementing the

Example: Use data
to communicate
program/policy
success to state
legislators

13

Position
Responsible

Time-line/
Due Date

Example: Example:
Dec 5, 2014
Program
Coordinator

Example: Example:
Feb, 2015
Program
Coordinator

State’s return to
play law

Example:
Evaluation
presentation

Example: School
superintendents

Example: Use data
to communicate
program/policy
success and
encourage continued
implementation in
schools

14

Example: Example:
Program
Mar, 2015
Coordinator

Appendix
Blank Tables
Table 1. Stakeholder Assessment and Engagement Plan
Stakeholder
Organization

Stakeholder
Category

Expertise/
Perspective

Role in the Evaluation

15

How to Engage When to Engage

Table 2. Data Collection and Analysis
Evaluation Question

Data Source(s)

Data Collection, Analysis, and Reporting
Activities

16

Position Responsible

Timeline/Due Date

Table 3. Program Improvement Log
Barrier/ Challenge

Potential
Solutions

Action Items

17

Position Responsible

Timeline/ Due
Date

Table 4. Data Dissemination and Use
Evaluation
Finding/ Result

Position
Dissemination Intended Audience Goals/ Intended Use
Time-line/
Medium/
Responsible Due Date
Method

18


File Typeapplication/pdf
AuthorWilkins, Natalie J. (CDC/ONDIEH/NCIPC)
File Modified2016-07-01
File Created2015-10-21

© 2024 OMB.report | Privacy Policy