Download:
pdf |
pdfForm approved:
OMB Control Number: 0920-1282
Expiration date: 06/30/2026
OD2A-S Performance Measures
Technical Guidance
Division of Overdose Prevention
State Program and Implementation Branch
August 2024
CDC estimates the average public reporting burden for this collection of information as 35 hours annually per
response from each recipient, including the time for reviewing instructions, searching existing data/information
sources, gathering and maintaining the data/information needed, and completing and reviewing the collection of
information. An agency may not conduct or sponsor, and a person is not required to respond to a collection of
information unless it displays a currently valid OMB control number. Send comments regarding this burden
estimate or any other aspect of this collection of information, including suggestions for reducing this burden to
CDC/ATSDR Information Collection Review Office, 1600 Clifton Road NE, MS D-74, Atlanta, Georgia 30333; ATTN:
PRA (0920-1282)
Acknowledgments
We want to acknowledge everyone who played a crucial role in the development of
performance measures for Overdose Data to Action in States (OD2A-S). This would not have
been possible without the dedication, expertise, and collaboration of a diverse group of
public health partners. Special thanks to the subject matter experts at the Centers for
Disease Control and Prevention (CDC), whose invaluable insights and domain knowledge
shaped the selection and prioritization of indicators. Thank you also for the active
engagement of our recipients during the initial performance measures webinar, whose
feedback provided essential perspectives and ensured the relevance of our measures. Finally,
thank you to the evaluators from the Program Evaluation Team, who each played a critical
role in the development of this guide. This collective effort underscores the spirit of
collaboration and commitment that defines our work in OD2A-S.
2
Contents
Introduction ............................................................................................................................................ 4
Purpose and Objectives ................................................................................................................. 4
Data Quality ........................................................................................................................................ 4
OD2A-S Performance Measures ..................................................................................................... 6
Quick View........................................................................................................................................... 6
Key Reporting Fields ....................................................................................................................... 7
HE_Impact ............................................................................................................................................ 9
HE_Activities ..................................................................................................................................... 11
HR_Encounters ................................................................................................................................ 13
HR_Naloxone..................................................................................................................................... 15
LTC_Navigators ................................................................................................................................ 17
LTC_Referrals .................................................................................................................................... 19
HS_Training ....................................................................................................................................... 22
HS_SUD_Protocols .......................................................................................................................... 24
Reporting ........................................................................................................................................... 26
Reporting Process .......................................................................................................................... 26
Excel Reporting Tool ..................................................................................................................... 27
Reporting Timeline ........................................................................... Error! Bookmark not defined.
Acronyms .................................................................................................. Error! Bookmark not defined.
Glossary ...................................................................................................... Error! Bookmark not defined.
3
Introduction
This technical guidance is specifically developed to support recipients of Overdose Data to
Action in States (OD2A-S) in their reporting of performance measures, also referred to as
indicators. Performance measures will be reported by recipients during the period of funding
to track progress on key interventions and outcomes as outlined in the Notice of Funding
Opportunity (NOFO).
This Technical Guidance will support recipients to collect and report on the outlined
performance measures. This document includes:
•
•
•
•
•
Introduction
Snapshot of performance measures
Detailed descriptions of each performance measure
Reporting timeline and guidance
Appendices (acronyms and glossary)
Purpose and Objectives
The primary goal of performance measures in OD2A-S is to provide a common set of
indicators that will be used by recipients and their partners to monitor progress and identify
areas for improvement. Performance measures data can be used to help:
1)
Recipients show progress and communicate progress to their health department
leadership.
2) CDC and recipients inform future CDC programmatic investments.
3) CDC and recipients understand the contributions of OD2A-S across overdose
prevention strategies and use data for programmatic improvement.
4) CDC communicates with Health and Human Services (HHS) and other federal
policymakers about the progress made under OD2A-S.
At CDC, these performance measures are not meant to compare jurisdictions to each other,
but rather to monitor progress for a recipient over time and to examine OD2A-S as a
program, overall. By establishing and regularly monitoring performance measures, recipients
can identify areas of strength, pinpoint challenges, and align their efforts with intended
objectives, ultimately fostering accountability and continuous enhancement within their
programs.
Data Quality
We strive for high-quality data reported across performance measures. High-quality data
ensures that the information collected is accurate, consistent, and reflective of the true
impact of program activities. Addressing data quality requires a proactive approach to
include staff training, standardized data collection protocols, regular data quality assurance
checks, and continuous monitoring and improvement processes. Investing in data quality
enhances the credibility of performance measures, supporting evidence-based decisionmaking and ensuring the program's overall success. Consider the following:
4
•
•
•
•
Accuracy – The information collected should clearly and adequately measure the
indicator within a plausible range.
Consistency – Written documentation of data collection and analysis methods can
ensure the same procedures are followed each time.
Timeliness – The information collected should be available to inform program
management decisions and it should represent the most current data available.
Reporting the data soon after it is collected is a good practice and can help to reflect
the true impact of program activities.
Integrity – Safeguards should be established to minimize the risk of bias or errors in
data transcription. This may be achieved by having more than one person conduct
the data transcription. In addition, there should be independence in key data
collection, management, and assessment procedures and mechanisms to prevent
unauthorized changes to the data.
We are asking OD2A-S recipients to keep us informed if you identify any data quality
concerns and challenges in data collection or reporting processes that could affect data
quality. Each of the performance measures includes data quality and contextual questions in
which any data quality concerns should be shared with CDC. Ultimately, we want to ensure
that performance measure data we review and share account for any needed caveats
regarding data quality.
5
OD2A-S Performance Measures
There are 8 performance measures. There are 7 quantitative measures and 1 qualitative
measure. The labels and brief descriptions are listed here for a quick reference. All
quantitative data should be answered in the Excel reporting tool. All qualitative questions
including HE_Impact, contextual questions, and data quality questions should, be reported
directly in Partners Portal.
Quick View
Icon
Label Name
Performance Measure
HE_Impact
Impactful practices for improving access to care and
treatment for PWUD who are historically underserved by
overdose prevention programs
HE_Activities
Number of health equity focused overdose prevention
activities implemented with OD2A funding
HR_Encounters
Number of harm reduction service encounters at OD2A
funded or supported organizations
HR_Naloxone
Number of naloxone doses distributed by OD2A funded
or supported organizations
LTC_Navigators
Number of navigators who link PWUD to care and harm
reduction services via warm handoffs
LTC_Referrals
Number of referrals to care and harm reduction services
HS_Training
Number of clinicians who received training on
implementing the “2022 CDC Clinical Practice Guidelines
for Prescribing Opioids for Pain”
HS_SUD_Protocols
Number of health settings implementing or improving
protocols and/or policies for evidence-based substance
use disorder (SUD) treatment or referrals
6
This guide uses a standard format to describe each performance measure. Each indicator
reference sheet is organized by an overview of the measure and its key reporting fields. Each
indicator reference sheet includes a section on reporting specifications to explain exactly
what needs to be reported for each performance measure. Each quantitative measure
includes required and optional disaggregates, contextual questions, and data quality
questions. Contextual questions are required and help recipients explain any nuances in the
data and provide a fuller picture of the quantitative measures. Data quality questions are
included for you to provide information about the data reported to help explain
representativeness, completeness, and other data quality considerations.
Key Reporting Fields
Label
Used to give a shorthand to each measure
Name
Descriptive name of performance measure
Unit of
Measure
Numerator
Denominator
Quantitative value (e.g., count or percentage)
Suggested numerator
Suggested denominator (if applicable).
Disaggregates
The separation of indicators into smaller units to identify underlying
trends and patterns. Allows for understanding of how subgroups are
impacted differently. All disaggregates are required unless otherwise
noted as optional.
Reporting
Specifications
Descriptions that operationalize how to report each measure to CDC
Contextual
Questions
Questions to improve CDC’s understanding of numeric data. As a
complement to the reported performance measures data, recipients are
asked to provide qualitative contextual explanatory information.
Data Quality
Specific questions for which recipients should describe data quality and
representativeness of the data, for example, issues or concerns with
respect to data quality and completeness.
7
Indicator Reference Sheets for Each
Performance Measure
8
HE_Impact
Impactful practices for improving
access to care and treatment for
PWUD who are historically
underserved by overdose prevention
programs
Key Reporting Fields
Primary
Measure
This is a qualitative measure. It is a narrative description of the impactful
practices you observe in your jurisdiction that improve access to care and
treatment for PWUD. There is no quantitative reporting required for
this performance measure. This may be reported in Partners Portal.
Disaggregates
N/A
Reporting
Specifications
The following format is recommended for reporting this qualitative
indicator:
1. Brief description of the implemented and/or tailored (adapted to
specific cultural, linguistic, environmental, or social needs of
populations) evidence-based intervention or innovative practice
(including setting and whether navigators were included if
applicable) and how these compare to previous efforts.
2. How access to care or treatment has been improved, and what
new/existing community assets were leveraged.
3. Which specific populations disproportionately affected by
overdose and underserved with care and treatment programs are
impacted by efforts (if tracked).
4. This is optional. Any other outcomes that were improved (provides
recipients the option to expand beyond access to care and
include any other outcomes, for example, like retention in care,
decreased opioid use).
The length of the narrative should be succinct, but each impactful
practice* should have a descriptive paragraph if more than one is
outlined.
*Note: If your jurisdiction or partners have not implemented any
impactful practices at the time of reporting, please note in the relevant
data submission field “no practices have been implemented to improve
access to care and treatment to date.”
9
Contextual
Questions
Data Quality
1.
2.
What barriers prevent achieving equitable access to care and
treatment for SUD?
What facilitators support achieving equitable access to care and
treatment for SUD?
1. Describe any issues or concerns that impact the quality of the data
shared (e.g., data completeness, data accuracy, facilitators/barriers for
collection and reporting).
10
HE_Activities
Number of health equity focused
overdose prevention activities
implemented with OD2A funding
Key Reporting Fields
Primary Unit of
Measure
Disaggregates
Total count of activities
Settings
• Health/Clinical (e.g., emergency department, hospitals, clinics,
outpatient, inpatient, primary care, pharmacies) Harm reduction
(e.g., syringe services programs)
• Public safety (e.g., criminal justice, EMS)
• Other
Total_HE_Activities
• This is a formula field that will generate a total count of health
equity focused overdose prevention activities that occurred in a
clinical, harm reduction, public safety, or other settings during the
designated reporting period once the disaggregates below are
entered into the appropriate fields.
HE_Clinical_Settings
• Enter a whole number for the health equity focused overdose
prevention activities that occurred in a health/clinical setting.
Reporting
Specifications
HE_HR_Settings
• Enter a whole number that reflects the health equity focused
overdose prevention activities that occurred in a harm reduction
setting.
HE_Public_Safety_Settings
• Enter a whole number that reflects the health equity focused
overdose prevention activities that occurred in a public safety
setting.
HE_Other_Settings
• This disaggregate is optional. If chosen, enter a whole number
that reflects the health equity focused overdose prevention
activities that occurred in any setting outside of clinical, harm
reduction, and public safety.
Contextual
Question
Data Quality
1. Please describe the activities in this performance measure, for whom
they were intended, and how the activities were implemented and/or
tailored (e.g., linguistically, culturally) for racially, ethnically, and
linguistically diverse populations?
1.
Describe any issues or concerns that impact the quality of the data
shared (e.g., data completeness, data accuracy, facilitators/barriers for
collection and reporting).
12
HR_Encounters
Number of harm reduction service
encounters at organizations
funded or supported by OD2A
Key Reporting Fields
Primary Unit of
Measure
Total count of service encounters
Selected harm reduction services:
• Number of service encounters where in-person drug checking
occurred, and result was provided back to participant (e.g., use
of FTIR/mass spectrometer)
Disaggregates
Reporting
Specifications
Locations where harm reduction services were provided:
• Zip code(s) where service is delivered. (Note: this is NOT the zip
code of the participant residence)
Total_HR_Encounters
• Enter a total count of harm reduction service encounters
(e.g., in-person, mail, telephone, online) that occurred at an
OD2A-S funded organization during the designated
reporting period.
Encounters_with_Drug_Checking
• Enter a whole number for service encounters where drug
checking occurred.
13
Reporting
Specifications
(Continued)
Contextual
Questions
ZipCode_By_HR_Service_Site
• Enter the five-digit zip code for each site where harm
reduction services (e.g., in-person, mail, telephone, online)
were provided. For any service site where services are
provided in person, use the brick and mortar location zip
code. For services provided via phone or mail, use the
address of the brick and mortar location. For mobile-based
outreach services, use the zip code of where the outreach
encounter happened. For any service sites where zip codes
are unknown, provide the total number of encounters that
occurred across locations with unknown zip codes in the
designated cell for “unknown” within the adjacent cell.
Encounters_with_Drug_Checking_by_ZipCode
• Enter a whole number for service encounters involving drug
checking for each zip code provided. When the zip code is
"unknown" total the remaining encounters with drug
checking and enter a whole number.
1. What are the barriers for people accessing harm reduction
services in your jurisdiction?
2. What are the facilitators for people accessing harm reduction
services in your jurisdiction?
3. What types of services are included?
4. Please estimate the proportion of harm reduction service
encounters that occurred:
___ % at brick and mortar locations
___ % via mobile-based outreach services
___ % via mail-based delivery
___ % other (please specify)
1.
Data Quality
2.
Describe any issues or concerns that impact the quality of the
data shared (e.g., data completeness, data accuracy,
facilitators/barriers for collection and reporting).
How many OD2A-funded organizations are included in the data
submitted?
14
HR_Naloxone
Number of naloxone doses
distributed by OD2A funded or
supported organizations
Key Reporting Fields
Primary Unit of
Measure
Total count of pre-measured naloxone doses distributed
•
•
Disaggregates
•
•
Type of funded organization (e.g., Syringe Service Programs,
community-based organizations, senior care organizations, faithbased organizations, Emergency Department/Urgent Care, Other
healthcare organizations, Police departments, Jails/Prisons,
Colleges/Universities, Secondary education, Health Department)
Number of all pre-measured naloxone doses distributed by
organization.
Zip code(s) where the organization distributed their doses (Note: if
distributed at a brick-and-mortar location like an SSP, use the zip
code of the SSP. This is NOT the zip code of the participant
residence)
Number of all pre-measured naloxone doses distributed by zip code.
Total_Naloxone_Distributed
Enter a whole number for doses of naloxone distributed by an
OD2A funded or supported organization during the designated
reporting period.
Reporting
Specifications
Type_of_Organization
• This variable has been pre-selected. If data are not available for a
particular type of organization, enter 0 for all variables in the
adjacent row.
Num_Doses_Distributed
• Enter a whole number for the count of all pre-measured naloxone
doses distributed for each type of organization.
15
Reporting
Specifications
(Continued)
Contextual
Questions
ZipCode_By_Nal_Distribution_Site
• Enter the five-digit zip code where the funded organization
distributed their doses of naloxone. For any distribution site where
the zip code is unknown, provide the total in the adjacent cell.
Num_Doses_Distributed_ZipCode
• Enter a whole number for the count of pre-measured naloxone
doses distributed for each zip code. When the zip code is
"unknown" total the remaining doses distributed and enter a
whole number.
1.
2.
3.
What are barriers to accessing or receiving naloxone?
What are facilitators to accessing or receiving naloxone?
How did you use OD2A Funds to distribute naloxone (e.g. staffing to
distribute, vending machines)?
4. This contextual question is optional. Describe mechanisms used to
distribute naloxone (e.g., mail in, handoffs).
1.
Data Quality
2.
If you selected “other” Type of organizations in the reporting tool,
please describe.
Describe any issues or concerns that impact the quality of the data
shared (e.g., data completeness, data accuracy, facilitators/barriers for
collection and reporting).
16
LTC_Navigators
Number of navigators who link
PWUD to care and harm
reduction services via warm
handoffs
Key Reporting Fields
Primary Unit of
Measure
Disaggregates
Total count of unique navigators who link PWUD
Entry points where navigators are primarily located:
• Health/Clinical (e.g., emergency department, hospitals,
clinics/practices, outpatient, inpatient, treatment centers, primary
care, pharmacies)
• Harm reduction (e.g., syringe services programs)
• Public safety (e.g., criminal justice, EMS)
• Other
This disaggregate is optional. Number of hours navigators spent on
linkage efforts
Total_Navigators
• This is a formula field that will generate a total count of unique
navigators who link PWUD to care and/or harm reduction services
via warm handoffs once the disaggregates below are entered into
the appropriate fields.
Reporting
Specifications
Nav_Clinical
• Enter a whole number for the navigators located in a
health/clinical setting.
Nav_HR
• Enter a whole number for the navigators located in a harm
reduction setting.
Nav_Public_Safety
• Enter a whole number for the navigators located in a public safety
setting.
17
Nav_Other
• Enter a whole number for the navigators in any other settings.
Navigator_Hours_Clinical
• This disaggregate is optional. If chosen, enter a whole number for
the total hours’ navigators have spent on linkage to care or
referral efforts in health/clinical settings.
Reporting
Specifications
(Continued)
Navigator_Hours_HR
• This disaggregate is optional. If chosen, enter a whole number for
the total hours’ navigators have spent on linkage to care or
referral efforts in harm reduction settings.
Navigator_Hours_Public_Safety
• This disaggregate is optional. If chosen, enter a whole number for
the total hours’ navigators have spent on linkage to care or
referral efforts in public safety settings.
Navigator_Hours_Other
• This disaggregate is optional. If chosen, enter a whole number for
the total hours’ navigators have spent on linkage to care or
referral efforts in any other settings.
1.
Contextual
Questions
Data Quality
2.
1.
Describe what types of navigators are included in the data reported
(e.g., certified peer recovery specialists, peer support specialists, case
managers, patient navigators, community health workers, persons
with lived experience, etc.).
Describe methods to support navigators, including average hourly
pay, benefits, and additional supports (e.g., trauma, wellness,
emotional/psychological support, infrastructure such as a phone) to
help retain them.
Describe any issues or concerns that impact the quality of the data
shared (e.g., data completeness, data accuracy, facilitators/barriers for
collection and reporting).
18
LTC_Referrals
Number of referrals to care and
harm reduction services
Key Reporting Fields
Total count of unique referrals
Primary Unit of
Measure
Disaggregates
Note: If you refer one individual to both MOUD and harm reduction
services, you would account for 2 different referrals as you will report by
each service. If you refer the same individual multiple times, they would
be counted multiple times. This indicator is not counting unique
individuals, but rather referral encounters.
Types of care/service referrals:
• Number of referrals to medications for opioid use disorder (MOUD)
• Number of referrals to behavioral health treatment only (without
MOUD)
• Number of referrals to harm reduction services
Demographics of people who are referred:
•
Race and Ethnicity (American Indian or Alaska Native, Asian, Black, or
African American, Hispanic, or Latino, Middle Eastern or North African,
Native Hawaiian or Other Pacific Islander, White, Multiracial and/or
Multiethnic, Unknown)
19
Total_Referrals
• This is a formula field that will generate a total count of unique
referrals to care and harm reduction services once the
disaggregates below are entered in the appropriate fields.
Race_Ethnicity
• This variable has been pre-selected. If data are not available for a
particular race and ethnicity, enter 0 for all variables in the
adjacent row. Note: when the race_ethnicity is marked unknown,
this also includes if an individual preferred not to answer.
Reporting
Specifications
Ref_MOUD
• Enter a whole number for all referrals to MOUD for each
race/ethnicity with available data.
Ref_Behavioral_Trt
• Enter a whole number for all referrals to behavioral health
treatment only (without MOUD) for each race/ethnicity with
available data.
Ref_to_HR
• Enter a whole number for all referrals to harm reduction services
for each race/ethnicity with available data.
Total_Ref_Race_Ethnicity
• This is a formula field that will generate a total count for all
referrals to MOUD, behavioral treatment only (without MOUD),
and harm reduction services by each race/ethnicity.
Contextual
Questions
Types of Referrals
1. This contextual question is optional. If you have other OD2A funded or
supported referrals beyond referrals to MOUD, behavioral treatment
only (without MOUD), and harm reduction services. Please describe
the “other” types of referrals.
Reporting Partners
2. Approximately, what % of healthcare facilities (e.g., hospitals,
emergency departments, other clinical settings) reported data to your
jurisdiction for this performance measure? (If % not available, report
total number of healthcare facilities that reported).
3. Approximately, what % of EMS agencies reported data to your
jurisdiction for this performance measure? (If % not available, report
total number of EMS agencies that reported).
4. Approximately, what % of carceral settings (e.g., prisons and jails),
5.
reported data to your jurisdiction for this performance measure? (If %
not available, report total number of carceral settings that reported).
Approximately, what % of harm reduction settings (e.g., SSPs)
reported data to your jurisdiction for this performance measure? (If %
not available, report total number of carceral settings that reported).
20
Meta Data /
Data Quality
1.
Describe any issues or concerns that impact the quality of the data
shared (e.g., data completeness, data accuracy, facilitators/barriers for
collection and reporting).
21
HS_Training
Number of clinicians who received
training on implementing the “2022
CDC Clinical Practice Guideline for
Prescribing Opioids for Pain”
Key Reporting Fields
Primary Unit of
Measure
Total count of OD2A-S clinicians trained
Numerator
Count of clinicians trained
•
Disaggregates
•
•
•
This disaggregate is optional. Specialty (e.g., Primary care,
Emergency medicine, Hospitalists, Surgeons, OB/GYNs, Neurologists,
Dentists, Physical medicine and rehabilitation, Occupational
medicine, Pharmacists)
This disaggregate is optional. Number of unique clinicians trained
This disaggregate is optional. Number of eligible clinicians
This disaggregate is optional. Percentage of eligible clinicians trained
Total_Trained
• Enter a whole number for the count of all unique clinicians trained
on implementing the 2022 CDC Clinical Practice Guidelines for
Prescribing Opioids for Pain during the designated reporting
period.
Reporting
Specifications
Specialty
Optional disaggregate: If chosen, select a specialty from the
dropdown list for the type of clinicians trained on the 2022 CDC
Clinical Practice Guidelines for Prescribing Opioids for Pain.
Num_Trained
• Optional disaggregate: If a specialty is chosen, enter a whole
number for the unique clinicians by specialty who are trained on
implementing the 2022 CDC Clinical Practice Guidelines for
Prescribing Opioids for Pain.
22
Reporting
Specifications
(Continued)
Num_Eligible
• Optional disaggregate: If a specialty chosen, enter a whole
number for all eligible clinicians who could be trained on
implementing the 2022 CDC Clinical Practice Guidelines for
Prescribing Opioids for Pain.
Percent_Clinician_Trained
• This is a formula field that will generate a percentage of clinicians
trained when the numerator (Num_Trained) and denominator
(Num_Eligible) are entered into the appropriate fields.
1.
Contextual
Questions
Data Quality
Describe the trainings including the title, number offered, length, who
conducted them, and where the training occurred.
2. This contextual question is optional. What populations are served by
the clinicians who were trained?
3. What are barriers to effectively training clinicians on the “2022 CDC
Clinical Practice Guideline”?
4. What are facilitators to effectively training clinicians on the “2022 CDC
Clinical Practice Guideline”?
1.
Describe any issues or concerns that impact the quality of the data
shared (e.g., data completeness, data accuracy, facilitators/barriers for
collection and reporting).
23
HS_SUD_Protocols
Number of health/clinical settings
implementing or improving protocols
and/or policies for evidence-based SUD
treatment or referrals
Key Reporting Fields
Primary Unit of
Measure
Total count of health/clinical settings
•
Disaggregates
•
Number of health/clinical settings where protocols or policies have
been implemented/improved for evidence-based SUD treatment
Number of health/clinical settings where protocols or policies have
been implemented/improved for evidence-based SUD referrals
Total_Health_Settings
• Enter the total count of health/clinical settings where protocols
and/or policies have been implemented/improved for evidencebased SUD treatment and/or referrals. Note this will be the
number of unique health settings, regardless of whether they
have just one or both types of protocols/policies.
Reporting
Specifications
Num_Settings_SUD_Treatment
• Enter a whole number for the health/clinical settings where
protocols or policies have been implemented/improved for
evidence-based SUD treatment.
Num_Settings_SUD_Referrals
• Enter a whole number for the health/clinical settings where
protocols or policies have been implemented/improved for
evidence-based SUD referrals.
1.
Contextual
Questions
2.
Describe how access to MOUD for healthcare settings has changed
since implementing policies or protocols.
Describe the partnerships for SUD referral with the health settings
included in this indicator. What steps were taken to develop and build
the partnerships for SUD referrals?
24
Data Quality
1.
2.
What types of health settings are included in the reported data?
Describe any issues or concerns that impact the quality of the data
shared (e.g., data completeness, data accuracy, facilitators/barriers for
collection and reporting).
25
Reporting
OD2A-S recipients are expected to report on all performance measures on an annual basis.
We have selected a short list of measures we believe are feasible for most recipients to report
on. This does not limit what individual health departments want to capture for their use, and
individual recipients can examine their capacities to collect, analyze, and disseminate
additional performance measure data.
Data collection may be ongoing in each individual health department with partners
reporting to health departments monthly or quarterly at minimum to allow for discussion
and potential course corrections early on. As part of the performance measures submission,
DOP staff at CDC commits to review the data, engage with recipients in discussion of the
data, and learn from health departments’ experiences and expertise gathered through prior
and ongoing efforts to collect data and justify overdose prevention programs. Once data
quality is at a sufficient place, CDC will share data reports back to individual recipients with
their data for use within their own health department. CDC will use the data along with work
plans and APRs to craft case studies and stories to share with CDC leadership, Health and
Human Services, and other federal policymakers, as well as with recipients. CDC will find
opportunities for mutual learning, growth, and sharing best practices so that we can all learn
from each other.
Reporting Process
The current plan is to report performance measure data in the Partner’s Portal. The 1
qualitative performance measure, contextual questions, and data quality questions will be
submitted directly into the Partner’s Portal platform. Data for the 7 quantitative measures
along with their disaggregates will be submitted using the Excel reporting tool we
developed—the Excel tool will be submitted as an attachment within Partner’s Portal. The
Excel tool has a tab titled, “Start Here.” Please read the information on that tab before
entering data.
Please note that CDC is requesting that jurisdictions enter all counts—please do not
suppress small numbers. All numbers will be available to the CDC OD2A-S Program
Evaluation Team, and small counts will not be shared with anyone outside the support team.
The CDC OD2A-S Program Evaluation Team will aggregate small counts before any data are
shared, and we will consult with recipients on plans to share data. If the count is zero, please
enter “0”—please do not leave these cells null or blank to ensure these cells are not
mischaracterized as missing data.
26
Excel Reporting Tool
Performance measures will be reported using the Partner’s Portal (see reporting process
above). To aid in data collection with your partners and provide a clearer roadmap for data
collection including required and optional disaggregates, we have developed an Excel-based
tool, OD2A-S Performance Measures Reporting Tool.
Example of OD2A-S Performance Measures Reporting Tool
27
File Type | application/pdf |
Author | Geller, Amanda (CDC/NCIPC/DOP) |
File Modified | 2024-08-02 |
File Created | 2024-08-02 |