National Evaluation of the DP18-1815 Cooperative Agreement ProgramCATEGORY A: DIABETES MANAGEMENT AND TYPE 2 DIABETES PREVENTION
PART B: STATISTICAL METHODS
April 22, 2020 |
Contact: Kimberly Farris Telephone: 770-488-0543 E-mail: [email protected] National Center for Chronic Disease Prevention and Health Promotion Centers for Disease Control and Prevention Atlanta, Georgia |
1. Respondent Universe and Sampling Methods
2. Procedures for the Collection of Information
3. Methods to Maximize Response Rates and Deal with No Response
4. Test of Procedures or Methods to be Undertaken
5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or
Analyzing Data
Centers for Disease Control and Prevention. National Center for Chronic Disease Prevention and Health Promotion. Notice of Funding Opportunity: Improving the Health of Americans through Prevention and Management of Diabetes and Heart Disease and Stroke – Financed in part by the 2018 Prevention and Public Health Funds. CDC-RFA-DP18-1815PPHF18
Attachment 2.
Authorizing Legislation Section 301(a) of the Public Health Services Act [42.U.S.C. 242k]
Attachment 3. 1815 Awardees and Information Collection Plan
1815 List of Health Department Awardees
1815 Strategies for Preventing and Controlling Diabetes and Heart Disease and Stroke
1815 Logic Model
1815 Crosswalk of Evaluation Components and Data Collection Tools
1815 Summary of Annualized Respondents
1815 Evaluation Gantt Chart
Attachment 4. Category A Partner Site-level Rapid Evaluations
DSMES Partner Site-Level Rapid Evaluation Rapid Nomination Form (Word Version)
DSMES Partner Site-Level Rapid Evaluation Rapid Nomination Form (Screenshot Version)
National DPP Partner Site-Level Rapid Evaluation Nomination Form (Word Version)
National DPP Partner Site-Level Rapid Evaluation Nomination Form (Screenshot Version)
DSMES Partner Site-Level Rapid Evaluation Survey Questionnaire
DSMES Partner Site-Level Rapid Evaluation Program Coordinator Interview Guide
DSMES Partner Site-Level Rapid Evaluation Professional Team Member Interview Guide
DSMES Partner Site-Level Rapid Evaluation Paraprofessional Team Member Interview Guide
National DPP Partner Site-Level Rapid Evaluation Survey Questionnaire
National DPP Partner Site-Level Rapid Evaluation Program Coordinator Interview Guide
National DPP Partner Site-Level Rapid Evaluation Lifestyle Coach Interview Guide
Attachment 5. Category A Recipient-led Evaluations
Category A Evaluation and Performance Measurement Plan (EPMP) including evaluation questions
Category A Recipient-Led Evaluation Reporting Template
Attachment 6. 60-Day Federal Register Notice
Attachment 7.
Institutional Review Board Approval Notification or Exemption Determination Part A
Institutional Review Board Approval Notification or Exemption Determination Part B
Attachment 8. Introductory/Follow-Up Letters
Category A DSMES/National DPP Site-Level Rapid Evaluation Pre-Survey Email
Category A DSMES/National DPP Site-Level Rapid Evaluation Survey Invitation
Category A DSMES/National DPP Site-Level Rapid Evaluation Survey Reminder
Category A DSMES/National DPP Site-Level Rapid Evaluation Final
This information collection request supports a National Evaluation of selected activities under Notice of Funding Opportunity (NOFO) CDC-RFA-DP18-1815, Improving the Health of Americans Through Prevention and Management of Diabetes and Heart Disease and Stroke (the “1815” cooperative agreement). The evaluation addresses activities under Category A, Diabetes Management and Type 2 Diabetes Prevention. There are two primary respondent groups:
Cooperative agreement awardees (N=51). Awardees are state/jurisdiction-level health departments in all 50 states and the District of Columbia (Attachment 3a), called “HD recipients” throughout this information collection request. Information will be collected from program directors, program staff, and evaluators. Under Category A funding, HD recipients are working with health systems and partner sites to implement and evaluate seven strategies (A1-A7, see below) for increasing enrollment and retention in evidence-based programs that help individuals manage their diabetes, or prevent or delay progression from prediabetes to type 2 diabetes.
Category A HD recipients are required to participate in the Recipient-led Evaluation component of the National Evaluation.
Affiliate staff members from 204 health systems or site partner sites working with or otherwise collaborating with the HD recipients.
Diabetes Self-Management Education and Support (DSMES) sites (N=102). These sites deliver care that helps individuals diagnosed with diabetes to manage their condition. Each HD recipient will nominate 2 DSMES sites to participate in the Site-level Rapid Evaluation on a voluntary basis.
National Diabetes Prevention Program (National DPP) sites (N=102). These sites deliver lifestyle change programs that help individuals prevent or delay the progression of prediabetes to type 2 diabetes. Each HD recipient will nominate 2 sites to participate in the Site-level Rapid Evaluation on a voluntary basis.
In order to obtain information from a variety of roles and perspectives, each site will complete a survey and multiple staff members will be asked to participate in interviews.
The strategies being implemented and evaluated are summarized below.
Type of Strategy |
Strategy Label |
Strategy Description |
Diabetes Management
(for Individuals with Diabetes) |
A1 |
Improve access to and participation in (ADA) American Diabetes Association-recognized/Association of Diabetes Care & Education Specialists (ACDES) accredited DSMES programs in underserved areas |
A2 |
Expand or strengthen DSMES coverage policy among public or private insurers or employers, with emphasis on one or more of the following: Medicaid and employers |
|
A3 |
Increase engagement of pharmacists in the provision of medication management or DSMES for people with diabetes |
|
Diabetes Prevention
(for Individuals with Prediabetes) |
A4 |
Assist health care organizations in implementing systems to identify people with prediabetes and refer them to CDC-recognized lifestyle change programs for type 2 diabetes prevention |
A5 |
Collaborate with payers and relevant public and private sector organizations within the state to expand availability of the National DPP as a covered benefit for one or more of the following groups: Medicaid beneficiaries; state/public employees; employees of private sector organizations |
|
A6 |
Implement strategies to increase enrollment in CDC-recognized lifestyle change programs |
|
Diabetes Management and Prevention |
A7 |
Develop a statewide infrastructure to promote long-term sustainability/ reimbursement for Community Health Workers (CHWs) as a means to establish or expand their use in a) CDC-recognized lifestyle change programs for type 2 diabetes prevention and/or b) ADA-recognized/ADCES-accredited DSMES programs for diabetes management |
HD recipients will consider the following criteria in determining which sites to nominate for participation in the Site-level Rapid Evaluation:
the strategies they have selected for implementation;
a mix of different geographic locations and contexts;
the demographic characteristics of the populations served;
the varying levels of experience of implementing the strategies; and
the availability and willingness of HDs and partner sites.
Respondent sampling methods for the various data collection efforts included in this National Evaluation are detailed further below and note that all participants sampled for the respective data collection efforts will be sampled with replacement.
CDC has contracted with Deloitte Consulting to design and implement the 1815 National Evaluation. Deloitte Consulting, together with the Division of Diabetes Prevention (DDT) Performance Improvement and Evaluation (PIE) team will be responsible for data collection and analysis activities. The Deloitte and PIE teams are referred to collectively as the National Evaluation Team in this document.
OMB approval is requested for three years. Data collection will occur in years 1-3 of OMB approval, corresponding to years 3-5 of the cooperative agreement. The criteria and procedures for site selection are described in more detail, below.
Category A Evaluation Component 1: Partner Site-Level Rapid Evaluations
DSMES/National DPP/ Partner Site-Level Rapid Evaluation
Nomination Process (Att. 4a, 4aa, 4b, 4bb):
During Year 2 of the 1815 cooperative agreement, each HD recipient will nominate 2 National Diabetes Prevention Program (National DPP) and 2 Diabetes Self-Management Education and Support (DSMES) sites to participate in the partner site-level rapid evaluation, for a total of 204 sites (102 National DPP and 102 DSMES). The National Evaluation Team will provide HD recipients with a nomination form and criteria for nominating sites to ensure a mix of sites in terms of geographic reach, length of time in operation, program size, target population, and strategies selected for implementation. The nomination form will collect details regarding contact information for the nominated site, how long the HD has been supporting the site, the site’s involvement in HD 1815 program activities, and information about the types of direct and indirect support the HD is currently providing the nominated site through 1815 funds. The National Evaluation Team will host a webinar to provide recipients with instructions on how to complete the nomination forms online. The National Evaluation Team will convene an internal CDC Category A National Evaluation Panel (including the Category A project officers, evaluators, and DDT subject matter experts) to review the completed nomination forms.
After the panel completes their review of the nominations, the Deloitte National Evaluation Team will host an introductory phone call with the sites to review the evaluation protocol and timeline. Sites will have two weeks to confirm their voluntary participation in the site-level rapid evaluation. Confirmed sites will be followed for the duration of the cooperative agreement. If a site chooses not to participate in the evaluation or withdraws from participating, the HD will be asked to nominate another site. If a site closes prior to the end of the cooperative agreement, the HD will be asked to select a new site for inclusion in the site-level rapid evaluation.
Starting in Year 3 of the cooperative agreement, the internal CDC 1815 Category A Panel will review site visit reports, annual progress reports, performance measure data, recipient-led evaluation reports and other supporting information and determine the criteria for virtual site visits. Site visits will be virtual to mitigate for COVID-19 transmission. A total of 12-14 DSMES sites and 12-14 National DPP sites (from the full list of 204 sites) will be selected for virtual site visits annually. The site selection criteria will be updated each year as the cooperative agreement progresses and the focus of the site-level rapid evaluation changes based on input from key stakeholders and outcome results.
In Year 3 and Year 5 of the cooperative agreement, staff members from all 204 selected sites will be invited to complete a web-based survey. Four to five staff members will be identified per site using a purposive sampling approach, to select staff members who are most familiar with implementation of the National DPP and DSMES programs respectively. Selected staff members will include program coordinators, educators, evaluation staff, and paraprofessional staff such as community health workers or pharmacists who support program delivery.
Evaluation Component 2: Evaluation and Performance Measurement Plan Template
Per the 1815 notice of funding opportunity, all 51 HD recipients are required to complete an evaluation and performance measurement plan in Year 2 of the cooperative agreement that demonstrates how the HD recipient will fulfill the state level evaluation requirements described in the CDC Evaluation and Performance Measurement and Project Description sections of the cooperative agreement. DDT has developed a set of evaluation questions and associated indicators as a resource for HD recipients. Recipients will submit their 5-year plan to DDT in Year 2 of the cooperative agreement. The plan can be updated annually, as necessary. The state level evaluation requirement is outlined in the cooperative agreement and not associated with the recipient’s participation in national evaluation activities, which is voluntary. The national evaluation team may triangulate these plans with other data sources to get a comprehensive look at each recipient’s progress to outcomes.
Category A Recipient-Led Evaluation Reporting
Per the 1815 cooperative agreement, all 51 HD recipients are required to complete an annual evaluation report to update CDC on the progress of their recipient-led evaluations, which is laid out in a 5-year plan submitted to DDT in Year 1, for Category A strategies. Recipients will submit their 5-year plan to DDT in Year 1 of the cooperative agreement. The plan can be updated annually, as necessary. The state-level evaluation requirement is outlined in the cooperative agreement and not associated with the recipient’s participation in national evaluation activities, which is voluntary. The national evaluation team may triangulate these plans with other data sources to get a comprehensive look at each recipients’ progress to outcomes.
Attachment 3f indicates the annualized number of entities covered by each proposed data collection effort.
Table B.1-B. Overview of the Data Collection Plan
This table provides an overview of the data collection plan, forms, respondents (by roles), and the schedule. OMB approval is requested for 3 years. Information collection will occur in years 3, 4, and 5 of the 5-year cooperative agreement (Years 1, 2, and 3 of the 3-year period of OMB approval).
Evaluation Component |
Respondents |
Form Name |
Coag data collection period (YR3-YR5) |
Total No. of Collections YR3-YR5 |
No. of Respondents per Collection |
No. of Respondents per Collection (detail) |
Recipient-Led Evaluation |
HD recipient staff (1) |
Att. 5a: Category A EPMP Template |
YR2 |
1 |
51 |
50 SHD + Washington DC x 1 staff/SHD |
HD recipient staff (1) |
Att. 5b: DDT Recipient-led Annual Evaluation Report Template |
YR 3, 4, 5 |
3 |
51 |
50 SHD + Washington DC x 1 staff/SHD |
|
Site-Level Rapid Evaluation – DSMES |
HD recipient staff (2) |
Att. 4a & 4aa: National DSMES Rapid Evaluation Nomination Form |
YR 3 |
1 |
51 |
50 SHD + Washington DC |
DSMES partner site staff (2,4,5,6) |
Att. 4c: DSMES Rapid Evaluation Survey Questionnaire |
YR 3, 5 |
2 |
510 |
102 sites x 5 staff/site |
|
DSMES partner site staff (4) |
Att. 4d: DSMES Rapid Evaluation Interview Guide - Program Coordinator |
YR 3, 4, 5 |
3 |
14 |
14 sites x 1 staff |
|
DSMES partner site staff (6) |
Att. 4e: DSMES Rapid Evaluation Interview Guide - Professional |
YR 3, 4, 5 |
3 |
28 |
14 sites x 2 staff/site |
|
DSMES partner site staff (5) |
Att. 4f: DSMES Rapid Evaluation Interview Guide - Paraprofessional |
YR 3, 4, 5 |
3 |
28 |
14 sites x 2 staff/site |
|
Site-Level Rapid Evaluation – National DPP |
HD recipient staff (3) |
Att. 4b & 4bb: National DPP Rapid Evaluation Nomination Form |
YR 3 |
1 |
51 |
50 SHD + Washington DC |
National DPP partner site staff (7,8) |
Att. 4g: National DPP Rapid Evaluation Survey Questionnaire |
YR 3, 5 |
2 |
510 |
102 sites x 5 staff/site |
|
National DPP partner site staff (7) |
Att. 4h: National DPP Rapid Evaluation Interview Guide - Program Coordinator |
YR 3, 4, 5 |
3 |
14 |
14 sites x 1 staff/site |
|
National DPP partner site staff (8) |
Att. 4i: National DPP Rapid Evaluation Interview Guide - Lifestyle Coach |
YR 3, 4, 5 |
3 |
28 |
(1) DSMES Evaluator; (2) DSMES Program Director; Team Lead/ Manager; Evaluator; Health Scientist; (3) National DPP Program Director; Team Lead/ Manager; Evaluator; Health Scientist; (4) DSMES Program/Quality Coordinator- Healthcare Social Worker; (5) DSMES Paraprofessionals: CHWs, Medical Assistants, Diabetes and Pharmacy Technicians; (6) DSMES Professionals: Pharmacists, RNs, RDs; (7) National DPP Program Coordinator (Healthcare Social Worker); (8) National DPP Lifestyle Coach (Community Health Worker)
Information will be collected from HD recipients on an annual basis, at most (Attachment 3f). Data collection procedures vary slightly for each component of the evaluation, and those methods are described below.
Category A Partner Site-Level Rapid Evaluation
DSMES/National DPP Partner Site-Level Nomination (Att. 4a, 4aa, 4b, 4bb):
During Year 3, each HD recipient will be asked to complete a web-based nomination form to indicate the 2 National DPP and 2 DSMES sites they are nominating to participate in the partner site-level rapid evaluations. The link to the online form will be sent to HD recipients via the 1815 communication listserv. Two reminder emails will be sent out to encourage completion of the nomination form.
DSMES/ National DPP Rapid Evaluation Survey (Att. 4c, 4g):
An online survey will be developed within Qualtrics and will be administered during Years 3 and 5 of the cooperative agreement. Four to five staff members from all the National DPP and DSMES sites who agree to participate in the rapid evaluation will be invited (Att. 8a, 8b) to complete the online survey. The National Evaluation Team will work with the HD recipients to develop a distribution list to send out the survey links. The online survey will remain open for three weeks to allow enough time for response. The National Evaluation Team will send two email reminders (Att. 8c, 8d) to non-respondents to encourage a high response rate. The survey will use skip patterns to reduce response burden.
DSMES/ National DPP Partner Site-Level Rapid Evaluation Program Interviews (Att. 4d, 4e, 4f, 4h, 4i):
During Years 3 – 5 of the cooperative agreement, the National Evaluation Team will conduct site visits to a total of 12-14 the National DPP and 12-14 DSMES sites annually. During the site visits, the team will conduct virtual stakeholder interviews to mitigate COVID-19 transmission. Prior to the site visit, the National Evaluation Team will communicate with selected sites to provide more information about the interviews and overall evaluation, and schedule and coordinate logistics for the virtual site visit.
Interviews will be led by the National Evaluation Team with one team member conducting the interview and another serving as a note-taker. All interviews will be recorded, with the consent of participants, to ensure accuracy of our notes. Recordings will be transcribed by an external transcription service prior to analysis.
As interviews are completed, participants will receive a follow up email thanking them for their participation, sharing the anticipated timeline for data analysis and results, and letting them know who to contact with further questions.
All audio files and any written notes will be stored in a secure environment, maintained by Deloitte and accessible only to the data collection/evaluation team. Qualitative analysis will be performed on all interview data collected. Content analysis will be conducted using NVivo. Once analyzed, findings will be shared with DDT evaluators, participating sites, and the nominating HD recipient.
Category A Evaluation and Performance Measurement Plan (Att. 5a)
Per the 1815 cooperative agreement, all HD recipients are required to submit an evaluation and performance measure plan to address the evaluation questions specified by DDT. The DDT team will assist HD recipients in developing and implementing evaluation plans that are useful for state-level program improvement and for the overall evaluation of the program. The overarching EPMP is due in Year 2. HD recipients can revise the plans in subsequent year to reflect changes in program priorities or lessons learned from previous the year’s evaluation.
Recipient-Led Evaluation Reporting Deliverables
Per the 1815 cooperative agreement, all HD recipients are required to complete an annual evaluation reporting deliverable to update CDC on the progress of their recipient-led evaluations of Category A strategies.
Category A Annual Evaluation Report (Att. 5b): Each year, HD recipients will submit an annual evaluation report for their Category A strategies based on the findings from the previous year’s evaluation. DDT will provide HD recipients with technical assistance and guidance to support completion of the evaluation reports. The reports will be due 90 days after the end of each program year.
While participation in all data collection for national evaluation activities is voluntary, the National Evaluation Team will make every effort to maximize the rate of participation. The HD recipient and partner site-level interview guides are tailored specifically to each stakeholder and are designed to gather the most relevant information within the designated length of time per instrument. The National Evaluation Team will also engage with select HD recipients to gather their input on the interview tools and process, thereby building buy-in for the evaluation process and encouraging full participation.
For potential interview and survey participants, the National Evaluation Team will first send invitation e-mails describing the purpose and length of interviews and surveys, what types of questions will be asked, and how findings will be used. For interviews, once individuals agree to participate, the National Evaluation Team will follow-up by sending confirmation and reminder e-mails in advance of interviews to ensure participation. For surveys, the National Evaluation Team will follow-up with non-respondents by sending two follow-up e-mails during the survey period to encourage their participation.
For Category A site level rapid evaluations, DDT will offer stipends to maximize participation of National DPP and DSMES sites and compensate for their time in participating in interviews, completing the survey, and providing relevant partner site-level documentation to the National Evaluation Team. Partner sites that participate in survey data collection will receive a a total stipend of $125, $25 per respondent x (an estimated) 5 respondents per site, after completion of each round of site survey data collection (Years 3 and 5 of the cooperative agreement).Partner sites that participate in virtual site visits will receive an additional one-time payment of $500.
Completion of the Category A Evaluation and Performance Measurements Plan (Attachment 5a) and Recipient-led Evaluation Reporting Deliverables (Attachment 5b) for Category A is a requirement of the 1815 cooperative agreement. These documents are included in the package to gain approval for their use as the data could be triangulated with other sources to inform the national evaluation. The DDT and DHDSP teams will provide ongoing technical assistance to HD recipients, including hosting topic-specific webinars, to support preparation and submission of each deliverable.
DDT and the National Evaluation Team are convening voluntary external advisory groups, or Evaluation Planning Groups (EPGs), comprised of HD recipients, who will provide input and feedback on data collection tools. Each data collection tool will be reviewed by no more than 9 individuals.
DSMES/ National DPP Partner Site-Level Rapid Evaluation: The National Evaluation Team gathered input on the rapid evaluation interview guides and survey questionnaires from the DDT Performance Improvement and Evaluation (PIE) Team, DDT Program Implementation Branch (PIB). In addition, we will pre- test the data collection instruments with 5-8 HDs and 5-8 National DPP and 5-8 DSMES site representatives. Feedback from all groups will be used to refine questions as needed, avoid duplicative areas, clarify question wording, ensure accurate programming and skip patterns, and establish the estimated time required to complete the data collection instruments.
Category A Evaluation and Performance Measurement Plan (EPMP): A total of eight HD recipients provided feedback to refine the Category A EPMP. The DDT PIE team will provide ongoing technical assistance to HD recipients in developing and implementing evaluation plans that will readily address the evaluation questions specified in the EPMP.
The individuals responsible for design and management of the 1815 National Evaluation data collection tools and processes include:
Name |
Contact Info |
Organization |
Gia Rutledge |
|
CDC, Division of Diabetes Translation, Performance Improvement and Evaluation Team |
Kimberly Farris |
CDC, Division of Diabetes Translation, Performance Improvement and Evaluation Team |
|
Yvonne Mensa- Wilmot |
CDC, Division of Diabetes Translation, Performance Improvement and Evaluation Team |
|
Kai Stewart |
CDC, Division of Diabetes Translation, Performance Improvement and Evaluation Team |
|
Jenica Reed |
Deloitte Consulting, 1815/1817 National Evaluation Team |
|
Meklit Hailemeskal |
Deloitte Consulting, 1815/1817 National Evaluation Team |
|
Nicolle Dally |
Deloitte Consulting, 1815/1817 National Evaluation Team |
|
Celeste Chung |
|
Deloitte Consulting, 1815/1817 National Evaluation Team |
Nina Granow |
Deloitte Consulting, 1815/1817 National Evaluation Team |
-
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Kumar, Anoosha (US - Chicago) |
File Modified | 0000-00-00 |
File Created | 2021-01-14 |