WISEWOMAN OMB Part A_5 27 2015

WISEWOMAN OMB Part A_5 27 2015.docx

WISEWOMAN National Program Evaluation: Implementation Assessment

OMB: 0920-1068

Document [docx]
Download: docx | pdf


WISEWOMAN National Program Evaluation: Implementation Assessment


Supporting Statement


Part A: Justification


May 27, 2015









Contact: Marla Vaughan

Division of Heart Disease and Stroke Prevention

Centers for Disease Control and Prevention

Atlanta, Georgia

Phone number: 770-488-4826

Email address: [email protected]



This page left blank for double-sided copying.


CONTENTS

A. Justification 1

1. Circumstances Making the Collection of Information Necessary 1

2. Purpose and Use of the Information Collection 6

3. Use of Improved Information Technology and Burden Reduction 9

4. Efforts to Identify Duplication and Use of Similar Information 10

5. Impact on Small Businesses or Other Small Entities 10

6. Consequences of Collecting the Information Less Frequently 11

7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5 12

8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency 12

9. Explanation of Any Payment or Gift to Respondents 13

10. Assurance of Confidentiality Provided to Respondents 13

10.1 Privacy Impact Assessment Information 14

11. Justification for Sensitive Questions 15

12. Estimates of Annualized Burden Hours and Costs 16

13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers 18

14. Annualized Cost to the Federal Government 18

15. Explanation for Program Changes or Adjustments 18

16. Plans for Tabulation and Publication and Project Time Schedule 18

17. Reason(s) Display of OMB Expiration Date is Inappropriate 25

18. Exceptions to Certification for Paperwork Reduction Act Submissions 25

References 26



Attachment A: Authorizing legislation

A1: BREAST AND CERVICAL CANCER MORTALITY PREVENTION ACT OF 1990

A2: PUBLIC HEALTH SERVICE ACT

Attachment B: Proposed outcomes, measures and data sources

Attachment C: PROGRAM Survey INSTRUMENT AND SUPPLEMENTARY DOCUMENTS

C1: PROGRAM SURVEY

C2: PROGRAM SURVEY INVITATION EMAIL

C3: PROGRAM SURVEY REMINDER EMAIL

Attachment D: Network Survey INSTRUMENT AND SUPPLEMENTARY DOCUMENTS

D1: NETWORK SURVEY

D2: NETWORK SURVEY INVITATION EMAIL

D3: NETWORK SURVEY REMINDER EMAIL

Attachment E: Site Visit data collection instruments AND SUPPLEMENTARY DOCUMENTS

E1: DISCUSSION GUIDE-GROUP INTERVIEW WITH KEY ADMINISTRATIVE STAFF

E2: DISCUSSION GUIDE-GROUP INTERVIEW WITH HEALTHY BEHAVIOR SUPPORT STAFF

E3: DISCUSSION GUIDE-GROUP INTERVIEW WITH STAFF AT PARTNER CLINICAL PROVIDERS

E4: DISCUSSION GUIDE-COMMUNITY PARTNERS

Attachment F.1: Federal Register Notice

Attachment F.2: summary of public comments



TABLES

A.1 Summary of data collection methods under this OMB request 6

A.2 Data collection efforts and evaluation component 8

A.3 Pre-test awardee contact information 12

A.4 Estimated annualized burden hours 17

A.5 Estimated annualized burden costs 17

A.6 Analytic approaches to answering evaluation questions 19

A.7 Illustrative table shell - Average baseline outcomes and average change in outcomes among WISEWOMAN participants 21

A.8 Illustrative table shell – Marginal effects of the WISEWOMAN program on outcomes (multivariate regressions results) 22

A.9. Proposed project timeline 25







FIGURES

A.1 WISEWOMAN logic model 5







This page left blank for double-sided copying.

A. Justification

1. Circumstances Making the Collection of Information Necessary

Overview

The Centers for Disease Control and Prevention (CDC) requests Office of Management and Budget (OMB) approval for three years to conduct new information collection supporting the Well-Integrated Screening and Evaluation for Women Across the Nation (WISEWOMAN) program. WISEWOMAN is overseen by CDC’s National Center for Chronic Disease Prevention and Health Promotion (NCCDPHP), Division for Heart Disease and Stroke Prevention (DHDSP). The proposed program implementation assessment is one component of a broader program monitoring and evaluation plan. Quantitative information about the services provided to individual WISEWOMAN clients, and changes in outcomes of interest, are currently monitored by the Minimum Data Elements (MDE) information collection (OMB No. 0920-0612, exp. 12/31/2016). The proposed new information collection will complement the quantitative outcome data collection by collecting process data to improve understanding of how WISEWOMAN awardees implement the program. The new data collection gains the perspectives of the program staff and key partners through surveys and discussions with individuals and groups and does not include any client-level data collection.

CDC will use the qualitative information about program implementation processes:

  1. To gain insights into key implementation components, and facilitators and challenges for program implementation, as well as contextual factors that may affect referrals and program operations.

  2. To identify opportunities for process improvement within each program.

  3. To identify best practices for program implementation that can be replicated and/or scaled up.

  4. To improve understanding of how processes and program components may contribute to or influence key outcomes at the program level. Analyses will be based on the qualitative, contextual, and process-oriented information to be collected through this assessment, in conjunction with program-level outcome metrics (i.e., aggregate analysis of each program’s MDE data).


The current assessment is designed to identify factors that influence program implementation and trends in outcomes on a broad scale. Findings may be used to inform the development of additional evaluation projects. The current assessment is not designed to assess attribution of program activities to client-level outcomes. Client-level outcomes are influenced by multiple factors that are outside the scope of the current WISEWOMAN program implementation assessment.


Background

Disease Burden

Cardiovascular disease (CVD), which includes heart disease, myocardial infarction, and stroke, is the leading cause of death for women in the United States. It is a primary contributor to mortality, morbidity, and decreased quality of life, especially among older women. Addressing risk factors such as high blood pressure, elevated blood cholesterol, obesity, sedentary lifestyle, diabetes, and smoking can reduce CVD-related illness and death. In particular, women with lower socioeconomic status and without health insurance have increased risk of CVD-related morbidity and mortality, as they have limited access to health services and have been shown to be more likely to engage in less healthy behaviors, including smoking cigarettes, physical inactivity, and poor eating habits (Vaid et al. 2011).

WISEWOMAN Program

The WISEWOMAN program is authorized under a legislative supplement to the Breast and Cervical Cancer Mortality Prevention Act of 1990 (Public Law 101-354, see Attachment A.1). CDC’s authority to collect information from WISEWOMAN program awardees is established by Section 301 of the Public Health Service Act [42 U.S.C. 241] (Attachment A.2). According to federal guidelines, NBCCEDP must establish an eligibility baseline to direct services to uninsured and underinsured women at or below 250% of federal poverty level; ages 21–64 for cervical cancer screening; and ages 40–64 for breast cancer screening. As stated in the NBCCEDP Program Guidance Manual, funds cannot be used to pay for any service for which payment has been made or can be made by a State compensation program, under an insurance policy, under a federal or state health benefits program, or by an entity that provides health services on a prepaid basis. Women who are determined to be eligible for breast and cervical cancer screening services through NBCCEDP programs are then referred to WISEWOMAN, which conducts screening for additional preventive services. The WISEWOMAN program focuses on reducing cardiovascular disease risk factors among high-risk women by: 1) assuring that cardiovascular screening is provided to women ages 40-64 who are participants in the National Breast and Cervical Cancer Early Detection Program (NBCCEDP); 2) working with community-based organizations to provide evidence-based prevention services to those women in need of them (through agreements with organizations such as the YMCA, Weight Watchers, and those that provide Diabetes Primary Prevention Programs); 3) improving the management and control of hypertension by integrating innovative health system-based approaches and strengthening community-clinical linkages (such as team-based care and pharmacy medication management programs); and 4) gathering and reporting program related data, including performance measures.

To its participants, the program provides a unique combination of cardiovascular and chronic disease risk screening, healthy lifestyle support programs, and linkages to community resources. In 2013, the CDC released its fourth funding opportunity announcement (FOA) (DP13-1302) for the current WISEWOMAN program, which resulted in four-year cooperative agreements with 22 state, territorial, and tribal health departments, including 5 new and 17 continuing awardees from the previous FOA. Approximately two-thirds of program funding is provided by CDC with the other one-third supplied by states, territories, or tribal organizations.

Program Implementation Assessment

The information collection effort to support the assessment of WISEWOMAN is of interest to DHDSP as a federal agency for promoting and improving cardiovascular health in the United States. DHDSP will use the results of the WISEWOMAN implementation assessment to improve interventions for disadvantaged women at risk for cardiovascular disease that will contribute to reductions in morbidity and mortality in the nation. In addition, assessment of the WISEWOMAN program’s implementation and processes is consistent with the needs of the CDC to meet its Government Performance and Results Act requirements. The new information collection includes a program survey, a network survey, and site visits.

Need for assessment of the WISEWOMAN program

Although the program goals have remained the same since its implementation, the program has seen several changes in the environment in which it operates, including those related to recommendations on detection and treatment of cardiovascular disease (science), the introduction of the Patient Protection and Affordable Care Act (ACA) (policy), and public health approaches related to a systems and community focus (environment). Thus, for the most recent cooperative agreement beginning in 2013, the program was modified to respond to the shifts in populations, systems, and community needs. The long-term objectives of the program reflect these updates; the program includes the original program elements of providing screening, promoting healthy lifestyle behaviors, and linkages to community resources, but increases emphasis on supporting clinical systems of care to improve access and leveraging existing resources in the community. Lifestyle interventions have also been reframed to include lifestyle programs (LSPs) and health coaching (HC) sessions, and minimum data elements (MDEs) have been updated to capture information about risk reduction counseling and participants’ readiness to change. The current cooperative agreement also stresses monitoring and performance evaluation as key program dimensions. In addition, the program aims to align with the national Million Hearts initiative, which seeks to reduce heart attack and stroke through integration of efforts of communities, health systems, federal agencies, and private-sector partners.

Recent science innovations, legislation, and program modifications necessitate an assessment of the program. Additionally, more information is needed to augment that from previous evaluation efforts. Since the program’s inception, it has shifted from conducting research to focusing primarily on practice (Vaid et al. 2011). Thus, the goals for the current iteration of the program include continued assessment and documentation of program performance and progress of the 22 state and tribal organization awardees; as well as an assessment of program effect and development of evidence to support and inform current practice related to implementation of specific curricula and models of interventions.

Information from a strong assessment will contribute to the program’s continued performance by shaping key programmatic decisions, identifying potentially successful implementation strategies, and strengthening the evidence base for the program model. Although the goal of this program implementation assessment is not to evaluate the impact of ACA, the information that we propose to collect about program implementation will help us understand the current environment that may affect WISEWOMAN enrollment, operations, and ultimately, the client-level outcomes. It will also be noted when reporting findings which funded states are Medicaid expansion states and which are not.

Underlying the evaluation of WISEWOMAN is the program logic model (Figure A.1). This framework was used to identify the data that are required in addition to MDEs, which are collected on an ongoing basis separately from these proposed information collection activities and capture individual-level information about participants’ outcomes, risks service receipt, and demographics. The outcomes that will be described in this assessment are specified in Attachment B.

Shape3 Shape2 Shape10 Shape9 Shape1 Shape8 Shape7 Shape6 Shape5 Shape4

LONG-TERM OUTCOMES


(D (is

ACTIVITIES


Improved prevention of hypertension


Improved hypertension control


Improved cholesterol control


Improved tobacco control / reduction in smoking














Reduced prevalence of heart disease and stroke


Decreased morbidity and mortality due to heart disease and stroke
















INTERMEDIATE OUTCOMES

SHORT-TERM OUTCOMES OUTCOMES

Figure A.1. WISEWOMAN logic model

Shape12 Shape13 Shape14 Shape15

Domain 1. Epidemiology and Surveillance

Activities:

  1. Use existing surveillance data to identify cardiovascular disease (CVD) risk factors, morbidity and mortality, and needs of population

  2. Develop and implement minimum data elements (MDEs) system and other data collection activities

  3. Establish/network with existing data collection systems to collect and report program related data


Domain 2. Environmental Approaches

Activities:

  1. Complete biennial community scan for available resources and gaps

  2. Work with partners to increase access to resources/services that support healthy behaviors


Domain 3. Health Systems Interventions

Activities:

  1. Provide cardiovascular risk screening

  2. Ensure provision of risk reduction counseling and referrals

  3. Engage in efforts to address uncontrolled hypertension

  4. Ensure referral to evidence-based lifestyle programs and/or other healthy behavior support options (e.g. health coaching)

  5. Engage with health systems/providers to improve clinical systems of care in blood pressure (BP) control


Domain 4. Community Clinical Linkages

Activities:

  1. Partner or contract with community groups that provide evidence-based lifestyle programs

  2. Identify other community resources and refer as needed

  3. Partner with appropriate organizations to increase program impact

Epidemiology and Surveillance

  • Collection and use of high quality data and information for program improvement, reporting, and evaluation

  • Increased detection of CVD risk factors


Environmental Approaches

  • Results from environmental scans used to identify and improve community resources


Health Systems Interventions

  • Increased systems and practices in place that support improved CVD risk factors, particularly control of high blood pressure (HBP)

  • Individual changes

  • Increased awareness of HBP and other CVD risk factors

  • Improved medication adherence for HBP

  • Improved lifestyle changes to reduce CVD risks, focus on HBP control

  • Increased self-monitoring of BP

  • Improved Quality of Life


Community Clinical Linkages

  • Increased referrals to Quit Line or smoking cessation programs

  • Increase utilization of community resources including evidence-based lifestyle programs





Epidemiology and Surveillance

  • Effective use of quality improvement and performance management (QI/PM) cycles using evaluation and data monitoring results.


Environmental Approaches

  • Environmental changes in communities that result in more places for physical activity, increased access to healthy food, smoking cessation, and more smoke-free public places.


Health Systems Interventions

  • Maintain continuity of relationship with systems and practices that support improved risk factors, particularly control of HBP

  • Maximize number of eligible women that are provided quality screening, risk reduction counseling, and all follow-up services as appropriate.

  • Individual changes

  • Maintenance of lifestyle changes to improve CVD risk

  • Maintain Quality of Life

  • Improved hypertension control


Community Clinical Linkages

  • Maintain access to and utilization of community resources including evidence-based lifestyle programs


PUBLIC HEALTH IMPACT


(D (is



Data collection activities under this OMB request for the WISEWOMAN Implementation Assessment

To support uniform data collection and a multi-component design, three types of data collection activities will be implemented: a program survey, a network survey, and site visits (summarized in Table A.1). These data collection activities will complement that provided through the MDEs, which will serve as the primary source of outcomes data.

Table A.1. Summary of data collection methods under this OMB request

Data Collection Method

Data Collected

Respondent Type

Administration

Rounds of Data Collection

Program Survey

Program implementation and aggregate outcomes data

WISEWOMAN administrative and service staff across all awardees

Editable PDF survey

Self-administered

2 rounds in Program Years 2 and 4

Network Survey

Organizational-level data

WISEWOMAN awardees and their partners across all awardee communities

Web-based survey

Self-administered

2 rounds in Program Years 2 and 4

Site Visits

Qualitative program implementation information

WISEWOMAN administrative staff, service staff, providers, and other partners in 6 selected awardee communities per year

In-person interviews conducted by national evaluator staff

3 rounds in Program Years 2, 3, and 4

Information collected through these three activities, along with MDE data and other publicly available secondary data, will be used together to assess implementation and characterize improvements in cardiovascular health among disadvantaged women. The mixed-modes data collection approach will describe both quantitative measures of program activities, outputs, and outcomes as well as qualitative impressions of program implementation, lessons learned, and emerging, promising, and best practices. This data collection approach will generate results useful to policymakers and practitioners, informing them about the implementation and value of WISEWOMAN as a multifaceted intervention to promote cardiovascular health.

2. Purpose and Use of the Information Collection

The purposes of the evaluation are aligned with WISEWOMAN program needs and objectives for accountability, programmatic decision making, and ongoing quality improvement. The evaluation of the WISEWOMAN program is focused around the following goals:

  • To gain insights into key implementation components, and facilitators and challenges for program implementation, as well as contextual factors that may affect referrals and program operations.

  • To identify opportunities for process improvement within each program.

  • Identify emerging, promising, and best practices in implementation, continued program improvement, replication, and dissemination1

  • To improve understanding of how processes and program components may contribute to or influence key outcomes at the program level. Analyses will be based on the qualitative, contextual, and process-oriented information to be collected through this assessment, in conjunction with program-level outcome metrics (i.e., aggregate analysis of each program’s MDE data).

  • Strengthen the evidence base for community-clinical interventions to support cardiovascular health

To reach these goals, the evaluation will consist of four components: an environmental scan, a process evaluation, an outcomes evaluation, and the summative evaluation. The evaluation components will address key evaluation questions:

  1. Environmental scan - What systems and external factors are in place in communities that could help or hinder awardees in improving outcomes?2

  2. Process evaluation - What are the emerging, promising, and best practices for program implementation? What are the challenges to program implementation?

  3. Outcome evaluation - What are the changes, if any, in WISEWOMAN outcomes?

  4. What contextual factors may influence the contribution of WISEWOMAN components and pathways to outcomes?3

These evaluation questions are further specified in Attachment B, which presents the linkage between each evaluation question to: (1) one of the four NCCDPHP domains, (2) purpose and use of the information to be collected, (3) outcomes for measurement, and (4) anticipated data sources.4 Table A.2 summarizes each data collection method and the evaluation components into which they will feed.

Table A.2. Data collection efforts and evaluation component

Data collection method

Respondents

Environmental scan

Process evaluation

Outcomes evaluation

Summative evaluation

Data collection requested under this OMB package

Program survey

All WISEWOMAN awardees


Ö


Ö

Network survey

All WISEWOMAN awardees and 10 partners per awardee


Ö


Ö

Site visits

Awardee staff, providers, and partners from a subset of 22 WISEWOMAN awardees (18 of 22)


Ö


Ö

Data from existing data sources

Minimum data elements

All participants

Ö

Ö

Ö

Ö

Awardee applications and reports, capacity assessment calls

All WISEWOMAN awardees

Ö

Ö

Ö

Ö

Community scan

All WISEWOMAN communities

Ö

Ö

Ö

Ö

Secondary survey data sources

Sample of secondary survey respondents (will vary by data source)



Ö

Ö

Below, we discuss the specific use of the information collected under each method.

  • The Program Survey (Attachment C1) is designed to provide high quality information about the implementation of the WISEWOMAN program across its specified activities. These data will be used for process and summative evaluations to provide variables related to program components and intervention models that may provide context to or influence outcomes. The information will be used to assess services offered and provided, intervention models used by projects, and program achievements.

  • The Network Survey (Attachment D1) is designed to collect information about implementation of the WISEWOMAN program as related to the health networks to support cardiovascular health within the community. Information from this survey will primarily be used in the summative evaluations to quantify the relationships between organizations and agencies within the 22 WISEWOMAN communities. Variables from the survey will be also used to describe aspects of program implementation as related to partnerships and resources in the community.

  • Site Visits (Attachments E1, E2, E3, and E4) will include key informant interviews that will cover several aspects of program activities, including staffing, services provided, populations reached and served, partnerships, networks, and reflections on challenges and successes. Qualitative information from the site visits will be used mainly to assess program implementation and identify and describe emerging, promising, and best practices. In addition, qualitative information about the nuances of program implementation may provide context to quantitative outcomes.

3. Use of Improved Information Technology and Burden Reduction

Program Survey and Network Survey. The program and network surveys will comply with the Government Paperwork Elimination Act (Public Law 105-277, Title XVII) by employing technology efficiently in an effort to reduce burden on respondents. CDC will use electronic and web-based modes of data collection to obtain information from respondents. The self-administered format allows respondents to complete the survey at a day and time that is most convenient for them, with the option of completing the questionnaire over multiple sessions, as needed. The instruments solicit only information that corresponds to the specific research items discussed in Section A.2, above. No superfluous or unnecessary information is being requested of respondents. The instruments will be programmed after completion of the pre-test efforts. Screen shots of the network survey can be made available as attachments to the ICR worksheet in the near future.

  • The program survey for 22 respondents will use an editable PDF format while the network survey of up to 242 respondents will use a web-based format. The editable PDF allows respondent to easily change responses.

  • The web-based network survey that will involve a larger number of respondents further minimizes respondent burden in following skip patterns, and to dynamically integrate applicable text from previous responses, where applicable. The web-based application will include automated range checks and branching and will enforce consistency among critical questions to optimize resources and facilitate collection of high quality data. The programming will allow the collection of information specific to each respondent by skipping respondents out of questions not pertinent to them, thereby eliminating undue time burden. The link and password to facilitate easy access to access the web survey will be provided in the survey notification and reminder mailings, sent via email and postal mail.

WISEWOMAN Site Visits. As these are qualitative data collection efforts, CDC will not use information technology to collect information from a total of 126 persons contacted in the site visits (staff, providers, and partners which comprise seven key informant interviews at each of six WISEWOMAN Awardees per year in program years 2-4). Because the data collection is qualitative in nature and requires information from a relatively small number of individuals; it is not appropriate, practical, or cost-beneficial to build electronic instruments to collect the information. All information will be collected orally in person using discussion guides, supported by digital recordings. Site visit transcripts will be analyzed using Atlas.ti, a software system used for the qualitative analysis of large amounts of data collected in text format.

4. Efforts to Identify Duplication and Use of Similar Information

The WISEWOMAN program currently supports data collection of MDEs from awardees on screening and assessment, lifestyle program, and health coaching activities, outputs, cardiovascular risk, and outcomes (OMB # 0920-0612). It has also supported data collection efforts under previous cooperative agreements (OMB# 0920-0864). While we plan to use the MDE data collected, we require additional information to assess program implementation in relation to outcomes, including systematic data on awardee implementation that would be provided through a program survey and network survey. In addition, there are no data collection activities to gather qualitative data that would provide in-depth information about implementation, such as that collected through site visits. The information that we are requesting to collect described in this OMB package is not available elsewhere. We will use program data collected through other mechanisms, such as grant applications for program implementation, whenever possible to supplement requested data. To the extent that they are available, we will use data from secondary sources to provide contextual community and program information over the period of the cooperative agreement. However, data from existing sources are not sufficient to evaluate the program. We describe the efforts to identify duplication and use of similar information for each data collection effort below.

Program Survey. CDC sought to avoid duplication of effort in the design of the form by adapting questions from previous program surveys, such as the National Healthy Start Program Surveys (OMB #0915-0287 and OMB #0915-0338). Some questions were modified, and new questions were created to reflect the differences in subject matter and program models. The source for each question is indicated on the instrument in Attachment C.

Network Survey. As with the program survey, the network survey used questions from previously developed instruments whenever possible. For questions related to networks and community, existing survey instrument items were used from the Community Voices for Coverage Leadership Team Follow-Up Survey; the Wilder Collaboration Factors Inventory; the Living Cities TII Grantee-Partner Network Survey; the Survey of the Health of Adults, the Population and the Environment; and the Social Capital Assessment Tool Household and Community Surveys. A few questions were developed specifically to reflect the needs for assessing networks of the WISEWOMAN program. The source for each question is indicated on the form in Attachment D.

Site Visit Data Collection Instruments. CDC specifically designed the site visit data collection instruments for the evaluation of the WISEWOMAN program.

5. Impact on Small Businesses or Other Small Entities

Program Survey and Network Survey. The program survey will be conducted with 22 WISEWOMAN awardees, and the network survey will be conducted in the 22 awardee communities with awardees and representatives from up to 10 partner organizations. The program and network surveys will occur during the second and fourth program years. The WISEWOMAN awardees are state and territorial health departments and some of the partner organizations are small, nonprofit organizations. We minimize burden by designing the instruments to include the minimum questions needed for evaluation. The network survey instrument will be administered electronically to allow respondents to stop and come back to the survey to accommodate respondents’ schedules. The network survey’s web-based application will include programmed skip patterns and branching so that the respondent does not answer any unnecessary questions.

Site Visits. This component of the evaluation was designed to minimize the burden on key informants/participants. In each of program years 2, 3, and 4, a small burden will be placed on six WISEWOMAN awardees when a few of their staff and partner organization representatives will be invited to participate in the site visit. The method for selecting the six awardees each year is described in Support Statement B. During the site visits, the key informant interviews will be conducted in person. Burden will be minimized by restricting the interviews to 45 to 75 minutes and conducting them at a time and location that is convenient for the key informant.

6. Consequences of Collecting the Information Less Frequently

Table A.1 in Section A.1 summarizes the new data collection efforts, including the frequency of information collection. Below, we discuss the consequences of collecting the information less frequently for each data collection activity.

Program and network surveys. To obtain as complete a picture of WISEWOMAN implementation and contribution to systems (or networks) over time as possible, awardees will be asked to complete the program and network surveys twice: during Program Years 2 and 4. Awardees will respond to the same questions in both rounds of the survey to capture changes in implementation and systems between the beginning and end of the cooperative agreement. The information collected from the first round will be used in the process evaluation to assess program implementation. Information from both survey periods will be used in the summative evaluation to measure variation in implementation and systems progress over the course of the cooperative agreement, which can be used by CDC to identify gaps in and approaches to improve implementation. Changes over time in implementation and systems will also be linked to changes in outcomes to identify factors associated with better outcomes. The findings from these analyses can be used to identify the best and promising practices associated with better outcomes to be used for purposes of replication and scale-up. Collecting information from all awardees at a single point in time (one round of the program and network surveys) will allow for linkages of implementation and systems measured at one point in time to changes in outcomes. However, if the program and network surveys were limited to a single round, it would preclude CDC from examining how implementation progressed over the cooperative agreement and prevent linkages of changes in implementation to changes in outcomes.

Site visits. No awardee will be asked to participate in more than one site visit. Six awardees will participate in site visits in Program Year 2 through Program Year 4, resulting in visits to 18 awardees in total. Therefore, four awardees will not host site visits at any point during the evaluation. In addition, each informant/participant will respond one time only under these three data collection efforts. There will be no additional qualitative information collections under this OMB request.

Data collection at the site level will enable us to observe program implementation directly and provide opportunities to interact with a wide variety of program staff and partners to understand the program context at a deeper level. Information collected through the site visits can be used to identify emerging, promising, and best practices in the process evaluation. In addition, this information can be used to describe awardees’ characteristics in the outcomes and summative evaluations, which can help identify promising and best practices.

7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5

This request fully complies with 5 CFR 1320.5. There are no special circumstances.

8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency

The notice required in 5 CFR 1320.8(d) was published in the Federal Register on August 25, 2014, Volume 79, Number 164, Page Numbers 50653–50654 (see Attachment F1 for a copy of Federal Register Notice.) One public comment was received and addressed (see Attachment F2).

In an effort to consult with experts both inside and outside of the Department of Health and Human Services, CDC Heart Disease and Stroke Prevention Division and WISEWOMAN program staff reviewed the survey and provided feedback on the electronic versions of the instruments during several conference calls. CDC pre-tested the program survey with three awardees and the network survey with six organizations across three awardees. All pre-tests were conducted using a paper version of the survey instruments. The results of the pre-test and recommendations for finalizing the instruments are presented in Supporting Statement Part B. The pre-test allowed us to validate the length of the instruments and confirm the anticipated public burden associated with participation in the surveys (discussed below in section A12). The pre-test also allowed us to debrief with participants and collect information that helped to inform refinements and clarifications to the wording of items, as well as survey instructions, where needed. Responses collected during the pre-test were not and will not be analyzed. Contact information for the four awardees that participated in the pre-test is provided in Table A.3. In addition, the instruments were revised based on results of the pre-test and feedback from CDC staff.

Table A.3. Pre-test awardee contact information

SEARHC WISEWOMAN
Pamela Sloper
Program Manager/ Coordinator
Haines Health Center
PO Box 1549, Haines, AK 99827
907-766-6367
[email protected]

Nebraska WISEWOAMN
Cathy Dillion
Program Manager
301 Centennial Mall So.
P.O. Box 94817, Lincoln, NE 68509-4817
402-471-1806
[email protected]

West Virginia WISEWOMAN
Sheryn Carey
Program Coordinator
WV Department of Health and Human Resources, Office of Maternal, Child and Family Health
350 Capitol Street, Room 427
Charleston, WV 25301-0351
304-356-4345
[email protected]

Colorado WISEWOMAN
Flora Kulwa Martinez
WISEWOMAN and Chronic Disease Program Coordinator
Colorado Department of Public Health and Environment
4300 Cherry Creek Drive South
Denver, CO 80246303-691-4919
[email protected]

9. Explanation of Any Payment or Gift to Respondents

In both surveys, respondents will not receive a monetary token of appreciation for their participation. Likewise, participants in the in-depth interviews conducted during the site visits will not be provided with a monetary token of appreciation, as information will be collected as part of the participation process for awardees and will be essential for providing, targeting, and improving services for these women. Their participation in the survey data collection and the site visit interviews is part of their professional positions as members of grantee organizations or their partners.

10. Assurance of Confidentiality Provided to Respondents

CDC is contracting with two organizations to collect and analyze the evaluation data: SRA International and Mathematica Policy Research. Staff from SRA International provide overall administrative oversight to the evaluation contract, and staff from Mathematica will play a leading role in all data collection activities, oversee the implementation of the WISEWOMAN program and network surveys, and conduct the site visit interviews. The data will be delivered to CDC at the end of the study, who will ultimately own the de-identified data. During data collection, the contractors will have access to personally identifiable information used to contact potential participants to invite them to participate in the evaluation and for non-response follow-up. However, respondents will be assigned a study ID for use on data collection instrument and all data files shared with CDC will be stripped of identifying information to maintain the privacy of those who participated in the evaluation. All data collected from the surveys and site visits will be treated in a secure manner and will not be disclosed, unless otherwise compelled by law.

CDC and its contractors have embedded protections for privacy in the study design. The information collection will fully comply with all aspects of the Privacy Act. Individuals and agencies will be assured of the privacy of their replies under Section 934(c) of the Public Health Service Act, 42 USC 299c-3(c). All respondents will be told during the consent process that the data they provide will be treated in a secure manner to the extent allowed by law. They also will be informed that participation is voluntary, that they may refuse to answer any question, and can stop at any time without risk to their receipt of services. In addition, names of participants in any component of the evaluation will not be provided to the federal government. Instead, a unique ID will be assigned to each participant along with the identifying information for the relevant awardee.

IRB approval

In addition to specific security procedures for the various data collection activities, two approaches cut across the entire study. First, all contractor employees will sign a pledge to protect the privacy of data and respondent identity, and breaking that pledge is grounds for immediate dismissal and possible legal action. Second, the contractor provided the Institutional Review Board (IRB) with an overview of all of the data collection activities supporting the evaluation. The IRB determined that the proposed project does not involve research with human subjects, and that IRB approval is not required.

10.1 Privacy Impact Assessment Information

Participation in data collection efforts will be voluntary for all awardees, their staff, and their partners identified as potential respondents. As part of establishing communication for the remaining data collection efforts, potential participants will be sent information about the study and what is required for participation. The elements of consent will be explained in these communications (see Attachments C, D, and E). No personally identifiable data will be collected from these data collection methods. The requested information is at the aggregate or organizational level. In the program and network survey data files, personally identifiable information, such as the name of the respondent, his / her email address, and the name of the organization, will be included in the initial data files. However, these identifiers will be delinked and ultimately removed from the final dataset, as unique identifiers will be assigned to each case. Survey and site visit interview data will be stored by Mathematica in secure servers.

Below is an overview of the steps taken to ensure the privacy of respondents for each of the three data collection efforts under this request for OMB clearance, including the mode of data collection and targeted respondents; identifiable information to be collected; parties responsible for data collection, transmission, and storage; and parties with access to the data and uses of the data.

  • The Program Survey (Attachment C1) is designed for self-administration through an editable PDF. Program managers may delegate completion of sections of the survey to other WISEWOMAN staff, but only one survey will be submitted per awardee in each survey round. No individually identifiable information about the respondents will be collected; only the identifying information for the awardee agencies will be included with the survey submission. Mathematica will assist CDC in administering the survey and develop the editable PDF survey. Respondents will be instructed on how to transmit the survey back using encrypted and password protected emails and data will be keyed in and stored securely by Mathematica.

  • The Network Survey (Attachment D1) is designed for self-administration through a web-based application. The program manager for each of the 22 awardees and one representative from 10 WISEWOMAN partner organizations in each of the 22 WISEWOMAN awardee communities will be asked to complete the survey. Partners include organizations and agencies in the community that range from state and local government, health care providers, to community-based organizations that provide healthy behavior supports. No individually identifiable information will be collected; only the identifying information for the awardee or partner organizations will be included in the survey submission. Mathematica will assist CDC in administering the survey and develop a web-based survey that will operate through secure servers and data will be stored securely by the Mathematica.

  • Site Visits (Attachment E1, E2, E3, and E4) will include key informant discussions/ interviews with four types of informants: WISEWOMAN program directors and administrative staff, WISEWOMAN healthy behavior support staff, health care providers, and partner organization representatives. Mathematica staff will conduct the site visits. The interviews will be recorded and transcribed (only first names of respondents and the awardee agencies identifying information will be collected); all information will be transmitted and stored securely on the Mathematica servers. Site visit transcriptions will be coded and uploaded into a qualitative database, using software such as Atlas.ti by Mathematica. Key themes will be developed based on the qualitative data analysis. Such identified themes and quotes may be included in reports; specific quotes will not be attributed to any single person in any reports.

Only approved members of the project team at SRA International and Mathematica will have access to the data collected through the three data collection efforts for the purposes of analysis and reporting. Data from the program and network surveys will be compiled into a SAS dataset for analysis. Data will be analyzed and presented in tables and figures in the aggregate in reports. Because the number of potential respondents to the program survey is small (N=22), care will be taken in the reporting of findings to minimize the potential of identifying any single respondent in any reports or publications associated with the evaluation. Activities of specific awardee agencies may be mentioned; however, individual respondents will not be identified in any materials. At the end of data collection and analysis, Mathematica will securely transmit survey data to CDC and data will be permanently destroyed on the contractor servers. In addition, original recordings and transcriptions from the site visits will not be shared with CDC to protect key informant privacy though the database of codes for each observation may be shared with CDC. To protect key informant privacy, recordings and transcripts will be destroyed at the end of the project.

11. Justification for Sensitive Questions

The program and network surveys will not contain any sensitive items. Likewise, the in-depth interviews conducted during the site visits will not contain any sensitive items. Although the Privacy Act does not apply to organizations, CDC acknowledges that information collection pertaining to organizational policies, performance data, or other practices may be viewed as sensitive if disclosure of such information could result in liability or competitive disadvantage to the organization. No such ramifications will exist for WISEWOMAN awardees. The information they provide will focus on program operations, challenges, and impacts on the populations they serve. These data will be used to identify areas for program improvement broadly, with no negative consequences for any single awardee or awardee partner. This information will be communicated in writing during the survey introductions and is part of the consent form signed by all persons engaging in site visit interviews.

12. Estimates of Annualized Burden Hours and Costs

In this section, we provide detailed information about the anticipated burden and cost estimates for each component of data collection in the WISEWOMAN evaluation. Tables A.4, and A.5 provide a summary of the annual burden hours and costs.

Program survey (Attachment C1). The burden estimate for this data collection effort is 60 minutes per respondent per survey year. The survey instrument is preceded by a survey invitation (Attachment C2) and may be followed up by a reminder email(s) (Attachment C2). The survey instrument will be completed twice, once in Program Year 2 and once in Program Year 4; we annualize the burden across the three data collection years in Tables A.4 and A.5. We anticipate the survey to be completed by the awardee program manager who is most closely related to WISEWOMAN implementation activities. The annualized hour and cost burden is estimated to be $46.36 based on the BLS median hourly wage for managerial positions (general, operational) as of 2013.5 The burden estimate for the program survey was confirmed through pre-testing activities conducted with three awardee respondents.

Network survey (Attachment D1). The burden estimate for this data collection effort is 30 minutes per respondent. The survey will be completed twice in Program Years 2 and 4; we annualize the burden across the three data collection years in Tables A.4 and A.5. In addition to the 22 awardee program managers, we anticipate the survey to be completed by the lead administrator identified by awardees of up to 10 partner organizations. The target partner respondent is the staff member who is most closely related to WISEWOMAN implementation activities at this partner organization. The annualized hour and cost burden is estimated to be $43.72 based on the BLS median hourly wage for managerial positions in medical or health services management organizations as of 2013.6 The burden estimate for the network survey was confirmed through pre-testing activities conducted with six awardee and partner organization respondents.

Site visits. The site visits will occur at six awardee programs per year for Program Years 2, 3, and 4. They will include interviews with four types of staff (estimates are total burden estimates per respondent): 1 program administrator (Attachment E1; 75 minutes), 2 healthy behavior support staff (Attachment E2; 45 minutes each); ,2 medical providers (Attachment E3; 45 minutes each); and 2 partner organization staff (Attachment E4; 45 minutes each).

The annualized hour and cost burden for program administrator staff (awardee and partners) is estimated to be $46.36 based on the BLS median hourly wage for all managerial positions as of 2013.7 For the healthy behavior support staff, the annualized cost burden is estimated at $24.44 per hour based on the BLS median wage for health care social workers as of 2013.8 The median wage for medical providers participating in site visits is estimated at $84.87 based on BLS median hourly wage for family and general practitioners as of 2013.9

No pre-testing is planned for the site visit interview guides. During the development and implementation process, careful adherence will be paid to the amount of content covered within the amount of time allocated. Staff conducting these interviews will reduce the number of items covered, as needed, during the course of the interview to adhere to the burden estimates described above.

Table A.4. Estimated annualized burden hours

Type of Respondents

Form Name

No. of Respondents

No. of Responses per Respondent

Avg. Burden per Response (in hr)

Total Burden (in hr)

WISEWOMAN Awardee Administrators

Program Survey

15

1

1

15

Network Survey

15

1

30/60

8

Site Visit Discussion Guide

6

1

75/60

8

Awardee Partners

Network Survey

147

1

30/60

74

Site Visit Discussion Guide

12

1

45/60

9

Healthy Behavior Support staff

Site Visit Discussion Guide

12

1

45/60

9

Clinical Providers

Site Visit Discussion Guide

12

1

45/60

9

Total


132



For the awardee partners, Healthy Behavior support staff, and clinical providers, we estimate that approximately 60% of respondents will be from the state/local/tribal government sector, and 40% of respondents will be from the private sector.



The total estimated annualized cost to respondents is $6,049.

Table A.5. Estimated annualized burden costs

Type of Respondents

Form Name

No. of Respondents

Total Burden (in hr)

Hourly Wage Rate

Total Cost

WISEWOMAN Awardee Administrators

Program Survey

15

15

$46.36

$695

Network Survey

15

8

$46.36

$371

Site Visit Discussion Guide

6

8

$46.36

$371

Awardee Partners

Network Survey

147

74

$43.72

$3,235

Site Visit Discussion Guide

12

9

$43.72

$393

Healthy Behavior Support staff

Site Visit Discussion Guide

12

9

$24.44

$220

Clinical Providers

Site Visit Discussion Guide

12

9

$84.87

$764

Total


$6,049

13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers

There are no capital or start-up costs to respondents associated with this data collection.

14. Annualized Cost to the Federal Government

The evaluation will take place over a 3-year period. The total cost of the evaluation to the government is $549,308.30, which includes the amount awarded via contract to SRA International and Mathematica Policy Research ($450,000) and CDC staff time/resources ($99.308.30). The total evaluation cost was based on the budget developed by SRA International and Mathematica incorporating wages and hours for all staff, all web survey costs, mailing costs, telephone charges, travel, and other overhead costs per contract year along with the Government staff costs. The annualized contract cost has been determined to be $150,000 per year by dividing the total funded amount by three years.

15. Explanation for Program Changes or Adjustments

This is a new information collection.

16. Plans for Tabulation and Publication and Project Time Schedule

Analysis plan

The overarching design is a mixed-methods approach that will provide a comprehensive assessment of the WISEWOMAN program. Each component of the design builds on the previous components and informs the subsequent components. The evaluation also considers the multiple levels at which the program operates to improve outcomes (participant, awardee, and community levels) and the increased program emphasis on systems.

Each proposed evaluation component and corresponding analytic approaches is intended to answer one of the four evaluation questions. The proposed information collection activities support one or more of the four evaluation components. Table A.6 lists the four evaluation questions linked to the evaluation components, data collection activities, and analytic approaches that will provide the evidence to help answer the questions. The outputs and outcomes described are those shown in the logic model (Figure A.1 in Section A.1; potential measures of outcomes shown in Attachment B).

Table A.6. Analytic approaches to answering evaluation questions

Evaluation Question (Evaluation Component)

Data Source(s)a

Analytic Approaches

1

What systems and external factors are in place in communities that could help or hinder awardees in improving outcomes?

(Environmental scan)

Existing data sources (potentially supplemented by new information collection)

Abstraction of qualitative and quantitative information from secondary data sources about the community-, systems-, and policy-level factors and changes in these factors that could hinder or facilitate program implementation and outcomes

Information from the scan will be organized and analyzed across sources to provide an overview of the key policy, system, or community factors; their implementation; and the factors’ potential influences on cardiovascular health and WISEWOMAN activities.

Quantitative data will be assessed for baseline and changes in demographics, health status, and risk over time to understand potential shifts in these areas that might contribute or influence awardee implementation and observed outcomes.

2

What are the emerging, promising, and best practices for program implementation? What are the challenges to program implementation?

(Process evaluation)

Program survey, network survey, site visits, and existing data sources

Qualitative assessment to examine the processes and procedures awardees use to recruit and enroll participants; conduct health risk assessments; provide services; link to community resources; and track participation, service receipt, and outcomes

The qualitative assessment relies on coding of documents and notes from site visits and focus groups (as well as various existing data sources). The purpose of the coding is to triangulate on key themes within the qualitative data collected and to organize it in a manner that permits comparisons of data from different sources.

Quantitative assessment to describe program participation and receipt of services and referrals.

The quantitative descriptive assessment includes the development of metrics (primarily from the program and network surveys) to evaluate implementation and performance, such as progress toward program performance; establishing service delivery networks; and enumeration of services provided by participants’ characteristics and risk.

Both assessments can be used to examine the differences in stages of implementation to assess facilitators and barriers to implementation.

3

What are the changes, if any in WISEWOMAN outcomes?

(Outcome evaluation)

Existing MDE data

Longitudinal analysis of changes in outcomes among WISEWOMAN participants over time: descriptive and bivariate analyses and an analysis of disparities. The primary data source for the outcomes is the MDEs collected for all participants over time by CDC.

The descriptive analysis includes summaries of the mean values for outcomes at the first available time period and the mean values for changes in outcomes at each subsequent time period (See Table A.7 for an example of how outcomes results can be presented).

One approach to the disparities analysis is to examine how outcomes changed for subgroups within the WISEWOMAN population over time (for example, by race or income). If there is not enough variation in race and income, an alternate approach is to assess whether changes in WISEWOMAN participants’ outcomes have moved closer to national benchmarks over time (using available secondary data sources). The statistical significance of comparisons of means for bivariate and continuous variables will be made using t-tests and comparisons of distributions for categorical variables will be made using chi-square tests. The analysis will incorporate sample weights for any secondary data sources.


4

What contextual factors may influence the contribution of WISEWOMAN components and pathways to outcomes?

(Summative evaluation)

MDEs, other existing data sources, program survey, network survey, site visits

Combines and synthesizes the information collected through and the findings of the environmental scan, process evaluation, and outcomes evaluation.

The objective is to identify community, awardee, network and participant characteristics that provide context for the WISEWOMAN program and may influence outcomes. The process evaluation provides contextual information, themes and focus areas for the multivariate analysis. The multivariate analysis adds key participant-, awardee-, and community-level variables to the bivariate analysis of changes in outcomes over time. The explanatory variables of interest will be taken from the MDEs and may be enhanced with elements from the site visits and program survey. Methods include ordinary least squares and logistic regression frameworks with an indicator variable for the time period to capture the change in time. (See Table A.8 for an example of how results can be presented)

While the analysis will not demonstrate attribution of the WISEWOMAN program or program components to outcomes or changes in outcomes, due to other potential contributing factors, we will analyze the contextual factors and highlight elements that may influence outcomes.


a In addition to new data collection, there are existing data sources available for the evaluation including: MDEs, awardee applications, annual performance reports, capacity assessment calls, a community scan, and existing survey data with outcomes similar to those collected through the MDEs.


Table A.7. Illustrative table shell – Longitudinal analysis of average baseline outcomes and average change in outcomes among WISEWOMAN participants



Outcomes domainsa

Average baseline level

Average change
(program year 2 - baseline)

Average change
(program year 3 – program year 2)

Average change
(program year 4 – program year 3)

Risk reduction counseling









Readiness to change









Hypertension/ blood pressure control









Cholesterol









Diabetes









Medication adherence









Cardiovascular risk factors









Diet









Exercise









Tobacco use









BMI









Quality of Life









Alert values









Referrals









Completed referrals


















a For a full detailed list of the outcomes and measures to be examined in the outcomes evaluation, see section III in Table B.1 (Attachment B)

BMI =body mass index


Table A.8. Illustrative table shell – Association between WISEWOMAN program components and outcomes (multivariate regressions results)



Outcomes domainsa

Coefficientb
(Marginal Effect)

Standard Error

Risk reduction counseling



Program trait 1



Program trait 2



Readiness to change



Program trait 1



Program trait 2



Hypertension/ blood pressure control



Program trait 1



Program trait 2



Cholesterol



Program trait 1



Program trait 2



Diabetes



Program trait 1



Program trait 2



a For a full detailed list of the outcomes and measures to be examined in the outcomes evaluation, see Table B.1 (Attachment B)

b In the longitudinal analysis, the coefficient is the estimate of the change in outcomes from one period to the next. This can be estimated between any two periods for which data are available and between multiple time periods in a single model (for example, between baseline and Program Year 1 and Program Years 1 and 2).

BMI =body mass index

* 10% significance level.

** 5% significance level.

*** 1% significance level.

Reports

Results from the evaluations will be summarized in three reports—one report for each of the process, outcomes, and summative evaluations. Each report plays an important role, and taken together, the findings presented provide the most complete picture of the performance of the WISEWOMAN program. In addition to the annual evaluation reports, opportunities will be identified to present preliminary findings throughout the evaluation period (for example, sharing results tables during calls with awardees or briefings to CDC staff). The findings presented in these preliminary products do not represent additional findings beyond what will ultimately be presented in the three evaluation reports. They are chiefly an opportunity to share the findings prior to the full reports. Additional publications may include peer-reviewed journal articles and issue briefs to disseminate results to the broader community of policymakers and practitioners involved in the prevention and study of cardiovascular disease.

All three reports will include a description of the relevant evaluation methodology, data collection instruments, data analysis procedures, a summary of and results from quantitative and qualitative analyses, as well as conclusions on program performance and highlight implications for program planning. The reports will be tailored to stakeholder needs, recognizing that these reports may be used for a variety of purposes. We provide a brief summary of the timing and content of each of the products produced as part of the evaluation:

  • Process evaluation report. The report will provide a detailed description of and findings from the process evaluation conducted in Program Year 2, including baseline information gathered in the environmental scan. The report will synthesize the information collected in the first rounds of the program survey, network survey, and the initial site visits conducted in Program Year 2 regarding how the WISEWOMAN program is being implemented (with a particular focus on facilitators and barriers to successful implementation).

  • Outcomes evaluation report. The second report will detail the results from the outcomes evaluation conducted in Program Year 3. The report will focus on estimating changes in outcomes among WISEWOMAN participants (measured using the MDEs).Environmental scan and site visit data will provide additional context to outcomes for a subset of awardees for in-depth case studies.

  • Summative evaluation report. The third report will detail the findings from the summative evaluation conducted in Program Year 4. The report will synthesize the findings from the environmental scan, process evaluation, and outcomes evaluation along with additional information provided by the program and network surveys and site visits conducted in Program Year 4 (another year of MDE data and the second round of the program survey). The report will provide the most comprehensive picture of how outcomes have changed over the cooperative agreement and the community, awardee, and participant factors (including changes in these factors) that may provide a contextual understanding of these changes.

  • Issue briefs and webinars. Shorter documents such as issue briefs may be developed, or webinars may be conducted for the broader community about topics of interest, highlights from evaluation results, and best practices.

Analysis plan for pre-test

CDC pre-tested the program survey with three awardees and the network survey with six awardee/partner organizations across three awardees. All pre-tests were conducted using a paper version. The results of the pre-test and recommendations for finalizing the instruments are presented in Supporting Statement Part B. The pre-test allowed us to validate the length of the instruments and confirm the anticipated public burden associated with participation in the surveys (Tables A.4 and A.5). The pre-test also allowed us to debrief with participants and collect information that helped to inform refinements and clarifications to the wording of items, as well as survey instructions, where needed. The instruments were revised based on results of the pre-test and feedback from CDC staff.

Timeline

The evaluation timeline considers the need for evidence throughout the five-year project period and data collection over this period to ensure that information is gathered at appropriate points in time to support the various analyses under each of the four complementary evaluation components. The estimated schedule for key data collection, analysis, and reporting tasks relevant to this request for OMB approval is presented in Table A.9. The evaluation timeline is divided by the year of the program, indicated by the Program Years 2 through 4.

Key milestones after Program Year 1 are listed in relation to the estimated date of OMB clearance (beginning of Program Year 2). In Program Year 2, new data collection begins; activities include the first round of the funded program and network surveys. Six site visits will be conducted in each year beginning in Program Year 2. Program Year 3 will include six site visits. Finally, in Program Year 4, the final rounds of the program and network surveys and site visits will be conducted. The maximum three years of clearance is requested with the expectation that data collection will commence at the beginning of Calendar Year 2015 and close early in Calendar Year 2017.

In Program Years 2 through 4, the specific evaluation design component for the program year will be refined, which will include the prioritization of questions and further specifying the design approach. This process will result in an updated evaluation plan for the year. In addition, analysis and the development of evaluation reports will be conducted in Program Years 2 through 4 after the data collection in each year is complete.



Table A.9. Proposed project timeline

Activity

Anticipated timeline

Program Year 2: Process evaluation


Data collection


Develop data collection systems

August 2014-1 month after OMB approval

Field program survey

3-4 months after OMB approval

Field network survey

3-5 months after OMB approval

Conduct site visits

2-5 months after OMB approval

Develop and submit evaluation plan and report


Updated evaluation plan

1 month after OMB approval

Analyze and synthesize data

6-9 months after OMB approval

Final evaluation report

10 months after OMB approval

Program Year 3: Outcomes evaluation


Data collection


Conduct site visits

14-17 months after OMB approval

Develop and submit evaluation plan and report


Updated evaluation plan

12 months after OMB approval

Analyze and synthesize data

13-21 months after OMB approval

Final evaluation report

22 months after OMB approval

Program Year 4: Summative evaluation


Data collection


Field program survey

27-28 months after OMB approval

Field network survey

27-29 months after OMB approval

Conduct site visits

26-29 months after OMB approval

Develop and submit evaluation plan and report


Updated evaluation plan

24 months after OMB approval

Analyze and synthesize data

28-33 months after OMB approval

Final evaluation report

34 months after OMB approval

17. Reason(s) Display of OMB Expiration Date is Inappropriate

There are no exceptions to the certification; the expiration date will be displayed. To continue data collection in the last two years of the grant, a reapplication for OMB clearance will be submitted.

18. Exceptions to Certification for Paperwork Reduction Act Submissions

There are no exceptions to the certification.

References

Vaid, Isam, Charles Wigington, Deborah Borbely, Partricia Ferry and Diane Manheim. “WISEWOMAN: Addressing the Needs of Women at High Risk for Cardiovascular Disease.” Journal of Women's Health, vol. 20, no. 7, July 2011, pp. 977-982.

1 Best practices in this case are those shown to be effective across organizations based on research. In contrast, emerging and promising practices are those shown effective in a particular situation or under a specific circumstance and hold promise for adoption by other organizations.

2 Systems refer to collections of organization working in the community to improve cardiovascular health and their activities and interactions with each other to achieve these improvements. Networks are prominent features of systems, which are collections of the partnerships formed by awardees and other organizations.

3 Components refer to the activities conducted by WISEWOMAN awardees, and pathways are the ways in which the components or activities are translated into better outcomes. For example, health coaching sessions are an activity, and improved health knowledge and behaviors would be a pathway from health coaching to better outcomes, such as lower cardiovascular risk.

4 Evaluation questions will be refined and prioritized as the evaluation design progresses. CDC will consider WISEWOMAN priorities for information and the feasibility of answering each question given the time and budget of the project in the final determination of questions to include in the evaluation each year.

5 Source: BLS Website, as of May 30, 2014. [http://www.bls.gov/oes/current/oes111021.htm]

6 Source: BLS Website, as of May 30, 2014. [http://www.bls.gov/oes/current/oes119111.htm]

7 Source: BLS Website, as of May 30, 2014. [http://www.bls.gov/oes/current/oes111021.htm]

8 Source: BLS Website, as of May 30, 2014. [http://www.bls.gov/oes/current/oes211022.htm]

9 Source: BLS Website, as of May 30, 2014. [http://www.bls.gov/oes/current/oes291062.htm]

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorEmily Wharton
File Modified0000-00-00
File Created2021-01-26

© 2024 OMB.report | Privacy Policy