13PR Part_A_01 16 2014

13PR Part_A_01 16 2014.docx

Experiences with the Implementation and Outcomes of Policy and Environmental Cancer Control Interventions

OMB: 0920-1016

Document [docx]
Download: docx | pdf








Evaluating the Implementation and Outcomes of Policy and Environmental Cancer Control Interventions


New


Supporting Statement – Part A




January 16, 2014












Primary Contact:


Angela Moore

Technical Monitor

National Center on Chronic Disease Prevention and Health Promotion

Centers for Disease Control and Prevention (CDC)

4770 Buford Highway NE, MS K-55

Atlanta, GA 30341-3724


Telephone: (770) 488-3094

Fax: (770) 488-4335

Email: [email protected]



Table of Contents



List of Attachments

Attachment A: Authorizing Legislation

Attachment B.1: Federal Register Notice

Attachment B.2: Summary of Public Comments

Attachment C.1a: Program Director Web Survey Questionnaire (.DOC version)

Attachment C.1b: Program Director Web Survey Questionnaire (screen shots)

Attachment C.2: Program Director Advance Letter

Attachment C.3: Program Director Advance E-Mail

Attachment C.4: Program Director Reminder E-Mail

Attachment C.5: Program Director Thank You E-Mail

Attachment D.1: Key Informant Interview Guide

Attachment D.2: Initial Site Contact E-Mail

Attachment D.3: Key Informant Identification and Selection

Attachment D.4: Key Informant Recruitment and Scheduling

Attachment D.5: Informed Consent for Key Informant Interview

Attachment E.1: TA Providers Focus Group Guide

Attachment E.2: TA Providers Focus Group Invitation

Attachment F.1: Coalition Survey

Attachment F.2: Coalition Survey Advance Email

Attachment F.3: Coalition Survey Recruitment Email

Attachment F.4: Coalition Survey Reminder Email

Attachment F.5: Coalition Survey Thank-you Email

Attachment G: Battelle IRB Approval Letter


Supporting Statement Part A. Justification

Overview

CDC plans to collect information about the experiences and outcomes of state-based comprehensive cancer control (CCC) programs that are implementing policy, system, and environmental (PSE) change strategies to address cancer control and prevention efforts. Program experiences will be compared for 65 CCC programs that are funded through the cooperative agreement that supports the National Comprehensive Cancer Control Program (NCCCP) and for a subset of 13 CCC programs that receive additional funding through a cooperative agreement to enhance their PSE change strategies. The study design includes a survey of all 65 CCC funded programs, administered at two points in time; a longitudinal case study of 6 of the 13 programs with enhanced PSE change strategies; focus groups with staff who provide technical assistance; and a one-time survey of coalition members and strategic partners who are collaborating in the development and implementation of enhanced PSE change strategies. Results will be used to improve program guidance provided to CCC programs and direct future investments in the program.

A.1. Circumstances of Information Collection

Background

This is a new Information Collection Request authorized by Section 301, "Research and Investigation," of the Public Health Service Act (42 U.S.C. 241; see Attachment A). OMB approval is requested for three years.


Through the National Comprehensive Cancer Control Program (NCCCP), CDC supports cancer prevention and control programs to reduce cancer-related morbidity, mortality, and health disparities. NCCCP awardees include 65 state, territorial, and tribal organizations whose current cooperative agreements are authorized under Funding Opportunity Announcement (FOA) DP12-1205, “Cancer Prevention and Control Program for State, Territorial, and Tribal Organizations.” The current cooperative agreements maintain core comprehensive cancer control (CCC) activities and build on policy, system, and environmental (PSE) change strategies that many NCCCP programs have begun to incorporate into their program plans and initiatives.


In 2010, 13 of the 65 NCCCP awardees received additional cooperative agreement funding to increase their focus on PSE change strategies and accelerate their implementation efforts. The additional funding (provided under FOA DP10-1017) aims to increase each program’s capacity to inform policy and system change strategies and to increase collaboration with both traditional and nontraditional partners. The 13 funded pilot programs (hereafter referred to as 1017 programs) include: Cherokee Nation, Colorado, Florida, Indiana, Kentucky, Louisiana, Massachusetts, Michigan, Minnesota, New York, Oregon, Utah, and Wisconsin. Each awardee has convened a task force or workgroup to facilitate and coordinate work involving multiple public and private sector stakeholders. With additional resources and structure, CDC hopes that 1017 programs will achieve greater health impact through increased skills and capacity, enhanced interactions with partners, and greater focus on expanding skills to inform evidence-based PSE changes. In alignment with Frieden’s pyramid that places a focus on activities with the potential for high impact relative to resource expenditure (Frieden, 2010), the goal of the 1017 pilot is to examine what a modest investment and more structure can yield, building on the successes that programs have already enjoyed.


CDC is committed to conducting utilization–focused evaluations for cancer control and prevention programs that are supported through cooperative agreements. The NCCCP evaluation, Development of an Evaluation Plan to Evaluate Grantee Attainment of Selected Activities of Comprehensive Cancer Priorities (OMB No. 0920-0971, exp. 5/31/201), evaluates the extent to which grantees are implementing the following NCCCP priorities: 1) emphasize primary prevention of cancer; 2) support early detection and treatment activities; 3) address public health needs of cancer survivors; 4) implement policies, systems, and environmental change strategies to guide sustainable cancer control; 5) promote health equity as it relates to cancer control; and 6) demonstrate outcomes through evaluation. In order to assess whether the 1017 pilot is meeting its goals and to inform CDC’s future support of this program and similar efforts, CDC is conducting a mixed-method evaluation of the 1017 program. The purpose of this evaluation is to understand (1) the processes by which these grantees develop and implement policy, system, and environmental changes; (2) the short-term program results and outcomes achieved by the grantees during 1017 funding period; and (3) the potential intermediate and long-term impacts of those changes on cancer burden. While data is available through existing program monitoring systems, these systems alone are insufficient to support a thorough evaluation. New information collection is proposed to supplement and complement the data collections that are already in place. The data collection activities included in this request are essential for a comprehensive mixed-method evaluation of the 1017 program that will answer questions about how 1017 program are implementing their programs, the role of partners in program implementation, the barriers and facilitators to program success, and the capacity of programs to achieve desired results. The 1017 evaluation is designed to use multiple data sources collected at different time points across the program period to capture information about those processes, results, outcomes, and potential impacts.


Privacy Impact Assessment

Overview of the Data Collection System

The data collection systems will be developed and implemented by CDC’s contractor (Battelle).


Program Director Survey. The program director survey is an on-line self-administered Web survey of all comprehensive cancer control (CCC) programs funded by CDC. The Web survey questionnaire consists largely of close-ended structured items, with a few open-ended items included (Attachment C.1).


Multiple-site Case Study. Six of the 1017 grantees will be selected as cases for in-depth study to understand how differences in programmatic characteristics and context influence the overall processes and outcomes expected as a result of 1017 funding and support. Through the case studies we will use multiple data sources to tell detailed stories of how the selected grantees developed and implemented their PSE change strategies, and how relevant contextual features influenced the development, implementation, and short-term outcomes of PSE change initiatives. During site visits to the 6 selected 1017 programs, interviews will be conducted with staff, partners, and other key informants (Attachment D.1). Interview data will be supplemented with documentary evidence and program monitoring data already collected by local program staff and by CDC.


Coalition Survey. A coalition survey of the 13 1017 programs will be part of the evaluation. The survey will be completed by individuals connected to the program through formal affiliation (staff or coalition members) or through informal association (strategic partners). This survey will not be implemented until 2015 (Attachment F.1). The attached instrument is modeled after existing partnership survey tools but is likely to require some modification based on findings from the initial program director survey and case study interviews. When content of the Coalition Survey is finalized, CDC will submit a Change Request to OMB outlining any changes, and provide screenshots of the instrument before it is fielded.


TA Providers Focus Groups. Focus groups with technical assistance (TA) providers will provide information about the types of TA requested and received from 1017 programs. It will also provide insights into program challenges from the perspective of those who have worked directly with the programs. Because the technical assistance providers are not co-located, the focus groups will be conducted by telephone (Attachment E.1).


Items of Information to Be Collected

Program Director Survey. The program director survey (Attachment C.1) collects information about:

  • Competencies and skills of the CCC program staff related to PSE change;

  • Competencies and skills of the CCC coalition or PSE Workgroup related to PSE change;

  • Training, networking and tools used by the CCC coalition or PSE Workgroup to increase capacity for PSE change work;

  • Descriptive information for two selected PSE change strategies currently underway;

  • The attitudes of decision makers, stakeholders and influencers regarding the two PSE change strategies;

  • The methods used by the CCC coalition or PSE Workgroup to inform decision makers about the two PSE change strategies;

  • The strategies used by the CCC coalition or PSE Workgroup to increase stakeholder support for the two selected PSE change strategies; and

  • Key events that influenced the specific PSE change strategies.

Multiple-site Case Study. Interviews with staff, partners, and other key informants (Attachment D.1) will collect information on:

  • Program infrastructure to support the 1017 program;

  • Planning and assessment activities and products;

  • Implementation of the PSE agenda and media plan;

  • Evaluation activities and use;

  • Key PSE outcomes achieved;

  • Coalition functioning; and

  • Program lessons learned.


Coalition Survey. A coalition survey of the 13 1017 programs will be a part of the evaluation. The survey will be completed by individuals connected to the program through formal affiliation (staff or PSE workgroup members) or through informal association (strategic partners). The survey will collect information about:

  • New relationships;

  • Group functioning;

  • Satisfaction with group membership;

  • Contributions of partners to strategic objectives; and

  • Key PSE outcomes achieved.


TA Providers Focus Groups. Focus groups with technical assistance (TA) providers (Attachment E.1) will provide information about:

  • TA and training needs;

  • TA and training provision;

  • Variation in TA needs across programs; and

  • Perceptions of best practices.



Identification of Website(s) and Website Content Directed at Children Under 13 Years of Age


Not applicable.


A.2. Purpose and Use of Information

The purpose of this evaluation is to understand (1) the processes by which the 1017 grantees develop and implement policy, system, and environmental (PSE) change strategies; (2) the short-term program results and outcomes achieved by the grantees during the 1017 funding period; and (3) the potential intermediate and long-term impacts of those changes on cancer burden. To address this purpose, the 1017 evaluation is designed to use multiple data sources collected at different time points across the 5-year program period to capture information about those processes, results, outcomes, and potential impacts. This information will be used by CDC to improve the guidance provided to current grantees and to direct future investments in the program.


The overarching question that the 1017 evaluation is designed to answer is:

  • Did 1017 cooperative agreement funding, training and TA:

  • enhance the ability of grantees to implement PSE change as part of comprehensive cancer control?

  • facilitate a shift towards primary prevention?

Related to this overarching question are additional, more specific, evaluation questions that address elements of the 1017 models, policy development framework, and specific topics across the program life cycle: 1) planning and assessment, 2) implementation, 3) outcomes, 4) coalitions, and 5) scaling up.


Because this is a pilot program (only 13 of the 65 CCC programs received funding), it is important that CDC obtain systematic evaluation data about the program to decide whether and how to expand this program to additional CCC grantees. The evaluation will utilize data collected from all 65 1017 grantees to allow comparisons between the 13 programs with 1017 funding and the remaining programs that did not receive 1017 funding (Program Director Survey and program monitoring data). More in-depth information will be collected from the 1017 grantees to provide a richer understanding of how the program has been implemented and what has been achieved (Multiple-site Case Study and Coalition Survey), with an emphasis on lessons learned that could inform the expansion of the 1017 program. Additional insights about the challenges faced by grantees will be obtained through the technical assistance providers (Focus Groups with TA Providers).


On behalf of CDC, the contractor (Battelle) developed the study protocols and all data collection instruments and supplementary materials (see Attachments C-E).


Privacy Impact Assessment Information

  1. Why the information is being collected:

The purpose of the proposed data collection is to evaluate the 1017 pilot program that provided funding to 13 of the 65 CDC-funded CCCC programs.


  1. Intended use of the Information:

The evaluation results will help CDC’s National Comprehensive Cancer Control Program improve the support provided to the programs receiving 1017 funding and direct future investments in the program by the agency.


  1. Impact on Privacy to Respondents:

CDC staff who are part of the 1017 evaluation team will receive identifying information. To protect against accidental disclosure of information beyond the evaluation team, all data provided to CDC will be stripped of identifying information by replacing names with unique identifiers. The key to the unique identifiers will be stored separately from the names and any other identifying information.


A.3. Use of Information Technology and Burden Reduction

We will utilize the Web to collect survey data from both program directors and coalition members. The Program Director Survey will be collected at two points in time. At the second administration, items from the first administration will be pre-populated to reduce the burden of recall.


The interviews will collect data that are not amenable to collection through a survey format. To reduce burden on the respondent, data collection staff will travel to the site to meet with interviewees and/or schedule a telephone interview at a time that is convenient for the respondent.


A.4. Efforts to Identify Duplication and Use of Similar Information

There are no similar data available that meet the needs of this proposed evaluation. Although the NCCCP evaluation, Development of an Evaluation Plan to Evaluate Grantee Attainment of Selected Activities of Comprehensive Cancer Priorities (OMB No. 0920-0971, exp. 5/31/2015), assesses the extent to which NCCCP grantees are implementing the NCCCP Priorities including the implementation of PSE change strategies, the data collection activities conducted will not provide sufficient detail to fully explain the processes and outcomes intrinsic to selected PSE change strategies. Further activities were conducted to ensure that the information collected in this study was not duplicative of other data collection methods. A comprehensive environmental scan was conducted prior to designing the evaluation and each of the data collection instruments included in this request. The scan included a literature review, a review of existing program monitoring data, and key informant interviews with program staff and individuals involved in similar evaluation efforts. In total, Battelle reviewed 450 published articles identified using predefined search criteria, conducted 11 key informant telephone interviews with CDC staff members and national partners, and reviewed routine program monitoring data collected by CDC via the Management Information System for Comprehensive Cancer Control Programs (OMB No. 0920-0841, exp. 3/31/2016). Through the environmental scan CDC determined that the existing information collections do not provide sufficient detail about program context and implementation experience or provide the variety of perspectives needed (program staff, coalition members and community partners) to inform CDC guidance. In addition, the scan included a review of 1017 documents (Exhibit 1) and other secondary sources (Exhibit 2).


Exhibit 1. CCC 1017 Program Documents Reviewed

Document Name

Document Description

Document Type

1017 Training Plan Framework_finalDRAFT_6092011

Plan for providing training to 1017 grantees

Word

1017 PSE StrategiesOct11

PSE strategies planned by 1017 programs

Excel

1017 Performance Measures MIS crosswalk

Performance Measures crosswalked with MIS categories

Word

AEA policy evaluation panel

AEA presentation slides, November 3, 2011

Identifying Facilitators and Barriers to Implementing Policy, Environmental, and System Changes: Lessons Learned from Comprehensive Cancer Control Policy Taskforces

Ppt PDF

AEA2011Panel Slides_Nov_1_final

AEA presentation slides, November 4, 2011

Approaches to Assuring Evaluation Use: Valuing Stakeholders, Context, and Program Priorities in Cancer Control

Ppt PDF

cdc-rfa-dp10-1017 amendment final 06 30 10

RFA to which funded 1017 programs applied

Word

CDC Webinar August 2011

Kelley Daniel training on policy agenda development

Ppt PDF

Chronic MIS User Guide

NCCDPHP Chronic Management Information System (MIS)

Word PDF

Comprehensive cancer control: progress and accomplishments

Rochester, Townsend et al.. Cancer Causes Control (2010) 21:1967–1977

Article

DCPC_CCCB Org Cht Sept 28,2011

CCC Branch Organization chart

PDF

DP10-1017Program Directors RSV Meeting Agenda – condensed-FINALWORKINGAGENDA (2)

Agenda for reverse site visit, March 30, 2011

Word

DP10-1017PilotPerformanceMeasuresFinal

Performance measures revised based on PD feedback October 2011

Word

Final Policy Analysis Sep10

Policy categories implemented by 67 CCC programs

Excel

Policy FOA Business Requirements 1.2

NCCDPHP Chronic MIS – CCC Policy FOA Business Requirements, Version 1, Updated 8-19-2011

Word

Policy_Agenda_Training_72811UPLOADFINAL

Training slides to help programs develop a PSE agenda. Policy Training July 28, 2011

Pptx

Public policy action and CCC implementation: benefits and hurdles

Steiger, Daniel et al.. Cancer Causes Control (2010) 21:2041–2048

Article



Exhibit 2. Other Secondary Documents Reviewed

Document Name

Document Source

Document Type

GAO 2012 report designing evaluations

DESIGNING EVALUATIONS 2012 Revision GAO-12-208G

Word PDF

A User’s Guide to Advocacy and Evaluation Planning

Harvard Family Research Project 2009

Word PDF

A Guide to Measuring Advocacy and Policy

Annie E. Casey Foundation 2007

Word PDF

Key Outcome Indicators for Evaluating Comprehensive Tobacco Control Programs

Office on Smoking and Health, CDC 2005

Word PDF

Evaluation Toolkit for Smoke Free Policies

Office on Smoking and Health, CDC 2008

Word PDF

Introduction to Process Evaluation in Tobacco Use Prevention and Control

Office on Smoking and Health, CDC 2008

Word PDF

Evaluation of State Nutrition, Physical Activity and Obesity Plans

Division of Nutrition, Physical Activity, and Obesity, CDC 2010

Word PDF

Pathfinder: A Practical Guide to Advocacy Evaluation

Innovation Network, 2009

Word PDF

Partnership Evaluation: Guidebook and Resources

Division of Nutrition, Physical Activity, and Obesity, CDC 2010

Word PDF

Issue Topic: Advocacy and Policy Change

The Evaluation Exchange, Harvard Family Research Volume XIII, Number 1, Spring 2007

Word PDF

Advocacy Evaluation: Review and Opportunities

Justin Whelan, The Change Agency, http://www.thechangeagency.org /01_cms/details.asp?ID=69


Word PDF


Each of the data collection instruments included in this request was designed to collect information that is not available through any of the aforementioned sources. Ongoing program monitoring data collected by CDC will be utilized in the evaluation to supplement the primary data collection activities proposed. While the environmental scan did not uncover any studies that duplicate the aims of the current data collection, it did provide some useful insights into evaluation frameworks, lines of questioning and even specific questions that could be used in developing the data collection instruments.


A.5. Impact on Small Businesses or Other Small Entities

No small businesses will be involved in this data collection.



A.6. Consequences of Collecting the Information Less Frequently

The consequence of not collecting the information would be to limit CDC’s efforts to assess the results of their investment in this pilot project and make future programmatic investment decisions. Without the data collection activities proposed, CDC would not have essential information necessary to provide support to current programs and make critical decisions about the future support of this program.


This request is for a series of data collection activities over the next 3 years of the pilot program.


There are no legal obstacles to reduce the burden.



A.7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5

This request fully complies with regulation 5 CFR 1320.5.



A.8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency

  1. A copy of the agency’s 60-day Federal Register Notice is attached (see Attachment B.1). The Notice, as required by 5 CFR 1320.8 (d), was published on March 8, 2013 (vol. 78, no. 67, pp. 20921-20923). One public comment was received and acknowledged (Attachment B.2).



A.9. Explanation of Any Payment or Gift to Respondents

Respondents will not receive any monetary payment or incentive for participating in the study. However, as an incentive to participate, all participating programs will receive a CDC-approved summary of the evaluation findings at the conclusion of the study. This summary can be used to articulate program outcomes to key stakeholders and is instrumental in quality improvement efforts.



A.10. Assurance of Confidentiality Provided to Respondents

The proposed data collection will have no effect on the respondent’s privacy.


  1. Privacy Act Determination. CDC’s Information Collection Review Office (ICRO) has reviewed this submission and determined that the Privacy Act does not apply. Although some respondents may be identifiable, the information provided to CDC and CDC’s data collection contractor concerns organizational activities, and is not personal in nature.

  2. Safeguards.

Program Director Survey and Coalition Survey. Battelle will have direct access to the data collected through both online surveys (Attachments C.1 and F.1). Battelle and CDC will safeguard the responses and will not release any identifying information. All completed surveys, as well as the electronic data files containing the survey data, will be identified only by study identification number. Neither the Internet surveys nor the electronic files of the survey data will contain names, addresses or telephone numbers of respondents. All project files will be password protected and access to files will be limited to authorized study staff.


Multiple Site Case Study Key Informant Interviews. An assurance of anonymity is not possible or practical for the key informant interviews conducted as part of the proposed case studies. CDC staff may participate in one or more site visits, thus they may have first-hand knowledge of the interview responses. In addition, CDC staff involved in the 1017 program evaluation will have access to the interview data both during and after the project is over for the purpose of further data analyses and reporting. Furthermore, the qualitative data generated by the key informant interviews will be of most use to CDC if the rich contextual and programmatic detail has identifying information. Therefore, we promise a limited level of privacyto participants, i.e., their identities will not be revealed in any reports or published articles (prepared by CDC and/or Battelle staff).


TA Providers Focus Groups. All handwritten notes, typed notes, and audio recordings from focus groups will be maintained in a secure manner. Hardcopies of these materials will be stored in a locking filing cabinet. The interview notes, or any other materials produced from the focus groups, that include identifiers will be handled or viewed only by Battelle and CDC staff who are directly responsible for data collection and analysis. Digital audio recordings will be stored on password-protected file servers and deleted upon study completion.


Reports and Manuscripts. Project reports and manuscripts will contain aggregated data only; results will not be associated with any individual respondent.


  1. Consent. All survey and interview respondents and focus group participants will be informed of their rights before data are collected. Consent for the internet surveys is located in Attachments C.1-3 and F.2. Consent for the interviews is located in Attachment D.5. Consent for the focus groups is located in Attachment E.2. The data collection plan was reviewed by Battelle’s IRB. An approval letter is included in Appendix G.

  2. Nature of Response. Respondents will be informed through the instructions and consent statements of the purpose of the study, what their participation will involve, and the steps taken to maintain their responses in a secure manner. They will be reminded that their participation is voluntary and that they may choose not to answer any question or may withdraw from the study at any time without penalty to themselves or their NCCCP-funded program.


A.11. Justification for Sensitive Questions

Topics typically considered to be of a sensitive nature include sexual practices, alcohol or drug use, religious beliefs or affiliations, immigration status, and employment history. No questions regarding these topics or any other topic of a sensitive nature will be asked in these data collection activities.


A12. Estimates of Annualized Burden Hours and Costs

A12-A. Estimated Annualized Burden to Respondents


The proposed study consists of two cycles of data collection that will be conducted over a three year period; the only exception is the TA Provider Focus Groups will be conducted annually over the three year period. The total estimated annualized response burden is 160 hours. The estimated annualized burden hours are presented in Table A-12.1.


Program Director Survey: This is an on-line self-administered Web survey that will be distributed to all Comprehensive Cancer Control (CCC) Program Directors (see Attachments C.1a and C.1b). Approximately 43 respondents will participate on an annualized basis. The estimated burden per response is 30 minutes. Program Directors will be invited to participate in the survey and will receive notifications in the form of an advance letter, reminder email, and thank you email (see Attachments C.2 – C.5).


Key Informant Interviews at Case Study Sites: A designated CCC staff person at each case study site will help identify and coordinate key informants for individual interviews. On an annualized basis we will work with 2 CCC staff members per year on the coordination process. The designated staff member will complete a key informant selection form (see Attachment D.3) that has an average burden of 8 hours per response. The key informant selection form will identify both CCC staff and CCC partners who are potential interview participants.


We plan to conduct interviews at each case study site with two types of respondents: CCC staff and CCC partners. The annualized number of CCC staff respondents is 12 and the annualized number of CCC partner respondents is 48. Prior to the interview, each selected respondent will be contacted for a 5-minute recruitment and interview scheduling process (see Attachment D.4). We will use the same key informant interview guide (see Attachment D.1) to guide all interviews; however, the amount of time devoted to the interview will vary by respondent type. The average burden per response for CCC staff is 90 minutes. The average burden per response for CCC partners is 60 minutes.


Coalition Survey (see Attachment F.1): This survey will be administered to CCC partners who are members of coalitions in each of the 13 1017 programs sites. Approximately 87 respondents will participate on an annualized basis. The estimated burden response per response is 20 minutes. Coalition members will be invited to participate in the survey and will receive notifications in the form of an advance email, recruitment email, reminder email, and thank you email (see Attachments F.2 – F.5).


TA Providers Focus Groups (see Attachment E.1): this information collection involves focus groups via telephone with CCC partners who provide technical assistance (TA) to CCC programs. On average, 15 respondents will participate in a focus group each year. TA providers will be invited to participate (Attachment E.2) in focus groups that we estimate will last for 90 minutes. The annualized burden is 22 hours.


Additional information on study design is presented in Section B.1.




Table 1. Estimated Annualized Response Burden in Hours

Type of Respondent

Form Name

Number of Respondents

Number of Responses per Respondent

Average Burden per Response (in hours)

Total Response Burden (in hours)

CCC Program Directors

Program Director Web Survey Questionnaire

43

1

30/60

22

CCC Staff


Key Informant Selection

2

1

8

16

Key Informant Recruitment/Scheduling

12

1

5/60

1

Key Informant Interview Guide

12

1

90/60

18

CCC Partners


Key Informant Recruitment/Scheduling

48

1

5/60

4

Key Informant Interview Guide

48

1

1

48

Coalition Survey

87

1

20/60

29

TA Provider Focus Group Guide

15

1

90/60

23

Total


--

--

--

161


A12-B. Estimated Annualized Cost to Respondents


The annualized cost to respondents is $5,065, as summarized in Table 2.


Table 2. Estimated Annualized Burden Costs

Type of Respondent

Form Name

Number of Respondents

Total Annualized Burden Hours

Average hourly wage rate
($/hour)*

Total Respondent Cost

CCC Program Directors

Program Director Web Survey Questionnaire

43

22

$25

$550

CCC Staff


Key Informant Selection

2

16

$25

$400

Key Informant Recruitment and Scheduling

12

1

$25

$25

Key Informant Interview Guide

12

18

$25

$450

CCC Partners


Key Informant Recruitment and Scheduling

48

4

$35

$140

Key Informant Interview Guide

48

48

$35

$1,680

Coalition Survey

87

29

$35

$1,015

TA Provider Focus Group Guide

15

23

$35

$805

Total


--

--

--

$5,065

* Wage rate data were obtained from the U.S. Department of Labor, Bureau of Labor Statistics, May 2011 National Occupational Employment and Wage Estimates, http://www.bls.gov/oes/current/oes_nat.htm.



A.13. Estimates of Other Total Annual Cost Burden to Respondents or Record Keepers

There are no costs to respondents associated with either capital and startup efforts or operation and maintenance of services for this project.


A.14. Annualized Cost to the Government

The average annualized cost to the Government includes the costs of CDC personnel and the project implementation contractor. CDC is responsible for project oversight, review and approval of project materials and deliverables, and oversight of publications and other dissemination activities. The average annualized cost of CDC personnel is $33,082. The contractor is responsible for development of all project materials and deliverables, data collection, management, and analysis, and report and publication writing. Actual contractor costs will be distributed over a five-year project period that includes project planning and follow-up activities such as analysis and report writing. When contractor costs are annualized over the three-year OMB approval period, the average annualized cost of the contractor is $322,388. The total annualized cost of the project is $355,470.


Table 3. Annualized Cost to the Federal Government


Annualized Cost

Federal Government Costs (salaries for

5% time for one FTE @ GS-14

5% time for one FTE @ GS-12

2.5% time each for three FTEs @ GS-13

2.5% time for one FTE @ GS-15

$33,082

Contractor Costs

$322,388

Total

$355,470



A.15. Explanation for Program Changes or Adjustments

This is a new data collection.


A.16. Plans for Tabulation and Publication and Project Time Schedule

Plan for Tabulation

Survey. Tabulation and analysis of the survey results will be conducted by the contractor (Battelle) on behalf of CDC. The statistical analyses of the data from the two Web surveys will be primarily descriptive, employing univariate statistical methods with tables and graphs for display and summarization. In addition, we will conduct comparisons between responses of 1017 Program Directors with the responses of Program Directors from CCC programs that did not receive 1017 funding.


Multiple-site Case Study. NVivo® (version 9.2), a qualitative data analysis software program, will be used for systematic management and analysis of all relevant case study data based on the evaluation questions and themes identified in the analysis process. All interview transcripts, as well as documentary data to the extent possible, will be imported into a case-specific NVivo project file and analyzed as a complete data set for each selected 1017 program. As single case analyses are completed, the case-specific NVivo files can be merged into multiple case files to facilitate cross-case analyses. The NVivo files will be stored in password-protected locations either on the Battelle analysts’ hard drives or on our secure intranet. The site visit teams will carry out the coding of the collected site visit data within NVivo using a preliminary coding scheme shared across cases. Each data source (key informant interview and documents) will be read carefully and relevant text segments will be assigned to one or more codes. As new themes and codes are identified and defined, they will be first incorporated into the case-specific codebooks. As the single-case analyses progress, the case study leader will convene analysis meetings to review the developing codebooks, identify similar themes or codes between cases, and to facilitate the development of a common codebook for use in the multiple case analysis. NVivo will allow us to flexibly manage the developing coding schemes and retrieve coded data for subsequent analysis steps. We will use the same methods to analyze Wave 2 site visit data. The codebooks developed in Wave 1 will continue to be used, but expanded to capture data relative to program outcomes and impacts of funding that were not explored in-depth in Wave 1 data collection. To the extent possible, the same team members will code all data from a given case at both Waves 1 and 2 to ensure consistency over time.


The qualitative coding will allow us to employ other analytic techniques to draw conclusions from the data about single cases and ultimately across the multiple cases selected for this study. Data displays are an essential analysis tool and can take a variety of forms with qualitative data, depending on study purpose and questions. Using NVivo and other software tools, we anticipate using the following data display techniques as part of the analysis process:

  • Grouping and sorting coded text segments from interviews and collected documents by themes, interview questions, and evaluation questions to develop written summaries at multiple levels of abstraction.

  • Building analysis matrices that array data summaries by theme/interview question/evaluation question and by data sources (e.g. key informants) to facilitate within-case comparisons on key topics.

  • Case-specific versions of the logic model and stages of policy development framework that allow us to represent how individual 1017 programs compare to the ideal versions of programmatic stages, processes, and outcomes.

  • Building analysis matrices that array data summaries by theme/interview question/evaluation question and by cases (e.g. selected 1017 programs) to facilitate within-case comparisons on key topics.


TA Providers Focus Groups. The data from the telephone focus group will be transcribed and analyzed using similar qualitative analysis procedures (described above) to the key informant interviews. However, the analysis will be simpler as there are no cases to compare.


Plan for Publication and Dissemination


The plan for publication and dissemination of the evaluation results will include three main components: (1) interim products from Wave 1 data collection (individual case study reports, summary of survey results, summary of TA focus groups); (2) interim products from Wave 2 data collection (individual case study reports, summary of survey results, summary of TA focus groups); and (3) final products from the evaluation (final report, 2 or more manuscripts for publication in peer reviewed scientific journals). The contractor (Battelle) will provide support for these dissemination efforts. Other dissemination activities may also be pursued given time and resources.


Case Study Reports


After Wave 1, a brief report for each selected program will be prepared highlighting the results of the analyses that reflects the program’s progress in carrying out 1017-related activities and achieving any outcomes. Each report will follow a generic outline, but the content and topics covered will be allowed to vary depending on the strategies and progress of each 1017 program. Draft case reports will be provided to the respective Program Director for review, inviting comment and clarification about matters of fact and interpretation. This type of “member checking” is an important part of case study research that helps to ensure the validity of the findings, and can often yield new insights and findings as it provides an opportunity for local participants to add or amplify observations that occurred to them after the site visit team’s departure (Stake, 2006).


After Wave 2, full case reports for each program will be prepared containing complete results from both Wave 1 and Wave 2 site visit data. These reports will follow the same generic outline used for the Wave 1 reports. As with Wave 1, the draft case reports will be provided to the respective Program Directors for review, inviting comment and clarification about matters of fact and interpretation. The results of the case studies will be featured prominently in the final evaluation report.


After Wave 2 site visits, analyses, and individual case reports are finished, a final multiple case report will be prepared that includes the complete results of the cross-case analyses of Wave 1 and 2 site visit data. The multiple case report will focus on the similarities and differences among the selected 1017 programs, and how the programs’ experiences inform an understanding of the potential impacts of 1017 program as a whole. The multiple case report will also permit a reexamination of the conceptual models and frameworks guiding this evaluation, and will serve as the basis for any manuscripts intended for journal publication. The multiple case report will include a review of the evaluation purpose and questions, a summary of the methods used to conduct the case studies, a summary of the findings (organized by the evaluation questions), and discussion of the findings and the implications for the 1017 programs as a whole.


Survey Summaries


Brief summaries of the methods and results of the Program Director Survey will be prepared after each wave of data collection. These summaries will be shared internally with CDC staff and key stakeholders, including TA providers and CCC program staff. The results of the surveys will also be included in the final evaluation report.


Focus Group Summaries


Brief summaries of each TA Provider Focus Group will be prepared after each administration. These summaries will be shared internally with CDC staff and key stakeholders, including TA providers. Insights from the focus groups will also be included in the final evaluation report.


Final Evaluation Report


Battelle will develop a final report that describes all project activities and outputs, including (1) an executive summary; (2) assessment and review of prior literature; (3) evaluation design and methods; 4) IRB and OMB approvals; (5) results of all data collection activities; and (6) summary of recommendations. The final report will include appendices for all study materials such as protocols and instruments.


Manuscripts


Battelle will develop 2 or more manuscripts ready for submission to relevant peer-reviewed journals. The manuscripts will be part of the overall dissemination plan for the project, and Battelle will work closely with CDC in specifying the topics, purpose, content, and target journals of the manuscripts so that they meet CDC’s communications needs. Topics of the manuscripts might include: (1) conceptual frameworks for evaluating PSE change efforts in chronic disease programs, (2) evaluation methods, and (3) results of the evaluation. Relevant journals could include those where similar or related articles have been published, including (1) Cancer Causes and Control; (2) Preventing Chronic Disease; and (3) the Journal of Public Health Management and Practice.


Other potential dissemination activities

An important part of the dissemination plan is sharing results directly with CCC program directors and their partners who are in a position to apply the results in their cancer control programs. Battelle and CDC will jointly plan additional dissemination of both interim and final results through such venues as (1) program directors meetings, (2) program Webinars, (3) fact sheets, (4) and written success stories. Battelle and CDC staff may also submit abstracts to meetings relevant to CCC programs and other public health practitioners and evaluators such as world cancer summits, chronic disease meetings, the American Public Health Association annual meeting, and the annual meeting of the American Evaluation Association.


Project Time Schedule

The contractor (Battelle) will begin Wave 1 data collection the first month following OMB clearance. Wave 1 data collection will be completed within 6 months of clearance. Data cleaning and analysis will be completed within 1 year of OMB clearance. Wave 2 data collection will occur 18-24 months after Wave 1 data collection. Final reports and manuscripts will be prepared at the end of Wave 2 data collection. Table 4 provides a summary of the study activities and the months following OMB clearance during which they will be performed.


Table 4. Project Time Schedule

Activity

Time Period

(Months after OMB Clearance)

Wave 1


Conduct Program Director Survey

1-3

Conduct Site Visits

2-6

Conduct Focus Group with TA Providers

1-3

Analyze Program Director Survey Data

3-5

Analyze Site Visit Data

5-10

Analyze TA focus groups

3-4

Report preparation

10-18

Wave 2


Conduct Program Director Survey

18-24

Conduct Site Visits

24-28

Conduct Focus Group with TA Providers

18-20; 34-36

Analyze Program Director Survey Data

24-26

Analyze Site Visit Data

28-33

Analyze TA focus groups

20; 36

Conduct Coalition Survey

24-28

Analyze Coalition Survey data

28-33

Final Report/Manuscripts

30-36


A.17. Reason(s) Display of OMB Expiration Date Is Inappropriate

Not Applicable.


A.18. Exceptions to Certification for Paperwork Reduction Act Submissions

There are no exceptions to the certification.








References

Dye, J. F., Schatz, I. M., Rosenberg, B. A., & Coleman, S. T. (1998). Constant comparison method: A kaleidoscope of data. [On-line article]. The Qualitative Report, 4(1/2), http://www.nova.edu/ssss/QR/QR4-1/dye.html.

Frieden, T. R. (2010). A framework for public health action: the health impact pyramid. Am J Public Health, 100(4), 590-595. doi: 10.2105/AJPH.2009.185652

Morgan, D. L. (1993). Qualitative content analysis: a guide to paths not taken. Qual Health Res, 3(1), 112-121.

Patton, M. Q. (1990). Qualitative evaluation and research methods (2nd ed.). Newbury Park, Calif.: Sage Publications.

Stake, R. E. (2006). Multiple case study analysis. New York: The Guilford Press.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorsic3
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy