SS Part A_07 18 2013

SS Part A_07 18 2013.docx

Impact Evaluation of CDC's Colorectal Cancer Control Program

OMB: 0920-0992

Document [docx]
Download: docx | pdf





Impact Evaluation of CDC’s Colorectal Cancer Control Program (CRCCP)


OMB Supporting Statement


Part A: Justification







Contact:

Amy DeGroff, PhD

Division of Cancer Prevention and Control

National Center for Chronic Disease Prevention and Health Promotion

Centers for Disease Control and Prevention

Atlanta, Georgia

Telephone: 770-488-2415





July 18, 2013





TABLE OF CONTENTS

Section Page









LIST OF ATTACHMENTS

  1. Authorizing legislation, Sections 301 (a) of the Public Health Service Act

  2. FRN and Comments on FRN

    1. Federal Register Notice

    2. Comments on Federal Register Notice

  3. Screener for the Colorectal Cancer Population Survey

  4. Colorectal Cancer Population Survey

    1. Colorectal Cancer Population Survey

    2. Screen Shots of the CATI Administration of the Colorectal Cancer Population Survey

  5. Colorectal Cancer Screening Practices Survey of Primary Care Providers

    1. Colorectal Cancer Screening Practices Survey of Primary Care Providers Survey

    2. Invitation/Cover Letter

    3. Advance Fax

    4. Reminder Fax

    5. Second Mailing Cover Letter

    6. Third Mailing Cover Letter

  6. Site Visit Planning

    1. Site Visits Suggested Interviewee Form

    2. Site Visit Instruction Template

    3. Grantee Site Visit Introductory Letter/Email

    4. Non-Grantee Site Visit Introductory Letter/Email

    5. Site Visit Confirmation Email

  7. CRCCP Grantee (Intervention) Case Studies

    1. Grantee Program Staff Interview Guide

    2. Grantee Evaluator Interview Guide

    3. Grantee Partner Interview Guide

    4. Informed Consent Form

  8. Non-Grantee ( Control) Case Studies

    1. Non-Grantee Program Staff Interview Guide

    2. Non-Grantee Evaluator Interview Guide

    3. Non-Grantee Partner Interview Guide

    4. Informed Consent Form

  9. Document Review Form

  10. Observational Assessment Guide

  11. Non-Disclosure agreement for Data Collection Contractors

  12. ICF Macro IRB Approval











Justification

A-1. Circumstances Making Collection of Information Necessary

Background

Colorectal cancer (CRC) is the second leading cause of cancer deaths in the U.S., killing more non-smokers than any other cancer.1 In 2006, more than 139,000 people were diagnosed with CRC and over 53,000 died from the disease.2 Screening can effectively reduce CRC incidence and mortality in two ways: first, unlike most cancers, screening offers the opportunity to prevent cancer by removing premalignant polyps; second, screening can detect CRC early when treatment is more effective.3,4 If diagnosed at early stages, the five-year survival rate for CRC is over 88%.5 In a modeling study to assess deaths prevented through increased utilization of clinical preventive services, Farley and colleagues estimated that 1900 deaths could be prevented for every 10% increase in CRC screening with colonoscopy – a result that exceeded similar calculations for breast or cervical cancer screening.6


CRC screening rates must be improved at a population level. In July 2009, DCPC funded the Colorectal Cancer Control Program (CRCCP) for a five-year program period. Through a competitive application process, 22 states and 4 tribal organizations received CRCCP cooperative agreement awards totaling nearly $22.5 million. In July 2010, CDC funded three additional states, bringing the total number of grantees to 29. Figure 1 highlights the CRCCP grantees. CDC’s stated goal for the CRCCP is to increase colorectal cancer screening rate to 80% in funded states and tribal areas and, subsequently, to reduce colorectal cancer incidence and mortality. The CRCCP builds on the work of CDC’s Colorectal Cancer Screening Demonstration Project that was funded from 2005-2009 and included five sites.7


Shape1


CDC has adopted the social ecological model (SEM) as a framework for the CRCCP (Figure 2). The SEM is a systems model with the individual at the core and multiple spheres of influence around the individual.8 CRCCP grantees implement activities at multiple levels in order to maximize synergies across the varied levels of intervention and promote program sustainability. As an example, a grantee may work with advocacy organizations to affect state policy (e.g., eliminate insurance copayments for colonoscopy), collaborate with professional organizations in their state (e.g., a state association of gastroenterology) to advance quality standards for endoscopy, contract with a media consultant to implement a mass media campaign to promote colorectal cancer, and fund patient navigators to help patients of federally qualified health centers access screening.





Figure 2. CRCCP Social Ecological Model




A CRCCP program logic model has also been developed to clarify the relationships between resources, program activities, and expected outcomes (Figure 3). Both the SEM and logic model have been used to guide program and evaluation planning. As represented in the logic model, the CRCCP includes two program components: 1) screening provision, supporting clinical service delivery for low income, under-insured persons, and 2) screening promotion, involving activities to encourage broad, population-level screening. CRCCP grantees must collaborate with their state or tribal comprehensive cancer control programs to plan and implement their programs.



For the new CRCCP, grantees are required to establish evidence-based colorectal cancer screening delivery programs for persons 50-64 years of age, focusing on asymptomatic persons at average risk for CRC with low incomes and inadequate or no health insurance coverage for CRC screening. Approximately 33% of each grantee award may be used to fund this component of the program, that is, to pay for the provision of screening and diagnostic tests. Grantees typically establish contracts with health care providers (e.g., primary care providers, endoscopists) to deliver colorectal cancer screening services for the eligible population. Additional program activities such as patient recruitment, patient navigation, provider education, quality assurance, and data management are also supported under this component of the program


For the screening promotion component, grantees must plan and implement program activities that promote colorectal cancer screening among all adults aged 50-75 in their states/tribal areas. Grantees are encouraged to implement evidence-based activities aimed at increasing population-level screening rates. In particular, CDC has directed grantees to implement policy-, systems-, and community-level interventions (see Figure 2, SEM) that leverage greater change than activities implemented at the individual or inter-personal level. For instance, grantees are encouraged to work with health care systems, health insurers, worksites, and existing community programs to implement evidence-based strategies identified in The Guide to Community Preventive Services, that reduce structural barriers to screening (e.g., insurance co-pays, time off of work for screening, patient navigation) and facilitate screening (e.g., patient tracking and reminder systems, provider reminder systems that support provider recommendations for screening).9 And, consistent with the SEM and health impact pyramid, CDC promotes the implementation of these strategies at organizational, community, and policy-levels where greater impact is expected than if implemented at the individual or inter-personal level, while also recognizing the importance of health education strategies at the individual and inter-personal levels.10

The DCPC is requesting a three-year approval for clearance to conduct an impact evaluation to determine whether increases in colorectal cancer screening rate and other proximal outcomes identified in the program logic model (e.g., population and provider knowledge, attitudes and behaviors regarding CRC screening) can be attributed to the CRCCP. DCPC plans to complete two cycles of information collection over a three-year period. The first information collection will be initiated in Fall 2013 and the second information collection will be initiated in Fall 2015.


Researchers have conducted efficacy studies of specific interventions (e.g., provider reminders, small media) to increase CRC screening among unique populations. However, we are not aware of any evaluation studies that have assessed the impact of a comprehensive program like the CRCCP on screening rate at the population-level for a state. In addition, the CRCCP is the first cancer prevention and control program funded by CDC emphasizing both direct screening service provision for underserved populations and screening promotion for the broader population. Consequently, the CRCCP offers a unique and important opportunity to evaluate the efficacy of this new public health model.


Specifically, the impact evaluation will test the following two hypotheses:

Hypothesis 1: The observed increases in population-level CRC screening rate and other proximal outcomes among grantees are attributable to the CRCCP.

Hypothesis 2: The proposed theory of change for the CRCCP, as described in the program logic model, accurately reflects causal pathways between program activities, outputs, and expected outcomes.

To test these hypotheses, CDC will conduct an impact evaluation using a quasi-experimental, control group design with pre- and post-test11. A total of six states will participate: three CRCCP grantee states (Alabama, Nebraska, Washington) will represent the “intervention” sites and three non-CRCCP states (Tennessee, Oklahoma, Wisconsin) will serve as “control” sites. By including carefully selected comparison states (control sites) that are “matched” to the intervention states, causal inferences may be made based on evidence of exposure to the intervention and observed differences in proximal (e.g., population and provider knowledge and attitudes about colorectal cancer screening) regarding CRC screening and our main outcome of interest, state-level colorectal cancer screening rate. The measurement of proximal outcomes is essential in order to adequately evaluate both hypotheses.

How Study Design Addresses Hypotheses and Meets Objectives

Given that the CRCCP is a “real-world” program implementation, randomization was impossible. Consequently, other study designs were considered that maximize evaluators’ abilities to evaluate the hypotheses, meet study objectives, and ensure validity. A non-randomized control group design with pre- and post- tests was selected given our circumstances (see Table 1). Our inclusion of the “matched” control states provided a needed counterfactual in order to make claims of efficacy. Resource constraints limit the total number of states to six. External validity will be limited given the small sample size (six states).




Table 1. Quasi-Experimental, Non-Random Control Group Design with Pre- and Post-tests

NR O1 X O2

----------------------------------

NR O1 O2



To make a case for causation, the evaluators have set three criteria 12:

  1. Observed increases in CRC screening rates are greater in the intervention than control states

  2. Observed changes in identified proximal outcomes are greater in the intervention than control states

  3. Implementation activities must be documented that would plausibly effect outcomes measured.


While secondary data are available to assess the primary outcome of interest, CRC screening rates, data to measure proximal measures are lacking as are data about program implementation. To gather needed data, we will utilize a mixed-methods approach, employing both qualitative and quantitative methods.


To address Hypothesis 1, four sources of data are needed. First, to measure changes in state-level CRC screening rates, we will use secondary data collected through the Behavioral Risk Factor Surveillance Survey (BRFSS), a CDC-funded population-level survey. These data allow us to address criteria #1 above (i.e., Observed increases in CRC screening rates are greater in the intervention than control states). To address the second criteria, we must assess proximal measures (e.g., provider knowledge, population attitudes) that we propose will be improved by the CRCCP intervention and are causally associated with changes in CRC screening rates. Consequently, we will field two new surveys– 1) a general population survey (Attachment 4A) and 2) a survey of primary care physicians (Attachment 5B). The CRCCP intervention includes strategies aimed at both the general population and health care providers. Finally, data about program implementation is needed to assess the third criteria noted above, that is, to document that program activities were, in fact, implemented that would plausibly effect the proximal outcomes of interest. To evaluate program implementation, we will conduct case studies in all six states. Conducting case studies in the control states is important so that we can document any CRC prevention activities that may be in place, but not funded by CDC’s CRCCP.


To address Hypothesis 2, all four data sources will again be used. Quantitative data sources will allow us to assess some relationships between program activities, proximal outcomes, and our outcome of interest, screening rates. The case study will allow us to collect information about program activities implemented. The population and provider surveys will provide data on proximal measures. And the BRFSS survey data provides data for CRC screening rates.


Although the national CRCCP is being implemented in 25 states and 4 tribal communities, budget constraints restrict the impact evaluation to three intervention and three control states, for a total of six states. As a result, trade-offs were made between internal and external validity. Ideally, we would include all states in an evaluation in order to ensure the findings were generalizable to the overall program; however, that was not feasible.


During the sample selection process, care was taken to choose three intervention states that provided as much variety as possible, thus reflecting the diversity of the states in the overall program, in order to increase the generalizability of the findings. Our sample was selected using a staged process. In the first stage, cluster analysis was employed to define four strata of states that were most similar in terms of CRC screening, CRC incidence and mortality, unemployment, and insurance. In the next stage, secondary variables such as racial/ethnic diversity, percent of the population over age 55, number of primary care physicians, and experience with state CRC screening and control efforts (as evidenced by past participation in Dialogue for Action roundtables sponsored by the Prevent Cancer Foundation, state and other non-federal funding for CRC and CRC-related legislative policies) were assessed to identify grantee and control states that are similar within each of the four strata identified in stage one. We then selected one intervention state matched with a control state from three of the four strata to increase the representativeness of the sample; thereby, strengthening the study’s validity.


Table 2 below summarizes data regarding the racial composition, percentage of the population over age 55, the unemployment rate and the uninsured rate (both of which are likely to negatively impact access to CRC screening), and the number of primary care physicians in each state. The number of primary care physicians is also an indicator of access. Unfortunately, standardized state level data on the proximal outcomes of interest (provider and population knowledge, attitudes and behaviors with respect to CRC screening) are not available and could not be used to refine the selection of states.



Table 2: Demographic Characteristics of States Selected for the Case Studies


State

Intervention

Vs. Control

CRC Screening Rate

Racial/ethnic diversity (% minority)

Percent age over 55 years

# of Non federal primary care physicians1

Unemployment Rate 2009

Uninsured Rate 2007

Pair 1


Minnesota


Intervention




70.0


9.44

21.95

7,198

8

8.8

Wisconsin

Control

67.1

11.69

23.23

7,044

8.5

8.5



Pair 2


Alabama

Intervention



61.4


29.40

23.97

4607

10.1

13.6

Tennessee

Control



62.3


20.82

23.56

7247

10.5

14

Pair 3


Nebraska

Intervention



60.1


12.95

22.89

2,172

4.6

12.8

Oklahoma

Control



55.5


16.01

23.05

3,614

6.4

18.4




This proposal is submitted with the intent of addressing an important gap in the general public health literature and practice -- to evaluate the impact of a large-scale public health program, based in a social ecological framework, in improving CRC screening rates at a state-level.



The proposed data collection is authorized by Section 301 of the Public Health Service Act (Attachment 1).

Privacy Impact Assessment

In accordance with the privacy impact assessment, the following items are described below: 1) and overview of the data collection strategies; 2) a delineation of the items of information to be collected, and3) an indication of whether the evaluation will involve hosting a website.



Overview of the Data Collection Strategies

The CRCCP impact evaluation consists of three primary data collection and analysis efforts. Primary data collection will be conducted at two time periods – early intervention and post- intervention – and will include a general population survey, a provider survey, and case studies. Each is described below.

Population Survey: The population survey will be administered by telephone with a state-based, representative, cross-sectional, random sample of adults aged 50–75 at both pre- and post- periods in each of the six states. The survey will be conducted by a professional survey center managed by the contract vendor, ICF Macro. The population survey will provide data on proximal outcomes of interest (e.g., individual-level knowledge and attitudes about CRC screening).



Provider Survey: The provider survey will be a mail-back written (paper–based) survey with a state-based, representative, sample of primary care providers. A longitudinal design will be used for the provider survey at pre and post periods in each of the six states. The providers will be randomly sampled from a list of primary care providers purchased from the American Medical Association (AMA). The provider survey will provide data on proximal outcomes of interest (e.g., provider knowledge and practices about CRC screening).


Case Studies: Qualitative case studies will be conducted in each state to assess context and document the implementation of the CRCCP (in intervention states) and monitor implementation of CRC activities (in control states). We will identify what CRC-related activities are conducted in each state and how are they implemented. Case studies will include multiple forms of data collection – document review, field observations, and participant interviews. Site visits will be conducted at two time periods – early intervention and post-intervention to conduct interviews with 10-12 public health staff.


The contract vendor, ICF Macro, will administer the surveys and collect, secure, store, and analyze all data (population survey data, provider survey data, and case study data). All electronic data, such as MP3 or .wav files of in-person interviews, will be stored in secured electronic files on secure contractor computers. Physical files containing respondent information such as interview notes and completed provider surveys will be kept in locked file cabinets. Both electronic and physical files will be retained for the minimum amount of time necessary to comply with records retention requirements.



Items of information to be collected

For each data collection strategy, the items of information to be collected, as well as the general overview of the procedures for data collection, are described in more detail below.


Population Survey. A state-based, representative, cross-sectional sample of adults aged 50–75 will be surveyed to assess proximal outcomes of interest (e.g. knowledge, attitudes, intentions, and behavior) around CRC screening. Individuals surveyed in intervention states will be asked additional questions to assess their exposure to CRCCP activities implemented in their state. Based on pilot testing, the telephone survey is estimated to take approximately 23 minutes to complete. The survey will be fielded at two time points, pre- and post-intervention (Fall 2013 and Fall 2015). Participants will not receive an incentive for completing the survey. Analysis will examine differences in changes from pre- and post- test periods on proximal measures between intervention and control states.


Provider Survey. A state-based, representative, longitudinal sample of primary care providers will be surveyed to assess provider knowledge, attitudes and behaviors with respect to CRC screening. The survey will be administered in paper format and, based on pilot testing, will take no more than 15 minutes to complete. The survey will be delivered to sampled primary care providers via Priority mail, and the mailing will include addressed, prepaid return envelopes and a $25 as an incentive to complete the survey. The survey will be fielded at two time points, pre- and post-intervention (Fall 2013 and Fall 2015). Analysis will examine differences in changes from pre- and post- test periods on proximal measures between intervention and control states.

Case Studies. Case studies will be conducted in each of the six states to assess implementation of the CRCCP in intervention states and track implementation of CRC-related activities (non-CDC funded) in control states. Each state will serve as a unique case. The case studies will include multiple forms of data collection – document review, field observations, and participant interviews. Site visits will be conducted at two time periods – early intervention and post-intervention Fall 2013 and Fall 2015) to conduct interviews with 10-12 public health staff and stakeholders.


Identification of Website (s) and Website Content Directed at Children Under 13 years of Age

Neither the population or provider survey will be administered using a web-based data collection tool. The interviews for the case studies will be conducted in person or via telephone. The data collection activities proposed as part of this clearance will not target children less than 13 years of age.



A-2. Purpose and Use of Information Collection

The purpose of the proposed data collection is to support a rigorous impact evaluation of the CRCCP, a new public health model intended to increase population-level CRC rates. The primary users of data collected through this clearance are DCPC/CDC and the CRCCP grantees and partners.


Data will be used to assess the impact of the CRCCP in improving proximal outcomes (e.g., provider knowledge, population attitudes) and in increasing population-level CRC screening rates. Results will inform future public health planning efforts at the state and federal levels. Evaluation results will be disseminated to CRCCP grantees and partners. Publications and presentations will be prepared for academic and non-academic audiences, with the intent of informing future program planning efforts in other public health areas.


Privacy Impact Assessment Information

The purpose of these data collection activities is to assess whether changes in state-level CRC screening rates (and more proximal measures) can be attributed to the CRCCP. The information will be used to inform future DCPC and grantee efforts aimed at increasing population-level colorectal and other cancer screening rates. The Privacy Act applies for the population and provider survey since personal identifiable information will be collected. The applicable Systems of Record Notice is 09-20-0136, Epidemiological Studies of Disease Problems. Respondents who participate in this study will be subject to assurances and safeguards as provided by the Privacy Act of 1974 (5 USC 552a), which requires the safeguarding of individuals against invasion of privacy. The Privacy Act also provides for the secure treatment of records maintained by a Federal agency according to either the individual’s name or some other identifier. Steps have been incorporated through the data collection procedures to ensure this secure treatment.


With permission, identifiable information will be collected from respondents of the longitudinal provider survey and the case studies. For the provider survey, identifiable information is necessary in order to preserve the sample for a subsequent administration of the survey. The information collected from the two administrations of the provider survey will be used to assess changes over time in providers’ knowledge, attitudes and behaviors regarding CRC screening. Participants’ identifiable information and participant responses will be securely kept in separate databases in order to minimize any chance of disclosure.


Although identifiable information will be collected to schedule and conduct the case studies, the Privacy Act does not apply because respondents are representing their respective organizations and are not providing personal information. For the case studies, no interview data will ever be reported by name without express permission from the respondent. Each informed consent form or informed consent statement (for the three types of data collection) will clearly identify who the data will be collected for, describe whether information will be collected in an identifiable form, and if so, how the data will be used, reported and/or shared.


Below is a brief description of how the data for each data collection activity will be collected and secured. ICF Macro will retain all data, including databases, audiotapes and hard copies of the interview notes will be retained until 3 years after the expiration or termination of the contract.


Population Survey: Each participant in the telephone survey will be assigned a random digit identification number. The identification number will be used to link participant information to survey responses for internal purposes of data tracking. Separate databases will be used to house participants' telephone number and participants’ survey responses -- each will be stored in a separate secure file on a secure network server. This step is taken to reduce the chance of inadvertently revealing identifying information. Only ICF Macro project staff will have access to these data. Only aggregate responses will be used in the report of study results. A de-identified data file will be created to share with CDC. In addition, all telephone surveyors will be trained on the project’s specific security requirements and will sign an agreement to keep the data secure.


Provider Survey: Each participant will be assigned a random digit identification number that will be used to link participant information to survey responses for internal purposes of data tracking. The identification number will also be used to link responses from the first administration of the survey to the responses from the second administration of the survey. Separate databases will be used to house participants' business mailing addresses and fax number and participants’ survey responses. Each database will be stored in a separate secure file on a secure network server. The completed surveys will be stored in a secured locked-file cabinet. Only ICF Macro project staff will have access to these data. Only aggregate responses will be used in the report of study results. A de-identified data file will be created to share with CDC.


Case Studies: All documents collected before, during, and after the case study site visits associated with a program (e.g., audio files, interview notes, field notes, documents) will be collected and stored in a password-protected electronic file accessible only by the ICF Macro project team or in a locked cabinet accessible to only ICF Macro project team members. During data collection in the field, site visitors will maintain data collection materials (such as audio files and notes) in their possession or in secured storage at all times. Site visitors will be instructed on data security procedures.


Program and interviewee names will not be removed from these materials, as they are necessary for generation of site-specific summaries. However, the interviewees’ names will not be associated with specific quotes or comments without written permission from the interviewee for each instance of usage. Audiotapes will be transcribed by a trained professional transcriptionist who will sign a non-disclosure agreement provided by the contractor. In the transcriptions, pseudonyms will be used in place of respondent names. Electronic copies of the de-identified transcripts will be provided to CDC. These notes will be stored in a locked file cabinet in the research office. Case study summaries will be restricted to project team members (ICF Macro staff and CDC staff). Data in these and any other subsequent reports will be presented in aggregate and no interviewees will be identified by name without first getting permission in writing for each usage.



A-3. Use of Improved Information Technology and Burden

Population Survey: Response burden for the population survey will be minimized by using Computer-Assisted Telephone Interviewing (CATI) technology. ICF Macro staff will identify households using the Genesys-ID system. This system contains information on area code-exchange combinations that have been assigned and Census-based demographic information for individuals and households for geographic areas defined by ZIP codes and Census tracts. This system will be used to quickly and economically generate a productive and statistically valid Random Digit Dial (RDD) sample, while removing much of the burden of telephone sample generation that is typically borne by dialing business, non-working and electronic-oriented telephone numbers. The generated sampling frame will then be used to randomly select a sample large enough to produce the desired number of interviews using Genesys estimates of the proportion of Working Residential Numbers (WRNs). Once a household is contacted by telephone, we will do a household enumeration to determine how many age eligible adults reside in the household. If there is more than one age-eligible adult in the household, the person who was initially chosen by random selection will be asked to complete the survey. This within household sampling approach is used to minimize the gender bias towards females that frequently occurs in telephone based survey studies. All skip patterns in the survey (that is, questions that are only appropriate for a proportion of respondents) will be automatically programmed into the CATI survey, thus further minimizing the burden on respondents in terms of their time. Attachment 4B provides a sample of screenshots of the survey in the CATI format.


Provider Survey: Information technology will not be used to administer the provider survey. This survey will be priority mailed to provider offices and a self-addressed, stamped return envelope will be provided. Providers will also have the option of faxing back their completed responses.


Case Studies: Information technology will not be used to conduct the interviews for the case studies. Relevant program documents will be reviewed over the course of the study and interviews along with field observations will be conducted in-person at two points in time.



A-4. Efforts to Identify Duplication and Use of Similar Information

The purpose of these data collection activities is to assess whether changes in state-level CRC screening rates (and more proximal measures) can be attributed to the CRCCP. The information will be used to inform future DCPC and grantee efforts aimed at increasing population-level colorectal and other cancer screening rates.


The data to be collected are specific to understanding the population-level impact of the CRCCP. Our study design requires that we collect data from a representative sample of the population and health care providers in each of the six states - these data are not available elsewhere. For the population and provider surveys, questions have been tailored specifically to assess the proposed theory of change for the CRCCP in order to help us understand what impact CRCCP is having on population level CRC-screening. Whenever possible, questions for each of the surveys were taken from existing national studies such as BRFSS, National Health Interview Survey (NHIS), and smaller community or site-specific survey efforts published in the literature. However, the data available from these national and smaller surveys cannot be examined or extrapolated to the six states involved in this impact evaluation. Currently, in-depth, comparable qualitative data describing the implementation CRC screening activities in each of the six states is not being collected. The approval of this data collection effort will allow DCPC to collect information in a standardized fashion needed to assess the impact of the CRCCP.



A-5. Impact on Small Businesses or Other Small Entities

Health care providers affiliated with medical offices that are considered small business may be asked to complete and return the provider survey. The expected response is limited to their completion of the survey during both waves of data collection and they will be asked to describe their current practices. The completion of the provider survey will not require new reporting or record keeping requirements; thus they will not have an impact on small businesses or other small entities. Not the provider, the business and/or medical office, will be asked to adopt new record keeping or reporting requirements. Skip patterns which allow providers to skip any questions that do not pertain to their practice or use of CRC screening have been built into the provider survey in order to minimize the administration time. The provider survey has also been pilot tested for both length and clarity. Providers will also be given a self- addressed, stamped envelope, as well a fax-back option, to facilitate the return of the completed survey. The data collection process has been designed to minimize the amount of time needed from providers as well as any intrusion in their normal work flow of activities.


A-6. Consequences of Collecting the Information Less Frequently

The contract vendor, ICF Macro, will conduct two rounds of data collection for CDC with each round consisting of an administration of the telephone–based population survey, mailed provider survey, and in-person case studies of six state programs. Because each data collection activity is designed for a different audience, it is highly unlikely that one type of respondent will be asked to participate in more than one data collection activity in each round. However, for the longitudinal provider survey, the same respondents will be asked to complete the provider survey during both rounds of data collection.


Our study design requires that we collect data at two time points in order that we 1) assess changes over time to assess effects of the CRCCP on outcomes of interest and 2) assess the accuracy of the proposed theory of change.


There are no legal obstacles to reduce the burden.

A-7. Special Circumstances Relating to Guidelines of 5 CFR 1320.5

There are no special circumstances with this information collection package. This request fully complies with the guidelines of 5 CFR 1320.5.

A-8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency

The 60-day Federal Register Notice (Attachment 2A) for the proposed data collection was posted in the Federal Register on August 22, 2012 (Volume 77, Number 163, pages 50696-50697). One comment was received from the American Society for Gastrointestinal Endoscopy (ASGE), which expressed support for the study and recommended that CDC include questions about whether primary care providers are counseling patients on potential cost-sharing obligations related to colonoscopy screening. These costs may pose an important barrier to screening. In response, CDC staff contacted ASGE and enlisted their assistance in drafting questions related to this issue. These questions have been added to the provider survey (Attachment 5B). A formal written response was also sent to the organization (Attachment 2B).


A second comment was received from the organization Fight Colorectal Cancer requesting that CDC provide a presentation of interim evaluation results to colorectal cancer stakeholder groups. CDC responded in writing to the group (Attachment 2B) and committed to share findings that may help them to refine current outreach and educational activities. As these data become available, CDC will contact the organization to schedule a meeting.

Table A-8.1 Individuals Who Have Provided Consultation on the Project

Consultant

Title

Affiliation

Email

Phone

Year of Consultation

Tom Chapel

Program Evaluator

CDC

[email protected]

404-639-2116

2010

Huey Chen

Program Evaluator

Montclair University

[email protected]

N/A

2010

Mark Lipsey

Evaluator

Director, Peabody Research Institute, Vanderbilt University

[email protected]


615-343-2696


2010

Faye Wong

Branch Chief

CDC

[email protected]

770-488-6427

ongoing

Janet Royalty

Data Manager

CDC

[email protected]

770-488-3085

ongoing

Djenaba Joseph

Medical Officer

CDC

[email protected]

770-488-3157

ongoing

Rebecca Kudon-Glover

Program Evaluator

CDC

[email protected]

770-488-2081

ongoing

Amy DeGroff

Program Evaluator

CDC

[email protected]

770-488-2415

ongoing

Marcus Plescia

Division Director

CDC

[email protected]

770-488-3055

ongoing

Susan Zaro

Senior Vice President

ICF Macro

szaro@ icfi.com

404-321-3211

ongoing

Michelle Revels

Principal

ICF Macro

[email protected]

404-321-3211

ongoing

Marnie House

Senior Technical Specialist

ICF Macro

[email protected]

404-321-3211

ongoing

Anna Krivelyova

Economist

ICF Macro

[email protected]

404-321-3211

ongoing

Ronaldo Iachan

Technical Director

ICF Macro

[email protected]

301-572-0538

ongoing

Naomi Freedner-Maguire

Principal

ICF Macro

[email protected]

802-863-8974

ongoing



A-9. Explanation of Any Payment of Gifts to Respondents

Primary care providers who receive the mailed provider survey will receive an initial monetary incentive of $25 to encourage survey completion. This monetary incentive will be included as a personal check in the initial survey package. Alternative methods for providing the incentive (e.g. cash, money order, gift certificate) were considered; however, checks made payable to the provider were chosen because : a) personalized checks increased the likelihood the incentive will reach the intended provider; b) cost benefit to the project. If a provider chooses not to complete the survey and does not cash the enclosed check, that money will not be expended. However, if the incentive was provided by cash, money order or gift card and enclosed with the survey, those funds would be permanently lost to the project. In addition to these practical reasons, research comparing the use of check incentives and gift cards found that for physicians, the provision of checks resulted in a higher response rate. (Hogan and La Force 2008).


For providers who do not respond to the initial mailing, a reminder fax, second and third mailing of the survey package will be conducted. A second $25 monetary incentive will be included in the third mailing of the survey package to physicians who have not returned a survey in response to the first two requests. Primary care providers are extremely busy professionals with many competing demands on their time and attention. Providing a monetary incentive, delivering a mailed survey by priority mail, and using multiple reminders have been shown to increase completion rates (VanGeest et al., 2001; Kasprzyk, 2001).


Respondents for the population survey and case studies will not receive incentives.



A-10. Assurance of Confidentiality Provided to Respondents


This data collection will conform to the ethical practices for conducting case studies and survey administration and researchers will implement procedures to protect the privacy of respondents as appropriate. Several methods will be used to gather data including a telephone survey for the general population, a mailed survey for providers, and site visits for the case studies. Respondent contact information used to solicit participation will be kept separate from participant responses. Only for the longitudinal provider survey will participants’ responses be linked with their identifying information in order to assess change over time.


All data will be treated in a secure manner and will not be disclosed unless otherwise required by law. All respondents will be informed that their responses will be treated in a secure manner unless otherwise specified by law. Only aggregate numbers, summary statistics, or de-identified quotes will be included in evaluation reports or manuscripts. As explained in A-2, with the permission of the respondent, identifiable information will be collected for the provider survey. The inform consent form or statement for each data collection activity will describe whether information will be collected in an identifiable form, and if so, how data will be secured used and reported. Additional procedures designed to protect participant privacy for the surveys and case studies are described below.



Population survey: Surveyors will explain to participants in the population survey that their participation in the project is voluntary. Verbal, informed consent will be obtained from all participants prior to conducting the survey. The surveyor will inform each participant that he/she may choose not to respond to any question or discontinue the survey interview at any time. Surveyors will provide participants with specific contact information for the study director, should participants have any questions once the survey is over. To protect the security of participant information, each participant will be assigned a random digit identification number. The identification number will be used to link participant information to survey responses for internal purposes of data tracking. Separate databases will be used to house participants' identifying information (telephone number) and participants survey responses; each will be stored in a separate secure file on secure network server. Only ICF project staff will have access to these data. Only aggregate responses will be used in the report of study results or in manuscripts. A de-identified data file will be created for CDC. In addition, all surveyors will be trained on the project’s specific confidentiality and security requirements and will sign a confidentiality agreement.


Provider Survey: An informed consent statement will be included in the mailed survey package. That statement explains to participants that the survey data is secure and that contractor project staff will retain the hard copies of the completed surveys in a locked file cabinet in contractor’s office until 3 years after the expiration or termination of the contract. To protect the security of participant information, each participant will be assigned a random digit identification number. The identification number will be used to link participant information to survey responses. Separate databases will be used to house participants' identifying information (telephone number) and participants survey responses; each will be stored in a separate secure file on secure network server. Participants will also be notified that neither they nor their responses will be identified by name in any reports of the survey results or manuscripts. Participants will be given contact information for the study director, should they have any questions about the study. A de-identified data file will be created for CDC.



Case Study: All case study participants will receive information about the risks and benefits of their participation prior to their interview. Participants will be told that their participation in the project is voluntary and written, informed consent will be obtained from participants. The interviewer will inform participants that they may choose not to respond to any question or discontinue the interview at any time. Interviewers will provide participants with specific contact information for the study director, should participants have any questions once the interview is over. Participants will receive a copy of the consent statement.


In addition, all interviewers will be trained on the project’s specific data security requirements. All interviewers will sign a confidentiality agreement. During data collection in the field, interviewers will maintain data collection materials (interview tapes and notes) in their possession and in secured storage at all times until the data are returned to the office. The tapes will be subsequently transcribed by a trained professional transcriptionist who will sign a confidentiality agreement provided by the contractor. In the transcript notes, pseudonyms will be used in place of respondent names. Copies of these de-identified transcripts will be provided to CDC. The project staff will retain the audiotapes and hard copies of the interview notes until 3 years after the expiration or termination of the contract, and these notes will be stored in a locked file cabinet in the research office.


The case study results will be reported by case, for a total of six case study reports for each wave of data collection. In these reports individual responses will not be attributed to any individual and instead will be reported in aggregate by state.



IRB Approval

The protocol was reviewed and approved by the ICF MacroICF Macro IRB for the duration of the study. The protocol is reviewed by the ICF Macro IRB every twelve months to determine the continuation of approval (Attachment 12).

Privacy Impact Assessment Information

  1. The population and provider surveys and case studies will require the collection of identifiable information. The Privacy Act applies. The applicable System of Records Notice is 09-20-0136, “Epidemiologic Studies and Surveillance of Disease Problems.” For the administration of the population survey, the identifying data will be kept in separate, secure electronic files from the survey results. Each participant will be assigned a code as the primary identifier. This is also true for the longitudinal provider survey, although the code will be used to link the results for both administrations of the surveys to assess change over time. For the case studies, participants will be asked about their thoughts and experiences with respect to implementing a CRC screening program or participating in their state’s CRC screening efforts. Because of the relatively low number of participants per health department, despite all precautions, the information an individual provides may serve as an identifier. All precautions will be taken to minimize this possibility, including having ICF staff conduct the site visits and taking steps to de-identify respondent data (e.g., through use of pseudonyms). The implementation of these safeguards will create an opportunity for respondents to provide valuable information that will help us deepen our understanding of state-level implementation, and how, and to what extent, the CRCCP program has contributed to the increase in population level CRC screening rates.



  1. All electronic data, such as survey results, MP3 or .wav files of in-person interviews, will be stored in secured electronic files on secured contractor computers. Similarly, physical files containing respondent information such as audiotapes or transcriptions will be kept in locked file cabinets. Both electronic and physical files will be kept for the minimum 3 years required in order to comply with records retention requirements, at which time all files will be destroyed.



  1. Participation in all data collection activities for this impact evaluation is voluntary. The informed consent procedures for each data collection activity is described in detail below:



Population Survey: Potential respondents will be contacted via telephone and asked if they are willing to participate. Individuals who agree will receive information about the risks and benefits of participation during the introduction to the actual survey. The surveyor will read an informed consent statement included in the introductory language of the survey and request verbal consent prior to beginning the administration of the survey. The informed consent statement describes the purpose of the study, how the information will be used, and the steps that will be taken to protect participant confidentiality. (See Attachment 3). Surveyors will provide participants with specific contact information for the study director, should participants have any questions once the survey is over.

Participants will be informed that the survey is voluntary and that they may choose to discontinue the survey at any time, for any reason. If a respondent chooses to stop participating in the survey, the surveyor will ask the respondent whether they wish to withdraw all of the data (their responses) that they have already contributed. If they choose to stop the survey, but allow the data already collected to be used, the surveyor will thank them for their participation and follow data security and handling procedures as for a completed survey. If they choose to withdraw all data, the surveyor will thank them for their time, end the survey, and as soon as possible, dispose of all responses.



Provider Survey: Primary care providers who participate in the provider survey via mail will receive information about the risks and benefits of their participation via an informed consent statement that is included on the survey. (See Attachment 5B). The informed consent statement describes the purpose of the study, how the information will be used, and the steps that will be taken to protect participant confidentiality. Participants will also be informed that the survey is voluntary and that they may choose to answer any or all questions they desire, to discontinue the survey at any time for any reason or they can choose to completely abstain from participating. The statement will also inform participants that by returning a completed survey to the study investigators, they are consenting to participate in the study. The cover letter and the survey will provide participants with name, telephone and address of the survey project director, should participants have any questions about the survey.


Case Studies: All individuals who participate in the in-person interviews (including program directors, partners, program staff, and evaluators, and other stakeholders) will receive information about the risks and benefits of participation. Prior to beginning the interview, interviewers will read an informed consent statement (See Attachments 7D and 8D). Participants will be asked to sign the consent statement and be given a copy of the consent document for their records. The consent form describes the purpose of the study, how the information will be used, and the steps that will be taken to protect participant confidentiality. Participants will also be informed of the following:


  • The interview is voluntary and that they may choose to discontinue the interview at any time for any reason.

  • The interviewer will take notes to capture what is covered in the interview

  • The interview will be audiotaped.

  • If a respondent chooses to stop the interview, the respondent has the option of withdrawing all of the data (their responses) that they have already contributed

    • If the respondent chooses to stop the interview, but allows the data already collected to be used, the interviewer will thank him/her for their participation and follow data security and handling procedures as for a completed interview.

    • If the respondent chooses to withdraw all data, the interviewer will thank them for their time, end the interview, and as soon as possible, shred the handwritten notes and not type or share those responses. Additionally, all notes will immediately be destroyed.

Contact information for the study director will be provided should participants have any concerns once the interview is over.


A-11. Justification for Sensitive Questions

Questions posed to respondents of the population survey address respondents’ knowledge, attitudes, and beliefs about CRC screening as well as family history of CRC and personal experience with CRC screening. This information is important to understanding the personal, social, and other contextual factors that may influence whether a person obtains CRC screening. The data collected reflect those needed to assess intermediate outcomes, as outlined in the theory of change, in order to determine program impact. The security of responses will be preserved by following the procedures outline in section A-10.


Questions for the provider survey and case studies inquire about everyday practice and program performance. There are no sensitive questions posed as part of these data collection activities.

A-12. Estimates of Annualized Burden Hours and Costs

  1. Estimated Annualized Burden Hours:

The proposed study consists of two cycles of data collection that will be conducted over a three year period. The total estimated annualized response burden is 2,425 hours. The estimated annualized burden hours are presented in Table A-12.1.


  • The Colorectal Cancer Population Survey (see Attachments 4A and 4B) will be distributed to a randomly selected, state-based representative, cross sectional sample of individuals 50-75 year of age from each of the six states. Approximately 3,200 respondents will participate on an annualized basis. The estimated burden per response is 23 minutes. Approximately 9,600 individuals will be screened (see Attachment 3) to identify and recruit the targeted number of respondents.

  • The Colorectal Cancer Screening Practices: Survey of Primary Care Providers (Attachment 5A) will be administered to a state-based, representative, longitudinal sample of primary care providers in each of the six states. Approximately 3,200 primary care physicians will participate on an annualized basis. The estimated burden per response is 12 minutes.

  • Information collection will also include implementation case studies involving interviews with state health department program staff, affiliated partners and other stakeholders from each of the six states. Interviews will last 60 minutes for all participants except program directors (the state CRCCP program director for each of the intervention states and a program director for cancer programs more generally in the control states) who will participate in 2 hour interviews. The average burden per response for program staff thus ranges from 1-2 hours with an average of 1 hour and 15 minutes. Program directors will also complete 2 forms, one a list of suggested interviewees (see Attachment 6A; average burden per response of one hour) and another form to schedule the interviews for the site visit (see Attachment 6B; average burden per response of five hours).


The data collection instruments were piloted with fewer than 10 respondents to determine burden estimates. Additional information on study design is presented in Section B.1.



Table A-12.1. Estimated Annualized Burden Hours



Type of Respondent

Form Name

Number of Respondents

Number of Responses per Respondent

Average Burden per Response (in hrs)

Total Burden

(in hrs)

General Population

Screener for the Colorectal Cancer Population Survey

9,600

1

5/60

800

General Population Eligible Individuals ages 50-75 years

Colorectal Cancer Population Survey

3,200

1

23/60

1,227

Participating Primary Care Providers

Colorectal Cancer Screening Practices: Survey of Primary Care Providers




1,600

1

12/60

320

CRCCP and Non-Grantee Program Director

Suggested Interviewees Form

4

1

1

4

CRCCP and Non-Grantee Program Directors

Site Visit Instructions Template

4

1

5

20

CRCCP Grantee Program Staff

Interview Guide: Grantee Program Staff

12

1

75/60

15

CRCCP Grantee Evaluators

Interview Guide: Grantee Program Evaluator

4

1

1

4

CRCCP State and Local Sector Partners

Interview Guide: Grantee Partner

4

1

1

4

CRCCP Private Sector Partners

Interview Guide: Grantee Partner

4

1

1

4

Non-Grantee Program Staff

Interview Guide: Non-grantee Program Staff

12

1

75/60

15

Non-Grantee Evaluator

Interview Guide: Non-grantee Program Evaluator

4

1

1

4

Non-grantee State and Local Partners

Interview Guide: Non-grantee Partner

4

1

1

4

Non-grantee Private Sector Partners

Interview Guide: Non-grantee Partner

4

1

1

4


Total

2,425







  1. Estimated Annualized Burden Costs

The annualized burden cost was calculated based on the hourly wage rates for appropriate wage rate categories using the May 2010 National Occupational Employment and Wage Estimates from the Bureau of Labor Statistics, U.S. Department of Labor15. The annualized cost for is estimated to be $56,044. There will be no direct costs to respondents other than their time to participate in their respective data collection activity.

The estimated hourly wage rate for each of the respondent audiences was calculated as follows:

  • Population Survey: Since households will be randomly chosen, the initial screening questions may be asked of individuals who either: a) do not meet the eligibility criteria or b) refuse to participate in the survey. These individuals will come from a diverse set of wage categories, which will also vary by state, age, genre and race and ethnicity. As a result, it was deemed more accurate to use a wage rate that reflected the average of the wages for the six states participating in the impact evaluation. Specifically an estimated hourly salary of $17.83 is assumed for all respondents to the screener, based on the 2010 National Occupational Employment and Wage Estimates referenced above. The same hourly wage rate of $17.83 was also used for the individuals aged 50-75 who complete the survey in full. This hourly wage rate will overestimate the cost of individuals who are 65 years and older may be retired; however, given that it is not possible to accurately estimate the number of respondents who will fall into each age category, this bias is acceptable.

  • Provider Survey. According to the 2010 national level data from the Bureau of Labor Statistics the average salaries of general and family practitioners ($173, 860/2080 hours) used to calculate the hourly wage of $83.59 for primary care providers in the six states.

  • Case Studies: The average salary of $113,100 for general and operational managers was used to calculate the hourly wage of $54.38 for program directors and partners. For program staff, the average salary of $50, 270 for health educators and healthcare support staff was used to calculate the hourly wage of $24.17. For the program data managers/ evaluators, the average salary of $64, 948 for database managers was used to calculate the hourly wage of $31.23. All average salaries are from the 2010 Bureau of labor Statistics, U.S. Department of Labor.



Table A.12.B.1. Estimated Annualized Burden Costs


Type of respondent

Instrument

Annualized Burden Hours

Hourly Wage Rate

Total Respondent Costs

General Population

Screener for the Colorectal Cancer Population Survey

800

$17.83

$14, 264

General Population Eligible Individuals ages 50-75

Colorectal Cancer Population Survey

1,227

$17.83

$21, 877

Participating Primary Care Providers

Colorectal Cancer Screening Practices: Survey of Primary Care Providers

320

$83.59

$26, 749

CRCCP and Non-Grantee Program Director

Site Visit Suggested Interviewee Form

4

$54.38

$218

CRCCP and Non-Grantee Program Directors

Site Visit Instructions Template

20

$54.38

$1,088

CRCCP Grantee Program Staff

Interview Guide: Grantee Program Staff

15

$24.17

$363

CRCCP Grantee Evaluators

Interview Guide: Grantee Program Evaluator

4

$31.23

$125

CRCCP State and Local Sector Partners

Interview Guide: Grantee Partner

4

$54.38

$218

CRCCP Private Sector Partners

Interview Guide: Grantee Partner

4

$54.38

$218

Non-Grantee Program Staff

Interview Guide: Non-grantee Program Staff

15

$24.17

$363

Non-Grantee Evaluator

Interview Guide: Non-grantee Program Evaluator

4

$31.23

$125

Non-grantee State and Local Partners

Interview Guide: Non-grantee Partner

4

$54.38

$218

Non-grantee Private Sector Partners

Interview Guide: Non-grantee Partner

4

$54.38

$218

Total




$56,044



A-13. Estimates of Other Total Annual Cost Burden to Respondents or Record Keepers

The ICF project team will collect the information required for this impact evaluation. There are no additional capital or start-up costs associated for the six state health departments to participate in the impact evaluation. There will be some additional burden on program staff to provide potential respondent lists for the case study site visits; however, these costs will be kept to a minimum. Other costs related to this effort are costs to the Federal government as part of the evaluator’s contract for the multisite evaluation. Therefore, the sites do not need to expend any additional funds or assume any costs due to participating in this evaluation effort.



A-14. Annualized Cost to the Government

The evaluation will be supervised by the CRCCP Evaluation Team Lead, a federal employee. The Team Lead, in close consultation with ICF Macro, will provide oversight to all evaluation activities and ensure data collection is being conducted in accordance with OMB requirements.

The annualized cost to the government is estimated at $983, 381.


This project has been fully funded by CDC. The annualized project costs for of the study are shown in Table A.14-1. The costs include (1) contract costs for ICF for data collection and analysis, and (2) the cost of CDC staff involved in oversight and analysis. The total contract cost for carrying out the project is $2,686,560 over the remaining project period. This has been annualized over the remaining 3 years in the table below. The CDC costs include personnel costs of Federal employees involved in oversight and analysis, estimated at $88,311 (35% of an FTE at GS-14 step 7, 10% of 1 FTEs at GS-13 step 10; 5% of 2 FTEs at GS-13 step 1. 25% was added to salaries for varied benefits) Thus, the total cost to the government, including total remaining contractual costs, and annualized costs for CDC oversight, is $2,774,871.



Table A-14.1 Estimated Annualized Cost to the Federal Government



Remaining project costs, including data collection

ICF Contract Costs

$895,520

CDC Costs

$88,311

Cost to Federal Government

$983,381



A-15. Explanation for Program Changes or Adjustments



This is a new data collection project.



A-16.1 Plans for Tabulation and Publication and Project Time Schedule

Project Time Schedule

The estimated timeline for data collection and analysis is provided below:


Table A-16.1 Data Collection and Analysis Time Schedule


Activity

Timeframe

Wave 1 Data Collection Activities

Population Survey

Oct 2013  to  Dec 2013

Provider Survey

Oct 2013  to   Dec 2013

Case studies

 Oct 2013  to  Dec 2013

Data Analysis

Jan 2014 -  March 2014

Case study report writing

Ongoing from   Oct  2013 - March  2014

Report of Findings from  Wave 1

April 2014

Wave 2 Data Collection Activities

Population Survey

Oct 2015 -  Dec 2015

Provider Survey-

Oct 2015  to   Dec 2015

Case studies

 Oct 2015 to  Dec 2015

Data Analysis

Jan 2016 -  March 2016

Case Study Report Writing

Ongoing from Oct  2015 - March 2016

Report of Findings from  Wave 2

April 2016

Manuscript Development

June 2016


Tabulation/Data Analysis

Both qualitative and quantitative data will be collected under this request. During each wave of data collection for the provider and population surveys, available data for non-responders will be analyzed to assess and minimize systematic bias. If non-response rates are high or the bias appears to be systematic, we will assess the causes and modify the approach to address these issues. For example, potential solutions may include calling on a specific time or day in order to increase the likelihood of participation by eligible respondents. The data analysis plan for each of the data collection activities is described in more detail below.


Population Survey: Tracking changes on key population-level, proximal variables will be accomplished by administering a repeated, cross-sectional survey in the intervention and control states. The data analysis will examine between group variation (intervention vs. control sites) and changes over time. Within each data collection wave, we will perform simple bivariate tests (t-tests and chi-squared) comparing means and proportions between the two populations (intervention and control). These tests will focus on CRC screening rate (BRFSS data) as well as changes in population-level variables described earlier (e.g., knowledge, attitudes, intentions, screening history). In addition, to the bivariate tests, we will develop logistic regression models for the probability of screening, for each type of screening, with independent variables including those described earlier as covariates (e.g. attitudes, knowledge, receipt of provider recommendation, perceived susceptibility etc.) . We will test whether there are significant state-level effects and intervention vs. control effects.


To assess change in key variables over time, the data will be combined across the two survey waves and analyzed. For the repeated data, we will look at changes between time-1 and time-2, again comparing intervention and control states. The statistical tests conducted will assess the differences (intervention– control) in differences (changes in time) in order to determine how much of the change is attributable to the CRCCP.



Provider Survey: Data from the repeated cross-sectional provider surveys will be used to assess changes in provider-level proximal outcomes in the intervention and control sites. The data analysis will examine between group variation (intervention vs. control sites) and changes over time. Within each data collection wave, we will perform simple bivariate tests (t-tests and chi-squared) comparing means and proportions between the two populations (intervention and control). These tests will focus on changes in provider-level variables described earlier (e.g., provider knowledge, attitude, beliefs and practices, including screening recommendations, types of tests used and methods for assuring compliance and completion).


In addition to the bivariate tests, we will develop logistic regression models to calculate the probability of screening, with independent variables including the variables described earlier as covariates (e.g., provider practice type, frequency of provider recommendations, attitudes towards the importance of screening, etc.). We will test whether there are significant state effects and intervention vs. control effects.


To assess change in key variables over time, the data will be combined across the two survey waves and analyzed. For the repeated data, we will look at changes between time-1 and time-2, again comparing intervention and control states The statistical tests conducted will assess the differences (intervention– control) in differences (changes in time) in order to determine how much of the change is attributable to the CRCCP.


Case Studies: The interview data will be analyzed by site and used to write detailed reports describing implementation of CRC efforts in each state. These descriptive reports will be used for hypothesis testing. Hypothesis 1 (whether the observed increased in CRC screening rate and other more proximal outcomes among intervention are attributable to the CRCCP) will be tested via analyses of the data from the population and provider surveys. Hypothesis 2 (whether the proposed theory of change for the CRCCP, as described in the program logic model, accurately reflects pathways between resources, program activities and outcomes) will be tested via the qualitative case study data.

If warranted, , themes across cases will be analyzed using ATLAS.ti as well as templates recommended by Stake (2006) to facilitate the systematic identification and documentation, along with supporting evidence, of themes and differences across the cases. All analyses (population survey, provider survey, case study) will be conducted by ICF staff trained in the appropriate qualitative and/ or quantitative research methods.


Publication Plans



At intervals throughout the study DCPC will present and discuss the findings from all three data collection activities with each state participating in the evaluation. Participating states will be given the opportunity to review draft case study reports for clarity and accuracy. Population and provider survey data, although purely descriptive, may be used in manuscript development after the first wave of data collection.


At the close of the evaluation, the evaluation findings will be presented to participating states as well as the other states and tribes in the national program via public meetings. We will conduct presentations on the evaluation at professional conferences and prepare articles for submission to peer-reviewed journals, such as Preventing Chronic Disease and the American Journal of Public Health. We will make the results of this study available to the general public by publishing them on the CDC website for the Colorectal Cancer Control Program.



A.17. Reason(s) Display of OMB expiration Date is Inappropriate

Exemption is not being sought. All data collection instruments will display the expiration date of OMB approval.

A-18. Exceptions to the Certification for Paperwork Reduction Act Submission



This collection of information involves no exceptions to the Certification for Paperwork Reduction Act Submissions.





References



1. CDC. Vital Signs: colorectal cancer screening among adults aged 50-75 years-United States, 2008. MMWR Morb Mortal Wkly Rep 2010; 59:1-5.


2. U.S. Cancer Statistics Working Group. United States Cancer Statistics: 1999-2006 Incidence and Mortality Web-based Report. Atlanta: U.S. Department of Health and Human Services, Centers for Disease Control and Prevention and National Cancer Institute; 2010. Available at: www.cdc.gov/uscs. Accessed October 27, 2010.


3. Mandel JS, Bond JH, Church TR, Snover DC, Bradley GM, Schuman LM, & Ederer F, (1993). Reducing mortality from colorectal cancer by screening for fecal occult blood. New England Journal of Medicine, 328(19), 1365-1371.


4. Winawer, SJ, Zauber, AG, Ho, MN, O'Brien, MJ, Gottlieb, LS, Sternberg, SS, et al. (1993). Prevention of colorectal cancer by colonoscopic polypectomy. New England Journal of Medicine, 329(27), 1977-1981.


5. Altekruse, SF, Kosary, CL, Krapcho M, et al, eds. SEER Cancer Statistics Review, 1957-2007, National Cancer Institute. Bethesda, MD. Available at: http://seer.cancer.gov/crs/1975_2007/, based on November 2009 SEER data submission, posted to the SEER website. Accessed on October 27, 2010.


6. Farley TA, Dalal MA, Mostashari F, & Frieden TR. (2010). Deaths preventable in the U.S. by improvements in use of clinical preventive services. American journal of preventive medicine, 38(6), 600-609.


7. Seeff, LS, Seeff LC, DeGroff A, Tangka F, Wanliss E, Major A, Nadel M, et al. Development of a federally funded demonstration colorectal cancer screening program. Preventing Chronic Disease. 2008;5(2). http://www.cdc.gov/pcd/issues/2008/apr/07_0206.htm


8. Klein, K. J., Tosi, H., & Cannella, A. A. (1999). Multilevel theory building: Benefits, barriers, and new developments. Academy of Management Review, 24, 243-248.

9. U.S. Preventative Task Force (2008). Recommendations for client- and provider-directed interventions to increase breast, cervical, and colorectal cancer screening. American Journal of Preventive Medicine, 35(1), S21-S25.


10. Frieden, T. R. (2010). A Framework for Public Health Action: The Health Impact Pyramid. Am J Public Health, 100(4), 590-595.


11. Shadish WR, Cook TD, & Campbell DT. (2002). Experimental and Quasi-Experimental Designs for Generalized Causal Inference. Boston: Houghton-Mifflin.


12. Weitzman, B.C. Silver, D. Dillma K.C. Integrating A Comparison Group Design into a Theory of Change Evaluation: The Case of the Urban Health Initiative. American Journal of Evaluation, Vol. 23, No. 4, 2002, pp. 371–385.


  1. Centers for Disease Control and Prevention (CDC). Behavioral Risk Factor Surveillance System Survey Data. Atlanta, Georgia: U.S. Department of Health and Human Services, Centers for Disease Control and Prevention.


  1. Hogan, S.O. & La Force M. (2008) Incentives in Physician Surveys: An Experiment Using Gift Cards and Checks. Proceedings of the Survey Research Methods Section, American Association on Public Opinion Research https://www.amstat.org/sections/srms/Proceedings/y2008/Files/hogan.pdf

1 This information was obtained from http://www.statehealthfacts.org/comparemaptable.jsp?typ=1&ind=432&cat=8&sub=100&sortc=1&o=a. The data are for December 2008. U.S. total includes territories and persons from the Pacific Islands

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorWindows User
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy