0920-05CJ Supporting Statement 02.14.07

0920-05CJ Supporting Statement 02.14.07.doc

Colorectal Cancer Screening Demonstration Program

OMB: 0920-0745

Document [doc]
Download: doc | pdf





February 14, 2007



Colorectal Cancer Screening

Demonstration Program






Application for OMB Clearance









Submitted by Laura C. Seeff, MD

Centers for Disease Control and Prevention

National Center for Disease Prevention and Health Promotion

Division of Cancer Prevention and Control

4770 Buford Highway NE, Mail Stop K-55

Atlanta, GA 30341-3724

(770) 488-3223

FAX (770) 488-4639






TABLE OF CONTENTS

Section Page

A. Justification .. 1

A.1 Circumstances Making the Collection of Information Necessary…………………. 4

A.2 Purpose and Use of the Information Collection 6

A.3 Use of Improved Information Technology and Burden Reduction 12

A.4 Efforts to Identify Duplication and Use of Similar Information 13

A.5 Impact on Small Businesses or Other Small Entities 14

A.6 Consequences of Collecting the Information Less Frequently 14

A.7 Special Circumstances Relating to the Guidelines of 5 CFR 1320.5 15

A.8 Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency ….. 15

A.9 Explanation of Any Payment or Gift to Respondents 17

A.10 Assurance of Confidentiality Provided to Respondents 17

A.11 Justification for Sensitive Questions 19

A.12 Estimates of Annualized Burden Hours and Costs 20

A.13 Estimates of Other Total Annual Cost Burden to Respondents or Record Keepers 23

A.14 Annualized Cost to the Federal Government 23

A.15 Explanation for Program Changes or Adjustments 24

A.16 Plans for Tabulation and Publication and Project Time Schedule 24

A.17 Reason(s) Display of OMB Expiration Date is Inappropriate 25

A.18 Exceptions to Certification for Paperwork Reduction Act Submissions 25

B. Collections of Information Employing Statistical Methods 25

B.1 Respondent Universe and Sampling Methods 26

B.2 Procedures for the Collection of Information 26

B.3 Methods to Maximize Response Rates and Deal with Non-response 29

B.4 Tests of Procedures or Methods to be Undertaken 30

B.5 Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 30

References 32






Exhibits

Number Page

Table A12.A Number of Respondents and Estimate Burden Hours. 21

Table A12.B Estimated Annualized Cost to Respondents. 22

Table A14 Estimated Annualized Federal Government Cost Distribution. 23

Table A16A Time Schedule for Data Reporting, Analysis and Publication 24

Table B2 Cutoff Dates for Complete Data Reporting by Awardees 27





LIST OF ATTACHMENTS


Attachment 1. Section 301 of the Public Health Service Act [42 U.S.C. 241]

Attachment 2. Clinical Data Collection forms

a. Colorectal Cancer Clinical Data Elements (CCDE)

b. Data User’s Manual for CCDEs

Attachment 3. Sample Feedback Reports

a. Error Summary/Edit Report

b. Data Quality Indicator Guide Report

c. Service Quality Indicator Guide Report

Attachment 4. Medical Complications Reporting Form

Attachment 5 Annual Aggregate Data on Medically Ineligible Clients form

Attachment 6.Cost data forms

a. Allowable Procedures and Relevant CPT, HCPCPS and APC codes

b. Reimbursement Data Reporting Form

Attachment 7. 60-day Federal Register Notice

Attachment 8. List of Experts Providing Outside Consultation on Data Related Issues

Attachment 9. Selected Sites for CRC screening demonstration program, 2005

Attachment 10. Data Collection and Processing Flowchart

Attachment 11. Individuals Consulted on Statistical Aspects, Collecting and/or Analyzing Data

A.Justification

A.1Circumstances Making Collection of Information Necessary

The Centers for Disease Control and Prevention (CDC) requests approval from the Office of Management and Budget (OMB) to collect individual patient-level screening, diagnostic and treatment data to be used to evaluate a new colorectal cancer (CRC) screening demonstration program, located in five sites in the US. This program was initiated to explore the feasibility of establishing a colorectal cancer screening program for the underserved U.S. population. No national federally-funded colorectal cancer screening program currently exists in the U.S. As of 2004, only 57% of the U.S. population has been screened for colorectal cancer as recommended (1), and most of the colorectal cancer screening performed in the US currently is opportunistic, with limited screening occurring in self-contained screening programs across the country. Before considering a larger national effort, CDC decided to establish a three-year colorectal cancer screening demonstration, or pilot, program (CRCSDP), in five separate sites, to better understand which settings and program models may be most viable and cost-effective in reaching this population. In August 2005, CDC awarded five new 3-year cooperative agreements to implement CRC screening demonstration programs designed for low-income persons 50 years and older who are inadequately insured for colorectal cancer screening services. This program addresses the "Healthy People 2010" Cancer focus area, specifically to increase the proportion of adults who receive a colorectal screening examination.


Colorectal cancer (CRC) is the second leading cause of cancer-related deaths in the United States following lung cancer (2). Ninety-one percent of new cases and 94% of deaths from CRC occur in persons over 50 years of age (2). Although strong scientific evidence has shown that regular screening can prevent colorectal cancer, and is effective in reducing CRC incidence and mortality (3-9), screening rates remain low (1, 10). Findings from the National Health Interview Survey (NHIS), administered by CDC, indicate that in 2000, barriers to screening include limited or lack of insurance coverage for CRC, lack of a regular health care provider, lack of organized systems where screening and follow-up may be conducted, and lack of doctor’s visits within the preceding year (10). In the face of these low rates of use of colorectal cancer screening tests, CDC and other federal and non-federal organizations are actively working to increase screening for colorectal cancer.


Regular CRC screening is now recommended for average-risk persons, using one or a combination of the following tests: fecal occult blood testing (FOBT), flexible sigmoidoscopy, colonoscopy, and/or double-contrast barium enema (DCBE) (11-13). Fecal immunochemical testing (FIT) is considered an acceptable alternative to FOBT. These tests vary in their costs, availability, and associated risks, and current evidence does not clearly demonstrate which of these tests is most effective. The programs that applied to be part of the CRC screening demonstration program were given the choice of which screening test(s) to offer from the above list of recommended tests.


In order to monitor the quality, effectiveness, appropriateness, cost, and cost-effectiveness of the CRC screening and diagnostic services delivered by these demonstration programs, and to compare any differences among the five unique program sites, CDC proposes to receive standardized, individual, patient-level data related to CRC screening, diagnostic follow-up, and treatment services in these programs, that the five sites will collect and submit to CDC. The data will be entered into the program site's electronic database, and formatted so that it can be exported to CDC on a quarterly basis (March 1, June 1, September 1 & December 1). The first data submission is planned for the next quarterly period following OMB approval. The data collected from each demonstration site will be used to provide immediate feedback to the programs for quality improvement and to inform current and future organized CRC screening efforts.


This demonstration program is authorized by Section 301 of the PHS Act (42 U.S.C. 241). A copy of the legislation is included as Attachment 1.



A.2Purpose and Use of Information Collection


In order to assess the quality, effectiveness, and appropriateness of the services delivered by the cooperative agreement recipients of the colorectal cancer screening demonstration program and to monitor program effectiveness, programs will submit standardized data to CDC related to CRC screening, diagnostic follow-up, and treatment services that are part of this demonstration project. CDC and the awardees worked together to define key data elements which are included in the codebook, to be used by the programs and CDC, known as the Colorectal Cancer Clinical Data Elements (CCDE) (see Attachment 2a).


Awardees will also receive a Data User’s Manual (Attachment 2b), developed by CDC and the data contractor IMS, that provides complete written instruction regarding CCDE data submission requirements, data variables, data field descriptions, report descriptions, etc. This document will support consistent submissions across awardee programs. The manual is accessible in a bound hard copy and through a web-site for CRC screening demonstration program Data Managers and Program Directors maintained by the data contractor.


Individual patient-level data will be collected on a continuous basis, but prepared for quarterly submission to CDC. Please see section B2 for a detailed description of data collection procedures. Briefly, data will be entered into the program site's electronic database and exported to CDC on a quarterly basis.


CDC has retained a contractor, IMS, to assist CDC and programs with collection, management and analysis of clinical data. All data received from programs will be submitted first to the data contractor, who will work with the program sites to identify and address data questions, provide data checks, and review the data for completeness, and if necessary, request clarification from awardees.


An analysis file will be created from the data, and will be used to produce three types of feedback reports: 1) an Error Summary Report, 2) a Data Quality Indicator Guide Report and 3) a Service Quality Indicator Guide Report (Attachments 3a, b, and c). Program site-specific Error Summary Reports will contain counts and associated percentages for blank field errors, inter-field relationship errors and inter-record relationship errors in each data set. Program site-specific Data Quality Indicator Guide Reports will be used to provide feedback to awardees about the quality, completeness and timeliness of their data within the most recent 18 months of data reporting, using tables and record audits. Finally, IMS will create an aggregated analysis file to generate standardized, CRC screening demonstration program surveillance reports (Service Quality Indicator Guide Reports) and any additional special CDC requests. Plots or graphs will be generated to provide fiscal year data for age and race demographics, counts of persons served, procedures performed and cancer incidence. Once the feedback reports are distributed to CDC, they will be shared with awardees and used for quality improvement of the delivery of services. These data and reports will also be used as part of the overall evaluation of the 3-year demonstration screening program, which will include the development of evaluation reports. The feedback reports will be used for quality assurance in the short term, and as program evaluation tools in the long term.


In the CCDEs, we are asking whether any medical complications occurred within 30 days of any of the procedures offered through these programs as a yes/no variable. If any complications do occur, awardee program sites are being asked to submit additional information regarding the complication, beyond what is captured in the CCDE submissions, in a Medical Complication Reporting Form (see Attachment 4). As described in Attachment 4, for clients whose complications do not result in hospitalization, CDC requests quarterly updates of any complications over the previous quarter. For clients whose complications do result in hospitalization, CDC would like to be notified within 72 hours of the medical complication, in advance of the quarterly report. While complications in colorectal cancer screening are rare, they can occur, and for quality control and evaluation purposes, CDC feels it is very important to be apprised of any complications that may occur as a result of service delivery through the program, and measures to resolve the complications (14).


We are asking program sites to monitor and report to CDC an aggregate number of clients who attempt to enroll at each program but are not medically eligible (Attachment 5). The categories of clients that would be outside of the scope of these programs and therefore medically ineligible include: 1) persons with a diagnosis of Inflammatory Bowel Disease, 2) persons with gastrointestinal symptoms for whom screening would be inappropriate, 3) persons with a history of a genetic cause for colorectal cancer (Hereditary Non-Polyposis Colorectal Cancer and Familial Adenomatous Polyposis), or 4) persons in whom genetic testing for these conditions is recommended. Additionally, awardee programs were given the choice of not including persons with a history of colorectal cancer or polyps into the program. Finally, clients who may have received a screening test outside of the program and are attempting to enroll into the program for a follow-up diagnostic test will be ineligible for the program. This information will be used as part of our overall evaluation and projection of future administrative need, in the event that this colorectal cancer screening program grows. CDC will be able to provide managerial planning data to any additional program site that might be added in the future, by providing an estimated total number of clients that might try to enroll in a CRC screening program, including those who may not be eligible for this program but who will need to be referred for care elsewhere.


Additionally, we are undertaking an economic analysis of this program. We plan to collect cost (reimbursement) data for the screening and diagnostic procedures funded through the program. The reimbursed or payment amounts will be used to estimate the clinical cost incurred. This is a fundamental part of our program evaluation. We are proposing to collect Current Procedural Terminology (CPT), Ambulatory Payment Classifications (APC) and Healthcare Common Procedure Coding System (HCPCS) codes for individual patients to understand the types of procedures performed and the cost (reimbursement) of these procedures, by provider type (see Attachment 6a for list of allowable codes). Based on strong preference by the program site awardees, these data will be collected separately from the CCDEs. This information is routinely collected and readily available to the awardees in their billing data systems. The program reimbursement data will include the following data elements: client ID, billing codes (current procedural terminology [CPT], healthcare common procedural coding system [HCPCS] or ambulatory payment classification [APC]), an indicator if the payment was for bowel preparation, procedure date, charge amount and payment/reimbursement amount. Each awardee program site will prepare one ASCII flat file to include the following data elements: patient ID (same as ID used for CCDEs), billing codes (CPT, APC, HCPCS), date of procedure, provider type, and reimbursement amount (see Attachment 6b for the file layout) to be submitted to CDC annually. The sites will provide annual program reimbursement data by exporting the ASCII flat files from their existing databases for each of the three years of the demonstration. The data received from the awardees will be analyzed to calculate the costs associated with screening and diagnosis at the individual patient level, by provider type. In addition, these data will be linked with the clinical information gathered in the CCDEs to estimate cost associated with high versus average-risk patients. Assessing these costs is critical for CDC to understand the true clinical costs and cost-effectiveness of a CRC screening program and will inform any future federal planning for colorectal cancer screening.


These CCDE data, medical complications data, aggregate enrollment data and cost data will be used by CDC for the following:

  • to assess the appropriate use of colorectal cancer screening and diagnostic tests, specifically under appropriate conditions, in the appropriate sequence, and within the appropriate time intervals,

  • to assess awardee’s ability to ensure timely follow-up diagnostic tests and treatment when required,

  • to estimate the number of clients seen at each awardee program site who are clinically ineligible,

  • to document complication rates associated with screening tests to compare outcomes across different screening programs and tests,

  • to estimate costs and cost effectiveness associated with the different program designs and selected screening tests, and where possible given the different program designs, and

  • to compare outcomes across the different awardee program sites and tests.


Measuring the quality of the delivery of these services is critically important since data published in the peer-reviewed literature have shown that colorectal cancer screening services are not consistently delivered appropriately (15, 16). These data will be used for program monitoring and evaluation including immediate quality improvement, to make an overall evaluation of the three year demonstration program and an assessment of the feasibility of a systematic approach increasing population-based CRC screening in this priority population, and to describe successes and barriers to establishing CRC screening programs in a community setting. Collecting and monitoring these data will also inform any future organized CRC screening efforts.


If these data are not collected and monitored, CDC would be unable to appropriately monitor and evaluate program implementation, effectiveness, and efficiencies including program quality, screening outcomes, and costs. In addition, CDC would be unable to achieve the overall aim of assessing the feasibility of a systematic approach increasing population-based CRC screening in this priority population. Based on what is learned from the collection and analysis of these data, CDC may recommend expanding these demonstration, or pilot, sites, into a larger national effort. While CDC has not previously collected these data for CRC screening since these demonstration CRC screening programs are a new CDC undertaking, we have been using a similar data collection and evaluation strategy for over 10 years as part of the administration of the National Breast and Cervical Cancer Early Detection Program (ref and OMB 0920-0571, Minimum Data Elements (MDEs) for the National Breast and Cervical Cancer Early Detection Program)) and these data have been critical in monitoring program quality and effectiveness and responding to inquiries regarding the use of federal funds used for the National Breast and Cervical Cancer Early Detection Program. The data we propose to collect for this demonstration colorectal cancer screening program will be used in much the same way.


A.3 Use of Improved Technology and Burden Reduction

CDC will require awardees to electronically report a standardized set of screening and follow-up data elements, the Colorectal Cancer Clinical Data Elements (CCDE) (Attachment 2a). Program sites can choose to collect additional data elements beyond what is included in the CCDEs, but we will only be requiring that they submit the elements included in the CCDEs. The CCDE data elements were selected because they will provide the minimum amount of information necessary to accomplish project evaluation.


Clinical data elements to be collected in the CCDEs will be entered into electronic data systems. Each awardee site used funds awarded for the demonstration to develop or modify pre-existing electronic data systems to manage the enrollment of patients into the program and track clinical care and follow-up.  Each electronic data system is uniquely designed to meet site-specific needs and integrate with existing resources and program operations.  Systems range from PC-based to web-based and all include a function to export a standardized data report to CDC.   


Each site modified existing data systems that had been used to collect breast, cervical or colorectal cancer screening data to create a system to facilitate data entry, editing and reporting of the CCDEs. Awardees will report the data set as an electronic, fixed-length text file. The data definitions and record layouts for this file were designed by the Division of Cancer Prevention and Control (DCPC) at CDC in conjunction with IMS and the awardee program sites (see Attachment 2a).


In the future, a web-based reporting system may be developed to further facilitate the reporting of screening and follow-up data sets by awardees.


A.4 Efforts to Identify Duplication and Use of Similar Information

Since no federally funded program currently exists which offers CRC screening in community settings to this identified population, there are no existing, comparable data sources available for the collection of this information. The data submitted to CDC by the awardee programs will provide information about persons specifically enrolled and screened in the CRC screening demonstration programs.


The consistent reporting by the awardee program sites to CDC of screening, final diagnosis, and treatment initiation data will promote assurances that awardee programs provide appropriate and timely clinical services to persons who utilize the CRC screening demonstration program, a critical requirement of the program. These evaluation data will be used to improve patient care by helping to increase the percent of abnormal tests receiving an appropriate follow-up test, the timeliness of the receipt of the appropriate follow-up test, and the timeliness of the initiation of treatment services, as outlined in Attachment 3c. This data collection is not designed to produce national estimates, but rather to evaluate five separate demonstration (pilot) programs, which will be used to guide how a national program might be designed.


The National Program of Cancer Registries (NPCR) collects data on all persons diagnosed with cancer. However, NPCR data do not include screening and tracking information nor do they allow for assurances that persons receive appropriate and timely care prior to and following final diagnosis. Additionally, they are collected and verified through medical record confirmation many months after a final diagnosis is made. Because it is imperative that CDC monitor the appropriate delivery of diagnostic and treatment services in a timely fashion, these data would not be sufficient for evaluating the CRC screening demonstration program activities.


A.5 Impact on Small Businesses or Other Small Entities

There will be no impact on small businesses.



A.6 Consequences of Collecting the Information Less Frequently

CDC will receive screening, diagnostic follow-up and treatment data from awardee programs quarterly. This will allow CDC to regularly evaluate the overall performance of the CRC screening demonstration programs, to make adjustments toward improved effectiveness, and to identify new goals as part of on-going planning efforts. Through quarterly review of the screening and follow-up data, CDC can identify any problems in a timely fashion. Since this is a new CDC undertaking, and organized colorectal cancer screening in general is new, we feel that the collection and evaluation of these data less frequently than quarterly would compromise CDC’s ability to appropriately monitor program progress and be able to provide technical assistance as needed. Aggregated data on ineligible clients will be submitted quarterly along with the CCDE data submission, to reduce burden on the awardee programs.


The cost data will be received annually, since these data will likely have less time-sensitive repercussions. Reporting of medical complications incurred through services delivered in the CRC screening demonstration program will be performed quarterly. As described in Attachment 4, for clients whose complications result in hospitalization, CDC would like to be notified within 72 hours of the medical complication, in advance of the quarterly report. It is very important that we are kept apprised of any medical complications at the time they occur. While CDC will not be directly involved in any clinical steps taken to resolve the complications, CDC wants to be informed of what the steps are, and how the complication has come to resolution. This information will be used to support and enhance quality assurance efforts at the five program sites.


A.7 Special Circumstances Relating to the Guidelines of 5 CFR 1320.5


These data are collected in a manner consistent with the guidelines in 5 CFR 1320.5. There are no special circumstances contained within this application.


A.8 Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency


As required by 5 CFR 1320.8(d), a notice of this data collection was published in the Federal Register on May 24, 2005, in Volume 70, No. 99: pp. 29747-29759 and again on Monday, June 26, 2006, Vol. 71, No. 122, pp. 36344-36345. A copy of the most recent 60-day Federal Register notice is included as Attachment 7. No comments were received in response to this notice.


The Division of Cancer Prevention and Control (DCPC) has employed several methods of consultation with individuals outside of the agency regarding the proposed data collection. In August 2004, DCPC convened a meeting of domestic and international stakeholders currently engaged in planning or implementing cancer screening programs. Attendees at this meeting were both federal and non-federal, and included clinicians, health planners, and representatives from various health organizations, all experts in CRC screening. The objective of the meeting was to gain stakeholder input on the need and utility for the establishment of a colorectal cancer screening demonstration program and the need for key data elements to be collected to track, monitor and evaluate the demonstration program. There was strong consensus among all meeting attendees that the demonstration program was a critical next step in advancing colorectal cancer screening and that the collection of patient-level data elements will be required to perform a comprehensive evaluation of the demonstration project. A list of these meeting participants is identified in Attachment 8.


Additionally, once the CCDEs were in draft, CDC consulted with two clinicians, a gastroenterologist who is a leader in CRC screening, and a pathologist from the National Cancer Institute who focuses on liver and GI pathology, about the variables and data definitions included in the CCDEs. Revisions and improvements were made to the CCDEs based on input from these two clinicians. Their names and contact information are also included in Attachment 8.


As the CCDEs were developed and revised, the work group also received input from each of the awardee programs, so that we could be sure that the data we proposed to collect are actually available and were deemed the appropriate variables by the awardee programs as well as by CDC. The names of the persons consulted on the CCDEs from the awardee programs are also included in Attachment 8.


The CDC also maintains an internal CRC Screening Demonstration Program Policies/Data/Evaluation working group that meets weekly to review and discuss policy and data issues related to this project. This work group includes CDC physicians, epidemiologists, program staff, senior statisticians, and social scientists. Upon OMB approval, once screening begins and data are received the work group will continue to meet regularly to review the data and make recommendations for data changes, data analysis, and other program improvements, and may occasionally include outside experts.


A.9 Explanation of Any Payment or Gift to Respondents

No payments or gifts will be provided to program participants.


A.10 Assurance of Confidentiality Provided to Respondents

The CDC Privacy Act Officer has reviewed this OMB application and has determined that the Privacy Act is not applicable. Although respondents (demonstration screening site awardees) will review and collect identifiable patient-level data, they will transmit only de-identified data to CDC and the data processing contractor, IMS. CDC, IMS, and the evaluation contractor, RTI, will never receive sensitive, identifiable information as a part of this program.


Respondents are clinical care sites and state and local health departments that have routine access to identifiable medical information for conducting patient care and public health activities. In order to provide clinical and screening services, respondents will collect personally identifying information on each patient served by the CRC (e.g., name, address, social security number, age, race/ethnicity) as well as information about the patient’s screening history, screening and diagnostic procedures provided, results of those procedures, and if cancer is diagnosed, information about treatment initiation and stage of disease. The respondent will assign a unique, sequential patient identification code to each patient in the CRC database (i.e., patient #1, patient #2, etc.), and the respondent will remove personal identifiers from CRC data prior to its transmission to IMS or CDC. The patient-level demographic data provided to IMS and CDC will include only the unique patient ID code, county of residence, state of residence, race, date of birth, and ethnicity. Respondents will not provide additional, potentially identifying variables or demographic information to IMS or CDC. These procedures allow CDC to anonymously track each patient served throughout his or her involvement with the CRC screening demonstration program, and to conduct the planned evaluations of the CRC screening program.

Each respondent will maintain a secure, encrypted data file linking its assigned CRC ID codes to patient identifiers such as name and SSN. The encryption scheme will not be provided to IMS, CDC, or any other entity. These provisions allow the respondent, and only the respondent, to re-link its CRC data with patient identifiers. The respondents are thus capable of following up on provider-initiated requests for information about specific patients, and to queries from CDC, as needed. All awardee sites will also maintain physical security measures. Hard copy data will be stored in locked file cabinets. All electronic data files will be password protected and access to the files will be limited to authorized project staff.


Formal reports will be developed for publication both biennially and periodically. These reports will present data in anonymized form only. Demonstration site awardees understand and agreed to comply with CDC's data sharing and release policy since it was written into the funding agreements with the awardees. Additionally, the original Request for Application (RFA) stated that CDC requirements for Release and Sharing of Data would apply to these projects (page 29755 of RFA published May 24, 2005 in the Federal Register). The reports will be disseminated to the public through the CDC internet, peer review journals, and publications.


A confidentiality agreement exists between CDC and both IMS and RTI. This data collection is a program evaluation activity, not research. IRB approval is not required.


A.11 Justification for Sensitive Questions

Questions about cancer diagnosis can be considered sensitive, since at least a portion of patients would view a cancer diagnosis as a sensitive issue (relates to health, potential social stigmatization, and insurability). However, these questions about cancer are fundamental to core purposes of the project. The specific goals of this project can not be accomplished without this information. In addition, Race/Ethnicity information, which may be considered sensitive, is collected for purposes of data analysis and in conformance with HHS policy.


Awardees may consider the information about Medical Complications to be sensitive, because it may be perceived as relevant to quality of care or practice issues. However, it is necessary to collect these data for quality control and program evaluation purposes, which are two critical project objectives. While complications in colorectal cancer screening are rare, they can occur, and, CDC feels it is very important to be apprised of any complications that may occur as a result of service delivery through the program, and measures to resolve the complications. CDC will not receive any personal physician or patient identifiers associated with any of the data received, including the medical complications data.

Because the project includes both sensitive information and patient-level identifiers (although no personal identifiers), the project team has devoted particular attention to the data security, data de-identification procedures and protecting sensitive information.


A.12 Estimates of Annualized Burden Hours and Costs

A. The total estimated annual respondent burden across all 5 demonstration program site awardees is 1270 hours for four types of data collection. Data collection for these four types of data [patient-level clinical data (CCDEs), medical complications, annual aggregate data on ineligible patients, and reimbursement data] will continue over the 3 years of the demonstration programs. Table A12A summarizes the number of respondents and estimated burden hours. Estimates are rounded to the nearest hour.


The respondent time burden of data collection for the National Breast and Cervical Early Detection Program was used to help estimate the time burden for the new CRC awardee programs’ data collection, since similar types of data from similar data sources will be used. The burden for this proposed data collection will vary by the type of test performed; two of our five sites perform screening colonoscopy and three perform screening fecal occult blood testing (FOBT). We estimate one hour to enter CCDEs for colonoscopy programs and 15 minutes to enter CCDEs for FOBT programs. We estimate one hour for entry for each of the other data collection forms (medical complications, annual aggregate data on ineligible patients, and reimbursement data). We anticipate that if a complication occurs, it will occur at a rate of no more than 1 or 2 per quarter per program (or less). The burden of data reporting will be reduced by a consistent quarterly reporting schedule. Attachment 6, Allowable Procedures and Relevant CPT, HCPCPS and APC codes, is included for reference purposes and supports completion of the reimbursement data. Burden for Attachment 6 is thus included in the cost data estimate.


Table A12A summarizes the number of respondents and estimated burden hours.

Type of Respondent

Form Name

Number of Respondents

Number of Responses per Respondent 1

Average Burden per Response

(in hours)

Total Burden (in hours)

Demonstration Program Site

CCDE 2 – for Colonoscopy programs

2


240


1


480

CCDE 2 – for FOBT programs

3

1000

15/60

750

Medical Complications Reporting Form 2

5

6

1

30

Annual Aggregate Data on Medically Ineligible Clients Form 3

5

1

1

5

Reimbursement Data Reporting Form 3

5

1

1

5

Total

1270


1 The number of responses per respondent is calculated by multiplying

(estimated number of forms per report) * (number of reports per year)


2The CCDE form and the Medical Complications Reporting Form are submitted to CDC on a quarterly basis (4 times per year)


3 Submitted to CDC on an annual basis.


B. The estimated annualized cost to respondents for the hour burden of reporting patient–level and cost data is based primarily on a mean, hourly wage for awardee Data Managers of $50.00. We anticipate that the Medical Complications Reporting Form will be completed jointly by a data manager ($50/hr) and a physician ($70/hr) (17). The estimated average hourly wage for completion of this form is calculated on the average of their wages.

As indicated in Table A12A, the estimated annualized hour burden for all program sites to report patient-level, medical complications, ineligibility and cost data is 1270 hours. Therefore, the annualized cost for all five Demonstration Program Sites to report data, as provided in Table A12B, is estimated as $63, 800.00. For colonoscopy programs, the estimated annualized cost for data collection is $12,460, and for FOBT programs, the estimated annualized cost for data collection is $12,960.00.


Type of Respondent

Form Name

Number of Respondents

Number of Responses per Respondent (1)

Average Burden per Response

(in hours)

Average Hourly Wage (in dollars)*

Total Cost

(in dollars)

Demonstration Program Site

CCDE – Colonoscopy


2


240


1


$50

$24,000

CCDE – FOBT

3

1000

15/60

$50

$37,500

Medical Complications Reporting Form

5

6

1

$60

$1,800

Annual Aggregate Data on Medically Ineligible Clients Form

5

1

1

$50

$250

Reimbursement Data Reporting Form

5

1

1

$50

$250

Total

$63,800



Table A.12.B Estimated Annualized Costs to Respondents






A.13 Estimates of Other Total Annual Cost Burden to Respondents or Record Keepers

This proposed data collection entails no additional cost to respondents or record keepers, since each awardee site used federal funds awarded for the demonstration program to develop or modify pre-existing software systems and to hire staff to collect data. 


A.14 Annualized Cost to the Federal Government

Total operation and maintenance costs include work performed by the data contractor, Information Management Services, Inc. (IMS), the evaluation contractor, Research Triangle Institute (RTI) and CDC personnel. IMS is funded at an annual cost of $126,617.00 for a three-year total of $379,851.00 for data management activities including data processing, analysis, systems development and provision of technical support. RTI is funded at an annual cost of $149,000.00 for a three-year period for a total of $447,000.00 over the project period, for assistance with the overall demonstration program evaluation and cost assessment. CDC personnel costs are estimated at $228,898.75 annually for 0.2 full time data manager, 0.6 medical officers, 0.6 epidemiologists, 0.2 health economist, and 0.4 public health analysts. CDC personnel costs are estimated at $686, 696.25 over three years. The following table summarizes the estimated Federal Government cost distribution.



Estimated Annualized Federal Government Cost Distribution:


Annualized Cost

CDC Personnel

$228, 899.00

Data Contractor (IMS)

$ 126,617.00

Evaluation Contractor (RTI)

$149,000.00

Total

$504, 516.00


A.15 Explanation for Program Changes or Adjustments

This is a new data collection.


A.16 Plans for Tabulation and Publication and Project Time Schedule

Time Schedule

CDC requests a 3-year clearance for this recurring data collection. During the 3-year project period, patient level data will be reported by awardees on a quarterly basis (March 1, June 1, September 1 and December 1), with the first data submission following OMB approval scheduled for December 2006. The data files include cumulative data from the beginning date of each awardee's funded screening services up to the current reporting date. The data are formatted and analyzed within 40 working days of reporting, and analysis reports are developed within 60 working days of the reporting date. The following table summarizes the time schedule for data reporting, analysis and publication.


Table A16A. Time Schedule for Data Reporting, Analysis and Publication:

Tasks

Schedule

Patient-level data reported

Mar 1, Jun 1, Sept 1 and Dec 1 of each year

  • Raw data reviewed

30 working days after data submission

  • Data analysis file created

40 working days after data submission

  • Standardized surveillance reports generated

60 working days after data submission

Cost data collection

Annually

Aggregate Data on Medically Ineligible ClinetsClients

Annually

Primary Statistical Reports

Produced biennially for publication

Planned Publications

Produced every 18 months during project

Special Research Projects

Produced periodically for publication


Publication Plan

CDC plans to use the patient-level data reported by awardees to produce three categories of publications: Primary Statistical Reports, Planned Publications, and Special Research Projects. The Primary Statistical Reports will be standardized, quarterly reports that include anonymized basic statistics and outcome variables by race and age. These reports will include formal, anonymized reports for use by CDC staff and internet publications posted to the CDC web site for dissemination to the public.


Planned Publications will be formal reports that include multi-variate analyses of the minimum data set and an examination of test characteristics. These anonymized reports will be reserved for inclusion in publications such as Morbidity and Mortality Weekly Report (MMWR) and presentations at conferences. These publications will also be posted to the CDC web site and included in peer review journals. CDC expects these publications will be produced every 18 months during and immediately following the completion of the demonstration program.


Special Research Projects may be developed as reports on topics of interest to CDC researchers that are for publication in peer reviewed journals. The CDC expects these projects to be developed periodically.


A.17 Reason(s) Display of OMB Expiration Date is Inappropriate

There is no request for an exemption from displaying the expiration date for OMB approval.


A.18 Exceptions to Certification for Paperwork Reduction Act Submissions

No exceptions are requested.


B. Collections of Information Employing Statistical Methods

B.1 Respondent Universe and Sampling Methods

The respondents are the 5 awarded cooperative-agreement recipient program sites that have received CDC funds to implement and maintain colorectal cancer demonstration screening programs (see Attachment 9). They will be providing information on all clients served in each program. The CDC requires data submissions as a stipulation of the CRC screening demonstration program Request for Application and the cooperative agreement notice of awards.


Sampling methods are not employed. Screening and follow-up data collection is performed at the awardee level on every client enrolled in the CRC screening demonstration program and is reported to the CDC quarterly. Awardees will report cumulative data sets dating back each year to the establishment of their original cooperative agreement with the CRC screening demonstration program. The planned data collection is a program evaluation activity designed to improve specific programs at specific sites, and help inform future activities.


B.2 Procedures for the Collection of Information

See Attachment 10 for an illustration of the flow of data in the overall demonstration program. Individual patient-level data will be collected on a continuous basis, but prepared for quarterly submission to CDC. Data will be entered into each program site's database; each program site modified existing databases they had been using to collect colorectal, breast, or cervical cancer data, to be able to collect the data from this demonstration program. Data will be edit-checked before submission and all submitted data will be stripped of any personal identifiers. The data will then be formatted by each program site so that it can be exported according to the CCDEs data definitions provided in Attachment 2a, and the file will then be forwarded electronically to IMS, CDC’s data contractor for these demonstration programs, on the quarterly reporting dates (March 1, June 1, September 1 & December 1). The first quarterly data submission will occur following OMB approval. The reporting schedule will remain consistent each year.

.

CDC acknowledges the potential delay between screening services and data entry. Thus, awardees will be expected to report complete demographic and screening data for all records with a procedure date more than 3.5 months prior to the reporting date and they will be expected to report complete final diagnosis and treatment initiation data for all records with a procedure date more than 9.5 months prior to the reporting date. The following table provides examples of the cutoff dates for complete data reporting.


Table B2: Data Reporting Schedule and Cutoff Dates for Awardees:


CCDE Reporting


CCDE data are submitted quarterly to IMS as described in the CCDE Data Definition Table. Each submission includes cumulative records reported since program inception through the submission cutoff date. Submission cutoff dates are based on Date of First Test Provided (CCDE 6.1.02), allowing for a 3-month reporting lag.


CCDE cycles that require additional tests/procedures may not be complete at the time of cutoff, but these records should be included in the submission.



Submission Due Date


Submission Cutoff Dates

9/1/2006

Cumulative – 5/31/2006

12/1/2006

Cumulative – 8/31/2006

3/1/2007

Cumulative – 11/30/2006

6/1/2007

Cumulative – 2/28/2007

Continued quarterly ….



Aggregate Reporting of Medically Ineligible


Aggregate data are reported to IMS annually on December 1, using the prescribed form. Each aggregate data submission includes one year of data from September 1 – August 31, allowing for a 3-month reporting lag.



Submission Due Date


Submission Cutoff Dates

12/1/2006

9/1/2005 - 8/31/2006

12/1/2007

9/1/2006 - 8/31/2007

Continued annually ….



Medical complications Reporting


Medical complications are reported to the CRC Technical Assistance Team following a reportable occurrence, using the prescribed form to include details of the occurrence and resolution.



Submission Due Date


Submission Includes

For clients requiring hospitalization, forms are reported within 5 days of occurrence, with monthly updates through resolution.


For clients not requiring hospitalization, forms are reported quarterly on Sep 1, Dec 1, Mar 1, Jun 1.

Medical complications form per patient for any new or updated occurrence.




Once the data are reported to the data contractor, they will be logged and archived. The data sets will be reviewed for completeness and any necessary clarification will be requested from awardees. A file will be created and used to generate awardee-specific Error Summary Reports that will contain counts and associated percentages for blank field errors, inter-field relationship errors and inter-record relationship errors in each data set.


IMS will then create an aggregated analysis file for generating standardized, CRC screening demonstration program surveillance reports and special CDC requests. The analysis file will also be used to generate awardee-specific reports for feedback to awardees about the quality, completeness and timeliness of their data.


Feedback reports will provide feedback on data quality and completeness within the most recent 18 months of data reporting using tables and record audits. Plots or graphs will be generated to provide fiscal year data for age and race demographics, counts of persons served, procedures performed and cancer incidence. See Attachment 3a, b, and c for examples of error summary/edit reports, data quality indicator guide reports and service quality indicator guide reports. Once awardee programs receive the feedback reports, they will be given the opportunity to discuss the reports and their methods of data management with the CDC and the data contractor.



B.3 Methods to Maximize Response Rates and Deal with Non-response

Since receipt and analysis of data for evaluation purposes will be a critical component of this new CRC screening demonstration program, CDC expects that all awardees will report data in a timely manner. In addition, CDC required the proposed data submissions as a stipulation of the Request for Application and the cooperative agreement notice of grant award. Respondents that have any difficulty with a data submission will be provided technical assistance by the CDC Project Officer and/or the data contractor. The schedule for data reporting will remain consistent each year. The awardees should have little difficulty with the logistics of the request for reporting screening and follow-up data, since the data will be transmitted electronically as an ASCII text file, which is a common format for data interchange. The data definitions for the text file are provided in Attachment 2a.


Professional training in the use of the data reporting system will be available for awardee Program Directors and Data Managers at their annual reverse site visits and periodic training and technical assistance will be available from CDC and the data contractor on monthly conference calls or more frequently as needed. Awardees will receive an independent and detailed assessment of their data quality and completeness from the data contractor, providing an additional incentive for reporting the requested data.


Awardees will also receive a Data User’s Manual (Attachment 2b), developed by CDC and IMS, that provides complete written instruction regarding data submission requirements, data variables, data field descriptions, report descriptions, etc. This document will support consistent submissions across awardee programs.


B.4 Tests of Procedures or Methods to be Undertaken

Since this demonstration program is comprised of only a few sites (n=5), no formal pilot testing will be conducted, although the demonstration program itself can be considered a pilot. Lessons learned from this demonstration would inform a future and potentially larger program.


B.5 Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

The data collection was designed by the Division of Cancer Prevention and Control, National Center for Chronic Disease Prevention and Health Promotion, Centers for Disease Control and Prevention, 4770 Buford Highway NE, Mail Stop K-52, Atlanta, GA 30341-3717, with assistance from staff at IMS and RTI, including from CDC: Laura Seeff MD, Marion Nadel PhD, Blythe Ryerson MPH, Jean Shapiro PhD, Lisa Richardson MD, Faruque Ahmed, MD, PhD, James Gardner, Florence Tangka PhD; from IMS: Bill Helsel, Bill Kammerer, Cindy Mattingly and Steve Marroulis; and from RTI: Sujha Subramanian PhD and Debbie Holden PhD. The role of each individual regarding the design, collection and analysis of the data is identified in Attachment 12. All CDC staff listed were involved in the design of the data collection and will be involved of analysis of the data for evaluation purposes.


The CDC Project Officer for the IMS data management contract is Janet Royalty, MS (770-488-3085), Data Manager at the Program Services Branch, Division of Cancer Prevention and Control, National Center for Chronic Disease Prevention and Health Promotion, Centers for Disease Control and Prevention, 4770 Buford Highway NE, Mail Stop K-57, Atlanta, GA 30341-3717.


The CDC Project Officer for the RTI evaluation contract is Amy Degroff, MPH (770-488-2415), Health Education Specialist at the Program Services Branch, Division of Cancer Prevention and Control, National Center for Chronic Disease Prevention and Health Promotion, Centers for Disease Control and Prevention, 4770 Buford Highway NE, Mail Stop K-57, Atlanta, GA 30341-3717. Evaluation activities performed by RTI are done so under the direction of Debra J. Holden, Ph.D. (919- 541-6491), Community Health Psychologist, Director, Community and Health Education Research, RTI International, 3040 Cornwallis Road RTP, NC  27709.

References


  1. CDC. Increased Use of Colorectal Cancer Tests --- United States, 2002 and 2004. MMWR 2006; 55 (11): 308-311.

  2. US Cancer Statistics Working Group. United States cancer statistics: 1999–2002 incidence and mortality. Atlanta, GA: US Department of Health and Human Services, CDC, National Cancer Institute; 2005. Available at http://www.cdc.gov/cancer/npcr/uscs/index.htm.

  3. Mandel JS, Bond JH, Church TR, et al. Reducing mortality from colorectal cancer by screening for fecal occult blood. N Engl J Med 1993; 328(19): 1365-1371.

  4. Selby JV, Friedman GD, Quesenberry CP Jr, Weiss NS. A case-control study of screening sigmoidoscopy and mortality from colorectal cancer. N Engl J Med 1992; 326:653-657.

  5. Newcomb PA, Norfleet RF, Storer BE, Surawicz T, Marcus PM. Screening sigmoidoscopy and colorectal cancer mortality. J Natl Cancer Inst 1992; 84:1572-1575.

  6. Hardcastle JD, Chamberlain JO, Robinson MH, Moss SM, Amar SS, Balfour TW, et al. Randomised controlled trial of faecal-occult-blood screening for colorectal cancer. Lancet 1996; 348:1472-1477.

  7. Kronborg O, Fenger C, Olsen J, Jorgensen OD, Sondergaard. Randomised study of screening for colorectal cancer with faecal-occult-blood test. Lancet 1996; 348:1467-1471.

  8. Mandel JS, Church TR, Ederer F, Bond JH. Colorectal Cancer Mortality: Effectiveness of Biennial Screening for Fecal Occult Blood. J Natl Cancer Inst 1999;91 (5):434-437.

  9. Mandel JS, Church TR, Bond JH, et al. The effect of fecal occult-blood screening on the incidence of colorectal cancer. N Eng J Med 2000;343:1603-7.

  10. Seeff LC, Nadel MR, Klabunde C, Thompson T, Shapiro JA, Vernon SW, Coates RJ. Patterns and Predictors of Colorectal Cancer Test Use in the Adult US Population. Cancer 2004;100:2093–103.

  11. US Preventive Services Task Force. Screening for colorectal cancer: recommendations and rationale. Rockville, MD: Agency for Healthcare Research and Quality; July 2002. Available at http://www.ahrq.gov/clinic/3rduspstf/colorectal/colorr.htm.

  12. Winawer S, Fletcher R, Rex D, et al. Colorectal cancer screening and surveillance: clinical guidelines and rationale. Update based on new evidence. Gastroenterology 2003;124:54460.

  13. Smith RA, von Eschenbach AC, Wender R, et al. American Cancer Society guidelines for the early detection of cancer: update of early detection guidelines for prostate, colorectal, and endometrial cancers. CA Cancer J Clin 2001;51:3875.

  14. Levin TR, Zhao W, Conell C, Seeff LC, Manninen DL, Shapiro JA, \ Schulman J. Complications of Colonoscopy in an Integrated Health Care Delivery System. Ann Intern Med. 2006 Dec 19;145 (12):880-6.

  15. Nadel MR, Shapiro JA, Klabunde CN, Seeff LC, Uhler R, Smith RA, Ransohoff DF. National survey of primary care physicians' methods for screening for fecal occult blood. Ann Intern Med 2005;142(2):86-94.

  16. Mysliwiec PA, Brown ML, Klabunde CN, Ransohoff DF. Are physicians doing too much colonoscopy? A national survey of colorectal surveillance after polypectomy. Ann Intern Med 2004;141:264-271.

  17. Bureau of Labor Statistics (http://www.bls.gov/ncs/ocs/sp/ncbl0832.pdf ). National Compensation Survey: Occupational Wages in the United States, June 2005, U.S. Department of Labor, Elaine L. Chao, Secretary U.S. Bureau of Labor Statistics Philip L. Rones, Acting Commissioner August 2006 Bulletin 2581.


3


File Typeapplication/msword
File TitleNovember 7, 2006
AuthorLaura Seeff
Last Modified Byarp5
File Modified2007-02-14
File Created2007-02-05

© 2024 OMB.report | Privacy Policy