OMB Clearance Package
OMB Clearance Application: Evaluation of Second Phase of Oncology Demonstration
December 12, 2007
TABLE OF CONTENTS
1. Circumstances of Information Collection 1
2. Purpose and Use of Information 2
3. Use of Improved Information Technology 2
4. Efforts to Identify Duplication 3
5. Efforts to Reduce Burden on Small Entities 3
6. Consequences of Collecting Data Less Frequently or Not At All 3
8. Federal Register Notice/Consultation Outside the Agency 4
9. Payments/Gifts to Respondents 4
10. Assurance of Confidentiality 5
11. Questions of a Sensitive Nature 5
12. Estimates of Annualized Hour Burden to Respondents 5
13. Estimates of Annualized Cost Burden to Respondents 6
14. Estimates of Annualized Cost to the Government 6
16. Time Schedule, Publication and Analysis Plan 6
Purpose and Main Research Questions 6
Tabulations and Statistical Analysis 10
Time Schedule and Publication Plan 12
17. Exemption for Display of Expiration Date 12
18. Exceptions to the Certification Statement 12
B. Collections of Information Employing Statistical Methods 13
1. Respondent Universe and Selection Methods 13
2. Information Collection Procedures 15
3. Methods to Maximize Response 15
The Centers of Medicare & Medicaid Services (CMS) of the Department of Health and Human Services (HHS) is requesting Office of Management and Budget (OMB) approval to survey physicians, specifically hematologists and oncologists, about the 2006 Medicare Oncology Demonstration Program. CMS, in conjunction with the National Cancer Institute (NCI), has contracted with L&M Policy Research, LLC (L&M) and the National Opinion Research Center (NORC) to conduct this assessment. The study will help HHS better understand the impact of the Demonstration on physician practices, as well as physicians’ overall experience with the Demonstration.
In January of 2005, CMS launched the Demonstration of Improved Quality of Care for Cancer Patients Undergoing Chemotherapy. This demonstration project was designed to provide incentives for oncologists to measure patient outcomes in three areas: 1) pain control, 2) minimizing of nausea and vomiting, and 3) reducing fatigue. These three areas are often cited as concerns by patients undergoing outpatient chemotherapy. New billing codes, or G-codes, were added to the claims form that corresponded to four patient assessment levels for each of the patient symptom areas. Oncologists self-enrolled in the Oncology Demonstration program by billing the designated G-codes. Participation was open to all Medicare providers who were oncologists in the United States, and oncologists were paid $130.00 per visit for submitting this information with the billing data.
This Oncology Demonstration Program was redesigned in January of 2006, such that oncologists as well as hematologists treating certain types of cancer patients could submit data about cancer patients in their care in conjunction with evaluation and management (E&M) visits. The 2006 Oncology Demonstration Program used evidence-based practice guidelines to encourage quality care for patients with a primary diagnosis of cancer in one of 13 major diagnostic categories. In contrast to the 2005 G-codes, the 2006 Demonstration used G-codes to gather information regarding patients’ treatments, the spectrum of care they receive from their physicians, and whether or not the care represents best practice.
The 2006 Oncology Demonstration Program aimed to: 1) have oncology payments increasingly focused on patient-centered care, rather than chemotherapy administration; 2) learn to what extent Medicare beneficiaries are being treated in a manner that yields the best outcomes; 3) understand clinical cancer scenarios where there is not clinical consensus among physicians on the relevance of specific evidence-based practice guidelines; and, 4) ensure that due emphasis is placed on multi-disciplinary, comprehensive approach to palliation and end of life care. In addition, CMS hoped to reduce the potential that unnecessary services and tests are being performed, thereby lowering program costs while yielding better quality of life for Medicare beneficiaries with cancer.
As the premier institution for cancer research at the National Institutes of Health, as authorized in 42 USC Section 285, the National Cancer Institute (NCI) is responsible for the National Cancer Program, which consists in part of a cancer control and population science program to design intervention and demonstration research studies aimed at translating the results of recommended practice in cancer care delivery settings and examining barriers and facilitators for delivering such therapies when medically indicated and preferred by patients. The 2006 Oncology Demonstration Program presents a unique opportunity to evaluate how quality reporting initiatives via G-codes can inform our understanding of the quality of care of cancer patients in the Medicare program and how collecting and reporting these data influence physician performance.
To provide a comprehensive assessment, this evaluation will involve a short voluntary survey of physicians who participated in the 2006 Oncology Demonstration Program. The survey topics include physician characteristics, practice characteristics, Medicare patient composition, types of cancer patients, and impact of Demonstration on the practice (e.g., with respect to time management, patient outcomes, financial status of practice). The following subsections of this document provide a detailed justification for the collection of these data, in accordance with OMB requirements.
There has been growing focus on cancer care in the health policy arena due to anticipated increases in cancer incidence, the refinement of evidenced-based guidelines for managing quality cancer care for major diagnoses, and the continual pressures to control medical costs while improving adherence to clinical care guidelines. In 2006, the American Cancer Society (ACS) projected approximately 1.4 million new cancer diagnoses, resulting in cancer being the second leading cause of death (557,264) in the United States. Furthermore, as cancer care improves and survival rates increase, there is an accompanying rise in costs in helping survivors cope with the long-term side effects (some yet unknown) from the often intense and toxic care provided during their cancer treatments. These demographic trends are of particular concern for the Medicare program, as older age and cancer risk are highly correlated, resulting in a heavy financial burden due to associated health care expenditures.
CMS has clear and compelling reasons to focus on encouraging appropriate and high quality of care for the beneficiaries it serves, in addition to fostering efficiencies across all facets of cancer treatment. The physician survey will provide first-hand information on how CMS payment systems encourage quality of care by physicians. Additionally, the survey will solicit feedback on how hematologists and oncologists have developed and gained experience with infrastructure and reporting of data. Finally, this survey will help CMS to understand lessons learned for future demonstration projects involving oncologists and other specialists.
The voluntary survey will be sent to physicians as a self-administered questionnaire (SAQ). To maximize response, NORC will also conduct telephone prompting to remind physicians to complete and return (by mail, fax, or email) the SAQ or to allow them to complete the questionnaire over the telephone. All data will be keypunched into a programmable data entry system to minimize data entry errors. An additional 10 percent of the data will be double entered and adjudicated to ensure accuracy and completeness.
In August 2006, the Office of the Inspector General (OIG) published a report that assessed the cost and overall performance of the 2005 Oncology Demonstration Program. For their study, OIG used demonstration claims data received through the end of 2005; conducted interviews with cancer researchers, CMS staff, and oncology physician practices; and reviewed CMS work papers, such as internal e-mails and meeting notes. In addition, as part of a larger project to assess the effects of changes in Medicare payments for physician-administered drugs, the Medicare Payment Advisory Commission (MedPAC) has conducted a study to learn more about physicians’ participation in the Oncology Demonstration. However, the MedPAC study was not intended to evaluate this demonstration and did not address the questions needed for a complete evaluation.
This is the first systematic data collection of its kind to take place regarding the 2006 Oncology Demonstration Program. No other projects assessing physicians’ experience of the 2006 Demonstration are known to be funded by either the federal government or private entities prior to the development of this study. No other survey of this type has been identified.
The survey will have minimal impact on small entities. Completion of the survey will require minimal time (no more than 10 minutes) out of a respondent’s work day. In addition, NORC has pre-tested the survey instrument to identify problem areas prior to implementation. This pre-test focused on the main study concepts, question reading and order, and question clarity, helping to ensure that the survey is as easy to complete as possible.
This survey will be conducted only once. The questionnaire data will provide CMS with critical information on how oncologists and hematologists adapted their practices in response to the demonstration and the extent to which the demonstration changed the way care is provided in physicians’ practices. Without this data, CMS would be unable to make an informed decision about the impact of using evidence-based clinical practice guidelines to improve cancer care.
Further, the aggregate data will provide CMS with a deeper understanding of why physicians chose to participate in this demonstration and the challenges to participation. These findings can be applied to experiences by other specialists.
This request is consistent with the general information collection guidelines of 5 CFR 1320.5(d)(2). There are no special circumstances associated with this project.
CMS did not receive any comments in response to the April 20, 2007 60-day Federal Register publication. Per OMB, no comments were received from the June 29, 2007 30-day Federal Register publication as well.
CMS has consulted with subject matter and survey design experts at L&M Policy Research, NORC, The Lewin Group, and American Institutes for Research (AIR) in designing the survey instrument and methodology. The names of the individuals consulted from these organizations can be found in Section B.5.
As noted earlier, CMS is collaborating with NCI in all aspects of the project. Contacts at the agencies include:
Pauline Karikari-Martin, MPH, MSN, APRN, BC, PAHM, Office of Research, Development and Information, Research and Evaluation Group, CMS, (410) 786-1040
Steven Clauser, Ph.D., Chief, Outcomes Research Branch, Division of Cancer Control and Population Sciences, Applied Research Program, NCI, (301) 451-4402.
Physicians will be compensated for their participation in the study. A pre-paid incentive of $25 will be included in the initial questionnaire mailing to physicians. NORC telephone prompters will be instructed to discuss payment. If physician indicates that payment was not received, another incentive will be mailed. Payment for participating in an interview or survey is standard practice when seeking participation of professionals such as physicians. The incentive payment is an effective method of drawing physicians’ attention to the study and gaining their cooperation in completing the survey. It is not intended to by a payment for their time.
Experiments to study the effect of incentives both in the general population1 as well as on physician surveys2,3 have conclusively shown that incentives are effective. CMS and NCI believe this investment will have a strong impact on resulting response rates.
The privacy of all study participants will be protected. Personal identification information (i.e., physician names) and practice identification information will not be collected in the surveys. Instead, NORC will assign a subject identification number which will be used in place of the participant’s name on the questionnaire. Data files and reports delivered to L&M and CMS will contain subject identification numbers only and not the personal or practice identification information. Individual participants will not be identified in any report, publication, or presentation of this study or its results.
NORC will not store the participants’ names or other personal identifiers in the same computer file as their questionnaire data. Any paper copies of questionnaires will be stored in a cabinet/storage area separate from the study administration materials. Electronic data will be stored in a password protected data file and only authorized project staff will have access to the data. At the conclusion of the study, all hard copy materials will be destroyed and electronic files will be deleted as requested or archived in password protected files.
The survey does not include questions of a sensitive or personal nature. Respondents will be asked to answer from the perspective of their practice about implementing the demonstration, as well as the respondents’ opinions of the 2006 Oncology Demonstration Program. The questions are designed to solicit information solely regarding the demonstration in a professional/worksite setting.
The cover letter inviting physicians to participate in the study clearly states that the study is voluntary and that respondents may elect to skip any questions. We have noted that answering the survey does not affect their participation in the demonstration. In addition, the materials include a telephone number that participants may call in order to address any questions regarding their rights as a research subject.
In Exhibit 1, we provide an estimate of the collection burden on participants for this effort. Study participants will take part in data collection one time only.
Exhibit 1. Estimate of Cost Burden To Respondents
Item |
Number of Respondents |
Responses per Respondent |
Average Respondent Minutes per Survey |
Estimated Total Hour Burden |
Median Hourly Wage Rate* |
Total Hour Cost |
Oncologists/ |
600 |
1 |
10 |
100 |
$68.98 |
$6,898.00 |
Total burden (hours): 100
Total imputed costs: $6,898.00
*Based on hourly wage for general physicians and surgeons, “May 2005 Occupational Employment and Wage Estimates,” U.S. Department of Labor, Bureau of Labor Statistics. Extracted November 25, 2006 from http://www.bls.gov/oes/oes_dl.htm
Data collection for this study will not result in any additional capital, start-up, maintenance, or purchase costs to respondents or record keepers. Therefore, there is no burden to respondents other than that discussed in the previous section (A.12).
The Department of Health and Human Services has contracted with L&M Policy Research, LLC (L&M) to complete all components within the task order. L&M has subcontracted with NORC and will be paying NORC a total of $272,348 over a 24-month period to conduct the survey and contribute to the final report.
This is a new collection of data.
This section contains an analysis plan for this survey, including: a) a review of the survey purpose and research questions, b) a review of the data sources, c) a discussion of the statistical analyses planned for each research question, and d) the time schedule for completing the project.
The 2006 Medicare Oncology Demonstration Program is a national program that uses evidence-based practice guidelines to encourage quality care for patients with a primary diagnosis of cancer in one of 13 major diagnostic categories. The demonstration uses G-codes to gather information regarding patients’ treatments, the spectrum of care they receive from their physicians, the frequency with which physicians use clinical practice guidelines, and whether or not the care represents best practice. The goals of this evaluation are to determine how oncologists and hematologists adapted their practice in response to the CMS payment incentive, to understand the impact of using evidence-based guidelines to deliver care, and to uncover lessons learned for future demonstration projects involving all specialists.
The primary objectives of the survey will be to profile demonstration participants, collect the process experiences associated with participation, and assess physician attitudes about the demonstration and more broadly, about evidence-based clinical guidelines.
The project will use a self administered mail survey to collect this information. The survey will be mailed to demonstration participants. The main research questions this survey are outlined in Exhibit 2.
Exhibit 2. Key Research Questions
1. What is the profile of the demonstration participants? |
|
2. What processes are associated with participation in the demonstration? |
|
3. What are physicians attitudes about the demonstration, and more broadly, about clinical guidelines? |
|
The findings of this study will assist CMS in refining the activities related to future demonstrations for all specialists.
This section provides an overview of the survey instrument and details the data collection methodology. The project team will utilize a mixed-mode approach (mail, telephone, fax and e-mail) to conduct a survey of participating oncologists and hematologists that will assess their experience with and attitudes toward the demonstration.
The survey instrument has been developed and pre-tested prior to submitting this OMB clearance package. The questionnaire consists of the general domains outlined in Exhibit 3.
Exhibit 3. Physician Survey Domains
Question Domain |
Overview |
Eligibility Screener |
Verify eligibility for the survey. |
Demonstration Background and Awareness |
Capture information on knowledge of the demonstration, how heard about it, and who made the decision to participate. |
Demonstration Implementation |
Capture practice process factors, impact of demonstration on the practice. |
Physician Perceptions and Attitudes |
Capture physician attitudes and perceptions regarding the demonstration and clinical guidelines. |
Physician and Practice Characteristics |
Capture demographic data (such as age, gender, specialty, practice size, percent Medicare patients) to be used to analyze survey data. |
The participant survey is included as Attachment A.
Details of the data collection processes are described below.
Step 1: Processing of Sample File. Prior to mailings, all address information will first be processed through the Smartmailer4 program to ensure that the address is acceptable to the U.S. Post Office and will not be returned as undeliverable. All mailings will include the postal instruction, “Do Not Forward, Return to Sender with Address Correction Information” and all outgoing mail will be pre-sorted so that it is processed more quickly and efficiently by the United States Postal System (USPS). In addition, we will take the additional step of processing all provider lists through the Accurint database. Processed in batches, this software provides the most up-to-date address and telephone number information for known addresses.
Step 2: Mail Pre-Notification Letters. A physician’s decision whether or not to participate in a study is influenced by the content and appearance of the materials he/she receives. As the first introduction to the survey, the pre-notification letter is of critical importance and serves the essential function of alerting respondents to an upcoming survey. Its text and appearance address the nature and legitimacy of the study and emphasize why participation is so important. The pre-notification letter will be signed by an official at CMS. A copy of this letter is included as Attachment B.
The letter will be mailed to physicians approximately 1 week before they receive the mail survey. The letter will reference a toll-free number established and monitored by NORC. Supervisory staff responsible for answering the toll-free lines will have immediate access to the case management system. This will allow a respondent to confirm that his/her questionnaire was received, request an additional copy of the questionnaire, or complete the interview over the telephone at that time. After-hours callers will reach a scripted message and all calls will be returned within one business day. A project-specific fax line will also be established, allowing respondents to return completed questionnaires or receive additional copies if requested.
Mail returned as undeliverable from the USPS will be receipted and processed. Updated address information will be noted, entered into the case management system and directed to our interviewing staff for contacting. Returned mail with no updated information will be sent to NORC’s locating staff for review and examination.
Step 3: Initial Questionnaire Mailing. Approximately 7 to 10 days following the pre-notification letter, NORC will mail the initial questionnaire to all sampled respondents. This mailing will utilize all updated address information resulting from returned pre-notification letters. Integrated into each questionnaire will be a cover letter from CMS (included as Attachment C) and another letter endorsing the survey from the NCI (included as Attachment D). The personalized cover letter will describe the purpose of the study and request participation. To encourage cooperation we will provide respondents with the most convenient means available to respond. We will offer a choice to respond via mail, a secure, dedicated fax-line or calling a toll-free telephone line to request a telephone interview. Furthermore, we will offer physicians the option of receiving the questionnaire in PDF format via e-mail so they can print out the questionnaire, complete it and return it by mail or fax.
The questionnaire will be professionally printed. A postage-paid business reply envelope (BRE) addressed to NORC will also be included, providing an easy and no-cost way to physicians for returning completed questionnaires. In addition, pre-paid incentive of $25 will be included in the initial mailing. The $25 serves as both a token of appreciation for their contribution to the study and also as an effective method to maximize physician response (see A9).
Step 4: Second Questionnaire Mailing. Follow-up mailings will be sent to respondents whose questionnaires have not been received by a date agreed upon with the Project Officer. To minimize the number of cases where a completed instrument is in transit to NORC but not yet receipted, NORC will enter all completed questionnaires into our case management system the same day they are delivered by the USPS. Once an updated list of non-respondents is complete, new questionnaires will be mailed. These second version will be identical to the initial mailings, with the exception of the cover letter (included in Attachment D), which will be revised to acknowledge the earlier mailing and express gratitude to those who have already responded. No additional incentive is proposed at this time. To further prevent any duplication, just prior to mailing, project staff will identify any newly returned questionnaires and extract those cases from the second mailing.
Step 5: Telephone Prompting. If, after a date approved by CMS, we have not received a completed survey, we will begin the telephone prompting effort. These calls will serve to boost the response rate achieved from the original and second mailings. Telephone interviewers will be responsible for conducting the following activities:
Telephone prompting of providers who have not yet returned their completed surveys despite receiving the initial and follow-up packages via U.S. mail.
Gaining cooperation and offering options of re-mailing the questionnaire, faxing, conducting a telephone interview, or receiving an e-mailed PDF file.
Managing resistance from gatekeepers, such as office managers, to achieve contact with sampled physicians.
Fielding incoming calls from providers who choose to reply to the survey via the toll-free line.
Research has shown that refusal responses are often a function of interviewer behavior.5 Therefore, NORC interviewers are highly trained in not only project-specific details but also the larger goals of social science research. Trainings cover the fundamentals of data collection, including implementation of sample designs, approach to respondents, administration of questionnaires (neutral probing techniques, accurate recording of responses, following skip patterns, etc.), and protection of respondent confidentiality. In addition to receiving basic training on the fundamentals of administering surveys, NORC interviewers take part in project-specific training, tailored to meet the needs of each study’s data collection method. Interviewers receive a detailed written manual of instructions that describe the background and purpose of the study, procedures, and the meaning and intent of individual questionnaire items, and this manual is supplemented by one or more additional training devices, such as instructional videotapes and role playing exercises. To inform ongoing training efforts, NORC supervisory staff regularly monitors all telephone interviewing activities throughout the data collection period. Furthermore, project and contact center staff continuously review production data and project-specific reports to gauge progress, quality, and identify and issues related to individual interviewer or production center staff.
This section details the tabulations and statistical analyses that will be conducted for the survey of the 2006 Medicare Oncology Demonstration Program. The study will use both univariate and, where possible, multivariate techniques to analyze the data. Data analysis will focus on identifying results of the core research questions. Both descriptive and inferential statistics, such as the standard t-test, chi-square test, and multiple comparison procedures will be utilized in the analysis. Nonsampling errors arising from unit and item nonresponse will be analyzed as discussed in item B3. The remainder of this section presents specific analyses that will be conducted to answer each research question.
Research Question 1. What is the profile of the demonstration participant?
This analysis will explore characteristics of participants to determine the extent to which certain characteristics of both the physician and the practice appear to be related to both participation in and awareness of the demonstration. Descriptive statistics will be run on these characteristics and chi-square tests of association between groups will be conducted.
Research Question 2. What processes are associated with participation in the demonstration?
There are several practice level processes that affect physician implementation of the demonstration. This survey will provide an opportunity to document implementation factors associated with participation in a uniform manner. Several questions on the survey relate to gaining information about how the physician and the practice implement the demonstration. Descriptive statistics will be used to describe the implementation processes and chi-square tests will be used to determine if the implementation process varies across physician and practice characteristics. We will also use logistic regression, where appropriate, to determine if practice characteristics are associated with the likelihood of participating in the demonstration.
Research Question 3. What are physicians attitudes about the demonstration, and more broadly, about clinical guidelines?
CMS is interested in learning which aspects of the demonstration are considered useful by physicians. An assessment of physician attitudes regarding aspects of the demonstration and an overall rating of the 2005 and 2006 demonstrations will help direct resources and effort. In addition, to assess the impact of the demonstration, evidence-based clinical guidelines, and the quality of care provided to cancer patients, a series of questions will ask physicians for their perceptions regarding these factors.
Simple descriptive statistics will be used to identify the elements considered most important and useful. Chi-square tests of association between these perceptions and organizational characteristics will also be conducted. Furthermore, analyses will also assess physician perceptions of the 2006 demonstration and clinical guidelines. These questions ask physicians to rate, on a scale of 1-5, how strongly they agree with key aspects about the demonstration and clinical guidelines. Mean scores will be computed and compared among different practice characteristics using a t-test, which assumes normally distributed data. An alternative non-parametric test (with no accompanying normality assumption) that will be used is the Wilcoxon signed rank test. Analyses will be conducted to examine possible correlation between overall opinion of the program and utilization of the program.
In addition to the closed-ended questions included in the survey, a final open-ended question has been included to allow physicians maximum flexibility in making suggestions and providing feedback about the demonstration. Responses to this question will be reviewed and common responses will be grouped and categorized for assessment.
Below is an anticipated timeline for implementing the survey. Pending OMB Clearance, we assume a June 2007 start date.
Exhibit 4. Timetable for Data Collection, Analysis, and Publication
Activity |
Length of Activity |
Estimated Duration |
Data Collection Preparations |
9 Weeks |
May 7 – July 6, 2007 |
Mail Advance Letter |
Week 1 |
July 9 – July 13, 2007 |
Mail Questionnaire #1 and Cover Letter |
Week 2 |
July 16 – July 20, 2007 |
Receipt of Returned Questionnaires |
Week 3 – 5 |
July 23 – Aug 10, 2007 |
Mail Questionnaire #2 and Follow-up Letter |
Week 6 |
Aug 13 – Aug 17, 2007 |
Receipt of Returned Questionnaires |
Week 7 - 10 |
Aug 20 – Sept 14, 2007 |
Phone Prompting, Receipting, and Data Entry |
Week 11 -16 |
Sept 17 – Oct 26, 2007 |
Data Delivery |
4 Weeks |
Oct 29 - Nov 23, 2007 |
No exemption is being requested.
There are no exceptions to the certification statement.
The potential respondent universe is comprised of office-based physicians who provide evaluation and management (E&M) services of level 2, 3, 4, and 5 to established patients with a primary diagnosis of cancer belonging to one of 13 major categories. Physicians, or cases, in the respondent universe are from four medical specialties: (1) medical oncology, (2) hematology, (3) hematology/oncology, and (4) gynecological oncology. The respondent universe contains a subset of physicians in the UPIN file supplied by CMS. The UPIN file will be cross-walked to the physician supplier file to determine site of service in order to ensure that only office-based physicians are being included in the universe.
We will draw a simple random sample without replacement of cases from the respondent universe. This sample will be self-weighting, therefore we will have no design effect from our sampling methodology. We will determine the sampling interval (k) by dividing the number of cases in the universe by 1,600. We will then randomly choose a number between 1 and k, and starting with that integer, sample every kth unit.
We plan to divide the sample into replicates of 50 cases. Since the eligibility rate is unknown at this time, we could do an early release of a few replicates to help us estimate our eligibility rate based on the returns from the early release. This will help determine the number of cases in the sample that we need to release to achieve 600 at 60 percent response rate. Exhibit 5 below shows an example for clarification.
Using a sample size of 1,500 as an example (assuming we release all 1500 cases), the table shows that of the 1,500 cases, approximately 33 percent would be “out of scope/ineligible” (deceased, non-office based, wrong specialty). An additional 27 percent of the sample would be “non-interviews” and would be included in our response rate. The non-interviews are those cases who do not return the questionnaire, break appointments, and/or cannot be reached. To calculate our final response rate, we will remove the “out of scopes” from the denominator, to achieve a 60 percent response rate.
Exhibit 5.
Call Disposition |
Number of Cases |
Percent |
Percent Non-OOS |
Complete |
600 |
40.00 |
60.00 |
Non-Interviews |
400 |
26.67 |
40.00 |
Out of Scope (OOS) |
500 |
|
|
Sample Size Released |
1,500 |
100 |
100 |
Power Calculation
The sampling strata and cell sizes will include the following:
Demonstration participation status
Eligible claims volume
The overall distribution across the primary strata (participation status) will reflect the following: the proposed sample will consist of 50 physicians within each combination of four specialties and three levels of claims volume, for a total of 600 respondents. The outcome of greatest interest is participation in the 2006 Oncology Demonstration Program, and we are interested in differences along either the dimension of claims volume or specialty, taken one at a time.
Stratum |
Low
|
Medium Volume Claims
|
High
|
Medical Oncology |
50 |
50 |
50 |
Hematology |
50 |
50 |
50 |
Hematology/Oncology |
50 |
50 |
50 |
Gynecological Oncology |
50 |
50 |
50 |
In our calculations, the null hypothesis of no difference in demonstration participation between categories is tested (using a two-sided test) against the alternative that one or more categories do differ from the rest. The calculations express the results in terms of the “effect size”, essentially the ratio of the variance between categories under the alternative to the variance under the null. Given the desired power and significance level, this depends only on the number of groups and the number within each group. However, both of the terms in the effect size vary with the event rates within categories, and there is no unique set of event rates corresponding to a given effect size. We show several examples of rates within categories that would provide the effect in question.
A) Comparisons by claims volume:
3 groups, 200 subjects each, alpha = .05, power = .90: Effect size = .02109
Examples: If the average participation rate is 10%, the rates within groups could be 4.6%, 10%, and 15.4%, or 6.9%, 6.9%, and 16.2%, or 4%, 12%, and 14%. Larger differences between groups would improve the power.
B) Comparisons between specialties:
4 groups, 150 subjects each, alpha = .05, power = .90: Effect size = .02362
Examples: If the average participation rate is 10%, the rates within groups could be 6%, 5%, 15%, and 14%, or 7%, 7%, 7.5%, and 18.5%, or 3.7%, 9%, 10%, and 17.3%.
If the average participation rate is 25%, the rates within groups might be: 17%, 22%, 26%, and 35%, or 20%, 22%, 37%, and 21%, or 18.5%, 18%, 31%, and 32.5%.
A pre-notification letter alerting respondents to an upcoming survey will be mailed to all physicians in the sample. Fielding of the survey will entail mailing surveys, along with a cover letter and a pre-paid incentive of $25, to the physicians. A postage-paid envelope addressed to NORC will also be included, providing an easy and no-cost way of returning the completed questionnaires. Follow-up mailings with another copy of the questionnaire will be sent to respondents whose questionnaires have not been received by an agreed upon date. If a completed survey is not received, we will begin the telephone prompting effort. Telephone interviewers will be responsible for gaining the physicians cooperation, offering alternatives such as a telephone interview or faxing the completed questionnaire, and managing resistance from office managers to achieve contact with sampled physicians. These calls will serve to boost the response rate achieved from the original mailing.
NORC will use a receipt control system using subject identification numbers to track the initial questionnaire mailing, address updates, re-mailing of questionnaires, and complete and incomplete questionnaire returns. Reports from the system will identify the sample members, which require telephone prompting for completion of the survey. Post receipt, completed questionnaires will be data entered into a system designed to ensure proper range edits, skip patterns, and missing data flags, which all serve to prevent errors and promote data integrity. Ten percent of all questionnaires will be re-entered for verification and quality control purposes.
Declining response rates have been reported by all survey organizations and statistical agencies regardless of respondent type and modes of data collection. In addition, physician surveys present unique challenges to response rates. While the obstacles to obtaining high response rates on physician surveys are formidable, good design decisions and proper implementation of all aspects of survey operations can greatly increase the likelihood of success. These decisions include judicious use of incentives and designing an instrument that keeps respondent burden as low as possible. A recent NORC effort was a survey that examined physicians’ use of imaging equipment conducted for MedPAC. NORC was able to obtain a response rate in excess of 70 percent despite a short field period.
The data collection procedures for the 2006 Oncology Demonstration Survey will follow the standards of the well-established and proven Total Design Method (TDM).6 In brief, the theory underlying the TDM is one of social exchange, where the rate of response is a function of both the amount of effort required as well as what benefit a respondent will receive for their participation. The basic components of the TDM are as follows:
Minimization of respondent burden through the design of high-quality instruments that are attractive and easy to complete;
Persuasive communications which provide information about the study; and
A series of follow-up techniques that vary by mode, such as additional mailings and telephone calls.
Specifically, our plans for enhancing response rates for the 2006 Oncology Demonstration Survey are highlighted below.
Quality of Study Materials and Interviewers. A mail and telephone study can be characterized by the degree and quality of contact with both the study materials (questionnaire) and the telephone interviewer. In numerous examples, quality print materials and a team of well-trained, efficient and effective interviewers has generated a higher response rate than other modes. All print materials for the physician survey will be professionally designed and printed. In addition, NORC interviewers are highly-trained, experienced, and skilled at achieving high response rates. Their skills and experience will be strengthened through a thorough training.
Overcoming Barriers to Participation. Improving response from physicians begins with in-depth understanding of the barriers to their participation in the survey. Our approach to gaining cooperation must be tailored to the needs of the physician community paying particular attention to the amount of time and effort needed by physicians to complete the survey. The survey is user-friendly, with a maximum of 38 questions. The questionnaire was pilot tested with 6 physicians and survey questions were amended to reflect suggested improvements from these respondents. Based on the pilot test, it is estimated that the survey will take no more than 10 minutes to complete.
Response will also be enhanced through the use of telephone prompting. NORC interviewers are skilled at adapting to the varied situation presented in each physician’s office. An array of options will be offered to maximize the study’s response rate. Specifically, physicians will have the option to complete the questionnaire over the phone, return by fax or email, or return by mail. In addition, interviewers will vary their call times to the physicians and work with the physicians to determine a time convenient for the physician’s participation.
Salience of Survey Content. The survey was designed to be not only brief, but of high salience to the sampled physicians. Surveys that are salient to the respondent group coupled with timely advance materials (such as pre-notification letters) will maximize response for the survey. Furthermore, NORC will extend this expectation of salience to the telephone interviewers, making sure they are knowledgeable about the survey and its importance, thereby allowing them to be persuasive in their efforts to maximize response rates.
In summary, although no response rate can be guaranteed, we are confident that we have proposed a level of effort that should attain a survey response rate of at least 60%.
In addition to maximizing participation in the survey, efforts will also be made to ensure the survey data that are collected are representative of the study population of interest. While some level of nonresponse is acceptable, the impact of nonresponse on data quality is complex. Previous analysis of data from a CMS physician survey suggests that unit nonresponse (whether the survey itself was completed) may be negatively associated with item nonresponse (whether a critical question was answered). It is important that the effect of both types of nonresponse be understood when trying to understand the quality of data7. Increasing response rates does not, by itself, ensure lower nonresponse bias8, 9.
To examine nonresponse as a potential source of bias, L&M and its subcontractors plan to conduct a nonresponse analysis. The analysis will focus on those known factors in the sample file, including, but not limited to:
Variation in participation in the 2006 demonstration among survey responders and nonresponders.
Variation in practice characteristics including the practice structure, location, specialties, and practice size, among survey responders and nonresponders.
Variation in physician experience with Medicare among survey responders and nonresponders.
Variation in physician characteristics, including physician age, gender, and length of time in practice.
Upon completing the analysis, L&M will report its finding and recommendations to CMS. Our recommendations will include the possibility of weighting to adjust for nonresponse.
The questionnaire was pilot tested with 6 physicians. The physicians were asked to complete the questionnaire and then respond to debriefing questions regarding comprehension of question, recall burden, and difficulties with response options. The questions were revised based on the respondents’ feedback.
The project team chosen to conduct this study includes the following individuals:
L&M Policy Research
Lisa Green
240-476-6663
Myra Tanamor
202-230-9029
Julia Doherty
202-291-2518
NORC
Marc Berk
301-951-5087
Karen Cheung
202-887-2331
Angela Jaszczak
312-759-4236
Katie Lundeen
312-759-4221
The Lewin Group
Joan DaVanzo
703-269-5724
David Jack Kenny
703-269-5697
American Institutes for Research
Steven Garfinkel
919-918-2306
1 Berk M, Mathiowetz N, Ward P, and White L. 1987. “The Effect of Prepaid and Promised Incentives: Results of a Controlled Experiment,” Journal of Official Statistics 3.
2 Berk M, Edwards W, and Gay N. 1993. “The Use of a Prepaid Incentive to Convert Nonresponders on a Survey of Physicians,” Evaluation and the Health Professions 16, 2.
3 Berry SH and Kanouse DE. 1987. “Physician Response to a Mailed Survey: An Experiment of Timing and Payment,” Public Opinion Quarterly, 51, 102-116.
4 A product of the U.S. Postal Service, Smartmailer contains and regularly updates a database of all deliverable U.S. addresses. The Smartmailer software compares input addresses to the database to determine their absence or presence in the standard database; it also performs address corrections and standardization, such as zip code correction.
5 Smit, J.H. and W. Dijkstra. 1991. Persuasion Strategies for Reducing Refusal Rates in Telephone Surveys. Bulletin de Methodologie Sociologique 33: 3-19.
6 Dillman, Don A. 1978. Mail and Telephone Surveys: The Total Design Method. New York: Wiley.
7 Berk, M., Mueller, C., and Thran, S. “Can Survey Data be Used to Estimate Physician Practice Costs?” Evaluation and the Health Professions 19, 1 (1996).
8 Schoenman, J., Berk, M., Feldman, J., and Singer, A. “Impact of Differential Response Rates on the Quality of Data Collected in the CTS Physician Survey,” Journal of Evaluation and the Health Professions, 26:23-42, 2003.
9 Berk, M. “Interviewing Physicians: The Effect of Improved Response Rate," American Journal of Public Health (November 1985).
File Type | application/msword |
Author | JOAN.DAVANZO |
Last Modified By | Myra Tanamor |
File Modified | 2007-12-12 |
File Created | 2007-12-12 |