Supporting Statement A for
The National Physician Survey of Precision Medicine in Cancer Treatment (NCI)
Date: May 12, 2016
Janet S. de Moor, PhD, MPH
Health Assessment Research Branch, Healthcare Delivery Research Program
Division of Cancer Control and Population Sciences
National Cancer Institute/National Institutes of Health
9609 Medical Center Drive, Room 3E438
Rockville, MD 20850
Telephone: 240-276-6806
Fax: 240-276-7909
Email: [email protected]
Check off which applies:
X New
Revision
Reinstatement with Change
Reinstatement without Change
Extension
Emergency
Existing
Table of contents
Section |
Page |
A. JUSTIFICATION |
3 |
A.1 Circumstances Making the Collection of Information Necessary |
3 |
A.2. Purpose and Use of the Information COLLECTION |
3 |
A.3 Use of Information Technology and Burden Reduction |
6 |
A.4 Efforts to Identify Duplication and Use of Similar Information |
7 |
A.5 Impact on Small Businesses or Other Small Entities |
7 |
A.6 Consequences of Collecting the Information Less Frequently |
7 |
A.7 Special Circumstances Relating to the Guidelines of 5 CFR 1320.5 |
7 |
A.8 Comments in Response to the Federal Register Notice and Efforts to Consult Outside Agency |
7 |
A.9 Explanation of Any Payment of Gift to Respondents |
8 |
A.10 Assurance of Confidentiality Provided to Respondents |
8 |
A.11 Justification for Sensitive Questions |
9 |
A.12 Estimates of Hour Burden Including Annualized Hourly Costs |
9 |
A.13 Estimate of Other Total Annual Cost Burden to Respondents or Record keepers |
11 |
A.14 Annualized Cost to the Federal Government |
11 |
A.15 Explanation for Program Changes or Adjustments |
12 |
A.16 Plans for Tabulation and Publication and Project Time Schedule |
12 |
A.17 Reason(s) Display of OMB Expiration Date is Inappropriate |
13 |
A.18 Exceptions to Certification for Paperwork Reduction Act Submissions |
13 |
Attachments
Attachment A Questionnaire
Attachment B Telephone Screener
Attachment C Survey Cover Letters and Emails
Attachment D Telephone Reminder Script
Attachment E Nonresponse Follow-Back Survey
Attachment F Consultants
Attachment G References
Attachment H OHSRP Exemption
Attachment I Privacy Act Memo
Attachment J Screen Shots from Web Survey
Attachment K Privacy Impact Assessment
Attachment L Thank you letter for a contingent incentive
A. Justification
This is a new information collection seeking approval for 2 years. The purpose of this study is to investigate the current practice of precision medicine in cancer treatment among medical oncologists in the U.S. This project aims to conduct a nationally representative survey designed to: assess oncologists’ current and potential use of genomic testing; to inform the development of interventions; to facilitate optimal use of genomic testing; and to improve patient-physician discussions of the risks, possible benefits, and uncertainties surrounding the use of these tests. Current knowledge of this topic is limited as there are no nationally-representative studies on this topic to date. The survey will be administered by mail and web to oncology physicians across the U.S. Non-respondents will be invited to complete a follow-back survey to share their reasons for not participating. The study findings will inform NCI of relevant issues and concerns relating to the application of precision medicine to current and future cancer treatment patterns and practice. This information will also inform the development of new funding initiatives to optimize the use of precision medicine in cancer treatment. Additionally, information collected as part of this survey will be used to develop physician educational materials to address barriers to precision medicine in cancer care delivery.
A.1 Circumstances Making the Collection of Information Necessary
The present study aims to address the limitations of earlier studies by conducting a large, nationally representative survey of the current practice of precision medicine in cancer treatment. Specifically, the survey is designed to assess oncologists’ current and potential use of somatic genomic testing in their practice, as well as inform the development of procedures to facilitate optimal use of genomic testing and improving patient-physician discussions of the risks, possible benefits, and uncertainties surrounding the application of these tests. The objectives of the survey are to assess medical oncologists’ experiences, attitudes, and recommendations concerning genomic tests, determine the prevalence of genomic testing in oncology, and identify facilitators and barriers in the integration of genomic testing into oncology.
We aim to conduct a nationally representative study to sample approximately 1,200 medical oncologists who have treated cancer patients in the past 12 months (approximately 10% of the target population registered as hematologists or oncologists in the American Medical Association Masterfile). An analytic sample size of 1,200 was determined to be the minimum sample required to meet the following analytic goals:
provide a national representative sample of medical oncologists, consisting of the following subspecialties: hematologists, hematologists/oncologists, and oncologists.
allow for selection of a stratified random sample by subspecialty, metropolitan (metro) area, geographic region, sex, and age. Subspecialty and metro area will be used as the primary sampling strata. Providers will be selected proportional to size for region, sex and age to ensure that the sample looks like the population.
provide an analytic sample size sufficient to detect a 10-15 percentage point difference in proportions of physicians by practice setting (e.g., academic versus non-academic, metro versus non-metro) at 90% power, 0.05 level of significance, and in a 2-tailed test
meet the criteria for precision (margin of error not exceeding + 5% around point estimates) that was set forth by the study team at NCI.
assume a 50-60% response rate as well as an ineligible rate of 15-17% (McLeod, Klabunde, & Willis, 2013). These estimates are based on a review of 117 large provider surveys described in the literature, published between 2000-2010. The proportion of surveys with a response rate >60% have declined over time. However, we have incorporated design features into this survey that are associated with higher response rates (i.e., incentives, multi modal data collection, government sponsorship)
The surveys will be completed either by paper survey or web survey (see Attachments A and J). An initial 350 surveys will be administered as part of a pilot data collection and then an additional 1,200 surveys will be administered for the main study. The pilot will be used to verify our assumptions about the response rate and to refine the survey instrument. It is anticipated that minor changes will be made to the survey in response to the pilot, so the pilot data will not be combined with the final sample. Participants will be asked to answer questions about their experiences with genomic testing in oncology and their perceptions of the barriers and facilitators of genomic testing within their practice. Demographic information and practice characteristics will also be collected. The entire procedure is expected to last approximately 20 minutes. This will be a one-time (rather than annual) information collection.
To achieve approximately 1,200 completed surveys for the main study, we will draw a sample of 2,882 oncologists from the American Medical Association (AMA) Masterfile, which is comprehensive of all physicians in the US. Because this study is designed to make inferences about the population of medical oncologists practicing in the United States, we opted to sample directly from the AMA Masterfile, rather than supplement a federal survey, such as the National Ambulatory Medical Care Survey (NAMCS), which includes a diverse sample of providers. Further, the core NAMCS survey is designed to generalize to medical visits rather than providers.
Prior to sending physicians a copy of the survey (Attachment A), the contractor will make telephone calls using the screener (Attachment B) to the sampled physicians to verify that the sample member is eligible for the survey (e.g., a practicing oncologist under age 75), and that the address is correct. The screener is also used to verify eligibility and contact information. For any sampled physicians who have moved, the contractor will attempt to obtain new address information from the office contacted or through various tracing procedures and internet searches. We estimate that approximately 15% of the initial sample will be ineligible or unable to be located. If needed, additional sample can be taken to ensure and verified to assure that the survey will yield the analytic sample required for the study.
Eligible, sampled physicians will then be mailed a survey invitation. The initial survey package will be sent by U.S.P.S. first-class mail. It will include a cover letter, survey, and postage-paid return envelope. . A review of 117 large provider surveys suggests that 53% of provider surveys that offered an incentive used a prepaid rather than a contingent incentive (McLeod, Klabunde, & Willis, 2013). The Division of Cancer Control and Population Sciences (DCCPS) has a long-standing program of provider surveys, and a pre-paid incentive has been successfully used in the past (e.g., OMB No 0925-0468, OMB No 0925-0583, OMB No 0925-0595, OMB No 0925-0562, OMB 0925-0469)In our experience, a small number of providers do not complete the survey but cash the incentive check. For example, in the National Physician Survey of Practices on Diet, Physical Activity, and Weight Control (OMB No. 0925-0583), 4.9% of the sample (n=148) did not complete the survey but still cashed the check. This was largely offset by the 4.6% of the sample (n=138) who completed the survey and did not cash the check. To empirically test whether a pre-incentive is associated with a higher response rate than a contingent incentive, we propose to embed an experiment into the main study to test whether the response rate to the survey differs based on the timing of the incentive (prepaid versus contingent). Eligible providers will be randomly assigned to either receive the incentive prior to completing the survey (prepaid) or after the survey has been completed (contingent). Starting with a sample of 2,882 physicians with an 85% eligibility rate, a 98% contact rate, and a 50% response rate, we have 2,400 potential respondents (1,200 per group) and 1,200 expected respondents (600 per group). We will assume a two-sample test of independent percentages, the post incentive response rate percentage equal to 50%, half of the potential respondents assigned to each experimental group, and two-sided test with a significance level (alpha) of 0.10. At 80% power with a two-sided test and alpha equal to 0.10 we can detect a 5.07 percentage point difference in the response rate (55.07 percent minus 50.00 percent). This information would be instructive for future provider surveys conducted by DCCPS. For the contingent incentive group, the check will be sent with the thank you letter in attachment L.
One-and-half weeks after the follow-up mailing, we will send an email reminder to nonresponders with a link to complete the survey via the web. Screen shots of the web survey are provided in Attachment J. The electronic web survey is exactly the same as the paper survey. Within one-and-a-half weeks of sending the email reminder, the contractor’s telephone interviewers will begin making follow-up telephone calls to nonresponders. The interviewers will provide reminders about the study, obtain new mailing addresses or email addresses, and send replacement questionnaires or emails as needed. Two weeks after the email reminder, we will send the final follow-up mailing to nonresponders via FedEx. The package will include a new cover letter, replacement survey, and a postage-paid envelope. One-and-half weeks after the FedEx mailing, we will send a final reminder via email to all nonresponding physicians. The email will include a link to the web survey. Reminder phone calls to nonresponding physicians will continue until the end of data collection – approximately 12 weeks after the initial survey invitations were sent. Copies of the cover letters and emails are provided in Attachment C. Participants will be contacted to remind them to complete the survey. The telephone reminder script is provided in Attachment D.
For this project, a brief (1 page) nonresponse survey (Attachment E) will be conducted with all providers who do not complete the pilot and the main survey. Because the sample for the nonresponse survey is providers who did not complete the pilot/main survey, we expect only 20% of this group to return the nonresponse survey. This translates to approximately 35 providers for the pilot and 250 providers for the main study.
The findings from The National Physician Survey of Precision Medicine in Cancer Treatment will inform NCI of relevant issues and concerns relating to the application of precision medicine to current and future cancer treatment patterns and practice. This information directly supports the recently established Precision Medicine Initiative and will also inform the development of new funding initiatives to optimize the use of precision medicine in cancer treatment. Additionally, information collected as part of this survey will be used to develop physician educational materials to address barriers to precision medicine in cancer care delivery.
Study aims include:
Obtaining estimates of the proportion of oncologists that use specific genomic tests for individual gene mutations (KRAS, BRAF, HER2, MPL, etc.) and multi-marker tumor panels (DecisionDX, FoundationOne, Oncotype DX Colon)
Determining how tests are used (diagnostically, to guide treatment decisions)
Understanding reasons test are used and not used by oncologists
Identifying key barriers and facilitators for use of genomic test
Determining differences in use, knowledge, and attitudes towards genomic tests between academic versus community physicians
A.3 Use of Information Technology and Burden Reduction
Information technology will be used in the collection of information for this study. First all selected sample cases will be screened via the telephone to determine eligibility and to verify contact information. Eligibility status and contact information will be stored in a database maintained by the contractor on their secure server. Only contractor staff authorized to work on the project will have access to the project folder on the server. A Privacy Impact Assessment attachment K was completed for this project and submitted to the NIH Privacy Office.
For eligible sample cases, physicians will be mailed a paper version of the survey to complete. Up to two follow-up mailings with replacement paper surveys will be sent to nonresponders. Sample members who do not respond to the mailed survey requests, and who have an email address, will be sent up to two email survey invitations with a link to a web survey that they can complete via computer or mobile device.
The web survey (attachment J) is programmed on Survey Gizmo. Survey Gizmo allows surveys to be accessed by users via secure (https) share links, which keep responses secure. It also has a Project Data Encryption feature that allows projects to encrypt all survey data that are received, so those data cannot be accessed without a password key.
Towards the end of the data collection period, reminder phone calls will be placed to nonresponding physicians. However, interviews will not be conducted over the telephone. Telephone interviewers can request that a survey be mailed or emailed to the sample member as needed.
In addition to its use in data collection, automated technology will be used in data editing and analysis. Burden will be reduced by recording data on a one-time basis for each respondent, and by keeping the survey to approximately 20 minutes.
Increased understanding of the mechanisms that drive cancerogenesis has resulted in an explosion in molecular and genomic tests to characterize individual tumors. These tests have placed oncology at the front of precision medicine, but no national data exist on the use of these tests in oncology or the barriers to and facilitators of their use. A previous NCI survey (OMB no. 0925-0469) characterized the use of testing for germline mutations associated with risk for breast, ovarian, or colon cancer. However, this survey did not include somatic testing for genetic alterations through which the molecular characteristics of a patient’s tumor are used to guide treatment decisions. Two non-federal studies have examined physicians’ knowledge and attitudes regarding somatic genetic and genomic testing (Miller et al., 2009; Gray et al., 2014). However, the generalizability of these results is limited and findings are unlikely to apply to the majority of cancer care delivery systems in the United States.
Although we have some current information regarding physicians’ knowledge and attitudes regarding genomic testing, we do not have nationally representative data on the perspectives of US medical oncologists regarding the growing use of multi-marker tumor panels in cancer treatment. In particular, it is critical to capture timely data on the use of emerging genomic tests, which are relatively new and for which there are few guidelines on use or indicators of clinical utility or validity.
Given past studies, it appears there is adequate background literature but no national studies that duplicate the efforts proposed in this statement.
No small businesses will be involved in this data collection.
The proposed data collection is one-time only. Although there are no plans for successive data collections, a portion of nonresponders will be selected to receive a follow-back survey to determine their reason for nonresponse.
This collection of information fully complies with 5 CFR 1320.5. There are no special circumstances.
A 60 day Federal Register Notice was published in the Federal Register on 11/18/2015, Vol. 80, No. 222; pp. 72077. No comments were received.
The National Survey of Precision Medicine questionnaire (see Attachment A) was developed with the consultation of several physicians and subject matter experts during the development period of the instrument. The names, addresses and affiliations of the consultants are listed in Attachment F.
A.9 Explanation of Any Payment of Gift to Respondents
Due to historically low response rates of physician surveys, numerous studies have been conducted to identify methods for improving such response rates. The general consensus among researchers in non-profit, government, and academic research have found that monetary gifts ensure acceptable response rates. At a provider survey workshop at the National Cancer Institute in 2010, the consensus was that use of monetary gifts for providers is common practice (Klabunde, 2012). Numerous studies have shown that monetary gifts improve response rates. Monetary gifts differ in type and amount. The most effective monetary gift for mailed physician surveys are noncontingent (i.e., sent before the survey is completed) token honoraria ranging from $50-$75.This finding is supported by Dykema, Stevenon, Day, Sellers, & Bonham (2011) who conducted a gift experiment on a survey of physicians selected from the American Medical Association’s Physician Masterfile. Physicians were randomly assigned to one of four treatment groups: no gift, $200 lottery, $50 gift, or $100 gift. Response rates were highest in the group with $100 gift. Another survey of medical oncologists (Oratz et al., 2011) provided a gift of $150 for completing a survey on genomic testing.
Given this information, we will offer a monetary gift of $50. The optimal timing of the $50 gift will be tested through the pre-post incentive experiment.
References are provided in Appendix G
A.10 Assurance of Confidentiality Provided to Respondents
Personally identifiable information will be collected by NCI. This information is in the form of name and email. Each survey will have a unique ID number associated with it, which will be used as the record identifier during data entry and analysis. All information that can identify individual respondents and link respondent contact information with the unique ID will be maintained in a form that is separate from the data provided to NCI.
All information will be kept private to the extent allowable by law. The privacy of information submitted is protected from disclosure under the Freedom of Information Act (FOIA) under sections 552(a) and (b) (5 U.S.C. 552(a) and (b)), and by part 20 of the agency’s regulations (21 CFR part 20). This information collection has been deemed exempt from the human Subject protection as this is not human subject research (Attachment H) Institutional Review Board (Research Involving Human Subjects Committee, RIHSC) prior to collecting any information.
All electronic data will be maintained in a manner consistent with the Department of Health and Human Services’ ADP Systems Security Policy as described in the DHHS ADP Systems Manual, Part 6, chapters 6-30 and 6-35. As determined by the NCI Privacy Act Memo ( Attachment I) the Privacy Act is applicable as PII is being collected and stored in a system where records a retrieved via personal identifiers. This collection is thus covered by the NIH Privacy Act Systems of Record 09 25 0156, "Records of Participants in Programs and Respondents in Surveys Used to Evaluate Programs of the Public Health Service, HHS/PHS/NIH/OD."
A.11 Justification for Sensitive Questions
This data collection will not include sensitive questions.
A.12.1 Estimated Annualized Burden Hours
The total annual estimated burden is 359 hours for this initial collection from a total of 1,375 individuals. This includes a burden estimate for the office manager or other administrative staff person who answers the telephone at the sampled oncology physician’s office (referred to as the receptionist in Table A.12.1) and the oncology physicians who are completing a pilot study (175 respondents), the main study (600 respondents), and a nonresponse follow-back survey (143 respondents). The initial telephone screener call is estimated to take 3 minutes. The pilot and full study are assumed to take 20 minutes on average to complete. We estimate that the one-page follow-back survey (see Attachment E) will take 5 minutes to complete. The telephone reminder (Attachment D) will take approximately 5 minutes. The annualized burden hours are summarized in Table A.12.1. This study is will take place over two years.
Table A.12-1 Estimated Annualized Burden Hours
Form Name |
Type of Respondent |
Number of Respondents |
Number of Responses per Respondent |
Average Burden Per Response (in hours) |
Total Annual Burden Hour |
Telephone Screener (Attachment B) |
Receptionists |
775 |
1 |
3/60 |
39 |
Precision Medicine Survey – Pilot Study (Attachment A or J) |
Oncology Physicians |
175 |
1 |
20/60 |
58 |
Precision Medicine Survey – Full Study (Attachment A or J) |
Oncology Physicians |
600 |
1 |
20/60 |
200 |
Non-response Follow-back Survey (Attachment E) |
Oncology Physicians |
143 |
1 |
5 /60 |
12 |
Telephone Reminder Script (Attachment D) |
Receptionists |
600 |
1 |
5/60 |
50 |
Total |
1,375 |
2,293 |
|
359 |
A.12.2 Annualized Cost to respondents
The total annual estimated cost to respondents is $26,058. As shown in Table A.12-2, we provide the annualized costs broken down by type of respondent. For receptionists, the wage rate of $15.50 was calculated using the Bureau of Labor Statistics, Occupation code 43-6013 Medical Secretaries, mean hourly wage, (http://www.bls.gov/oes/current/oes436013.htm). For oncology physicians, the wage rate of $91.23 was calculated using Bureau of Labor Statistics, Occupation code 29-1069 Physicians and Surgeons, All other, Mean hourly wage, (http://www.bls.gov/oes/current/oes291069.htm).
Table A.12-1 Annualized Cost to the Respondents
Type of Respondent |
Total Annual Burden Hour |
Hourly Wage Rate |
Respondent Cost |
Receptionists (telephone screener) |
39 |
$15.50 |
$605 |
Oncology Physicians (Pilot) |
58 |
$91.23 |
$5,337 |
Oncology Physicians (Full study) |
200 |
$91.23 |
$18,246 |
Oncology Physicians (Follow-up) |
12 |
$91.23 |
$1095 |
Telephone Reminder Script |
50 |
$15.50 |
$775 |
Total |
350 |
|
$26,058 |
A.13 Estimate of Other Total Annual Cost Burden to Respondents or Record Keepers
There are no other costs to respondents other than their time.
The annualized cost to the Federal Government for the collection of data is $368,520. This includes the costs paid to the contractor to obtain and select the survey sample, collect the data and non-response follow-up, pay incentives to respondents, create a database of the results, and perform data analysis, write reports, and study management. The cost also includes NCI staff time to design and manage the study, to analyze the resultant data, and to disseminate results. Itemized costs are shown in Tables A.14. Federal staff includes a Program Director/Behavioral Scientist and a Branch Chief who co-lead the project. Other federal staff include a Program Director who is serving is the Contract Officer Representative and a Fellow. Staff provided by RTI includes a Project Director, Statistician, Survey Methodologist, Research Services/Data Collection Specialist, and a Programmer Analyst. Additional costs include travel expenses for an in-person meeting between NCI and RTI, a subcontract for trained telephone screeners and participant gifts.
Table A.14 Annualized Cost to the Respondents
Staff |
Grade/Step |
Salary |
% of Effort |
Fringe (if applicable) |
Total Cost to Gov’t |
Federal Oversight |
|
|
|
|
|
Program Director/Behavioral Scientist |
14/5 |
$120,429 |
15% |
|
$18,064 |
Branch Chief |
15/7 |
$149,993 |
15% |
|
$22,499 |
Program Director |
13/3 |
$95,919 |
10% |
|
$9,592 |
Fellow |
NA |
$38,000 |
10% |
|
$3,800 |
Contractor Cost |
|
|
|
|
|
Project Director |
NA |
$128,939 |
16% |
$8,076 |
$29,326 |
Statistician |
NA |
$93,944 |
19% |
$6,961 |
$25,280 |
Survey Methodologist |
NA |
$69,975 |
33% |
$8,775 |
$31,867 |
Research Services/ Data Collection Specialist |
NA |
$76,205 |
13% |
$3,910 |
$14,197 |
Programmer Analyst |
NA |
$106,300 |
13% |
$5,252 |
$19,071 |
|
|
|
|
|
|
Travel |
|
|
|
|
$3,660 |
Materials and Supplies |
|
|
|
|
$30,088 |
Subcontracts |
|
|
|
|
$33,223 |
Participant gifts |
|
|
|
|
$39,600 |
Indirect costs, G&A, Fee |
|
|
|
|
$88,253 |
Total |
|
|
|
|
$368,520 |
A.15 Explanation for Program Changes or Adjustments
This is a new collection of information.
Conventional statistical techniques for survey data, including descriptive statistics such as proportions and percentages will be used to describe the data. Additionally, analysis of variance and regression models will be used to analyze comparisons between academic and community physicians.
The Project timeline schedule is summarized in Table A.16. NCI anticipates disseminating the results of the study after the final analyses of the data are completed, reviewed, and cleared. The exact timing and nature of any such dissemination has not been determined, but may include presentations at trade and academic conferences, publications, articles, and Internet postings.
Table A.16 Project Timeline
Activity |
Time Schedule |
Conduct pilot test |
Completed 4 months after OMB approval |
Analyze results of pilot test |
Completed 6 months after OMB approval |
Screeners calls made to verify eligibility of sample cases |
Completed 7 months after OMB approval |
Initial survey package sent via USPS 1st class mail |
Completed 8 months after OMB approval |
Reminder letter and replacement survey sent via FedEx |
Completed 9 months after OMB approval |
First invitation with link to web survey sent via email |
Completed 9.5 months after OMB approval |
Phone contacts begin to encourage response |
Completed 9.5 months after OMB approval |
Second/final email reminders sent with link to web survey |
Completed 10 months after OMB approval |
Final surveys accepted |
Completed 11 months after OMB approval |
Data entry and data check completed |
Completed 11.5 months after OMB approval |
Final survey non-response follow-up package sent |
Completed 12 months after OMB approval |
Initial analyses completed by RTI |
Completed 14 months after OMB approval |
Initial papers submitted for publication |
Completed 18 months after OMB approval |
Additional papers submitted for publication |
Completed 22 months after OMB approval |
A.17 Reason(s) Display of OMB Expiration Date is Inappropriate
No exemption is requested. The expiration date for OMB approval will be displayed on the survey forms.
A.18 Exceptions to Certification for Paperwork Reduction Act Submissions
None.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Modified | 0000-00-00 |
File Created | 2021-01-24 |