Evaluation of an Online Prostate Cancer Screening Decision Aid - New
[OMB No. 0920-xxxx] [OMB expiration date]
Program Official/Contact
David Siegel, MD, MPH
Medical Officer
National Center for Chronic Disease Prevention and Health Promotion
Centers for Disease Control and Prevention
P: [Contact phone]
F: [Contact fax]
6/17/2024
TABLE OF CONTENTS
A1. Circumstances Making the Collection of Information Necessary 4
A2. Purpose and Use of the Information Collection 5
A3. Use of Improved Information Technology and Burden Reduction 8
A4. Efforts to Identify Duplication and Use of Similar Information 9
A5. Impact on Small Businesses or Other Small Entities 9
A6. Consequences of Collecting the Information Less Frequently 10
A7. Special Circumstances Relating to the Guidelines of 5 CRF 1320.5 10
A8. Comments in Response to the FRN and Efforts to Consult Outside the Agency 10
A9. Explanation of Any Payment or Gift to Respondents 12
A10. Protection of the Privacy and Confidentiality of Information Provided by Respondent 12
A11. Institutional Review Board (IRB) and Justification for Sensitive Questions 14
A12. Estimates of Annualized Burden Hours and Costs 15
A13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers 17
A14. Annualized Cost to the Federal Government 17
A15. Explanation for Program Changes or Adjustments 18
A16. Plans for Tabulation and Publication and Project Time Schedule 18
A17. Reason(s) Display of OMB Expiration Date is Inappropriate 21
A18. Exceptions to Certification for Paperwork Reduction Act Submission 21
1. Authorizing Legislation
2. Study Design Graphic
3. Evaluation Questions and Indicators
4a. Provider Survey: Introductory Email to All Primary Care Providers, English
4b. Provider Survey: Consent Statement, English
4c. Provider Survey: Paper, English
4d. Provider Survey: Online, English
4e. Provider Survey: Reminder Email, English
4f. Provider Survey: Thank You Email, English
5a. Patient Recruitment: Introductory Email to Providers of Patient Sample, English
5b. Patient Recruitment: Introductory Email to Patients, English
5c. Patient Recruitment: Introductory Email to Patients, Spanish
5d. Patient Recruitment: Email with Link to Eligibility Screener, English
5e. Patient Recruitment: Email with Link to Eligibility Screener, Spanish
5f. Patient Recruitment: Reminder Email for Eligibility Screener, English
5g. Patient Recruitment: Reminder Email for Eligibility Screener, Spanish
6a. Pre-Exposure Survey: Eligibility Screener, English
6b. Pre-Exposure Survey: Eligibility Screener, Spanish
6c. Pre-Exposure Survey: Consent Statement, English
6d. Pre-Exposure Survey: Consent Statement, Spanish
6e. Pre-Exposure Survey: Paper, English
6f. Pre-Exposure Survey: Online, English
6g. Pre-Exposure Survey: Online, Spanish
6h. Pre-Exposure Survey: Reminder Email, English
6i. Pre-Exposure Survey: Reminder Email, Spanish
7a. Post-Exposure Survey: Assigned Materials and Survey Link, English
7b. Post-Exposure Survey: Assigned Materials and Survey Link, Spanish
7c. Post-Exposure Survey: Consent Statement, English
7d. Post-Exposure Survey: Consent Statement, Spanish
7e. Post-Exposure Survey: Paper, English
7f. Post-Exposure Survey: Online, English
7g. Post-Exposure Survey: Online, Spanish
7h. Post-Exposure Survey: Reminder Email, English
7i. Post-Exposure Survey: Reminder Email, Spanish
8a. Usability Survey: Pre-Notification Email, English
8b. Usability Survey: Pre-Notification Email, Spanish
8c. Usability Survey: Email with Survey Link, English
8d. Usability Survey: Email with Survey Link, Spanish
8e. Usability Survey: Consent Statement, English
8f. Usability Survey: Consent Statement, Spanish
8g. Usability Survey: Paper, English
8h. Usability Survey: Online, English
8i. Usability Survey: Online, Spanish
8j. Usability Survey: Reminder Email, English
8k. Usability Survey: Reminder Email, Spanish
9a. User Experience Interviews: Recruitment Email, English
9b. User Experience Interviews: Recruitment Email, Spanish
9c. User Experience Interviews: Consent Statement, English
9d. User Experience Interviews: Consent Statement, Spanish
9e. User Experience Interviews: Interview Guide, English
9f. User Experience Interviews: Interview Guide, Spanish
10a. Post-Clinic Visit Survey: Pre-Notification Email, English
10b. Post-Clinic Visit Survey: Pre-Notification Email, Spanish
10c. Post-Clinic Visit Survey: Email with Survey Link, English
10d. Post-Clinic Visit Survey: Email with Survey Link, Spanish
10e. Post-Clinic Visit Survey: Consent Statement, English
10f. Post-Clinic Visit Survey: Consent Statement, Spanish
10g. Post-Clinic Visit Survey: Paper, English
10h. Post-Clinic Visit Survey: Online, English
10i. Post-Clinic Visit Survey: Online, Spanish
10j. Post-Clinic Visit Survey: Reminder Email, English
10k. Post-Clinic Visit Survey: Reminder Email, Spanish
10l. Post-Clinic Visit Survey: Thank You Email, English
10m. Post-Clinic Visit Survey: Thank You Email, Spanish
11a. Clinic Coordinator Interviews: Consent Statement, English
11b. Clinic Coordinator Interviews: Interview Guide, English
12a. Published 60-Day Federal Register Notice (FRN)
12b. 60-Day FRN Public Comments
12c. CDC Response to Public Comments
13. Approved Privacy Narrative
14. Institutional Review Board Approval
JUSTIFICATION SUMMARY
Goal of the project:
To conduct a randomized
controlled trial (RCT) to determine if CDC’s online,
simulated, human decision aid module, Talk
to Nathan About Prostate Cancer Screening (treatment arm), is
effective in improving knowledge, overcoming health literacy
barriers, and resolving decisional conflict compared to a standard
decision aid (control arm 1) and standard education materials
(control arm 2). Also, to identify barriers and best practices for
incorporating Talk to Nathan About Prostate Cancer Screening into
the flow of primary care practice.
Intended
use of the resulting data: To
measure evaluation outcomes, understand how to help men make
decisions about the harms and benefits of prostate cancer screening
that are in line with the patient’s individual values and
preferences, and make recommendations for improving the Talk to
Nathan About Prostate Cancer Screening decision aid and
incorporating it into primary care practice. Methods
to be used to collect: The
RCT is a three-group parallel design with one treatment arm and two
control arms. Data will be collected from all arms using a
pre-exposure survey, a post-exposure survey, and a post-clinic visit
survey. The treatment arm will also complete a usability survey and
a subset of the treatment arm will be invited to participate in user
experience interviews. Health care providers at the four
participating clinics will complete a short survey prior to
executing the three-arm study and interviews will be conducted at
the close of the study with study coordinators from the four
participating clinics. The
subpopulation to be studied: For
the pre- and post-surveys, the usability survey, and the user
experience interviews, the subpopulation is men aged 55-69 years.
For the provider survey, the subpopulation is primary care providers
who practice within the four clinics participating in the study. For
the clinic coordinator interviews, the subpopulation is the study
coordinators from the four participating clinics. How
data will be analyzed: For
quantitative survey data: intention-to-treat analysis; repeated
measures analysis of variance across assessment time points;
ordinary least squares regression; complier-average causal effect
(CACE) approach to calculate treatment effect; maximum likelihood
and Bayesian inferential methods for CACE. For qualitative data from
the surveys and interviews: We
will identify and analyze themes, patterns, and inter-relationships
relevant to the evaluation questions for this study.
JUSTIFICATION
The Centers for Disease Control and Prevention (CDC), Division of Cancer Prevention and Control (DCPC) is requesting a new, three-year OMB approval to conduct a three-arm, randomized controlled trial (RCT) to evaluate the impact of a virtual human decision aid to help improve the quality of prostate cancer screening and treatment decisions. The information collection for which approval is sought is in accordance with the CDC’s mission to conduct, support, and promote efforts to prevent cancer and to increase early detection of cancer, authorized by Section 301 of the Public Health Service Act [42 USC 241] (Attachment 1).
Talk to Nathan About Prostate Cancer Screening (hereafter referred to as Nathan) is DCPC’s online, interactive, human simulation decision aid designed to help men learn and make informed decisions about prostate cancer screening. A small, preliminary evaluation of Nathan conducted by Kognito (the health care simulation company that developed the Nathan module) showed promise in increasing men’s knowledge about prostate cancer and likelihood of engaging in shared decision-making about prostate cancer screening with their health care providers.i At this time, a larger, more systematic evaluation is needed to understand whether Nathan is effective in areas such as improving knowledge, overcoming health literacy barriers, and resolving decisional conflict, especially among priority populations who are most likely to be affected by prostate cancer and least likely to be screened. Further, as some experts consider the digital divide to be the newest social determinant of healthii, it is important to explore how, where, and for which populations there may be disparities in accessing and using Nathan. CDC will implement this RCT with the help of its contractor, ICF.
Broadly, the purpose of this information collection is to 1) assess whether Nathan is more effective at helping men make decisions about prostate cancer screening than an established decision aid or standard educational materials; 2) determine if changes or improvements to Nathan are warranted; and 3) identify ways to incorporate Nathan into primary care.
We will select four primary care clinics to participate in this study based on the following criteria:
Population Served |
|
|
|
Data Sharing |
|
|
|
Perspective on Shared Decision-Making |
|
|
|
Resources |
|
The design of the RCT is depicted in the graphic in Attachment 2. It includes a three-group parallel design with one treatment arm and two control arms to test the effectiveness of Nathan for men aged 55-69. We will recruit 900 men aged 55-69 who have an upcoming general health exam at one of the four primary care clinics and randomize them using permuted block design to one of three arms: (1) Nathan (intervention=300 men), (2) the Massachusetts Department of Public Health’s (MDPH’s) Patient Decision Aid, Get the Latest Facts about Screening for Prostate Cancer (control 1=300 men), and (3) standard educational materials from the National Cancer Institute (NCI), Prostate Cancer Screening (PDQ®)–Patient Version (control 2=300 men).
Our information collections are informed by the evaluation questions and indicators included in Attachment 3. Eight forms of information collection will be implemented to answer our evaluation questions. These include a provider survey; a patient eligibility screener; patient pre-exposure, post-exposure, and post-clinic visit surveys; a patient usability survey; patient user experience interviews; and clinic coordinator interviews. Each instrument will be administered once per respondent throughout the course of the study.
Provider Survey
Each of the four clinics will provide a sample of all primary care providers within their clinic. Prior to executing the three-arm study, we will administer a web-based survey to these providers. This survey will assess providers’ prostate cancer screening practices and attitudes towards prostate cancer screening and will also collect demographic data from the providers. The provider survey will be administered in English. Prior to beginning the survey, providers will be provided with an informed consent statement. See Attachment 4b.
Patient Eligibility Screener
Each of the four clinics will provide a sample of men who meet the criteria for the study based on clinics’ review of electronic health records (EHRs). Primary care providers of men in the sample will receive an email informing them of the study and its purpose and letting them know that some of their patients may be enrolled in the study. Men in the sample will receive an introductory email informing them of the study and its purpose and letting them know that someone from ICF will be contacting them. Once contacted, men will be screened to confirm their eligibility in the study. The screener will be administered in English or Spanish. See Attachments 6a and 6b.
Pre-Exposure Survey
Following completion of the eligibility screener, eligible men will be administered a pre-exposure survey that will measure changes in primary outcome (decisional conflict), and in secondary outcomes including prostate cancer knowledge, and autonomous decision-making. The pre-exposure survey will also collect participants’ demographics, digital literacy, health literacy, previous exposure to informational materials about prostate cancer screening, and prostate cancer experience. It will be administered in English or Spanish. Prior to beginning the pre-exposure survey, men will be provided with an informed consent statement. See Attachments 6c and 6d.
Post-Exposure Survey
Following completion of the pre-exposure survey, men will be randomized to one of the three arms of the study and sent an email with their assigned materials (Nathan, MDPH decision aid, or NCI PDQ) for review and a link to the post-exposure survey. Men in all three arms will complete the post-exposure survey immediately after exposure to their assigned materials. This survey will measure exposure to the assigned material, decisional conflict, autonomous decision making, decision self-efficacy, preparation for decision-making, prostate cancer knowledge, help needed to review assigned materials, and contamination from other informational materials about prostate cancer screening. It will be administered in English or Spanish. Prior to beginning the post-exposure survey, men will be provided with an informed consent statement. See Attachments 7c and 7d.
Usability Survey
The 300 men in the Nathan arm will be administered a usability survey within 2 weeks of completing the post-exposure survey. The usability survey will focus on understanding the acceptability, perceived fit, usability of the decision aid, and technology acceptance. It also will assess Nathan dosage (i.e., pathways completed and time spent on Nathan by patient; also confirmed by use data gathered through the Nathan platform), help needed to review Nathan, COVID-19 impact and telemedicine, and gather recommendations for improving Nathan’s content and functionality. The usability survey will be administered in English or Spanish. Prior to beginning the usability survey, men will be provided with an informed consent statement. See Attachments 8e and 8f.
User Experience Interviews
A subset (n=30) of men who complete the usability survey will be administered an in-depth interview to gather a deeper understanding of Nathan’s acceptability, perceived fit, usability, and digital literacy as well as barriers and facilitators to use and recommendations for improving Nathan’s content and functionality. These interviews will be conducted in English or Spanish. Prior to beginning the user experience interview, men will be read an informed consent statement. See Attachment 9c and 9d.
Post-Clinic Visit Survey
Men in all three arms will be administered a post-clinic visit survey following the scheduled general health exam with their provider. This survey will measure decisional conflict, autonomous decision-making, prostate cancer knowledge, screening behavioral intent, screening behavior (also confirmed by electronic health record [EHR] review), shared decision-making, time spent with provider discussing PSA test, and informational materials used in making a screening decision. It will be administered in English or Spanish. Prior to beginning the post-clinic visit survey, men will be provided with an informed consent statement. See Attachment 10e and 10f.
Clinic Coordinator Interviews
We will conduct interviews with study coordinators from the four participating clinics to get their perspective on barriers, facilitators, and best practices to incorporating Nathan into the clinic workflow. These interviews will be conducted in English. Prior to beginning the interview, clinic coordinators will be read an informed consent statement. See Attachment 11a.
The purpose for the use of a randomized controlled trial (RCT) design is to ensure high internal validity (i.e., the extent to which the observed results represent the truth in the target population). Generalizability to the target population for the study drives the selection of the design. Strict adherence to our carefully crafted inclusion and exclusion criteria for study participants maximizes this generalizability.
The provider survey will be administered via an online instrument to minimize burden to respondents (Attachment 4d; a Microsoft Word version of this survey is provided in Attachment 4c for ease of review). It will be programmed in SurveyMonkey which provides secure storage of data until the team is ready to export into appropriate data analysis software (e.g., SAS, SPSS). SurveyMonkey also conducts basic descriptive analyses and creates visualizations, which can be incorporated into the monthly data report to CDC which ICF is required to submit to CDC during each month of the study period.
The pre-exposure, post-exposure, usability, and post-clinic visit surveys also will be administered via online instruments to minimize burden to respondents (Attachments 6f, 6g, 7f, 7g, 8h, 8i, 10h, and 10i; Microsoft Word versions of these surveys are provided in Attachments 6e, 7e, 8g, and 10g for ease of review). These surveys will be programmed in Voxco, an easy-to-use data collection, management, and analysis system hosted and maintained by ICF. Men who do not respond to the web-based pre-exposure, post-exposure, usability, and post-clinic visit surveys will be contacted to complete these surveys via computer-assisted telephone interviewing (CATI). We anticipate that 50% of these surveys will be completed via web administration and 50% will be completed via CATI administration. Men will first be provided a unique web link to take the survey online. The unique web link for each of the four surveys allows for the survey responses to be linked together on the backend within the Voxco platform. Non-respondents will receive a follow-up call to gauge interest, and if interested, we will administer the survey using CATI. All skip patterns within the surveys (i.e., questions that are only appropriate for a portion of respondents) will be automatically programmed into the web and CATI surveys, further minimizing the burden on respondents in terms of their time.
The user experience and clinic coordinator interviews will be conducted using virtual technology (e.g., Zoom; see Attachments 9e, 9f, and 11b). This will minimize response burden by allowing for more flexibility in scheduling the interviews as the participating men and clinic coordinators can join from their place of employment or from home and removes the need to travel to an office or alternate location to participate in these interviews.
ICF is responsible for data management and analysis of all information collections, which includes generating monthly progress reports, a final evaluation report, and a manuscript for CDC. Results will also be made available on DCPC’s website.
The information to be collected from providers, men aged 55-69 years, and study coordinators across the four participating clinics is specific to the CDC-developed Nathan decision aid and, therefore, not duplicative of any other efforts. This information collection will allow DCPC to better understand the population-level impact, effectiveness, and use of Nathan versus an established decision aid or standard education materials. These data are not available elsewhere.
It is possible that one or more of the four primary care clinics with whom we partner may be considered small business. As part of the partnership, we will work mainly with their clinic coordinators to identify potentially eligible men for the study, confirm PSA screening behavior, complete a short interview with the study team to discuss best practices and recommendations for incorporating Nathan into the clinic workflow, and engage clinic providers to complete a short survey to assess their prostate cancer screening practices and attitudes towards prostate cancer screening. Participation in this study will not require these clinics to adopt new EHR practices or reminders, however, we will expect the clinics to work with our team to pull reports from their EHR of already-existing data. We will provide training, and clear guidance to the clinic coordinators on their role and will be available to answer questions and provide technical assistance to the clinic coordinators throughout the study period.
This is a one-time study. Each of the study audiences (i.e., providers, men ages 55-69, and clinic study coordinators) will complete their respective information collections one time. Reducing the respondent burden below the estimated levels (that is, reducing the type or number of study participants or the number of instruments) would diminish the robustness of the information collection and the utility of the study.
OMB approval is requested for three years. There are no legal obstacles to reduce the burden.
This request fully complies with the regulation 5 CFR 1320.5. There are no special circumstances with this information collection request. Participation in the study is voluntary.
Part A: PUBLIC NOTICE
A 60-day Federal Register Notice was published in the Federal Register on January 26, 2024, vol. 89 No. 18, pp. 5237-5239 (see Attachment 12a). CDC received two substantive comments (see Attachment 12b) and provided responses accordingly (see Attachment 12c).
Part B: CONSULTATION
CDC received expert external consultation from ICF as well as a well-known decision scientist. Both aided in the development of the study design and data collection instruments. CDC also engaged multiple staff in internal consultation. A list of those individuals is included in the below Tables A8B1 and A8B2.
Table A8B1. External Consultations
Name |
Title |
Affiliation |
Phone |
Role |
|
OUTSIDE CONTRACTORS |
|
||||
Danielle Nielsen, MPH |
Director |
ICF |
(678) 488-3365 |
Project Manager |
|
Bhuvana Sukumar, PhD |
Vice President |
ICF |
(404) 592-2122 |
Senior Technical Advisor |
|
Robert Stephens, PhD |
Senior Research Methodologist |
ICF |
(404) 320-4494 |
Evaluation Team Lead |
|
Helen Coelho, MPH |
Senior Data Analyst |
ICF |
(404) 592-2127 |
Recruitment Lead |
|
Elizabeth Douglas, MPH |
Senior Manager |
ICF |
(404) 592-2175 |
Cognitive Testing Lead |
|
Sara Perrins, PhD |
Behavioral Health Research Scientist |
ICF |
(404) 321-3211 |
Data Analyst |
|
Bryce McGowan, MPH |
Research Scientist |
ICF |
(404) 464-3617 |
Project Support |
|
OUTSIDE CONSULTANT |
|
||||
Robert Volk, PhD |
Professor |
University of Texas MD Anderson Cancer Center |
(713) 563-2239 |
Scientific and Technical Consultation |
Table A8B2. Consultations within CDC
Name |
Title |
Affiliation |
Phone |
Role |
|
David Siegel, MD, MPH |
Medical Officer |
DCPC |
770-488-4426 |
Technical Monitor |
|
Thomas Richards, MD |
Medical Officer |
DCPC |
404-634-9915 |
Contracting Officer’s Representative |
|
Cheryll Thomas |
Associate Director for Science |
DCPC |
770-488-3254 |
Scientific advisor |
|
Nita Patel |
Health Scientist |
DCPC |
404-639-8706 |
Scientific advisor |
Providers will not receive an incentive for completing the provider survey. At the close of the study, providers will receive a thank you email for their participation with a link to CDC’s Explore Talking to Patients about Prostate Cancer, a version of Nathan for providers (Attachment 4f).
Men ages 55-69 will receive a noncash incentive (i.e., $25 Amazon gift code) after completing each of the data collection activities in which they participate (i.e., the pre-exposure, post-exposure, usability, and post-clinic visit surveys as well as the user experience interview). These incentives are offered to increase the likelihood of participation and to thank a respondent for their time and input to the study.iii At the close of the study, men will receive a thank you email with the materials assigned to each of the three arms of the study (Attachments 10l and 10m).
Clinic study coordinators will not receive an incentive for completing the clinic coordinator interview.
These information collections will be conducted by ICF on CDC’s behalf and will conform to the ethical practices for administering surveys and conducting interviews. We will implement procedures to protect the privacy of all respondents as appropriate. Several methods will be used to gather data, including web and CATI survey administration and telephone interviews. Respondent contact information used to solicit participation will be kept separate from participant responses. All respondents will be informed that the responses they provide will be treated in a secure manner and will be used only for the purpose of this study, unless otherwise compelled by law. Only aggregate numbers, summary statistics, or de-identified quotes will be included in the final report and manuscript. The informed consent statement for each information collection describes how personally identifiable information will be secured, used, and reported. Additional procedures designed to protect participant privacy for the information collections are described below.
Privacy Act Determination: NCCDPHP’s Information Systems Security Officer has reviewed this submission and has determined that the Privacy Act does not apply. The completed privacy narrative is attached (Attachment 13).
Response data will not be stored or retrieved by name. Identifiers used for recruitment and scheduling are not linked to response data at any time. Unique identifiers will be assigned to each case in the data files as data are collected and participants removed from contact lists when their interview participation is complete. Survey and interview data will be stored by ICF in secure servers. All respondents will be told during the consent process that the data they provide will be treated in a secure manner to the extent allowed by law. (Informed consent statements for each information collection are included in Attachments 6c, 6d, 7c, 7d, 8e, 8f, 9c, 9d, 10e, 10f, and 11a). They also will be informed that participation is voluntary, that they may refuse to answer any question, and can stop at any time without risk. In addition, names of participants in any information collection will not be provided to the Federal government. Instead, a unique ID will be assigned to each participant.
Provider Survey
Each of the four clinics will provide a sample of all primary care providers within their clinic. A randomly generated, numeric participant ID will be assigned to each provider in the sample. Provider and clinic name in the sample will allow us to analyze and describe provider survey responses by clinic. Personally identifiable information (e.g., name, clinic) will not be shared outside of the sample database. We will develop and maintain the provider survey and its data in SurveyMonkey until the survey closes. Data will be exported to Excel and saved in a secure folder in Microsoft Teams. The provider survey is in Attachment 4d; a Microsoft Word version of this survey is provided in Attachment 4c for ease of review.
Pre-Exposure, Post-Exposure, Usability, and Post-Clinic Surveys
Clinic coordinators will review patient EHRs to identify potentially eligible men who meet the inclusion criteria for the study. To ensure privacy, a randomly generated, numeric ID will be assigned to each patient prior to completing the eligibility screener and will serve as the identifier to match information from each survey with the respondent’s EHR record. (Surveys are in Attachments 6f, 6g, 7f, 7g, 8h, 8i, 10h, and 10i; Microsoft Word versions of these surveys are provided in Attachments 6e, 7e, 8g, and 10g for ease of review).
Personally identifiable information will not be collected in the surveys. After completion of all study data collection activities, only the respondent ID and other, non-Protected Health Information, categorical variables necessary for analysis will be available. Within 3 months of the end of the study period, the link between the respondent ID and the patient’s personally identifiable information will be destroyed. No personally identifiable information (names, addresses, and telephone numbers) will be in the database delivered to CDC. All data containing identifying information about patients will be destroyed within three months of the end of the study period. CDC will not have access to personal identifiable information.
User Experience Interviews
To ensure privacy, personally identifiable information will not be collected during the user experience interviews (Attachment 9e and 9f). The unique ID assigned for the surveys will be used for the interviews and interview responses will not be stored with any identifying information. Personally identifiable information such as email addresses will be stored within the Voxco database and will not be shared.
Clinic Coordinator Interviews
To ensure privacy, personally identifiable information will not be collected during the clinic coordinator interviews (Attachment 11b). Each respondent will be assigned an ID and their interview responses will not be stored with their identifying information. Personally identifiable information such as email addresses will be stored in an Excel file saved in a secure folder in Microsoft Teams and will not be shared.
The study protocol has been reviewed and approved by ICF’s IRB. A copy of the approval letter is provided in Attachment 14.
Provider Survey
The provider survey captures provider prostate cancer screening practices and attitudes towards prostate cancer screening which may be considered sensitive. Providers will receive information about the risks and benefits of their participation via an informed consent statement (Attachment 4b). The informed consent statement describes the purpose of the study, how the information will be used, and the steps that will be taken to protect participant confidentiality. Participants will also be informed that the survey is voluntary. The consent statement will include names and email addresses of the ICF project manager and IRB representative should participants have any questions about the survey or their rights as a participant in the study.
Pre-Exposure, Post-Exposure, and Post-Clinic Visit Surveys
The pre-exposure, post-exposure, and post-clinic visit surveys ask respondents about decisional conflict, knowledge, attitudes, beliefs, intentions, and behaviors about prostate cancer screening as well as family history of prostate cancer and personal experience with prostate cancer screening. This information is important to understanding the personal, social, and other contextual factors that may influence whether a person speaks with their provider about, and ultimately obtains, prostate cancer screening. In addition, race/ethnicity, income, and education data are collected per OMB standards; and sexual orientation and gender identity are collected according to Behavioral Risk Factor Surveillance System standards. These data collected reflect those needed to assess the study’s outcomes in order to determine the effectiveness and accessibility of Nathan. Men who participate in these surveys will receive information about the risks and benefits of their participation via an informed consent statement (Attachments 6c, 6d, 7c, 7d, 10e, and 10f). The informed consent statement describes the purpose of the study, how the information will be used, the length of the survey, and the steps that will be taken to protect participant confidentiality. Participants will also be informed that the survey is voluntary. The consent statement will include names and email addresses of the ICF project manager and IRB representative should participants have any questions about the survey or their rights as a participant in the study. For men who participate in CATI administration of these surveys, the consent statement will be read to them and consent obtained prior to beginning CATI administration.
Usability Survey
The usability survey does not contain sensitive questions. Men who participate in this survey will receive an informed consent statement that is included on the survey (Attachments 8e and 8f). For men who participate in CATI administration of these surveys, the consent statement will be read to them and consent obtained prior to beginning CATI administration.
User Experience Interviews
The user experience interview guide does not contain sensitive questions. Men who participate in this interview will receive an informed consent statement (Attachments 9c and 9d) prior to the interview. The ICF interviewer will also read the consent statement and obtain consent prior to beginning the interview.
Clinic Coordinator Interviews
The clinic coordinator interview guide does not contain sensitive questions. Clinic coordinators who participate in this interview will receive an informed consent statement (Attachment 11a) prior to the interview. The ICF interviewer will also read the consent statement and obtain consent prior to beginning the interview.
The security of all information collection responses will be preserved by following the procedures outlined in section A-10.
The estimated annualized burden hours are presented in Table A12A. The proposed study consists of eight one-time data collection activities conducted over a six-month period. Average burden per response is based on pilot testing of each information collection conducted with fewer than 10 respondents. The total response burden for the six-month period is estimated to be 1,129 hours.
Table A12A: Estimated Annualized Burden (Hours)
Type of Respondents |
Form Name |
No. of Respondents |
No. of Responses per Respondent |
Average Burden per Response (in hours) |
Total Burden Hours |
Primary care providers |
Provider survey |
40 |
1 |
10/60 |
7 |
Men ages 55-69 |
Patient eligibility screener |
900 |
1 |
8/60 |
120 |
Men ages 55-69 |
Pre-exposure survey |
900 |
1 |
20/60 |
300 |
Men ages 55-69 |
Post-exposure survey |
900 |
1 |
20/60 |
300 |
Men ages 55-69 |
Usability survey |
300 |
1 |
18/60 |
90 |
Men ages 55-69 |
User experience interview |
30 |
1 |
20/60 |
10 |
Men ages 55-69 |
Post-clinic survey |
900 |
1 |
20/60 |
300 |
Clinic coordinators |
Clinic coordinator interview |
4 |
1 |
30/60 |
2 |
|
|
|
|
Total |
1,129 |
Estimates for the average hourly wage for respondents was calculated based on the hourly wage rates for appropriate occupational categories from the Bureau of Labor Statistics May 2021 National Occupational Employment and Wage Estimatesiv and from the U.S. Department of Labor Federal Minimum Wage Standardsv. The annualized cost is estimated to be $9,254.20, as summarized below in Table A12B. There will be no direct costs to respondents other than their time to participate in their respective data collection activities.
Table A12B: Estimated Annualized Burden Costs
Type of Respondents |
Form Name |
Total Annual Burden Hours |
Average Hourly Wage Rate |
Total Respondent Labor Cost |
Primary care providers |
Provider survey |
7 |
$116.44 |
$815.08 |
Men ages 55-69 |
Patient eligibility screener |
120 |
$7.50 |
$900 |
Men ages 55-69 |
Pre-exposure survey |
300 |
$7.50 |
$2,250 |
Men ages 55-69 |
Post-exposure survey |
300 |
$7.50 |
$2,250 |
Men ages 55-69 |
Usability survey |
90 |
$7.50 |
$675 |
Men ages 55-69 |
User experience interview |
10 |
$7.50 |
$75 |
Men ages 55-69 |
Post-clinic survey |
300 |
$7.50 |
$2,250 |
Clinic coordinators |
Clinic coordinator interview |
2 |
$19.56 |
$39.12 |
|
|
|
Total |
$9,254.20 |
Provider and patient respondents will incur no capital or maintenance costs to complete this data collection. Clinic coordinators will experience some additional burden in generating the patient samples for the study.
Other costs related to this effort are costs to the Federal government as part of ICF’s contract to collect all information required for this evaluation.
Total operations and maintenance costs include work performed by both CDC and contractor (ICF) personnel. Salary cost of CDC staff include 2 FTEs (GS-14 and GS-13) to prepare and review OMB documents and oversee all information collection activities, including data collection, management, analysis, and report preparation. 200 hours of staff time was estimated for each FTE annually for this information collection. Cost of ICF represents an estimated $507,518 annually allocated for data collection, data management, data analysis and reporting activities as part of this evaluation. Table A.14-A describes how the cost estimate was calculated.
Table A14A. Estimated Annualized Federal Government Cost Distribution
Staff (FTE) |
Average Hours per Collection |
Average Hourly Rate |
Average Cost |
Medical Officer (GS-14) Preparation and review of OMB package; overall coordination; and consult on information collection, analysis, report preparation |
100 |
$90 |
$9000 |
Medical Officer (GS-13) Preparation and review of OMB package; overall coordination; and consult on information collection, analysis, report preparation |
100 |
$70 |
$7000 |
Contractor Costs |
|
|
|
Annualized Cost of Contract with ICF Responsible for OMB package preparation, site and participant recruitment, development of web-based data collection and CATI tool, data collection, management, information collection, coding and entry, quality control analysis, and report preparation, analysis, report preparation |
|
|
$507,518 |
Estimated Total Cost of Information Collection |
$523,518 |
The majority of data collection and management will be the responsibility of the CDC contractor and will not require additional operational or maintenance costs to the Federal government. Each of the four clinics will receive a stipend of $15,000 for their contributions to this study; the total cost of these stipends is included in the Annualized Cost of Contract with ICF, noted above. CDC personnel will oversee the project and provide leadership and coordination which will not require additional costs beyond individual employees’ salaries. Therefore, there are no additional operational or maintenance costs associated with this information collection. Table A14B provides the total cost to the Federal government.
Table A14B. Total Cost to the Federal Government
Operational and Maintenance Costs |
Estimated Annualized Federal Government Costs |
Total Cost |
$0.00 |
$16,000 |
$16,000 |
This is a new information collection request.
Tabulation
Our analysis plans for assessing the efficacy of Nathan include a tabular analysis to examine baseline variables across the three study arms to determine whether random assignment resulted in participant characteristics being equally distributed across study arms. Specifically, we will examine differences in various demographic variables (e.g., age, race/ethnicity, etc.) and our primary outcome variable (e.g., decision conflict) across the intervention and two control groups using regression analyses. Any observed differences across experimental groups will be statistically controlled for in the subsequent outcome analysis.
We will compare the efficacy of Nathan to the MDPH decision aid and the NCI PDQ® using a repeated measures analysis of variance across the three assessment time points. An ordinary least squares regression will be used to understand the relationships between the dependent variables (outcomes) and independent variables (covariates). To examine Nathan’s effect on change in decision conflict after adjusting for confounding variables, ordinary least squares regression analysis will estimate treatment effects following the intent-to-treat (ITT) framework. Generalized estimating equations will be used to obtain standard errors that are robust to arbitrary patterns of correlation between repeated observations. While ITT analysis provides an unbiased estimate for the effect of treatment assignment on the outcome, it can introduce other forms of bias. For example, biasing the actual effect of receiving treatment compared to a control condition. If attrition is substantial (>= 10%), we will use a complier-average causal effect (CACE) approach to calculate the treatment effect for participants who would comply regardless of their assigned treatment. Based on previous researchvi, we propose to use maximum likelihood and Bayesian inferential methods for CACE which make explicit assumptions for causal inference in the presence of noncompliance and are more efficient than standard instrumental variable methods.
For the qualitative data from the surveys and interviews, we will identify and analyze themes, patterns, and inter-relationships relevant to the evaluation questions for this study. Textual data from interview transcripts and surveys will be entered into a qualitative database software program, MAXQDA, for analysis. We will develop an initial list of deductive codes aligned with the study questions and systematically code the data to identify relevant themes in preparation for unique and common thematic analyses. As the ICF team begins to generate conclusions about the data during coding, they will verify these more general analyses and validate them by cross-checking and revisiting the data. Thus, the coding scheme will be elaborated upon and refined based on themes that emerge from the data. The descriptive themes that emerge from the synthesis of findings will be discussed at length with the coders and the larger evaluation team to ensure the validity of conclusions.
Publication
At least one manuscript will be prepared for submission to a peer-reviewed journal. DCPC will make the results of this study available to the public by publishing them on the prostate cancer web page of CDC’s website.
Project Time Schedule
Data collection will occur over a period of 6 months, beginning immediately after OMB approval. Analyses will be carried out from 6 to 12 months after OMB approval. The final data set, report, and manuscript will be submitted 24 months after OMB approval. A summary timeline is provided below.
Table A.16. Estimated Time Schedule for Project Activities
Activity |
Timeline |
Provider Data Collection |
|
Introductory email sent to all primary care providers at four clinics |
1 month after OMB approval |
Provider information collection (provider survey) |
2-3 months after OMB approval |
Patient Data Collection |
|
Introductory email sent to providers of patient sample |
2-3 months after OMB approval |
Introductory email sent to patients |
2-3 months after OMB approval |
Patient information collection (eligibility screener, pre-exposure survey, post-exposure survey, usability survey, user experience interview, post-clinic visit survey) |
3-6 months after OMB approval |
Clinic Coordinator Data Collection |
|
Clinic coordinator information collection (clinic coordinator interview) |
6 months after OMB approval |
Data Validation and Analysis |
|
Validation |
6-8 months after OMB approval |
Analysis |
8-12 months after OMB approval |
Reporting and Dissemination |
|
Report preparation |
12-16 months after OMB approval |
Manuscript development |
16-24 months after OMB approval |
|
|
Exemption is not being sought. All data collection instruments will display the expiration date of OMB approval.
This collection of information involves no exceptions to the Certification for Paperwork Reduction Act Submission.
i Schoenthaler, A. (2021). Evaluation of Kognito patient education and shared decision-making
modules. Appendix A, RFQ 75D301-22-Q-75441.
ii Heath, S. (2021, March 10). Is the digital divide the newest social determinant of health? Patient Engagement HIT. https://patientengagementhit.com/news/is-the-digital-divide-the-newestsocial- determinant-of-health.
iii Kammerer, K., Falk K., Herzog A. & Fuchs J. (2019). How to reach ‘hard-to-reach’ older people for research: The TIBaR model of recruitment. Survey Methods: Insights from the Field. Retrieved from https://surveyinsights.org/?p=11822.
iv U.S. Bureau of Labor Statistics. May 2021 National Occupational Employment and Wage Estimates. (2021) https://www.bls.gov/oes/current/oes_nat.htm.
v U.S. Department of Labor. Minimum Wage. https://www.dol.gov/general/topic/wages/minimumwage#:~:text=The%20federal%20minimum%20wage%20for,of%20the%20two%20minimum%20wages.
vi Imbens, G.W., & Rubin, D.B. (1997). Bayesian inference for causal effects in randomized experiments with noncompliance. Annals of Statistics, 25(1), 305–27. https://doi.org/10.1214/aos/1034276631.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Evaluation of an Online Prostate Cancer Screening Decision Aid - New |
Subject | Supporting Statement A template |
Author | Centers for Disease Control and Prevention |
File Modified | 0000-00-00 |
File Created | 2024-09-05 |