Supporting Statement B For:
Surveys and Interviews to Support an Evaluation of the Innovative Molecular Analysis Technologies (IMAT) Program (NCI)
Feb 17, 2015
Tony Dickherber
Center for Strategic Scientific Initiatives
National Cancer Institute
31 Center Dr., Rm 10A33
Bethesda, MD 20892
Telephone: 301.547.9980
Fax: 301.480.2889
E-mail: [email protected]
Table of Contents
B.1 Respondent Universe and Sampling Methods 1
B.2 Procedures for the Collection of Information 2
B.3 Methods to Maximize Response Rates and Deal with Nonresponse 3
B.4 Test of Procedures or Methods to be Undertaken 4
B.5 Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 5
Attachment 1: IMAT Awardee Interview Guide
Attachment 2: IMAT Awardees and Other NIH Awardee Web-based Survey
Attachment 3: IMAT Technology End User Interview Guide
Attachment 4: Background and Rational
Attachment 5: Trans-NIH Evaluation Advisory Committee (EAC)
Attachment 6: Privacy Act Memo
Attachment 7: Office of Human Subjects Research Protection (OHSRP) Exemption
Attachment 8: Invitation Letters to IMAT Awardees, Technology End Users, and IMAT Awardees and Other NIH Awardees (Comparison Group)
B. STATISTICAL METHODS
The non-federal respondents to the surveys proposed are awardees of funding announcements associated with the Innovative Molecular Analysis Technologies (IMAT) program and a comparison group of awardees from other NIH-funded technology development grants. The IMAT program has issued 70 independent funding opportunity announcements over its lifetime, receiving more than 4,500 applications over that time, and issued 673 awards, supporting roughly 500 unique technology platforms. The evaluation strategy also requires reaching out to investigators that employed any of these technologies, but were not involved during the IMAT-supported periods of development.
Surveying all awardees and applicants is both cost-prohibitive, and unlikely to yield substantially new information beyond a minimal number. Thus it was determined that interview responses from at least 100 IMAT awardees would provide a 95% level of confidence within a +/- 5% margin of error. Given the substantially larger expense associated with obtaining information through phone surveys versus a web-based survey protocol (30 minutes per individual), the strategy for this evaluation will be to issue a web-based survey to IMAT awardees and a comparison group of NIH technology focused grantees with the goal of obtaining as many as 900 responses in addition to interviews. Interviews are required to understand the complex outcomes of supported projects so a phone-based interview protocol will be engaged to obtain a deeper understanding of successful selected NIH applicants of up to 100 (60 minutes per interview). To corroborate the findings from both survey respondent groups, additional interviews will be pursued for up to 50 technology end-users (30 minutes per interview).
Individuals targeted for interviews will be selected after archival data collections are sufficiently completed to yield a case profile of all awardees. Based on archival data analysis, and with a thorough deliberation with members of Trans-NIH Evaluation Advisory Committee (Attachment 5), the NCI IMAT program team, and members of the contracted organization conducting the evaluation study, an intentional sample will be chosen for the respondent group.
The evaluation approach is to be centered on tracking or following all supported technologies since 1998. The history of the technology would then be tracked so that the technology can be described at each stage of its development, even prior to conceptualization for IMAT funding. Interviews will be conducted with individuals involved with the development of the technology in order to gather information about its current state and the potential for affecting progress in cancer research and treatment. Information from awardee responses will provide information for the earliest stages of development (i.e., at the time of the grant award). Further information from awardees will provide details for the developmental stages for the technology (i.e., progress and current status).
Responses from indirectly-related technology end-users and still more information obtained from awardees will inform the current the state of the technology and future potential. In terms of impact, potential users should also be asked to provide information on the use of the technology in improving cancer research and treatment. Most important for the purpose of this request is that the ability to gather specific information through standardized interview protocols (Attachments 1 and 3) from the researchers that developed the technology as well as research scientists in the field that might have employed these technologies to pursue exciting new research opportunities is considered critical for the success of the proposed evaluation strategy.
In order to maximize response rates, respondents will be initially contact by email and informed about plans to conduct an evaluation of the IMAT Program (Attachment 8). Program staff will monitor all emails that bounce back and identify other methods of contacting respondents for whom the email address is invalid. For web-based surveys, respondents will then click on the link in the invitation letter which will bring them to the on-line survey. For interviews, the respondents will receive a phone call and the interview protocol will be conducted by telephone. Should the respondents prefer an alternative time to conduct the interview, a different time will be arranged at their request. If the respondents are not available initially by telephone, then three follow-up phone calls will be made and a telephone message will be left.
Beginning with study initiation and continuing through all phases of information collection and analysis, NCI will take steps to ensure that the data collected are of the highest quality possible. Program staff will understand the purpose, sponsorship, background, objectives, and importance of the project, as well as their specific role and activities on the study.
To maximize response rates, interview respondents will be informed prior to the evaluation by email then sent a copy of the interview protocol in advance, and up to 3 follow-up attempts to contact non-responders will be made. Survey respondents will also be informed of the survey prior to the survey intended start date. Communications with respondents will be personalized and concise. Response rates will be measured and recorded and once data collection has been completed, a non-response analysis will be conducted. Based on the results, the survey data may be weighted to adjust for non-response bias.
Several pilot evaluations were conducted in 2008, 2010, and 2013. As indicated above, the principle findings were largely positive regarding the outcomes of the program, suggesting that it was serving the purposes for which it was designed. All pilot evaluations fell far short of serving as a comprehensive evaluation of the IMAT program as uniquely responsible for serving its stated mission, however.
In 2007, Macro International Inc. was contracted by NCI to perform a feasibility study for pursuing an outcome evaluation the IMAT program. In addition to verifying the feasibility of performing such an evaluation, the organization also produced an evaluation design, which forms the basis of the proposed evaluation design. The proposed design includes a research approach, a list of research objectives/questions, a conceptual framework, proposed data sources, a data collection strategy, and a costing/staffing estimate (circa 2007). Evaluation questions include:
1) identification of all IMAT and associated technologies;
2) identify the development path(s) for IMAT technologies;
3) assess the dissemination of all IMAT technologies; and
4) determine the outcomes or impacts of each IMAT technology.
The approach described involved tracking both successful as well as some unsuccessful applications to the program (as a comparison group), with a focus on short term and intermediate outcomes. The recommendation was that the most reliable information would be collected from interviews, but additional background and supplementary data would also be useful.
Minor modifications were made to the survey content and format as guided by a trans-NIH advisory committee. Experienced survey operations staff formatted the survey questionnaire for online ease of completion, as well as to facilitate coding and data entry. Additionally, the web-based survey was pilot tested with seven individuals to ensure the survey design and web-based settings were working as intended. This pilot test also served as an opportunity to assess whether the questions were clear and inform burden estimates for time to complete (i.e., 30 minutes). The evaluation team made minor modifications to formatting and question wording (to clarify meaning) and also deleted 13 questions to reduce respondent burden based on pilot findings. The interview protocol was also pilot tested using a mock interview and minor modifications were made to make the interview questions clearer.
Individuals who have consulted on statistical aspects and/or in analyzing the information are listed in Attachment 5.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | TABLE OF CONTENTS |
Author | Vivian Horovitch-Kelley |
File Modified | 0000-00-00 |
File Created | 2021-01-25 |