Ssa_imat_5-16-2016

SSA_IMAT_5-16-2016.docx

Surveys and Interviews to Support an Evaluation of the Innovative Molecular Analysis Technologies (IMAT) Program (NCI)

OMB: 0925-0720

Document [docx]
Download: docx | pdf





Supporting Statement A For:



Surveys and Interviews to Support an Evaluation of the Innovative Molecular Analysis Technologies (IMAT) Program (NCI)

OMB #0925-0720, Expiration Date: May 31, 2016

May 16, 2016



Yellow highlights represent changes from the 2015 submission.



Tony Dickherber

Center for Strategic Scientific Initiatives

National Cancer Institute


31 Center Dr., Rm 10A33

Bethesda, MD 20892


Telephone: 301.547.9980

Fax: 301.480.2889

E-mail: [email protected]



Check off which applies:

  • New

  • Revision

  • Reinstatement with Change

  • Reinstatement without Change

X Extension

  • Emergency

  • Existing

Table of Contents

A. Justification iii

A.1 Circumstances Making the Collection of Information Necessary 1

A.2. Purpose and Use of the Information Collection 2

A.3 Use of Improved Information Technology and Burden Reduction 5

A.4 Efforts to Identify Duplication and Use of Similar Information 5

A.5 Impact on Small Businesses or Other Small Entities 6

A.6 Consequences of Collecting the Information Less Frequently 6

A.7 Special Circumstances Relating to the Guidelines of 5 CFR 1320.5 6

A.8 Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency 6

A.9 Explanation of Any Payment or Gift to Respondents 7

A.10 Assurance of Confidentiality Provided to Respondents 7

A.11 Justification for Sensitive Questions 8

A.12 Estimates of Annualized Burden Hours And Costs 8

A.13 Estimates of Other Total Annual Cost Burden to Respondents and

Record Keepers 10

A.14 Annualized Cost to the Federal Government 10

A.15 Explanation for Program Changes or Adjustments 10

A.16 Plans for Tabulation and Publication and Project Time Schedule 11

A.17 Reason(s) Display of OMB Expiration Date is Inappropriate 12

A.18 Exceptions to Certification for Paperwork Reduction Act Submissions 12




List of Attachments


Attachment 1: Background and Rationale


Attachment 2: IMAT Awardee Interview Guide


Attachment 3: IMAT Awardees and Other NIH Awardees Evaluation Web-based Survey


Attachment 4: IMAT-Supported Technology End-User Interview Guide


Attachment 5: Privacy Impact Assessment (PIA)


Attachment 6: Trans-NIH Evaluation Advisory Committee (EAC)


Attachment 7: Privacy Act memo


Attachment 8: Office of Human Subjects Research Protection (OHSRP) Exemption


Attachment 9: Invitation Letters to IMAT Awardees and IMAT-Supported Technology End-Users



ABSTRACT


This is an information request for a one-year approval for the extension of OMB #0925-0720 titled, “Surveys and Interviews to Support an Evaluation of the Innovative Molecular Analysis Technologies (IMAT) Program”. The National Cancer Institute (NCI) IMAT program was launched in 1998 to support the development of highly innovative technologies to advance cancer research and clinical care capabilities. NCI is pursuing a comprehensive and robust evaluation to assess the process and outcomes of the IMAT program, and also seeking opportunities for NCI to improve the program’s utility for the broad continuum of cancer researchers, clinicians and ultimately patients. The focus of the evaluation has been on understanding the successes of supported technologies and targets three respondent groups: IMAT awardees, a sample of awardees from NIH funded programs on technology development beyond the IMAT program, and technology end-users. The evaluation approach is centered on tracking or following all supported technologies since 1998. To date, the IMAT program has issued 603 R21 and R33 awards and 183 SBIR and STTR awards, supporting over 500 unique technology platforms. The majority of the interviews (82%) and surveys (58%) have been completed with the past IMAT awardees and the comparison group of other NIH awardees. This extension is to collect information from the remaining interviews and surveys. Interviews with the cohort of IMAT-supported technology end-users have not started but will begin shortly.



A. Justification


A.1 Circumstances Making the Collection of Information Necessary

The National Cancer Institute (NCI) Innovative Molecular Analysis Technologies (IMAT) program was launched in FY1998. The mission for the IMAT program is “to support the development, maturation, and dissemination of novel and potentially transformative next-generation technologies through an approach of balanced but targeted innovation in support of clinical, laboratory, and epidemiological research on cancer”. Given the trans-divisional nature of the program, it has always been managed from the NCI Office of the Director. This is consistent with the purpose of NCI and authorizes the collection of information under Section 410 of the Public Health Service Act (42 USC § 285).

While the structure of the IMAT Program has evolved over periodic reformulations of the associated funding opportunities, the goal remains largely unchanged since its inception that is to support highly innovative ideas focused on early-stage technology development unlikely to win support through traditional NIH funding mechanisms. The IMAT program generally includes a rolling portfolio of 80 to 120 active projects, with total award outlays (including new and continuing awards) of approximately $20M to $30M per year. The program also requires participation at an annual meeting from all funded investigators (Attachment 1).

In 2015, IMAT and NIH awardees completed 521 web surveys, and there were 82 phone interviews completed. Due to a substantial delay in pursuing contract tasks during this project, the task associated with interviewing an end-user cohort had to be included in a subsequent contract with the same team, substantially delaying the execution of this component. A term of 1 year is requested for OMB approval to complete the interviews and surveys with the three respondent groups.


A.2 Purpose and Use of the Information Collection

NCI is pursuing a comprehensive process and outcome assessment of the 15-year old IMAT program. While the program consistently offers promising indicators of success, the full program has not been evaluated since 2008, and never in as comprehensive a manner as has been formulated in the current evaluation plan. An outcome evaluation of the long-standing NCI IMAT program presents a rich and unique opportunity likely to serve institutes across the NIH, and perhaps other federal agencies, considering the costs and benefits of directing resources towards supporting technology development. An award through the NIH Evaluation Set-Aside program to support this evaluation, for which NIH-wide relevance is a principle element of determining merit for support, is testament to this. The evaluation serves as an opportunity to gauge the impact of investments in technology development and also to assess the strengths and weaknesses of phased innovation award mechanisms.

The focus of the evaluation is on understanding the successes of supported technologies, specifically. The program's longevity provides a large data set with the potential to produce statistically significant findings and also comes with a variety of highly leverage-able resources, including: currently active program staff with extensive institutional memory; a robust feasibility study with proposed outcome evaluation design; and partial evaluation studies of the program to reduce the scope of work for the proposed evaluation.

The evaluation approach is centered on tracking or following all supported technologies since 1998. The history of the technology would then be tracked so that the technology can be described at each stage of its development, even prior to conceptualization for IMAT funding. Interviews and surveys will be conducted with individuals who have different associations with the technology in order to gather information about the current state of the technology and its potential for affecting progress in cancer research and treatment.

The interviews and surveys target three different respondent groups including: IMAT awardees, a comparison group of NIH awardees, and technology end-users, possibly including patient advocacy communities. The information for the majority of the interviews and surveys with both the IMAT awardees and the comparison group of NIH awardees has been completed, and only interviews with technology end-users remain. Technology end-users are respondents who employed any technologies that arose from IMAT funding, but were personally not involved during the IMAT-supported periods of development1. All three groups of respondents comprise primarily of scientists, engineers, and clinicians.


The interview protocol for the technology end-user group (Attachment 2) will focus on collecting the following information:

  1. Identify the development path(s) for IMAT technologies:

    1. How is the technology developed after IMAT funding and/or disassociation of the original PI with the project?

  2. Dissemination of all IMAT technologies:

    1. How were the details of the technology spread to scientific audiences?

    2. To what extent is the technology or methodology being used?

  3. Outcomes or impacts of successfully developed IMAT technology:

    1. What are the short-term and intermediate-term impacts?



The web-based survey protocol (Attachment 3) complements the interview protocols (Attachments 2 and 4). Issuing the web-based survey to a comparison group of NIH-supported investigators similarly developing novel technologies offers an important comparison group for the evaluation study. Responses to the surveys and interviews from IMAT awardees, who are the researchers that developed the technology, allow for the identification of research scientists in the field that might have employed these technologies, representing the Technology End-Users. The following questions were assessed during the interviews and surveys from the IMAT Awardees and comparison group (Attachments 3 and 4):

  1. Assessment of IMAT Program Design:

    1. How did the application process, solicitation, and IMAT funding structure (mechanisms) impact the development of your technology?



  1. Assessment of IMAT Program Implementation:

    1. How did interactions with NIH, NCI, or other organizations impact the development of your technology?

    2. How did the research environment (e.g., institutional support; other related research activities) impact the development of your technology?

  2. Interactions and collaborations with NCI program staff and other research organizations.

  3. Experiences with IMAT during the application and award selection process.

  4. Experiences with the funding mechanisms and the grants related to IMAT.


A.3 Use of Improved Information Technology and Burden Reduction

Web-based surveys and telephone interviews have been employed to reduce the burden to the respondent. A database was developed that contains the names of the respondents, the contact information, and the dates of all email and telephone contacts to track the most up-to-date contact information. All quantitative data has been standardized and compiled into a searchable database. The qualitative data will be compiled into Word documents and personally identifying information about the respondents from the database will be kept separate. If deemed necessary and worthwhile by the evaluation lead, more sophisticated data management and qualitative analysis software tools (e.g. nVivo software) may be employed.

The NCI Privacy Coordinator reviewed and approved the Privacy Impact Assessment (PIA) for this study. Additional reviews at NIH and HHS will be forthcoming (Attachment 5).


A.4 Efforts to Identify Duplication and Use of Similar Information

NCI conducted three previous pilot evaluations2 of this program in 2008, 2010 and 2013 to assess outcomes similar to the aims of the proposed evaluation, as they were based, in part, on the same professionally-developed evaluation design originally proposed in 2007 guiding the current evaluation strategy. Outcomes from these pilot evaluations were largely positive, suggesting significant anecdotal evidence that the IMAT program was achieving its proposed goals, but were unsatisfactory in that they represented such a narrow investigation of the full portfolio and highly susceptible to selection bias. An archival data analysis was conducted to select particular respondents based on their case profile, however no subjective information exists about their experiences with the IMAT program or with individuals who are using technologies that arose from IMAT funding. This is the first comprehensive and robust evaluation conducted since the IMAT Program’s inception in 1998. This extension is to extend the period allowed for conducting interviews and surveys with the IMAT awardees and technology end-users.



A.5 Impact on Small Businesses or Other Small Entities

SBIR and STTR grantees that were awarded IMAT grants comprise about 25% (160) of the awards made. The investigators involved in these projects would be approximately 25 of the awardees selected to be interviewed. As for other interviews, telephone surveys will be employed to limit the burden on the interviewee. It is possible that some of the end-users sought for interviews will be part of small business entities. It is not expected that this will account for more than 3-4 individuals (~15% of the group), however.



A.6 Consequences of Collecting the Information Less Frequently

This is a one-time collection.


A.7 Special Circumstances Relating to the Guidelines of 5 CFR 1320.5

This study complies fully with the guidelines of 5 CFR 1320.5. No exceptions to the guidelines are required.


A.8 Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency

A 60-day Federal Register Notice was published on March 23 (Vol. 81, Page 15541) and allowed 60 days for public comment and review. There were no public comments received.

A trans-NIH Evaluation Advisory Committee (EAC) was recruited to make early recommendations on the evaluation plan, consider ongoing progress of the evaluation, and to comment on the findings and final report from the proposed evaluation (Attachment 6).


A.9 Explanation of Any Payment or Gift to Respondents

No payment or gift will be made to the respondents.




A.10 Assurance of Confidentiality Provided to Respondents

Respondents have the option to skip any question they would prefer not to answer and to quit the survey at any time. Unless express permission is provided by the respondent, all data will be de-identified and reported in the aggregate. Respondents are not asked to complete a consent form. Each respondent’s willingness to schedule time for an interview is interpreted as evidence of implied consent.

To protect the security of respondents’ information, all project files are password protected and access to the files is limited to authorized project staff. Interview information is stored on a secure server protected with a Secure Sockets Layer (SSL) certificate and 128-bit encryption, the strongest online data encryption protection available. The tracking database with individual contact information is stored separately from the data. The database contains IDs only. The tracking database that links IDs to individual information will be destroyed at the end of the project. Project reports will not identify individuals who completed the survey. No names, university names, or personal identifying information will be used in any published reports of this study unless given express permission from the respondent. Survey reports will present all findings in aggregate so individual responses cannot be identified.

The NIH Privacy Act Officer has reviewed this data collection and determined the Privacy Act is applicable and is covered by NIH Privacy Act Systems of Record Notice (SORN) #09-25-0156, “Records of Participants in Programs and Respondents in Surveys Used to Evaluate Programs of the Public Health Service, HHS/PHS/NIH/OD” (Attachment 7). This SORN was published in Federal Register on 9/26/2002, Vol. 67, p. 60743.

Additionally, the Office of Human Subjects Research Protection (OHSRP) has reviewed this project and deemed that Federal regulations for the protection of human subjects do not apply to this information collection, and thus it has been excluded from Institutional Review Board review (Attachment 8).



A.11 Justification for Sensitive Questions

The questions being asked do not constitute sensitive questions.



A.12 Estimates of Annualized Burden Hours and Costs

The estimated burden hours for this extension are 233 hours for 447 respondents (Table A.12-1). Of the 900 originally approved respondents for the web-based survey (Attachment 3), 521 surveys have been completed and 379 still need to be completed which would constitute 190 burden hours from 379 repsondents. Of the 100 phone interviews originally approved (Attachment 4) with the IMAT awardees, 82 have been completed and 18 interviews still need to be completed which would constitute 18 burden hours from 18 respondents. Additionally, no interviews have been completed from the originally approved burden and all 50 technology end-user interviews (Attachment 2) still need to be completed from 50 respondents constituting 25 burden hours.

Additionally, Federal employees will be interviewed and asked similar questions, but because this is part of their job duties, the burden is not calculated into the table below.

Table A.12-1 Estimate of Annual Burden Hours

Form Name

Type of

Respondents

Number of

Respondents

Number of

Responses per Respondent

Average Burden per Response

(in hours)

Total Annual

Burden

(in hours)

Interview - IMAT Grantee


IMAT Awardees

18

1

1

18

Web-based Survey - Technology Grantees


IMAT Awardees; Other NIH Awardees representing comparison group

379

1

30/60

190

Interview –

Tech End-Users

Technology End-Users

50

1

30/60

25

Totals


447

447


233

Respondents comprise primarily of scientists, engineers, and clinicians. The mean hourly wage rate was calculated taking the average ($44.23 per hour3) of the following three categories:


The total and annualized cost to respondents is estimated to be $10,306 annually (Table A.12-2).





Table A.12-2 Annualized Cost to Respondents

Type of

Respondents

Total Annual

Burden Hours

Hourly Wage Rate

Respondent

Cost

IMAT Grantees

18

$44.23

$796.14

IMAT Grantees and Other NIH Grantees

190

$44.23

$8,403.70

Technology End-Users

25

$44.23

$1,105.75

Total

233


$10,305.59


A.13 Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers

There are no direct costs to respondents other than their time to participate in the study.



A.14 Annualized Cost to the Federal Government

The total and annualized cost to the Federal Government is approximately $7,837 for this information collection. The majority of the expenses are from contractor costs that include non-federal personnel costs including salary and benefits of a project director, senior research analyst, and a research analyst. Federal personnel costs include several hours of participation from a program officer. The total costs are in Table A.14-1.

A.14-1 Estimate of Total Cost-Government

Staff

Grade/Step

Salary

% of Effort

Fringe (if applicable)

Total Cost to Gov’t

Federal Oversight





$640.41

Program Officer

15/1

$128,082

0.5%


$640.41







Contractor Cost





$7,196.80

Project Director



100%


$2,603.52

Senior Research Analyst



100%


$1,937.20

Research Analyst



100%


$2,656.08

Travel





$0

Other Cost





$0







Total





$7,837.21


A.15 Explanation for Program Changes or Adjustments

This is an extension to continue and complete the information collection for approved submission (OMB #0925-0720). In 2015, the previous submission approved 575 burden hours to complete 900 web surveys and 150 interviews (100 with IMAT Grantees and 50 with Tech End Users). Most of the data collection has been completed including 521 (58%) web surveys and 82 (82%) phone interviews with investigators supported by the IMAT program. Interviews with the cohort of IMAT-supported technology end-users have not started but will begin shortly. This extension is to collect information from the remaining interviews and surveys. Additionally, there have been no changes to the data collection plan.

This extension is being requested to allow one additional year to finish collecting the information (233 burden hours), which includes:

  • 18 Interviews with IMAT Grantees (18 burden hours),

  • 379 Surveys with IMAT and other NIH Grantees (190 burden hours),

  • 50 Interviews with Tech End Users (25 burden hours).

Subsequently, this request amounts to a decrease in burden hours from the previous approved submission by a total amount of 342 hours.


A.16 Plans for Tabulation and Publication and Project Time Schedule

Data Analysis

Data quality control and quality assurance procedures have been developed and implemented by senior evaluation professionals and applied to all collected data. This includes procedures to ensure accuracy and consistency in data entry, data manipulation, and calculation. For quantitative data, internal validity is checked as necessary for analysis. Descriptive and summary statistics will be calculated. If warranted, data from multiple sources may be cross-tabulated to address the study questions. Analytical statistics (e.g., linear regression – especially for publication analysis/bibliometric information) and network statistics (e.g., degree centrality, betweenness, closeness and eigenvector centrality – especially for multi-disciplinary analyses) may be used to assess differences between ICMICs and comparator institutions should comparable information be obtained from the comparators.

For qualitative data, a more sophisticated data management and qualitative analysis software tools (e.g. nVivo software) may be employed (although it is not expected that the volume of qualitative data will require this). Qualitative data will be coded and analyzed using standard qualitative methods.

Plans for Publication

No plans for publication.

Project Time Schedule

The evaluation is overseen by a trans-NIH evaluation advisory committee (EAC – Attachment 6), and it is estimated that the entire project will be completed by May 2017. The remaining surveys and interviews will be collected within 12 months of OMB approval.


A.17 Reason(s) Display of OMB Expiration Date is Inappropriate

No exemption from the display of the OMB Expiration Date are being requested.


A.18 Exceptions to Certification for Paperwork Reduction Act Submissions

There are no exceptions to certification being requested.

1 Though the technology end-users are a separate respondent group, it is conceivable that some individuals in that group could also be IMAT awardees.

2 The pilot evaluations involved interviewing 9 individuals for each of the years (2008, 2010, and 2013) and thus did not need OMB clearance.

3 The Hourly Wage Rate for the previous submission was $44.34, calculated from the May 2013 Wage estimates.

4 May 2014 National Occupational Employment and Wage Estimates - United States.

5


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitlePLCO SCREENING TRIAL
AuthorLinda Suit
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy