CTSA OMB stmt A revised 3 31 11 srw

CTSA OMB stmt A revised 3 31 11 srw.doc

National Evaluation of the Clinical and Translational Science Awards (CTSA) Initiative (NCRR)

OMB: 0925-0629

Document [doc]
Download: doc | pdf






Supporting Statement A for

National Evaluation of the
Clinical and Translational Science Awards (CTSA) Initiative (NCRR)





Revised March 30, 2011 December 15, 2010









Patricia Newman

Program Analyst

Office of Science Policy

National Center for Research Resources

6701 Democracy Boulevard

MSC 4874

Bethesda, Maryland 20892-4874.

Telephone: 301-435-0864

Fax: 301-480-3654

Email: [email protected]


Table of contents


Supporting Statement A. Justification 34

Introduction 34

Overview of the CTSA Initiative 34

A.1. Circumstances Making the Collection of Information Necessary 56

A.2. Purpose and Use of the Information Collection 56


Overview of the Study Design 67

A.3. Use of Information Technology and Burden Reduction 1412

A.4. Efforts to Identify Duplication and Use of Similar Information 1412

A.5. Impact on Small Businesses and Other Small Entities 1512

A.6. Consequences of Collecting the Information Less Frequently 1513

A.7. Special Circumstances Relating to the Guidelines in 5 CFR 1320.5 1513

A.8. Comments in Response to Federal Register Notice and Efforts to Consult Outside Agency 1613

A.9. Explanation of Any Payment of Gift to Respondents 1714

A.10. Assurance of intent ot keep information Confidentialprivate to the extent permitted by law 1714

A.11. Justification of Sensitive Questions 1815

A.12. Estimates of Hour Burden Including Annualized Hourly Costs 1816

A.12.1. Number of Respondents, Frequency of Response, and Annual Hour Burden 1916

A.12.2. Hour Burden Estimates by Each Module and Aggregate Hour Burdens 1916

A.12.3. Estimates of Cost to Respondents for the Hour Burdens 1720

A.13. Estimate of Other Total Annual Cost Burden to Respondents or Record Keepers 1820

A.14. Annualized Cost to the Federal Government 1820

A.15. Explanation for Program Changes or Adjustments 2118

A.16. Plans for Tabulation and Publication and Project Time Schedule 2118

A.17. Reason(s) Display of OMB Expiration Date is Inappropriate 1922

A.18. Exceptions to Certification for Paperwork Reduction Act Submissions 1922



Exhibit A1. Summary of Cognitive Testing Informants 18

Exhibit A2. Calculations used to estimate annualized response burden for the National Evaluation of the CTSA Initiative 21

Exhibit A3. Calculations used to estimate annualized cost burden for the Evaluation of the CTSA Initiative 2220

Exhibit A4: Estimated annualized cost to the Federal government for the National Evaluation of the CTSA Initiative 223




Request for OMB Clearance

National Evaluation of the
Clinical and Translational Science Awards (CTSA) Initiative (NCRR)


Supporting Statement for Paperwork Reduction Act Submission


Supporting Statement A. Justification


Introduction

This request for Office of Management and Budget (OMB) review asks for clearance for four surveys to be used in the National Evaluation of the Clinical and Translational Science Awards (CTSA) Initiative (see Attachment 1). A national consortium of academic medical centers, CTSA is directed at transforming the way biomedical research is conducted nationwide. The CTSA Initiative embodies the vision to reduce the time it takes for basic science or laboratory discoveries to become treatments for patients, and for those treatments in turn to be incorporated and disseminated throughout community practice. Scientific innovation is critical to the CTSA mission, including employing novel concepts, approaches, methodologies, tools, and technologies; integrating clinical, basic, and other related disciplines; and training the next generation of clinical and translational researchers. The Initiative is administered by the National Center for Research Resources (NCRR) at the National Institutes of Health (NIH) as part of the NIH Roadmap for Medical Research initiative launched under the direction of former NIH Director Elias Zerhouni. The NIH, an operating division of the U.S. Department of Health and Human Services, is the primary Federal agency for conducting and supporting medical research. NCRR provides laboratory scientists and clinical researchers with tools and training to understand, detect, treat, and prevent a wide range of diseases. Evaluation of the CTSA Initiative is expected by Congress according to the NIH Reform Act of 2006, the Fiscal Year 2007 Appropriation Report – Senate – Report 109-287, and the Fiscal Year 2008 Appropriation Report – House – Report 110-231. 42 USC 242 b and 42 USC 282 b provide the Secretary of Health and Human Services and the Director of the NIH, respectively, to conduct evaluations in health and health care. The evaluation findings will be used to assess the large annual NIH investment in this Initiative.



Overview of the CTSA Initiative

The CTSA Initiative is administered by the Division of Clinical Research Resources (DCRR) within NCRR. The Initiative was designed to spur the transformation of the biomedical research enterprise in the United States so that new treatments can be developed more efficiently and delivered more quickly to patients. The Initiative enables institutions to create an integrated academic home for clinical and translational science that has the resources to train and advance multi- and interdisciplinary investigators and research teams. CTSAs engage basic, translational, and clinical investigators, community clinicians, clinical practices, networks, professional societies, and industry to develop new professional interactions, programs, and research projects. The CTSA Initiative is designed to fundamentally change the organization and operating paradigm of research in major academic medical centers by providing infrastructure resources to support clinical and translational science activities and training for the next generation of scientists. Establishing new partnerships and new collaborations is considered essential to reaching the Initiative’s goals. While many of the new linkages will involve direct collaborations among people or groups, new informatics tools are also expected to play an important role in connecting information, resources, and groups. The impact of this program on the organizational research culture will help deconstruct communication barriers that are endemic in the traditional academic research model, where basic scientists work separately from clinical researchers and clinical researchers do not have strong connections to clinical practitioners in the community. Additionally, through innovative clinical and translational training, CTSAs will develop a new generation of interdisciplinary scientists who can continue to transform the way biomedical research is conducted.


When fully implemented by 2012, there will be a total of 60 CTSA awardees, with an expected annual budget of approximately $500 million. Given the size of the Federal investment and the importance Congress places on improving health care for the American people through clinical patient services and other aspects of interdisciplinary clinical and translational research, rigorous evaluation of the implementation and impacts of this Initiative is essential. Indeed, evaluating the work of the CTSA institutions and the consortium, described below, is expected by Congress.1 This clearance request to conduct four surveys to inform the progress of the Initiative pertains to the 46 awardees participating in the first four (2006, 2007, 2008, and 2009) cohorts of CTSA awards.2


Each CTSA participates in a national consortium with NIH, a distinctive feature essential to transforming the impact of research on human health. While each CTSA is unique in its approach to addressing the key “function areas” outlined in the Request for Applications (FRA)/Funding Opportunity Announcement (FOA), interchange and collaboration among the CTSA awardees is supported by the CTSA consortium in an effort to create synergies and efficiencies among the institutions and NIH. When established in 2006, the consortium formed committees that aligned with the key function areas outlined in the initial RFA.3 In October 2008, consortium members realigned the consortium committee structure to support activities related to five newly articulated strategic goals, which include the enhancement of 1) national clinical and translational research capability, 2) training and career development of clinical and translational scientists, 3) consortium-wide collaborations, 4) the health of communities and the nation, and 5) T1 translational research. The consortium formulated new Strategic Goal Committees (SGCs) that draw on the expertise of the Key Function Committee (KFC) members and that focus on identifying and accomplishing specific deliverables more deliberately than in the past. A Steering Committee, Child Health Oversight Committee, and Executive Committee were also included, and, more recently, a Consortium Management Group was added to provide consortium oversight. As the Initiative moves forward, the consortium structure is expected to continue to evolve.





A.1. Circumstances Making the Collection of Information Necessary

NCRR is requesting an evaluation of the first four cohorts of CTSA awardee institutions in order to comply with the congressional expectation for an evaluation that is commensurate with the level of NIH investment and the potential impact on the biomedical enterprise and on public health. As indicated earlier, congressional reports and the Reform Act of 2006 indicate an expectation for an external evaluation with a requirement to evaluate pediatric training. This evaluation is also intended to support continuous program improvement and effective funding decisions made by NIH. This information is collected under 42 USC 242b and 42 USC 282 b.




A.2. Purpose and Use of the Information Collection

The primary purpose of this data collection is to provide information about the process and early outcomes associated with 46 awardees participating in the first four cohorts of CTSA awards, in order to fulfill the congressional expectations described above. NIH will use the results to understand the extent to which the CTSA Initiative is bringing about transformational changes in clinical and translational science among academic medical centers and their research partners, increasing the efficiency of the research process, and enhancing the capacity of the field to conduct clinical and translational research. All information collected will be used to provide analytical and policy support to NCRR, assisting NIH in making decisions about current CTSA programming, future funding, and other initiatives to improve clinical and translational science. It may also provide information for NIH’s Government Performance and Results Act (GPRA) report.


A secondary purpose of this data collection is to provide the foundation for future evaluations that will track the progress of the CTSA Initiative. Since baseline data —–data gathered regarding the status of variables before the initiation of the CTSA program--—are not available for many of the areas being examined, this data collection will serve as an initial benchmark against which future progress will be measured.



Overview of the Study Design

NCRR will collect information about the process and early outcomes associated with CTSA awards through several component studies, two of which—the biennial utilization study and the biennial education and training study—are the subject of this clearance request.4 The full range of data collection activities are described below:


  • Individualized stakeholder interviews to determine short-term outcomes believed to be important;

  • Two biennial surveys of researchers (CTSA users and nonusers) at academic medical centers to assess utilization of CTSA resources, referred to as the “utilization study”;

  • Two biennial surveys of researchers (CTSA clinical and translational trainees/scholars and their mentors) to assess the efficacy of the education and training component, referred to as the “education and training study”;

  • Analyses of secondary publications and patent data to provide an indicator of CTSA-related scientific advancements;

  • Other secondary analyses of data submitted by awardees in Non-Competing Continuation Progress Reports (PHS 2590);

  • Expert panel review of research and tools nominated as potential scientific breakthroughs; and

  • Field visits to a selected sample of CTSA institutions to more closely examine how the program is being implemented and what it is accomplishing, using interviews tailored to the specific CTSA and each individual’s unique project role.

Additional detail on the studies and surveys that are the subject of this clearance are presented below.



Utilization Study


The utilization study will collect information from two groups of researchers at academic medical centers:


  • Researchers, including Principal Investigators (PIs) and Co-Principal Investigators (Co-PIs), whose clinical or translational research was aided by the resources of a CTSA, as reported in the Non-Competing Continuation Progress Reports (PHS 2590) submitted by the 46 awardee institutions. These researchers are respondents to the users survey.

  • Researchers who are the PI or Co-PI on an NIH grant at a participating CTSA institution, not reported to have used any CTSA resources. These researchers are respondents to the nonusers survey.


The users survey is designed to obtain information about the resources being used and the researchers who use them. It will be administered biennially, through a web-based survey sent by email to the last known email address of a sample of 700 users, stratified by cohort, who were reported in Non-Competing Continuation Progress Reports (PHS 2590). The web link to the survey will be accompanied by an introductory email that explains the purpose of the evaluation, describes the nature of the information being requested, indicates that response is voluntary, assures the respondents that their answers will be kept completely confidentialprivate to the extent permitted by law, provides a respondent-specific password, and requests that the survey be completed on the web within two weeks of receipt. For each biennial administration, we are expecting approximately 500 complete responses (i.e., responses from about 71 percent of users sampled) based on experience with similar surveys and consultation with CTSA consortium evaluation experts.


The users survey has two main sections:


Section I asks respondents about their interactions with a CTSA, including how they learned about it, resources they used or are considering using, the research for which they used the resources, and their satisfaction with the resources.

  • Section II asks users’ primary employment and educational background.


Copies of the cover email and a paper-based version of the users survey are provided in Attachment 1, Part A.


The nonusers survey is designed to obtain information about familiarity with CTSA resources and attitudes about them from researchers who have not been reported in Non-Competing Continuation Progress Reports (PHS 2590) to have used them. This survey will be administered biennially, through a web-based survey sent by email to the last known email address of a sample of 2,000 researchers. These researchers will be ones who were listed (using the NIH Research Portfolio Online Reporting Tool (RePORTER)) as the PI or Co-PI on an NIH grant at an institution that is participating in the CTSA Initiative and were not reported to have used any CTSA resources. The web link to the survey will be accompanied by an introductory email that explains the purpose of the evaluation, describes the nature of the information being requested, indicates that response is voluntary, assures the respondents that their answers will be kept completely confidentialprivate to the extent permitted by law, and requests that the survey be completed on the web within two weeks of receipt. For each biennial administration, we are expecting approximately 500 complete responses (i.e., about 25 percent of nonusers sampled) based on experience with similar surveys and consultation with CTSA consortium evaluation experts.


The nonusers survey has two main sections:


Section I asks about their familiarity with a CTSA and whether they have considered or are considering using CTSA resources in their research and, if relevant, why they decided not to use the resources.

Section II asks about nonusers’ primary employment and educational background.

Copies of the cover email and a paper-based version of the nonusers survey are provided in Attachment 1, Part B.


Education and Training Study

The education and training study will collect information from trainees/scholars and mentors, more specifically:


  • Clinical and translational trainees/scholars reported by a CTSA to have received training awards through CTSAs. These researchers are respondents to the trainees/scholars survey.

  • Researchers reported by a CTSA to have served as mentors for clinical and translational researchers. These researchers are respondents to the mentors survey.


The trainees/scholars survey is designed to obtain information about the support being provided to 1,516 individuals who have received education/training funding through the CTSAs. It will be administered biennially, through a web-based survey sent by email to the last known email address of all individuals who were reported—in CTSA Non-Competing Continuation Progress Reports (PHS 2590) and supplemented by current information provided by a local CTSA contact— to have been provided with education/training support. Each local CTSA contact will be asked to provide updated contact information for education/training funding recipients who have left the institution and supplementary information for recipients who have joined the institution since the last Non-Competing Continuation Progress Reports (PHS 2590) were submitted in spring 2010. The web link to the survey will be accompanied by an introductory email that explains the purpose of the evaluation, describes the nature of the information being requested, indicates that response is voluntary, assures the respondents that their answers will be kept completely confidentialprivate to the extent permitted by law, provides a respondent-specific password, and requests that the survey be completed on the web within two weeks of receipt. For each biennial administration, we are expecting approximately 1,213 complete responses (i.e., responses from about 80 percent of trainees/scholars sampled) based on experience with similar surveys and consultation with NCRR and CTSA consortium evaluation experts.

The trainees/scholars survey has seven main sections:


Section I asks them to assess themselves on three types of translational research and their overall interest in a career in the field.

Section II asks them to assess the career preparation provided by their CTSA-funded education/training program.

Section III asks them to identify their career intentions, their research interests, and their views about a career in research.

Section IV asks about various components of their CTSA-funded education/training program, their exposure to team-based research, how long they were in the program and whether they have completed it, and their opinions about the program.

Section V asks about mentoring provided by the CTSA-funded education/training program, including the research backgrounds of the mentors and the help they provided.

Section VI asks about applications for research funding for which they may have applied and journal articles published related to their CTSA-funded education/training.

Section VII asks about trainees/scholars’ demographics, educational background, and employment.



Copies of the cover email and a paper-based version of the trainees/scholars survey are provided in Attachment 1, Part C.



Mentor Survey


The mentors’ survey is designed to obtain information about the support being provided by 1,688 mentors to individuals who have received CTSA education/training funding in the two most recent years. It will be administered biennially to mentors from the two most recent years. The administration will be through a web-based survey sent by email to the last known email address of all individuals who were reported to have served as mentors for trainees and scholars in CTSA Non-Competing Continuation Progress Reports (PHS 2590), supplemented by current information provided by a local CTSA contact (e.g., program administrator). Each local CTSA contact will be asked to provide supplementary information for mentors who have joined the institution since the last Non-Competing Continuation Progress Reports (PHS 2590) were submitted in spring 2010. The web link to the survey will be accompanied by an introductory email that explains the purpose of the evaluation, describes the nature of the information being requested, assures the respondents that their answers will be kept completely confidentialprivate to the extent permitted by law, provides a respondent-specific password, and requests that the survey be completed on the web within two weeks of receipt. For each biennial administration, we are expecting approximately 1,350 complete responses (i.e., responses from about 80 percent of mentors sampled) based on experience with similar surveys and consultation with CTSA consortium evaluation experts.

The mentor survey has four main sections:


Section I asks them about their training and preparation for mentoring associated with the CTSA-funded education/training program.

Section II asks them about their mentoring experiences, their views on the usefulness of the CTSA in mentoring trainees and scholars, and the numbers of individuals they have mentored.

Section III asks them to identify, for up to two mentees, the type of award, the duration of the mentoring relationship, the frequency and content of their meetings, the support they provide as mentors, the professional activities in which the mentee has engaged, and views about the mentee’s preparedness to conduct research.

Section IV asks about mentors’ demographics and primary employment, and asks them to classify their translational research.



Copies of the cover email and a paper-based version of the mentors survey are provided in Attachment 1, Part D.


In the remainder of this section we provide an overview of the questions the study will examine, the methodologies that will be used to address each question, and the specific information to be contributed to each question by the two study components whose surveys are submitted for clearance under this request. In addition, we provide illustrations of the types of conclusions that might be drawn based on the measures that result. The evaluation addresses the following questions:


Question 1:Are CTSAs increasing the quality of clinical and translational research (e.g., knowledge, tools, and methodologies) and creating more streamlined and cost-efficient processes?




  • Methodologies used to Address the Question: Utilization Study (User and Non-User Surveys), Document Review, Publications Analyses, Field Visits, and Scientific Breakthrough Nominations.

  • Information gathered through the utilization study:

  • The users and on-users surveys that comprise the utilization study will collect information from two groups of researchers at academic medical centers (see above):

  • Content of the survey questions--—Section I of the users survey (questions 1 to19) asks investigators about their interactions with a CTSA, including how they learned about it, resources they used or are considering using, the research for which they used the resources, their satisfaction with the resources, products, publications, and grants that resulted, and any new collaborations that emerged. A comparable section in the non-users survey (questions 3 to 7) asks about investigators’ awareness of the CTSAs’ resources and whether the investigators have considered using any CTSA resources or have plans to do so in the future.

Section II of the users survey (questions 1 to 5) and Section II of the non-users survey (questions 1-8) ask investigators about their background characteristics—type of employer, position, degrees, etc.

  • Illustrative conclusions that might be drawn regarding question 1 from the measures that result

  • Data from the users survey on use of services and their ratings (questions 2A 8) along with data from the non-users survey (questions 3 to 7) will provide an initial picture of the perceived value of the CTSA resources to investigators. A first step toward improved quality is providing evidence that the resources afforded through the CTSAs are valued and are perceived to be contributing positively to the research enterprise. Usage patterns and reasons for non-usage will inform Congress and program staff about the value to investigators of resource infrastructure investments being made in the CTSAs. Information on early indicators of the quality of the research work being conducted will also be made available to Congress and the program staff through traditional indicators such as publications, patents, and follow-on funding.

Comparisons of data over time will provide information on the extent to which the CTSAs are able to maintain or improve the value of the services they offer, meet the needs of more investigators, and leverage the CTSA investments.

  • Data from the users and non-users surveys on investigator characteristics will provide information to Congress and the program staff about differences between the user and non-user populations. These data will provide important information on the extent to which resources are being accessed by the target populations sthat the CTSAs are intended to serve, where gaps may exist, and whether there is a need to reach out to individuals from some fields or in some positions.

Comparison of data over time will provide information on the extent to which the any important differences persist (or newly emerge) in the characteristics of the user and non-user populations and whether or not the program is adequately addressing any gaps that may have been noted.


Question 2: Are CTSAs enhancing collaborations among institutions, disciplines, and researchers along the clinical and translation research continuum?


  • Methodologies used to Address the Question: Utilization Study (User and Non-User Surveys), Education and Training Study (Scholar/Trainee and Mentor Survey)s, Document Review, Publications Analyses, and Field Visits.

  • Information gathered through the utilization study:

  • The user sand non-users surveys that comprise the Utilization Study will collect information from two groups of researchers at academic medical centers (see above):

  • Content of questions-Questions 15 to 19 in section I of the users survey ask investigators to provide information on whether or not new collaborations have been fostered through the use of CTSA resources, the nature of the collaborations, and whether they are likely to continue.

Section II of the users survey (Questions 1 to 5) asks investigators about their background characteristics—type of employer, position, degrees, etc.

  • Illustrative conclusions that might be drawn regarding question 2 based on the measures that result

  • Data from the users survey on collaborations that occur in association with the use of CTSA resources will provide an initial picture of the extent to which new collaborations are being fostered by the CTSA program. These data combined with information about the investigators, such as the place of their work along the clinical and translational continuum, will provide a picture of the extent to which investigators with different background characteristics report engaging in such collaborations and whether or not systematic differences exist among groups with different backgrounds.

Comparisons of data over time will provide information on the extent to which the impact on collaborations is maintained, changes or decreases.

  • Information gathered through the education and training study

  • The education and training study will collect information from trainees/scholars and mentors (see above)

  • Content of questions—Question 8 in Section III of the mentor survey asks about whether or not the mentor has collaborated with mentees on publications, presentations, and grant applications.

Question 2 in Section V of the trainee/scholar survey asks whether trainee/scholar asks about whether or not the mentor helps the trainee/scholar make connections with people in and outside their field and helps with research project and obtaining research funding.

  • Illustrative conclusions that might be drawn regarding question 2 based on the measures that result

  • Data from the mentor survey regarding collaborations will inform Congress and the program staff of the extent to which the education and training component is promoting collaboration between new and more experienced faculty. Such data provide an early indicator of the extent to which the program, as implemented, is modeling the kind of research collaborations that the CTSA is expected to promote.

  • Comparisons of data over time will provide information on the extent to which the impact on collaborations is maintained, changes or decreases.

  • Data from the trainee/scholar survey on extent to which the mentors assist them in making connections, working together on research and obtaining research funding will provide additional information to Congress and the program staff on the contribution of the CTSAs to creating a collaborative research enterprise and establishing the connections needed for successful clinical and translational research.

Comparisons of data over time will provide information on the extent to which the impact on collaborations is maintained, changes or decreases.









Are CTSAs enhancing collaborations among institutions, disciplines, and researchers along the clinical and translational research continuum?

Question 3:Are CTSAs enhancing the capacity to conduct clinical and translational research, including supporting a diverse workforce that is distributed along the clinical and translational research continuum?



  • Information gathered through the education and training study

  • The education and training study will collect information from trainees/scholars and mentors (see above):

  • Content of Questions—trainee/scholar survey

  1. Section I asks them to assess themselves on three types of translational research and their overall interest in a career in the field.

  2. Section II asks them to assess the career preparation provided by their CTSA-funded education/training program.

  3. Section III asks them to identify their career intentions, their research interests, and their views about a career in research.

  4. Section IV asks about various components of their CTSA-funded education/training program, their exposure to team-based research, how long they were in the program and whether they have completed it, and their opinions about the program.

  5. Section V asks about mentoring provided by the CTSA-funded education/training program, including the research backgrounds of the mentors, the help they provided, and their effectiveness.

  6. Section VI asks about applications for research funding for which they may have applied and journal articles published related to their CTSA-funded education/training.

  7. Section VII asks about trainees/scholars’ demographics, educational background, and employment.



  • Content of the questions—mentor survey

  1. Section I asks them about their training and preparation for mentoring associated with the CTSA-funded education/training program.

  2. Section II asks them about their mentoring experiences, their views on the usefulness of the CTSA in mentoring trainees and scholars, and the numbers of individuals they have mentored.

  3. Section III asks them to identify, for up to two mentees, the type of award, the duration of the mentoring relationship, the frequency and content of their meetings, the support they provide as mentors, the professional activities in which the mentee has engaged, and views about the mentee’s preparedness to conduct research.

  4. Section IV asks about mentors’ demographics and primary employment, and asks them to classify their translational research.

  • Illustrative conclusions that might be drawn regarding question 3 based on the measures that result

  • Data from questions in section II on the trainee/scholar survey will provide Congress and the program staff with information on the extent to which new faculty and the potential next generation of faculty perceive that they are acquiring new skills and knowledge from the program. Data from questions in section IV will allow Congress to determine whether or not the education and training component of the CTSA program is reaching a diverse workforce.

Comparisons of data over time will inform the extent to which trainee/scholar perceptions of the benefits of the program are maintained, increase or decrease over time. Comparisons by demographic characteristics will inform Congress and the program of the progress toward the goal of increasing the diversity of the clinical and translational workforce.

  • Data from question in section II of the mentor survey will provide Congress with information on the extent to which more senior faculty perceive that the education component of the CTSA is increasing their own knowledge related to clinical and translational research.

Comparisons of data over time will inform the extent to which mentors’ perceptions of the benefits of the program are maintained, increase or decrease over time.


Question 4: Are the CTSAs impacting the health of communities and the practices of community clinicians through each of the forms of contributions described above?



  • Methodologies used to Address the Question: Document Review and Field Visits.


Question 5: Is the CTSA consortium providing added value to the program? How does the CTSA consortium influence institutional activities, and how do the institutional activities influence the consortium


  • Methodology used to Address the Question: Field Visits.

  1. Is the CTSA consortium providing added value to the program? How does the CTSA consortium influence institutional activities, and how do the institutional activities influence the consortium?




A.3. Use of Information Technology and Burden Reduction

All four surveys for which clearance is being requested will use a modular, Internet-based approach to data collection and quality control to minimize the response burden and maximize accuracy. The selected sample will be contacted by a targeted email and provided with a password-protected web link to the survey. In addition, skip patterns will be incorporated in each survey to minimize response burden. Built-in quality control checks programmed into the system will alert respondents to skipped items and/or inconsistent responses. NIH submitted a Privacy Impact Assessment (PIA) for the system that stores, processes, and transmits all information related to the study including surveys for which clearance is requested (see Attachment 2).



A.4. Efforts to Identify Duplication and Use of Similar Information

Selected respondents may have been surveyed by the NIH on matters related to clinical and translational research programs by their own CTSA or by a CTSA consortium committee. However, there is no comprehensive and consistent collection of these data across the CTSAs. Secondary data available through the Non-Competing Continuation Progress Reports (PHS 2590) will be used, for example, to identify research specialties of CTSA users, thereby minimizing response burden and unnecessary duplication of effort.



A.5. Impact on Small Businesses and Other Small Entities

No information is to be collected from small businesses.



A.6. Consequences of Collecting the Information Less Frequently

If the data for this evaluation are not collected, the NIH will be unable to assess the degree to which the early CTSA Initiative is making progress toward meeting its goals to transform the clinical and translational research enterprise. In the absence of the data that will be collected biennially through the surveys described herein, the NIH will fail to fulfill the congressional expectations included in funding appropriations and the NIH Reform Act of 2006. More specifically, without the data described in this request for clearance, NCRR will be unable to document the processes and early outcomes for the first four cohorts of CTSA award recipients with respect to the utilization of resources and the success of the education and training component of the CTSA Initiative. In addition, without collecting this information, consistent, early data for assessing the progress of the Initiative over time will not be available to evaluate the CTSA Initiative in future years.



A.7. Special Circumstances Relating to the Guidelines in 5 CFR 1320.5

The data collection is in compliance with 5 CFR 1320.5.


A.8. Comments in Response to Federal Register Notice and Efforts to Consult Outside Agency

In developing the four survey instruments included in the National Evaluation of the CTSA Initiative, experts in evaluation at NCRR and the National Cancer Institute, CTSA program officials, and evaluators at CTSA institutions were consulted. Their names and contact information are provided in Supporting Statement B, Section B.5.



To further refine the surveys and minimize response burden, cognitive testing5 (a form of pilot testing) was undertaken. Two experts in cognitive testing were consulted and a total of 23 informants at CTSA institutions (detailed in Exhibit 1) were interviewed by telephone. Informants were identified by CTSA site evaluators and agreed to participate.


Exhibit A1. Summary of Cognitive Testing Informants


Respondent type

Round 1

Round 2

User

5

2

Nonuser

2

3

Trainee/scholar

7


Mentor

4



As shown in Exhibit 1, we conducted two rounds of cognitive testing on the users and nonusers surveys and one round of cognitive interviews with trainees/scholars and mentors surveys. For all four surveys, review of the instruments by cognitive testing experts was conducted prior to testing to enhance the clarity of language used (e.g., eliminating double-barreled questions), support consistency across surveys, and reduce response burden. After Round 1, revisions were made to the users and nonusers surveys and content common to the trainees/scholars and mentors surveys. Revisions to all four surveys have been made to reflect informants’ specific suggestions during cognitive interviews and analysis of the results across all of the interviews conducted for each group. Contact information for the cognitive testing experts is provided in Supporting Statement B, Section B.5.


This proposed information collection was published at 75 Federal Register 198, pp. 62543–62544, on October 12, 2010 and allowed 60 days for public comment. No comments were received.


A.9. Explanation of Any Payment of Gift to Respondents

No payments or gifts will be offered.



A.10. Assurance that Data Will be Kept of ConfidentialPrivate to the Extent Permitted by Lawity Provided to Respondents

The primary data collected in all four surveys will be kept strictly confidentialprivate to the extent permitted by law and maintained in accordance with the requirements of the Privacy Act of 1974. A Privacy Impact Assessment (PIA) was submitted for the CTSA Study Management System that stores, processes, and transmits all information related to the study including surveys for which clearance is requested (see Attachment 2). Data containing personally identifiable information (PII) will be available only to three designated project staff members who have signed a pledge to keep data of confidentialprivate to the extent permitted by law, ity, possess up-to-date certificates attesting to their having completed Westat’s Human Subjects and Data Security trainings, have received background checks and clearance from the NIH, and possess up-to-date certificates attesting to their having completed NIH Human Subjects and Data Security trainings.


In the cover emails that will accompany the surveys, respondents will be informed about procedures to maintain keep data confidentialprivate to the extent permitted by lawity and security. They will also be told that their participation is voluntary and that they may choose not to provide responses to individual questions and still complete the survey. For all surveys, data maintained in electronic form will be stored on a password-protected server accessible only to designated project staff. Alphanumeric identifiers will be used in place of names to protect respondents from being identified. Evaluation results will be reported in aggregated form and will include no information that will enable identification of specific individuals. We will also make every effort to conceal the specific institutional affiliations associated with all responses. Westat’s Institutional Review Board (IRB) has deemed this to be a program evaluation posing minimal risk and therefore exempt from review pursuant to 45 CFR 46 101. (b)(2) (See Attachment 3). The IRB reviews all studies involving research on human subjects to ensure that appropriate procedures are in place to maintain the integrity of the data and address possible disclosures outside of the research context.


The only personally identifying information (PII) requested on the users and nonusers surveys is information about employment and educational background. This information is generally publicly available.


Some identifiable PII data are requested from trainees/scholars, namely demographic data and grant/award applications, as well as current employment and educational background. Mentors are asked to provide demographic information in addition to current employment and educational background. Security provisions have been made to preserve keep the data confidentialprivate to the extent permitted by lawity of all data, and the risk of accidental release of these data is minimal.



A.11. Justification of Sensitive Questions

All four surveys include items requesting data that may be construed as moderately sensitive, namely employment and educational background and, in the case of trainees/scholars and mentors, demographic data. This information is crucial to effectively evaluate the outcomes of the CTSA Initiative. As noted in Section A.10, alphanumeric identifiers will be used in place of names to protect respondents from being identified and eliminate the danger that any unintentional disclosure may enable personally identifiable information (PII) to be connected with sensitive information. In addition, all the safeguards discussed in Section A.10 will apply to these data.



A.12. Estimates of Hour Burden Including Annualized Hourly Costs

In keeping with the NIH’s program goals, the survey instruments will be administered using methods designed to collect essential evaluation data with the least possible burden to respondents. As detailed below, estimated response time for each respondent is 15 minutes for the users survey, 5 minutes for the nonusers survey, 20 minutes for the trainees/scholars survey, and 15 minutes for the mentors survey. The estimated total annualized response burden per administration is 451.5 person-hours and the estimated annualized total costs to respondents is $14,056. These estimates are based on responses from cognitive testing informants, discussions with NCRR program officers, experience in implementing similar surveys, and information found in Bureau of Labor Statistics, U.S. Department of Labor, Occupational Outlook Handbook, 2008-09 Edition.


A.12.1. Number of Respondents, Frequency of Response, and Annual Hour Burden

The estimated total annualized response burden is 451.5 person-hours. Burden hours are based on estimates made by cognitive testing informants, who were asked during cognitive testing an explicit question regarding how long it took them to read the instructions and respond to the survey, in preparation for the telephone interview. We also considered information obtained through discussions with NCRR program officers , and experience in implementing similar surveys.



A.12.2. Hour Burden Estimates by Each Module and Aggregate Hour Burdens

Exhibit 2 presents the calculations used to determine the overall response burden per administration, the estimated number of respondents, and the annualized response burden. As described in the Study Overview and in Supporting Statement B, for each biennial administration we are expecting a total of 3,563 respondents: 500 respondents for the users survey, 500 for the nonusers survey, 1,213 for the trainees/scholar survey, and 1,350 for the mentors survey. Please note that no individual will receive more than one survey. While a person may appear in the population for more than one survey, we will assign them to only one potential respondent pool. That is, if an individual appears on the investigator list and the mentor list, or the investigator list and the trainee/scholar list s/he will be removed from the list of investigators. This decision is based on the fact that the investigator list is substantially larger than the lists of trainees/scholars and mentors .


In calculating the frequency of response we have assumed that the surveys will be administered every other year. Thus, the annualized frequency of response is .5.


Exhibit A2. Calculations used to estimate annualized response burden for the National Evaluation of the CTSA Initiative

Respondent type

Estimated number of respondents

Estimated

number of hours per

respondent type

Frequency of response

Estimated total annual burden hours requested

Users survey

500

.25

.5

62.5

Nonusers survey

500

.08

.5

20

Trainees/scholars survey

1,213

.33

.5

200

Mentors survey

1,350

.25

.5

169

Total



451.5



A.12.3. Estimates of Cost to Respondents for the Hour Burdens

The total annualized cost to respondents, per administration, is estimated to be $14,056. The hourly wage rates are based on information found in Bureau of Labor Statistics, U.S. Department of Labor, Occupational Outlook Handbook, 2008-09 Edition, Medical Scientists, retrieved from http://www.bls.gov/oco/ocos309.htm. There are four classes of respondents: users, nonusers, trainees/scholars, and mentors. Users, nonusers, and mentors are assumed to be medical scientists at general medical and surgical hospitals. Trainees/scholars are assumed to be medical scientists at colleges/universities/professional schools. Because data for postdoctoral medical scientists are not available, we believe this to be an overestimate for this class of respondents. Calculations are shown in Exhibit 3.


Exhibit A3. Calculations used to estimate annualized cost burden for the National Evaluation of the CTSA Initiative


Respondent type

Calculation

(# of respondents x hourly wage x hours to complete x frequency of response)

Cost burden

Users survey

500 x $35.69 ($74,230/year, medical scientist at general medical and surgical hospitals) x .25 hours x .5 years

$2,231

Nonusers survey

500 x $35.69 ($74,230/year, medical scientist at general medical and surgical hospitals) x .08 hours x .5 years

714

Trainees/scholars survey

1,213 x $25.42 ($52,880/year, medical scientist at colleges/universities/professional schools) x .33 hours x .5 years

5,088

Mentors survey

1,350 x $35.69 ($74,230/year, medical scientist at general medical and surgical hospitals) x .25 hours x .5 years

6,023

Total

14,056



A.13. Estimate of Other Total Annual Cost Burden to Respondents or Record Keepers

There are no capital or start-up costs, and no maintenance or service cost components to report.



A.14. Annualized Cost to the Federal Government

The estimated annualized cost to the government for data collection, analysis, and reporting activities is approximately $364,480 as shown in Exhibit 4. These estimates reflect actual expenditures in 2009 and 2010 for instrument development and the contractor’s anticipated costs for data collection, analysis, and reporting based on past experience with similar surveys. This information collection is part of a three-year project with a total cost to the government of $1,881,067. As mentioned above,

In calculating the frequency of response we have assumed that the surveys will be administered every other year. Thus, the annualized frequency of response is .5.



Exhibit A4: Estimated annualized cost to the Federal government for the National Evaluation of the CTSA Initiative


Expenditure

Users survey

Nonusers survey

Trainees/ scholars survey

Mentors survey

Total per infor-mation collection

Frequency of collection

Annualized
cost of information collection

Federal agency personnel

$20,000

$20,000

$20,000

$20,000

$80,000

.5

$40,000

Contractor personnel

174,418

$182,397

$164,449

$160,951

$682,215

.5

$341,107.50

Incentives

0

0

0

0

0

.5

0.00

Computing

11,241

11,648

11,445

11,236

45,570

.5

$22,785.00

Copying, postage, telephone

250

500

250

175

1,175

.5

$587.50

Total costs

185,909

194,545

176,144

172,362

728,960

.5

$364,480.00


A.15. Explanation for Program Changes or Adjustments

This is a new collection of information.



A.16. Plans for Tabulation and Publication and Project Time Schedule

The data are being collected for evaluation purposes. We anticipate that the surveys will have their first administration between February and April 2011. Initial analyses will be completed and included in an interim report by August 2011. The final evaluation report, incorporating analyses of these data and results, will be completed in February 2012. At least one journal article based on these findings will be developed and submitted for publication. A similar timeline and uses are expected for the 2013 administration of the surveys.


Westat is conducting this external evaluation of the CTSA Initiative on behalf of the NIH. After the products are delivered, NIH determines whether the quality of the products merits publication by NIH (i.e., NIH is the exclusive publisher of the information being gathered). Often it is only after seeing the quality of the information delivered by the study that the NIH decides the format (raw or analytical) and manner in which to publish.



A.17. Reason(s) Display of OMB Expiration Date is Inappropriate

Not applicable.



A.18. Exceptions to Certification for Paperwork Reduction Act Submissions

No exceptions apply.


1See, for example, the NIH Reform Act of 2006 (Public Law 109–482, 120 Stat. 3677 (2007)), the 2007 Senate Appropriations Committee Report 109-287 (p. 152–153), and the 2008 House Appropriations Committee Report 110-231(p. 164–165).

2The nine institutions in the recently awarded 2010 cohort will not be the subject of this evaluation.

3 Administration; Biostatistics/Epidemiology/Research Design; Clinical Research Ethics; Clinical Research Innovation; Communications; Community Engagement; Education and Career Development; Evaluation; Informatics; Public-Private Partnerships; and Translational Research.

4 Each of the four surveys included in this request for clearance will be administered twice—once in 2011 and once in 2013.

5 Cognitive testing is the one-on-one testing of a survey instrument with respondents to identify any problems that may exist with the survey questions or the survey structure.

2


File Typeapplication/msword
File TitleRequest for OMB Clearance
AuthorNCRR
Last Modified Bywarren_s
File Modified2011-03-31
File Created2011-03-31

© 2024 OMB.report | Privacy Policy