Supporting Statement A revised 2-12-15

Supporting Statement A revised 2-12-15.docx

Evaluation of the NIH Academic Research Enhancement Award (NIH/OD)

OMB: 0925-0710

Document [docx]
Download: docx | pdf





Supporting Statement A for:


Evaluation of the NIH Academic Research Enhancement Award (NIH OD)


October 2014







Michelle M. Timmerman, Ph.D.

Office of the Director

National Institutes of Health


6705 Rockledge Drive

Bethesda, MD 20892


Telephone: 301.402.0672

Fax: 301.402.0672

Email: [email protected]















Table of contents

A. JUSTIFICATION 1

A.1 Circumstances Making the Collection of Information Necessary 1

A.2. Purpose and Use of the Information COLLECTION 2

A.3 Use of Information Technology and Burden Reduction 9

A.4 Efforts to Identify Duplication and Use of Similar Information 11

A.5 Impact on Small Businesses or Other Small Entities 11

A.6 Consequences of Collecting the Information Less Frequently 12

A.7 Special Circumstances Relating to the Guidelines of 5 CFR 1320.5 12

A.8 Comments in Response to the Federal Register Notice and Efforts to Consult Outside Agency 12

A.9 Explanation of Any Payment of Gift to Respondents 13

A.10 Assurance of Confidentiality Provided to Respondents 13

A.11 Justification for Sensitive Questions 15

A.12 Estimates of Hour Burden Including Annualized Hourly Costs 15

A.13 Estimate of Other Total Annual Cost Burden to Respondents or Record

keepers 17

A.14 Annualized Cost to the Federal Government 17

A.15 Explanation for Program Changes or Adjustments 18

A.16 Plans for Tabulation and Publication and Project Time Schedule 18

A.17 Reason(s) Display of OMB Expiration Date is Inappropriate 21

A.18 Exceptions to Certification for Paperwork Reduction Act Submissions 21




LIST OF Attachments:


1. Survey of Awardees with Screenshots

2. Survey of Applicants with Screenshots

3. Survey of Students with Screenshopts

4. Semi-structured Interview Guide

5. Privacy Impact Statement

6. Privacy Act Memo

7. IRB Approval Letters

8. Survey Introductory Email

8A. awardees

8B. applicants



ABSTRACT


This is a request for OMB to approve the new submission titled, “Evaluation of the NIH Academic Research Enhancement Award” for 1 year. The AREA program’s mission is to support meritorious research, strengthen the research environment of the grantee institution, and expose students to research. Since its congressional mandate in 1985, there has been only one limited-scope evaluation of the AREA program in 1996 (the record of the OMB approval for this evaluation is not available). As a result of this evaluation, significant changes were made to the program. This will be the first evaluation of the AREA program since then. In the past three years alone, the federal government has awarded approximately 78 million dollars annually in AREA grants. The evaluation will allow NIH and Congress to assess the extent to which the AREA program is meeting its goals and make recommendations so that this significant investment of public funds may be used as effectively as possible.



A. JUSTIFICATION

A.1 Circumstances Making the Collection of Information Necessary

The proposed data collection is the first full-scale evaluation of the Congressionally-mandated Academic Research Enhancement Award (AREA) program. The AREA program was established by Congress in 1985. The AREA award plays a critical role in advancing NIH’s goals to develop and maintain the scientific workforce by providing support to scientists at public and private colleges and universities that historically have received only small amounts of NIH funding in order to provide high quality research opportunities to undergraduate and graduate students.

As a result of an initial small scale evaluation, NIH made significant changes to the AREA program in 1997. AREA grants were no longer treated as stepping stones for investigators to move on from towards grant mechanism used at major research institutions. Instead, AREA grants have been reconceived as the means for conducting meritorious small-scale research that may be renewable competitively throughout the course of a scientist’s career. Also at this time, the three goals of the program were re-defined: 1) to support meritorious research, 2) to strengthen the research environment of the institution, and 3) to expose students to research. The extent to which the program is meeting the above goals will be examined through both process and outcome measures.

A feasibility study completed in 2011 helped develop a design for a full program evaluation, using both structured and unstructured data sources to determine the extent to which the program is fulfilling its three goals stated above. This evaluation will provide an invaluable opportunity for the National Institute of Health (NIH) Office of the Director (OD) and Institutes and Centers (ICs) staff to understand the impact of the AREA program, identify gaps, and develop strategies for closing those gaps to enhance the ability of the program to meet its goals and ensure that the approximately 78 million dollars spent annually on the program is being used as effectively as possible. The evaluation will provide NIH with much-needed empirical information to share with Congress about its activities.

The Public Health Service Act (42 USC § 241) authorizes the collection of this information. The Public Health Service Act states, “The Secretary shall conduct in the Service, and encourage, cooperate with, and render assistance to other appropriate public authorities, scientific institutions, and scientists in the conduct of, and promote the coordination of, research, investigations, experiments, demonstrations, and studies relating to the causes, diagnosis, treatment, control, and prevention of physical and mental diseases and impairments of man…”

A.2 Purpose and Use of the Information Collection

In addition to providing funding to carry out worthy research projects, the AREA program aims to enhance the overall research environment of the institutions receiving the money, and contribute to the United States’ biomedical scientist pipeline by developing students’ interest in, and aptitude for, biomedical research. This evaluation will indicate whether the content of the current R15 application materials need to be revised in order to provide more concrete guidelines for how applicants might meet the program’s goals of providing research opportunities for students and strengthening their institutions’ research environments. A prior feasibility study conducted by Discovery Logic in 2011, recommended a full scale external evaluation to inform decision making, identify opportunities for improvement, and provide feedback on the performance of the program to key stakeholders.

Existing data sources that will be utilized to evaluate the AREA program include: National Institutes of Health (NIH) IMPAC II (NIH grants applied for and received, Progress Reports and grant Supplements containing names of AREA awardees’ former students) and SPIRES (publications funded by NIH grants), ScienceWire (USDA, NSF, DoD grants received), PubMed/Medline (publications, field of research), Web of Knowledge (publications, citations, impact factors), and IPEDS (characteristics of institutions of higher education including number of faculty per discipline, student population, graduation rate, student disciplinary concentrations).

The existing data sets listed above will yield minimal information on how the grants were implemented at the host institution, and the contribution of the AREA grants to the career outcomes of students training in science as a result of these grants. Therefore we are proposing to request additional data from a sample of 600 awardees, 600 unsuccessful applicants, and 601 students. To collect information about the extent to which the receipt of the grant contributed to promoting the scientific workforce pipeline and enhancing the research environment at the institution where the grant was held, we are utilizing three newly developed web-based surveys: Survey of Awardees (Attachment 1), Survey of Applicants (Attachment 2), and Survey of Students (Attachment 3). A semi-structured interview (Attachment 4) will be carried out with 50 awardees to gather information on how the AREA grant was implemented at the host institutions. The intention of these survey and interview tools is to collect quantitative outcome data and qualitative data on implementation not included in any of the existing databases.

A.2.1 Research Questions


The full-scale evaluation – including the archival data analysis and the new data collection components - will address the following questions:

  1. Is meritorious research being funded?

  1. Do awardees who have received AREA grants have more publications per project and more subsequently funded grants in comparison to a control group of never successful AREA applicants?

  2. Has awardee productivity increased as a result of the programmatic changes instituted in 1997?

  3. Does the R15 program fund meritorious research projects based on scores?

  4. Do funding trends of the R15 program differ from that of other funding mechanisms within each Institute or Center?

  1. What happens to the research agendas of applicants who are unsuccessful in securing AREA funding? Are they able to obtain research funding from other sources? Do they postpone their research plans indefinitely?

  2. Is the research environment of AREA-funded institutions being strengthened?

  1. Have AREA-funded institutions’ research activity increased after receiving an AREA award as measured by increased publications, number of grant applications and awards and increased NIH funding?

  2. Do the numbers of subsequent applications from an AREA-funded institution increase after receiving an award?

  3. Have R15 awards contributed to the increased availability of laboratory experiences for students at the home institution?


  1. Are students benefiting from exposure to research?

  1. How many students have participated in R15 projects over the history of the program?

  2. Are students making meaningful scientific contributions to the AREA-funded research (posters and presentations at conferences, publications)?

  3. Do students undertake post-baccalaureate scientific education and research careers in science?

  4. Are students at AREA-eligible institutions more likely to participate in R15-funded research than in research funded through other means?

  5. Is commitment to student research consistent across types of AREA-eligible institutions (liberal arts colleges, universities, professional schools)?


  1. What has NIH been doing to advance the goals of the AREA program?

  1. Does the review and award process fit the goals of the program?

  2. What was the effect of decentralizing funding in 1997—moving it from an Office of the Director set-aside, to allowing individual ICs to fund R15s according to their policies?

  3. Since 1997, has the success rate, number of grants and total funds (as a fraction of the total budget) changed, and is the pattern uniform across ICs?

  4. Has the expanded eligibility of institutions (to institutions receiving up to 6 million dollars per year, instead of three million) and budget caps ($300K up from $150K for three years) altered the outcomes for awardees, institutions, and students?

  5. What has NIH done to highlight the availability of AREA funds and how effective has outreach been in terms of the number and diversity of participating institutions and awardees?


  1. How can NIH ensure that it achieves the goals of the AREA program?

  1. What obstacles stand in the way of the production of high quality research by awardees at AREA-eligible institutions (where research resources are comparatively modest and faculty have heavy teaching responsibilities)?

  2. How appropriate are R15 awards to the research and mentorship goals of successful and unsuccessful applicants to the AREA program?

  3. What have been the most successful strategies for encouraging meaningful student participation in AREA research?

  4. What have been the most successful strategies for cultivating students’ research skills and nurturing student enthusiasm for scientific careers?

The structured web survey topics include the following:

  • AREA application process

  • Research productivity

  • Collaboration with colleagues

  • Subsequent research and funding

  • Students’ contributions to publications, presentations, posters, other dissemination products

  • Roles of students in research project

  • Student education and career outcomes

  • Student perceptions of research experience and benefits

  • Suggestions for program improvement


The semi-structured telephone interview topics include the following:

  • Student involvement in AREA project

  • Mentorship of students

  • Student educational and career outcomes

  • Impact of AREA grants on awardees’ careers

  • Impact of AREA grants on research environment of host institution

  • Suggestions for program improvement


A.2.2 Audiences for Data and Results

There are three main target audiences for this program evaluation: NIH leadership, NIH AREA program administrators, and current and former AREA applicants and awardees. In addition, non-NIH program directors, and the research and evaluation communities will benefit from this work. Since most NIH’s Institutes and Centers (ICs) participate in the AREA program, this evaluation has relevance to the entirety of NIH’s programs.

NIH Leadership: For the purpose of determining the effectiveness of the AREA program across all ICs and making possible program adjustments, NIH leadership will have access to the final report, aggregate findings, and recommendations. This will include aggregated data and findings from awardees, applicants, and students. Having this information will enable administrators to understand the impact of the program, identify gaps and develop strategies to enhance the ability of the program to fulfill its goals. NIH leadership will share selected data with Congress in reports about the AREA program.

Since every participating IC gives few R15 awards relative to the funding level of their overall extramural grant program, it can be difficult to see within - and across - IC trends in the community of recipient institutions and awardees. Analysis of the R15 program, aggregated across ICs and all years, will provide NIH leadership with the information they need to understand the needs of AREA-eligible awardees, students, and their academic communities, and develop strategies to address them.

NIH AREA Program Administrators: Several OD AREA Program Administrators and IC AREA Program Administrators are overseeing the evaluation as Project Officers and members of the evaluation committee. The committee of IC AREA Program administrators will receive access to data tables for all comparison groups for which data is available, in addition to the final report and slide presentation. NIH AREA program administrators will participate in the process of considering recommendations about AREA policy and future data collection.

Current and Former AREA Applicants and Awardees: Former awardees and applicants will have access to the final report with aggregate findings and recommendations. For AREA awardees, many of whom are isolated from a larger academic community, it will be invaluable to learn of strategies for research and mentorship success. Individual-level data or detailed data tables will not be provided.

Non-NIH Program Directors: The NIH AREA program administrators have developed a list with names and contact information of individuals who oversee similar projects and other agencies and organizations, for example, the National Science Foundation’s Research in Undergraduate Institutions (RUI) program. Findings and lessons for the evaluation will be eventually made public and or disseminated to such agencies and organizations.

Research and Evaluation Communities: While some members of these communities may be interested in the methodologies used, others may be interested in the outcomes and findings of the evaluation. However, individual-level data will not be provided to members of these communities. Dissemination may take the form of a study report, conference presentations, and publications of study methods and findings in journals.

A.3 Use of Information Technology and Burden Reduction

The evaluation study will utilize two different modes of data collection: web-based surveys and telephone interviews.

In the web-based survey approach, respondents provide data using a computerized system to collect and enter data. The respondent provides data directly without an interviewer. The advantages of this format are: (1) data may be collected for all respondents using this method allowing for uniform data collection, (2) respondents can complete the survey at their convenience, (3) the system navigates through the survey for the respondent, (4) respondents are relieved of the burden of having to mail a survey, (5) the dissemination of the survey to participants is easy and has a low cost, (6) follow up reminders are also easy and inexpensive, and (7) the resulting data is automatically available upon completion of the administration in an electronic database and there is no wait time for analysis to begin. In addition, information may be obtained at any point during the administration regarding the number of survey respondents who initiated the survey but who did not complete and submit the survey. Challenges include obtaining an email address for each respondent, as well as potential lack of familiarity with online surveys for some respondents (expected to be very low in this very highly educated and internet savvy population).

Westat will use an internally maintained Commercial Off-the-Shelf (COTS) web survey platform. When the platform is integrated into Westat systems, it allows the collection of web responses and the creation of reports on completed and outstanding responses. Data on web responses (including bounce backs, cooperation rates, and survey initiation rates - will allow the creation of activity reports to further assist in helping project staff monitor activity and perform respondent contact information ‘tracing’ to increase response rates.

The telephone interviews will be conducted by trained interviewers using a Semi-Structured Interview Guide (Attachment 4). The interview data will be collected via audio tape. Each interview will be recorded on a digital file labeled with the respondent’s ID number; each audio file will have a standardized unique file name: “AREA –##” where ## is the unique ID number for the interview respondent (i.e. AREA-99). All audio files will be sent via a secure File Transfer Protocol (FTP) to a transcription service, Casting Words, for transcription. Interview transcripts will be stored as Word documents and will use the same naming convention used for the telephone interview audio files; “AREA-##”. The file name extension will distinguish between the audio file (.wav) and the Word document (.docx). Management and analysis of the interview data, including monitoring activity and response rates, will be performed utilizing Excel.

The advantage of conducting interviews in any data collection is that they allow for the collection of in-depth qualitative data that often helps fill in the gaps and provide explanations not possible to collect via any structured data collection method – such as web surveys. The advantage of using telephone interviews is mainly that they allow for cost-effective data collection from respondents who reside across the United States and also allow respondents to participate in the study at a time convenient for them.

A Privacy Impact Assessment (PIA) has been drafted and is under review at the Department of Health and Human Services. (Attachment 5).

A.4 Efforts to Identify Duplication and Use of Similar Information

The AREA program was last evaluated in 1996. At that time, the main objectives of the evaluation were to decide whether NIH should increase the awards from $75,000, permit renewals of grants, allow for an extension of the length of awards, and make changes to the application process. Researchers conducted a brief survey of area grantees (49 PIs from 30 institutions) and examined extant data, including publications of the grantees and annual expenditures of AREA and other comparable research programs. Findings from this evaluation resulted in significant philosophical and procedural changes to the program, as well as the shift in program objectives. Since the changes, there has not been any type of evaluation conducted of the AREA program. This will be the first full-scale evaluation of the AREA program. A comprehensive evaluation feasibility study of the AREA program, conducted by Discovery Logic in 2011, indicated the need for this program evaluation and that it would not duplicate already existing information or previous evaluations; no data at all has been collected from grantees since 1996 and even then, data collection was quite limited (49 PIs from 30 institutions).

A.5 Impact on Small Businesses or Other Small Entities

No small entities will be involved in this evaluation. All respondents will be individuals who participate voluntarily.

A.6 Consequences of Collecting the Information Less Frequently

This information collection is a one-time collection. There will be follow-up activities to those respondents who do not complete the survey within two weeks, but the survey will be implemented only once. Similarly, the semi-structured interviews will be conducted once with each respondent selected to participate.


A.7 Special Circumstances Relating to the Guidelines of 5 CFR 1320.5

This evaluation is consistent with the information collection guidelines in 5 CFR 1320.5

A.8 Comments in Response to the Federal Register Notice and Efforts to Consult Outside Agency

The 60-Day Federal Register notice soliciting comments on this study prior to initial submission to OMB was published on July 9, 2014, Vol 79, page 38921. No public comments were received.

The web-based survey instruments and the semi-structured interview guide were developed through NIH’s collaboration with Westat Incorporated, based on the foundation provided by Discovery Logic’s feasibility study that helped inform the full scale evaluation. In 2011, NIH’s Office of Extramural Programs carried out a feasibility study, drawing on expertise from a steering committee of program staff from several of NIH’s 27 Institutes and Centers (ICs) and leveraging Discovery Logic’s knowledge of data sources for evaluation of research and training programs. After completing the feasibility study, and after consultation with AREA R15 program staff and the feasibility study steering committee, Discovery Logic recommended that NIH carry out a full-scale evaluation of the AREA R15 program using both structured and unstructured data sources to assess the program goals and impacts in the four study group areas (i.e. Meritorious Research - PI level, Strengthen Research at Institution/Administrative Unit, Student, Program).

NIH and Westat distilled down key themes and questions that NIH was interested in asking for the full scale study and Westat developed the instruments.

The NIH project team has consulted with:

  • Rebekah Rasooly, Ph.D., NIDDK Program Director

[email protected], 301-594-6007


  • Erica Brown, Ph.D., Extramural Policy Coordination Officer

[email protected], 301-402-1081


  • Sherry Mills, MD, MPH, Director, OEP

[email protected], 301-435-2729


  • Jean Chin, Ph.D., NIGMS Program Director

[email protected], 301-594-0828


  • Christopher Platt, Ph.D., NIDCD Program Director

[email protected], 301 496 1804


  • Robin Wagner, Ph.D., Chief, Division of Information Services.

Robin.wagner@nih,gov, 301-443-5234


A.9 Explanation of Any Payment of Gift to Respondents

This information collection does not involve payment or gifts to respondents.


A.10 Assurance of Confidentiality Provided to Respondents

Potential participants will receive email notifications, invitations and reminders announcing the evaluation, explaining its purpose, detailing the topics the survey or interview will cover, and describing both the voluntary nature of participation and information that the data will be kept private to the extent provided by law. The following steps outline the procedures for contacting potential participants to ensure compliance and maximize response rates:

All proposed emails include language stating that all collected information will be kept private and not disclosed in any identifiable form to anyone but the researcher conducting the study, except as otherwise required by law. Individuals who choose to participate will be providing implicit consent by their participation. This information collection is covered by the NIH Privacy Act Systems of Record 09-25-0156, “Records of Participants in Programs and Respondents in Surveys Used to Evaluate Programs of the Public Health Service, HHS/PHS/NIH/OD.” All study personnel will adhere to the provisions stipulated within that announcement (Attachment 6).

As the target population is not vulnerable and the questions are not personal or intrusive, risks to participants are expected to be minimal. Westat IRB approval was received on January 24, 2014 and updated on August 28, 2014 (Attachment 7). Because the researchers are all Westat staff, submission to the NIH IRB is not necessary.

Study personnel have obtained proper security clearances and are required to adhere to strict professional survey standards and have signed a non-disclosure agreement as a condition of their employment. Web-based, computer-based and any hard copy data collection forms will be maintained in a secure area for receipt and processing. All data files on multi-user systems will be under the control of a database manager and will be subject to controlled access only by authorized personnel. Personal identifying information (PII) will be maintained separately from completed data collection forms, and from computerized data files used for analysis. Final reports will be based on aggregate data in which individuals are not identified.

After the data collection is completed, all hard copy collected information and study materials will be stored in a locked, secure facility for two years, and then will be shredded. Electronic data will be password protected and stored by the data management contractor, and will also be destroyed after two years.

A.11 Justification for Sensitive Questions

Personally identifiable information (PII) is collected in the form of the participant’s name, email address and phone number which is needed to contact potential participants. This information will be obtained from archival sources, when possible and in some cases will need to be updated by study staff using internet searches. Additionally, sensitive information will be collected in the form of gender, race and ethnicity information as part of the survey data collection.

The survey of AREA awardees asks respondents to provide the name, email address, and last known employer or higher education institution of the students who worked on their AREA-funded research. This information is critical to the evaluation because part of the evaluation is surveying former students to determine whether participation in the AREA-funded project had an impact on their career path. Since Congress has mandated that R15s be used to train students in biomedical and behavioral science research in order to increase the biomedical and behavioral scientist workforce, it is essential to find out whether participation in AREA-sponsored research is having an impact on students’ careers. Therefore, we must collect PII from awardees about students who participated in their AREA-funded research. To the extent possible—and to minimize burden on participants—we will also obtain student names from past AREA grant Final Progress Reports. However, Final Progress Reports (FPRs) are available for less than 24% of the awards made from 1985 through FY 2010, and the FPRs available as a source of student names, and we anticipate failing to trace a very large percentage of them – over 50%.

Participation in the study is voluntary and participants have the right not to answer any questions without consequences. Section A.10 discusses the steps taken to safeguard this information.

A.12 Estimates of Hour Burden Including Annualized Hourly Costs

Data collection activities are different for each type of respondent. Awardees will complete a web-based survey as well as the semi-structured interviews. AREA students will complete a web-based survey and AREA applicants will complete a web-based survey. All data collection will be completed within one year. The estimated time for completing the awardee web-survey is 30 minutes; for the student web-survey is 20 minutes; and for the applicants is 20 minutes. The estimated time for completing the semi-structured interview with awardees is 45 minutes. We also estimate contacting approximately 13 awardees who will decline the interview. For these individuals, we estimate the burden to be about 3 minutes. The estimate of the annualized burden hours, an estimated annualized total burden of 459 hours, is summarized in Table A.12-1. Please note that the number of expected respondents is listed in this table, not the total sample of each type of respondent.

Table A.12 – 1. Estimates of Annualized Burden Hours

Type of

Respondents

No. of

Respondents

No. of

Responses Per Respondent

Average Burden per Response (in hours)

Total Annual Burden Hours

Web Survey – Awardees

480

1

30

240

Interviews - Awardees

50*

1

45

38

Declined interviews – Awardees

13

1

.05

1

Web Survey - Students

301

1

20

100

Web Survey - Applicants

240

1

20

80

Totals

1021


459

*50 awardees to take part in semi-structured interviews will be chosen using convenience sampling from among the 480 awardees we expect to respond to the web survey.


The cost burden to respondents is the time required to read the instructions and complete the survey, and, in the case of interviewed awardees, to speak with the interviewer on the phone. The total annualized cost to the respondents is estimated to be $19,697.86, calculated at $43.01 per hour; an average of the hourly wage rate for doctoral level biochemists and biophysicists at $44.06, medical scientists at $42.98, and post-secondary physical science teachers at $41.99 (U.S., Department of Labor, Bureau of Labor Statistics, 2013, http://www.bls.gov/oes/current/oes_nat.htm#19-0000

). The costs are summarized in Table A.12-2.

Table A.12 – 2. Estimated Annualized Cost to Respondents

Type of

Respondents

Number of

Respondents

Total Annual Burden Hours

Hourly Wage Rate

Total Respondent

Cost

Web Survey – Awardees

480

240

$43.01

$10,322.40

Interviews - Awardees

50

38

$43.01

$1,634.38

Web Survey - Students

301

100

$43.01

$4,301.00

Web Survey - Applicants

240

80

$43.01

$3,440.08


$19,697.86


A.13 Estimate of Other Total Annual Cost Burden to Respondents or Record Keepers

There are no direct costs to respondents other than their time to participate in the study.

A.14 Annualized Cost to the Federal Government

The largest cost to the federal government is for sub-contractor fees of $275,333 to conduct the study and deliver the analysis. Contractor fees for one year total $40,224. OD and other NIH staff costs are based entirely on labor. It is estimated that the study will require about .4 FTE total per year spread mostly over 3 NIH Health Scientist Administrators at the GS-14 level or above. These expenses are for directing contractors and sub-contractors, overseeing and solving problems as they arise, developing materials, and supervising data collection. The estimated annualized cost to the Federal Government is $362,557 summarized in Table A14-1.


Table A.14-1 Annual Cost to the Federal Government



Annual Cost

Sub-contractor costs

274,511

Contractor costs

20,224

OD personnel subtotal

8,556

Health Scientist Administrator, OD GS14/5, 6% of $120429

7,223

Health Scientist Administrator, OD , GS15/3, 1% of $133,328

1,333

Other NIH personnel subtotal, Health Scientist Administrator, NIDDK, GS15/10, 6% of $157100 (Detailee to OD)

9,426

GRAND TOTAL

321,273


A.15 Explanation for Program Changes or Adjustments

This is new information collection.



A.16 Plans for Tabulation and Publication and Project Time Schedule

The web-based survey and interview information collection will begin within two months of obtaining OMB approval. The contract period will include fielding, analyzing, and disseminating findings from these studies. Westat, Inc. will be responsible for preparing the analytic databases resulting from the study. The timetable for the data collection is shown below, in Table A.16-1.

Table A.16-1. Project Time Table

Activity

Timeline

Introductory email sent to survey respondents

1-2 months after OMB approval

Invitation e-mail sent to survey respondents

1 week following introductory e-mail

Field web surveys

2-3 months after OMB approval

Introductory e-mail sent to potential interviewees

3-4 months after OMB approval

Confirmation phone call to interviewees

Within 36 hours of interviewee response

Conduct semi-structured interviews

3-6 months after OMB approval

Complete field work

5-6 months after OMB approval

Validation

7-8 months after OMB approval

Analyses

8-9 months after OMB approval

Publication

11 months after OMB approval


A.16.1 Analysis of the Study Data

Many of the Web survey analyses will consist of descriptive statistics (e.g., percentages, means, medians, and standard deviations, as appropriate), cross-tabulations, and graphical summaries. Tests of significance will be conducted using statistics such as chi-squared, t-tests, or analysis of variance (ANOVA) to examine differences. The analysis will be performed in SAS with PROC SURVEY with the non-response adjusted weight. See Table A.16.2 for a list of research objectives and associated measures.

Table A.16-2. Research Objectives and Measures for the AREA Evaluation

RESEARCH OBJECTIVE

MEASURES

Evaluate extent to which AREA Program objectives are met

Compare AREA awardees and unsuccessful applicants regarding:

  • Quantity and quality of student work on projects

  • Extent of within-institution collaboration

  • Record of subsequent funding


Evaluate existing AREA Program efforts with outreach

Unsuccessful applicants’ experiences with the application process


Understand how program objectives are implemented by awardees—successes and obstacles

Awardee semi-structured interviews about the research and mentorship experience at their institution

Determine impact of AREA Program on student career path

Student degrees earned, employment type, engagement in a scientific career, subjective value of research experience


A.16.2 Products of the Study

Products of the evaluation will include a full final report detailing the methodology, key findings, conclusions, and recommendations. Data in the report will be presented in user-friendly graphs and tables with textual analysis. An executive summary will be developed to concisely convey the salient findings to a general audience. Additionally, a Power Point slide deck will be prepared for presentation purposes to highlight the key findings and will include graphs, tables, and other relevant information.

A.16.3 Dissemination of Results

Intended audiences for the results of the evaluation include NIH OD staff, AREA representatives in each participating IC, and AREA awardees, applicants, and future applicants. The findings of this study will be disseminated to the larger scholarly community and presented at education- and research policy-oriented scientific conferences. The results will also be shared with the broader planning and evaluation community. A copy of the final report will be provided to the NIH Evaluation Office.

A.16.4 Use of Results

The results will be used to determine the extent to which the AREA program is meeting its three main objectives: 1) to support meritorious research, 2) to strengthen the research environment of the institution, and 3) to expose students to research. Results from this program evaluation will provide NIH with the opportunity to understand the impact of the program, identify any gaps, and develop strategies for closing those gaps in order to enhance the ability of the program to meet its goals and to further enhance its role in promoting the scientific workforce pipeline in the U.S.

A.17 Reason(s) Display of OMB Expiration Date is Inappropriate

The web-based surveys and semi-structured interview guide will not require exemption from displaying the expiration date of OMB approval. Any reproduction of the data collection instrument will prominently display the OMB approval number and expiration date. Awardees who will be interviewed by phone will be presented with a copy of the OMB Expiration date in the same packet as the consent form.

A.18 Exceptions to Certification for Paperwork Reduction Act Submissions

The AREA Program Evaluation Study does not require any exceptions to the Certificate for Paperwork Reduction Act (5 CFR 1320.9).





File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorMartha Palan
File Modified0000-00-00
File Created2021-01-26

© 2024 OMB.report | Privacy Policy