Revised_Diversity_OMB_Section_A_ 6_15_16

Revised_Diversity_OMB_Section_A_ 6_15_16.docx

Evaluation of the Enhancing Diversity of the NIH-funded Workforce Program for the National Institute of General Medical Sciences (NIGMS)

OMB: 0925-0747

Document [docx]
Download: docx | pdf




Supporting Statement A for



National Institutes of Health

Evaluation of the Enhancing Diversity of the NIH-funded Workforce Program for the National Institute of General Medical Sciences




May 31, 2016














Michael Sesma, Ph.D.

Chief, Postdoctoral Training Branch

Division Training, Workforce Development, and Diversity

Diversity Program Consortium

National Institute of General Medical Sciences

45 Center Drive, 2AS43H

Bethesda, MD 20892

Phone: (301) 594-2772

Email: [email protected]




List of Attachments


Attachment 1: RFA-RM-13-016 (Building Infrastructure Leading to Diversity (BUILD))


Attachment 2: RFA-RM-13-017 (National Research Mentoring Network (NRMN))


Attachment 3: RFA-RM-13-015 (Coordination & Evaluation Center)


Attachment 4: Hallmarks of Success


Attachment 5: BUILD Interventions Summary Table


Attachment 6: BUILD Logic Models


Attachment 7: NRMN Logic Model


Attachment 8a: HERI Freshman Survey (On-line Version)


Attachment 8b: HERI Freshman Survey (Paper Version)


Attachment 9: HERI Transfer Student Survey


Attachment 10: HERI Your First College Year


Attachment 11: HERI College Senior Survey


Attachment 12: BUILD Student Annual Follow-up Survey


Attachment 13: HERI Faculty Survey


Attachment 14: BUILD Faculty Annual Follow-up Survey


Attachment 15: Mentee Mentor Assessment


Attachment 16: NRMN Data Warehouse Baseline Data


Attachment 17: NRMN Faculty/Mentor Core Follow-up Survey


Attachment 18: NRMN Student/Mentee Core Follow-up Survey


Attachment 19: NRMN Mentor Skills Module


Attachment 20: NRMN Research & Grant Writing Module


Attachment 21: NRMN Coaching Training Module


Attachment 22: NRMN Institutional Context Module


Attachment 23: BUILD Site Visit & Case Studies Protocol


Attachment 24: NRMN Site Visit & Case Studies Protocol


Attachment 25: BUILD Institutional Records & Program Data Requests


Attachment 26: BUILD Implementation Reports


Attachment 27: Coordination & Evaluation Center (CEC) Tracker Security Overview


Attachment 28: Advisory Committee to the Director – Working Group on Diversity in the Biomedical Research Workforce Membership


Attachment 29: NIH Privacy Act Officer Correspondence


Supporting Statement for the

Paperwork Reduction Act Submission



National Institutes of Health (NIH)

Evaluation of the Enhancing Diversity of the NIH-funded Workforce Program


A: Justification


This is a NEW OMB application request for a three year clearance to conduct an evaluation of the NIH Common Fund’s “Enhancing the Diversity of the NIH-funded Workforce Program.” This program is a national consortium, comprised of three integrated initiatives:

  1. Building Infrastructure Leading to Diversity (BUILD) Initiative (RFA-RM-13-016; see Attachment 1),

  2. National Research Mentoring Network (NRMN) (RFA-RM-13-017; see Attachment 2) and

  3. Coordination and Evaluation Center (CEC) for Enhancing the Diversity of the NIH-Funded Workforce Program (RFA-RM-13-015; see Attachment 3).


This OMB application requests a 3-year clearance for data collection efforts that are needed in order to meet the requirement for formal evaluation of the BUILD and NRMN initiatives under the NIH-funded cooperative agreement awarded to the Coordination and Evaluation Center at the University of California, Los Angeles (UCLA; (RFA-RM-13-017). To complete the required evaluation, individual-level data are needed from faculty, students (undergraduate and graduate), and post-doctoral scientists participating in activities implemented by(a) the 10 institutions receiving “Building Infrastructure Leading to Diversity (BUILD) awards and their partners, and (b) the National Research Mentoring Network (NRMN). Clearance is also requested to collect institutional-level data from institutions participating in the BUILD initiative and program-level data on the BUILD programs and the NRMN program.


Evaluation of BUILD and NRMN will include assessment of agreed-upon consortium-wide hallmarks of success at the student/mentee-level, faculty/mentor level, and institutional-level (see Attachment 4: Hallmarks of Success). Hallmarks represent key factors at each stage of a biomedical research career that contribute to development of a successful biomedical research career and to success as an NIH-funded (or similar peer-reviewed) investigator. Proposed data collection is designed to support evaluation of the overall effectiveness of BUILD and NRMN efforts. Lessons learned will be disseminated to the broader biomedical research community.


A.1 Circumstances Making the Collection of Information Necessary


The data to be collected are responsive to helping fulfill the requirements of several relevant authorizing mandates:

  • Executive Order 12862, “Setting Customer Service Standards,” which directs Agencies to continually reform their management practices and operations to provide service to the public that matches or exceeds the best service available in the private sector; and

  • The March 3, 1998 White House Memorandum, “Conducting Conversations with America to Further Improve Customer Service,” which directs Agencies to determine the kind and quality of service its customers want as well as their level of satisfaction with existing services.

  • Executive Order 13450 - Improving Government Program Performance, which calls for evaluation as outlined at: https://www.whitehouse.gov/sites/default/files/omb/assets/performance_pdfs/eo13450.pdf

  • 42 U.S. Code § 284n, “Certain demonstration projects” section, which was enacted as part of the National Institutes of Health Reform Act of 2006 to allocate funds for NIH institutes and centers to make grants for the purpose of improving public health through demonstration projects for biomedical research at the interface between the biological, behavioral, and social sciences and the physical, chemical, mathematical, and computational sciences [specifically, authority for NIH Office of Director that make the collection of information necessary as part of this initiative is based on the following: U.S. CodeTitle 42, Chapter 6A, Subchapter III, Part B - § 284n; “certain demonstration projects at (https://www.law.cornell.edu/uscode/text/42/284n) and (http://www.law.cornell.edu/uscode/html/uscode42/usc_sup_01_42_10_6A_20_III_30_B.html)].

  • The NIH Privacy Act Systems of Record Notice 09-25-0156, provides authority for the NIH to conduct and fund research and to provide training assistance, and to maintain records in connection with these and its other functions (42 U.S.C. 203, 241, 289l-1 and 44 U.S.C. 3101), and Section 301 and 493 of the Public Health Service (PHS) Act, including records of participants in programs and respondents in surveys used to evaluate PHS programs (http://oma.od.nih.gov/public/MS/privacy/PAfiles/0156.htm).


The Enhancing the Diversity of the NIH-funded Workforce Program (also called the Diversity Program) was established in FY 2014 in response to a set of recommendations provided by the NIH Advisory Committee to the Director (ACD) Working Group on Diversity in the Biomedical Research Workforce. The ACD Working Group reviewed a study conducted by Ginther et al., (2011), which indicated significant disparities in funding success of NIH R01awards (and the number of R01 applications submitted and revised/resubmitted) occurred by race/ethnicity. In response, the ACD working group provided a set of 13 recommendations in four broad areas: data collection/evaluation; mentoring/career preparation and retention; institutional support; and research and intervention testing. These recommendations were the genesis of the development of Funding Opportunity Announcements for the NIH Building Infrastructure Leading to Diversity (BUILD) Initiative (RFA-RM-13-016), National Research Mentoring Network (NRMN) (RFA-RM-13-017) and the Coordination and Evaluation Center for Enhancing the Diversity of the NIH-Funded Workforce Program (RFA-RM-13-015).


The Diversity Program consists of a five-year funding commitment and the program distributed $31.3M in FY2014 with similar levels of distribution anticipated each year. The Diversity Program is one of the NIH Common Fund programs and is funded by the Office of Strategic Coordination (OSC), located in the Division of Program Coordination, Planning and Strategic Initiatives (DPCPSI) at the Office of the Director (OD), and managed by the National Institute of General Medical Sciences (NIGMS). This initiative is a critical investment in student development, faculty training and mentoring, and infrastructure development that is necessary to achieve the NIH and the nation’s goal of increased diversity in the biomedical and health professional workforce as a key strategy for improving the health of the nation.


The Diversity Program supports two complementary efforts to achieve its overarching goal of enhancing the diversity of the NIH-funded biomedical workforce. The 10 BUILD awards were granted to a diverse set of institutions nationwide with high-levels of low income students and limited previous NIH funding (including minority-serving institutions (MSIs), Hispanic-serving institutions (HSIs). The BUILD initiative supports the design and implementation of innovative programs, strategies and approaches to transform undergraduate research training and mentorship. The BUILD program elements at funded institutions include infrastructure development, partnerships with pipeline and research campuses, faculty development, student development, and biomedical research training and mentorship [See Attachment 5: BUILD Interventions Summary Table]. Institutions funded through the BUILD initiative are encouraged to partner with other academic institutions as well as industry to provide a wealth of diverse training opportunities for their students. The support given to BUILD institutions is expected to directly impact undergraduate students and faculty as well as strengthen institutional research infrastructure (e.g. student recruitment, research training opportunities, faculty tenure and promotion, curriculum development, etc.) in biomedical disciplines. The programs aim to produce knowledge to promote culture change in the fields of biomedical education and training.


Complementing the BUILD initiative that focuses on training of undergraduates, the NRMN initiative supports a nationwide consortium of individuals, institutions, societies aiming to enhance mentoring and career development opportunities to enhance the Diversity Consortium’s efforts to provide training and career development of individuals from diverse backgrounds at all career levels from undergraduate through junior faculty who are pursuing biomedical, behavioral, clinical, and social science research careers (collectively termed biomedical research careers).


The Coordination and Evaluation Center (CEC) is the cornerstone of the consortium, and is responsible for bringing together the BUILD and NRMN awardees to work collaboratively as a consortium along with the NIH to achieve the stated goals of the Diversity Program Consortium. The BUILD, NRMN and CEC Principal Investigators are expected to form a network to share experiences and determine best practices. The CEC is responsible for the development and implementation of the required Consortium-wide evaluation of the BUILD and NRMN initiatives to document the outcomes of the various interventions employed by BUILD and NRMN. The transformative potential of this program will be realized as successful models of training, mentoring, and institutional development are disseminated and adopted across NIH and nationwide.


The CEC, along with NIH programmatic involvement and funding, proposes to conduct an outcome evaluation for ten or more years (FY2015-FY2024) and requests OMB clearance for 3 of the first five presently funded years (FY2014-FY2019); OMB clearance for the final 2 years of the initial funding period will be sought in future years. The evaluation will include data collection from students, faculty and institutional representatives participating in activities funded through the BUILD and NRMN initiatives from FY2014 through FY2019.BUILD and NRMN outcome evaluations will support the assessment of the Diversity Program Consortium’s overall effectiveness, produce knowledge that can be widely disseminated to transform research training and mentoring nationwide, and inform NIH’s training and diversity efforts. Data and knowledge developed from this evaluation will be directly responsive to the recommendations set forth by the ACD Working Group to the NIH Director.


Evaluation of the BUILD program will include assessment of the following overarching outcomes (see Attachment 6: BUILD Logic Models):

  1. Student outcomes such as –

  1. engagement in research and satisfaction with faculty mentorship,

  2. enhanced science identity and self-efficacy,

  3. intent to pursue biomedical research career

  4. participation in academic and professional student organizations,

  5. enhanced social integration/perceived fit within university setting,

  6. pursuit, persistence/retention, and success in biomedical science disciplines,

  7. evidence of scholarly productivity (e.g. science conference presentation, authorship on papers),

  8. completion of undergraduate degree in biomedical science discipline,

  9. application to attend graduate program in biomedical science discipline,

  10. entrance to graduate program in biomedical science discipline

  1. Faculty/mentor outcomes such as –

      1. change/increase in self-efficacy as instructor, mentor and/or researcher,

      2. increase in participation in professional development activities (e.g. training in NIH grant applications, technical training for conducting research, workshops on cultural assets/stereotype threat, etc.),

      3. increase participation in BUILD mentorship programs

      4. increased research productivity in grant submissions and awards

      5. increase in number of trainees mentored in BUILD programs

      6. increase in the quality of mentoring

  2. Institutional outcomes such as –

      1. improved undergraduate retention rates of students in programs relevant to BUILD,

      2. increased student and faculty participation in mentoring in BUILD activities,

      3. increase enrollment and retention of disadvantaged/underrepresented students in BUILD related programs,

      4. increase in number of student research training opportunities in BUILD programs,

      5. increased inter-institutional collaborations to achieve BUILD outcomes related to research, mentorship, and faculty development (e.g., linkages with community colleges, collaborations with NRMN);


Evaluation of the NRMN program will include assessment of the following outcomes (see Attachment 7 – NRMN Logic Model):

  1. Increase in the number and quality of trained mentors and mentees;

  2. Increase in the numbers and quality of culturally responsive active mentor/mentee pairs;

  3. Increase in the number and quality of master trainers and coaches;

  4. Increase in satisfaction with mentoring (for mentors and mentees);

  5. Increase in the quantity, quality, and use of NRMN portal resources;

  6. Increase in research productivity skills;

  7. Increase in motivation to pursue a biomedical career path;

  8. Increase in career productivity of NRMN participants;

  9. Increase in NRMN participant progression in biomedical careers; and

  10. Increase in institutional commitment to training diverse students in biomedical fields.


In essence, the outcome evaluations of BUILD and NRMN will be focused on the extent to which a sustainable pipeline for underrepresented students and faculty in biomedical research has been developed and strengthened.


A.2 Purpose and Use of the Information


Below, we outline and describe the various sources of data needed for the proposed BUILD and NRMN evaluations. These sources include on-line and/or paper surveys administered to students/mentees and faculty/mentors, requests for institutional/administrative data from BUILD sites as well as more qualitative data to be collected during proposed site visits and case studies. In addition to these primary data sources, the CEC will also request secondary data available from BUILD and NRMN. As discussed further below, the latter reflect data that BUILD institutions and NRMN initiative are collecting for their own purposes and that the CEC can request for inclusion in the overall evaluation plans. Leveraging the data obtained from the individual awardees to evaluate the Consortium as a whole provides cost savings and reduces participant burden.


SURVEYS


For BUILD, the purpose of the surveys of undergraduate students and faculty is to collect baseline data and track subsequent outcome data to permit evaluation of primary programmatic objectives and outcomes outlined in A.1. above (e.g., increases in the academic, non-academic/psychosocial, and professional skills of underrepresented students, increases in opportunities to participate in diverse biomedical research opportunities, application to and acceptance into graduate school in a biomedical discipline, completion of undergraduate degrees in biomedical science).


For NRMN, CEC-specific data collection will be implemented through on-line surveys of NRMN participants along with site visits/case studies, which will be based on semi-structured interviews. Baseline data for NRMN participants will be obtained from existing sources: the NRMN’s program registration, website, and pre/post-tests conducted by NRMN investigators during their trainings. Follow-up CEC-administered surveys are designed to provide information on longer-term programmatic objectives and outcomes of the NRMN training and mentor activities, including mentor competency, perceptions of mentor self-efficacy, and increased scientific productivity and successful submission of NIH grants.


It is important to note that BUILD and NRMN intervention activities were initiated in October 2014, when the NIH awards were made. For BUILD programs, this means that the first cohort of students engaged in BUILD activities has been recruited into the program as of the Spring/Summer of 2015. NRMN also began faculty training activities in February 2015. Since OMB clearance for CEC data collection was not in place Spring/Summer 2015, the CEC will rely on existing, secondary data available from the programs for baseline data on the incoming 2015 BUILD participants in evaluation analyses. Below, as we outline the various types of data the CEC proposes to collect (and for which OMB clearance is being sought), we also note the types of secondary data we expect to have available from BUILD and/or NRMN.


Baseline data will be collected for all BUILD and NRMN student and faculty who enter these programs (starting Spring/Summer 2015 for BUILD and NRMN initiated data collection and starting Spring/Summer 2016 for CEC data collection that requires OMB clearance). Data collection will continue through Fall 2019. Follow-up data will be collected annually from these individuals thereafter (through the end of the current funding period and further if subsequent funding becomes available). BUILD students and faculty will enter each year in the Spring/Summer prior to the beginning of that academic year; NRMN participants can enter at any time.


In the following sections, we outline the data to be collected for (a) BUILD (students, faculty, and institutional data) and (b) NRMN participants (students, postdoctoral scientists, junior faculty, senior researchers, and program leaders).


BUILD Institution Undergraduates


Entering Freshman/Transfer student surveys– A survey will be administered to a sample of 500 students entering each BUILD institution annually, including freshman as well as transfer students who enter at higher levels. The sample of 500 will include all students admitted to the BUILD program itself as well as a sample from similar students not in the BUILD program. As noted in B.1.1, sampling will include oversampling on the basis of declared major (80% biomedical vs 20% other) and race/ethnicity (oversampling African Americans and Hispanic/Native Americans to account for their known lower survey response rates). The initial survey will have as its core the “The Cooperative Institutional Research Program (CIRP) Freshman Survey” items (or their equivalents for the “Transfer Student” version of the survey) administered by the UCLA Higher Education Research Institute (HERI) (see Attachments 8a and 8b – The HERI Freshman Survey (TFS) on-line and paper versions, and Attachment 9 – HERI Transfer Student Survey). NOTE: As discussed further in sections A.3 and B.2.1, for all proposed surveys, the planned primary mode of administration is on-line but paper versions will be available as needed. We illustrate the parallel on-line and paper formats in Attachments8a and 8b; for all other surveys we provide only the on-line version as each will be formatted similar to Attachment 8b and will include identical content to that shown in the on-line version for the respective survey.


The HERI “entering student” survey is an existing source of data collection that has been fielded for 50 years by hundreds of two-year colleges, four-year colleges and universities. Because not all BUILD grantees are current users of the survey, we are including it in our OMB application as a primary data collection instrument so that we can work with all BUILD institutions to ensure that these surveys are completed at minimum by the needed sample of 500 students. In cases where a BUILD institution may already administer these surveys to all of their incoming students, we will sample our needed 500 as outlined above and collect follow-up data on those 500 over time as described below. Use of the HERI survey as the core of the proposed baseline for undergraduates will allow for institutional and planned national comparisons of data obtained from BUILD students to students from similar institutions that do not have the BUILD program but do administer the HERI TFS to incoming freshmen. These surveys will provide a baseline or pre-intervention level for each student. Items on the survey include student demographic characteristics, academic preparedness, expectations of college, interactions with peers and faculty, student values and goals, scientific identify, research self-efficacy, and student concerns about financing college. Versions of these items worded appropriately for transfer students are included in the Transfer Student Survey (Attachment 9).


Follow-up HERI surveys –Follow-up HERI surveys, HERI’s “Your First College Year” and “College Senior Survey”, will be administered to allow for comparative analyses of undergraduates at BUILD institutions with undergraduates who complete these same HERI surveys as part of independent data collection at other non-BUILD institutions.


HERI Your First College Year Survey – Administered at the end of students’ first year at the institution (as a freshman or as a transfer student), this survey allows measurement of perceptions regarding academic skills (problem-solving, critical thinking, research skills), involvement in various campus activities (including contacts with faculty and involvement in research) (see Attachment 10)


HERI College Senior Survey – Administered to graduating seniors, this survey allows reflection on the undergraduate experience, degree attainment, and assesses future career and graduate education plans (see Attachment 11). Several items can be compared with the entering student survey described above.


BUILD Student Annual Follow-up Survey – BUILD students will be contacted annually after their baseline enrollment and asked to complete an on-line survey. This survey will seek to update their academic progress and obtain information regarding Consortium hallmarks. Topics include students’ research and mentoring experiences, self-assessments of science identity, research self-efficacy and participation in profession conferences and publications (see Attachment 12).


BUILD Institution Faculty


HERI Faculty Survey – Baseline data for a sample of 50 faculty in biomedical disciplines at each BUILD institution (total N=500) will be collected through an on-line survey administered in 2016. All faculty involved in BUILD program activities will be included unless there are more than 25 in which case a subsample of 25 will be randomly drawn to represent no more than 50% of the total faculty sample of 50. Faculty who have not participated in BUILD activities will be randomly sampled to complete the total desired sample size of 50 at each institution. The core of the survey will be the HERI Faculty Survey in order to allow for comparisons with faculty from non-BUILD institutions for whom secondary data will be available from the 2014 and 2017 national data collection waves. The survey asks about scholarly productivity, teaching load, patterns of interaction with students, perceptions of institutional support for research, and perceptions of diversity at the institution (see Attachment 13).


BUILD Faculty Annual Follow-up Survey – Participants who completed the baseline HERI Faculty Survey will be asked to provide updated information annually about their mentoring and research activities as well as their scholarly productivity (see Attachment 14).


Faculty mentee surveys – Faculty reporting being a mentor will be asked to provide contact information for up to 2 of their mentees who will then be asked to complete a survey about the mentor’s “mentoring” skills (see Attachment 15 Mentee Mentor Assessment). Mentee privacy will be protected by having mentors submit names and emails for selected mentees through a system that forwards the mentee an email asking them to complete a brief survey about the mentor. Names and email information for mentees selected by the mentor will only become available to the CEC if that mentee elects to complete the survey; information for those who chose not to respond is destroyed if they haven’t responded within 3 weeks.


NRMN Participants


Needed baseline data on NRMN participants (i.e., undergraduate/graduate students, post-doctoral students, junior/senior faculty participating in mentoring and/or professional development activities) will be available as secondary data from the NRMN data warehouse as NRMN is collecting profile and program participation data (see Attachment 16 for list of data to be downloaded from the NRMN Data Warehouse).


Follow up surveys for NRMN participants will include annual core surveys for faculty/mentor and students/mentees to document academic output and career progress of trainees (see Attachment 17 (faculty/mentor) and Attachment 18 (student/mentee) CORE follow-up surveys). Additional questions (survey modules) relating to the specific NRMN training or professional development program(s) the individual may have participated in will be included in their follow-up surveys, using the same scales that are used by NRMN in their process evaluation of each component, e.g. mentor training, coaching training, grant-writing and other professional skills (see Attachments 15, 19-22). Follow-up data will support CEC outcome evaluation with respect to outcomes of the training and professional development programs as well as outcomes relating to general academic output and career progress of NRMN participants.



SITE VISITS AND CASE STUDIES


The BUILD sites are using funds to enhance the capacity of campuses to attract, serve, and promote the success of under-served populations in biomedical research. This work requires analysis and understanding of how building and successfully implementing the “systems”, “structures”, and in some cases facilities of campuses — to achieve this goal. As part of the BUILD mixed methods evaluation design, we proposed to conduct annual site visits of the 10 sites in order to understand the context and conditions in which the BUILD funds enhance the institutional development of the sites, and ultimately explain the overall “story” of the BUILD initiative.


Site visits will largely focus on describing the activities BUILD sites implement to promote and support underrepresented groups in biomedical research training at each site. Using the site level BUILD program logic model as a guiding framework, site visits will offer the CEC the opportunity to provide a narrative description of the relationships among each BUILD site’s inputs, activities and outputs and some, but not all, short-term program outcomes.


Site visits are an occasion for sites to showcase the defining features of their programs as well as to discuss any challenges related to program implementation and evaluation. The site visit is a three-way exchange of information among the BUILD site, the CEC and NIH that will allow for critical face-to-face learning to transpire. They are an opportunity to develop trusting relationships that help to promote knowledge exchange and learning from the evaluation.


In addition to the site visits, we are proposing to conduct in-depth case studies of the BUILD sites. The purpose of the case studies is to provide a holistic, in-depth description of the BUILD program at the consortium level. Case studies will provide a clearer understanding of each BUILD site’s successes and challenges by offering explanations and descriptions that will allow for us to generalize at the consortium level. They will also provide us with the opportunity to leverage the findings of the site/local level BUILD evaluations, the survey data collected for the consortium level evaluation, and to better understand the relationship between the BUILD and NRMN programs. The case study will promote learning across the sites and cultures, which will enhance the validity and use of the overall consortium evaluation.


The primary focus of the BUILD case study is to describe how the BUILD program is enhancing the capacity of campuses to attract, serve, and promote the success of underrepresented groups in biomedical research. The BUILD case studies will focus on the processes and procedures that build capacity and infrastructure to advance bio-medical research training. This work requires an understanding and analysis of how to build and successfully implement the “systems” “structures” at the institutional (in some cases this includes building or enhancing facilities), faculty and student levels— to achieve this goal. The primary theoretical/conceptual framework that will guide the analysis of case study is the consortium level logic model, developed by the CEC (see Attachment 23 for complete BUILD Site Visit and Case Studies Protocols).


As part of the NRMN mixed-methods evaluation design, we propose to conduct semi-annual site visits by attending their all-program meetings to better understand NRMN design and implementation, context, structures, processes and achievement of outcomes. During site visits the development of mentoring infrastructure and programs to develop professional skills will be explored. We will also conduct a case study. NRMN case study participants will be interviewed once per year and the sample will include the principal investigator (PI), co-principal investigators (Co-PI’s) and program management staff (1) from each NRMN project/core, as well as participant mentors/coaches and mentees. Semi-structured observations will be conducted of the NRMN PI, Core Co-PI’s, and Project Management staff at regularly scheduled NRMN meetings (see Attachment 24 for NRMN Site Visit and Case Studies Protocol).



ADMINISTRATIVE RECORDS AND NATIONAL DATABASES


BUILD Institutional & Program Data – Additional student and faculty data will be requested from BUILD institutions, including course loads, grades, major (see Attachment 25 for BUILD Institutional Records & Program Data Requests).


BUILD Implementation Data – Each BUILD program will be asked to complete an implementation report annually, providing information on BUILD activities that were implemented, numbers of participants (see Attachment 26: BUILD Implementation Reports).


Use of the Information Collected:


Within the Consortium, we will utilize information collected to provide feedback to our BUILD and NRMN partners on how the programs are being implemented and work to indicate where the programs are doing well and how the programs might be augmented to ensure successful program outcomes. Our formative, interim results will be communicated to key BUILD and NRMN investigators and NIH program staff on an on-going basis.

Information collected for the Diversity Program outcome evaluation will be used more broadly to assess and communicate the BUILD and NRMN program impacts, and ultimately improve the programs for dissemination to other institutions. First, the information will be used to assess the desired impacts (goals) of each of the individual programs. Second, the results of the evaluation will inform the NIH about the outcomes of the Diversity Program, and how the consortium is meeting the programmatic goals and objectives outlined in the FOAs and originating from the ACD report. Third, the results of the outcome evaluation will be distributed to the wider research community and disseminated across biomedical training programs and the NIH to promote adoption of program components found to enhance successful recruitment and retention of diverse populations in the biomedical research career pipeline (from the undergraduate stage through to post-doctoral and junior faculty positions). Without the detailed and comprehensive data collection proposed above to evaluate the effectiveness of the BUILD and NRMN initiatives, sites will lack the opportunity for ongoing iteration and improvement, and the consortium will lack information about the impact of the various activities and interventions implemented by NRMN and BUILD. Ultimately, without the data collection, NIH will not have information necessary to evaluate the effectiveness of the funding investment and to produce knowledge necessary to advance the goals of enhancing diversity in the biomedical workforce.

To disseminate the evaluation findings to the evaluation and the scientific communities, efforts also will be made to publish the results of the outcome evaluation in professional journals and to present the findings at conferences. To enhance and facilitate broader dissemination, we will create simple and compelling products tailored to specific groups and interests that promote a clearer understanding of BUILD and NRMN to program stakeholders, policy-makers, educators, and others.

We will work with NIH, BUILD, and NRMN (together referred to as “Diversity Program” and “Consortium”) to identify key audiences, venues, and topics to be included in all formal dissemination efforts. All study reports and briefs will be posted on the CEC site, both in draft and final forms. When in draft form, BUILD and NRMN community members will be asked to comment on and question report content. The bullets below outline the multiple venues that will be used for disseminating our study findings. We plan to produce:

  • Briefs - small topical reports that highlight key outcomes and how they can be used to better inform diversity programs, specifically the impact of BUILD and NRMN. These will be of particular interest college campus faculty and administrators, NIH program staff, and for institutional, local and federal education and research focused policy makers. 

  • Presentations at national conferences that are chosen carefully to maximize the value and impact of the findings by reaching specific academic and policy audiences.

  • Journal articles on specific topics of interest submitted to key journals and provide the evidence for implementing BUILD and NRMN activities, and for designing future interventions by institutions as well as the NIH.

  • Releases and brief notes to share and potentially gain the attention of broader audiences through outlets such as the Chronicle of Higher Education, and local and national lay news outlets.


A.3 Use of Information Technology and Burden Reduction


For BUILD, the surveys for all students and faculty will be provided online as feasible to facilitate completion by respondents at times and locations convenient for them, thus reducing burden. The surveys for BUILD undergraduate students include: baseline (as entering Freshmen or Transfer students), end of first college year, senior year, and annual follow-up surveys. Those for BUILD faculty include a baseline survey and annual follow-up surveys. Where possible, we are also relying on collection of existing annual institutional data to further reduce the time needed to collect data directly from BUILD participants.


With respect to NRMN, as noted earlier, baseline data for NRMN participants will be downloaded from the existing NRMN database (reflecting data collected by the NRMN project) so as to minimize the additional data collection burden for NRMN participants. Follow-up surveys hosted by the CEC to collect data from NRMN participants (students, postdoctoral scientists, junior faculty, senior researchers, and program leaders) will include annual queries about scientific productivity and career progress for the outcome analysis, along with biannual questions about mentoring, coaching, and research skills as appropriate dependent on the NRMN activities used (through current funding period).


For all on-line surveys, participants will receive a link to the online survey and a user code to access it. As appropriate, the online surveys will use skip-patterns so that each respondent is only presented with questions relevant to his or her specific situation. In addition, most questions on the surveys are multiple choice or closed-ended to reduce burden on respondents. Informed consent procedures will be incorporated into the survey administration to allow respondents to complete both activities during a single session.


The Diversity Program Coordination and Evaluation Center will develop a data system that is compliant with UCLA policies (e.g., Policy 404 -http://www.csg.ucla.edu/documents/2009/Policy404DeanVCfinal.pdf) on the protection of personally identifying information. In consultation with the Chief Information Officer of the UCLA David Geffen School of Medicine (DGSOM), the CEC follows internal Risk Assessment procedures for all systems (see Attachment 27: Coordination & Evaluation Center (CEC) Tracker Security Overview for details).


A.4 Efforts to Identify Duplication and Use of Similar Information


The data to be collected from BUILD undergraduate students and faculty, from NRMN participants, and from institutional sources do not duplicate other data collection efforts, beyond the secondary institutional data described above. The Diversity Program is a new initiative, established to implement and evaluate innovative approaches to undergraduate research training as well as new approaches to networking and mentoring of students and junior faculty in biomedical research careers. The NIH has convened numerous discussions both across NIH (e.g., Office of Extramural Programs, Training and Advisory Council, and the Office for Scientific Workforce Diversity) and from individuals outside of the NIH (e.g., ACD Working Group to the Director) to discuss the needs and objectives of the Diversity Program and ensure that the knowledge needed could not be answered using existing data collection efforts. There is no similar information that has been collected across a multi-site, multi-year program aiming to develop training, mentoring, and institutional development to enhance diversity in the biomedical workforce. As noted above in some detail, where possible, we have sought to use existing, secondary data from national surveys, such as those administered by HERI, or data collected by the NRMN or BUILD initiatives or the BUILD institutions themselves in order to leverage activity already underway at many institutions. Through these efforts, we have minimized additional survey collection by the CEC for evaluation of the Diversity program.


A.5 Impact on Small Businesses or Other Small Entities


Small businesses are not involved in this study.


A.6 Consequences of Collecting the Information Less Frequently


Proposed data collection schedules are designed to provide information needed for a rigorous longitudinal program evaluation. Efforts have been made to reduce the burden on participants by collecting data annually rather than on a more frequent basis. Programs are designed to annually enroll students and to assess factors before and after exposure to the intervention activities and to obtain consistent data for each student cohort. In order to assess outcomes, it is necessary to collect data for at least 5 years to see long-term changes (possibly even longer if additional funding for 10 or 15 year evaluation and tracking is supported by the NIH). It is also necessary to maintain contact with participants after they exit their institutions/programs for tracking of their biomedical research career outcomes (which are a key outcome of the Diversity Program). The timeline for data collection allows for adequate tracking of changes in the desired outcomes at the individual student, faculty, and institutional level in the short-to-medium-term. Data collected from 2016 to 2019 is intended to begin establishing the extent to which BUILD and NRMN programs are on the road to success.


If the BUILD and NRMN surveys for students and faculty are not conducted or are conducted less frequently, NIH will not be able to determine whether the Diversity Program produced the desired programmatic outcomes and whether any changes in outcomes could be attributed to the program interventions. Also, if the surveys are not conducted, the program activities which produce the desired outcomes could not be identified and disseminated to the biomedical research training community. If information is not collected from BUILD institutions, NIH will not be able to assess how the interventions are being implemented, the impact of the interventions, the progress made in achieving programmatic goals and achieving sustainability and institutionalization of program interventions and activities. Further, NIH will not be able to take steps to improve implementation, sustainability, and institutionalization throughout the Diversity Program.


A.7 Special Circumstances Relating to the Guidelines of 5 CFR 1320.5


This project fully complies with all guidelines of 5 CFR 1320.5.


A.8 Comments in Response to the Federal Register Notice and Efforts to Consult Outside Agency


The 60-day Federal Register notice was posted on September 28, 2015. Federal Register document 80-FR58270, pages 58270-58271. Federal Register Notice:https://www.federalregister.gov/articles/2015/09/28/2015-24594/proposed-collection-60-day-comment-request-evaluation-of-the-enhancing-diversity-of-the-nih-funded


One public comment was received.


Efforts to Consult Outside Agency: To address underrepresentation in biomedical and behavioral research, NIH Director Dr. Francis Collins charged the Advisory Committee to the NIH Director (ACD) to form a Working Group on Diversity in the Biomedical Research Workforce (WGDBRW) to examine the findings and implications of the Ginther et al. (2011) study results. The vast majority of WGDBRW members were external to NIH. Dr. Collins charged the WGDBRW with providing concrete recommendations toward improving the recruitment and retention of underrepresented minorities (URM), people with disabilities, and people from disadvantaged backgrounds across the lifespan of a biomedical research career from graduate study to acquisition of tenure in an academic position or the equivalent in a non-academic setting. The WGDBRW met 13 times in person at the NIH’s Bethesda campus or by telephone and used a variety of means to gather information based on existing data and efforts, leading to a report and set of recommendations to the NIH Director. These recommendations led to the creation of the Diversity Program Consortium. (see Attachment 28: Advisory Committee to the Director – Working Group on Diversity in the Biomedical Workforce Membership).


In preparing this application, CEC faculty consulted extensively external to NIH with other members of the Diversity Consortium, both generally and through the Diversity Program Consortium Executive Steering Committee (ESC) as well as NIH scientists involved in the Consortium to determine (a) exactly what information was needed to implement the required consortium-wide evaluation, (b) what measures and data sources were recommended for collection of each required piece of information. Decisions regarding the data required from BUILD and NRMN participants and comparison groups, the required frequency of that data collection as well as evaluation of potential sources of data (i.e., what required primary data collection vs. what could be obtained from existing data sources) was importantly informed by Consortium member’s extensive experience in implementation of educational interventions for university students and faculty and the methodologies and the type and frequency of data sources required for successful evaluation of those efforts.


Specific consultations included: (1) Group discussions with BUILD and NRMN investigators at the 2014 Consortium Kickoff meeting, (2) CEC orientation visits to each BUILD site and attendance at 2 NRMN leadership meetings, where evaluation and data collection were discussed with site investigators, faculty, and institutional representatives, (3) Working group conference calls with BUILD and NRMN representatives regarding evaluation plans and data collection (bi-monthly for 6-8 months) (4) CEC 1-day planning retreat with the NIH leadership team, and (4) ESC meetings with representatives from each BUILD and NRMN site and the NIH to develop, discuss, and approve proposed Consortium Hallmarks.


Representatives from NIH involved in the Diversity Program evaluation consultations included: Drs. Allison Scott (Diversity Program Leader, NIGMS/Common Fund), Michael Sesma (CEC Program Officer and NIGMS Program Director), Mercedes Rubio (NRMN Program Officer, NIGMS), Darren Sledjeski (CEC Project Scientist, NIGMS), Pamela Thornton (BUILD Project Scientist, NIGMS), L. Michelle Bennett (Deputy Scientific Director, NHLBI).The table below provides the names and affiliations for the BUILD PIs and NRMN leadership involved in Consortium consultations.


BUILD & NRMN leadership consulted in Grant Year 01


Name

Affiliation

Organization

Email

1

Laura Kingsford

BUILD

Cal State University Long Beach (CSULB)

[email protected]

2

Crist Khachikian

BUILD

Cal State University Northridge

[email protected]

3

Gary Kuleck

BUILD

University of Detroit Mercy

[email protected]

4

Farin Kamangar

BUILD

Morgan State University

[email protected]

5

Carlos Crespo

BUILD

Portland State University

[email protected]

6

Leticia Marquez-Magana

BUILD

San Francisco State University (SFSU)

[email protected]

7

Karsten Hueffer

BUILD

University of Alaska Fairbanks (UAF)

[email protected]

8

Philip Rous

BUILD

University of Maryland Baltimore County (UMBC)

[email protected]

9

Lourdes Echegoyen

BUILD

University of Texas El Paso (UTEP)

[email protected]

10

Gene D’Amour

BUILD site

Xavier University

[email protected]

11

David Burgess

NRMN

Boston College

[email protected]

12

Chris Pfund

NRMN

University of Wisconsin, Madison

[email protected]



A.9 Explanation of Any Payment or Gift to Respondents


In order to be responsive to the Coordination and Evaluation Center Funding Opportunity Announcement (RFA-RM-13-015), and collect comprehensive, longitudinal data to assess the impact of interventions across the consortium, it is critical that we achieve two central goals. First, we must successfully recruit students and faculty from groups with varying degrees of involvement with the interventions that comprise this Consortium – ranging from heavily involved to those with no involvement. The latter in particular are likely to be quite difficult to recruit as they have no direct connection to the Consortium’s activities. We need to secure good recruitment for all of these groups in order to have data that will allow us to draw unbiased conclusions from the program evaluation. Second, once individuals are recruited into the project, we must retain participants over the longitudinal follow-up. This is critical to our ability to derive valid conclusions regarding the impact of the interventions on primary outcomes, including rates of graduation and progression to graduate school for BUILD students and career progress with respect to research, publications and general career advancement for BUILD and NRMN faculty and trainees. Failure to achieve good response rates at baseline and/or failure to retain participants over the next several years of contact has the potential to result in non-representative samples and associated bias in conclusions that would be drawn from those data.


Two specific aspects of this project lead us to propose offering incentives. First, our target student and faculty populations include hard-to-reach underrepresented groups within student and faculty populations. These are groups that have traditionally participated in research studies such as this at lower rates (Sharknesss, 2012; Porter & Whitcomb, 2005). It is essential that we achieve good representation for these groups. Second, our project depends critically on our ability to obtain longitudinal data for both students and faculty and thus depends importantly on participants’ willingness to commit to annual requests that they take time from their busy schedules to complete our surveys. Though the burden to participants for the individual surveys is fairly minimal, it is essential that we maximize the willingness of our participants to respond repeatedly over time to our requests that they complete an annual survey.


Though our recruitment/retention strategies will include non-monetary approaches known to improve response rates (e.g., providing respondents information about the contribution they will be making to the understanding of the important issues on which the project focuses as a means of enhancing their intrinsic motivation to participate; Singer & Ye, 2013), we believe that successful recruitment/retention efforts will also require that we offer a monetary incentive for participants.. Our original submission to OMB requested approval to provide each participant with an incentive of $15 for each survey that they completed. Since that time, it has become clear that budget constraints are now such that available funds would only allow us to offer an incentive of under $3 per participant if we provided an incentive to each participant for each survey. Prior experience (ours and that of others) suggests that this level of incentive would not be effective in securing participation. Thus, we have revised our proposed approach based on available funds. We now propose to offer incentives through raffles. Incentives will include 15 $100 gift cards raffled to each set of 50 participating faculty at each of the 10 BUILD institutions; 75 $20 gift cards raffled to each set of 500 participating students at each BUILD institution, and 200 $100 gift cards raffled to the overall pool of NRMN participants each year. The larger incentives for faculty and NRMN participants is based on further investigation as to what prior studies have found to be effective in more senior academics vs. undergraduate students.


Without such an incentive, prior evidence (both from the literature and from reports from our BUILD program colleagues with in the Diversity Consortium) suggests that we will have a difficult time recruiting a representative group, especially among some of the underrepresented groups among both students and faculty who are key elements of our target populations (Estrada, Woodcock, & Schultz, 2014). There is strong and consistent evidence that provision of a monetary incentive to all participants as part of the survey request is the most effective strategy for ensuring better response rates (LaRose & Tsai, 2014; Singer & Ye, 2013). Indeed, among the current BUILD programs within the Diversity Consortium, those that have included incentives in their own data collection efforts report significantly better response rates from their student populations. Also, once individuals have been recruited, incentives will also be critical to our ability to retain a representative sample over the longitudinal follow-up in order to track primary outcomes for the required program evaluation (Estrada, Woodcock, & Schultz, 2014).



A.10 Assurance of Confidentiality Provided to Respondents


The UCLA Office for Protection of Research Subjects (PRS) will review the evaluation and its associated data collection protocols, along with the corresponding human subjects review committees of the 11 lead intervention institutions (10 BUILD + NRMN), as noted below,


The NIH Office of Human Subjects Research (OHSR) was consulted and because each site has IRB approval and because the NIH will not be collecting or analyzing the data, there is no need for the office to review the package. The NIH Privacy Act Officer has reviewed the package. The approval letter is Attachment 29.


Participants in the Consortium-wide evaluation - the BUILD students and faculty and NRMN mentors, mentees, and trainees - will be informed that their responses to the data collection efforts are only to be disclosed to authorized users for analysis and reporting. The Coordination and Evaluation Center (CEC) at UCLA will be responsible for ensuring the security of the data. Authorized users include: (1) the CEC implementing the evaluation, (2) authorized staff at the NIH, and (3) PIs and staff for the BUILD institutions and NRMN network that are involved in the local evaluation. The data will be shared with institutions to avoid duplication of efforts and to reduce burden on students and faculty so they can avoid completing multiple surveys with similar questions. Given the nature of the study, participants will be given the assurance that their information will be protected and secured to the extent permitted by law. To that end, data may only be disseminated in aggregate to the public in order to inform the research community of the results of the study but protect the identity of individual respondents.


The CEC conducting the study will be required to adhere to the following safeguarding procedures:


  • The safeguarding protections offered to all participants will be included in the invitations and introduction to the survey instruments after review by the appropriate institutional human protections office (for the CEC, the UCLA Institutional Review Board; for other institutions as required). Respondents will be informed that their participation is voluntary and that no consequences will be associated with either responding or not responding. For example, the online surveys will have the following statements: “Your participation in the evaluation of the NIH Diversity program is voluntary. You are free to withdraw your consent and discontinue participation without penalty at any time. You are not waiving any legal claims, rights or remedies because of participating in the NIH Diversity program.”, and “All information you provide for this program is private. You will be assigned a code number, and all surveys will use this number. All the information you provide will remain locked in a filing cabinet or protected on a secure server to prevent disclosure. Information provided by you during participation in the surveys for the evaluation of the NIH Diversity program will only be disclosed to the Coordinating Center staff, the NIH Diversity program staff, and the Diversity Program staff at your institution”.

  • Data will be stored in a manner such that restricted information (e.g., name, address, contact information) will be stored in a different system from study data such as survey responses. The restricted information will be stored in a system behind the CEC firewall and operates on a private IP range. Only local users can access these IP addresses and the number of authorized users will be strictly limited. The study data will be maintained in a separate system requiring authorized users using encryption.

  • Any paper files used for data collection (such as handwritten interview notes) shall be stored in locked cabinets with access limited and controlled as with electronic data.

  • For purposes of analysis, restricted information will be included in datasets.

  • Publications shall only report the data in aggregate and will not contain any identifying information.

  • The UCLA CEC will follow the methods detailed in the NIH Privacy Act Systems of Record Notice 09-25-0156 which provides authority for the NIH to conduct and fund research and to provide training assistance, and its general authority to maintain records in connection with these and its other functions (42 U.S.C. 203, 241, 289l-1 and 44 U.S.C. 3101), and Section 301 and 493 of the Public Health Service Act, as it relates to records of participants in programs and respondents in surveys used to evaluate programs of the Public Health Service.

  • The data collected from the surveys and the interviews will be stored in the secure server. A password-protected directory will be created and only authorized users will have access to the data. All computer-based systems will comply with the Privacy Act.


A.11 Justification for Sensitive Questions


The proposed data collection involves few, if any, sensitive questions in the surveys, site visits, and case studies. The one exception may be income. It is important to collect income data (along with other personal characteristics) in order to document the diversity of the participants in this initiative which is specifically designed to enhance the diversity of the NIH-funded workforce and to understand the contextual factors and variables that contribute to success in any intervention.


The student and faculty surveys contain questions regarding respondents’ race, ethnicity, sex and income. We note that items regarding race included in the HERI surveys (see Attachments 8-11 and 13) are similar but not identical to the definitions for racial categories in OMB Directive 15. The HERI surveys also ask only that single item and do not include a separate item on ethnicity as outlined in OMB Directive 15. We are unable to modify the items in the HERI surveys as HERI seeks to maintain comparability of their data over time, having administered those same items over the past 50 years. However, for federal reporting we can collapse those responses into the 4 racial (American Indian or Alaska Native, Asian or Pacific Islander, Black, White) and 2 ethnic categories (Hispanic origin, Not of Hispanic origin) as defined in the OMB Directive 15. In addition, in our annual follow-up surveys for both students and faculty, we ask the standard ethnicity item (Hispanic/not Hispanic) and a separate race item (e.g., see last 2 items in Attachment 12). The race item is more detailed than those listed in OMB Directive 15 due to the fact that several of the participating institutions desire more detailed race information. However, the resulting data will allow us to classify participants according to the race/ethnic classifications in OMB Directive 15. Similarly, for sexual orientation/identity, the HERI survey uses slightly different descriptors than the most recent National Center for Health Statistics (NCHS) recommendation. While the NCHS suggests the following terms: i) Lesbian or gay, ii) Straight, that is, not gay, iii) Bisexual, iv) Something Else or v) Don’t Know, the HERI survey uses “Other” instead of  “Something Else” (these are equivalent) and it gives the option of “Queer” (which NCHS doesn’t include) and does not include the NCHS “Don’t Know” option. Since HERI is a historical survey and we cannot change the wording we propose to merge “Queer” into “Something Else” to be as consistent as possible with current NCHS recommended reporting categories.


The exit and follow-up surveys for BUILD students and faculty and NRMN mentees and mentors will contain questions regarding work-related information (type of employer organization and career field), career status (student/faculty level – graduate, post-graduate, Assistant/Associate/Full professor), number of NIH applications, number of publications). This information will allow NIH to analyze the survey data by subgroups and support NIH’s long standing efforts to strengthen diversity of those it serves.


Survey participants may skip any or all of the Personally Identifiable Information (PII) questions that they do not wish to answer. To avoid fear of disclosure of sensitive information, participants will be assured that their responses will be kept private, and will be reported in terms of aggregate numbers or summary statistics. Those who choose to provide these demographic data will do so on a strictly voluntary basis.


Institutional level data will not contain any highly sensitive questions.


As requested by the NIH Privacy Act Officer, the Privacy Act Systems of Record Notice 09-25-0156, “Records of Participants in Programs and Respondents in Surveys Used to Evaluate Programs of the Public Health Service, HHS/PHS/NIH/OD” will be distributed to staff responsible for handling any PII. The UCLA CEC will conduct the evaluation and will develop online surveys that comply with the Privacy Act.


A.12 Estimates of Hour Burden Including Annualized Hourly Costs


A.12.1 Estimates of Hours Burden


Every effort will be made to minimize the burden on the respondents. All surveys will be given online (with paper version available on request). Estimates of hourly burden are provided only for the surveys that will be administered during the funded period.


BUILD Students – The entrance survey for the BUILD undergraduate students will take on average 45 minutes to complete. Similarly, the surveys at the end of freshmen and senior years will also take approximately 45 minutes each. The annual follow-up survey will take on average 25 minutes. If this current request for clearance is approved, the same surveys will be used to gather data from applicants for the duration of the clearance (2016-2019).


BUILD Faculty – The initial survey for BUILD faculty will take on average 45 minutes to complete. The annual follow-up survey will take on average 15 minutes to complete.


BUILD Site Visits – A team of 3-5 CEC team members will conduct a 1-2 day site visit to each BUILD site in 2016, 2017, and 2018 (not in 2019 to allow for analysis of data during that final year of funding). The site visit protocol will involve standard site visit presentations by the site with group discussion. We estimate that there are likely to be 12-15 individuals from a given BUILD site who participate in this portion of the site visit. We have estimated 24 hours per person based on 8 hours preparation, 16 hours for a 2-day site visit, including the semi-structure group interview.


BUILD Case Studies – We have estimated that 40 hours (5days * 8hr/day =40 hours) will be needed from someone at each of the case studies sites to assist in pulling together needed materials and assisting in scheduling the requested individual and group interviews. Semi-structured interviews will be conducted in 2017 and 2018 at each BUILD site. Individual interviews (1.5 hours each) will be conducted with the Principal Investigator(s), Program manager(s), and Partner Institution Directors (up to a maximum of 4 such individuals). Group interviews (1.5 hours each) will be conducted with: BUILD Faculty Leads (5 participants), Faculty participants (including mentors) in site BUILD activities (5 participants), undergraduate students (2 groups of 6 active BUILD student participants; one group of 4 BUILD drop-outs) and one group of 6 graduate/post-doctoral students. Two additional case studies will be done at 2 matched non-BUILD institutions; these will include interviews (1.5 hours each) with one Institutional Director/Program Manager and group interviews with 5 faculty, 5 undergraduate students, 5 graduate/post-doctoral students.


NRMN Trainees – We estimate that completing the annual follow-up survey will take an average of 25 minutes, with variability depending on the academic productivity of the respondent. The biannual program-specific components, which are all fixed choice questions, add about five minutes per program component.


NRMN Case Studies Visits– The case studies will involve annual one-hour interviews with NRMN leadership (37 individuals representing the PIs and core leads) as well as 1 hour interviews with 10 mentors in each of 3 years and 1 hour interviews with 10 mentees (5 graduate students; 5 faculty-level) in each of 3 years. No preparation time is estimated as scheduling of these interviews will be done by the CEC team.


Further reductions in the online surveys and phone interviews would jeopardize accurate assessment of the program. Table A.12.1 displays the annualized estimate of hour burden. For each survey/data collection instrument included in Table A.12.1, the relevant Attachment illustrating that instrument is also listed. The expected burden level for this study is 61,950 hours.


A.12.1: Annualized Estimate of Hour Burden

Type of Respondents

Number of Respondents

Frequency of Response

Average Time per Response (in hours)

Annual Hour Burden

BUILD Student - Entrance Survey (Version A:HERI Freshman Survey [Attachment 8a], Version B: HERI Freshman Survey for Non-Freshman Transfers [Attachment 8b])

15,000

1

45/60

11,250

BUILD Student – Follow-up survey at the end of the first attendance year (HERI Your First College Year; Attachment 10)

15,000

1

45/60

11,250

BUILD Student – Follow-up survey at graduation (HERI College Senior Survey; Attachment 11)

15,000

1

45/60

11,250

BUILD Student Annual Follow-up Survey (Attachment 12)





  1. 2015 Cohort

5,000

3

25/60

6,250

  1. 2016 Cohort

5,000

2

25/60

4,167

  1. 2017 Cohort

5,000

1

25/60

2,083

BUILD Faculty Survey (HERI Faculty Survey; Attachment 13)

500

1

45/60

375

BUILD Faculty Annual Follow-up survey (Attachment 14)

500

2

25/60

417

BUILD Mentee Mentor Assessment (Attachment 15)

1,000

3

10/60

500

BUILD Institutional Research & Program Data Requests (Attachment 25)

10

3

16

480

BUILD Implementation Reports (Attachment 26)

10

3

16

480

BUILD Site Visits (Attachment 23)

120

3

24

8,640

BUILD Case Studies Preparation (Attachment 23)

24

1

40

960

BUILD Case Study Interviews (Attachment 23)





  1. Undergraduate Students

170

1

90/60

255

  1. Graduate/post-doctoral students

70

1

90/60

105

  1. PI’s, Program Managers/Directors, & Faculty

162

1

90/60

243

NRMN Mentee Annual Follow-up Surveys (Attachment 18)





  1. 2016 student cohort

375

3

25/60

469

  1. 2016 faculty cohort

100

3

25/60

125

  1. 2017 student cohort

375

2

25/60

313

  1. 2017 faculty cohort

100

2

25/60

83

  1. 2018 student cohort

375

1

25/60

156

  1. 2018 faculty cohort

100

1

25/60

42

NRMN Mentor Annual Follow-up Surveys (Attachment 17)





  1. 2016 Cohort

375

3

25/60

469

  1. 2017 Cohort

375

2

25/60

313

  1. 2018 Cohort

375

1

25/60

156

NRMN Mentees – Program specific modules for tracking survey:

Mentee Assessment of Mentor (Attachment 15),

Research & Grant Writing (Attachment 20),

Institutional Context (Attachment 22)





  1. 2016 student cohort

375

3

10/60

188

  1. 2016 faculty cohort

100

3

10/60

50

  1. 2017 student cohort

375

2

10/60

125

  1. 2017 faculty cohort

100

2

10/60

33

  1. 2018 student cohort

375

1

10/60

63

  1. 2018 faculty cohort

100

1

10/60

17

NRMN Mentors – Program specific modules for tracking survey:

Mentor Skills (Attachment 19),

Coaching Training (Attachment 21),

Institutional Context (Attachment 22)





  1. 2016 Cohort

375

3

10/60

188

  1. 2017 Cohort

375

2

10/60

125

  1. 2018 Cohort

375

1

10/60

63

NRMN site visits (Attachment 24)

1

6

16

96

NRMN Case Study Interviews (Attachment 24)





  1. Investigators

37

3

1

111

  1. Mentors

30

1

1

30

  1. Student mentees

15

1

1

15

  1. Faculty mentees

15

1

1

15

Total

67,764

90,723


61,950



A.12.2 Annualized Cost to Respondents


An hourly earning rate for undergraduate students was estimated by average the state minimum wage for the locations of the BUILD institutions ($8.73).


The average hourly earnings for graduate students are $17.50 and $31.15 for post-doctoral students.


An hourly earning rate for faculty was estimated using the American Association of University Professors’ Annual Report on the Economic Status of the Profession (http://www.aaup.org/report/heres-news-annual-report-economic-status-profession-2014-15). The amount of average hourly earnings rate was based on the salary of full professors with doctorates. The average hourly earnings for the faculty are $64.34.


The average hourly earnings rate for Institutional Research staff is estimated as the same as faculty.


The annual cost for all respondents to participate in the Diversity Consortium evaluation would equal approximately $ 1,304, 459.


A.12.2: Annualized Cost to Respondents

Type of Respondents

Annual Hour Burden

Approx. Hourly Wage Rate ($)

Total Cost ($)

BUILD Student - Entrance Survey (Version A:HERI Freshman Survey, Version B: HERI Freshman Survey for Non-Freshman Transfers)

11,250

$8.73

$98,213

BUILD Student – Follow-up survey at the end of the first attendance year (HERI Your First College Year)

11,250

$8.73

$98,213

BUILD Student – Follow-up survey at graduation (HERI College Senior Survey)

11,250

$8.73

$98,213

BUILD Student Annual Follow-up Survey




  1. 2015 Cohort

6,250

$8.73

$54,563

  1. 2016 Cohort

4,167

$8.73

$36,375

  1. 2017 Cohort

2,083

$8.73

$18,188

BUILD Faculty Survey (HERI Faculty Survey)

375

$64.34

$24,128

BUILD Faculty Annual Follow-up survey

417

$64.34

$26,808

BUILD Mentee Mentor Assessment

500

$8.73

$4,365

BUILD Institutional Research & Program Data Requests

480

$64.34

$30,883

BUILD Implementation Reports

480

$64.34

$30,883

BUILD Site Visits

8,640

$64.34

$555,898

BUILD Case Studies Preparation

960

$64.34

$61,766

BUILD Case Study Interviews




  1. Undergraduate Students

255

$8.73

$2,226

  1. Graduate/Post-doctoral Students

105

$17.40

$1,827

  1. PI’s, Program Managers/Directors, & Faculty

243

$64.34

$15,635

NRMN Mentee Annual Follow-up Surveys




a. 2016 student cohort

469

$17.40

$8,156

b. 2016 faculty cohort

125

$64.34

$8,043

c. 2017 student cohort

313

$17.40

$5,438

d. 2017 faculty cohort

83

$64.34

$5,362

e. 2018 student cohort

156

$17.40

$2,719

f. 2018 faculty cohort

42

$64.34

$2,681

NRMN Mentor Annual Follow-up Surveys




  1. 2016 Cohort

469

$64.34

$30,159

  1. 2017 Cohort

313

$64.34

$20,106

  1. 2018 Cohort

156

$64.34

$10,053

NRMN Mentees – Program specific modules for tracking survey (Mentee Assessment of Mentor, Research & Grant Writing, Institutional Context)




  1. 2016 student cohort

188

$17.40

$3,263

  1. 2016 faculty cohort

50

$64.34

$3,217

  1. 2017 student cohort

125

$17.40

$2,175

  1. 2017 faculty cohort

33

$64.34

$2,145

  1. 2018 student cohort

63

$17.40

$1,088

  1. 2018 faculty cohort

17

$64.34

$1,072

NRMN Mentors – Program specific modules for tracking survey (Mentor Skills, Coaching Training, Institutional Context)




  1. 2016 Cohort

188

$64.34

$12,064

  1. 2017 Cohort

125

$64.34

$8,043

  1. 2018 Cohort

63

$64.34

$4,021

NRMN SiteVisits

96

$64.34

$6,177

NRMN Case Study Interviews




  1. Investigators

111

$64.34

$7,142

  1. Mentors

30

$64.34

$1,930

  1. Student mentees

15

$17.40

$261

  1. Faculty mentees

15

$64.34

$965

Total

61,950


$1,304,459


A.13 Estimate of Other Total Annual Cost Burden to Respondents or Record-keepers


There are no capital, maintenance or operating costs to respondents.


A.14 Annualized Cost to the Federal Government


The annualized costs to the federal government will be $5,269.92 in the first year, based on the hourly rates (on the Office of Personnel Management [OPM] website) for the four NIH staff and Contractor participating in the development, review, and monitoring of the evaluation and data collection. It is anticipated that four NIH staff from the National Institute of General Medical Sciences will be involved at the GS-15 and GS-14 levels, and include the Program Officers, Project Scientist, in addition to Program Leader (contractor) responsible for oversight of Coordination and Evaluation Center. The costs for the first year assumes two GS-15 with annual salaries of $157,971 working at 10 hours; one GS-15 with an annual salary of $128,082; one GS-14 with a salary of $127,036 working at 8 hours; and one Program Leader (contractor) working 38 hours. The total costs for the four years of the project are estimated to be $21,079.69, assuming each of the four NIH staff and contractor devote the same amount of hours each year to the project. Salaries are based on the January 2016 General Schedule for the Washington, DC Metropolitan area (https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2016/DCB.pdf; https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2016/DCB_h.pdf).


Staff


Grade/Step

Salary

% of Effort

Fringe (if applicable)

Total Cost to Gov’t

Federal Oversight

GS-15/8

$157,971

10 (40)


$756.90 ($3,027.60)


GS-15/8

$157,971

10 (40)


$756.90 ($3,027.60)


GS-15/1

$128,082

10 (40)


$613.70 ($2,454.80)


GS-14/6

$127,036

8 (32)


$486.96

($1,947.84)

Contractor Cost


----

38 (152)


$2,655.46 ($10,621.85)







Travel






Other Cost












Totals


$571,060



$5,269.92 ($21,079.69)



A.15 Explanation for Program Changes or Adjustments


This is a new collection of information.


A.16 Plans for Tabulation and Publication and Project Time Schedule


The data collection and analysis will be conducted by the UCLA CEC with oversight from the NIH Project Officer. The CEC will provide a report that describes the study and findings.


A.16 Estimated Annual Project Time Schedule

Activity

Time Schedule

Administer online Entrance Survey to all incoming BUILD undergraduate students

Summer/Fall 2016 and annually thereafter

Administer BUILD “Your First College Year” and “College Senior” surveys

Fall 2016 and annually thereafter

Administer BUILD student annual follow-up

Spring 2017 and annually thereafter

Administer BUILD faculty survey

Fall 2016

Administer annual BUILD faculty follow-up survey

Spring 2017 and annually thereafter

Collect BUILD Institutional Records & Program data

Fall 2016 and annually thereafter

BUILD Implementation Reports

Fall 2016 and annual thereafter

Coordinate to receive download of data from NRMN Data Warehouse

Bi-annually

Collect NRMN Mentor/Mentee &Trainee follow up data

Annually 2016-2019 after OMB approval

BUILD Site visits

Fall-Spring 2016, 2017, 2018

BUILD Case studies

2017 & 2018

NRMN Semi-structured interviews with NRMN leadership

Bi-annually 2016-2019

NRMN Case Studies

Annually 2016-2019 after OMB approval


A.17 Reasons Why Display of OMB Expiration Date is Inappropriate


No exceptions are sought; data collection instruments will display the OMB Expiration Date.


A.18 Exceptions to Certification for Paperwork Reduction Act Submissions


No exceptions are sought from the Paperwork Reduction Act.



REFERENCES


Estrada M, Woodcock A, Schultz PW. Tailored Panel Management: A Theory-Based Approach to Building and Maintaining Participant Commitment to a Longitudinal Study. Evaluation Review, 2014; 38: pp 3-28.


Ginther DK, Schaffer WT, Schnell J, Masimore B, Liu F, Haak LL, Kington R.

Race, ethnicity, and NIH research awards. Science. 2011 Aug 19; 333(6045):1015-9.

doi: 10.1126/science.1196783. PubMed PMID: 21852498; PubMed Central PMCID:

PMC3412416.


LaRose R & Tsai HS (2014). Completion rates and non-response error in online surveys: Comparing sweepstakes and pre-paid cash incentives in studies of online behavior. Comp Hum Beh, 34: 110-119.


NIH Advisory Committee to the Director, Diversity in the Biomedical Workforce Working Group Report (2012), NIH, Bethesda, MD. http://acd.od.nih.gov/Diversity%20in%20the%20Biomedical%20Research%20Workforce%20Report.pdf


Parsons NL &Manierre MJ. Investigating the relationship among prepaid token incentives, response rates, and nonresponse bias in a web survey. Field Methods, 2014; 26(2): 191-204.


Porter, SR, Whitcomb ME. Non-Response in Students Surveys: the Role of Demographics, Engagement and Personality. Research in Higher Education, 2005, 46:127-152.

Sharkness, J. Why Don’t They Respond? An Investigation of Longitudinal Survey nonresponse Among College Students Atttending Four-Year Institutions. Electronic Thesis and Dissertations, UCLA, 2012.


Singer E & Ye C. The use and effects of incentives in surveys. Ann Am Acad Polit Soc Sci, 2013; 645: 112-141.


To, Nhien (2015). Review of Federal Survey Program Experiences in Incentives. Bureau of Labor Statistics, July 23, 2015. http://www.bls.gov/cex/research_papers/pdf/Review-of-Incentive-Experiences-Report.pdf


Wilder, EL, Tabak, LE, Pettigrew, RI, Collins, FS. Biomedical Research: Strength from Diversity. Science, 2013; 342, p. 798.


vi


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorMelissa Newberry;CEC
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy