2015 SDR OMB Supporting Statement Part A and TOC_Final_25AUG2015

2015 SDR OMB Supporting Statement Part A and TOC_Final_25AUG2015.pdf

2013 Survey of Doctorate Recipients (SDR)

OMB: 3145-0020

Document [pdf]
Download: pdf | pdf
SF-83-1 SUPPORTING STATEMENT
for
Survey of Doctorate Recipients
for 2015 SDR Survey Cycle

TABLE OF CONTENTS
A. JUSTIFICATION .................................................................................................................... 1 
A.1 Necessity for Information Collection................................................................................... 2 
A.2 Uses of Information ............................................................................................................. 2 
A.3 Consideration of Using Improved Technology.................................................................... 6 
A.4 Efforts to Identify Duplication ............................................................................................. 6 
A.5 Efforts to Minimize Burden on Small Business .................................................................. 7 
A.6 Consequences of Less Frequent Data Collection................................................................. 7 
A.7 Special Circumstances ......................................................................................................... 7 
A.8 Federal Register Announcement and Consultations Outside the Agency ........................... 7 
A.9 Payment or Gifts to Respondents ....................................................................................... 11 
A.10 Assurance of Confidentiality ........................................................................................... 13 
A.11 Justification for Sensitive Questions ................................................................................ 14 
A.12 Estimate of Respondent Burden....................................................................................... 14 
A.13 Cost Burden to Respondents ............................................................................................ 14 
A.14 Cost Burden to the Federal Government ......................................................................... 14 
A.15 Reason for Change in Burden .......................................................................................... 14 
A.16 Schedule for Information Collection and Publication ..................................................... 15 
A.17 Display of OMB Expiration Date .................................................................................... 15 
A.18 Exception to the Certification Statement ......................................................................... 15 
B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS ...... 16 
B.1 Respondent Universe and Sampling Methods ................................................................... 16 
B.2 Statistical Procedures ......................................................................................................... 18 
B.3 Methods to Maximize Response ........................................................................................ 20 
B.4 Testing of Procedures ......................................................................................................... 27 
B.5 Responsive Design and Nonresponse Error Assessment ................................................... 29 
B.6 Contacts for Statistical Aspects of Data Collection ........................................................... 30 

LIST OF ATTACHMENTS
Attachment A – NSF Act of 1950; America COMPETES Reauthorization Act of 2010 .......... A-1
Attachment B – First Federal Register Announcement ...............................................................B-1
Attachment C – Draft 2015 SDR Questionnaire..........................................................................C-1
Attachment D – Draft 2015 SDR Survey Mailing Materials ...................................................... D-1
Attachment E – 2013 Survey of Doctorate Recipients: Sample Design and
Implementation Report .................................................................................................... E-1
Attachment F – 2015 SDR Sample Allocation and Selection Tables .......................................... F-1
Attachment G – 2015 SDR Contacting Protocol Experiments Results ...................................... G-1

A. JUSTIFICATION
This request is for a three-year reinstatement of the previously approved OMB clearance for the Survey of
Doctorate Recipients (SDR). The SDR was last conducted in 2013 and the OMB clearance for the
2013 SDR expired November 30, 2014 (OMB No 3145-0020). While the data collection instruments for
the 2015 SDR are largely unchanged from the prior round, the sample has been greatly enlarged from
47,000 to 120,000 individuals to support new and expanded analytical objectives. Additionally, data
collection procedures have been modified to accommodate the change in sample composition.

SDR Background
The SDR provides information on scientists and engineers who were awarded doctoral degrees from U.S.
institutions. The 2015 SDR is comprised of three components: 1) a longitudinal panel that tracks doctorate
recipients throughout their careers until age 76, 2) a new sample of doctorate recipients awarded their
degrees from 1959 to 2011, and 3) a new cohort component that adds new doctorate recipients after they
receive their degree. The panel portion of the SDR provides information on the experienced stock of
doctorate recipients. The new sample of graduates from 1959 to 2011 represents an expansion to the
SDR sample to allow for estimation at a finer degree level. The new cohort from the two most recent
doctorate award years provides important data on the early career experiences of new doctorate recipients
with science, engineering, and health (SEH) degrees entering the labor force.
The SDR contributes to the National Center for Science and Engineering Statistics’ (NCSES) Scientists
and Engineers Statistical Data System (SESTAT). The purpose of SESTAT is to provide information on
the entire U.S. population of scientists and engineers with at least a bachelor’s degree. SESTAT is
produced by combining data from the SDR with data from NCSES’s National Survey of College
Graduates (NSCG). The NSCG represents all individuals in the U.S. with a bachelor’s or higher degree in
an SEH or related field, or those with a bachelor’s or higher degree in another field, but in an SEH or
related occupation. The NSCG includes individuals who received degrees from foreign institutions. The
integrated database derived from these surveys contains data on the demographic, educational, and
employment characteristics of college-educated scientists and engineers in the United States. These
surveys are usually conducted every two years.
Since 2003 and continuing with the 2006, 2008, 2010, and 2013 SDR, NCSES tested and reaffirmed the
feasibility of developing an international panel study of U.S.-trained doctorate recipients. Initially, this
sub-sample was comprised primarily of non-U.S. citizens who emigrated after degree award.
For 2015, the SDR will no longer make a distinction, and will include sample members predicted to
reside either in or outside of the U.S. Currently, 36% of U.S. SEH doctorates are awarded to temporary
visa holders, and nearly 24% of them plan to leave the U.S. upon graduation. The 2015 SDR will yield
information about the educational and demographic characteristics of U.S.-trained SEH doctorate
recipients both living and working in the U.S. and abroad on the reference date, 1 February 2015.
The 2015 SDR introduces a major sample size expansion to support employment outcome estimates by
fine field of degree (FFOD). The expansion increases the sample from approximately 47,000 to 120,000
sample members. The objective of the new sample design is to meet both the traditional (historic)
domain-level estimation goals plus new fine field estimation goals. The SDR was originally designed to
produce estimates by various analytical domains defined by aggregated field of degree, gender, race,
ethnicity, citizenship at birth, and disability status. The new sample approach stratifies by FFOD,
featuring a combination of equal and proportional sample allocation to strata, and systematic probability
proportional to size (PPS) sampling within strata. See Section B.1 for details.

2015 SDR OMB Supporting Statement

Page 1

A.1 Necessity for Information Collection
The National Science Foundation Act of 1950, as amended by Title 42, United States Code, Section 1862,
required the Foundation to:
…“provide a central clearinghouse for the collection, interpretation, and analysis of data
on scientific and engineering and to provide a source of information for policy
formulation by other agencies of the Federal Government...” (See Attachment A – NSF
Act of 1950 and America COMPETES Reauthorization Act of 2010.)
In meeting its responsibilities under the NSF Act, the Foundation relied on the National Register of
Scientific and Technical Personnel from 1954 through 1970 to provide names, location, and
characteristics of U.S. scientists and engineers. Acting in response to a Fiscal Year 1970 request of the
House of Representatives Committee on Science and Astronautics (see U.S. Congress, House of
Representatives, 91st Congress, 1st Session, Report No. 91-288), the Foundation, in cooperation with the
Office of Management and Budget and eight other agencies, undertook a study of alternative methods of
acquiring personnel data on individual scientists and engineers.
The President’s budget for Fiscal Year 1972, as submitted to the Congress, recommended the
“discontinuation of the National Register of Scientific and Technical Personnel in its present form” and
that funds be appropriated “to allow for the development of alternative mechanisms for obtaining required
information on scientists and engineers.” The House of Representatives Committee on Science and
Astronautics, in its report on Authorizations for Fiscal Year 1972, states that “...it has no objection to this
recommendation...” (see U.S. Congress, House of Representatives, 92nd Congress, 1st Session, Report
No. 92-204).
Subsequently, NSF established and continues to maintain the SESTAT system, the successor to the
Scientific and Technical Personnel Data System of the 1980s which was the successor to the National
Register. The Science and Technology Equal Opportunities Act of 1980 directs NSF to provide to
Congress and the Executive Branch an “accounting and comparison by sex, race, and ethnic group and by
discipline, of the participation of women and men in scientific and engineering positions.”
The America COMPETES Reauthorization Act of 2010 established within NSF a National Center for
Science and Engineering Statistics, and reaffirmed that it serve as “…a central Federal clearinghouse for the
collection, interpretation, analysis, and dissemination of objective data on science, engineering, technology,
and research and development.” The SDR provides information on the training, career, and educational
development of the nation’s U.S.-trained doctorate recipients with SEH degrees, an important component
of the U.S. science and engineering workforce. These reports enable NSF to fulfill the legislative
requirement to act as a clearinghouse for current information on the SEH workforce.

A.2 Uses of Information
SDR data are used in assessing the quality and supply of the nation’s SEH personnel resources for
educational institutions, private industry, and professional organizations, as well as federal, state, and
local governments. NSF uses the information to prepare congressionally mandated biennial reports, such
as Women, Minorities and Persons with Disabilities in Science and Engineering and Science and
Engineering Indicators.

2015 SDR OMB Supporting Statement

Page 2

The SDR data have been used extensively in the policy and planning activities of NSF and the National
Institutes of Health. Other federal agencies, such as the Departments of Commerce, Agriculture, Energy,
and the National Aeronautics and Space Administration, request and make use of the SDR data for a
variety of informational purposes.
Educational institutions use SDR data in establishing and modifying scientific and technical curricula,
while various industries use the information to develop recruitment and remuneration policies.
Researchers, policymakers, and others use information from the SDR to answer questions about the
doctoral SEH workforce. SDR data are used to address topics such as: the role of foreign-born scientists
and engineers; the transition from higher education to the workforce; the role and importance of
postdoctoral appointments; diversity in education and employment; and the implications of an aging
cohort of scientists and engineers as baby boomers reach retirement age. The SDR data on those living
outside the U.S. allows economists and policy analysts to better understand the migration patterns,
productivity, and employment concerns of the most highly trained individuals potentially able to return to
the U.S. workforce.
Findings from the 2015 SDR will enable NCSES to continue reporting employment patterns of recent
SEH doctorate recipients, as well as more experienced doctorate recipients in the labor market. The
expanded sample size will allow NCSES for the first time to produce reliable estimates of employment
outcomes by the fine field of degree taxonomy used in the Survey of Earned Doctorates (SED). The SED
gathers information yearly from all new research doctorates awarded by U.S. institutions. Detailed
information about the SED can be found at http://www.nsf.gov/statistics/srvydoctorates/.
The National Science Board reports SDR data on the state of SEH doctorates in Science and Engineering
Indicators. NSF’s Education and Human Resources Directorate uses SDR data in the evaluation and
development of programs, and other NSF research directorates use SDR to analyze SEH employment
pathways.
Without these data, those at the NSF, as well as researchers and policymakers, would be less informed in
attempts to carry out their responsibilities. The SDR data are made available through published reports;
the SESTAT online data system, through public use files and restricted licenses.
The Committee for Equal Opportunity in Science and Engineering (CEOSE), an advisory committee to
NSF and other government agencies, established under 42 U.S.C. §1885c, has been charged by the U.S.
Congress with advising NSF in assuring that all individuals are empowered and enabled to participate
fully in science, mathematics, engineering and technology. Every two years CEOSE prepares a
congressionally mandated report that makes extensive use of the SESTAT data to highlight key areas of
concerns relating to students, educators and technical professionals. Similarly, ad hoc committees
convened by the National Research Council of the National Academies (advisors to the nation on Science,
Engineering, and Medicine) have used SDR and SESTAT data in Committee reports such as the
Committee on Gender Differences in Careers of Science, Engineering, and Mathematics Faculty’s 2009
report “Gender Differences at Critical Transitions in the Careers of Science, Engineering, and
Mathematics Faculty.”
Information from the SDR was presented at the Organisation for Economic Co-operation and
Development conference in December 2012, “Understanding and improving the contribution of doctoral
graduates to innovation and the economy: Developing the statistical evidence.
(http://www.oecd.org/sti/inno/CDH%20final%20conference%20report.pdf)

2015 SDR OMB Supporting Statement

Page 3

NSF publications using SDR data (all NSF publications can be accessed on the NCSES website at
http://www.nsf.gov/statistics) include:
Congressionally mandated reports –
Science & Engineering Indicators 2014
Women, Minorities, and Persons with Disabilities in Science and Engineering 2015
Other NCSES publications –
Biennial report series: Characteristics of Doctoral Scientists and Engineers in the United States
Annual report series: Science and Engineering State Profiles
Unemployment among Doctoral Scientists and Engineers Increased but Remained Below the
National Average (April 2014)
Employment and Educational Characteristics of Scientists and Engineers (January 2013)
International Mobility and Employment Characteristics among Recent Recipients of U.S.
Doctorates (October 2012)
Racial and Ethnic Diversity among U.S.-Educated Science, Engineering, and Health Doctorate
Recipients: Methods of Reporting Diversity (January 2012)
Academic Institutions of Minority Faculty with Science, Engineering, and Health Doctorates
(October 2011)
The End of Mandatory Retirement for Doctoral Scientists and Engineers in Postsecondary
Institutions: Retirement Patterns 10 Years Later (December 2010)
A.2.1 Data Dissemination and Access
Since 1993, the SDR data have been incorporated into SESTAT. The data are available as separate standalone public-use files, as a component of the SESTAT public-use data files, and as restricted use files
licensed by NCSES. The SESTAT data tool allows users to create customized data tabulations in subject
areas of their interest. The SESTAT Home Page can be accessed at http://www.nsf.gov/statistics/sestat.
SDR and SESTAT data are presented at conferences and professional meetings, such as the annual
meetings of the Association for Institutional Research, the American Association for Public Opinion
Research, and the American Educational Research Association.
Since 2007, NCSES has distributed more than 2,000 copies of SDR public-use files (2003, 2006, 2008,
2010, and 2013 survey cycles), as well as over 4,700 copies of the SESTAT public-use files (1993-2010
survey cycles). There are currently 50 restricted-use licenses active for the SDR. Additional licensing
requests for the SDR are pending review and approval by NCSES.
Recent examples of use of the SDR data include the following:
Selected Presentations:
Balancing Timeliness, Data Quality and Cost – by Optimizing Data Collection Strategies, Joint
Statistical Meetings, August 2014 .

2015 SDR OMB Supporting Statement

Page 4

Belt and Suspenders: Evaluating the Efficacy of Sending Initial Contacts via Email Only vs.
Letter-Plus-Email to Online Responders in the Survey of Doctorate Recipients, American
Association for Public Opinion Research, May 2014.
A “Green” Appeal: Efficacy Evaluation of Assigning Sample Members that Prefer the USPS Mail
Mode to the Online Mode in the 2013 Survey of Doctorate Recipients, American Association
for Public Opinion Research, May 2014.
Preparing Graduate Students for Non-Academic Careers, American Association of Physics
Teachers Meeting, January 2014.
OECD/UNESCO Institute for Statistics/Eurostat Careers of Doctorate Holders (CDH) Project,
The Organisation for Economic Co-operation and Development, December 2012
Integration of the National and International 2008 SDR: Bridging Effects and Expected
Improvements to the Time Series Data, Joint Statistical Meetings, August 2012.
Development of the Sample Design for the International Survey of Doctorate Recipients, Joint
Statistical Meetings, August 2012.
Migration Patterns of U.S. Trained Doctorate Holders (A Longitudinal Study), Joint Statistical
Meetings, August 2012.
Utilizing a Logistic Regression Approach for Weighting Adjustment in a Longitudinal Dataset,
Joint Statistical Meetings, August 2012.
Coping with Missing Data: Assessing Methods for Logically Assigning Race and Ethnicity,
American Association for Public Opinion Research, May 2012.
Science and Engineering Doctorate Recipients as Adjunct Faculty: New Findings from the Survey
of Doctorate Recipients, American Educational Research Association, April 2012.
An investment in Goodwill or Encouraging Delays? Examining the Effects of Incentives in a
Longitudinal Study, Federal Committee on Statistical Methodology Annual Meeting, January
2012.
Selected Citations of SDR data in other sources:
Interdisciplinary Research and the Early Career: The Effect of Interdisciplinary Dissertation
Research on Career Placement and Publication Productivity of Doctoral Graduates in the
Sciences, Research Policy 42(5):1152-1164, June 2013.
Comparing Research Productivity across Disciplines and Career Stages, Journal of Comparative
Policy Analysis 15(2):141-163, April 2013.
Increasing the Visibility of Women of Color in Academic Science and Engineering: Professional
Society Data. New Directions for Higher Education, 2013(163):7-21, 2013.
Contributions of Foreign-Born Faculty to Doctoral Education and Research. New Directions for
Higher Education, 2013(163):89-98, 2013.
Beyond Anecdotes: A Quantitative Examination of Black Women in Academe. The Review of
Black Political Economy, July 2012.
Disparities in Publication Patterns by Gender, Race and Ethnicity Based on a Survey of a
Random Sample of Authors. Scientometrics, 2012 (November):1-20.
Education and Career Outcomes for Women of Color in Academia, National Academies’
Conference Seeking Solutions: Maximizing American Talent by Advancing Women of Color
in Academia, 2012.

2015 SDR OMB Supporting Statement

Page 5

A.3 Consideration of Using Improved Technology
The 2015 SDR will collect data using three modes of data collection:


Self-administered online surveys via the Internet (Web or online);



Paper self-administered questionnaires (mail); and



Computer-assisted telephone interviews (CATI).

Prior to the 2003 survey cycle, SDR data were collected by first mailing paper questionnaires to sample
members, then following up the nonrespondents by telephone. In the 2003 SDR, the tri-mode data
collection effort including mail, CATI, and Web was tested and has been fully implemented in all of the
rounds since (2006, 2008, 2010, and 2013). The 2015 survey cycle will continue this protocol.
Since 2003, there has been a steady increase of participation via the Web; In 2008, over 57 percent of
sample members completed an online survey; in 2010, that number rose to 63 percent; and in 2013, rose
again to 75 percent. Of the respondents who answered the 2013 survey mode preference question and
selected a specific mode, 80 percent indicated a preference for the online survey in future cycles.
Analysis indicates that the online mode results in higher response rates, as well as more complete survey
and contacting data, than the mail mode.
For returning sample members, the 2015 SDR will honor mode preferences reported in the 2013 SDR but
also emphasize the efficiency of completing via the Web. The majority of cases new to the SDR, which
make up the sample in high proportion due to the sample redesign, will be started in the online survey
mode. Eighty percent or more of the 2015 survey responses are expected to be in the online mode. The
2015 online instrument will also be configured for use on mobile devices (e.g., smartphones and tablets)
to ensure that the respondent experience is optimized regardless of the screen size or browser used to
access the survey.
The 2015 data collection effort will also use a comprehensive computerized case management system to
track data capture across the three modes (Web, mail, CATI). Optical scanning will be used to capture the
digital images of the mail questionnaire after keying. The images will be stored in a database for archival
purposes.

A.4 Efforts to Identify Duplication
Some overlap exists with NCSES’ Early Career Doctorates Survey (ECDS) (OMB Control # 31450235) target population and content. The ECDS builds its sample by obtaining employee lists from U.S.
academic institutions, Federally Funded Research and Development Centers, and NIH Intramural
Research Programs, and includes individuals that received their first doctorate in the U.S. or abroad
within the last ten years. In contrast, the SDR includes sample members up to age 76; SDR sample
members potentially received their doctorate degrees 50 years prior. The SDR is a probability sample
and surveys sample members regardless of where they currently reside or work, including residing or
working outside of the U.S. The SDR surveys individuals working full or part time at any type of
employer, and individuals not working due to retirement or other reasons.
Overlap exists in the target populations for the NSCG and the SDR. It is estimated, based on the 2013
overlap, that as many as 600 individuals may be selected for sample in both the 2015 NSCG and the
2015 SDR. Given recent changes to the NSCG questionnaire content, there are notable differences in

2015 SDR OMB Supporting Statement

Page 6

the information collected on the NSCG and SDR. Examples of topics planned for collection on the
2015 NSCG, but not on the 2015 SDR include attainment of certifications and licenses, financial
support for education, and community college enrollment. Due to the content differences between the
surveys, the relatively small number of expected duplicates, and the operational challenges of the
deduplication process, NCSES will not deduplicate individuals selected for sample in both the NSCG
and SDR in the 2015 survey cycle.
Data from the Census Bureau’s Current Population Survey and the American Community Survey
(ACS) are intended to provide occupational estimates, and provide estimates of degree field earned
only at the bachelor’s level. There is no similar information available on doctorate-holding population
that may be used, modified, or made comparable to the SDR.

A.5 Efforts to Minimize Burden on Small Business
Not applicable. The SDR collects information from individuals only.

A.6 Consequences of Less Frequent Data Collection
Conducting the SDR on a less frequent basis would prohibit NSF from meeting its congressional
mandate to produce a report that contains an accurate accounting and comparison, by sex, race, and
ethnic group and by discipline, of the participation of women and men in scientific and engineering
positions. The SDR data are central to the analysis presented in the congressionally mandated report,
Women, Minorities, and Persons with Disabilities in Science and Engineering. SDR data are used
extensively in the National Science Board report, Science and Engineering Indicators. Both of these
reports are published on a biennial schedule, and rely on the availability of updated data on the science
and engineering workforce every two years. In addition to not having recent data for these reports,
government, business, industry, and universities would also have less recent data to use as a basis for
formulating the nation’s science and engineering policies.

A.7 Special Circumstances
Not applicable. This data collection does not require any one of the reporting requirements listed.

A.8 Federal Register Announcement and Consultations Outside the Agency
A.8.1 Federal Register Announcement
The Federal Register Notice for the SDR appeared on August 11, 2014 (See Attachment B). No public
comments were received in response to the announcement by the closing date of October 10, 2014.
A.8.2 Consultations Outside the Agency
The Human Resources Experts Panel (HREP) serves as a subcommittee of the NSF Directorate for Social,
Behavioral, and Economic Sciences Advisory Committee. HREP advises NCSES on priorities and
strategies for ongoing activities to improve the relevance of current and future statistics produced by
NCSES’ Human Resources Statistics (HRS) program. The standing HREP consists of 15 rotating
members who serve a 3-year term and are broadly representative of stakeholders with an interest in S&E
human resources, such as:
 Current data users, including NCSES restricted-use data licensees

2015 SDR OMB Supporting Statement

Page 7










Potential data users
Policy makers from various levels of government
Professional organizations and foundations, such as the American Institute of Physics (AIP),
Council of Graduate Schools (CGS) and the American Association for the Advancement of
Science (AAAS)
Research organizations that use human resources data such as the National Bureau for
Economic Research (NBER) and the National Academy of Sciences (NAS)
Current respondents to the surveys/projects conducted by HRS
Large and small institutions of higher education, including both public and private institutions
Industry
Human resources professionals

HREP accomplishes its mission by: 1) suggesting methods to publicize and promote the data; 2)
providing advice on efforts to improve the timeliness and accuracy of SEH labor force data; 3) providing
a mechanism for obtaining ongoing input from both researchers and policy analysts interested in SEH
personnel data; 4) providing perspectives on the data needs of decision makers; 5) identifying issues and
trends that are important for maintaining the relevance of the data; 6) identifying ways in which SEH
personnel data could be more useful and relevant for analyses; and 7) proposing ways to enhance the
content of the NCSES human resources surveys. HREP has met 7 times since it was convened in 2007.

A.8.3 Meetings and Workshops on Redesign Activities
A series of meetings and workshops on various issues related to a SESTAT redesign and survey
methodology have been held since 2013.
For the 2015 survey round:


Two HREP meetings were held in August 2013 and January 2014 with the following goals:
o To enrich the HRS understanding of how the education and careers of the S&E workforce
are evolving;
o To identify salient characteristics of the evolving S&E education/career pathways that
can be incorporated into HRS surveys;
o HREP Members attending the August 2013 and January 2014 Workshops were as
follows:

2015 SDR OMB Supporting Statement

Page 8

Nathan Bell
Associate Director, Education Research & Policy
American Educational Research Association

Brian Hartz
Vice President of Client Services
TORQworks

Roman Czujko
Director, Statistical Research Center
American Institute of Physics

Beverly Karplus Hartline
Vice Chancellor for Research and Graduate Studies
Montana Tech

Ronni Denes
President and Executive Director
New Jersey SEEDS

Cheryl Leggon
Associate Professor, School of Public Policy
Georgia Institute of Technology

Catherine Didion
Senior Program Officer
National Academy of Engineering
Director, Committee on Women in S&E
National Academies

Sharon Levin
Professor of Economics
University of Missouri, St. Louis

Earnestine Psalmonds Easter
Program Director, Division of Graduate Education
National Science Foundation
Cary Funk
Senior Researcher
Pew Research Center
Donna Ginther
Professor of Economics
University of Kansas



Duncan McBride
Program Director, Division of Undergrad Ed.
National Science Foundation
Catherine Millett
Research Scientist
Educational Testing Service
Cathee Johnson Phillips
Executive Director
National Postdoctoral Association
George Wimberly
Director, Professional Development/Social Justice
American Educational Research Association

A third HREP meeting was held in June 2014. The objectives of this meeting were:
o To become better informed about:


Research questions and policy issues concerning job mobility, occupational
change, and career pathways that currently engage researchers and
policymakers, particularly as these questions and issues relate to the S&E
workforce;



How survey data are used to study the research questions and policy issues, and
the limitations of these data;



o
o

Best practices for collecting occupational history data in the context of different
longitudinal study designs;
To identify other important characteristics of occupational history that can be
incorporated into HRS surveys.
HREP Members attending the June 2014 Workshop were as follows:
Jake Bartolone
Senior Research Scientist
National Opinion Research Center

Albert Sumell
Associate Professor of Economics
Youngstown State University

Kirk Doran

Omari Swinton

2015 SDR OMB Supporting Statement

Page 9



Assistant Professor of Economics
University of Notre Dame

Assistant Professor of Economics
Howard University

Donna Ginther
Professor of Economics
University of Kansas

John Bound
Professor of Economics
University of Michigan

Shulamit Kahn
Associate Professor of Public Policy & Law
Boston University

Charlie Brown
Professor of Economics
University of Michigan

Morris Kleiner
Professor of Public Affairs/Industrial Relations
University of Minnesota

Pamela Herd
Professor of Public Affairs and Sociology
University of Wisconsin-Madison

Iourii Manovskii
Associate Professor of Economics
University of Pennsylvania

Sheila Kirby
Senior Fellow
National Opinion Research Center

Erika McEntarfer
Supervisory Economist
U.S. Census Bureau

Cheryl Leggon
Associate Professor, School of Public Policy
Georgia Institute of Technology

Donna Rothstein
Research Economist
Bureau of Labor Statistics

Audrey Light
Professor of Economics
Ohio State University

Hal Salzman
Professor of Planning and Public Policy
Rutgers, The State University of New Jersey

Mike Pergamit
Senior Fellow
Urban Institute

Marc Scott
Associate Professor of Applied Statistics
New York University

Jeff Strohl
Senior Research Fellow
Georgetown University

John Skrentny
Professor of Sociology
University of California at San Diego

Josh Trapani
Director of Policy Analysis
Association of American Universities

An Expert Panel of Sampling Statisticians was held in December 2014. The objectives of this
meeting were:
o To discuss sample redesign options for the 2015 SDR.
o To determine which design approach to implement.
o The Statistical Experts attending the December 2014 meeting were as follows:
Rachel Harter
Senior Research Statistician
RTI
Frauke Kreuter
Professor in the Joint Program in Survey Methodology
The University of Maryland, USA, and
Professor of Statistics
Ludwig-Maximilians-Universität, Germany

2015 SDR OMB Supporting Statement

Page 10

Michael Larsen
Associate Professor in the Department of Statistics and Biostatistics Center
George Washington University
Jill Montaquila
Associate Research Professor in Joint Program in Survey Methodology (JPSM)
The University of Maryland, and
Associate Director of the Statistical Staff and a Senior Statistician
Westat

A.8.4 Consultations for Outreach and Dissemination
To maintain the relevancy of the SESTAT surveys and to obtain ongoing input from the public and
researchers, NCSES engaged in the following activities.
For the 2010 and 2013 survey rounds:
1. NCSES convened an HREP to help improve data collection on the SEH labor force through
review and renewal of the SEH personnel surveys and to promote use of the data for research and
policy analysis purposes.
2. ASA/AAPOR invited an NCSES analyst to present a webinar on science and technology human
resources surveys, data and indicators; the SESTAT data are the source for all of the major
indicators and trends on this workforce.

A.9 Payment or Gifts to Respondents
Incentives were initially introduced into the SDR data collection protocol during the 2003 cycle, and
have been incorporated into the data collection plan for all subsequent cycles. Described below are the
proposals to offer both early and late stage incentives. During the early phases of data collection,
incentives will be offered to a selected set of the sample described below. During the later phase of data
collection, an incentive plan will be implemented similar to the ones used in the 2008, 2010, and 2013
SDR.
A.9.1 Proposed Plan for the 2015 SDR
Early-Stage Incentive. The early-stage incentive will target three types of sample members: 1) those
who have only responded after being incentivized in prior rounds, 2) new cohort sample members who
are recent graduates (earning their degree in 2012 and 2013), and 3) sample members who are
underrepresented minorities (URM) earning their degree in 2011 and earlier, including expansion
sample members.
Early incentives will not be offered to all new sample cases; early incentives will be offered to all
members of the “new cohort” who do not respond to the initial request to participate in the SDR.
However, to clarify, the new cohort does not include all of the sample cases new-to-the-SDR, but is one
of three primary sample components in 2015:
(1) Panel: Individuals included in the 2013 SDR sample and selected for the 2015.
(2) New cohort: Individuals who received their doctorate in the academic years 2012 and 2013
(new-to-the-SDR).
(3) Expansion sample: Individuals who were not in the 2013 SDR sample and who received their
doctorate in the academic years 1961-2011 (new-to-the-SDR).

2015 SDR OMB Supporting Statement

Page 11

Early incentives will be offered to each sample component as described below:
1.

Panel:
Two subgroups of the panel will be eligible for an early incentive offer:
a.

Those who only participated in the prior survey rounds after receiving an incentive will
receive an incentive with their initial request to participate.

b.

Underrepresented minorities (URM) who do not respond to the initial request to
participate in the survey will receive an incentive offer with the second request.

2.

New cohort: Any new cohort sample member who does not respond to the initial request will
receive an incentive offer with the second request.

3.

Expansion component: URM who do not respond to the initial request will receive an
incentive offer with the second request.

Sample members who have historically only responded with an incentive will be offered a monetary
incentive in the first contact to encourage a faster response and to reduce the costs associated with
follow-up contacts. The rationale for this approach is based on the 2013 SDR data collection
experience. An examination of the 2013 response of sample members who consistently only
participated after receiving an incentive in the past survey cycles shows 69.7 percent completed the
2013 survey after receiving a late-stage request for survey participation with an incentive offer, and
37.6 percent completed the 2013 survey after receiving just a late-stage request for survey participation
without an incentive.
The incentive experiments conducted in 2006 and 20081 indicated that offering a prepaid incentive in
the second contact was a cost-effective way of encouraging survey response. Further analysis in 20102
indicated that the monetary incentive had a positive conditioning effect on response propensity in the
subsequent round. Therefore, the NSF proposes to offer a monetary incentive to all URM sample
members who are in the panel or new-to-the-SDR; this incentive will be included in the second contact.

Late-Stage Incentive. The overall strategy for the late-stage incentive is to ensure that all sample
members who have been subject to the standard survey data collection protocols and still remain as
survey nonrespondents will have a probability of receiving a monetary incentive. In the plan used for
the 2008, 2010, and 2013 SDR, and again proposed here, a greater probability of selection for the
incentive will be given to cases in those sampling cells with relatively lower response rates, in order to
improve the accuracy of survey estimates (given that the sampling cells are aligned with the domains of
interest for analysis). This is consistent with an adaptive design data collection strategy.
To allocate its available limited resources for the monetary incentive to late-stage survey
nonrespondents most effectively, there will be an analysis of the characteristics of the remaining
1

“2008 Survey of Doctorate Recipients New Cohort Incentive Experiment” issued to NSF by Karen Grigorian and
Shana Brown, NORC, May 28, 2010.
2
“2010 Survey of Doctorate Recipients Late-Stage Incentive Program Results” issued to NSF by Karen Grigorian et
al., NORC, January 4, 2013.

2015 SDR OMB Supporting Statement

Page 12

nonrespondents using a logistic regression model to determine which types of sample members should
receive additional inducement to mitigate response bias; the cases with lowest response propensity will
be selected for the incentive with certainty. Approximately a third of late-stage nonresponse cases will
be incentivized with certainty. From the remaining nonresponding cases (both found and in locating),
15 percent will be selected to receive the incentive. In this way, all late-stage nonresponding sample
members will have a chance of receiving the incentive, while resources are strategically targeted to
reduce bias according to an adaptive design strategy.
After the 2015 SDR, an analysis of the effectiveness of incentives, particularly on the expansion cohort,
will be conducted. This analysis will be the basis for determining whether to keep the incentives in future
rounds, modify the incentive plan, or eliminate incentives from the SDR.

A.9.2 Incentive Costs
According to this plan, a $30 prepaid incentive would be offered for the 2015 SDR, as was done for the
2008, 2010, and 2013 NSDR. The total cost of incentives in the 2013 SDR was $90,000. In 2015, it is
expected to cost $210,000. The complete incentive plan for 2015 is in section B.3.4.

A.10 Assurance of Confidentiality
NCSES and its contractors are fully committed to protecting the confidentiality of all survey
respondents. SDR data will be collected under the authority of America COMPETES Reauthorization
Act of 2010 and the Confidential Information Protection and Statistical Efficiency Act (CIPSEA) of
2002. Cover letters and survey questionnaires to each selected respondent will advise them that the
information they provide is confidential (see Attachment D – Draft 2015 SDR Survey Mailing
Materials and Attachment C – Draft 2015 SDR Questionnaire). The same notice of confidentiality will
be used in the introduction to the CATI interview and will be displayed prior to the start of the survey
in the online instrument.
Standard data collection procedures incorporate numerous safeguards for the data and must conform to a
detailed security plan approved by NCSES. While collecting SDR data, the information that could
identify a particular sample member is separated from data about that person. Each sample member is
assigned a unique identifier, and this identifier is used to store identifying information (such as name,
address, etc.) in a separate, secure database apart from the survey response database. SDR contractors and
NCSES staff receive annual CIPSEA training to reinforce their legal obligations to protect the privacy
and confidentiality of the SDR data; staff must sign data use agreements annually to acknowledge this
legal obligation.
Completed SDR hard copy questionnaires and other contact materials will be housed in a secure storage
room at the contractor’s production facility. Only authorized staff – and only when necessary for data
collection activities – will have access to hard copy materials from the SDR file room. The contractor’s
electronic systems will be on a secure local area network (LAN), and all contractor systems for storage of
electronic survey data will be secure by design and will be protected by passwords available only to
authorized study staff.
The contractor will implement systems to make certain that data collected via the online questionnaire are
secure. First, access to the online instrument will be allowed only with a valid Personal Identification
Number (PIN) and password correctly entered in combination. Second, data will be transmitted by the
Secure Sockets Layer (SSL) protocol that employs powerful encryption during transmission through the
Internet. If a respondent keeps an online survey open without any activity, the online server will close the

2015 SDR OMB Supporting Statement

Page 13

connection after a short period of inactivity, both preserving the data up to the break-off point and
preventing unauthorized persons from completing the questionnaire. The online survey system will place
authentication information and response data on physically separate servers, a strategy that provides an
extra layer of security to protect response data. Both development and production servers will be backed
up nightly as required by the contractor’s disaster recovery plan.
NCSES and its contractors will analyze and make available SDR data only in aggregate form and will
take all measures to assure that the identity of individuals or organizations will not be disclosed.

A.11 Justification for Sensitive Questions
No questions of a sensitive nature are asked in this data collection.

A.12 Estimate of Respondent Burden
A statistical sample of approximately 120,000 persons, identified as having a doctorate in an SEH field
from a U.S. academic institution will be selected for the 2015 SDR. This sample will include
approximately 106,000 individuals residing in the U.S. and 14,000 residing abroad. The amount of time to
complete the questionnaire may vary depending on an individual’s circumstances; however, on average it
will take approximately 25 minutes to complete the survey. Assuming a 70 percent response rate (84,000
respondents), the total burden for the 2015 SDR is estimated to be 35,000 hours.
The total cost to respondents for the 35,000 burden hours is estimated to be $1,493,978. This is based on
an estimated median annual salary of $88,785 per full-time employed SDR respondent from the 2013
SDR data. Assuming a 40-hour workweek over 52-weeks of employment, this annual salary corresponds
to an hourly rate of $42.69.

A.13 Cost Burden to Respondents
Not applicable. This survey will not require respondents to purchase equipment or software, nor to
contract out services.

A.14 Cost Burden to the Federal Government
The total estimated cost to the Government for the 2015 SDR is $17.7 million for survey cycle costs and
for staff costs to provide oversight and coordination with the other SESTAT survey. The cost estimate for
the survey cycle is $17.1 million, which is based on sample size; length of questionnaire; CATI and online
data collection technology; administrative, overhead, design, printing, mail and telephone data collection
costs; incentive payments; critical items data retrieval; data keying and editing; data quality control;
imputation for missing item responses; weighting and estimating sampling error; file preparation and
delivery; preparation of documentation and final reports; analysis, and tabulations. NCSES staff costs are
estimated at $562,500 ($150,000 annual salary of 1.5 FTE for 2.5 years of the 2015 SDR survey cycle).

A.15 Reason for Change in Burden
The 2015 SDR will include a significantly larger sample size (from 47,078 in 2013 to 120,000 in 2015) to
accommodate analyses by SED fine field. The change in burden hours from the 2013 SDR reflects the
increase in the total SDR sample size.

2015 SDR OMB Supporting Statement

Page 14

A.16 Schedule for Information Collection and Publication
There are no plans to use any complex analytical techniques in NCSES publications using these data.
Normally, SDR data are presented as cross-tabulations of the data in reports and other data releases. The
time schedule for 2015 data collection and publication is currently estimated as follows:
Data Collection (Mail, CATI, online)
Coding and Data Editing
Final Edited/Weighted/Imputed Data File
SDR InfoBrief
SDR Detailed Statistical Tables
SDR Public Use File

September 2015 – March 2016
September 2015 – July 2016
August 2016
Spring 2017
Spring 2017
Spring 2017

A.17 Display of OMB Expiration Date
The OMB Expiration Date will be displayed on the 2015 SDR questionnaire; in the online survey version,
it will be included on the informed consent page of the online survey and available in a help screen
accessible at any point in the online survey; in the telephone interview, it will be read to sample members
during the introductory informed consent.

A.18 Exception to the Certification Statement
Not Applicable.

2015 SDR OMB Supporting Statement

Page 15


File Typeapplication/pdf
Authorbrown-shana
File Modified2015-08-26
File Created2015-08-25

© 2024 OMB.report | Privacy Policy