National Longitudinal Survey of Youth 1979
1220-0109
Expiration Date: 9/30/2020
Information Collection Request for
The National Longitudinal Survey of Youth 1979
OMB # 1220-0109
Summary and Part A
Submitted by the Bureau of Labor Statistics
TABLE OF CONTENTS
Summary 1
Supporting Statement 2
A. Justification 2
1. Necessity for the Data Collection 2
2. Purpose of Survey and Data-Collection Procedures 2
3. Improved Information Technology to Reduce Burden 5
4. Efforts to Identify Duplication 6
5. Involvement of Small Organizations 8
6. Consequences of Less Frequent Data Collection 8
7. Special Circumstances 9
8. Federal Register Notice and Consultations 9
9. Payment to Respondents 12
10. Confidentiality of Data 19
11. Sensitive Questions 21
12. Estimation of Information Collection Burden 26
13. Cost Burden to Respondents or Record Keepers 27
14. Estimate of Cost to the Federal Government 28
15. Change in Burden 28
16. Plans and Time Schedule for Information Collection, Tabulation, and Publication 28
17. Reasons Not to Display OMB Expiration Date 28
18. Exceptions to “Certificate for Paperwork Reduction Act Submissions” 28
This is a 2 year revision clearance request for Round 29 of the National Longitudinal Survey of Youth 1979 cohort (NLSY79). The sample includes 9,964 persons who will be 55 to 62 years old on December 31, 2019. Approximately 10 percent of the sample members are deceased. The NLSY79 is a representative national sample of adults who were born in the years 1957 to 1964 and lived in the U.S. when the survey began in 1979. The sample contains an overrepresentation of black and Hispanic respondents to include a sufficient sample size to permit racial and ethnic analytical comparisons. Appropriate weights have been developed so that the sample components can be combined in a manner to aggregate to the overall U.S. population of the same ages, excluding those who have immigrated since 1978.
This package also seeks clearance to interview children born to female NLSY79 respondents. The Children of the NLSY79 have been assessed since 1986, when the National Institute of Health (NIH) National Institute of Child Health and Human Development (NICHD) began sponsoring a set of supplemental surveys to gather a large amount of information about the lives of these children. A battery of child cognitive, socio-emotional, and physiological assessments has been administered biennially since 1986 to NLSY79 mothers and their children. Starting in 1994, children who had reached age 15 by December 31 of the survey year (the Young Adults) were interviewed about their work experiences, training, schooling, health, fertility, self-esteem, and other topics. By 2016, the sample included very few children age 14 and under and the separate child survey was discontinued; at that point children age 12 and older joined the Young Adults. The Young Adult group will include 989 respondents ages 12-24 and 4,703 respondents age 25 and older in Round 29.
The main NLSY79 is funded primarily by the Bureau of Labor Statistics (BLS). Funding for the NLSY79 Child and Young Adult surveys is provided by NICHD through an interagency agreement with the BLS and through a grant awarded to researchers at the Ohio State University Center for Human Resource Research (CHRR). The interagency agreement funds data collection for children and young adults up to age 24; the grant funds data collection for young adults age 25 or older. The BLS has overall responsibility for the project. The BLS contracts with CHRR and the National Opinion Research Center (NORC) at the University of Chicago to conduct the surveys. NORC handles the interviewing, initial data preparation, and weighting. Questionnaire design, additional data cleanup and preparation, development of documentation, and preparation of data files are handled by CHRR.
The data collected in this survey are part of a larger effort that involves repeated interviews administered to a number of cohorts in the U.S. Many of the questions are identical or very similar to questions previously approved by OMB that have been asked in other cohorts of the National Longitudinal Surveys. Many of the questions in the NLSY79 have been designed to reflect the changing nature of institutions and the different problems facing these groups of people.
National Longitudinal Survey of Youth 1979
A Survey of Persons who were Ages 14 to 21 on December 31, 1978
Rationale, Objectives, and Analysis of Content
This statement covers main fielding of Round 29 of the National Longitudinal Survey of Youth 1979 (NLSY79). The NLSY79 is a nationally representative sample of persons who were born in the years 1957 to 1964 and lived in the U.S. when the survey began in 1979. The longitudinal focus of the survey requires the collection of identical information for the same individuals over the life cycle and the occasional introduction of new data elements to meet the ongoing analytical needs of policymakers and researchers. Most of the information to be collected this round has been approved by OMB in previous rounds of the NLSY79. See attachment 5 for details on changes for this round.
The mission of the Department of Labor (DOL) is, among other things, to promote the development of the U.S. labor force and the efficiency of the U.S. labor market. The BLS contributes to this mission by gathering information about the labor force and labor market and disseminating it to policymakers and to the public so that participants in those markets can make more informed and, thus, more efficient choices. The charge to the BLS to collect data related to the labor force is extremely broad, as reflected in Title 29 USC Section 1:
“The general design and duties of the Bureau of Labor Statistics shall be to acquire and diffuse among the people of the United States useful information on subjects connected with labor, in the most general and comprehensive sense of that word, and especially upon its relation to capital, the hours of labor, the earnings of laboring men and women, and the means of promoting their material, social, intellectual, and moral prosperity.”
The collection of these data contributes to the BLS mission by aiding in the understanding of labor market outcomes faced by individuals in the early stages of career and family development. See attachment 1 for Title 29 USC Sections 1 and 2.
Through 1984, the NLSY79 consisted of annual interviews with a national sample of 12,686 young men and women who were ages 14 to 21 as of December 31, 1978, with overrepresentation of blacks, Hispanics, and economically disadvantaged nonblacks/non-Hispanics. The sample also included 1,280 persons serving in the military in 1978. Starting in 1985, the military sample was reduced to 201 due to a cessation of funding from the Department of Defense. Starting in 1991, interviews were discontinued with the 742 male and 901 female members of the economically disadvantaged nonblack/non-Hispanic sample. This reduced the eligible pool of sample members to 9,964. The NLSY79 was conducted annually from 1979 to 1994 and has been conducted every two years since 1994.
In addition to the regular interviews, several supplementary data-collection efforts completed during the early survey years greatly enhance the value of the survey to policymakers and researchers. The full Armed Services Vocational Aptitude Battery (ASVAB) was administered to 94 percent of sample members. Also, for a large proportion of the sample, information has been collected about the characteristics of the last high school each respondent attended, as well as the courses taken, grades, and some other personal characteristics about the respondents while attending high school.
These data-collection efforts have enabled researchers to complete studies of the relationship between background environment, employment behavior, vocational aptitudes, and high school quality. The data have helped the Departments of Labor, Defense, Education, and Health and Human Services and many congressional committees to make more knowledgeable decisions when evaluating the efficacy of programs in the areas of civilian and military employment, training, and health.
The NLSY79 is a general-purpose study designed to serve a variety of policy-related research interests. Its longitudinal design and conceptual framework serve the needs of policymakers in a way that cross-sectional surveys cannot. In addition, the NLSY79 allows a broad spectrum of social scientists concerned with the labor market problems of young baby boomers to pursue their research interests. Support for the survey by other government agencies is encouraged, as the increasingly omnibus nature of the survey makes it an efficient, low-cost data set. As noted, the survey has incorporated items needed for program and policy purposes by agencies other than the Department of Labor. In this survey round, we anticipate funding from the NICHD, the National Institute on Drug Abuse, and the National Institute on Aging.
In this survey round, information once again will be collected about the biological children of the female NLSY79 respondents. For the most part, this collection of data about the children repeats surveys administered to these children biennially from 1986 to 2018. These unique data permit medical and social science researchers to consider a large number of basic research issues relating to the effects of family background, program activities, and infant and maternal health on outcomes from early childhood through adolescence and into early adulthood. While the principal focus of the survey remains the collection of data for labor force analysis, the questionnaires administered to these children and older youths include items needed by other agencies that are not always directly related to employment and training studies. As these children reach adolescence, the focus of the surveys of these young adults returns to the school-to-work transition.
Sample sizes and the expected number of Round 29 respondents for each group are listed in table 1.
Table 1. NLSY79 Sample Size and Expected Response in 2020 (Round 29)
Cohort |
Approximate sample size |
Expected number of respondents |
NLSY79 main |
8,931 |
6,750 |
Young Adults ages 12-24 |
989 |
791 |
Young Adults age 25 and older |
4,7031 |
3,7621 |
1 Members of the Young Adult grant sample are contacted for interviews every other round once they reach age 31. These numbers do not include Young Adult sample members age 31 and older who will not be contacted for interviews in 2020.
The specific objectives of the NLSY79 fall into several major categories that will be further explained below:
to explore the labor market activity and family formation of individuals in this age group
to explore in greater depth than previously has been possible the complex economic, social, and psychological factors responsible for variation in the labor market experience (including retirement) of this cohort
to explore how labor market experiences explain the evolution of careers, wealth, and the preparation of this cohort for the further education of their children and for their own retirement
to analyze the impact of a changing socio-economic environment on the labor market experiences of this cohort by comparing data from the present study with those yielded by the surveys of the earlier NLS cohorts of young men (which began in 1966 and ended in 1981) and young women (which began in 1968 and ended in 2003), as well as the more recent NLSY97 cohort of young men and women born in the years 1980-84 and interviewed for the first time in 1997
to consider how intergenerational links between mothers and their children, including the employment-related activities of women affect the subsequent cognitive and emotional development of their children, and how the development of the children affects the activities of the mother
to meet the data-collection and research needs of various government agencies with an interest in the relationships among child and maternal health, drug and alcohol use, and juvenile deviant behavior and child outcomes such as education, employment, and family experiences and interest in the relationships between health, cognition, and retirement.
The NLSY79 has several characteristics that distinguish it from other data sources and make it uniquely capable of meeting the major purposes described above. The first of these is the breadth and depth of the types of information that are being collected. It has become increasingly evident in recent years that a comprehensive analysis of the dynamics of labor force activity requires an eclectic theoretical framework that draws on several disciplines, particularly economics, sociology, and psychology. For example, the exploration of the determinants and consequences of the labor force behavior and experience of this cohort requires information about (1) the individual’s family background and ongoing demographic experiences; (2) the character of all aspects of the environment with which the individual interacts; (3) human capital inputs such as formal schooling and training; (4) a complete record of the individual’s work experiences; (5) the behaviors, attitudes, and experiences of closely related family members, including spouses and children; and (6) a variety of social-psychological measures, including attitudes toward specific and general work situations, personal feelings about the future, personality characteristics, and perceptions of how much control one has over one’s environment.
A second major advantage of the NLSY79 is its longitudinal design, which permits investigation of labor market dynamics that would not be possible with one-time surveys and allows directions of causation to be established with much greater confidence than cross-sectional analyses. Also, the considerable geographic and environmental information available for each respondent for each survey year permits a more careful examination of the impact that area employment and unemployment considerations have for altering the employment, education, and family experiences of these cohort members and their families.
Third, the oversampling of blacks and Hispanics, together with the other two advantages mentioned above, makes possible more sophisticated examinations of human capital creation programs than previously have been possible. Post-program experiences of “treatment” groups can be compared with those of groups matched not only for preprogram experience and such conventional measures as educational attainment, but also for psychological characteristics that have rarely been available in previous studies.
As has been indicated above, the study has several general research and policy-related objectives.
Attachment 3- Survey Applications
We elaborate on these basic purposes by setting forth a series of specific research themes.
Attachment 4- Analysis of Content of Interview Schedules
The detailed content of the interview schedule is then related to the themes from Attachment 3.
Attachment 5- New Questions and Lines of Inquiry
Summarizes the new questions and lines of inquiry in the proposed questionnaire.
Attachment 6- Respondent Advance Letters
Provides the advance letters that will be sent to respondents prior to data collection and the questions and answers about uses of the data, confidentiality, and burden that will appear on the back of each advance letter.
The NLSY79 is part of a broader group of surveys that are known as the BLS National Longitudinal Surveys program. In 1966, the first interviews were administered to persons representing two cohorts, Older Men ages 45-59 in 1966 and Young Men ages 14-24 in 1966. The sample of Mature Women ages 30-44 in 1967 was first interviewed in 1967. The last of the original four cohorts was the Young Women, who were ages 14-24 when first interviewed in 1968. The survey of Young Men was discontinued after the 1981 interview, and the last survey of the Older Men was conducted in 1990. The Young and Mature Women surveys were discontinued after the 2003 interviews. The most recent cohort added to the NLS program is the National Longitudinal Survey of Youth 1997 (NLSY97), which includes persons who were ages 12–16 on December 31, 1984.
The National Longitudinal Surveys are used by BLS and other government agencies to examine a wide range of labor market issues. The most recent BLS news release that examines NLSY79 data was published on August 22, 2019, and is available online at http://www.bls.gov/news.release/nlsoy.nr0.htm. In addition to BLS publications, analyses have been conducted in recent years by other agencies of the Executive Branch, the Government Accountability Office, and the Congressional Budget Office. The surveys also are used extensively by researchers in a variety of academic fields. A comprehensive bibliography of journal articles, dissertations, and other research that have examined data from all National Longitudinal Surveys cohorts is available at http://www.nlsbibliography.org/.
The field staff of NORC makes every effort to ensure that the information is collected in as expeditious a manner as possible, with minimal interference in the lives of the respondents. Our success in this regard is suggested by the very high continuing response rate and low item refusal rates that have been attained. More recent efforts also have advanced technologies that lower respondent burden.
During Round 11 (1989) of the NLSY79, about 300 cases were collected using a Computer Assisted Personal Interview (CAPI). During Round 12 (1990) CAPI was again used, this time for about 2,400 cases using a longer and more complex questionnaire. The CAPI efforts in 1989 and 1990 were designed to assess the feasibility of the method and the effect of CAPI methods on data quality. Since 1993, the NLSY79 has been conducted using CAPI for all cases. Note that some cases have been completed over the telephone in each round. In recent rounds, the percentage of cases completed by phone is around 96%. The system has proved to be stable and reliable and is well received by interviewers, respondents, and researchers.
An analysis of the Round 12 (1990) experimental data revealed that the quality of the data, as measured by missing or inconsistent responses, was greatly improved by CAPI. The effects on the pattern of responses were minor, although some answer patterns were affected because the CAPI technology in certain cases changed the way the questions were asked. Production data from Rounds 15–21, based on over 80,000 completed interviews, showed these predictions of high-quality data were correct.
In 1994 we began to administer the Child Survey assessments using CAPI. CAPI simplified the administration of these assessments and reduced interview time and respondent burden. We saw the average interview time fall throughout the field period, and quality control measures revealed the data to be in excellent condition. The Young Adult survey, begun in 1994, originally had a CAPI interview with an additional paper and pencil self-report booklet. In 2000, this booklet was converted to CAPI. In the 2014 fielding, the entire Young Adult questionnaire was conducted in CAPI format.
CAPI surveying will continue to be used in Round 29. We estimate the average interview would be 10-15 percent longer if paper and pencil were used instead of CAPI.
In Round 29, we will continue to use computer interviewing to reduce the number of validation reinterviews needed to ensure that field interviewing is accurate. The survey includes a number of questions for which interviewers should not be able to fabricate answers, such as the respondent’s date of birth, height, and weight. Entries for these questions that are inconsistent with information we already have could signal that an interviewer falsified an interview. We will then conduct reinterviews only in cases where an inconsistency suggests that a check is needed. This approach, first used in Round 20, allowed us to reduce the number of validation reinterviews from about 1,250 in Round 19 to about 200 in Round 20. In rounds 25, 27, and 28, no reinterviews were needed. In round 26, 12 validation reinterviews were conducted.
In Round 29, as in Round 28, we also plan to continue using an additional computerized check to ensure that an interview has taken place. Our laptops are equipped with an internal microphone and recording capability. With respondent consent, we will record the interview in its entirety. If we have any suspicions about a given interview, we can listen to the sound files and ensure that an interview is taking place. In addition to identifying any interviews that are completely falsified, the recordings can reveal when an interviewer is taking shortcuts or not reading questions as instructed or when respondents are having trouble understanding questions. We will use these recordings only to evaluate interviewer performance and validate suspicious interviews.
We will include in the introduction to the questionnaire a statement that “interview will be recorded for quality control purposes. This will not compromise the strict confidentiality of your responses” (“CONSENT-1200” in the main NLSY79 questionnaire and “INTRO” in the young adult questionnaire). If the respondent does not want to be recorded, for interviews conducted during the regular field period the laptop recording capability will be immediately turned off.
A 1982 study entitled “National Social Data Series: A Compendium of Brief Descriptions” by Richard C. Taeuber and Richard C. Rockwell includes an exhaustive list with descriptions of the national data sets available at that time. A careful examination of all the data sets described in their listing indicates that no other data set would permit the comprehensive analyses of youth and young adult employment that can be conducted using the National Longitudinal Surveys. Indeed, the absence of alternative data sources was the deciding factor in the Department of Labor’s determination (in 1977) to sponsor the NLSY79. The longitudinal nature of the survey and the rich background information collected mean that no survey subsequently released can provide data to replace the NLSY79. The expansion in the mid-1980s of the NLSY79 to incorporate information on child outcomes represents a unique intergenerational data-collection effort.
Survey staff have continued to confirm that no comparable data set exists. An investigation into data sets related to wealth by F. Thomas Juster and Kathleen A. Kuester describes the data available in the wealth domain, showing the unique nature of the data available in the NLS.
The 1993 volume The Future of the Survey of Income and Program Participation points out a number of contrasts between the Survey of Income and Program Participation (SIPP) and the NLS and other major longitudinal studies (see especially pages 77, 107, and 265–7). This book was written primarily to review the SIPP but helps put the major longitudinal surveys in perspective.
BLS convened a conference in fall 1998 to look at the design of the NLSY79 for 2002 and beyond. In the process, external reviewers assessed the coverage, quality, and duplication of the survey. In its central areas of coverage—event histories on labor supply, major demographic events, and child assessments—this conference concluded that the NLSY79 was well-designed and continued to be a unique resource, unduplicated in the national statistical inventory.
An article by Michael Pergamit, et al. (2001) discusses the strengths of the NLSY79, and prominent research that had been done using NLSY79 data. Many of these studies could not have been done otherwise, because the NLSY79 has such a breadth of information, along with a history of all jobs ever held and a cognitive test score. A later paper by Aughinbaugh, et al. (2015) examines the unique strengths of the NLSY79 and NLSY97 surveys, and the resulting important research.
In 2016, BLS convened a panel of aging, health, and retirement experts to give advice on the content and direction the survey needs to take in the next 10-15 years. The Retirement Working Group, led by Professor Kathleen McGarry, provided advice on survey content, linkages to administrative data, and such, which the NLSY79 has already begun implementing. The NLSY79 provides advantages for studying aging and retirement over other longitudinal studies, which tend to begin when respondents are much older. The NLSY79 contains a long history of information collected as events occurred. In addition, over half of the sample members have a sibling in the sample, the NLSY79 Child/Young Adult data contains assessments and interviews with children of the female respondents, and the main survey contains data on respondents’ cognition and health over the years.
Past work shows that there is no other longitudinal data set available that can address effectively the many research topics highlighted in attachment 3. This data set focuses specifically and in great detail on the employment, educational, demographic, and social-psychological characteristics of a nationally represented sample of younger baby boomers and their families and measures changes in these characteristics over long time periods. The survey gathers this information for both men and women, and for relatively large samples of black and Hispanic adults. The repeated availability of this information permits consideration of employment, education, and family issues in ways not possible with any other available data set. The combination of (1) longitudinal data covering the time from adolescence; (2) national representation; (3) large black and Hispanic samples; and (4) detailed availability of education, employment and training, demographic, health, child outcome, and social-psychological variables make this data set, and its utility for social science policy-related research, unique.
In addition to the content of the interviews, the survey is also distinctive because of its coverage of the respondents’ lives for more than 35 years and the linkage between concurrently collected data on mothers and their children. These aspects attract the thousands of users who rely on this survey for their studies of American workers, their careers, and their families.
References:
Aughinbaugh, Alison, Charles R. Pierret, and Donna S. Rothstein. “The National Longitudinal Surveys of Youth: Research Highlights.” Monthly Labor Review (September 2015). https://doi.org/10.21916/mlr.2015.34.
Citro, Constance C. and Kalton, Graham, eds. The Future of the Survey of Income and Program Participation. Washington, DC: National Academy Press, 1993.
Juster, F. Thomas and Kuester, Kathleen A. “Differences in the Measurement of Wealth, Wealth Inequality and Wealth Composition Obtained from Alternative U.S. Wealth Surveys.” Review of Income and Wealth Series 37, Number 1 (March 1991): 33-62.
Pergamit, Michael R., Charles R. Pierret, Donna S. Rothstein, and Jonathan R. Veum. “Data Watch: The
National Longitudinal Surveys.” Journal of Economic Perspectives 15, 2 (Spring 2001): 239-253.
Taeuber, Richard C. and Rockwell, Richard C. “National Social Data Series: A Compendium of Brief
Descriptions.” Review of Public Data Use 10, 1-2 (May 1982): 23-111.
The NLSY79 is a survey of individuals in household and family units and therefore does not involve small organizations.
The core of the National Longitudinal Surveys is the focus on labor force behavior. It is very difficult to reconstruct labor force behavior retrospectively and still maintain sufficient precision and data quality. This is the single most important reason we strive to maintain regular interviews with these respondents, who on average have frequent transitions in employment, income and earnings, and family and household structure. Historic dates relating to these transitions are difficult to reconstruct when one focuses on events earlier than the recent past. For those who are employed, retrospective information on wages, detailed occupations, job satisfaction, or other employment-related characteristics cannot be easily recalled.
As with employment-related information, data about a respondent’s education and training history are also difficult to recall retrospectively. Completion dates of training and education programs are subject to severe memory biases. Thus, causal analyses that require a sequencing of education, training, and work experiences cannot be easily or accurately accomplished with historical data. Not only are completion dates of educational and training experiences frequently difficult to recall, but there is also evidence that misreporting of program completion is not unusual.
The precise timing and dating of demographic, socio-economic, and employment events, so crucial to most labor force analysis, is in most instances impossible to reconstruct accurately through retrospective data collection that extends very far into the past. For example, we have evidence from the NLS that dates of events of fundamental importance, such as marriage and birth histories, are subject to considerable error at the disaggregated level when collected retrospectively. Respondents have difficulty recalling when their marriages began or ended. Also, accurate information about household structure, how it changes over time, and how this relates to changes in family income and labor force dynamics is difficult to reconstruct retrospectively, as is the information on the health and related behaviors of the respondents, their spouses, and their children.
Finally, it is important to emphasize that information of a subjective nature can only be accurately reported and collected on a contemporaneous basis. Recollection of attitudes may be colored by subsequent experiences or reflect a rationalization of subsequent successes or failures. Attitudes as widely diverse as one’s ideas about women’s roles or how one perceives one’s health as of an earlier period can be recollected inaccurately, even when respondents are trying to be as honest as they can. In addition, the older the events that one tries to recall, either objective or subjective in nature, the greater the likelihood of faulty recall. The recall of events or attitudes is often biased either by a tendency to associate the event with major life-cycle changes (that may or may not be in temporal proximity to what one is trying to recall) or to move the event into the more recent past. The cognitive and socio-emotional information collected for the children of the NLSY79 respondents is, of course, sensitive to the age and life-cycle stage through which the particular children are progressing and, in many instances, changes over time. This is the reason why we need to repeat some of the assessments, in order to measure the extent to which the results are related to the age of the child as well as intervening family and non-family activities.
While more frequent interviewing is desirable, financial limitations prompted the NLSY79 to move to a biennial interview cycle beginning in 1994. The data loss due to reduced frequency is somewhat ameliorated by the fact that the cohort is more established, having negotiated the school-to-work transition with varying degrees of success. The NLSY79 uses bounded interviewing techniques and is designed so that when respondents miss an interview, information not collected in the missed interview is gathered in the next completed interview. In this way, the event history on work experience is very complete.
A study was conducted to assess the impact of the longer recall period by using an experimental design in the 1994 interview. About 10 percent of the respondents who were interviewed in 1993 were given a modified instrument that was worded as if the respondents were last interviewed in 1992. Using this experimental structure, we examined the respondents’ reports on experiences between the 1992 and 1993 interviews using information from their 1993 and 1994 reports on that same reference period. As expected, recall was degraded by a lower interview frequency. Events were misdated and some short duration jobs were not reported when the reference period was moved back in time. Based on this evidence, it is clear that less frequent data collection adversely affects longitudinal surveys.
A second potential problem caused by the move to a biennial interview is a decline in our ability to locate respondents who move. We have been able to compensate for this so far, but a change to less frequent interviewing would likely have a more negative impact.
None of the listed special circumstances apply.
No public comments were received as a result of the Federal Register notice published in 84 FR 71475, on December 27, 2019.
There have been numerous consultations regarding the NLSY79. Preceding the first round of the NLSY79, the Social Science Research Council (SSRC) sponsored a conference at which academic researchers from a broad spectrum of fields were invited to present their views regarding the value of initiating a longitudinal youth survey and what the content of the survey should include. The initial survey development drew heavily on the suggestions made at this conference, which were published in a proceeding under the auspices of the SSRC.
In 1988, the National Science Foundation sponsored a conference to consider the future of the NLS. This conference consisted of representatives from a variety of academic, government, and nonprofit research and policy organizations. There was enthusiastic support for the proposition that the NLS should be continued in the current format, and that the needs for longitudinal data would continue over the long run. The success of the NLS, which was the first general-purpose, longitudinal labor survey, has helped reorient survey work in the United States toward longitudinal data collection and away from simple cross sections.
Also, on a continuing basis, BLS and its contractors encourage NLS data users to suggest ways in which the quality of the data can be improved and to suggest additional data elements that should be considered for inclusion in subsequent rounds. We encourage this feedback through the public information office of each organization and through the ‘Suggested Questions for Future NLSY Surveys’ (available online at https://www.nlsinfo.org/nlsy-user-initiated-questions ).
Individuals from other Federal agencies who were consulted regarding the content of the 2020 survey include:
Regina Bures
Demographic and Behavioral Sciences Branch
Center for Population Research
National Institute of Child Health and Human Development
Marsha Lopez
Chief, Epidemiology Research Branch
Division of Epidemiology, Services, and Prevention Research
National Institute on Drug Abuse
John Phillips
Chief, Population and Social Processes Branch
Division of Behavioral and Social Research
National Institute of Aging
The NLS program has a Technical Review Committee that advises BLS and its contractors on questionnaire content and long-term objectives. The committee typically meets twice a year. Table 2 below shows the current members of the committee.
Table 2. National Longitudinal Surveys Technical Review Committee (December 2019)
Jennie Brand Department of Sociology and Department of Statistics University of California, Los Angeles
|
Sarah Burgard Department of Sociology and Department of Epidemiology University of Michigan
|
Shawn Bushway Department
of Public Administration and Policy
|
Judith Hellerstein (Chair) Department
of Economics College Park, MD |
David Johnson Research Professor, Survey Research Center University of Michigan
|
Michael LovenheimDepartment of Policy Analysis and Management Cornell University
|
Nicole Maestas Harvard Medical School
|
Melissa McInerney Department of Economics Tufts University
|
Kristen Olson Department of Sociology University of Nebraska-Lincoln
|
John Phillips Division of Behavioral and Social Research National Institute on Aging/NIH |
Rebecca Ryan Department of Psychology Georgetown
University |
Jeffrey Smith Department of Economics University of Wisconsin
|
The NLS Technical Review Committee convened a conference in 1998 to review the current and future design of the NLSY79. This conference indicated that the central design of the NLSY79 remained strong, although changes in the nation’s welfare program required changes in the program recipiency section of the survey. Many of these changes were implemented in the 2000 and 2002 interviews. Some health section modifications were introduced in 2006 (cognitive functioning model), and the 2008 and 2018 surveys included a new health module for respondents who had reached age 50 and age 60 (mirroring the age 40 module).
In addition to the Technical Review Committee, the decisions concerning which child outcome measures to include in the child assessment sections of the NLSY79 were carefully considered from a number of perspectives. NICHD has collaborated with BLS on the NLSY79 for many years, with NICHD providing funds for topical modules that are added to the core interview. This collaboration reduces the total cost of data collection for the government. NICHD staff consults with experts outside NICHD to determine priorities for the modules the institute funds. NICHD staff and nationally recognized panels of experts jointly made the decisions about NICHD-sponsored survey topics from medicine and the social sciences.
The NICHD also has convened groups of outside experts to review the progress and content of the data collected for NICHD within the NLSY79 program. A brief description of outside experts consulted with respect to the child instruments and their affiliations may be found in table 3 below.
Table 3. Advisors and Experts Contributing to NLSY79 Child and Young Adult Surveys
Because this is a long-term study requiring the subjects to be reinterviewed regularly, respondents are offered financial incentives to secure their cooperation. Respondent payments are appropriate given the continuing nature of the survey; 2020 will be the 29th round and the 41st year since the survey began. Throughout the course of the NLSY97, we have also implemented a variety of experiments to inform our incentive design, and continue to track response to incentives to ensure that the design is relevant.
Respondent incentives represent only a fraction of the total field costs, and higher incentives can be a cost-effective means of increasing response while constraining the overall budget. We face a growing pool of sample members who are reluctant to cooperate. Therefore, our overall data collection strategy includes a set of measures to encourage cooperation beyond just offering higher respondent incentives.
We request clearance for the following, integrated conversion strategy:
Table 1: Main NLSY79 incentive categories
Incentive Type |
Completed Round 28 |
Missed Round 28 |
Base |
$70 |
$70 |
Early Bird |
$100 |
$100 |
Missed Rounds |
|
$20 to 40 |
Final Push – Standard |
$30 |
$30 |
Final Push – Enhanced |
$50 |
$50 |
In-kind |
Up to $12 |
Up to $12 |
|
|
|
Min |
$70 |
$90 |
Max |
$120 |
$160 |
Typical |
$70 |
$90 |
Main NLSY79 Incentives
Base Incentive
We plan to once again offer a base respondent incentive of $70, the amount that was first introduced in Round 27.
Early Bird
As OMB requested in past rounds, we will make an Early Bird offer to all main NLSY79 respondents. In Round 27, the Early Bird offer was $100. In Round 29, we request permission to maintain that amount. Because our approach is to tie higher incentives to cooperative behavior, bringing more respondents under the Early Bird rubric will reduce our overall cost structure, hence extending higher incentives is beneficial for both the project and respondents. We will provide the base incentive to youth and young adults who wait to receive outbound calls to complete their interview. BLS will allocate the respondent incentives according to the guidelines described in the OMB package and approved by OMB.
Missed Round Bonus
We plan to repeat an additional incentive payment for respondents who were not interviewed in the previous survey round(s). This is appropriate because these respondents tend to have slightly longer interviews as they catch up on reporting information from missed rounds. Specifically, we would offer respondents $20 for the first missed round and $10 per consecutive additional rounds missed, up to $40 (3 rounds). We would be careful to inform respondents that this is a one-time additional incentive in appreciation for the additional time and data needed to catch up in this interview round and that they will not receive this additional amount next round. Based on our experience in the NLSY79 and NLSY97, we anticipate that respondents will appreciate this non-interview premium and will understand the distinction between the base amount and the additional incentive.
Gatekeepers
Some “gatekeepers” are particularly helpful in tracking down reluctant sample members. For example, a parent or sibling not in the survey may provide locating information or encourage a sample member to participate. (Note that we never reveal the name of the survey to these gatekeepers; it is described simply as “a national survey.”) We often return to the same gatekeepers round after round. To maintain their goodwill, we would like to be able to offer gatekeepers who are particularly helpful a small gift worth about $5. NLS has offered a similar incentive in Rounds 26-28 of the NLSY79 and Rounds 15-17 of the NLSY97. It is used very infrequently, but has proved very effective in finding respondents and getting them to complete interviews.
Final Push
Starting after the first 12 weeks of the round, cases that have not yet refused and had at least 6 contact attempts will be eligible for a final push incentive of up to $30 (bringing the total incentive to $100). This represents an increase of $10 over the final push amount of $20 used in prior rounds. Cases eligible for the final push are our most costly for completion and lead increasingly to elongated field periods. We propose the increase in final push amount in response to interviewer reports that respondents find the $90 amount ($70 base + $20 final push) an odd amount, and because $10 additional for the several hundred cases receiving the incentive would be a worthwhile trade-off if we could shorten the length of the field period. If a case has refused during initial outreach attempts, the case would eligible for the final push incentive after 12 weeks regardless of the number of contact attempts made. In order to facilitate a timely close to the fielding, starting 6 months after the start of fielding, all cases will be eligible for this incentive, after BLS and the contractors have discussed the exact timing of implementation.
In the past, it has been difficult to evaluate the effectiveness of the final push due to the facts that all cases become eligible at the same time and that cases naturally are harder to complete as the fielding period progresses. Because of this, we propose an experiment in round 29 to evaluate the effectiveness of the final push. This experiment will be conducted by randomly assigning interviewers into two separate groups. The first group will have their cases eligible for the final push and enhanced final push at the traditional time. The second group will then have their cases eligible for the final push and enhanced final push 6 weeks later. This staggered implementation of the final push will allow an opportunity to evaluate the effectiveness of the final push by comparing the trajectories of completion rates and effort for cases eligible for final push to randomly assigned cases that will not have become eligible yet. Based on round 28 production, we expect an average of roughly 80 cases a week to be completed over this 6-week period, and anticipate that this will result in sufficient sample size to detect effects on the order of 3-5 percentage points in completion rates on a base of 10 percent with 80 percent power. The proposed design achieves the necessary power without the more complex implementation challenges of a case-level randomization.
Enhanced Final Push
In order to support sample representativeness, we request the use of an “Enhanced Final Push” incentive that targets an additional $50 final push amount instead of the $30 final push amount for a subgroup of cases that had low response rates. Once this offer is extended, it will remain in place for the remainder of the field period (as does the traditional final push incentive).
The subgroups we propose to evaluate for inclusion in the enhanced final push will be chosen to represent groups that directly concern labor market activity: educational attainment, AFQT score, weeks worked (defined in 4 categories), presence or absence of a health condition that limits work activity, and each of these stratified by gender or race/ethnicity. As the oldest members of the NLSY79 cohort are now in their early 60s, it is vital to collect an accurate picture of this cohort’s employment over the lifecycle so that retirement decisions can then be studied. For each possible subgroup, we will calculate the overall response rate for the subgroup when the entire sample has completed at a rate of 60 percent. Any subgroup having a response rate at least 7 percent below the sample average (i.e., less than 53 percent) will be offered an enhanced $50 final push amount instead of the $30 final push amount that is approved for all remaining cases.
This enhanced final push expands the eligibility compared to Round 28 (from subgroups needing to have a response rate at least 8 percent below sample average to 7 percent below sample average). Based on results from round 28, we anticipate that this will result in roughly 400 more respondents eligible for the enhanced final push (roughly 1200 in total).
We propose this expansion due to the results of the successful enhanced final push in Round 28. Table 3 presents results of how the enhanced final push group performed relative to the same group in Round 27 (when there was no enhanced final push). Here, the group designated “Eligible for Final Push” includes all individuals who would have been eligible for the enhanced final push if they had not completed as of the start of final push. This includes individuals with missing AFQT, weeks worked, or educational attainment; or white respondents with unknown health conditions. The first two sets of columns present completion rates at the 60% mark and at the start of final push, respectively. The third set of columns presents the cumulative completion rate of these groups since the final push went into effect.
Prior to the final push going into effect, the targeted groups of respondents were completing at significantly lower rates than the average respondent. At the start of the final push, only 44.87% of respondents in the enhanced final push eligibility group had completed an interview, compared to 68.01% of other respondents. After the final push went into effect, these two groups completed at similar rates, however: 7.92% of enhanced final push respondents completed after the start of final push, compared to 7.33% of other respondents, suggesting that the enhanced final push was successful.
In order to provide a comparison for these results, Panel B shows comparable completion statistics for these groups of respondents in Round 27. The two rows represent the exact same respondents in the previous rounds, and we display completion status for them at various points throughout the round. To make the comparisons as close as possible, we do not use dates to designate the completion periods but instead categorize them based on the fraction of NLSY Youth cases that had completed.
At the 60% completion mark, the results look relatively similar in Round 27 and Round 28: the group of respondents eligible for the enhanced final push in Round 28 is completing at lower rates compared to other respondents. This continues to be the case when the final push went into effect. The third set of columns displays how the Round 28 completion rates in the first 8 weeks of the final push compare to the same period in Round 27. Here, we see that during this time frame, eligible respondents completed at a 1.43 (7.66-6.23) percentage point lower rate in Round 27 compared to a 0.59 (7.33-7.92) percentage point higher rate in Round 28. Moving to the last columns with cumulative end of round statistics, we see that there is only a 22.68 percentage point difference in final completion rates between the two groups in Round 28 compared to a 27.56 percentage point difference in Round 27.
These differential patterns in Round 28 compared to Round 27 are suggestive that the enhanced final push may improve response rates. While there is an uptick in response rates for these low-responding groups later in the round in both Round 27 and Round 28, the uptick is larger in Round 28 with the enhanced final push.
In addition to these suggestive results, note that the enhanced final push is paid only to respondents in the selected group on completion at the end of the round. Therefore, it is a very efficient expenditure of funds for additional completes, costing just over $4,000 ($20 enhancement of final push dollars * 200 estimated completed interviews). In addition, the cases targeted by the enhanced final push are the most difficult, and therefore the most costly cases to complete. Given this low cost, even the small amount of additional outreach we save yields a savings in overall cost. Even a savings of two visits or several calls per respondent would likely recover the cost of this incentive supplement. Therefore, we feel this incentive represents a cost effective way to increase completion rates for our most under-represented respondents.
Table 2: Enhanced Final Push Results
|
At 60% Completion (2/20/2019) |
At Start of Final Push |
After Final Push (Current As 4/29/2019) |
Cumulative at End of Round |
||||||||
(3/3/2019) |
||||||||||||
N |
Complete |
% Complete |
N |
Complete |
% Complete |
N |
Complete |
% Complete |
N |
Complete |
% Complete |
|
Panel A: Round 28 Results |
||||||||||||
Subgroups Not Eligible for Enhanced Final Push |
7522 |
4926 |
65.49% |
7522 |
5116 |
68.01% |
7522 |
551 |
7.33% |
7522 |
6030 |
80.16% |
Subgroups Eligible for Enhanced Final Push |
1364 |
581 |
42.60% |
1364 |
612 |
44.87% |
1364 |
108 |
7.92% |
1364 |
784 |
57.48% |
Panel B: Round 27 Comparison |
||||||||||||
Subgroups Not Eligible for R28 Enhanced Final Push |
7522 |
4955 |
65.87% |
7522 |
5150 |
68.47% |
7522 |
576 |
7.66% |
7522 |
6170 |
82.03% |
Subgroups Eligible for R28 Enhanced Final Push |
1364 |
545 |
39.96% |
1364 |
575 |
42.16% |
1364 |
85 |
6.23% |
1364 |
743 |
54.47% |
Notes: Round 27 comparisons are based on percentage complete at a given time, and not the date.
In-Kind Incentives
While the value of the in-kind offerings that we are proposing is small, the benefit is useful in inducing respondents to engage and interviewers to make outreach. Our goal will be to use these small tokens broadly across the sample, because the value of the proposed items is small we will not prevent circumstances where a respondent may be the recipient of more than 1 offer but no respondents will receive all. We are asking to use in-kind gifts for as many as 2,400 respondents, and we propose to use them in mailings or using in door hangers. We would spend no more than $12 on any respondent and average only $10. We would like to have the ability to include a token in a mailing that either makes the envelope heavier or comes in a box, and thus may induce cooperative behavior from the sample member such as opening the mail, reading the message, and engaging with the project through calls to the 800#, setting a web appointment, or responding when the interviewer calls or stops by. Decisions for who receives this would be made by NORC central office, and interviewers would not have discretion over which respondents received this incentive.
NLSY79 Young Adult Incentives
Table 3: NLSY79 Young Adult incentive categories
Incentive Type |
Age 12-13 |
Age 14-17 |
Age 18+, completed R28 |
Age 18+, missed R28 |
Base |
$50 |
$70 |
$70 |
$70 |
Early Bird |
|
|
$100 |
$100 |
Missed Rounds |
|
|
|
$20 to 401 |
Fertility Section |
|
$10 per child |
$10 per child |
$10 per child |
Final Push |
|
$30 only if older sibling also gets offer |
$30 |
$30 |
In-kind |
|
Up to $10 |
Up to $10 |
Up to $10 |
|
|
|
|
|
Min |
$50 |
$70 |
$70 |
$90 |
Max |
$50 |
$100+202 |
$100+$702 |
$140+702 |
Typical |
$50 |
$70 |
$702 |
$1002 |
1Young Adult grant respondents are eligible for interview only every 4 years once they reach age 30. The missed round payment will be made to these respondents only for rounds for which they were eligible but did not participate; it will not take into account rounds in which they were not included in the eligible sample.
2$100 or $140 is the maximum regular incentive and $70 is the maximum amount anticipated for the additional per-child fertility section fee for Young Adults age 18 and older. For YAs 14-17, we will offer $10 per co-resident child; in the past two rounds this has affected only 3 respondents total and only one had more than one eligible child.
Base Incentive
We plan to offer $70 as the base incentive for young adult respondents age 14 and older. This is the same as round 29 and mirrors the main youth incentive. We plan to offer $50 as the base incentive for young adult respondents ages 12-13. We feel that this lower amount is appropriate for the very youngest respondents.
Early Bird
BLS will offer an Early Bird incentive to all respondents aged 18 and over who call into the toll-free line set and complete an interview or set an appointment within 3 weeks of receiving the initial mailing. Those contacting the project during the 3-week window will receive a $100 incentive for interview completion. We will provide the base incentive to youth and young adults who wait to receive outbound calls to complete their interview. BLS will allocate the respondent incentives according to the guidelines described in the OMB package and approved by OMB.
Missed Round Bonus
Missed round bonuses for young adult sample members 18 years of age or older will be the same as for the main NLSY79 sample.
Fertility
Once again, we plan to offer an additional $10 per child to young adult respondents who go through the lengthy fertility section, similar to the additional incentive offered in past rounds to main youth mothers for the mother supplement. This additional fertility section, and hence the incentive, only apply to young adults age 14 and older (in Round 27, only three respondents under age 18 had a co-resident child, and none had an eligible child in round 28). The section is asked about each co-resident child age 14 and younger.
Final Push
We propose to offer the same standard final push premium of up to $30 as proposed for the main NLSY79. This offer would be made following the same rules described in the main youth section above. This offer will be available only to respondents ages 18 and older, and to 14-17 year olds with a sibling getting the offer. We will not offer the enhanced final push incentive for the young adult sample. We will use the same timing for the young adult as laid out above for the main NLSY79 sample.
Small In-Kind Incentives
We propose the same in-kind incentives for the YA sample as are laid out above for the main NLSY79 sample. This includes in-kind gifts for as many as 2,400 respondents, to be used in mailings or door hangers.
Incentives among Family Members
To avoid problems with unequal treatment within families, we will continue to offer all relatives the best deal to which any family member is entitled, except for the additional non-interview and final push premiums. Because our approach is to tie higher incentives to cooperative behavior, bringing more respondents under the Early Bird rubric will reduce our overall cost structure, hence extending higher incentives is a win-win offer for the project and respondents.
Incentive Costs
A table listing the total cost of incentives by respondent pool is provided below.
Table 4. Round 29 Incentive Costs for Main NLSY79 Sample
Incentive Type |
R28 Completers |
Missed R28 |
Missed R27-R28 |
Missed R26-R28 |
Sample size* |
6872 |
289 |
228 |
1379 |
Expected completes |
94.52% |
40.25% |
21.12% |
9.44% |
Base |
$70 |
$70 |
$70 |
$70 |
Missed Round(s) |
|
$20 |
$30 |
$40 |
Total from Base and Missed Round |
$454,679 |
$10,469 |
$4,815 |
$14,320 |
Total from Above Row |
$484,283 |
|||
Other Incentives |
||||
Early Bird |
$30 for an estimated 3,000 respondents = $90,000 total. |
|||
Final Push |
$30 for an estimated 1,000 respondents = $30,000 total. |
|||
Enhanced Final Push |
$20 for an estimated 200 respondents = $4,000 total. |
|||
In-Kind |
$10 for an estimated 2,400 respondents = $24,000. |
|||
Total from Other Incentives |
$148,000 |
|||
Grand Total |
$632,283 |
* Note: Sample sizes exclude deceased and blocked cases. Expected completes based on completion rates of similar groups in R28 as of November 18, 2019.
Electronic Payment of Incentives
In rounds 27 and 28, we offered respondents the opportunity to receive their incentive payment via electronic methods (specifically, PayPal in Round 27). During the NLSY79 Round 29 main field period, we will continue offering this option to telephone interview respondents. The attraction of electronic payment is large for both respondents and the NLS program. For respondents, electronic payment will generally involve less effort to receive funds than would a paper check or money order. For the NLS program, we can save administration costs and mailing costs by making electronic payments. The cost is also much lower if a payment needs to be tracked or a respondent does not recall having received it. Electronic transaction options are expanding markedly, for example, PayPal, GoogleWallet, ApplePay, Zelle, etc., and we can expect the options during NLSY79 Round 29 main fielding to be even greater than those available today.
Summary
In our experience, there is no single strategy for securing respondent cooperation. The primary need is for flexibility in how we approach sample members and how we follow up with reluctant sample members. Overall, there are about 2,500 sample members who are hard to interview, either because of busy schedules or a mindset that ranges from indifference to hostility. Our Early Bird efforts attempt to reduce our total costs (incentive plus fielding costs) for cooperative sample members so that we can devote the resources necessary for difficult cases.
We reiterate that incentive payments are only part of our approach to encouraging response. An equally important part of the effort is designing an effective marketing campaign, including conversion materials that interviewers can use to respond to sample members who provide a variety of reasons for not completing the interview. This portfolio of respondent materials backs up the interviewer, providing a variety of approaches to converting refusals. We also encourage national teamwork among the interviewers and periodic calls among interviewers to share their successful “tricks of the trade” with each other that turn reluctant sample members into completed cases. Conversion materials and the ability to employ flexible respondent incentives also have important effects on interviewer morale. With a combination of a marketing campaign to “sell” the survey and the ability to personalize their approach to each respondent, interviewers will not feel they are alone and forced to deal with a difficult task of persuasion without the tools to do the job.
Locating is also a key aspect of obtaining respondent cooperation. Over the past several rounds, we have developed continuous locating strategies to keep track of sample members between interviews. For example, we monitor area code changes, which have become more frequent and play havoc with the accuracy of the phone numbers for our sample members. We review the record of calls to identify subsets of sample members for whom a particular style of advance conversion letter will respond to their particular concerns. We also will continue to include questions in the survey that solicit the views and opinions of respondents. Respondents often find such questions interesting and engaging, and the questions help to build credibility with respondents that the survey is a serious scientific and policy-related endeavor.
Our primary goal must be to continue in the good graces of the respondents. When we feel respondents are under heavy stress and suspect that additional contacts will be unproductive, we will set the case aside and try again in two years. Angering sample members is not an option in the face of their ability to screen and reject our calls. Our incentive efforts and contacting approach will continue our efforts to motivate sample members, assuage their concerns, and convey our interest in them as individuals, not numbers.
The information that NLSY79 respondents provide is protected by the Privacy Act of 1974 (DOL/BLS–13, National Longitudinal Survey of Youth 1979 (NLSY79) Database (81 FR 25788)) and the Confidential Information Protection and Statistical Efficiency Act (CIPSEA).
CIPSEA safeguards the confidentiality of individually identifiable information acquired under a pledge of confidentiality for exclusively statistical purposes by controlling access to, and uses made of, such information. CIPSEA includes fines and penalties for any knowing and willful disclosure of individually identifiable information by an officer, employee, or agent of the BLS.
Based on this law, the BLS provides respondents with the following confidentiality pledge/informed consent statement:
“We want to reassure you that your confidentiality is protected by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act, the Privacy Act, and other applicable Federal laws, the Bureau of Labor Statistics, its employees and agents, will, to the full extent permitted by law, use the information you provide for statistical purposes only, will hold your responses in confidence, and will not disclose them in identifiable form without your informed consent. All the employees who work on the survey at the Bureau of Labor Statistics and its contractors must sign a document agreeing to protect the confidentiality of your information. In fact, only a few people have access to information about your identity because they need that information to carry out their job duties.
Some of your answers will be made available to researchers at the Bureau of Labor Statistics and other government agencies, universities, and private research organizations through publicly available data files. These publicly available files contain no personal identifiers, such as names, addresses, Social Security numbers, and places of work, and exclude any information about the States, counties, metropolitan areas, and other, more detailed geographic locations in which survey participants live, making it much more difficult to figure out the identities of participants. Some researchers are granted special access to data files that include geographic information, but only after those researchers go through a thorough application process at the Bureau of Labor Statistics. Those authorized researchers must sign a written agreement making them official agents of the Bureau of Labor Statistics and requiring them to protect the confidentiality of survey participants. Those researchers are never provided with the personal identities of participants. The National Archives and Records Administration and the General Services Administration may receive copies of survey data and materials because those agencies are responsible for storing the Nation’s historical documents.”
BLS policy on the confidential nature of respondent identifiable information (RII) states that “RII acquired or maintained by the BLS for exclusively statistical purposes and under a pledge of confidentiality shall be treated in a manner that ensures the information will be used only for statistical purposes and will be accessible only to authorized individuals with a need-to-know.”
By signing a BLS Agent Agreement, all authorized agents employed by the BLS, contractors at CHRR and their subcontractors pledge to comply with the Privacy Act, CIPSEA, other applicable federal laws, and the BLS confidentiality policy. No interviewer or other staff member is allowed to see any case data until the BLS Agent Agreement, Department of Labor Rules of Conduct, BLS Confidentiality Training certification, and Department of Labor Information Systems Security Awareness training certification are on file. Respondents will be provided a copy of the questions and answers shown in Attachment 6 about uses of the data, confidentiality, and burden. These questions and answers appear on the back of the letter that respondents will receive in advance of the Round 29 interviews.
CHRR and NORC have safeguards to provide for the security of NLS data and the protection of the privacy of individuals in the sampled cohorts. These measures are used for the NLSY79 as well as the other NLS cohorts. Safeguards for the security of data include:
1. Storage of printed survey documents in locked space at NORC.
2. Protection of computer files at CHRR and NORC against access by unauthorized individuals and groups. Procedures include using passwords, high-level “handshakes” across the network, data encryption, and fragmentation of data resources. As an example of fragmentation, should someone intercept data files over the network and defeat the encryption of these files, the meaning of the data files cannot be extracted except by referencing certain cross-walk tables that are neither transmitted nor stored on the interviewers’ laptops. Not only are questionnaire response data encrypted, but the entire contents of interviewers’ laptops are now encrypted. Interview data are periodically removed from laptops in the field so that only information that may be needed by the interviewer is retained. If a laptop is lost or stolen it can be wiped clean remotely.
3. Protection of computer files at CHRR and at NORC against access by unauthorized persons and groups. Especially sensitive files are secured through a series of passwords to restricted users. Access to files is strictly on a need-to-know basis. Passwords change every 90 days.
4. To assure the CHRR and its subcontractor NORC are in compliance with the Federal Information Security Modernization Act of 2014 (FISMA) and adequately monitoring all cybersecurity risks to NLS assets and data, in addition to regular self-assessments, the NLS system undergoes a full NIST 800-53 audit every 3 years using an outside independent cybersecurity auditing firm.
Protection of the privacy of individuals is accomplished through the following steps:
1. Oral permission for the interview is obtained from all respondents, after the interviewer ensures that the respondent has been provided with a copy of the appropriate BLS confidentiality information and understands that participation is voluntary.
2. Information identifying respondents is separated from the questionnaire and placed into a non-public database. Respondents are then linked to data through identification numbers.
3. The public-use version of the data, available on the Internet, masks or removes data that are of sufficient specificity that individuals could theoretically be identified through some set of unique characteristics.
4. Other data files, which include variables on respondents’ State, county, metropolitan statistical area, zip code, and census tract of residence and certain other characteristics, are available only to researchers who undergo a review process established by BLS and sign an agreement with BLS that establishes specific requirements to protect respondent confidentiality. These agreements require that any results or information obtained as a result of research using the NLS data will be published only in summary or statistical form so that individuals who participated in the study cannot be identified. These confidential data are not released to researchers without express written permission from NLS and are not available on the public use internet site.
5. In Round 29 we will continue several training and procedural changes to increase protection of respondent confidentiality. These include an enhanced focus on confidentiality in training materials, clearer instructions in the Field Interviewer Manual on what field interviewers may or may not do when working cases, and formal separation procedures when interviewers complete their project assignments. Online and telephone respondent locating activities have been moved from NORC’s geographically dispersed field managers to locating staff in NORC’s central offices. Respondent social security numbers were removed from NORC and CHRR records during Round 23.
Several sets of questions in the NLSY79 and Children of the NLSY79 might be considered sensitive. This section describes these questions and explains why they are a crucial part of the data collection. All of these topics have been addressed in previous rounds of the surveys, and respondents generally have been willing to answer the questions. Respondents are always free to refuse to answer any question that makes them feel uncomfortable.
Income, Assets, and Program Participation. One major set of sensitive questions collects information about respondents’ income and assets. The interviews record information about the sources and amounts of income received during the past calendar year by the respondent, his or her spouse or partner, or other family members. Income sources identified include the respondents’ and their spouses’ or partners’ wages and salaries, income from military service, profits from a farm or business, Social Security, pensions and annuities. These questions, or variants, have appeared in the NLSY79 since 1979. The interviews collect information about what assets and debts the respondent and his or her spouse or partner hold, as well as the values of these assets and debts. Variants of these questions have appeared in the NLSY79 since 1979. While some respondents refuse to answer these questions, our item nonresponse rate is lower than for most surveys. The survey also asks about income received by the respondent and spouse or partner from unemployment compensation, Temporary Assistance for Needy Families (TANF), food stamps, Supplemental Security Income, and other public assistance programs. While questions on program participation have changed over the years, they have been in the survey for the past 30 years.
Income and assets questions are central to the usefulness of the NLSY79 data collection. Most analyses based on the survey include some measure of the financial resources available to the respondent, either as an input variable, outcome variable, or control. It is very difficult to conceive of a replacement measure that would function in the same way in research about the returns to education, participation in the labor market, child development, and so on. The public assistance questions additionally permit research on the effects of the welfare reforms enacted in 1997, providing important information to public officials charged with overseeing the country’s public assistance programs.
In Round 29, we plan to continue the use of follow-up questions for respondents who answer “don’t know” or “refuse” to income and asset questions. As described in a report submitted to OMB with the Round 22 clearance package, we have tested these follow-up questions to ensure that we are getting the best information possible. Based on this analysis, we implemented a hybrid approach beginning in Round 22 and plan to continue using this approach in Round 29. Briefly, respondents who answer an income or asset question with don’t know or refuse are first asked to provide a self-generated range. Respondents who are unable to answer the self-generated intervals question then are asked unfolding bracket questions, where they are asked if the amount is more or less than a specified amount, and then more or less than a second specified amount. These unfolding brackets are a common follow-up approach in major surveys including the Health and Retirement Study. To limit any negative effects from using a hybrid follow-up, we skip potentially uncooperative respondents (for example, those giving at least one refusal in the income and asset section) past the hybrid follow-up to subsequent refusals.
Menopause: The NLSY79 will ask a brief set of questions in order to date the onset of menopause. These questions are useful in research on fertility. Similar questions were asked in the NLS Young Women.
Cigarette and Alcohol Use. Questions on smoking ask whether the respondent has smoked more than 100 cigarettes in his or her life, the age the respondent began smoking daily, whether the respondent now smokes daily, the age when the respondent quit smoking, and the number of cigarettes smoked per day. These questions have been asked in several previous rounds of the survey with very low nonresponse rates.
The survey also will include a series of four questions asking whether the respondent drank alcohol in the past 30 days, the number of days on which alcohol was consumed, the average number of drinks per day, and the number of days on which the respondent consumed 6 or more drinks (an indication of alcohol abuse). These questions are important for both economic and social research. Economists are interested in the impact that alcohol use and abuse may have on employment and earnings (for example, Dooley and Prause 1998; Kenkel and Wang 1998). Sociologists and public health researchers can use alcohol data, along with the other information collected, to examine the social and psychological impact of alcohol use and abuse.
The set of alcohol questions has been asked previously in identical or similar form (the time reference period was different in the early surveys) in 1982–85, 1988, 1989, 1992, 1994, 2002, 2006, 2008, 2010, 2012, 2014, and 2016. In the earlier surveys, the questions were generally part of a longer series on alcohol use and abuse. No difficulties were encountered with these longer series in past rounds, and nonresponse is very low. For example, for the set of four questions being included this round, the largest number of respondents who refused or answered “don’t know” in 2010 was 18. No problems were experienced in Rounds 22-28 with the shorter and less intrusive set of questions, and none are expected in Round 29.
Income. Young adults (those age 14 and older) are asked a series of income questions similar to, but somewhat less detailed than, those asked of their mothers in the main interview. As described above, income data are crucial to many kinds of analysis in a variety of research fields. The young adult data additionally allow researchers to examine similarities or differences in the income sources of mothers and their children, providing information about the transmission of the ability to earn income across generations.
Smoking, Drug and Alcohol Use, and Criminal Activity. Young adults age 12 and older are asked about smoking, drug use, and alcohol use. These questions record whether the respondent has ever used a number of substances, including alcohol, cigarettes, marijuana, cocaine, and other drugs, and ask about the extent of the respondent’s use in the past 30 days. Additional delinquent and criminal behavior questions record whether the young adult has run away from home or been convicted of criminal activities such as selling drugs, possessing drugs, theft, assault, and so on. If the respondent reports convictions, he or she is asked to describe the type of crime committed and the punishment received.
Questions about substance use and criminal behavior are crucial in understanding the education and employment outcomes of this group of young adults. To quote a report based on data from the 1990 Youth Risk Behavior Surveillance System (U.S. Department of Health and Human Services), “Patterns of tobacco, alcohol and other drug use usually are established during youth, often persist into adulthood, contribute substantially to the leading causes of mortality and morbidity, and are associated with lower educational achievement and school dropout.” One concern with long-term drug and alcohol use is the gateway effect that can occur, leading to use of heavier drugs and an increase in other risky behaviors (for example, sexual activity or criminal acts). For examples of such research, see Pacula (1997); Desimone (1998); and Parker, Harford, and Rosenstock (1994). The negative relationship between drug and alcohol use and educational attainment has been investigated by authors such as Yamada, Kendix, and Yamada (1996). Finally, as mentioned above, substance use may have a negative effect on the probability of employment and the wages and benefits received.
These sensitive questions have been asked in a nearly identical form since 1994 without difficulty. Refusals and don’t knows often have been less than 1 percent. Prior to computer-assisted administration, some respondents did not fill out the self-report booklets correctly or completely. Because these instruments are now computer-administered, starting in 2000 for young adults 15 and older and in 2002 for those under age 15, this problem has been ameliorated.
School Safety. Reflecting the growing national concern with weapons in schools, questions on this topic have been administered to adolescents in several other national surveys. Questions on whether a respondent has carried a weapon or been threatened by a weapon have been directed toward adolescents ranging in age from 12-18 in the following surveys:
The National Youth Study (Tulane, 1998)
Welfare, Children, and Families: A Three-City Study (Johns Hopkins, 2001)
NLSY97 (BLS, 1997-2004)
Youth Risk Behavior Surveys (CDC)
Other surveys have asked questions about whether young respondents carry a weapon to school for protection (Josephson Institute, 1999).
For many previous rounds, the NLSY79 Child and Young Adult surveys have included questions about the child’s attitudes and opinions regarding school, including whether the child feels safe at school. In 2002, we added two related questions that ask whether a respondent has ever seen a student with a gun, knife, or other weapon on school property and, if so, the number of times in the last year that the respondent has seen a student with a gun, knife, or other weapon on school property. These questions will continue to be asked of respondents aged 12–18 who are attending school. These questions will aid researchers in investigating the presence of weapons in schools as it relates to school characteristics, neighborhood environment, child behavior, child success in school, subsequent criminal behavior, and so on.
Sexual Activity. Young adults aged 14 and older are also asked about sexual intercourse. Because puberty and the initiation of sexual activity occur during the teenage years for many youths, and because this information may not be recalled accurately if collected retrospectively, it is important to ask these questions of respondents in this age range in each survey round. Results from a number of different surveys, including early rounds of the NLSY97, indicate that a significant proportion of adolescents report that they are sexually active. It is vital that we continue to trace the progression of sexual activity in relation to the realization of educational and occupational goals and with respect to the promotion of good health practices. The level of sexual activity and contraceptive use are important indicators of how serious young people are about reaching higher levels of educational and occupational attainment, and there should be significant congruence between anticipated life goals, sexual activity, and its associated outcomes. These questions also provide information important for analyses of programs and policies related to adolescent health and well-being.
Further, age at first intercourse is important to understanding labor market behavior because of the central role that adolescent fertility plays in the future life course of women. Early childbearing not only retards the education of the mother and hence is deleterious to her labor market opportunities but also tends to play a powerful role in the intergenerational transmission of poverty. AIDS and other sexually transmitted diseases also make sexual behavior a significant public health issue. For these reasons, this line of questioning is important to the central point of the survey.
Suicidal Ideation. In round 25, we added two questions on suicidal ideation which will continue in Round 29 of respondents aged 14 and up. We believe this is an important topic to address and one that can be captured relatively simply. As diagnoses of depression have risen in recent years among adolescents and young adults, the use of anti-depressants also has risen. A supposed side effect of these medications is an increase in suicidal ideation. No large scale surveys currently exist, however, that enable one to estimate the extent of suicidal ideation in the general population across a range of youthful ages. The NLSY79 Young Adults, with repeated CES-D scores, represent an ideal sample to use. These data also will provide researchers with an opportunity to answer important questions about socioeconomic determinants of suicidal risk. Both questions are taken from the CDC's Youth Risk Behavior Survey. If a respondent answers that he or she has seriously considered suicide during the last 12 months, we will instruct interviewers to provide the respondent the National Suicide Prevention Lifeline number which is staffed through the U.S. Department of Health and Human Services’ Substance Abuse and Mental Health Services Administration (SAMHSA). This number takes in all calls and then funnels them to an available crisis center.
Gender Identity and Sexual Orientation. With growing openness about and awareness of these issues and how they might affect the lives of our respondents across multiple domains, and in direct response to requests from researchers, we added items on this topic beginning in 2018. Sexual and gender minorities, particularly those transitioning into adulthood, are at a significantly increased risk of mortality and morbidity (e.g. substance use, mental health conditions), and thus understanding the key determinants of health among sexual and gender minorities is a fundamental and imperative research aim for ameliorating health disparities at large. Although gender identity and sexual orientation have sometimes been combined into a single survey question, after reviewing question wording in several large national studies and in published research (Betts, 2009; GenIUSS, 2014; Miller and Ryan, 2011; Redford and Van Wagen, 2012), we plan to ask about gender identity separately from sexual orientation. Several large studies, such as the National Survey of Family Growth (NSFG), the National Health and Nutrition Examination Study (NHANES), and the General Social Survey (GSS), have included sexual orientation and/or gender identity questions in their surveys. The Center for Disease Control (CDC) has done cognitive testing on gender identity and sexual orientation questions for inclusion in the National Health Interview Survey (NHIS) and the National Center for Health Statistics (NCHS) surveys.
The sensitive questions on substance use, criminal behavior, and sexual activity are only asked with the consent of a parent or guardian of a young adult under age 18. We inform parents about the questions we ask, and both parents and teenagers are free to refuse to answer these questions. Our experience has been that participants recognize the importance of these questions and rarely refuse to answer. To further protect respondents and encourage honest reporting, these questions are contained in a self-administered section of the interview for any young adults interviewed in person. Because most young adults will be interviewed on the telephone, the sensitive questions have been written in such a way that the respondent can answer the questions without revealing personal information to anyone (such as a parent) who might overhear the conversation. Although we now ask about the age of the respondent’s most recent sexual partner and his or her relationship with the respondent, no identifying information is collected about sexual partners.
Informed Consent. At OMB’s request, we conducted cognitive testing before Round 20 to determine whether children and young adults understand the informed consent statement. A report summarizing this research was submitted with the Round 20 OMB clearance package. We will continue to use the consent statement developed as a result of that research and used for the first time in Round 20.
References
Betts, Peter. “Developing survey questions on sexual identity: Cognitive/in-depth interviews.” Office for National Statistics, July 2009.
Desimone, Jeffrey. “Is Marijuana a Gateway Drug?” Eastern Economic Journal 24,2 (Spring 1998): 149-163.
Dooley, David and Prause, Joann. “Underemployment and Alcohol Misuse in the National Longitudinal Survey of Youth.” Journal of Studies on Alcohol 59,6 (November 1998): 669-80.
The GenIUSS Group. “Best Practices for Asking Questions to Identify Transgender and Other Gender Minority Respondents on Population-Based Surveys.” J.L. Herman (Ed.). Los Angeles, CA: The Williams Institute, 2014.
Harford, Thomas C. and Muthen, Bengt O. “Adolescent and Young Adult Antisocial Behavior and Adult Alcohol Use Disorders: A Fourteen-Year Prospective Follow-Up in a National Survey.” Journal of Studies on Alcohol 61,4 (July 2000): 524-528.
Kenkel, Donald S. and Wang, Ping. “Are Alcoholics in Bad Jobs?” NBER Working Paper No. 6401, National Bureau of Economic Research, March 1998.
Miller, Kristen and J. Michael Ryan. “Design, Development and Testing of the NHIS Sexual Identity Question.” National Center for Health Statistics, October 2011.
Pacula, Rosalie Liccardo. “Adolescent Alcohol and Marijuana Consumption: Is There Really a Gateway Effect?” NBER Working Paper No. 6348, National Bureau of Economic Research, January 1997.
Parker, Douglas A.; Harford, Thomas C.; and Rosenstock, Irwin M. “Alcohol, Other Drugs, and Sexual Risk-Taking among Young Adults.” Journal of Substance Abuse 6,1 (1994): 87-93.
Redford, Jeremy and Aimee Van Wagen. “Measuring Sexual Orientation Identity and Gender Identity in a Self-Administered Survey: Results from Cognitive Research with Older Adults.” Presented: San Francisco, CA, Population Association of America Meetings, May 2012.
Yamada, Tetsuji; Kendix, Michael; and Yamada, Tadashi. “The Impact of Alcohol Consumption and Marijuana Use on High School Graduation.” Health Economics 5,1 (January-February 1996): 77-92.
The main NLSY79 interview will be administered to approximately 6,750 respondents, and the average response time is about 74 minutes per respondent.
The Young Adult survey will be administered to approximately 4,550 respondents ages 12 and older. These respondents will be contacted for an interview regardless of whether they reside with their mothers. The expected average response time for respondents without children is 54.5 minutes per respondent in the 12-13 age group, 70.5 minutes per respondent in the 14-18 age group, 67.5 minutes per respondents in the 19-22 age group, 64.50 minutes per respondent in the 23-28 age group, and 74.5 minutes per respondent age 29 and older (the longer interview time is due to the age 29 health module and the longer gap between interviews starting at age 31). Respondents across all age groups with children in their household take an average of 20 additional minutes of survey time; total time including this additional child section is reflected in the table below. Members of the Young Adult grant sample are contacted for interviews every other round once they reach age 31.
The time required to finish an interview varies in the sample. While women with young children take longer to answer the fertility and childcare questions, men are asked more questions about employment because they tend to hold more jobs. The data show the standard deviation of interview time for the main survey is around 25 minutes.
During the field period, less than 100 interviews may be validated to ascertain whether the interview took place as the interviewer reported and whether the interview was done in a polite and professional manner. These reinterviews average about 6 minutes each. As mentioned earlier, we employ a number of analytical techniques in order to limit the number of validation interviews required and to target these interviews to the respondents most likely to have had a problematic experience.
Table 4 below summarizes the estimated respondent burden for Round 29 of the NLSY79.
Table 4. Estimated Annualized Respondent Cost and Hour Burdens, Round 29
Instrument |
No. of Respondents |
No, of Responses per Respondents |
Total No. of Responses |
Average Burden per Response (in Min.) |
Total Burden Hours |
Average Hourly Wage Rate |
Total Burden Costs |
|
|
|
|
|
|
|
|
NLSY79 Round 29 Main Survey |
6,750 |
1 |
6,750 |
74/60 |
8,325 |
$7.25 |
$60,356.25 |
Round 28 Validation Interviews |
100 |
1 |
100 |
6/60 |
10 |
$7.25 |
$72.50 |
Young Adult Survey (Ages 12 to 13) |
6 |
1 |
6 |
54.5/60 |
5 |
$7.25 |
$36.25 |
Young Adult Survey (Ages 14 to 18, no kids) |
95 |
1 |
95 |
70.5/60 |
112 |
$7.25 |
$812.00 |
Young Adult Survey (Ages 14 to 18, kids) |
4 |
1 |
4 |
90.5/60 |
6 |
$7.25 |
$43.50 |
Young Adult Survey (Ages 19 to 22, no kids) |
329 |
1 |
329 |
67.5/60 |
370 |
$7.25 |
$2,682.50 |
Young Adult Survey (Ages 19 to 22, kids) |
25 |
1 |
25 |
87.5/60 |
36 |
$7.25 |
$261.00 |
Young Adult Survey, Grant component (Age 23 to 28, no kids), interview |
964 |
1 |
964 |
64.5/60 |
1,036 |
$7.25 |
$7,511.00 |
Young Adult Survey, Grant component (Age 23 to 28, kids), interview |
338 |
1 |
338 |
84.5/60 |
476 |
$7.25 |
$3,451.00 |
Young Adult Survey, Grant component (Age 29 and older, no kids), interview |
1258 |
1 |
1258 |
74.5/60 |
1,562 |
$7.25 |
$11,324.50 |
Young Adult Survey, Grant component (Age 29 and older, kids), interview |
1536 |
1 |
1536 |
94.5/60 |
2,419 |
$7.25 |
$17,537.75 |
TOTALS1 |
11,305 |
-- |
11,405 |
— |
14,357 |
|
$104,088.25 |
1The total number of 11,305 respondents across all the survey instruments is a mutually exclusive count that does not include the 100 re-interview respondents, who were previously counted among the main and young adult survey respondents.
The total response burden for the survey is 14,357 hours. The total annualized cost to respondents, based on burden hours and the Federal minimum wage of $7.25 per hour is $104,088.25.
Respondents for this survey will not incur any capital and start-up costs; respondents also will not incur any operation, maintenance, or purchase-of-service costs. There are no additional costs to respondents other than their time.
The estimated annual cost for the main NLSY79 and the associated Young Adult surveys is $10 million. This cost includes survey management, questionnaire design, instrument development, pretest and main data collection including incentive payments, cleaning and preparation of data files for users, and services to users of the data files.
The BLS bears approximately $7 million of this cost. The National Institute of Child Health and Human Development is expected to provide $3 million in funding annually through interagency agreements with the BLS and through a grant awarded to researchers at the Ohio State University Center for Human Resource Research (CHRR). The interagency agreement funds interviews with children and young adults up to age 24. The grant funds interviews of young adults age 25 and older.
The estimated total respondent burden of 14,357 hours for Round 29 is a decrease of 1158 hours from Round 28. We estimate the time for the main questionnaire to take 74 minutes this round rather than the 80 minutes of round 28. We made several changes to the questionnaire to decrease the interview length and added new questions that ask about the effects of the coronavirus outbreak on the employment, health, and retirement expectations of this cohort. In addition, we expect a smaller number of interviews for Round 29 compared to Round 28. We are anticipating 250 fewer interviews for the Main Youth in Round 29. We expect 150 respondents to be newly deceased between Rounds 28 and 29 and another 100 to attrite from the survey. See Attachment 5 for a description of questionnaire changes.
Following transmittal of the survey data from NORC to CHRR, staff at CHRR spend approximately 14 months cleaning the data and preparing a main NLSY79 public-use data file. A month later, the Child and Young Adult data files are released and reports are written for the National Institute of Child Health and Human Development. The expected timing of these events is as follows:
|
|
NLSY79 Young Adult Interviews |
September 2020–October 2021 |
Main NLSY79 Interviews |
September 2020–October 2021 |
Data Cleaning and Coding |
October 2021–January 2023 |
Public-Use Data File Preparation |
February–November 2023 |
Release of Main NLSY79 Public-Use Data File |
December 2023 |
Release of Young Adult File |
The OMB number and expiration date will be provided in the advance letter.
We do not have any exceptions to the “Certificate for Paperwork Reduction Act Submissions” statement.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Information Collection Request for |
Author | Jay Meisenheimer |
File Modified | 0000-00-00 |
File Created | 2021-01-12 |