This package requests clearance for the 23rd wave (2008) questionnaire of the National Longitudinal Survey of Youth 1979 cohort (NLSY79). The original sample includes 9,964 respondents who will be 43 to 50 years of age on December 31, 2007 (two subsamples were dropped for budgetary reasons). Approximately 4.8 percent of the respondents are deceased, and in recent waves we find that about 50-60 respondents (0.5-0.6 percent) have died since the previous round. There is no evidence of sample attrition bias at this time. The NLSY79 is a representative national sample of adults who were born in the years 1957 to 1964 and lived in the U.S. in 1978. The sample contains an overrepresentation of black and Hispanic respondents born in those years and living in the United States when the survey began so as to include sufficient sample cases to permit racial and ethnic analytical comparisons. Appropriate weights have been developed so that the sample components can be combined in a manner to aggregate to the overall U.S. population of the same ages, excluding those who have immigrated since 1978.
This submission also seeks clearance for assessments and interviews of the Children of the NLSY79. The Children of the NLSY79 have been assessed since 1986, when the National Institute of Child Health and Human Development (NICHD) began sponsoring a set of supplemental surveys to gather a large amount of information about the lives of these children. A battery of child cognitive, socio-emotional, and physiological assessments has been administered biennially since 1986 to NLSY79 mothers and their children. Starting in 1994, children who had reached age 15 by December 31 of the survey year (the Young Adults) were interviewed about their work experiences, training, schooling, health, fertility, and self-esteem, as well as sensitive topics addressed in a supplemental self-administered questionnaire. By 2008, the Children of the NLSY79 will include 670 children under age 10; 1,100 children ages 10–14; 2,546 Young Adults ages 15–20, and 5,080 Young Adults age 21 and older.
The main NLSY79 is funded by the Department of Labor, with additional funding for the Children of the NLSY79 anticipated from an interagency agreement with the National Institute of Child Health and Human Development (NICHD). The Young Adult 21-and-over sample is funded entirely by a grant from NICHD. The Bureau of Labor Statistics has overall programmatic responsibility for the project, except for the Young Adult 21-and-over sample which is the responsibility of NICHD. Direction for the conduct of the main survey comes from the National Opinion Research Center (NORC), which is affiliated with the University of Chicago. NORC is also responsible for interviewing and reporting on the survey to BLS. Data processing, the development of final documentation, and the preparation of a public-use data set are completed by the Center for Human Resource Research (CHRR) of the Ohio State University. Direction for the conduct of the Young Adult 21-and-over component, also referred to as the Young Adult grant, is provided by CHRR.
The data collected in this survey are a continuation of an ongoing data-collection effort that previously has been approved by the Office of Management and Budget (OMB). The longitudinal focus of the survey requires the collection of identical information for the same individuals, as well as the occasional introduction of new data elements, to meet the ongoing data and analysis needs of various government agencies. Almost all of the information to be collected in this survey round has already been justified in earlier clearance documents submitted to OMB. Those data elements of a particularly sensitive nature are justified in this document.
National Longitudinal Survey of Youth 1979, 14–21 Years of Age on December 31, 1978
23rd Round (2008 Survey) Rationale, Objectives, and Analysis of Content
This survey represents the 23rd wave of data collection of the National Longitudinal Survey of Youth 1979 cohort (NLSY79). The data collected in this survey are thus a continuation of an ongoing data-collection effort that previously has been approved by the Office of Management and Budget (OMB). The longitudinal focus of the survey requires the collection of identical information for the same individuals, as well as the occasional introduction of new data elements, to meet the ongoing data and analysis needs of various government agencies and to reflect the changing life-cycle stages of the respondents. Most of the information to be collected in this survey round has already been justified in earlier clearance documents submitted to OMB.
Among the objectives of the Department of Labor (DOL) are to promote the development of the U.S. labor force and the efficiency of the U.S. labor market. The Bureau of Labor Statistics (BLS) contributes to these objectives by gathering information about the labor force and labor market and disseminating it to policy makers and to the public so that participants in those markets can make more informed and, thus, more efficient choices. The charge to BLS to collect data related to the labor force is extremely broad, as reflected in Title 29 USC Section 1:
“The general design and duties of the Bureau of Labor Statistics shall be to acquire and diffuse among the people of the United States useful information on subjects connected with labor, in the most general and comprehensive sense of that word, and especially upon its relation to capital, the hours of labor, the earnings of laboring men and women, and the means of promoting their material, social, intellectual, and moral prosperity.”
The collection of these data aids in the understanding of labor market outcomes through mid-career experienced by individuals who have been followed since the early stages of career and family development. NLS data represent an important means of fulfilling BLS responsibilities. See Attachment 1 for Title 29 USC Section 2, “Collection, collation, and reports of labor statistics.”
Through 1984, the NLSY79 consisted of annual interviews with a national sample of 12,686 young men and women between who were ages 14 to 21 as of December 31, 1978, with overrepresentation of blacks, Hispanics, and economically disadvantaged non-black/non-Hispanics. The sample also included 1,280 persons serving in the military in 1978. The oversampled groups tend to experience above-average labor market difficulties and are disproportionately represented in federally financed employment and training programs. Starting in 1985, the military sample was reduced to 201 due to a cessation of funding from the Department of Defense. Again due to budget limits, starting in 1991 no attempt was made to interview the 742 male and 901 female economically disadvantaged non-black/non-Hispanic respondents. This reduced the eligible pool of respondents to 9,964. The most recent change to the survey occurred after the 1994 round of interviews, when the NLSY79 switched from an annual to a biennial interview schedule.
In addition to the regular interviews, several supplementary data-collection efforts completed during the early survey years greatly enhance the overall value of the survey to government agencies and academic researchers. The full Armed Services Vocational Aptitude Battery (ASVAB) was administered to 94 percent of the sample respondents. This was done pursuant to a Congressional mandate to “renorm” the ASVAB. Also, for a very large proportion of the total sample, information has been collected about the characteristics of the last high school each respondent attended, as well as some personal characteristics of the respondents while attending high school (including courses taken and grades).
These supplementary data-collection efforts have enabled researchers to complete careful studies of the relationship between a youth’s background environment, employment behaviors, vocational aptitudes, and high school quality. They have helped the Departments of Labor, Defense, Education, and Health and Human Services and many congressional committees to make more knowledgeable decisions when evaluating the efficacy of programs in the areas of military and civilian employment, training, and health.
The NLSY79 is a general-purpose study designed to serve a variety of policy-related research interests. Its longitudinal design and conceptual framework serve the needs of program and policy makers in a way that cross-sectional surveys cannot. In addition, the NLSY79 allows a broad spectrum of social scientists concerned with the labor market problems of young baby boomers to pursue their research interests. Participation in the survey by other government agencies is encouraged, as the increasingly omnibus nature of the survey makes it an efficient, low-cost data set. As noted, the survey has incorporated items needed for program and policy purposes by agencies other than the Department of Labor. In this survey round, we anticipate funding from the National Institute of Child Health and Human Development and the National Institute on Drug Abuse.
In this survey round, information once again will be collected about the biological children of the female respondents to the main NLSY79. For the most part, this collection of data about the children repeats surveys already administered to these children biennially from 1986–2006. These unique data permit medical and social science researchers to consider a large number of basic research issues relating to the effects of family background, federal program activities, and infant and maternal health on outcomes from early childhood through adolescence and into early adulthood. This will be elaborated at length in a subsequent section. Thus, while the principal focus of the survey remains the collection of data for labor force analysis, the questionnaires administered to these children and older youth include items needed by other agencies that are not always directly related to employment and training studies. As these children reach adolescence, the focus of the surveys of these “young adults” returns to the school-to-work transition.
Sample sizes and the expected number of interviews for each group are listed in table 1.
Table 1. NLSY79 Sample Size and Expected Response in 2008 (Round 23)
Cohort |
Approximate sample size |
Expected number of interviews |
NLSY79 pretest |
1301 |
1001 |
NLSY79 main youth |
9,424 |
7,550 |
Children ages 0–9 |
670 |
550 |
Children ages 10–14 |
1,110 |
900 |
Young Adults ages 15–20 |
2,546 |
2,195 |
Young Adults ages 21 and older |
5,080 |
4,165 |
The specific objectives of the NLSY79 fall into several major categories that will be further explained below:
to explore the labor market activity and family formation of individuals in this age group
to explore in greater depth than previously has been possible the complex economic, social, and psychological factors responsible for variation in the labor market experience of this cohort
to explore how labor market experiences explain the evolution of careers, wealth, and the preparation of this cohort for the further education of their children and retirement
to analyze the impact of a changing socio-economic environment on the educational and labor market experiences of this cohort by comparing data from the present study with those yielded by the surveys of the earlier NLS cohorts of young men (which began in 1966 and ended in 1981) and young women (which began in 1968 and ended in 2003), as well as the more recent NLS cohort of young men and women born in the years 1980-84 and interviewed for the first time in 1997
to consider how the employment-related activities of women affect the subsequent cognitive and emotional development of their children, and how the development of the children affect the activities of the mother
to meet the data-collection and research needs of various government agencies that have been interested in the relationships between child and maternal health, drug and alcohol use, and juvenile deviant behavior and child outcomes such as education, employment, and family experiences
The NLSY79 has several characteristics that distinguish it from other data sources and make it uniquely capable of meeting the major purposes described above. The first of these is the breadth and depth of the types of information that are being collected. It has become increasingly evident in recent years that a comprehensive analysis of the dynamics of labor force activity requires an eclectic theoretical framework that draws on several disciplines, particularly economics, sociology, and psychology. For example, the exploration of the determinants and consequences of the labor force behavior and experience of this cohort requires information about (1) the individual’s family background and ongoing demographic experiences; (2) the character of all aspects of the environment with which the individual interacts; (3) human capital inputs such as formal schooling and training; (4) a complete record of the individual’s work experiences; (5) the behaviors, attitudes, and experiences of closely related family members, including spouses and children; and (6) a variety of social-psychological measures, including attitudes toward specific and general work situations, personal feelings about the future, and perceptions of how much control one has over one’s environment.
A second major advantage of the NLSY79 is its longitudinal design, which permits investigation of labor market dynamics that would not be possible with one-time surveys and allows directions of causation to be established with much greater confidence than cross-sectional analyses. Also, the considerable geographic and environmental information available for each respondent for each survey year permits a more careful examination of the impact that area employment and unemployment considerations have for altering the employment, education, and family experiences of these cohort members and their families.
Third, the oversampling of blacks and Hispanics, together with the other two advantages mentioned above, makes possible more sophisticated examinations of human capital creation programs than previously have been possible. Post-program experiences of “treatment” groups can be compared with those of groups matched not only for preprogram experience and such conventional measures as educational attainment, but also for psychological characteristics that have rarely been available in previous studies.
As has been indicated above, the study has several general research and policy-related objectives. In Attachment 4, we elaborate on these basic purposes by setting forth a series of specific research themes. The detailed content of the interview schedule is then related to these themes. Attachment 5 tabulates the relationships of the recent modules of the survey to these objectives. Attachment 6 summarizes the new questions and lines of inquiry in the proposed questionnaire. A rationale for the various child assessments is contained in Attachment 7. Attachment 8 includes a copy of a BLS news release issued on August 25, 2006, that highlights findings from previous rounds of the NLSY79. Attachment 9 provides the advance letter and Privacy Act statement that will be sent to respondents prior to data collection. Attachment 10 provides justification for a specific set of political participation questions. Attachment 11 lists the data-collection instruments that will be used in round 23; these instruments are submitted as separate electronic files. In reviewing the questionnaire and the research themes, it should be noted that, because of the longitudinal nature of the NLSY79, there has been no need to recollect much of the background data gathered in the early rounds of the survey.
As the uses of this survey are described, the reader should take note of the other cohorts included in the National Longitudinal Surveys. In 1966, the first interviews were administered to members of two cohorts: Older Men ages 45–59 and Young Men ages 14–24. In 1967, the sample of Mature Women ages 30–44 was first interviewed. The last of the four original cohorts was the Young Women, who were ages 14–24 when first interviewed in 1968. The survey of the Young Men was discontinued after the 1981 interview, and the last survey of the Older Men was conducted in 1990. The Young and Mature Women were discontinued after the 2003 interviews. The most recent cohort added to the NLS program is the NLSY97, youths ages 12–16 by the end of 1996. This cohort was interviewed for the first time in 1997, reinterviewed for a second time starting in the fall of 1998, and continues to be interviewed each year. A fuller description of the National Longitudinal Surveys program can be found on the NLS program website, www.bls.gov/nls. The various cohorts are briefly summarized in the table below.
Table 2. The NLS: Survey groups, sample sizes, interview years, and survey status
Survey group |
Age cohort |
Birth year cohort |
Original sample |
Initial year/ latest year |
Number of surveys |
Number at last interview |
Status |
Older men |
45–59 |
4/1/06–3/31/21 |
5,020 |
1966/1990 |
13 |
12,092 |
Ended |
Mature women |
30–44 |
4/1/22–3/31/37 |
5,083 |
1967/2003 |
21 |
2,237 |
Ended |
Young men |
14–24 |
4/1/41–3/31/52 |
5,225 |
1966/1981 |
12 |
3,398 |
Ended |
Young women |
14–24 |
1943–1953 |
5,159 |
1968/2003 |
22 |
2,859 |
Ended |
|
|
|
|
|
|
|
|
NLSY79 |
14–21 |
1957–1964 |
212,686 |
1979/2006 |
21 |
37,661 |
Continuing |
NLSY79 children |
birth–14 |
— |
4— |
1986/2006 |
10 |
32,514 |
Continuing |
NLSY79 young adults |
15 and older5 |
— |
4— |
1994/2006 |
6 |
35,024 |
Continuing |
|
|
|
|
|
|
|
|
NLSY97 |
12–16 |
1980–1984 |
8,984 |
1997/2006 |
10 |
67,338 |
Continuing |
1Interviews in 1990 also were conducted with 2,206 widows or other family members of deceased respondents.
2After dropping the military (in 1985) and economically disadvantaged nonblack/non-Hispanic oversamples (in 1991), the sample contains 9,964 respondents eligible for interview.
3The latest sample size available is from the 2004 survey.
4The size of the NLSY79 child sample depends on the number of children born to female NLSY79 respondents, attrition over time, and the gradual aging of the children into the young adult sample. The size of the young adult sample depends on the number of children who reach age 15 in each survey year. Information about the number interviewed in each survey is available in chapter 4 of the NLS Handbook.
5In 1998 only, the young adults eligible for interview were limited to those aged 15–20.
6The latest sample size available is from round 9.
The NLSY79 is used by BLS and other government agencies to examine a wide range of labor market issues. In addition to BLS publications, in recent years analyses have been conducted for the Secretary of Labor, other agencies of the Executive Branch, the General Accounting Office, and the Congressional Budget Office.
The field staff of the National Opinion Research Center (NORC) makes every effort to ensure that the information is collected in as expeditious a manner as possible, with minimal interference in the lives of the respondents. Our success in this regard is suggested by the very high continuing response rate and low item refusal rates that have been attained. More recent efforts also have advanced technologies that lower respondent burden.
During round 11 (1989) of the NLSY79, about 300 cases were collected using a Computer Assisted Personal Interview (CAPI). During round 12 (1990) CAPI was again used, this time for about 2,400 cases using a longer and more complex questionnaire. The CAPI efforts in 1989 and 1990 were designed to assess the feasibility of the method and the effect of CAPI methods on data quality. Since 1993, the NLSY79 has been conducted using CAPI for all cases. (Note that at least some cases are completed over the telephone in each round; in recent rounds, the percentage of main youth cases completed by phone is around 85%. The precise abbreviation for a Computer Assisted Telephone Interview is CATI. However, because our system uses the same instrument regardless of interview mode, throughout this submission we use CAPI as a generic term for all computer-assisted NLSY79 interviews.). The system has proved to be stable and reliable, and is well received by interviewers, respondents, and researchers.
An analysis of the round 12 (1990) experimental data revealed that the quality of the data, as measured by missing or inconsistent responses, was greatly improved by CAPI. The effects on the pattern of responses were minor, although some answer patterns were affected because the CAPI technology in certain cases changed the way the questions were asked. Production data from rounds 15–21, based on over 80,000 completed instruments, showed these predictions of high quality data were correct.
In 1994 we also began to administer the Child Survey assessments using CAPI. CAPI simplified the administration of these assessments and reduced interview time and respondent burden. We saw the average interview time fall throughout the field period, and quality control measures revealed the data to be in excellent condition. The Young Adult survey, begun in 1994, originally had a CAPI interview with an additional paper and pencil self-report. In 2000, this booklet was converted to CAPI; again in the 2008 fielding, the entire Young Adult questionnaire will be in a CAPI format.
CAPI surveying will continue to be used in round 23. We estimate the average interview would be 10-15 percent longer if paper and pencil were used instead of CAPI.
In round 23, we will continue to exploit computer interviewing to reduce the number of validation reinterviews needed to ensure that field interviewing is accurate. The survey includes a number of questions which interviewers should not be able to fabricate answers for, such as the respondent’s date of birth, height, and weight. Entries for these questions that are inconsistent with information we already have could signal that an interviewer falsified an interview. We will then conduct reinterviews only in cases where an inconsistency suggests that a check is needed. This approach, first used in round 20, allowed us to reduce the number of validation reinterviews from about 1,250 in round 19 to about 200 in round 20.
In round 23, we also plan to introduce an additional computerized check to ensure that an interview has taken place. Our laptops will be equipped with an internal microphone and recording capability. We will record short segments of the interview (that is, 15 seconds or less) several times throughout the interview at points unknown to the interviewer. If we have any suspicions about a given interview, we can then listen to the sound files and ensure that the interviewer is reading the appropriate question aloud. For round 23, we plan to use these sound files only for verification purposes and not for data quality review or any other research purpose. We plan to focus our recordings on questions where the interviewer will be talking (for example, a long introduction screen explaining different types of jobs) and we expect to minimize recordings of any respondent voices. However, we cannot rule out the possibility that a respondent may be speaking when a recording occurs. We will include in the introduction to the questionnaire a statement that “Parts of this interview may be recorded for quality control purposes. This will not compromise the confidentiality of your responses” (“CONSENT-1200” in the main youth questionnaire and “INTRO” in the young adult questionnaire).
A study entitled “National Social Data Series: A Compendium of Brief Descriptions” by Richard C. Taeuber and Richard C. Rockwell includes an exhaustive list with descriptions of the national data sets available at that time. A careful examination of the description of all the data sets specified in their comprehensive listing indicates clearly that there was no other data set that would permit the comprehensive analyses of youth and young adult employment that can be conducted using the National Longitudinal Surveys. Indeed, it was the absence of alternative data sources that was the deciding factor in the Department of Labor’s determination (in 1977) to sponsor this comprehensive youth survey. The longitudinal nature of the survey and the rich background information collected mean that no survey subsequently released can provide data to replace the NLSY79. The expansion in the mid-1980s of the NLSY79 data set to incorporate the child outcome data represents a totally unique data-collection effort.
Survey staff have continued to confirm that no comparable data set exists. An investigation into data sets related to wealth by F. Thomas Juster and Kathleen A. Kuester describes the data available in the wealth domain, showing the unique nature of the data available in the NLS.
The volume The Future of the Survey of Income and Program Participation points out a number of contrasts between the Survey of Income and Program Participation (SIPP) and the NLS and other major longitudinal studies (see especially pages 77, 107, and 265–7). This book was written primarily to review the SIPP, but helps put the major longitudinal surveys in perspective.
As we will describe more fully below, BLS convened a conference in fall 1998 to look at the design of the NLSY79 for 2002 and beyond, and in the process external reviewers assessed the coverage, quality, and duplication of the survey. In its central areas of coverage—event histories on labor supply, major demographic events, and child assessments—this conference concluded that the NLSY79 was well-designed and continued to be a unique resource, unduplicated in the national statistical inventory.
As these studies show, there is no other longitudinal data set available that can be utilized to address effectively the many research topics highlighted in Attachment 4. This data set focuses specifically and in great detail on the employment, educational, demographic, and social-psychological characteristics of a national data set of young baby boomers and their families and measures changes in these characteristics over long time periods. It gathers this information for both men and women, as well as for relatively large samples of non-black/non-Hispanic, black, and Hispanic adults. The repeated availability of this information permits consideration of employment, education, and family issues in ways not possible with any other available data set. The combination of (1) longitudinal data covering the time from adolescence; (2) national representation; (3) large minority samples; and (4) detailed availability of education, employment and training, demographic, health, child outcome, and social-psychological variables make this data set, and its utility for social science policy-related research, completely unique.
In addition to the unique content of the interviews, the survey is also distinctive because of its coverage of the respondents’ lives for more than 25 years and the linkage between concurrently collected data on mothers and their children. It is these aspects that attract the thousands of users who rely on this survey for their studies of American workers, their careers, and their families.
References:
Citro, Constance C. and Kalton, Graham, eds. The Future of the Survey of Income and Program Participation. Washington, DC: National Academy Press, 1993.
Juster, F. Thomas and Kuester, Kathleen A. “Differences in the Measurement of Wealth, Wealth Inequality and Wealth Composition Obtained from Alternative U.S. Wealth Surveys.” Review of Income and Wealth Series 37, Number 1 (March 1991): 33-62.
Taeuber, Richard C. and Rockwell, Richard C. “National Social Data Series: A Compendium of Brief Descriptions.” Review of Public Data Use 10,1-2 (May 1982): 23-111.
Not applicable as the NLSY79 is a survey of individuals in household and family units.
The core of the National Longitudinal Surveys is the focus on labor force behavior. It is very difficult to reconstruct labor force behavior retrospectively and still maintain sufficient precision and data quality. This is the single most important reason we strive to maintain regular interviews with these respondents, who on average have somewhat frequent transitions in employment, income and earnings, and family and household structure. Historic dates relating to these transitions are difficult to reconstruct when one focuses on events earlier than the recent past. For those who are employed, retrospective information on wages, detailed occupations, job satisfaction, or other employment-related characteristics cannot be easily recalled.
As with employment-related information, data about a respondent’s education and training history are also extremely difficult to recall retrospectively. Completion dates of training and education programs are subject to severe memory biases. Thus, causal analyses that require a sequencing of education, training, and work experiences cannot be easily or accurately accomplished with historical data. Not only are completion dates of educational and training experiences frequently difficult to recall, but there is also evidence that misreporting of program completion is not unusual.
The precise timing and dating of demographic, socio-economic, and employment events, so crucial to most labor force analysis, is in most instances impossible to reconstruct accurately through retrospective data collection that extends very far into the past. For example, we have evidence from the NLS that dates of events of fundamental importance, such as marriage and birth histories, are subject to considerable error at the disaggregated level when collected retrospectively. Respondents have difficulty recalling when their marriages began or ended. Also, accurate information about household structure, how it changes over time, and how this relates to changes in family income and labor force dynamics is difficult to reconstruct retrospectively, as is the information on the health and related behaviors of the respondents, their spouses, and their children.
Finally, it is important to emphasize that information of a subjective nature can only be accurately reported and collected on a current, continuing basis. Recollection of attitudes may be colored by subsequent experiences or reflect a rationalization of subsequent successes or failures. Attitudes as widely diverse as one’s ideas about women’s roles or how one perceives one’s health as of an earlier period can be recollected inaccurately, even when respondents are trying to be as honest as they can. In addition, the older the events that one tries to recall, either objective or subjective in nature, the greater the likelihood of faulty recall. The recall of events or attitudes is often biased either by a tendency to associate the event with major life-cycle changes (that may or may not be in temporal proximity to what one is trying to recall) or to move the event into the more recent past. The cognitive and socio-emotional information collected for the children of the NLSY79 respondents is, of course, sensitive to the age and life-cycle stage through which the particular children are progressing and, in many instances, changes over time. This is the reason why we need to repeat some of the assessments, in order to measure the extent to which the results are related to the age of the child as well as intervening family and non-family activities.
Because of budget cuts at BLS over the past 15 years, the funding for the NLSY79 has been significantly reduced. While more frequent interviewing is desirable, financial limitations prompted the NLSY79 to move to a biennial interview cycle beginning in 1994. The data loss due to reduced frequency is somewhat ameliorated by the fact that the cohort is more established, having negotiated the school-to-work transition with varying degrees of success. The NLSY79 uses bounded interviewing techniques and is designed so that when respondents miss an interview, information not collected in the missed interview is gathered in the next completed interview. In this way, the event history on work experience is very complete.
A study was conducted to assess the impact of the longer recall period by using an experimental design in the 1994 interview. About 10 percent of the respondents who were interviewed in 1993 were given a modified instrument that was worded as if the respondents were last interviewed in 1992. Using this experimental structure, we examined the respondents’ reports on experiences between the 1992 and 1993 interviews using information from their 1993 and 1994 reports on that same reference period. As expected, recall was degraded by a lower interview frequency. Events were misdated and some short duration jobs were not reported when the reference period was moved back in time. Based on this evidence, it is clear that less frequent data collection adversely affects longitudinal surveys.
A second potential problem caused by the move to a biennial interview is a decline in our ability to locate respondents who move. We have been able to compensate for this so far, but a change to less frequent interviewing would likely have a more negative impact.
None of the listed special circumstances apply.
No comments were received as a result of the Federal Register notice published in Volume 72, No. 176 on September 12, 2007.
It should be noted that several comments were received from the Federal Register notice published in Volume 72, No. 54 on March 21, 2007. Three sets of comments were received which favored the inclusion of the proposed political participation module. The first letter of comment indicated that the inclusion of a political participation module represented a, ‘low cost, low risk, opportunity’ to expand the NLS used base. The second letter of comment discussed how the module, ‘assesses important aspects of America’s involvement in our system of government’ and that the questions, ‘are the subject of substantial scholarly research interest’. The commentator also noted that the questions have been rigorously tested and are based on many years of asking American about their orientation towards politics. The third letter of comment recounted the Irish Central Statistics Office’s (CSO) experience with including a political participation module in their Quarterly National Household Survey (QNHS). The commentator noted that the module is extensively used by social scientists, and is considered an innovation and success for the CSO.
There have been numerous consultations regarding the NLSY79. Preceding the first round of the NLSY79, the Social Science Research Council sponsored a conference at which academics from a broad spectrum within the social sciences were invited to present their views regarding (1) the value of initiating a longitudinal youth survey and (2) what the content of the surveys should be. The initial survey development drew heavily on the suggestions made at this conference, which were published in a proceeding under the auspices of the SSRC.
In 1988, the National Science Foundation sponsored a conference to consider the future of the NLS. This conference consisted of representatives from a variety of academic, government, and non-profit research and policy organizations. There was enthusiastic support for the proposition that the NLS should be continued in the current format, and that the needs for longitudinal data would continue over the long run. The success of the NLS, which was the first general-purpose, longitudinal labor survey, has helped reorient survey work in the United States toward longitudinal data collection and away from simple cross sections.
BLS has consulted with its Business Research Advisory Council and Labor Research Advisory Council for their input into questionnaire content.
Also, on a continuing basis, BLS and its contractor, the Center for Human Resource Research, encourage NLS data users to (1) suggest ways in which the quality of the public-use data can be improved and (2) suggest additional data elements that should be considered for inclusion in subsequent data rounds. We encourage this feedback through the public information offices of each organization and through the quarterly NLS Newsletter.
Individuals from other federal agencies who were consulted regarding the content of the 2006 survey include:
V. Jeffrey Evans
Director of Intergenerational Research
National Institute of Child Health and Human Development
6100 Executive Boulevard, Room 8B07
Bethesda, MD 20892-7510
The NLS program has a Technical Review Committee that advises BLS on interview content and long-term objectives. That group has met twice a year for the past decade. Table 3 below shows the current members of that committee.
Table 3. Technical Review Committee for the NLS (2007)
David Autor Massachusetts Institute of Technology Department of Economics 50 Memorial Drive, E52-371 Cambridge, MA 02142 Email: [email protected] Phone: 617-258-7698 Fax: 617-253-1330
Janet Currie Professor, Department of Economics Columbia University Room 1038 IAB 420 West 118th Street New York, NY 10027 Email: [email protected] Phone: 212-854-4520 Fax: 212-854-8059
Paula England Department of Sociology Building 120, Serra Mall Stanford University Stanford, CA 94305-2047 Email: [email protected] Phone: 650-723-4912 Fax: 650-725-6471
Jeff Grogger Harris School of Public Policy University of Chicago Suite 139 1155 E. 60th Street Chicago, IL 60637 E-mail: [email protected] Phone: 773-834-0973
Arie Kapteyn Senior Economist RAND 1776
Main Street Email: [email protected] Phone: 310-393-0411 x7973 Fax: 310-393-4818
|
Annamaria Lusardi Dartmouth University Economics Department 301 Rockefeller Hall Hanover, NH 03755 Email: [email protected] Phone: 603-646-2099 Fax: 603-646-2122
Derek Neal Professor and Chair Department of Economics University of Chicago 1126 E. 59th Street Chicago, IL 60637 Email: [email protected] Phone: 773-702-8166 Fax: 773-702-8490
Seth Sanders Department of Economics University of Maryland College Park, MD 20742 Email: [email protected] Phone: 301-405-3497
Chris Taber Professor of Economics Northwestern University 302 Arthur Andersen Hall 2001 Sheridan Road Evanston, IL 60208-2600 Email: [email protected] Phone: 847-491-8229
Bruce Western Professor, Department of Sociology Wallace Hall Princeton University Princeton NJ 08544 Email: [email protected] Phone: (609) 258-2445 Fax: (609) 258-2180
|
The NLS Technical Review Committee convened a conference in 1998 to review the current and future design of the NLSY79. This conference indicated that the central design of the NLSY79 remained strong, although changes in the nation’s welfare program required changes in the program recipiency section of the survey. Many of these changes were implemented in the 2000 and 2002 interviews. Some health section modifications were introduced in 2006 (cognitive functioning model), and the 2008 survey will include a new health module for respondents who have reached age 50 (mirroring the age 40 module).
In addition to the Technical Review Committee, the decisions concerning which child outcome measures to include in the child assessment sections of the NLSY79 were carefully considered from a number of perspectives. The National Institute of Child Health and Human Development (NICHD) has collaborated with BLS on the NLSY79 for many years, with NICHD providing funds for topical modules that are added on to the core interview. This collaboration reduces the total cost of data collection for the government. NICHD staff consult with experts outside NICHD to determine priorities for the modules it funds. NICHD staff, CHRR personnel, and nationally recognized panels of experts jointly made the decisions about NICHD-sponsored survey topics from medicine and the social sciences. The individuals on these panels were all highly respected social scientists with national reputations, and each had specialized areas of interest central to this study.
The NICHD has also convened groups of outside experts to review the progress and content of the data collected for NICHD within the NLSY79 program. A brief description of outside experts consulted with respect to the child instruments and their affiliations may be found in table 4 below.
Table 4. Advisors and Experts Contributing to NLSY79 Child and Young Adult Surveys
Children of the NLSY79 |
||
Ann
L. Brown Joseph
Campione Joseph
Campos Lindsay
Chase-Lansdale William
E. Cross, Jr. Robert
Emery Rochel
Gelman Willard
H. Hartup Lois
Hoffman Jerome
Kagan |
Luis
M. Laosa Robert
Michael Marian
Radke-Yarrow Henry
Ricciuti Joseph
Rodgers Barbara
Starfield (M.D.) Linda
Waite Kenneth
Wolpin Michael
Yogman (M.D.) Nicholas
Zill |
|
Young Adults |
||
Kenneth
Wolpin Elizabeth
Menaghan Kristi
Williams James
R. Walker David
Blau |
Joe Rodgers Sandra
L. Hofferth Freya
L. Sonenstein Guang
Guo
|
Because this is a long-term study requiring the subjects to be reinterviewed regularly, respondents are offered compensation for completing the interview as a means of securing their long-term cooperation. Respondent payments are appropriate given the long-term nature of the survey – 2008 will be the 23rd round, and the 29th year since the survey began. We conducted an experiment during Round 19 (2000) of the survey to determine the effect of higher response fees on reluctant respondents toward the end of the field period. Then, during Round 20 (2002) we conducted an experiment that focused on cooperative respondents to determine whether respondent fees can motivate them to cooperate to a degree that significantly reduces our field costs. We documented the results of the first experiment in our clearance statement for the Round 20 field effort and the second (the Early Bird experiment) in our clearance statement for Round 21. A brief summary of those findings is as follows:
Respondent fees (round 19) - Offering a higher respondent fee to reluctant respondents toward the end of the field period is cost-effective when the respondent refused the interview the previous round. When the respondent had cooperated the previous year and the choice was between a $40 fee and an $80 fee, we were able to convert a sufficiently high fraction of reluctant respondents by normal means and by a $40 fee that the cost in respondent fees per incremental case was about $270. When the respondent had refused in the previous round, in looking at the yield from a $40 fee versus an $80 fee, an additional case cost about $133 in additional fees. These costs per incremental case are calculated by dividing the change in total fees paid by the change in cases interviewed for the $40 versus $80 fee treatments (see the round 20 clearance package for a more detailed explanation).
Early Bird Experiment (round 20) - Offering a higher respondent fee if the respondent will call the interviewer rather than waiting for the interviewer to call him/her is very effective in reducing field costs. This experiment allowed us to attain several months of production with fewer hours per case than we have observed in the best week in the past fifteen years. We tried both $60 and $80 treatments in this experiment. While the $60 fee was more cost-effective, for both treatments field costs were so low that either incentive fee in return for cooperative behavior is highly favorable to the project. This experiment was conducted in twelve replicates fielded sequentially. We were able to evaluate the experiment for each of the randomly selected replicates within a few weeks of mailing out the materials.
To provide perspective, toward the end of Round 20 in 2002, the weekly field costs per case were over $500. Respondent incentives represent only a fraction of the total field costs, and higher incentives can be a cost-effective means of increasing response while constraining the overall budget. We face a growing pool of respondents who are reluctant to cooperate. In this section we propose a set of measures to encourage cooperation that involve more than just higher respondent fees.
Marketing - We have put a great deal of effort into respondent materials that describe the importance of the study. These materials are colorful, easy to read, and produced by experienced marketers. For example, in round 20 one brochure used jellybeans as a visual theme illustrating the importance of each respondent. In round 21, we extended this marketing approach to include the use of small in-kind gifts, which were tailored to the respondent’s needs or coordinated with the respondent materials. We plan to continue use of such small gifts in Round 23. For example, a flyer with the theme “You’re a Lifesaver” might be accompanied by a bag of Lifesaver candies.
Pulling these three strands together (respondent fees, Early Bird promotion, and the marketing campaign), we request clearance for the following, integrated conversion strategy:
Main Youth Survey:
We request permission to increase the base main youth respondent fee to $50, an increase of $10 over the amount given in 2002–2006 to respondents not in one of the incentive experiment groups. We feel that an increase is necessary at this time because the respondent fee has been level for several rounds. Our experience with the various NLS cohorts has indicated that respondents respond positively to regular increases in fee amount. Because the pretest sample is not offered the Early Bird, these respondents would receive $50 as well.
As OMB requested last round, we will make an Early Bird offer to all main youth respondents. We request permission to increase the Early Bird fee to $70 for those offered $60 last round. This small $10 increase reflects the need to raise incentives over time, reflecting increases in inflation and cost of living. We would also to like to maintain the $80 fee for the those that received that amount last round. As in 2004, all main survey members who were in the same family in 1978 will be offered the same fee amount.
For respondents in the $80 treatment cells in the round 19 experiment, we plan to continue offering them the same incentive amount for Round 23.
We request permission to institute an additional incentive for respondents who were not interviewed in the previous survey round(s); this is appropriate because these respondents tend to have slightly longer interviews as they catch up on previous rounds’ information. This incentive would be structured in exactly the same way as the successful non-interview premium that has been used for several rounds in the NLSY97. Specifically, we would offer respondents $10 per consecutive round missed, up to $30 (3 rounds). We would be careful to inform respondents that this is a one-time additional incentive in appreciation of the additional time needed to catch up in this interview round, and that they will not receive this additional amount next round. Based on our experience in the NLSY97, we anticipate that respondents will appreciate this non-interview premium and will understand the distinction between the base amount and the additional incentive.
To avoid problems with unequal treatment within families, we will continue to offer all relatives the best deal to which any family member is entitled, except for the additional NIR premium. Because our approach is to tie higher incentives to cooperative behavior, bringing more respondents under the Early Bird rubric will reduce our overall cost structure, hence extending higher incentives is a win-win offer for the project and respondents.
Young Adult Survey:
We request permission to offer $50 as the base incentive for young adult respondents. This represents a $10 increase over the $40 amount given for the last 2 rounds.
Child Survey:
In past rounds, we have provided the mother with an incentive payment upon completion of the entire suite of child survey instruments (mother supplement, child interview/assessment, and child self-administered supplement). Through 2004, these instruments were typically administered within a span of a few days, if not on the same day, so this approach posed no problems. In 2006, we incorporated the mother supplement assessments into the main youth interview. However, we maintained the traditional incentive approach of providing the incentive after completion of the entire suite of instruments (mothers received $10 for completing the mother supplement and child assessment, and an additional $10 if a child age 10-14 completed the child self-administered supplement). This led to complications in the administration of the incentive payments. For children under the age of 4, the only instrument to be completed is the mother supplement, so the incentive was provided for these children following the completion of the main youth interview. For children ages 4-14, we attempted to complete the child interview/assessment and child self-administered supplement (if appropriate); depending on the timing of the main youth interview this may have occurred several months later. Assuming that the child completed the child interview/assessment, the mother received $10 at that time (plus an additional $10 for the child self-administered supplement). If the mother completed the mother supplement questions at the time of the main youth interview but the child interview/assessment is not completed (as of February 19, 2007, we still have about a week of interviewing left in round 22), we plan to mail the mother’s incentive payment for the mother supplement at the end of the interview round. Clearly, this approach is unnecessarily complicated and has led to significant administrative burden, and it has also led to some respondent dissatisfaction and confusion about the timing of incentive payments for various children. Therefore, we propose to simplify the approach for round 23 by separating the incentive payments for the various child instruments:
We request permission to offer mothers $10 for the completion of each child interview/assessment and to give each child age 14 and younger a small toy or gift card worth not more than $10. We have found that, while it is appropriate to give the cash incentive to the mother, a toy or gift provided directly to the child provides the child with encouragement and excitement to complete the survey.
We request permission to offer mothers $10 for the completion of each set of mother supplement assessments within the main youth interview. This payment would be made in conjunction with the incentive payment for the main youth interview, although we will make it clear to the respondent which portion of the money is related to the main youth interview and which is for the mother supplement(s).
We request permission to offer mothers an additional $10 for the completion of each child self-administered supplement. This is appropriate because the large additional questionnaire section represents an additional burden on children ages 10-14.
Assuming that the proposed child incentives are approved, the effect would be an increase of $10 as compared to 2006 for a child case where all relevant instruments are completed. In addition, we will simplify fee administration and generate respondent goodwill by providing the $10 mother supplement incentive in close proximity to the time the mother supplement is actually completed.
Cross-Sample In-Kind:
We request permission to spend no more than $5 per respondent on average, with an upper limit of $20 per case, on personalized gifts that convey the message that each respondent is special to the survey and has a unique situation that our program acknowledges and respects. For example, we may provide a small gift related to the marketing materials, or the interviewer may bring a pizza to the respondent’s house for the family dinner. Bringing dinner can be particularly effective for mothers with children in the child sample, as these interviews result in a somewhat larger time burden for the respondents. This continues our standard practice and spending limits from past rounds.
Gatekeepers:
Some “gatekeepers” are particularly helpful in tracking down recalcitrant respondents. For example, a parent or sibling not in the survey may provide locating information or encourage a respondent to participate. (Note that we never reveal the name of the survey to these gatekeepers; it is described simply as “a national survey.”) We often return to the same gatekeepers round after round. To maintain their goodwill, we would like to be able to offer gatekeepers who are particularly helpful a small gift worth about $5. This gift would most likely be a box of chocolates or a small plant.
In our experience, there is no such thing as a single strategy for securing respondent cooperation. The primary need is for flexibility in how we approach respondents and how we follow up with reluctant respondents. Overall, there are about 2,500 respondents who are hard to interview, either because of busy schedules or a mindset that ranges from indifference to hostility. Our Early Bird efforts attempt to reduce our costs for cooperative respondents so that we can devote the resources necessary for difficult cases.
We reiterate that fees are only part of our approach to encouraging response. An equally important part of the effort is designing an effective marketing campaign, including conversion materials for interviewers that they can use with respondents offering responses to a variety of reasons for not doing the interview. This portfolio of respondent materials backs up the interviewer, providing a variety of approaches to converting refusals. We also encourage national teamwork among the interviewers and periodic calls among interviewers to share their successful “tricks of the trade” with each other that turn reluctant respondents into completed cases. Conversion materials and the ability to employ flexible respondent incentives also have important effects on interviewer morale. With a combination of a marketing campaign to “sell” the survey and the ability to personalize their approach to each respondent, interviewers will not feel they are alone and forced to deal with a difficult task of persuasion without the tools to do the job.
In addition to these measures, we plan to step up our marketing campaign with birthday cards and other greetings when we ask nothing of the respondents, only telling them they are in our thoughts. We also plan to step up our locating effort so that it continues at a low level year around, year in and year out. We also plan to monitor area code changes, which have become more frequent and play havoc with the accuracy of the phone numbers for our respondents. We plan to review the record of calls to identify subsets of respondents for whom a particular style of advance conversion letter will respond to their particular concerns. We will also continue with our existing marketing efforts including questions in the survey that solicit their views and opinions, trying our best to secure the goodwill of our respondents so they look forward to an interesting, engaging interview that has face value as a serious scientific and policy-related endeavor.
Our primary goal must be to continue in the good graces of the respondents. When we feel respondents are under heavy stress and suspect additional contacts will be unproductive, we will set the case aside and try again in two years. Angering the respondent is not an option in the face of their ability to screen and reject our calls. Our incentive efforts and contacting approach will continue our efforts to motivate respondents, assuage their concerns, and convey our interest in them as individuals, not numbers.
The Commissioners’ Order 01-06, “Confidential Nature of BLS Records,” explains the Bureau’s policy on confidentiality. The order states in part: “In conformance with existing law and Departmental regulations, it is the policy of the BLS that Respondent identifiable information collected or maintained by, or under the auspices of, the BLS for exclusively statistical purposes and under a pledge of confidentiality shall be treated in a manner that will ensure that the information will be used only for statistical purposes and will be accessible only to authorized persons” (this Commissioners’ Order is provided in full in Attachment 2). By signing a BLS Agent Agreement, all authorized persons employed by the BLS contractors at the Ohio State University Center for Human Resource Research and at the National Opinion Research Center have pledged to comply with the Bureau’s confidentiality policy.
NLS data are also covered by the Confidential Information Protection and Statistical Efficiency Act of 2002 (CIPSEA). Survey staff must follow all of the provisions of CIPSEA in the collection and dissemination of NLS data. See Attachment 3 for a copy of CIPSEA. Respondents are provided with detailed privacy and confidentiality information on the back of the letter that they will receive a few weeks prior to the round 20 interview. This information is reproduced in Attachment 9.
The generally available public-use version of the NLSY79 data set masks all data that are of sufficient specificity that respondents theoretically could be identified through some set of unique characteristics. A second data set, called the NLSY79 geocode file, identifies the respondents’ state, county, and Metropolitan Statistical Area of residence. Access to this file is only available through a licensing system established by BLS. Under this licensing system, legitimate researchers at universities and other research organizations in the United States can use NLSY79 geocode data at their own facilities, provided that the research project and physical and electronic security measures that the researchers describe in their application are approved by BLS. Once BLS approves a project, a dean or other high-ranking official of the researchers’ institution is required to sign a letter of agreement that obligates the institution to adhere to BLS security requirements designed to protect the confidentiality of respondents. This agreement states that any results or information obtained as a result of research using the NLS data will be published only in summary or statistical form so that individuals who participated in the study cannot be identified. The individual researchers participating in each project also are required to read and sign the BLS Agent Agreement and return it to BLS before receiving the NLSY79 geocode CD.
CHRR and NORC have established safeguards for the NLSY79 data to provide for the confidentiality of data and the protection of the privacy of individuals in the sampled cohorts. Safeguards for the data include:
1. Storage of survey documents in locked space at NORC until the data are cleaned and sent to CHRR.
2. Protection of computer files at NORC and at CHRR against access by unauthorized persons and groups. Especially sensitive files are secured in locked offices on secure floors.
3. Storage of documents at Ohio State in locked space after data processing is completed.
Protection of the privacy of individuals is accomplished through the following steps:
1. Permission for the interview is obtained from the parents/guardians of all minors.
2. Information identifying respondents is detached from the questionnaire and will be kept in locked storage. Respondents will be linked to data through identification numbers.
3. After the final interview wave, respondent identifiers will be destroyed by shredding paper documents and deleting electronic files.
At OMB’s request, in 2000 BLS and CHRR investigated whether it was possible to identify respondents using information on the public-use data file. As a result of this investigation, we moved several variables from the public-use data file to the geocode CD in round 20. These variables include the respondent’s day of birth (month and year remain public), day of birth of family members, state of birth, and field of study in which an advanced degree was obtained. Further details were provided in an attachment to the round 20 clearance package.
Several sets of questions in the NLSY79 and Children of the NLSY79 might be considered sensitive. This section describes these questions and explains why they are a crucial part of the data collection. All of these topics have been addressed in previous rounds of the surveys, and respondents generally have been willing to answer the questions. Respondents are always free to refuse to answer any question that makes them feel uncomfortable.
Income, Assets, and Program Participation. One major set of sensitive questions collects information about respondents’ income and assets. The interviews record information about the sources and amounts of income received during the past calendar year by the respondent, his/her spouse or partner, or other family members. Income sources identified include the respondents’ and their spouses’ or partners’ wages and salaries, income from military service, profits from a farm or business, Social Security, pensions and annuities, and alimony/child support. These questions, or variants, have appeared in the NLSY79 since 1979. While some respondents refuse to answer these questions, our item nonresponse rate is lower than for most surveys. The survey also asks about income received by the respondent and spouse or partner from unemployment compensation, Aid to Families with Dependent Children and Temporary Assistance for Needy Families (AFDC/TANF), food stamps, Supplemental Security Income, and other public assistance programs. While questions on program participation have changed over the years, they have been in the survey for the past 20 years. Finally, the survey includes a regular collection of information about the value of respondents’ assets. Although the assets section was not included in round 22, these questions have been asked in 18 of the 22 rounds to date. In consultation with the TRC, we have decided that it will only be necessary to ask the assets series in every other round from this point forward. Asset accumulation is a slow process, and it seems reasonable to ask these questions lees often, leaving time for questions on other topics in non-asset rounds. The asset questions planned for round 23 are the same as the questions asked in round 21. Although some respondents refuse to answer these questions or say that they don’t know the amount of various assets, as with income we are able to collect assets information with a low rate of nonresponse and relatively little respondent resistance.
Income and assets questions are central to the usefulness of the NLSY79 data collection. Most economic and many other analyses based on these data include some measure of the financial resources available to the respondent, either as an input variable, outcome variable, or control. It is very difficult to conceive of a replacement measure that would function in the same way in research about the returns to education, participation in the labor market, child development, and so on. The public assistance questions additionally permit research on the effects of the welfare reforms enacted in 1997, providing important information to public officials charged with overseeing the country’s public assistance programs. In addition to providing information about the financial resources currently available to respondents, as respondents age the assets series will permit examination of issues relating to retirement and respondents’ accumulation of wealth in preparation for labor force separation.
As part of the rotating assets module, we include a brief series of questions on personal finance. These questions, which explore issues closely related to income and assets, ask respondents to report problems that may affect their ability to obtain credit. Issues explored include missed payments on bills, credit cards on which the respondent carries the maximum balance, whether and when the respondent has ever declared bankruptcy, and whether the respondent has been turned down for credit in the past 5 years. These questions were asked in round 21 of the NLSY79, and similar questions are asked in the Survey of Consumer Finances. We did not experience nonresponse problems in round 21.
In round 23, we plan to continue the use of follow-up questions for respondents who answer “don’t know” or “refuse” to income and assets questions. As described in a report submitted to OMB with the round 22 clearance package, we have tested these follow-up questions to ensure that we are getting the best information possible. Based on this analysis, we implemented a hybrid approach for round 22 and plan to continue using this approach in round 23. Briefly, respondents who answer an income or assets question with don’t know or refuse are first asked to provide a self-generated range. Respondents who are unable to answer the self-generated intervals question (e.g., either a don’t know or a refuse answer) then get unfolding bracket questions, where they are asked if the amount is more or less than a specified amount, and then more or less than a second specified amount. These unfolding brackets are a common follow-up approach in major surveys including the Health and Retirement Survey (HRS). To limit any negative effects from using hybrid follow-up, we skip potentially uncooperative respondents (e.g., those giving at least one refusal in the income section) past the hybrid follow-up to subsequent refusals.
Contraception. The NLSY79 includes three brief questions about contraception. Respondents are asked to report whether they have used birth control in the past month, what types they have used, and what percentage of the time they use birth control. These questions are useful in research about fertility expectations and in public health research related to unprotected sexual contact. These questions have been included in the survey for a number of years, and respondents are generally cooperative. Item nonresponse on contraception is lower than for income questions.
Cigarette and Alcohol Use. A final set of potentially sensitive survey questions is the brief series on cigarette and alcohol use. First, the round 23 interview will include questions on smoking last asked in round 18. These questions are whether the respondent has smoked more than 100 cigarettes in his/her life, the age the respondent began smoking daily, whether the respondent now smokes daily, the age when the respondent quit smoking, and the number of cigarettes smoked per day. These questions have been asked in several previous rounds of the survey with very low nonresponse rates.
The round 23 survey will also include a series of four questions asking whether the respondent drank alcohol in the past 30 days, the number of days on which alcohol was consumed, the average number of drinks per day, and the number of days on which the respondent consumed 6 or more drinks (an indication of alcohol abuse). These questions are important for both economic and social research. Economists are interested in the impact that alcohol use and abuse may have on employment and earnings (for example, Dooley and Prause 1998; Kenkel and Wang 1998). Sociologists and public health researchers can use alcohol data, along with the other information collected, to examine the social and psychological impact of alcohol use and abuse.
The set of alcohol questions included in the round 22 interviews has been asked previously in identical or similar form (the time reference period was different in the early surveys) in 1982–85, 1988, 1989, 1992, 1994, 2002, and 2006. In these years, the questions were generally part of a longer and more intrusive series on alcohol use and abuse. No difficulties were encountered with these longer series in past rounds, and nonresponse is very low. For example, for the set of four questions being included this year, the largest number of respondents who refused or answered “don’t know” in 1994 was 11. No problems were experienced in round 22 with the shorter and less intrusive set of questions, and none are expected in round 23.
Political Participation. For the 2008 survey, we are proposing the introduction of a short series of questions on the respondent’s participation in the political process. These questions are described and justified in detail in Attachment 10.
Income. Young adults (those ages 15 and older) are asked a series of income questions similar to, but somewhat less detailed than, those asked of their mothers in the main interview. As described above, income data are crucial to many kinds of analysis in a variety of research fields. The young adult data additionally allow researchers to examine similarities or differences in the income sources of mothers and their children, providing information about the transmission of the ability to earn income across generations.
Smoking, Drug and Alcohol Use, and Criminal Activity. Children age 10 and older (including young adults) are asked about smoking, drug use, and alcohol use. These questions record whether the respondent has ever used a number of substances, including alcohol, cigarettes, marijuana, cocaine, and other drugs, and ask about the extent of the respondent’s use in the past 30 days. For young adults (ages 15 and older), additional delinquent and criminal behavior questions record whether the young adult has run away from home or been convicted of criminal activities such as selling drugs, possessing drugs, theft, assault, and so on. If the respondent reports convictions, he or she is asked to describe the type of crime committed and the punishment received.
Questions about substance use and criminal behavior are crucial in understanding the education and employment outcomes of this group of young adults. To quote a report based on data from the 1990 Youth Risk Behavior Surveillance System (U.S. Department of Health and Human Services), “Patterns of tobacco, alcohol and other drug use usually are established during youth, often persist into adulthood, contribute substantially to the leading causes of mortality and morbidity, and are associated with lower educational achievement and school dropout.” One concern with long-term drug and alcohol use is the gateway effect that can occur, leading to use of heavier drugs and an increase in other risky behaviors (for example, sexual activity or criminal acts). For examples of such research, see Pacula (1997); Desimone (1998); and Parker, Harford, and Rosenstock (1994). The negative relationship between drug and alcohol use and educational attainment has been investigated by authors such as Yamada, Kendix, and Yamada (1996). Finally, as mentioned above, substance use may have a negative effect on the probability of employment and the wages and benefits received.
These questions will be asked to about 7,260 of the 7,801 children and young adults who will participate in round 23. These sensitive questions have been asked in a nearly identical form since 1994 without difficulty. Refusals and don’t knows have been quite low (often less than 1 percent). Prior to computer-assisted administration, some respondents did not fill out the self-report booklets correctly or completely. Because these instruments are now computer-administered, starting in 2000 for the young adults and in 2002 for children under age 15, this problem has been ameliorated.
Sexual Activity. Young adults (6,360 respondents ages 15 and older) are also asked about the onset of sexual intercourse. Because puberty and the initiation of sexual activity occur during the teenage years for many youths, and because this information may not be recalled accurately if collected retrospectively, it is important to ask these questions of respondents in this age range in each survey round. Results from a number of different surveys, including early rounds of the NLSY97, indicate that a significant proportion of adolescents report that they are sexually active. It is vital that we continue to trace the progression of sexual activity in relation to the realization of educational and occupational goals and with respect to the promotion of good health practices. The level of sexual activity and contraceptive use are important indicators of how serious young people are about reaching higher levels of educational and occupational attainment, and there should be significant congruence between anticipated life goals, sexual activity, and its associated outcomes. These questions also provide information important for analyses of programs and policies related to adolescent health and well-being.
Further, age at first intercourse is important to understanding labor market behavior because of the central role that adolescent fertility plays in affecting the future life course of women. Early childbearing not only retards the education of the mother and hence is deleterious to her labor market opportunities, but also tends to play a powerful role in the intergenerational transmission of poverty. AIDS and other sexually transmitted diseases also make sexual behavior a significant public health issue. For these reasons, this line of questioning is important to the central point of the survey.
The sensitive questions on substance use, criminal behavior, and sexual activity are only asked with the consent of a parent or guardian of a child. We inform parents about the questions we ask, and the parents and teenagers are free to refuse to answer these questions. Our experience has been that participants recognize the importance of these questions and only very rarely refuse to answer. To further protect respondents and encourage honest reporting, these questions are contained in a self-administered section of the interview for children ages 10–14 and for any young adults interviewed in person. Because most young adults will be interviewed on the telephone, the sensitive questions have been written in such a way that the respondent can answer the questions without revealing personal information to anyone (such as a parent) who might overhear the conversation. Although we now ask about the age of the respondent’s most recent sexual partner and his or her relationship with the respondent, no identifying information is collected about sexual partners.
School Safety. Reflecting the growing national concern with weapons in schools, questions on this topic have been administered to adolescents in several other national surveys. Questions on whether or not a respondent has carried a weapon or been threatened by a weapon have been directed toward adolescents ranging in age from 12-18 in the following surveys:
The National Youth Study (Tulane, 1998)
Welfare, Children, and Families: A Three-City Study (Johns Hopkins, 2001)
NLSY97 (BLS, 1997-2004)
Youth Risk Behavior Surveys (CDC)
Other surveys have asked questions about whether young respondents carry a weapon to school for protection (Josephson Institute, 1999).
For many previous rounds, the NLSY79 Child and Young Adult surveys have included questions about the child’s attitudes and opinions regarding school, including whether the child feels safe at school. In 2002, we added two related questions that ask whether a respondent has ever seen a student with a gun, knife, or other weapon on school property and, if so, the number of times in the last year that the respondent has seen a student with a gun, knife, or other weapon on school property. These questions will be continue to be asked of respondents ages 10–14 in the Child Survey and respondents attending school in the Young Adult Survey. These questions will aid researchers in investigating the presence of weapons in schools as it relates to school characteristics, neighborhood environment, child behavior, child success in school, subsequent criminal behavior, and so on.
Political Participation. For the 2008 survey, we are proposing the introduction of a short series of questions on the respondent’s participation in the political process. These questions are described and justified in detail in Attachment 10.
Child Assessments. Attachment 7 includes a discussion of the child assessment data to be included in this survey round, although for the most part these assessments are not sensitive, are well validated, and have been asked already without difficulty or any significant respondent resistance since 1986.
Informed Consent. At OMB’s request, we conducted cognitive testing before round 20 to determine whether children and young adults understand the informed consent statement. A report summarizing this research was submitted with the round 20 OMB clearance package. We will continue to use the consent statement developed as a result of that research and used for the first time in round 20.
References
Desimone, Jeffrey. “Is Marijuana a Gateway Drug?” Eastern Economic Journal 24,2 (Spring 1998): 149-163.
Dooley, David and Prause, Joann. “Underemployment and Alcohol Misuse in the National Longitudinal Survey of Youth.” Journal of Studies on Alcohol 59,6 (November 1998): 669-80.
Harford, Thomas C. and Muthen, Bengt O. “Adolescent and Young Adult Antisocial Behavior and Adult Alcohol Use Disorders: A Fourteen-Year Prospective Follow-Up in a National Survey.” Journal of Studies on Alcohol 61,4 (July 2000): 524-528.
Kenkel, Donald S. and Wang, Ping. “Are Alcoholics in Bad Jobs?” NBER Working Paper No. 6401, National Bureau of Economic Research, March 1998.
Pacula, Rosalie Liccardo. “Adolescent Alcohol and Marijuana Consumption: Is There Really a Gateway Effect?” NBER Working Paper No. 6348, National Bureau of Economic Research, January 1997.
Parker, Douglas A.; Harford, Thomas C.; and Rosenstock, Irwin M. “Alcohol, Other Drugs, and Sexual Risk-Taking among Young Adults.” Journal of Substance Abuse 6,1 (1994): 87-93.
Yamada, Tetsuji; Kendix, Michael; and Yamada, Tadashi. “The Impact of Alcohol Consumption and Marijuana Use on High School Graduation.” Health Economics 5,1 (January-February 1996): 77-92.
The NLSY79 pretest interview will be administered to approximately 100 respondents with an average response time is about 60 minutes per respondent. The main NLSY79 interview will be administered to approximately 7,550 respondents, and the average response time is about 60 minutes per respondent.
The time estimate for the NLSY79 Child Survey involves three components:
The Mother Supplement assessments are administered to female NLSY79 respondents who live with biological children under age 15. This section will be administered to about 1,300 mothers, who will be asked a series of questions about each child under age 15. On average, these women each have 1.26 children under age 15, for a total number of approximately 1,638 children. The average response time is 20 minutes for each child or, stated alternatively, 26 minutes for each mother (20 minutes per child times 1.26 children per mother).
The Child Supplement involves testing the achievement and aptitude of about 1,450 children ages 4-14. The average response time for this aptitude testing is 31 minutes per child.
The Child Self-Administered Questionnaire (SAQ). The Child SAQ is administered to about 900 children ages 10–14, and the average response time is 30 minutes per child.
The Young Adult Survey will be administered to approximately 2,195 youths ages 15 to 20. These youths will be contacted for an interview regardless of whether they reside with their mothers. The average response time for the Young Adult Survey is 45 minutes per respondent. The Young Adult Survey (grant component) will also be administered to approximately 4,165 youths age 21 and older. These interviews average 53 minutes per respondent. Grant interviews are longer than age 15-20 interviews for two reasons: respondents answer a longer set of political participation questions, and grant respondents are older so on average they are more likely to have a job, a spouse/partner, and children, all of which increase the number of questions to be answered.
Reviewers should keep in mind that substantial portions of the child supplement material are only asked of small numbers of children (for example, children ages 10–14 or young adults ages 15–20). Thus, the average time in each household is not as extensive as the attached stack of survey questionnaires might suggest. We are sensitive to the fact that the interviews in households with several children can theoretically pose interviewing problems, but our experience with this material in previous rounds has clearly indicated that this is not an issue. Respondents are very interested in the child assessments and have been quite cooperative.
The projected timing for the interview is based upon data on interview length in 2006. Accurate timings are available on a section-by-section basis for each of our respondents in 1996–2006. The time required to finish an interview varies in the sample. While women with children take longer to answer the fertility and childcare questions, men are asked more questions about employment because they tend to hold more jobs. The data show the standard deviation of interview time for the main survey is around 25 minutes. The variability of the Child Survey components will be chiefly due to differences in the number and ages of children in the family.
During the field period, about 200 interviews are validated to ascertain whether the interview took place as the interviewer reported and whether the interview was done in a polite and professional manner. These reinterviews average about 6 minutes each.
Table 5 below summarizes the estimated respondent burden for round 23 of the NLSY79.
Table 5. Number of respondents and average response time by survey questionnaire, Round 23
Instrument |
Total Respondents |
Total Responses |
Average Time per Response |
Estimated Total Burden |
NLSY79 Round 23 Pretest |
1001 |
100 |
60 minutes |
100 hours |
NLSY79 Round 23 Main Survey |
7,550 |
7,550 |
60 minutes |
7,550 hours |
Round 23 Validation Interviews |
200 |
200 |
6 minutes |
20 hours |
Mother Supplement (Mothers of children under age 15) |
1,3002 |
1,638 |
20 minutes |
546 hours |
Child Supplement (Children under age 15) |
1,450 |
1,450 |
31 minutes |
750 hours |
Child Self-Administered Questionnaire (Children ages 10 to 14) |
900 |
900 |
30 minutes |
450 hours |
Young Adult Survey (Youths ages 15 to 20) |
2,195 |
2,195 |
45 minutes |
1,646 hours |
Young Adult Survey, Grant component (Youths age 21 and older) |
4,165 |
4,165 |
53 minutes |
3,679 hours |
TOTALS3 |
15,460 |
18,198 |
— |
14,741 hours |
1This assumes that our separate proposal for augmentation of the pretest sample is approved and implemented.
2The number of respondents for the Mother Supplement (1,300) is less than the number of responses (1,638) because mothers are asked to provide separate responses for each of the biological children with whom they reside. The total number of responses for the Mother Supplement (1,638) is more than the number for the Child Supplement (1,450) because the number of children completing the Child Supplement is lower due to age restrictions and nonresponse.
3The total number of 15,460 respondents across all the survey instruments is a mutually exclusive count that does not include: (1) the 200 reinterview respondents, who were previously counted among the 7,550 main survey respondents, (2) the 1,300 Mother Supplement respondents, who were previously counted among the main youth, and (2) the 900 Child SAQ respondents, who were previously counted among the 1,450 Child Supplement respondents.
Respondents for this survey will not incur any capital and start-up costs; respondents will also not incur any operation, maintenance, or purchase-of-service costs.
The total estimated cost of the round 23 (2008) NLSY79 is about $16,200,000. This figure is based on extrapolations from the costs of previous survey rounds, adjusted for the estimated cost savings resulting from survey automation. This cost includes funding for the development of survey instruments for the main NLSY79 and the associated Child and Young Adult surveys; data collection for the surveys; cleaning and preparation of the data file; limited analysis of the survey data; and services to users of the public-use data files.
The survey costs are borne largely by BLS. The National Institute of Child Health and Human Development will provide an anticipated $3,700,000 in funding to BLS through an interagency agreement for the Child survey and for interviews of the Young Adults ages 15-20. NICHD also will provide an anticipated $2,200,000 as a grant for interviewing Young Adults age 21 and older. This shared effort reduces the cost to the government, as compared with the costs that would be incurred if each agency were to conduct the surveys independently.
The estimated total respondent burden of 14,741 hours for the main fielding of round 23 is higher than the estimated burden of 11,044 that was previously approved by OMB. The increase of 3,697 hours is due to the inclusion of the grant component (3,679 hours), a rounding adjustment in the Mother’s Supplement (-4 hours), and a calculation correction in the Young Adult Ages 15-20 supplement (22 hours).
Following receipt of final data from NORC, approximately 9 months are then spent cleaning the data and preparing a main NLSY79 public-use data file. Subsequent to this, the child/young adult file is prepared and reports are written for the Department of Labor and the National Institute of Child Health and Human Development. The timing of these events is as follows:
Pretest Interviews |
October 2007 |
Interviews |
January 1, 2008–January 31, 2009 |
Data Reduction and Coding |
February 1–April 30, 2009 |
Public-Use Data File Preparation |
May 1, 2009–February 28, 2010 |
Release of Main NLSY79 Public-Use Data File |
March 2010 |
Report Writing for NICHD, Release of Child/Young Adult File |
Summer 2010 |
Does not apply.
We do not have any exceptions in Item 19, “Certification for Paperwork Reduction Act Submissions,” of OMB form 83-I.
File Type | application/msword |
File Title | SUMMARY |
Author | Amy Hobby |
Last Modified By | Amy Hobby |
File Modified | 2007-11-28 |
File Created | 2007-11-28 |