Supporting Statement (1220-0109)

Supporting Statement (1220-0109).doc

National Longitudinal Survey of Youth 1979

OMB: 1220-0109

Document [doc]
Download: doc | pdf














Information Collection Request for

The National Longitudinal Survey of Youth 1979

OMB # 1220-0109


Submitted by the Bureau of Labor Statistics

TABLE OF CONTENTS

Summary 1

Supporting Statement 2

A. Justification 2

1. Necessity for the Data Collection 2

2. Purpose of Survey and Data-Collection Procedures 2

3. Improved Information Technology to Reduce Burden 6

4. Efforts to Identify Duplication 7

5. Involvement of Small Organizations 8

6. Consequences of Less Frequent Data Collection 8

7. Special Circumstances 9

8. Federal Register Notice and Consultations 9

9. Payment to Respondents 14

10. Confidentiality of Data 17

11. Sensitive Questions 19

12. Estimation of Information Collection Burden 23

13. Cost Burden to Respondents or Record Keepers 24

14. Estimate of Cost to the Federal Government 25

15. Change in Burden 25

16. Plans and Time Schedule for Information Collection, Tabulation, and Publication 25

17. Reasons Not to Display OMB Expiration Date 25

18. Exceptions to “Certification for Paperwork Reduction Act Submissions,” OMB Form 83-I 25

B. Collections of Information Employing Statistical Methods 26

1. Respondent Universe and Respondent Selection Method 26

2. Design and Procedures for the Information Collection 27

3. Maximizing Response Rates 28

4. Testing of Questionnaire Items 30

5. Statistical Consultant 30

Attachment 1—Title 29 USC Sections 1 & 2 31

Attachment 2— Commissioner's Order No. 1-06 33

Attachment 3—Confidential Information Protection and Statistical Efficiency Act of 2002 (CIPSEA) 39

Attachment 4—Survey Applications 48

A. Use of the NLSY79 for Diffusion of Useful Information on Labor 48

B. Use of the NLSY79 for Examination of Department of Labor Employment and Training Programs 48

C. Use of the NLSY79 in Understanding Labor Markets 49

1. Orientation toward the Labor Market 49

2. Factors in Educational Progress 49

3. Transition from School to Work 51

4. The Work Environment 52

5. Racial, Sex, and Cultural Differences in Employment and Earnings 52

6. The Relationships between Economic and Social Factors and Family Transitions and Well-Being 53

7. The Geographic Mobility of Young Baby Boomers 54

8. The Measurement and Analysis of Gross Changes in Labor Market Status 54

D. Use of the NLSY79 for Social Indicators Analysis 55

1. Delinquent Behavior, Arrest Records, and School Discipline 55

2. Drug and Alcohol Use 56

E. Use of the NLSY79 to Measure Maternal and Child Inputs and Outcomes 57

1. Research Issues Linking Employment, Income, and Child Outcomes 57

2. Other Research Issues Relating Family Structure and Child Outcomes 62

Attachment 5—Analysis of Content of Interview Schedules 65

Attachment 6—New Questions and Lines of Inquiry 72

Attachment 7—Child Assessment Measures 76

A. The Child Assessment Measures 76

B. Summary of Child Aptitude Measures To Be Used 77

1. Home Observation for Measurement of the Environment (HOME) 77

2. Wechsler Intelligence Scale for Children–Revised: Digit Span Subscale 79

3. Peabody Picture Vocabulary Test–Revised (PPVT-R) 79

4. Peabody Individual Achievement Test (PIAT) 81

5. Temperament Scales 82

6. Perceived Competence Scale for Children/Self-Perception Profile 83

7. Behavior Problems Index 83

Attachment 8—August 25, 2006, BLS News Release 85

Attachment 9—Respondent Advance Letters and Privacy Act Statement 86

Attachment 10—Justification for Political Participation Questions 93

Attachment 11—23rd Wave (2008) Proposed Interview Schedules 106


Summary


This package requests clearance for the 23rd wave (2008) questionnaire of the National Longitudinal Survey of Youth 1979 cohort (NLSY79). The original sample includes 9,964 respondents who will be 43 to 50 years of age on December 31, 2007 (two subsamples were dropped for budgetary reasons). Approximately 4.8 percent of the respondents are deceased, and in recent waves we find that about 50-60 respondents (0.5-0.6 percent) have died since the previous round. There is no evidence of sample attrition bias at this time. The NLSY79 is a representative national sample of adults who were born in the years 1957 to 1964 and lived in the U.S. in 1978. The sample contains an overrepresentation of black and Hispanic respondents born in those years and living in the United States when the survey began so as to include sufficient sample cases to permit racial and ethnic analytical comparisons. Appropriate weights have been developed so that the sample components can be combined in a manner to aggregate to the overall U.S. population of the same ages, excluding those who have immigrated since 1978.


This submission also seeks clearance for assessments and interviews of the Children of the NLSY79. The Children of the NLSY79 have been assessed since 1986, when the National Institute of Child Health and Human Development (NICHD) began sponsoring a set of supplemental surveys to gather a large amount of information about the lives of these children. A battery of child cognitive, socio-emotional, and physiological assessments has been administered biennially since 1986 to NLSY79 mothers and their children. Starting in 1994, children who had reached age 15 by December 31 of the survey year (the Young Adults) were interviewed about their work experiences, training, schooling, health, fertility, and self-esteem, as well as sensitive topics addressed in a supplemental self-administered questionnaire. By 2008, the Children of the NLSY79 will include 670 children under age 10; 1,100 children ages 10–14; and 2,546 Young Adults ages 15–20.


The main NLSY79 is funded by the Department of Labor, with additional funding for the Children of the NLSY79 anticipated from an interagency agreement with the National Institute of Child Health and Human Development. The Bureau of Labor Statistics has overall programmatic responsibility for the project. Direction for the conduct of the survey comes from the National Opinion Research Center (NORC), which is affiliated with the University of Chicago. NORC is also responsible for interviewing and reporting on the survey to BLS. Data processing, the development of final documentation, and the preparation of a public-use data set are completed by the Center for Human Resource Research (CHRR) of the Ohio State University.


The data collected in this survey are a continuation of an ongoing data-collection effort that previously has been approved by the Office of Management and Budget (OMB). The longitudinal focus of the survey requires the collection of identical information for the same individuals, as well as the occasional introduction of new data elements, to meet the ongoing data and analysis needs of various government agencies. Almost all of the information to be collected in this survey round has already been justified in earlier clearance documents submitted to OMB. Those data elements of a particularly sensitive nature are justified in this document.


Supporting Statement

National Longitudinal Survey of Youth 1979, 14–21 Years of Age on December 31, 1978

23rd Round (2008 Survey) Rationale, Objectives, and Analysis of Content


A. Justification


1. Necessity for the Data Collection

This survey represents the 23rd wave of data collection of the National Longitudinal Survey of Youth 1979 cohort (NLSY79). The data collected in this survey are thus a continuation of an ongoing data-collection effort that previously has been approved by the Office of Management and Budget (OMB). The longitudinal focus of the survey requires the collection of identical information for the same individuals, as well as the occasional introduction of new data elements, to meet the ongoing data and analysis needs of various government agencies and to reflect the changing life-cycle stages of the respondents. Most of the information to be collected in this survey round has already been justified in earlier clearance documents submitted to OMB.


Among the objectives of the Department of Labor (DOL) are to promote the development of the U.S. labor force and the efficiency of the U.S. labor market. The Bureau of Labor Statistics (BLS) contributes to these objectives by gathering information about the labor force and labor market and disseminating it to policy makers and to the public so that participants in those markets can make more informed and, thus, more efficient choices. The charge to BLS to collect data related to the labor force is extremely broad, as reflected in Title 29 USC Section 1:

The general design and duties of the Bureau of Labor Statistics shall be to acquire and diffuse among the people of the United States useful information on subjects connected with labor, in the most general and comprehensive sense of that word, and especially upon its relation to capital, the hours of labor, the earnings of laboring men and women, and the means of promoting their material, social, intellectual, and moral prosperity.”


The collection of these data aids in the understanding of labor market outcomes through mid-career experienced by individuals who have been followed since the early stages of career and family development. NLS data represent an important means of fulfilling BLS responsibilities. See Attachment 1 for Title 29 USC Section 2, “Collection, collation, and reports of labor statistics.”


2. Purpose of Survey and Data-Collection Procedures

Through 1984, the NLSY79 consisted of annual interviews with a national sample of 12,686 young men and women between who were ages 14 to 21 as of December 31, 1978, with overrepresentation of blacks, Hispanics, and economically disadvantaged non-black/non-Hispanics. The sample also included 1,280 persons serving in the military in 1978. The oversampled groups tend to experience above-average labor market difficulties and are disproportionately represented in federally financed employment and training programs. Starting in 1985, the military sample was reduced to 201 due to a cessation of funding from the Department of Defense. Again due to budget limits, starting in 1991 no attempt was made to interview the 742 male and 901 female economically disadvantaged non-black/non-Hispanic respondents. This reduced the eligible pool of respondents to 9,964. The most recent change to the survey occurred after the 1994 round of interviews, when the NLSY79 switched from an annual to a biennial interview schedule.


In addition to the regular interviews, several supplementary data-collection efforts completed during the early survey years greatly enhance the overall value of the survey to government agencies and academic researchers. The full Armed Services Vocational Aptitude Battery (ASVAB) was administered to 94 percent of the sample respondents. This was done pursuant to a Congressional mandate to “renorm” the ASVAB. Also, for a very large proportion of the total sample, information has been collected about the characteristics of the last high school each respondent attended, as well as some personal characteristics of the respondents while attending high school (including courses taken and grades).


These supplementary data-collection efforts have enabled researchers to complete careful studies of the relationship between a youth’s background environment, employment behaviors, vocational aptitudes, and high school quality. They have helped the Departments of Labor, Defense, Education, and Health and Human Services and many congressional committees to make more knowledgeable decisions when evaluating the efficacy of programs in the areas of military and civilian employment, training, and health.


The NLSY79 is a general-purpose study designed to serve a variety of policy-related research interests. Its longitudinal design and conceptual framework serve the needs of program and policy makers in a way that cross-sectional surveys cannot. In addition, the NLSY79 allows a broad spectrum of social scientists concerned with the labor market problems of young baby boomers to pursue their research interests. Participation in the survey by other government agencies is encouraged, as the increasingly omnibus nature of the survey makes it an efficient, low-cost data set. As noted, the survey has incorporated items needed for program and policy purposes by agencies other than the Department of Labor. In this survey round, we anticipate funding from the National Institute of Child Health and Human Development and the National Institute on Drug Abuse.


In this survey round, information once again will be collected about the biological children of the female respondents to the main NLSY79. For the most part, this collection of data about the children repeats surveys already administered to these children biennially from 1986–2006. These unique data permit medical and social science researchers to consider a large number of basic research issues relating to the effects of family background, federal program activities, and infant and maternal health on outcomes from early childhood through adolescence and into early adulthood. This will be elaborated at length in a subsequent section. Thus, while the principal focus of the survey remains the collection of data for labor force analysis, the questionnaires administered to these children and older youth include items needed by other agencies that are not always directly related to employment and training studies. As these children reach adolescence, the focus of the surveys of these “young adults” returns to the school-to-work transition.


Sample sizes and the expected number of interviews for each group are listed in table 1.


Table 1. NLSY79 Sample Size and Expected Response in 2008 (Round 23)

Cohort

Approximate sample size

Expected number of interviews

NLSY79 pretest

1301

1001

NLSY79 main youth

9,424
(9,964 – about 540 deceased respondents)

7,550

Children ages 0–9

670

550

Children ages 10–14

1,110

900

Young Adults ages 15–20

2,546

2,195

1 These numbers assume that the proposed pretest refresher sample (submitted separately) is approved by OMB and that the refresher fielding yields about 100 respondents. Along with the remaining 30 in-scope, living pretest respondents from the original 1979 pretest sample, this would yield a pretest sample of 130.


The specific objectives of the NLSY79 fall into several major categories that will be further explained below:

  1. to explore the labor market activity and family formation of individuals in this age group

  2. to explore in greater depth than previously has been possible the complex economic, social, and psychological factors responsible for variation in the labor market experience of this cohort

  3. to explore how labor market experiences explain the evolution of careers, wealth, and the preparation of this cohort for the further education of their children and retirement

  4. to analyze the impact of a changing socio-economic environment on the educational and labor market experiences of this cohort by comparing data from the present study with those yielded by the surveys of the earlier NLS cohorts of young men (which began in 1966 and ended in 1981) and young women (which began in 1968 and ended in 2003), as well as the more recent NLS cohort of young men and women born in the years 1980-84 and interviewed for the first time in 1997

  5. to consider how the employment-related activities of women affect the subsequent cognitive and emotional development of their children, and how the development of the children affect the activities of the mother

  6. to meet the data-collection and research needs of various government agencies that have been interested in the relationships between child and maternal health, drug and alcohol use, and juvenile deviant behavior and child outcomes such as education, employment, and family experiences


The NLSY79 has several characteristics that distinguish it from other data sources and make it uniquely capable of meeting the major purposes described above. The first of these is the breadth and depth of the types of information that are being collected. It has become increasingly evident in recent years that a comprehensive analysis of the dynamics of labor force activity requires an eclectic theoretical framework that draws on several disciplines, particularly economics, sociology, and psychology. For example, the exploration of the determinants and consequences of the labor force behavior and experience of this cohort requires information about (1) the individual’s family background and ongoing demographic experiences; (2) the character of all aspects of the environment with which the individual interacts; (3) human capital inputs such as formal schooling and training; (4) a complete record of the individual’s work experiences; (5) the behaviors, attitudes, and experiences of closely related family members, including spouses and children; and (6) a variety of social-psychological measures, including attitudes toward specific and general work situations, personal feelings about the future, and perceptions of how much control one has over one’s environment.


A second major advantage of the NLSY79 is its longitudinal design, which permits investigation of labor market dynamics that would not be possible with one-time surveys and allows directions of causation to be established with much greater confidence than cross-sectional analyses. Also, the considerable geographic and environmental information available for each respondent for each survey year permits a more careful examination of the impact that area employment and unemployment considerations have for altering the employment, education, and family experiences of these cohort members and their families.


Third, the oversampling of blacks and Hispanics, together with the other two advantages mentioned above, makes possible more sophisticated examinations of human capital creation programs than previously have been possible. Post-program experiences of “treatment” groups can be compared with those of groups matched not only for preprogram experience and such conventional measures as educational attainment, but also for psychological characteristics that have rarely been available in previous studies.


As has been indicated above, the study has several general research and policy-related objectives. In Attachment 4, we elaborate on these basic purposes by setting forth a series of specific research themes. The detailed content of the interview schedule is then related to these themes. Attachment 5 tabulates the relationships of the recent modules of the survey to these objectives. Attachment 6 summarizes the new questions and lines of inquiry in the proposed questionnaire. A rationale for the various child assessments is contained in Attachment 7. Attachment 8 includes a copy of a BLS news release issued on August 25, 2006, that highlights findings from previous rounds of the NLSY79. Attachment 9 provides the advance letter and Privacy Act statement that will be sent to respondents prior to data collection. Attachment 10 provides justification for a specific set of political participation questions. Attachment 11 provides justification for proposed additions to the Young Adult questionnaire for Young Adults under Age 21. questionnaire for Attachment 12 lists the data-collection instruments that will be used in round 23; these instruments are submitted as separate electronic files. In reviewing the questionnaire and the research themes, it should be noted that, because of the longitudinal nature of the NLSY79, there has been no need to recollect much of the background data gathered in the early rounds of the survey.


As the uses of this survey are described, the reader should take note of the other cohorts included in the National Longitudinal Surveys. In 1966, the first interviews were administered to members of two cohorts: Older Men ages 45–59 and Young Men ages 14–24. In 1967, the sample of Mature Women ages 30–44 was first interviewed. The last of the four original cohorts was the Young Women, who were ages 14–24 when first interviewed in 1968. The survey of the Young Men was discontinued after the 1981 interview, and the last survey of the Older Men was conducted in 1990. The Young and Mature Women were discontinued after the 2003 interviews. The most recent cohort added to the NLS program is the NLSY97, youths ages 12–16 by the end of 1996. This cohort was interviewed for the first time in 1997, reinterviewed for a second time starting in the fall of 1998, and continues to be interviewed each year. A fuller description of the National Longitudinal Surveys program can be found on the NLS program website, www.bls.gov/nls. The various cohorts are briefly summarized in the table below.

Table 2. The NLS: Survey groups, sample sizes, interview years, and survey status

Survey group

Age cohort

Birth year cohort

Original sample

Initial year/ latest year

Number of surveys

Number at last interview

Status

Older men

45–59

4/1/06–3/31/21

5,020

1966/1990

13

12,092

Ended

Mature women

30–44

4/1/22–3/31/37

5,083

1967/2003

21

2,237

Ended

Young men

14–24

4/1/41–3/31/52

5,225

1966/1981

12

3,398

Ended

Young women

14–24

1943–1953

5,159

1968/2003

22

2,859

Ended









NLSY79

14–21

1957–1964

212,686

1979/2006

21

37,661

Continuing

NLSY79 children

birth–14

4

1986/2006

10

32,514

Continuing

NLSY79 young adults

15 and older5

4

1994/2006

6

35,024

Continuing









NLSY97

12–16

1980–1984

8,984

1997/2006

10

67,338

Continuing


1Interviews in 1990 also were conducted with 2,206 widows or other family members of deceased respondents.

2After dropping the military (in 1985) and economically disadvantaged nonblack/non-Hispanic oversamples (in 1991), the sample contains 9,964 respondents eligible for interview.

3The latest sample size available is from the 2004 survey.

4The size of the NLSY79 child sample depends on the number of children born to female NLSY79 respondents, attrition over time, and the gradual aging of the children into the young adult sample. The size of the young adult sample depends on the number of children who reach age 15 in each survey year. Information about the number interviewed in each survey is available in chapter 4 of the NLS Handbook.

5In 1998 only, the young adults eligible for interview were limited to those aged 15–20.

6The latest sample size available is from round 9.



The NLSY79 is used by BLS and other government agencies to examine a wide range of labor market issues. In addition to BLS publications, in recent years analyses have been conducted for the Secretary of Labor, other agencies of the Executive Branch, the General Accounting Office, and the Congressional Budget Office.


3. Improved Information Technology to Reduce Burden

The field staff of the National Opinion Research Center (NORC) makes every effort to ensure that the information is collected in as expeditious a manner as possible, with minimal interference in the lives of the respondents. Our success in this regard is suggested by the very high continuing response rate and low item refusal rates that have been attained. More recent efforts also have advanced technologies that lower respondent burden.


During round 11 (1989) of the NLSY79, about 300 cases were collected using a Computer Assisted Personal Interview (CAPI). During round 12 (1990) CAPI was again used, this time for about 2,400 cases using a longer and more complex questionnaire. The CAPI efforts in 1989 and 1990 were designed to assess the feasibility of the method and the effect of CAPI methods on data quality. Since 1993, the NLSY79 has been conducted using CAPI for all cases. (Note that at least some cases are completed over the telephone in each round; in recent rounds, the percentage of main youth cases completed by phone is around 85%. The precise abbreviation for a Computer Assisted Telephone Interview is CATI. However, because our system uses the same instrument regardless of interview mode, throughout this submission we use CAPI as a generic term for all computer-assisted NLSY79 interviews.). The system has proved to be stable and reliable, and is well received by interviewers, respondents, and researchers.


An analysis of the round 12 (1990) experimental data revealed that the quality of the data, as measured by missing or inconsistent responses, was greatly improved by CAPI. The effects on the pattern of responses were minor, although some answer patterns were affected because the CAPI technology in certain cases changed the way the questions were asked. Production data from rounds 15–21, based on over 80,000 completed instruments, showed these predictions of high quality data were correct.


In 1994 we also began to administer the Child Survey assessments using CAPI. CAPI simplified the administration of these assessments and reduced interview time and respondent burden. We saw the average interview time fall throughout the field period, and quality control measures revealed the data to be in excellent condition. The Young Adult survey, begun in 1994, originally had a CAPI interview with an additional paper and pencil self-report. In 2000, this booklet was converted to CAPI; again in the 2008 fielding, the entire Young Adult questionnaire will be in a CAPI format.


CAPI surveying will continue to be used in round 23. We estimate the average interview would be 10-15 percent longer if paper and pencil were used instead of CAPI.


In round 23, we will continue to exploit computer interviewing to reduce the number of validation reinterviews needed to ensure that field interviewing is accurate. The survey includes a number of questions which interviewers should not be able to fabricate answers for, such as the respondent’s date of birth, height, and weight. Entries for these questions that are inconsistent with information we already have could signal that an interviewer falsified an interview. We will then conduct reinterviews only in cases where an inconsistency suggests that a check is needed. This approach, first used in round 20, allowed us to reduce the number of validation reinterviews from about 1,250 in round 19 to about 200 in round 20.


In round 23, we also plan to introduce an additional computerized check to ensure that an interview has taken place. Our laptops will be equipped with an internal microphone and recording capability. We will record short segments of the interview (that is, 15 seconds or less) several times throughout the interview at points unknown to the interviewer. If we have any suspicions about a given interview, we can then listen to the sound files and ensure that the interviewer is reading the appropriate question aloud. For round 23, we plan to use these sound files only for verification purposes and not for data quality review or any other research purpose. We plan to focus our recordings on questions where the interviewer will be talking (for example, a long introduction screen explaining different types of jobs) and we expect to minimize recordings of any respondent voices. However, we cannot rule out the possibility that a respondent may be speaking when a recording occurs. We will include in the introduction to the questionnaire a statement that “Parts of this interview may be recorded for quality control purposes. This will not compromise the strict confidentiality of your responses” (“CONSENT-1200” in the main youth questionnaire and “INTRO” in the young adult questionnaire).


4. Efforts to Identify Duplication

A study entitled “National Social Data Series: A Compendium of Brief Descriptions” by Richard C. Taeuber and Richard C. Rockwell includes an exhaustive list with descriptions of the national data sets available at that time. A careful examination of the description of all the data sets specified in their comprehensive listing indicates clearly that there was no other data set that would permit the comprehensive analyses of youth and young adult employment that can be conducted using the National Longitudinal Surveys. Indeed, it was the absence of alternative data sources that was the deciding factor in the Department of Labor’s determination (in 1977) to sponsor this comprehensive youth survey. The longitudinal nature of the survey and the rich background information collected mean that no survey subsequently released can provide data to replace the NLSY79. The expansion in the mid-1980s of the NLSY79 data set to incorporate the child outcome data represents a totally unique data-collection effort.


Survey staff have continued to confirm that no comparable data set exists. An investigation into data sets related to wealth by F. Thomas Juster and Kathleen A. Kuester describes the data available in the wealth domain, showing the unique nature of the data available in the NLS.


The volume The Future of the Survey of Income and Program Participation points out a number of contrasts between the Survey of Income and Program Participation (SIPP) and the NLS and other major longitudinal studies (see especially pages 77, 107, and 265–7). This book was written primarily to review the SIPP, but helps put the major longitudinal surveys in perspective.


As we will describe more fully below, BLS convened a conference in fall 1998 to look at the design of the NLSY79 for 2002 and beyond, and in the process external reviewers assessed the coverage, quality, and duplication of the survey. In its central areas of coverage—event histories on labor supply, major demographic events, and child assessments—this conference concluded that the NLSY79 was well-designed and continued to be a unique resource, unduplicated in the national statistical inventory.


As these studies show, there is no other longitudinal data set available that can be utilized to address effectively the many research topics highlighted in Attachment 4. This data set focuses specifically and in great detail on the employment, educational, demographic, and social-psychological characteristics of a national data set of young baby boomers and their families and measures changes in these characteristics over long time periods. It gathers this information for both men and women, as well as for relatively large samples of non-black/non-Hispanic, black, and Hispanic adults. The repeated availability of this information permits consideration of employment, education, and family issues in ways not possible with any other available data set. The combination of (1) longitudinal data covering the time from adolescence; (2) national representation; (3) large minority samples; and (4) detailed availability of education, employment and training, demographic, health, child outcome, and social-psychological variables make this data set, and its utility for social science policy-related research, completely unique.


In addition to the unique content of the interviews, the survey is also distinctive because of its coverage of the respondents’ lives for more than 25 years and the linkage between concurrently collected data on mothers and their children. It is these aspects that attract the thousands of users who rely on this survey for their studies of American workers, their careers, and their families.


References:

Citro, Constance C. and Kalton, Graham, eds. The Future of the Survey of Income and Program Participation. Washington, DC: National Academy Press, 1993.

Juster, F. Thomas and Kuester, Kathleen A. “Differences in the Measurement of Wealth, Wealth Inequality and Wealth Composition Obtained from Alternative U.S. Wealth Surveys.” Review of Income and Wealth Series 37, Number 1 (March 1991): 33-62.

Taeuber, Richard C. and Rockwell, Richard C. “National Social Data Series: A Compendium of Brief Descriptions.” Review of Public Data Use 10,1-2 (May 1982): 23-111.


5. Involvement of Small Organizations

Not applicable as the NLSY79 is a survey of individuals in household and family units.


6. Consequences of Less Frequent Data Collection

The core of the National Longitudinal Surveys is the focus on labor force behavior. It is very difficult to reconstruct labor force behavior retrospectively and still maintain sufficient precision and data quality. This is the single most important reason we strive to maintain regular interviews with these respondents, who on average have somewhat frequent transitions in employment, income and earnings, and family and household structure. Historic dates relating to these transitions are difficult to reconstruct when one focuses on events earlier than the recent past. For those who are employed, retrospective information on wages, detailed occupations, job satisfaction, or other employment-related characteristics cannot be easily recalled.


As with employment-related information, data about a respondent’s education and training history are also extremely difficult to recall retrospectively. Completion dates of training and education programs are subject to severe memory biases. Thus, causal analyses that require a sequencing of education, training, and work experiences cannot be easily or accurately accomplished with historical data. Not only are completion dates of educational and training experiences frequently difficult to recall, but there is also evidence that misreporting of program completion is not unusual.


The precise timing and dating of demographic, socio-economic, and employment events, so crucial to most labor force analysis, is in most instances impossible to reconstruct accurately through retrospective data collection that extends very far into the past. For example, we have evidence from the NLS that dates of events of fundamental importance, such as marriage and birth histories, are subject to considerable error at the disaggregated level when collected retrospectively. Respondents have difficulty recalling when their marriages began or ended. Also, accurate information about household structure, how it changes over time, and how this relates to changes in family income and labor force dynamics is difficult to reconstruct retrospectively, as is the information on the health and related behaviors of the respondents, their spouses, and their children.


Finally, it is important to emphasize that information of a subjective nature can only be accurately reported and collected on a current, continuing basis. Recollection of attitudes may be colored by subsequent experiences or reflect a rationalization of subsequent successes or failures. Attitudes as widely diverse as one’s ideas about women’s roles or how one perceives one’s health as of an earlier period can be recollected inaccurately, even when respondents are trying to be as honest as they can. In addition, the older the events that one tries to recall, either objective or subjective in nature, the greater the likelihood of faulty recall. The recall of events or attitudes is often biased either by a tendency to associate the event with major life-cycle changes (that may or may not be in temporal proximity to what one is trying to recall) or to move the event into the more recent past. The cognitive and socio-emotional information collected for the children of the NLSY79 respondents is, of course, sensitive to the age and life-cycle stage through which the particular children are progressing and, in many instances, changes over time. This is the reason why we need to repeat some of the assessments, in order to measure the extent to which the results are related to the age of the child as well as intervening family and non-family activities.


Because of budget cuts at BLS over the past 15 years, the funding for the NLSY79 has been significantly reduced. While more frequent interviewing is desirable, financial limitations prompted the NLSY79 to move to a biennial interview cycle beginning in 1994. The data loss due to reduced frequency is somewhat ameliorated by the fact that the cohort is more established, having negotiated the school-to-work transition with varying degrees of success. The NLSY79 uses bounded interviewing techniques and is designed so that when respondents miss an interview, information not collected in the missed interview is gathered in the next completed interview. In this way, the event history on work experience is very complete.


A study was conducted to assess the impact of the longer recall period by using an experimental design in the 1994 interview. About 10 percent of the respondents who were interviewed in 1993 were given a modified instrument that was worded as if the respondents were last interviewed in 1992. Using this experimental structure, we examined the respondents’ reports on experiences between the 1992 and 1993 interviews using information from their 1993 and 1994 reports on that same reference period. As expected, recall was degraded by a lower interview frequency. Events were misdated and some short duration jobs were not reported when the reference period was moved back in time. Based on this evidence, it is clear that less frequent data collection adversely affects longitudinal surveys.


A second potential problem caused by the move to a biennial interview is a decline in our ability to locate respondents who move. We have been able to compensate for this so far, but a change to less frequent interviewing would likely have a more negative impact.


7. Special Circumstances

None of the listed special circumstances apply.


8. Federal Register Notice and Consultations


Three comments were received as a result of the Federal Register notice published in Volume 72, No. 54 on March 21, 2007. All three sets of comments favored the inclusion of the proposed political participation module. The first letter of comment indicated that the inclusion of a political participation module represented a, ‘low cost, low risk, opportunity’ to expand the NLS used base. The second letter of comment discussed how the module, ‘assesses important aspects of America’s involvement in our system of government’ and that the questions, ‘are the subject of substantial scholarly research interest’. The commentator also noted that the questions have been rigorously tested and are based on many years of asking American about their orientation towards politics. The third letter of comment recounted the Irish Central Statistics Office’s (CSO) experience with including a political participation module in their Quarterly National Household Survey (QNHS). The commentator noted that the module is extensively used by social scientists, and is considered an innovation and success for the CSO.


There have been numerous consultations regarding the NLSY79. Preceding the first round of the NLSY79, the Social Science Research Council sponsored a conference at which academics from a broad spectrum within the social sciences were invited to present their views regarding (1) the value of initiating a longitudinal youth survey and (2) what the content of the surveys should be. The initial survey development drew heavily on the suggestions made at this conference, which were published in a proceeding under the auspices of the SSRC.


In 1988, the National Science Foundation sponsored a conference to consider the future of the NLS. This conference consisted of representatives from a variety of academic, government, and non-profit research and policy organizations. There was enthusiastic support for the proposition that the NLS should be continued in the current format, and that the needs for longitudinal data would continue over the long run. The success of the NLS, which was the first general-purpose, longitudinal labor survey, has helped reorient survey work in the United States toward longitudinal data collection and away from simple cross sections.


BLS has consulted with its Business Research Advisory Council and Labor Research Advisory Council for their input into questionnaire content.


Also, on a continuing basis, BLS and its contractor, the Center for Human Resource Research, encourage NLS data users to (1) suggest ways in which the quality of the public-use data can be improved and (2) suggest additional data elements that should be considered for inclusion in subsequent data rounds. We encourage this feedback through the public information offices of each organization and through the quarterly NLS Newsletter.


Individuals from other federal agencies who were consulted regarding the content of the 2006 survey include:


V. Jeffrey Evans

Director of Intergenerational Research

National Institute of Child Health and Human Development

6100 Executive Boulevard, Room 8B07

Bethesda, MD  20892-7510


The NLS program has a Technical Review Committee that advises BLS on interview content and long-term objectives. That group has met twice a year for the past decade. Table 3 below shows the current members of that committee.



Table 3. Technical Review Committee for the NLS (2007)


David Autor

Massachusetts Institute of Technology

Department of Economics

50 Memorial Drive, E52-371

Cambridge, MA 02142

Email: [email protected]

Phone: 617-258-7698

Fax: 617-253-1330


Janet Currie

Professor, Department of Economics

Columbia University

Room 1038 IAB

420 West 118th Street

New York, NY 10027

Email: [email protected]

Phone: 212-854-4520

Fax: 212-854-8059


Paula England

Department of Sociology

Building 120, Serra Mall

Stanford University

Stanford, CA 94305-2047

Email: [email protected]

Phone: 650-723-4912

Fax: 650-725-6471


Jeff Grogger

Harris School of Public Policy

University of Chicago

Suite 139

1155 E. 60th Street

Chicago, IL 60637

E-mail: [email protected]

Phone: 773-834-0973


Arie Kapteyn

Senior Economist

RAND

1776 Main Street
P.O. Box 2138
Santa Monica, CA 90407

Email: [email protected]

Phone: 310-393-0411 x7973

Fax: 310-393-4818


Annamaria Lusardi

Dartmouth University

Economics Department

301 Rockefeller Hall

Hanover, NH 03755

Email: [email protected]

Phone: 603-646-2099

Fax: 603-646-2122


Derek Neal

Professor and Chair

Department of Economics

University of Chicago

1126 E. 59th Street

Chicago, IL 60637

Email: [email protected]

Phone: 773-702-8166

Fax: 773-702-8490


Seth Sanders

Department of Economics

University of Maryland

College Park, MD 20742

Email: [email protected]

Phone: 301-405-3497


Chris Taber

Professor of Economics

Northwestern University

302 Arthur Andersen Hall

2001 Sheridan Road

Evanston, IL 60208-2600

Email: [email protected]

Phone: 847-491-8229


Bruce Western

Professor, Department of Sociology

Wallace Hall

Princeton University

Princeton NJ 08544

Email: [email protected]

Phone: (609) 258-2445

Fax: (609) 258-2180


The NLS Technical Review Committee convened a conference in 1998 to review the current and future design of the NLSY79. This conference indicated that the central design of the NLSY79 remained strong, although changes in the nation’s welfare program required changes in the program recipiency section of the survey. Many of these changes were implemented in the 2000 and 2002 interviews. Some health section modifications were introduced in 2006 (cognitive functioning model), and the 2008 survey will include a new health module for respondents who have reached age 50 (mirroring the age 40 module).


In addition to the Technical Review Committee, the decisions concerning which child outcome measures to include in the child assessment sections of the NLSY79 were carefully considered from a number of perspectives. The National Institute of Child Health and Human Development (NICHD) has collaborated with BLS on the NLSY79 for many years, with NICHD providing funds for topical modules that are added on to the core interview. This collaboration reduces the total cost of data collection for the government. NICHD staff consult with experts outside NICHD to determine priorities for the modules it funds. NICHD staff, CHRR personnel, and nationally recognized panels of experts jointly made the decisions about NICHD-sponsored survey topics from medicine and the social sciences. The individuals on these panels were all highly respected social scientists with national reputations, and each had specialized areas of interest central to this study.


The NICHD has also convened groups of outside experts to review the progress and content of the data collected for NICHD within the NLSY79 program. A brief description of outside experts consulted with respect to the child instruments and their affiliations may be found in table 4 below.

Table 4. Advisors and Experts Contributing to NLSY79 Child and Young Adult Surveys


Children of the NLSY79

Ann L. Brown
Department of Psychology
University of Illinois

Joseph Campione
Department of Psychology
University of Illinois

Joseph Campos
Department of Psychology
University of Denver

Lindsay Chase-Lansdale
Chapin Hall Center for Children
University of Chicago

William E. Cross, Jr.
Department of Psychology
Cornell University

Robert Emery
Department of Psychology
University of Virginia

Rochel Gelman
Department of Psychology
University of Pennsylvania

Willard H. Hartup
Institute of Child Development
University of Minnesota

Lois Hoffman
Department of Psychology
University of Michigan

Jerome Kagan
Department of Psychology and Social Relations
Harvard University

Luis M. Laosa
Educational Testing Service
Princeton, New Jersey

Robert Michael
Graduate School of Public Policy Studies
University of Chicago

Marian Radke-Yarrow
Laboratory of Developmental Psychology
National Institute of Mental Health

Henry Ricciuti
Dept of Human Development and Family Studies
Cornell University

Joseph Rodgers
Department of Psychology
University of Oklahoma

Barbara Starfield (M.D.)
Dept of Health Care Organizations
The Johns Hopkins University

Linda Waite
Population Research Center
NORC, University of Chicago

Kenneth Wolpin
Department of Economics
University of Pennsylvania

Michael Yogman (M.D.)
Infant Health and Development Program
Children’s Hospital, Boston, MA

Nicholas Zill
Child Trends, Inc.
Washington, D.C.

Young Adults

Kenneth Wolpin
Department of Economics
University of Pennsylvania

Elizabeth Menaghan
Department of Sociology
Ohio State University

Kristi Williams
Department of Sociology
Ohio State University

James R. Walker
Department of Economics
University of Wisconsin

David Blau
Department of Economics
University of North Carolina

Joe Rodgers
Department of Psychology
University of Oklahoma

Sandra L. Hofferth
Department of Family Studies
University of Maryland

Freya L. Sonenstein
Professor & Director
Center for Adolescent Health
Johns Hopkins Bloomberg School of Public Health

Guang Guo
Department of Sociology
University of North Carolina



9. Payment to Respondents

Because this is a long-term study requiring the subjects to be reinterviewed regularly, respondents are offered compensation for completing the interview as a means of securing their long-term cooperation. Respondent payments are appropriate given the long-term nature of the survey – 2008 will be the 23rd round, and the 29th year since the survey began. We conducted an experiment during Round 19 (2000) of the survey to determine the effect of higher response fees on reluctant respondents toward the end of the field period. Then, during Round 20 (2002) we conducted an experiment that focused on cooperative respondents to determine whether respondent fees can motivate them to cooperate to a degree that significantly reduces our field costs. We documented the results of the first experiment in our clearance statement for the Round 20 field effort and the second (the Early Bird experiment) in our clearance statement for Round 21. A brief summary of those findings is as follows:


  1. Respondent fees (round 19) - Offering a higher respondent fee to reluctant respondents toward the end of the field period is cost-effective when the respondent refused the interview the previous round. When the respondent had cooperated the previous year and the choice was between a $40 fee and an $80 fee, we were able to convert a sufficiently high fraction of reluctant respondents by normal means and by a $40 fee that the cost in respondent fees per incremental case was about $270. When the respondent had refused in the previous round, in looking at the yield from a $40 fee versus an $80 fee, an additional case cost about $133 in additional fees. These costs per incremental case are calculated by dividing the change in total fees paid by the change in cases interviewed for the $40 versus $80 fee treatments (see the round 20 clearance package for a more detailed explanation).

  2. Early Bird Experiment (round 20) - Offering a higher respondent fee if the respondent will call the interviewer rather than waiting for the interviewer to call him/her is very effective in reducing field costs. This experiment allowed us to attain several months of production with fewer hours per case than we have observed in the best week in the past fifteen years. We tried both $60 and $80 treatments in this experiment. While the $60 fee was more cost-effective, for both treatments field costs were so low that either incentive fee in return for cooperative behavior is highly favorable to the project. This experiment was conducted in twelve replicates fielded sequentially. We were able to evaluate the experiment for each of the randomly selected replicates within a few weeks of mailing out the materials.


To provide perspective, toward the end of Round 20 in 2002, the weekly field costs per case were over $500. Respondent incentives represent only a fraction of the total field costs, and higher incentives can be a cost-effective means of increasing response while constraining the overall budget. We face a growing pool of respondents who are reluctant to cooperate. In this section we propose a set of measures to encourage cooperation that involve more than just higher respondent fees.


Marketing - We have put a great deal of effort into respondent materials that describe the importance of the study. These materials are colorful, easy to read, and produced by experienced marketers. For example, in round 20 one brochure used jellybeans as a visual theme illustrating the importance of each respondent. In round 21, we extended this marketing approach to include the use of small in-kind gifts, which were tailored to the respondent’s needs or coordinated with the respondent materials. We plan to continue use of such small gifts in Round 23. For example, a flyer with the theme “You’re a Lifesaver” might be accompanied by a bag of Lifesaver candies.


Pulling these three strands together (respondent fees, Early Bird promotion, and the marketing campaign), we request clearance for the following, integrated conversion strategy:


Main Youth Survey:

  • We request permission to increase the base main youth respondent fee to $50, an increase of $10 over the amount given in 2002–2006 to respondents not in one of the incentive experiment groups. We feel that an increase is necessary at this time because the respondent fee has been level for several rounds. Our experience with the various NLS cohorts has indicated that respondents respond positively to regular increases in fee amount. Because the pretest sample is not offered the Early Bird, these respondents would receive $50 as well.

  • As OMB requested last round, we will make an Early Bird offer to all main youth respondents. We request permission to increase the Early Bird fee to $70 for those offered $60 last round. This small $10 increase reflects the need to raise incentives over time, reflecting increases in inflation and cost of living. We would also to like to maintain the $80 fee for the those that received that amount last round. As in 2004, all main survey members who were in the same family in 1978 will be offered the same fee amount.

  • For respondents in the $80 treatment cells in the round 19 experiment, we plan to continue offering them the same incentive amount for Round 23.

  • We request permission to institute an additional incentive for respondents who were not interviewed in the previous survey round(s); this is appropriate because these respondents tend to have slightly longer interviews as they catch up on previous rounds’ information. This incentive would be structured in exactly the same way as the successful non-interview premium that has been used for several rounds in the NLSY97. Specifically, we would offer respondents $10 per consecutive round missed, up to $30 (3 rounds). We would be careful to inform respondents that this is a one-time additional incentive in appreciation of the additional time needed to catch up in this interview round, and that they will not receive this additional amount next round. Based on our experience in the NLSY97, we anticipate that respondents will appreciate this non-interview premium and will understand the distinction between the base amount and the additional incentive.


To avoid problems with unequal treatment within families, we will continue to offer all relatives the best deal to which any family member is entitled, except for the additional NIR premium. Because our approach is to tie higher incentives to cooperative behavior, bringing more respondents under the Early Bird rubric will reduce our overall cost structure, hence extending higher incentives is a win-win offer for the project and respondents.


Young Adult Survey:

  • We request permission to offer $50 as the base incentive for young adult respondents. This represents a $10 increase over the $40 amount given for the last 2 rounds.


Child Survey:

In past rounds, we have provided the mother with an incentive payment upon completion of the entire suite of child survey instruments (mother supplement, child interview/assessment, and child self-administered supplement). Through 2004, these instruments were typically administered within a span of a few days, if not on the same day, so this approach posed no problems. In 2006, we incorporated the mother supplement assessments into the main youth interview. However, we maintained the traditional incentive approach of providing the incentive after completion of the entire suite of instruments (mothers received $10 for completing the mother supplement and child assessment, and an additional $10 if a child age 10-14 completed the child self-administered supplement). This led to complications in the administration of the incentive payments. For children under the age of 4, the only instrument to be completed is the mother supplement, so the incentive was provided for these children following the completion of the main youth interview. For children ages 4-14, we attempted to complete the child interview/assessment and child self-administered supplement (if appropriate); depending on the timing of the main youth interview this may have occurred several months later. Assuming that the child completed the child interview/assessment, the mother received $10 at that time (plus an additional $10 for the child self-administered supplement). If the mother completed the mother supplement questions at the time of the main youth interview but the child interview/assessment is not completed (as of February 19, 2007, we still have about a week of interviewing left in round 22), we plan to mail the mother’s incentive payment for the mother supplement at the end of the interview round. Clearly, this approach is unnecessarily complicated and has led to significant administrative burden, and it has also led to some respondent dissatisfaction and confusion about the timing of incentive payments for various children. Therefore, we propose to simplify the approach for round 23 by separating the incentive payments for the various child instruments:

  • We request permission to offer mothers $10 for the completion of each child interview/assessment and to give each child age 14 and younger a small toy or gift card worth not more than $10. We have found that, while it is appropriate to give the cash incentive to the mother, a toy or gift provided directly to the child provides the child with encouragement and excitement to complete the survey.

  • We request permission to offer mothers $10 for the completion of each set of mother supplement assessments within the main youth interview. This payment would be made in conjunction with the incentive payment for the main youth interview, although we will make it clear to the respondent which portion of the money is related to the main youth interview and which is for the mother supplement(s).

  • We request permission to offer mothers an additional $10 for the completion of each child self-administered supplement. This is appropriate because the large additional questionnaire section represents an additional burden on children ages 10-14.

Assuming that the proposed child incentives are approved, the effect would be an increase of $10 as compared to 2006 for a child case where all relevant instruments are completed. In addition, we will simplify fee administration and generate respondent goodwill by providing the $10 mother supplement incentive in close proximity to the time the mother supplement is actually completed.


Cross-Sample In-Kind:

  • We request permission to spend no more than $5 per respondent on average, with an upper limit of $20 per case, on personalized gifts that convey the message that each respondent is special to the survey and has a unique situation that our program acknowledges and respects. For example, we may provide a small gift related to the marketing materials, or the interviewer may bring a pizza to the respondent’s house for the family dinner. Bringing dinner can be particularly effective for mothers with children in the child sample, as these interviews result in a somewhat larger time burden for the respondents. This continues our standard practice and spending limits from past rounds.


Gatekeepers:

  • Some “gatekeepers” are particularly helpful in tracking down recalcitrant respondents. For example, a parent or sibling not in the survey may provide locating information or encourage a respondent to participate. (Note that we never reveal the name of the survey to these gatekeepers; it is described simply as “a national survey.”) We often return to the same gatekeepers round after round. To maintain their goodwill, we would like to be able to offer gatekeepers who are particularly helpful a small gift worth about $5. This gift would most likely be a box of chocolates or a small plant.


In our experience, there is no such thing as a single strategy for securing respondent cooperation. The primary need is for flexibility in how we approach respondents and how we follow up with reluctant respondents. Overall, there are about 2,500 respondents who are hard to interview, either because of busy schedules or a mindset that ranges from indifference to hostility. Our Early Bird efforts attempt to reduce our costs for cooperative respondents so that we can devote the resources necessary for difficult cases.


We reiterate that fees are only part of our approach to encouraging response. An equally important part of the effort is designing an effective marketing campaign, including conversion materials for interviewers that they can use with respondents offering responses to a variety of reasons for not doing the interview. This portfolio of respondent materials backs up the interviewer, providing a variety of approaches to converting refusals. We also encourage national teamwork among the interviewers and periodic calls among interviewers to share their successful “tricks of the trade” with each other that turn reluctant respondents into completed cases. Conversion materials and the ability to employ flexible respondent incentives also have important effects on interviewer morale. With a combination of a marketing campaign to “sell” the survey and the ability to personalize their approach to each respondent, interviewers will not feel they are alone and forced to deal with a difficult task of persuasion without the tools to do the job.


In addition to these measures, we plan to step up our marketing campaign with birthday cards and other greetings when we ask nothing of the respondents, only telling them they are in our thoughts. We also plan to step up our locating effort so that it continues at a low level year around, year in and year out. We also plan to monitor area code changes, which have become more frequent and play havoc with the accuracy of the phone numbers for our respondents. We plan to review the record of calls to identify subsets of respondents for whom a particular style of advance conversion letter will respond to their particular concerns. We will also continue with our existing marketing efforts including questions in the survey that solicit their views and opinions, trying our best to secure the goodwill of our respondents so they look forward to an interesting, engaging interview that has face value as a serious scientific and policy-related endeavor.


Our primary goal must be to continue in the good graces of the respondents. When we feel respondents are under heavy stress and suspect additional contacts will be unproductive, we will set the case aside and try again in two years. Angering the respondent is not an option in the face of their ability to screen and reject our calls. Our incentive efforts and contacting approach will continue our efforts to motivate respondents, assuage their concerns, and convey our interest in them as individuals, not numbers.


10. Confidentiality of Data


a. BLS Confidentiality Policy

The Commissioners’ Order 01-06, “Confidential Nature of BLS Records,” explains the Bureau’s policy on confidentiality. The order states in part: “In conformance with existing law and Departmental regulations, it is the policy of the BLS that Respondent identifiable information collected or maintained by, or under the auspices of, the BLS for exclusively statistical purposes and under a pledge of confidentiality shall be treated in a manner that will ensure that the information will be used only for statistical purposes and will be accessible only to authorized persons.” (this Commissioners’ Order is provided in full in Attachment 2). By signing a BLS Nondisclosure Affidavit, all authorized persons employed by the BLS contractors at the Ohio State University Center for Human Resource Research and at the National Opinion Research Center have pledged to comply with the Bureau’s confidentiality policy.


NLS data are also covered by the Confidential Information Protection and Statistical Efficiency Act of 2002 (CIPSEA). Survey staff must follow all of the provisions of CIPSEA in the collection and dissemination of NLS data. See Attachment 3 for a copy of CIPSEA. Respondents are provided with detailed privacy and confidentiality information on the back of the letter that they will receive a few weeks prior to the round 20 interview. This information is reproduced in Attachment 9.


The generally available public-use version of the NLSY79 data set masks all data that are of sufficient specificity that respondents theoretically could be identified through some set of unique characteristics. A second data set, called the NLSY79 geocode file, identifies the respondents’ state, county, and Metropolitan Statistical Area of residence. Access to this file is only available through a licensing system established by BLS. Under this licensing system, legitimate researchers at universities and other research organizations in the United States can use NLSY79 geocode data at their own facilities, provided that the research project and physical and electronic security measures that the researchers describe in their application are approved by BLS. Once BLS approves a project, a dean or other high-ranking official of the researchers’ institution is required to sign a letter of agreement that obligates the institution to adhere to BLS security requirements designed to protect the confidentiality of respondents. This agreement states that any results or information obtained as a result of research using the NLS data will be published only in summary or statistical form so that individuals who participated in the study cannot be identified. The individual researchers participating in each project also are required to read and sign the BLS Nondisclosure Affidavit and return it to BLS before receiving the NLSY79 geocode CD.


b. CHRR and NORC Confidentiality Safeguards

CHRR and NORC have established safeguards for the NLSY79 data to provide for the confidentiality of data and the protection of the privacy of individuals in the sampled cohorts. Safeguards for the data include:


1. Storage of survey documents in locked space at NORC until the data are cleaned and sent to CHRR.

2. Protection of computer files at NORC and at CHRR against access by unauthorized persons and groups. Especially sensitive files are secured in locked offices on secure floors.

3. Storage of documents at Ohio State in locked space after data processing is completed.


Protection of the privacy of individuals is accomplished through the following steps:


1. Permission for the interview is obtained from the parents/guardians of all minors.

2. Information identifying respondents is detached from the questionnaire and will be kept in locked storage. Respondents will be linked to data through identification numbers.

3. After the final interview wave, respondent identifiers will be destroyed by shredding paper documents and deleting electronic files.


c. Reidentification

At OMB’s request, in 2000 BLS and CHRR investigated whether it was possible to identify respondents using information on the public-use data file. As a result of this investigation, we moved several variables from the public-use data file to the geocode CD in round 20. These variables include the respondent’s day of birth (month and year remain public), day of birth of family members, state of birth, and field of study in which an advanced degree was obtained. Further details were provided in an attachment to the round 20 clearance package.


11. Sensitive Questions

Several sets of questions in the NLSY79 and Children of the NLSY79 might be considered sensitive. This section describes these questions and explains why they are a crucial part of the data collection. All of these topics have been addressed in previous rounds of the surveys, and respondents generally have been willing to answer the questions. Respondents are always free to refuse to answer any question that makes them feel uncomfortable.


NLSY79 main survey

Income, Assets, and Program Participation. One major set of sensitive questions collects information about respondents’ income and assets. The interviews record information about the sources and amounts of income received during the past calendar year by the respondent, his/her spouse or partner, or other family members. Income sources identified include the respondents’ and their spouses’ or partners’ wages and salaries, income from military service, profits from a farm or business, Social Security, pensions and annuities, and alimony/child support. These questions, or variants, have appeared in the NLSY79 since 1979. While some respondents refuse to answer these questions, our item nonresponse rate is lower than for most surveys. The survey also asks about income received by the respondent and spouse or partner from unemployment compensation, Aid to Families with Dependent Children and Temporary Assistance for Needy Families (AFDC/TANF), food stamps, Supplemental Security Income, and other public assistance programs. While questions on program participation have changed over the years, they have been in the survey for the past 20 years. Finally, the survey includes a regular collection of information about the value of respondents’ assets. Although the assets section was not included in round 22, these questions have been asked in 18 of the 22 rounds to date. In consultation with the TRC, we have decided that it will only be necessary to ask the assets series in every other round from this point forward. Asset accumulation is a slow process, and it seems reasonable to ask these questions lees often, leaving time for questions on other topics in non-asset rounds. The asset questions planned for round 23 are the same as the questions asked in round 21. Although some respondents refuse to answer these questions or say that they don’t know the amount of various assets, as with income we are able to collect assets information with a low rate of nonresponse and relatively little respondent resistance.


Income and assets questions are central to the usefulness of the NLSY79 data collection. Most economic and many other analyses based on these data include some measure of the financial resources available to the respondent, either as an input variable, outcome variable, or control. It is very difficult to conceive of a replacement measure that would function in the same way in research about the returns to education, participation in the labor market, child development, and so on. The public assistance questions additionally permit research on the effects of the welfare reforms enacted in 1997, providing important information to public officials charged with overseeing the country’s public assistance programs. In addition to providing information about the financial resources currently available to respondents, as respondents age the assets series will permit examination of issues relating to retirement and respondents’ accumulation of wealth in preparation for labor force separation.


As part of the rotating assets module, we include a brief series of questions on personal finance. These questions, which explore issues closely related to income and assets, ask respondents to report problems that may affect their ability to obtain credit. Issues explored include missed payments on bills, credit cards on which the respondent carries the maximum balance, whether and when the respondent has ever declared bankruptcy, and whether the respondent has been turned down for credit in the past 5 years. These questions were asked in round 21 of the NLSY79, and similar questions are asked in the Survey of Consumer Finances. We did not experience nonresponse problems in round 21.


In round 23, we plan to continue the use of follow-up questions for respondents who answer “don’t know” or “refuse” to income and assets questions. As described in a report submitted to OMB with the round 22 clearance package, we have tested these follow-up questions to ensure that we are getting the best information possible. Based on this analysis, we implemented a hybrid approach for round 22 and plan to continue using this approach in round 23. Briefly, respondents who answer an income or assets question with don’t know or refuse are first asked to provide a self-generated range. Respondents who are unable to answer the self-generated intervals question (e.g., either a don’t know or a refuse answer) then get unfolding bracket questions, where they are asked if the amount is more or less than a specified amount, and then more or less than a second specified amount. These unfolding brackets are a common follow-up approach in major surveys including the Health and Retirement Survey (HRS). To limit any negative effects from using hybrid follow-up, we skip potentially uncooperative respondents (e.g., those giving at least one refusal in the income section) past the hybrid follow-up to subsequent refusals.


Contraception. The NLSY79 includes three brief questions about contraception. Respondents are asked to report whether they have used birth control in the past month, what types they have used, and what percentage of the time they use birth control. These questions are useful in research about fertility expectations and in public health research related to unprotected sexual contact. These questions have been included in the survey for a number of years, and respondents are generally cooperative. Item nonresponse on contraception is lower than for income questions.


Cigarette and Alcohol Use. A final set of potentially sensitive survey questions is the brief series on cigarette and alcohol use.  First, the round 23 interview will include questions on smoking last asked in round 18. These questions are whether the respondent has smoked more than 100 cigarettes in his/her life, the age the respondent began smoking daily, whether the respondent now smokes daily, the age when the respondent quit smoking, and the number of cigarettes smoked per day. These questions have been asked in several previous rounds of the survey with very low nonresponse rates.


The round 23 survey will also include a series of four questions asking whether the respondent drank alcohol in the past 30 days, the number of days on which alcohol was consumed, the average number of drinks per day, and the number of days on which the respondent consumed 6 or more drinks (an indication of alcohol abuse).  These questions are important for both economic and social research.  Economists are interested in the impact that alcohol use and abuse may have on employment and earnings (for example, Dooley and Prause 1998; Kenkel and Wang 1998).  Sociologists and public health researchers can use alcohol data, along with the other information collected, to examine the social and psychological impact of alcohol use and abuse.


The set of alcohol questions included in the round 22 interviews has been asked previously in identical or similar form (the time reference period was different in the early surveys) in 1982–85, 1988, 1989, 1992, 1994, 2002, and 2006.  In these years, the questions were generally part of a longer and more intrusive series on alcohol use and abuse.  No difficulties were encountered with these longer series in past rounds, and nonresponse is very low.  For example, for the set of four questions being included this year, the largest number of respondents who refused or answered “don’t know” in 1994 was 11.  No problems were experienced in round 22 with the shorter and less intrusive set of questions, and none are expected in round 23.


Political Participation. For the 2008 survey, we are proposing the introduction of a short series of questions on the respondent’s participation in the political process. These questions are described and justified in detail in Attachment 10.


NLSY79 children/young adults

Income. Young adults (those ages 15–20) are asked a series of income questions similar to, but somewhat less detailed than, those asked of their mothers in the main interview. As described above, income data are crucial to many kinds of analysis in a variety of research fields. The young adult data additionally allow researchers to examine similarities or differences in the income sources of mothers and their children, providing information about the transmission of the ability to earn income across generations.


Smoking, Drug and Alcohol Use, and Criminal Activity. Children age 10 and older (including young adults) are asked about smoking, drug use, and alcohol use. These questions record whether the respondent has ever used a number of substances, including alcohol, cigarettes, marijuana, cocaine, and other drugs, and ask about the extent of the respondent’s use in the past 30 days. For young adults (ages 15–20), additional delinquent and criminal behavior questions record whether the young adult has run away from home or been convicted of criminal activities such as selling drugs, possessing drugs, theft, assault, and so on. If the respondent reports convictions, he or she is asked to describe the type of crime committed and the punishment received.


Questions about substance use and criminal behavior are crucial in understanding the education and employment outcomes of this group of young adults. To quote a report based on data from the 1990 Youth Risk Behavior Surveillance System (U.S. Department of Health and Human Services), “Patterns of tobacco, alcohol and other drug use usually are established during youth, often persist into adulthood, contribute substantially to the leading causes of mortality and morbidity, and are associated with lower educational achievement and school dropout.” One concern with long-term drug and alcohol use is the gateway effect that can occur, leading to use of heavier drugs and an increase in other risky behaviors (for example, sexual activity or criminal acts). For examples of such research, see Pacula (1997); Desimone (1998); and Parker, Harford, and Rosenstock (1994). The negative relationship between drug and alcohol use and educational attainment has been investigated by authors such as Yamada, Kendix, and Yamada (1996). Finally, as mentioned above, substance use may have a negative effect on the probability of employment and the wages and benefits received.


These questions will be asked to about 3,065 of the 3,615 children and young adults (under 21) who will participate in round 23. These sensitive questions have been asked in a nearly identical form since 1994 without difficulty. Refusals and don’t knows have been quite low (often less than 1 percent). Prior to computer-assisted administration, some respondents did not fill out the self-report booklets correctly or completely. Because these instruments are now computer-administered, starting in 2000 for the young adults and in 2002 for children under age 15, this problem has been ameliorated.


School Safety. Reflecting the growing national concern with weapons in schools, questions on this topic have been administered to adolescents in several other national surveys. Questions on whether or not a respondent has carried a weapon or been threatened by a weapon have been directed toward adolescents ranging in age from 12-18 in the following surveys:

  • The National Youth Study (Tulane, 1998)

  • Welfare, Children, and Families: A Three-City Study (Johns Hopkins, 2001)

  • NLSY97 (BLS, 1997-2004)

  • Youth Risk Behavior Surveys (CDC)


Other surveys have asked questions about whether young respondents carry a weapon to school for protection (Josephson Institute, 1999).


For many previous rounds, the NLSY79 Child and Young Adult surveys have included questions about the child’s attitudes and opinions regarding school, including whether the child feels safe at school. In 2002, we added two related questions that ask whether a respondent has ever seen a student with a gun, knife, or other weapon on school property and, if so, the number of times in the last year that the respondent has seen a student with a gun, knife, or other weapon on school property. These questions will be continue to be asked of respondents ages 10–14 in the Child Survey and respondents attending school in the Young Adult Survey. These questions will aid researchers in investigating the presence of weapons in schools as it relates to school characteristics, neighborhood environment, child behavior, child success in school, subsequent criminal behavior, and so on.


Sexual Activity. Young adults (about 2,500 respondents ages 15–20) are also asked about the onset of sexual intercourse. Because puberty and the initiation of sexual activity occur during the teenage years for many youths, and because this information may not be recalled accurately if collected retrospectively, it is important to ask these questions of respondents in this age range in each survey round. Results from a number of different surveys, including early rounds of the NLSY97, indicate that a significant proportion of adolescents report that they are sexually active. It is vital that we continue to trace the progression of sexual activity in relation to the realization of educational and occupational goals and with respect to the promotion of good health practices. The level of sexual activity and contraceptive use are important indicators of how serious young people are about reaching higher levels of educational and occupational attainment, and there should be significant congruence between anticipated life goals, sexual activity, and its associated outcomes. These questions also provide information important for analyses of programs and policies related to adolescent health and well-being.


Further, age at first intercourse is important to understanding labor market behavior because of the central role that adolescent fertility plays in affecting the future life course of women. Early childbearing not only retards the education of the mother and hence is deleterious to her labor market opportunities, but also tends to play a powerful role in the intergenerational transmission of poverty. AIDS and other sexually transmitted diseases also make sexual behavior a significant public health issue. For these reasons, this line of questioning is important to the central point of the survey.


The sensitive questions on substance use, criminal behavior, and sexual activity are only asked with the consent of a parent or guardian of a child. We inform parents about the questions we ask, and the parents and teenagers are free to refuse to answer these questions. Our experience has been that participants recognize the importance of these questions and only very rarely refuse to answer. To further protect respondents and encourage honest reporting, these questions are contained in a self-administered section of the interview for children ages 10–14 and for any young adults interviewed in person. Because most young adults will be interviewed on the telephone, the sensitive questions have been written in such a way that the respondent can answer the questions without revealing personal information to anyone (such as a parent) who might overhear the conversation. Although we now ask about the age of the respondent’s most recent sexual partner and his or her relationship with the respondent, no identifying information is collected about sexual partners.


Political Participation. For the 2008 survey, we are proposing the introduction of a short series of questions on the respondent’s participation in the political process. These questions are described and justified in detail in Attachment 10.


Child Assessments. Attachment 7 includes a discussion of the child assessment data to be included in this survey round, although for the most part these assessments are not sensitive, are well validated, and have been asked already without difficulty or any significant respondent resistance since 1986.


Informed Consent. At OMB’s request, we conducted cognitive testing before round 20 to determine whether children and young adults understand the informed consent statement. A report summarizing this research was submitted with the round 20 OMB clearance package. We will continue to use the consent statement developed as a result of that research and used for the first time in round 20.


References

Desimone, Jeffrey. “Is Marijuana a Gateway Drug?” Eastern Economic Journal 24,2 (Spring 1998): 149-163.

Dooley, David and Prause, Joann. “Underemployment and Alcohol Misuse in the National Longitudinal Survey of Youth.” Journal of Studies on Alcohol 59,6 (November 1998): 669-80.

Harford, Thomas C. and Muthen, Bengt O. “Adolescent and Young Adult Antisocial Behavior and Adult Alcohol Use Disorders: A Fourteen-Year Prospective Follow-Up in a National Survey.” Journal of Studies on Alcohol 61,4 (July 2000): 524-528.

Kenkel, Donald S. and Wang, Ping. “Are Alcoholics in Bad Jobs?” NBER Working Paper No. 6401, National Bureau of Economic Research, March 1998.

Pacula, Rosalie Liccardo. “Adolescent Alcohol and Marijuana Consumption: Is There Really a Gateway Effect?” NBER Working Paper No. 6348, National Bureau of Economic Research, January 1997.

Parker, Douglas A.; Harford, Thomas C.; and Rosenstock, Irwin M. “Alcohol, Other Drugs, and Sexual Risk-Taking among Young Adults.” Journal of Substance Abuse 6,1 (1994): 87-93.

Yamada, Tetsuji; Kendix, Michael; and Yamada, Tadashi. “The Impact of Alcohol Consumption and Marijuana Use on High School Graduation.” Health Economics 5,1 (January-February 1996): 77-92.


12. Estimation of Information Collection Burden

The NLSY79 pretest interview will be administered to approximately 100 respondents (assuming that our separate proposal to augment the pretest sample is approved), and the average response time is about 60 minutes per respondent. The main NLSY79 interview will be administered to approximately 7,550 respondents, and the average response time is about 60 minutes per respondent.


The time estimate for the NLSY79 Child Survey involves three components:

  • The Mother Supplement assessments are administered to female NLSY79 respondents who live with biological children under age 15. This section will be administered to about 1,300 mothers, who will be asked a series of questions about each child under age 15. On average, these women each have 1.26 children under age 15, for a total number of approximately 1,650 children. The average response time is 20 minutes for each child or, stated alternatively, 26 minutes for each mother (20 minutes per child times 1.26 children per mother).

  • The Child Supplement involves testing the achievement and aptitude of about 1,450 children ages 4-14. The average response time for this aptitude testing is 31 minutes per child.

  • The Child Self-Administered Questionnaire (SAQ). The Child SAQ is administered to about 900 children ages 10–14, and the average response time is 30 minutes per child.


The Young Adult Survey will be administered to approximately 2,165 youths ages 15 to 20. These youths will be contacted for an interview regardless of whether they reside with their mothers. The average response time for the Young Adult Survey is 45 minutes per respondent.


Reviewers should keep in mind that substantial portions of the child supplement material are only asked of small numbers of children (for example, children ages 10–14 or young adults ages 15–20). Thus, the average time in each household is not as extensive as the attached stack of survey questionnaires might suggest. We are sensitive to the fact that the interviews in households with several children can theoretically pose interviewing problems, but our experience with this material in previous rounds has clearly indicated that this is not an issue. Respondents are very interested in the child assessments and have been quite cooperative.


The projected timing for the interview is based upon data on interview length in 2006. Accurate timings are available on a section-by-section basis for each of our respondents in 1996–2006. The time required to finish an interview varies in the sample. While women with children take longer to answer the fertility and childcare questions, men are asked more questions about employment because they tend to hold more jobs. The data show the standard deviation of interview time for the main survey is around 25 minutes. The variability of the Child Survey components will be chiefly due to differences in the number and ages of children in the family.


During the field period, about 200 interviews are validated to ascertain whether the interview took place as the interviewer reported and whether the interview was done in a polite and professional manner. These reinterviews average about 6 minutes each.


Table 5 below summarizes the estimated respondent burden for round 23 of the NLSY79.


Table 5. Number of respondents and average response time by survey questionnaire, Round 23

Instrument

Total Respondents

Total Responses

Average Time per Response

Estimated Total Burden

NLSY79 Round 23 Pretest

1001

100

60 minutes

100 hours

NLSY79 Round 23 Main Survey

7,550

7,550

60 minutes

7,550 hours

Round 23 Validation Interviews

200

200

6 minutes

20 hours

Mother Supplement

(Mothers of children under age 15)

1,3002

1,650

20 minutes

550 hours

Child Supplement

(Children under age 15)

1,450

1,450

31 minutes

750 hours

Child Self-Administered Questionnaire

(Children ages 10 to 14)

900

900

30 minutes

450 hours

Young Adult Survey

(Youths ages 15 to 20)

2,165

2,165

45 minutes

1,624 hours

TOTALS3

11,265

14,015

11,044 hours

1This assumes that our separate proposal for augmentation of the pretest sample is approved and implemented.

2The number of respondents for the Mother Supplement (1,300) is less than the number of responses (1,650) because mothers are asked to provide separate responses for each of the biological children with whom they reside. The total number of responses for the Mother Supplement (1,650) is more than the number for the Child Supplement (1,450) because the number of children completing the Child Supplement is lower due to age restrictions and nonresponse.

3The total number of 11,265 respondents across all the survey instruments is a mutually exclusive count that does not include: (1) the 200 reinterview respondents, who were previously counted among the 7,550 main survey respondents, (2) the 1,300 Mother Supplement respondents, who were previously counted among the main youth, and (2) the 900 Child SAQ respondents, who were previously counted among the 1,450 Child Supplement respondents.


13. Cost Burden to Respondents or Record Keepers

Respondents for this survey will not incur any capital and start-up costs; respondents will also not incur any operation, maintenance, or purchase-of-service costs.


14. Estimate of Cost to the Federal Government

The total estimated cost of the round 23 (2008) NLSY79 is about $14,000,000. This figure is based on extrapolations from the costs of previous survey rounds, adjusted for the estimated cost savings resulting from survey automation. This cost includes funding for the development of survey instruments for the main NLSY79 and the associated Child and Young Adult surveys; data collection for the surveys; cleaning and preparation of the data file; limited analysis of the survey data; and services to users of the public-use data files.


The survey costs are borne largely by BLS, with the National Institute of Child Health and Human Development providing an anticipated $3,700,000 in funding to BLS for the Child and Young Adult surveys. This shared effort reduces the cost to the government, as compared with the costs that would be incurred if each agency were to conduct the surveys independently.


15. Change in Burden

The estimated total respondent burden of 11,044 hours for round 23 is higher than the estimated burden for the pretest sample replenishment of 1,100. A more comparable comparison would be to the burden estimate for Round 22, which was 12,172. The decrease from Round 22 can be attributed to attrition as well as the aging of children from the longer child survey to the shorter young adult instrument, and the aging of young adults into the grant-funded sample.


16. Plans and Time Schedule for Information Collection, Tabulation, and Publication

Following receipt of final data from NORC, approximately 9 months are then spent cleaning the data and preparing a main NLSY79 public-use data file. Subsequent to this, the child/young adult file is prepared and reports are written for the Department of Labor and the National Institute of Child Health and Human Development. The timing of these events is as follows:


Pretest Interviews

October 2007

Interviews

January 1, 2008–January 31, 2009

Data Reduction and Coding

February 1–April 30, 2009

Public-Use Data File Preparation

May 1, 2009–February 28, 2010

Release of Main NLSY79 Public-Use Data File

March 2010

Report Writing for NICHD, Release of Child/Young Adult File

Summer 2010


17. Reasons Not to Display OMB Expiration Date

Does not apply.


18. Exceptions to “Certification for Paperwork Reduction Act Submissions,” OMB Form 83-I

We do not have any exceptions in Item 19, “Certification for Paperwork Reduction Act Submissions,” of OMB form 83-I.


B. Collections of Information Employing Statistical Methods


1. Respondent Universe and Respondent Selection Method

The initial NLSY79 sample was selected to represent (after appropriate weighting) the total U.S. civilian and military population of 33,570,000 persons who were ages 14 to 21 as of December 31, 1978. The sample selection procedure included an overrepresentation of Hispanic and black youths so as to include sufficient sample cases to permit racial and ethnic analytical comparisons. Another group that was oversampled was economically disadvantaged non-black/non-Hispanic youths. The NLSY79 also originally included a supplemental sample of youths in the military. In 1985, the military supplemental sample was discontinued, and in 1991, the economically disadvantaged non-black/non-Hispanic oversample was discontinued. Appropriate weights have been developed so that the sample components can be combined in a manner to aggregate to the overall U.S. population born in the years 1957-64 and living in the United States when the sample was selected in 1978. The number of sample cases in 1979, excluding the discontinued military and non-black/non-Hispanic samples, was 9,964. A breakdown by sex and race is depicted in table 6 below. We anticipate a response rate in round 23 that is similar to the round 22 experience.


Table 6. Civilian Sample Interviews Completed in 1979 and 2006 (Preliminary) by Race and Sex


1979

2006

Race and Hispanic origin

Number of men

Number of women

Total number

Number of men

Retention rate for men

Number of women

Retention rate for women

Total number

Total retention rate

Non-black, non-Hispanic

2518

2484

5002

1827

72.56%

1956

78.74%

3783

75.63%

Hispanic

981

980

1961

737

75.13%

753

76.84%

1490

75.98%

Black

1524

1477

3001

1174

77.03%

1207

81.72%

2381

79.34%

Total

5023

4941

9964

3738

74.42%

3916

79.26%

7654

76.82%



Retention rates for the NLSY79 are significantly affected by attrition due to death. Approximately 4.8% of the 9,964 NLSY79 respondents still eligible for interviewing were deceased after the 2006 survey; we are investigating ways to determine whether some respondents we cannot locate may be deceased. Table 7 provides information about retention (percent of all respondents interviewed) and response (percent of living respondents interviewed) rates for each year of the NLSY79.


Table 7. NLSY79 retention and response rates by sample type

Year

Number Interviewed

Retention Rate1

Number of Deceased Respondents

Response Rate1

1979

12,686

1980

12,141

95.7

9

95.8

1981

12,195

96.1

29

96.3

1982

12,123

95.6

44

95.9

1983

12,221

96.3

57

96.8

1984

12,069

95.1

67

95.6

1985

210,894

93.9

79

94.5

1986

10,655

91.8

95

92.6

1987

10,485

90.3

110

91.2

1988

10,465

90.2

127

91.2

1989

10,605

91.4

141

92.5

1990

10,436

89.9

152

91.1

1991

39,018

90.5

145

91.8

1992

9,016

90.5

156

91.9

1993

9,011

90.4

177

92.1

1994

8,891

89.2

204

91.1

1996

8,636

86.7

243

88.8

1998

8,399

84.3

275

86.7

2000

8,033

80.6

313

83.2

2002

7,724

77.5

368

80.3

2004

7,661

76.9

421

80.3

2006

7,654

76.8

477

80.7

1Retention rate is defined as the percentage of base-year respondents remaining eligible who were interviewed in a given survey year; deceased respondents are included in the calculations. Response rate is defined as the percentage of base-year respondents remaining eligible and not known to be deceased who were interviewed in a given survey year.

2A total of 201 military respondents were retained from the original sample of 1,280; 186 of the 201 participated in the 1985 interview. The total number of NLSY79 civilian and military respondents eligible for interview beginning in 1985 was 11,607.

3The 1,643 economically disadvantaged nonblack/non-Hispanic male and female members of the supplemental subsample were not eligible for interview as of the 1991 survey year. The total number of NLSY79 civilian and military respondents eligible for interview beginning in 1991 was 9,964.


2. Design and Procedures for the Information Collection

The survey includes personal-visit or telephone interviews with all the respondents, regardless of their place of residence. At each interview, detailed information is gathered about relatives and friends who could be of assistance in locating the respondent if he or she cannot be readily located in the subsequent survey round. Interviews in round 23 will be carried out between January and December 2008, with the field period extending into January 2009 if necessary. Every effort is made to locate respondents, as the attrition information above suggests. Interviewers are encouraged to attempt contacting respondents until they are located. There is no arbitrary limit put on the number of callbacks. The success of NORC interviewers in this regard is indicated by a very low rate of attrition over the first 22 rounds of the survey. Over 80 percent of the living, in-scope, original respondents were surveyed in 2006.


Preceding the data collection, NORC interviewers are carefully trained, with particular emphasis placed on resolving sensitive issues. Most of the NORC interviewers have lengthy experience in the field from participation in earlier NLSY79 interview rounds, as well as from involvement with other NORC surveys. All new recruits are given one day of personal training on general interviewing techniques, followed by three days of personal training on the questionnaire and field procedures. Experienced interviewers receive self-study training consisting of over 8 hours spent on specially designed materials requiring study of the questionnaire and question-by-question and procedural specifications, with exercises on new or difficult sections and procedures. All interviewers must successfully complete a practice interview with their supervisor before they are permitted to begin field work.


Efforts to assure quality data from the field are instigated at several points. The first 100 cases completed are reviewed, answer by answer, to determine whether there are any problems with the instrument. After this, every case identified by the interviewer as having a problem during the interview is reviewed in detail. Throughout the field period, individual cases are checked for problems, and rapid feedback is given to the interviewers so they can improve interviewing methods.


We will reduce burden by employing targeted validation. Cases that have unusual patterns in terms of length, time of day, break-offs, an incorrect entry to the question on the respondent’s date of birth, or height and weight entries that are inconsistent with previous rounds, will be validated. If anything suspicious is found as a result of a validation interview, the entire caseload of the interviewer will be validated.


NORC’s Field Managers (FMs) supervise the field interviewers. For each round, NORC divides the continental U.S. into approximately 15 regions (depending on the locations of respondents), each supervised by a Field Manager who is responsible for staffing and for the quality of fieldwork in the region. A ratio of 1 supervisor to 15 interviewers is the standard arrangement. FMs are, in turn, supervised by one of three Divisional Field Managers.


The interview schedules are prepared by professional staff at the Center for Human Resource Research (CHRR) at The Ohio State University under contract with the U.S. Department of Labor. When new materials are incorporated into the schedule, special assistance is generally sought from appropriate experts in the specific substantive area. The technical expertise of staff at NORC is also used in this regard.


Because sample selection took place in 1978 in preparation for the 1979 baseline interview, sample composition has remained unchanged except for the discontinuation of some of the oversamples as previously mentioned. A more detailed discussion of sampling methodology is available from NLS User Services at CHRR (phone: 614-442-7366, e-mail: [email protected]).


In an effort to reduce respondent burden while still providing a broad spectrum of variables for researchers and policy makers to use, certain topical modules are cycled in and out of the survey from one round to the next. Although the data from these modules are important, it is not necessary to collect data on all topics in every round. An example of such a topical module is the series of questions on promotions and job hierarchies that was asked for the first time in 1990. A redesigned series of questions on promotions was asked again in 1996 and 1998 but was not included in subsequent surveys.


3. Maximizing Response Rates

A number of the procedures used to maximize the response rate already have been described in items 1 and 2 above. The success of the procedures is demonstrated by the low attrition rates indicated in tables 6 and 7. Hispanic attrition has been slightly higher than among blacks and non-black/non-Hispanics, and attrition for men is higher than for women. There is no evidence of any selective response problems that could bias analytical results due to the efforts of the high-quality NORC interviewers. To the best of our knowledge, the NLSY79 has the best retention rate of any longitudinal survey in the U.S. We note, however, that interviewing becomes a little more difficult each round.


The other component of missing data is item nonresponse. The rate of item nonresponse due to the refusal of a respondent to answer a particular question in the NLSY79 is under 0.2 percent per question. The highest nonresponse rates occur for income and asset items.


One natural issue for longitudinal surveys is to determine whether the sample still represents its portion of the U.S. population. The NLSY79 originally was weighted to represent the 1978 population of 14-21 year-olds and closely matches the official statistics for that year. Sampling weights are prepared each year to adjust the remaining sample to representative proportions. These sampling weights are released with the other data on the public-use data file.


To investigate the issue of continued sample representation, table 8 compares numbers from the 2000 decennial Census with NLSY79 population estimates. Census data, are taken from the FactFinder.Census.Gov website, which is the official website for disseminating Census 2000 data.1 The FactFinder website provides the number of people living in the U.S. who were ages 35 to 43 on April 1, 2000, which is the same age group that the NLSY79 sample represents. NLSY79 population estimates are from the weighted results of the round 19 (year 2000) survey.


Table 8 shows the percentage of the NLSY79 sample and U.S. population by sex and race. Overall, the table has two significant features. First, the 2000 NLSY79 sample slightly overrepresents men, since there is a larger percentage of men in the NLSY79 sample than in the U.S. population. For comparison, the original NLSY79 sample in 1979 was composed of 50.8 percent men and 49.2 percent women, while the Census Bureau reported a 1979 population for the same age group that was 50.6 percent men and 49.4 percent women. (Source: U.S. Bureau of the Census, publication P25-917, Preliminary Estimates of the Population of the United States, by Age, Sex, and Race: 1970 to 1981; and 1979 NLSY79 data) Hence, the current male-female composition of the NLSY79 does not suggest any gender-biased attrition. Rather, the composition is quite similar to the sex ratio of the original panel. Nevertheless, the comparison with the recent Census Bureau estimates may suggest that the NLSY79 sample is experiencing lower male mortality than the overall U.S. population of the same age.


Second, the NLSY79 sample underrepresents the current U.S. population of Hispanics. The NLSY79 sample does not include persons who entered the United States after 1978, and the rate of immigration among Hispanics has been very high since the NLSY79 sample was selected. (The differences between the NLSY79 sample and the recent U.S. population estimates in the percentages of non-black/non-Hispanics and black non-Hispanics can be explained largely by the shortfall of Hispanics in the NLSY79 sample.) Comparing the NLSY79 sample with the U.S. population estimates for 1978, the NLSY79 sample correctly represents the Hispanic population on a weighted basis, and as described earlier in this document, the NLSY79 sample intentionally overrepresents the 1978 Hispanic population on an unweighted basis.


Overall, table 8 shows that except for the Hispanic population the NLSY79 sample is still similar to the U.S. population estimates for the same age group. If one accounts for the large amount of Hispanic immigration since the survey began, the remaining differences are not large. Moreover, the weights that are produced after each round compensate for the modestly different rates of attrition and mortality across demographic groups.


Table 8. NLSY79 Weighted Sample Composition in 2000 versus U.S. Census Data for Persons Ages 31 to 39 as of April 1, 2000



2000 NLSY79

Census Data

Total

100.0%

100.0%

Men

50.9%

49.8%

Women

49.1%

50.2%

Non-black, non-Hispanic

79.3%

76.4%

Men

40.2%

38.1%

Women

39.1%

38.3%

Black, non-Hispanic

14.2%

12.1%

Men

7.3%

5.7%

Women

6.9%

6.4%

Hispanic

6.5%

11.5%

Men

3.4%

6.0%

Women

3.1%

5.6%



4. Testing of Questionnaire Items

A comprehensive pretest of the main NLSY79 questionnaire is carried out approximately 3 months preceding each round of the regular survey. Assuming that our separate proposal to augment the pretest is approved, this pretest will include about 100 respondents from different racial, ethnic, geographic, and socio-economic backgrounds. On the basis of this pretest, the various questionnaire items—particularly those being asked for the first time—are evaluated with respect to question sensitivity and validity. When necessary, problem items are deleted from the final survey questionnaire.


5. Statistical Consultant

Dr. Kirk Wolter

NORC

1155 East 60th Street

Chicago, IL 60637

(312) 753-7500


A detailed statement of the sampling plan, prepared several years ago by Dr. Martin Frankel, is available from NLS User Services at CHRR. The sample design and the interviewing fieldwork are being carried out by NORC, 1155 East 60th Street, Chicago, IL, 60637.


As indicated earlier, analyses of the data collected in this survey round will be prepared for the contracting agencies by the Center for Human Resource Research, The Ohio State University, 921 Chatham Lane, Suite 100, Columbus, OH, 43221.

Attachment 1—Title 29 USC Sections 1 & 2


The Code of the Laws
of the United States of America

Title 29 - Labor



bureau of labor statistics


§1. Design and duties of bureau generally


The general design and duties of the Bureau of Labor Statistics shall be to acquire and diffuse among the people of the United States useful information on subjects connected with labor, in the most general and comprehensive sense of that word, and especially upon its relation to capital, the hours of labor, the earnings of laboring men and women, and the means of promoting their material, social, intellectual, and moral prosperity. (June 13, 1899, c389, § 1, 25 Stat. 182.)


§2. Collection, collation, and reports of labor statistics


The Bureau of Labor Statistics, under the direction of the Secretary of Labor, shall collect, collate, and report at least once each year, or oftener if necessary, full and complete statistics of the conditions of labor and the products and distribution of the products of the same, and to this end said Secretary shall have power to employ any or either of the bureaus provided for his department and to rearrange such statistical work, and to distribute or consolidate the same as may be deemed desirable in the public interests; and said Secretary shall also have authority to call upon other departments of the Government for statistical data and results obtained by them; and said Secretary of Labor may collate, arrange, and publish such statistical information so obtained in such manner as to him may seem wise.


The Bureau of Labor Statistics shall also collect, collate, report, and publish at least once each month full and complete statistics of the volume of and changes in employment, as indicated by the number of persons employed, the total wages paid, and the total hours of employment, in the service of the Federal Government, the States and political subdivisions thereof, and in the following industries and their principal branches: (1) Manufacturing; (2) mining, quarrying, and crude petroleum production; (3) building construction; (4) agriculture and lumbering; (5) transportation, communication, and other public utilities; (6) the retail and wholesale trades; and such other industries as the Secretary of Labor may deem it in the public interest to include. Such statistics shall be reported for all such industries and their principal branches throughout the United States and also by States and/or Federal reserve districts and by such smaller geographical subdivisions as the said Secretary may from time to time prescribe. The said Secretary is authorized to arrange with any Federal, State, or municipal bureau or other governmental agency for the collection of such statistics in such manner as he may deem satisfactory, and may assign special agents of the Department of Labor to any such bureau or agency to assist in such collection.


history; ancillary laws and directives


Explanatory notes:

The bracketed words are substituted for “There shall be at the seat of government a Department of Labor, the general design and duties of which shall be . . .” Act Feb. 14, 1903, c. 552 § 4, 32 Stat. 826, placed the Department of Labor in the Department of Commerce and Labor, Act Mar. 18, 1904, c. 716, 33 Stat. 136, changed the name of the Department of Labor to the Bureau of Labor. Act Mar. 4, 1913, c. 141 § 3, 37 Stat. 737, transferred the Bureau of Labor from the Department of Commerce and Labor to the Department of Labor and redesignated the Bureau as the Bureau of Labor Statistics.


Transfer of functions:


1950 reorganization plan no. 6

department of labor

Section 1. Transfer of functions to the Secretary, (a) Except as otherwise provided in subsection (b) of this section, there are hereby transferred to the Secretary of Labor all functions of all other officers of the Department of Labor and all functions of all agencies and employees of such Department. (b) This section shall not apply to the functions vested by the Administrative Procedure Act (60 Stat. 237) [see 5 USCS §§ 551 et. seq., 701 et. seq., 3105, 3344, 5362, 7521] in hearing examiners employed by the Department of Labor.


Sec. 2. Performance of functions of Secretary. The Secretary of Labor many from time to time make sure provisions as he shall deem appropriate authorizing the performance by any other officer or by any agency or employee, of the Department of Labor of any function of the Secretary, including any function transferred to the Secretary by the provisions of this reorganization plan.


Sec. 3 Administrative Assistant Secretary. There shall be in the Department of Labor an Administrative Assistant Secretary of Labor, who shall be appointed, with the approval of the President, by the Secretary of Labor under the classified civil service, who shall perform such duties as the Secretary of Labor shall prescribe.


Sec. 4. Incidental transfers. The Secretary of Labor may from time to time effect such transfers within the Department of Labor of any of the records, property, personnel, and unexpended balances (available or to be make available) of appropriations, allocations, and other funds of such Department as he may deem necessary in order to carry out the provisions of this reorganization plan.

Attachment 2— Commissioner's Order No. 1-06

Confidential Nature of BLS Statistics Data

  1. Purpose. The purpose of this Order is to state the Bureau of Labor Statistics (BLS) policy concerning the confidential nature of BLS statistical data.

  2. Reference Office. Office of Administration, Division of Management Systems.

  3. Authority. Secretary's Order 39-72, "Control of Data and Information Collected by the Bureau of Labor Statistics," assigns the Commissioner of Labor Statistics responsibility for confidentiality policy and procedures related to the protection of BLS data and for deciding on all requests for public disclosure of data collected by the BLS. The Confidential Information Protection and Statistical Efficiency Act of 2002 (CIPSEA), Title 5 of Public Law 107 347, establishes statutory provisions protecting the confidentiality of data collected by Federal Executive Branch agencies for exclusively statistical purposes under a pledge of confidentiality. The Workforce Investment Act of 1998, Public Law 105-220, section 309(a)(2), establishes statutory provisions protecting the confidentiality of data collected through the Federal/State Labor Market Information programs. The Federal Statistical Confidentiality Order issued by the Office of Management and Budget, 62 Federal Register 35043 (June 27, 1997), establishes a consistent government policy protecting the confidentiality interests of respondents who provide information for Federal statistical programs.

  4. Directives Affected. Commissioner's Order 3-04, "Confidential Nature of BLS Records," is replaced by this Order. In all cases where Commissioner's Order 3-04 is cited as the BLS policy, this Order is henceforth the applicable document.

  5. References. Administrative Procedure 2-05, "Responsibility for Safeguarding Confidential Information," Administrative Procedure 2-06, "Informed Consent Procedures," Commissioner's Order 3-00, "Contracts and Agreements Involving BLS Confidential Data or Privacy Act Data," Commissioner's Order 4-00, "Advance Release of Embargoed News and Data Releases," Commissioner's Order 1-05, "Authorizing Advance Access to or Publication of Non-Embargoed News and Data Releases," and Administrative Procedure 2-99, "Requests for Records Under the Freedom of Information Act" provide additional information on the BLS confidentiality policy.

  6. Definitions. For purposes of this Order:

    1. Confidential information includes:

      1. Respondent identifiable information. Any representation of information that permits the identity of the respondent to whom the information applies to be reasonably inferred by either direct or indirect means.

      2. Pre-release economic data. Statistics and analyses that have not yet officially been released to the public, whether or not there is a set date and time of release before which they must not be divulged.

        1. Embargoed data. Pre-release economic data for the Principal Federal Economic Indicators produced by the BLS. Currently, the following BLS data series have been designated by OMB as Principal Federal Economic Indicators: the Consumer Price Index, Employment Situation, Employment Cost Index, Producer Price Indexes, Productivity and Costs, Real Earnings, and U.S. Import and Export Price Indexes.

        2. Non-embargoed data. Non-embargoed data include all economic data produced by the BLS that are not designated as Principal Federal Economic Indicators. This includes statistics and analyses that have not yet officially been released to the public, whether or not there is a set date and time of release before which they must not be divulged.

    2. Respondent. A person who, or organization that, is requested or required to supply information to the BLS, is the subject of information requested or required to be supplied to the BLS, or provides that information to the BLS. A person or organization is not required to actually have provided information to BLS, or have had information provided to BLS from another source, to be considered a respondent.

    3. Statistical purposes. The description, estimation, or analysis of the characteristics of groups without identifying the individuals or organizations that comprise such groups, and the development, implementation, or maintenance of methods, procedures, or information resources that support such purposes. This definition does not include any use of respondent identifiable information for administrative, regulatory, law enforcement, adjudicatory, disclosure under the Freedom of Information Act, or other similar purposes that affect the rights, privileges, or benefits of a particular respondent.

    4. Statistical activities. The collection, compilation, processing, or analysis of data for the purpose of describing or making estimates or tabulations concerning the whole, or relevant groups or components within the economy, society, or the natural environment. Statistical activities include the development of methods or resources that support those activities, such as measurement methods, models, statistical classifications, or sampling frames.

    5. Authorized persons. Officers, employees, and agents of the BLS who are responsible for collecting, processing, or using confidential information in furtherance of statistical purposes or for the other stated purposes for which the data were collected. Authorized persons are authorized access to only confidential information that are integral to the program or project on which they work, and only to the extent required to perform their duties.

    6. Agents. Individuals who meet the definition of agent as set forth by CIPSEA and who have been designated by the BLS to perform exclusively statistical activities through an Agent Agreement.

    7. Disclose or Disclosure. The release of confidential information to anyone other than authorized persons or the respondent who provided or is the subject of the data.

    8. Advance Release. Providing a BLS news or data release (or any part or derivative of a release) to a person or organization outside the BLS prior to its official date and time of public release.

  7. Policy. In conformance with existing law and Departmental regulations, it is the policy of the BLS that:

    1. Respondent identifiable information collected or maintained by, or under the auspices of, the BLS for exclusively statistical purposes and under a pledge of confidentiality shall be treated in a manner that will ensure that the information will be used only for statistical purposes and will be accessible only to authorized persons.

    2. Pre-release economic data, including embargoed data, prepared for release to the public will not be disclosed or used in an unauthorized manner before they officially have been released, and will be accessible only to authorized persons.

  8. Designation of Authorized Persons. The following categories of individuals are authorized persons:

    1. BLS officers and employees who take the oath of office and who sign the BLS Employee Acknowledgment Letter when they enter on duty.

    2. Individuals designated as agents who fall within one of the following categories:

      1. State agency employees who are directly involved in the BLS/State cooperative programs, who are subject to the provisions of the BLS/State cooperative agreement, and who have signed a BLS Agent Agreement.

      2. BLS contract employees whose contract under which they are working contains provisions that includes the BLS confidentiality policy and who have signed a BLS Agent Agreement.

      3. Individuals working under the authority of a separate government entity with which the BLS has entered into a contract or other agreement that includes the BLS confidentiality policy and who have signed a BLS Agent Agreement.

      4. Researchers who are affiliated with an organization with which the BLS has entered into a contract or other agreement that includes the BLS confidentiality policy, who are working on a temporary basis on a statistical project of interest to the BLS, and who have signed a BLS Agent Agreement.

      5. Any other individuals who are affiliated with an organization with which the BLS has entered into a contract or other agreement that includes the BLS confidentiality policy. Such individuals must meet the definition of an agent under CIPSEA, and must sign a BLS Agent Agreement.

  9. Delegation of Authority for Designating Agents.

    1. The authority for designating agents for access to the confidential National Longitudinal Survey of Youth Geocode Files for statistical research is hereby delegated to the Senior Research Economist for Employment Research and Program Development.

    2. The authority for designating agents for access to the confidential Census of Fatal Occupational Injuries Research File for statistical research is hereby delegated to the Assistant Commissioner for Safety, Health, and Working Conditions.

    3. The authority for designating agents for access to all other BLS confidential information is delegated to the Associate Commissioner for the office in which the confidential information is maintained.

    4. The authority for designating agents for administrative statistical activities that involve access to confidential information is delegated to the Associate Commissioner for Administration.

    5. The authority for designating agents for the provision of contracted services to the BLS that involve access to confidential information is delegated to the Contracting Officer and the Contracting Officer's Technical Representatives assigned to oversee work on individual contracts.

    6. The authority for designating agents for the purposes of carrying out statistical activities with State agencies with which the BLS has written agreements is delegated to the Associate Commissioner for Field Operations and the Regional Commissioners.

    7. The authority for designating agents for access to BLS confidential information for authorized fellowship programs is delegated to the Associate Commissioner for Survey Methods Research.

  10. Implementation. In the execution of this general policy concerning confidential BLS records, the following requirements shall be in effect:

    1. Data collected in cooperation with another Federal or State agency for exclusively statistical purposes under a pledge of confidentiality are covered by the policy of this Order and by applicable Federal laws governing the handling of confidential information.

    2. Files maintained by another Federal or State agency that are commingled with confidential information collected by BLS for exclusively statistical purposes under a pledge of confidentiality are covered by the policy of this Order and by applicable Federal laws governing the handling of confidential information. Further, any data, including publicly available data,that are commingled with confidential information covered by this Order are to be treated as confidential and handled in accordance with this policy.

    3. Universe lists derived from data provided to the BLS for exclusively statistical purposes under a pledge of confidentiality shall be kept confidential.

    4. The survey sample composition, lists of reporters, names of respondents, and brand names shall be kept confidential, regardless of the source of such lists or names.

    5. Publications shall be prepared in such a way that they will not reveal the identity of any specific respondent and, to the knowledge of the preparer, will not allow information concerning the respondent to be reasonably inferred by either direct or indirect means.

    6. Frequency count data of establishments tabulated by the Quarterly Census of Employment and Wages (QCEW) are not considered confidential since general information about an establishment, particularly information on the establishment location and line of business (or industry) that would be used in a frequency count table, is publicly available. All other information maintained by BLS in the QCEW file, including the employment and wages of establishments, is considered confidential and must be handled in accordance with this policy and applicable Federal law.

    7. Graphical representations of data, including maps, may be disclosed to the public only if the table underlying the graphical representation meets BLS disclosure criteria.

    8. All individuals or organizations, government or private, who enter into a contract or other agreement with the BLS for the collection, processing, maintenance, or storage of data shall conform to CIPSEA and other applicable Federal laws, to the BLS confidentiality policy, to Commissioner's Order 3 00, "Contracts and Agreements Involving BLS Confidential Data or Privacy Act Data," and to all specific procedures published pursuant to this Order.

    9. Each BLS/State cooperative agreement shall designate a State official to serve as a State Cooperating Representative. The State Cooperating Representative shall act as the BLS representative for ensuring that all provisions of the BLS confidentiality policy are understood and complied with in the cooperating State agency. The State Cooperating Representative and all other State agency personnel who receive access to BLS confidential information must be designated agents of the BLS in accordance with Section 8, "Designation of Authorized Persons."

    10. Any restrictions placed by international sources upon the use of data obtained from those sources shall be observed. Also, any limitations placed by the Department of State or other agency upon the use, dissemination, or handling of data obtained through Foreign Service channels shall be observed wherever applicable.

    11. BLS officers, employees, and agents who are responsible for collecting data shall not sign any confidentiality agreements required by respondents. Such agreements may be forwarded to the Division of Management Systems for consideration. Signing of building entrance logs, which sometimes may contain confidentiality language, is allowed.

    12. Programs are responsible for complying with Disclosure Review Board (DRB) policies established under BLS Statistical Policy Directives. In addition, when specific disclosure limitation issues arise, programs are responsible for consulting with the DRB prior to disseminating potentially confidential information.

    13. In order for data obtained solely from a publicly available source to be covered under this Order, a pledge of confidentiality must be provided to the person or organization that is the subject of the information.

    14. Programs may provide data to other BLS programs, with management approval, for the statistical purposes of data reconciliation.

    15. Under limited circumstances, advance release of pre-release economic data is permitted with the authorization of the Commissioner. Advance release of embargoed data is permitted only under the conditions set out in Commissioner's Order 4-00, "Advance Release of Embargoed News and Data Releases." Advance release of non-embargoed data is permitted only under the conditions set out in Commissioner's Order 1-05, "Authorizing Advance Access to or Publication of Non-Embargoed News and Data Releases."

  11. Exceptions Under Conditions of Informed Consent. Exceptions to the general policy relating to the disclosure of confidential information set forth in Section 7, "Policy," or to the provisions listed in Section 10, "Implementation, "shall be granted only under the conditions of informed consent. Proposed informed consent arrangements shall be developed in consultation with the Division of Management Systems and must be authorized by the Commissioner prior to implementation in accordance with Administrative Procedure 2-06, "Informed Consent Procedures."

  12. Assignment of Responsibility.

    1. The Commissioner of Labor Statistics approves all confidentiality policies and procedures related to the protection of BLS confidential information and decides all requests for public disclosure of data collected by the BLS.

    2. The Associate Commissioner for Administration is assigned responsibility for the following:

      1. Developing and overseeing all BLS-wide policies and procedures for the safe handling of BLS confidential information.

      2. Ensuring BLS-wide compliance with confidentiality laws, policies, and procedures.

      3. Overseeing the development and implementation of regular confidentiality training for all BLS employees and agents.

      4. Serving as a BLS Disclosure Officer deciding on requests for public disclosure of BLS confidential information under the Freedom of Information Act (FOIA) and for establishing BLS-wide procedures for the handling of requests for records under FOIA.

    3. All Associate Commissioners are responsible for ensuring full compliance with all confidentiality laws, policies, and procedures within their organization.

  13. Disciplinary Actions. It is the policy of the BLS to enforce the provisions of this Order to the full extent of its authority. Any unauthorized disclosure or use of confidential information by a BLS officer or employee may constitute cause for the BLS to take disciplinary action against that officer or employee including, but not limited to, reprimand, suspension, demotion, or removal. Any unauthorized disclosure or use of confidential information by a BLS contractor or other agent may constitute cause for removal from further work under the contract or other agreement through which access to confidential information is authorized or termination of the contract or other agreement. Furthermore, a knowing and willful disclosure by a BLS officer, employee, or agent of respondent identifiable information collected for exclusively statistical purposes under a pledge of confidentiality would be a violation of CIPSEA and potentially other applicable Federal laws that carry criminal fines and penalties.

  14. Effective Date. This Order is effective immediately.

PHILIP L. RONES
Acting Commissioner of Labor Statistics


Attachment 3—Confidential Information Protection and Statistical Efficiency Act of 2002 (CIPSEA)


Sec. 501. Short Title.

This title may be cited as the "Confidential Information Protection and Statistical Efficiency Act of 2002".

Sec. 502. Definitions.

As used in this title:

  1. The term "agency" means any entity that falls within the definition of the term "executive agency" as defined in section 102 of title 31, United States Code, or "agency", as defined in section 3502 of title 44, United States Code.

  2. The term "agent" means an individual—

    1.  

      1. who is an employee of a private organization or a researcher affiliated with an institution of higher learning (including a person granted special sworn status by the Bureau of the Census under section 23(c) of title 13, United States Code), and with whom a contract or other agreement is executed, on a temporary basis, by an executive agency to perform exclusively statistical activities under the control and supervision of an officer or employee of that agency;

      2. who is working under the authority of a government entity with which a contract or other agreement is executed by an executive agency to perform exclusively statistical activities under the control of an officer or employee of that agency;

      3. who is a self-employed researcher, a consultant, a contractor, or an employee of a contractor, and with whom a contract or other agreement is executed by an executive agency to perform a statistical activity under the control of an officer or employee of that agency; or

      4. who is a contractor or an employee of a contractor, and who is engaged by the agency to design or maintain the systems for handling or storage of data received under this title; and

    2. who agrees in writing to comply with all provisions of law that affect information acquired by that agency.

  3. The term "business data" means operating and financial data and information about businesses, tax-exempt organizations, and government entities.

  4. The term "identifiable form" means any representation of information that permits the identity of the respondent to whom the information applies to be reasonably inferred by either direct or indirect means.

  5. The term "nonstatistical purpose"

    1. means the use of data in identifiable form for any purpose that is not a statistical purpose, including any administrative, regulatory, law enforcement, adjudicatory, or other purpose that affects the rights, privileges, or benefits of a particular identifiable respondent; and

    2. includes the disclosure under section 552 of title 5, United States Code (popularly known as the Freedom of Information Act) of data that are acquired for exclusively statistical purposes under a pledge of confidentiality.

  6. The term "respondent" means a person who, or organization that, is requested or required to supply information to an agency, is the subject of information requested or required to be supplied to an agency, or provides that information to an agency.

  7. The term "statistical activities"

    1. means the collection, compilation, processing, or analysis of data for the purpose of describing or making estimates concerning the whole, or relevant groups or components within, the economy, society, or the natural environment; and

    2. includes the development of methods or resources that support those activities, such as measurement methods, models, statistical classifications, or sampling frames.

  8. The term "statistical agency or unit" means an agency or organizational unit of the executive branch whose activities are predominantly the collection, compilation, processing, or analysis of information for statistical purposes.

  9. The term "statistical purpose"

    1. means the description, estimation, or analysis of the characteristics of groups, without identifying the individuals or organizations that comprise such groups; and

    2. includes the development, implementation, or maintenance of methods, technical or administrative procedures, or information resources that support the purposes described in subparagraph (A).

Sec. 503. Coordination and Oversight of Policies.

  1. In General.—The Director of the office of Management and Budget shall coordinate and oversee the confidentiality and disclosure policies established by this title. The Director may promulgate rules or provide other guidance to ensure consistent interpretation of this title by the affected agencies.

  2. Agency Rules.—Subject to subsection (c), agencies may promulgate rules to implement this title. Rules governing disclosures of information that are authorized by this title shall be promulgated by the agency that originally collected the information.

  3. Review and Approval of Rules.—The Director shall review any rules proposed by an agency pursuant to this title for consistency with the provisions of this title and chapter 35 of title 44, United States Code, and such rules shall be subject to the approval of the Director.

  4. Reports.—

    1. The head of each agency shall provide to the Director of the office of Management and Budget such reports and other information as the Director requests.

    2. Each Designated Statistical Agency referred to in section 522 shall report annually to the Director of the office of Management and Budget, the Committee on Government Reform of the House of Representatives, and the Committee on Governmental Affairs of the Senate on the actions it has taken to implement sections 523 and 524. The report shall include copies of each written agreement entered into pursuant to section 524(a) for the applicable year.

    3. The Director of the office of Management and Budget shall include a summary of reports submitted to the Director under paragraph (2) and actions taken by the Director to advance the purposes of this title in the annual report to the Congress on statistical programs prepared under section 3504(e)(2) of title 44, United States Code.

Sec. 504. Effect on Other Laws.

  1. Title 44, United States Code.—This title, including amendments made by this title, does not diminish the authority under section 3510 of title 44, United States Code, of the Director of the office of Management and Budget to direct, and of an agency to make, disclosures that are not inconsistent with any applicable law.

  2. Title 13 and Title 44, United States Code.—This title, including amendments made by this title, does not diminish the authority of the Bureau of the Census to provide information in accordance with sections 8, 16, 301, and 401 of title 13, United States Code, and section 2108 of title 44, United States Code.

  3. Title 13, United States Code.—This title, including amendments made by this title, shall not be construed as authorizing the disclosure for nonstatistical purposes of demographic data or information collected by the Census Bureau pursuant to section 9 of title 13, United States Code.

  4. Various Energy Statutes.—Data or information acquired by the Energy Information Administration under a pledge of confidentiality and designated by the Energy Information Administration to be used for exclusively statistical purposes shall not be disclosed in identifiable form for nonstatistical purposes under—

    1. section 12, 20, or 59 of the Federal Energy Administration Act of 1974 (15 U.S.C. 771, 779, 790h);

    2. section 11 of the Energy Supply and Environmental Coordination Act of 1974 (15 U.S.C. 796); or

    3. (3) section 205 or 407 of the Department of the Energy Organization Act of 1977 (42 U.S.C. 7135, 7177).

  5. Section 201 of Congressional Budget Act of 1974.—This title, including amendments made by this title, shall not be construed to limit any authorities of the Congressional Budget office to work (consistent with laws governing the confidentiality of information the disclosure of which would be a violation of law) with databases of Designated Statistical Agencies (as defined in section 522), either separately or, for data that may be shared pursuant to section 524 of this title or other authority, jointly in order to improve the general utility of these databases for the statistical purpose of analyzing pension and health care financing issues.

  6. Preemption of State Law.—Nothing in this title shall preempt applicable State law regarding the confidentiality of data collected by the States.

  7. Statutes Regarding False Statements.—Notwithstanding section 512, information collected by an agency for exclusively statistical purposes under a pledge of confidentiality may be provided by the collecting agency to a law enforcement agency for the prosecution of submissions to the collecting agency of false statistical information under statutes that authorize criminal penalties (such as section 221 of title 13, United States Code) or civil penalties for the provision of false statistical information, unless such disclosure or use would otherwise be prohibited under Federal law.

  8. Construction.—Nothing in this title shall be construed as restricting or diminishing any confidentiality protections or penalties for unauthorized disclosure that otherwise apply to data or information collected for statistical purposes or nonstatistical purposes, including, but not limited to, section 6103 of the Internal Revenue Code of 1986 (26 U.S.C. 6103).

    1. Authority of Congress.—Nothing in this title shall be construed to affect the authority of the Congress, including its committees, members, or agents, to obtain data or information for a statistical purpose, including for oversight of an agency's statistical activities.

Subtitle A—Confidential Information Protection

Sec. 511. Findings and Purposes.

  1. Findings.—The Congress finds the following:

    1. Individuals, businesses, and other organizations have varying degrees of legal protection when providing information to the agencies for strictly statistical purposes.

    2. Pledges of confidentiality by agencies provide assurances to the public that information about individuals or organizations or provided by individuals or organizations for exclusively statistical purposes will be held in confidence and will not be used against such individuals or organizations in any agency action.

    3. Protecting the confidentiality interests of individuals or organizations who provide information under a pledge of confidentiality for Federal statistical programs serves both the interests of the public and the needs of society.

    4. Declining trust of the public in the protection of information provided under a pledge of confidentiality to the agencies adversely affects both the accuracy and completeness of statistical analyses.

    5. Ensuring that information provided under a pledge of confidentiality for statistical purposes receives protection is essential in continuing public cooperation in statistical programs.

  2. Purposes.—The purposes of this subtitle are the following:

    1. To ensure that information supplied by individuals or organizations to an agency for statistical purposes under a pledge of confidentiality is used exclusively for statistical purposes.

    2. To ensure that individuals or organizations who supply information under a pledge of confidentiality to agencies for statistical purposes will neither have that information disclosed in identifiable form to anyone not authorized by this title nor have that information used for any purpose other than a statistical purpose.

    3. To safeguard the confidentiality of individually identifiable information acquired under a pledge of confidentiality for statistical purposes by controlling access to, and uses made of, such information.

Sec. 512. Limitations on Use and Disclosure of Data and Information.

  1. Use of Statistical Data or Information.—Data or information acquired by an agency under a pledge of confidentiality and for exclusively statistical purposes shall be used by officers, employees, or agents of the agency exclusively for statistical purposes.

  2. Disclosure of Statistical Data or Information.—

    1. Data or information acquired by an agency under a pledge of confidentiality for exclusively statistical purposes shall not be disclosed by an agency in identifiable form, for any use other than an exclusively statistical purpose, except with the informed consent of the respondent.

    2. A disclosure pursuant to paragraph (1) is authorized only when the head of the agency approves such disclosure and the disclosure is not prohibited by any other law.

    3. This section does not restrict or diminish any confidentiality protections in law that otherwise apply to data or information acquired by an agency under a pledge of confidentiality for exclusively statistical purposes.

  3. Rule for Use of Data or Information for Nonstatistical Purposes.—A statistical agency or unit shall clearly distinguish any data or information it collects for nonstatistical purposes (as authorized by law) and provide notice to the public, before the data or information is collected, that the data or information could be used for nonstatistical purposes.

  4. Designation of Agents.—A statistical agency or unit may designate agents, by contract or by entering into a special agreement containing the provisions required under section 502(2) for treatment as an agent under that section, who may perform exclusively statistical activities, subject to the limitations and penalties described in this title.

Sec. 513. Fines and Penalties.

Whoever, being an officer, employee, or agent of an agency acquiring information for exclusively statistical purposes, having taken and subscribed the oath of office, or having sworn to observe the limitations imposed by section 512, comes into possession of such information by reason of his or her being an officer, employee, or agent and, knowing that the disclosure of the specific information is prohibited under the provisions of this title, willfully discloses the information in any manner to a person or agency not entitled to receive it, shall be guilty of a class E felony and imprisoned for not more than 5 years, or fined not more than $250,000, or both.

Subtitle B—Statistical Efficiency

Sec. 521. Findings and Purposes.

  1. Findings.—The Congress finds the following:

    1. Federal statistics are an important source of information for public and private decision-makers such as policymakers, consumers, businesses, investors, and workers.

    2. Federal statistical agencies should continuously seek to improve their efficiency. Statutory constraints limit the ability of these agencies to share data and thus to achieve higher efficiency for Federal statistical programs.

    3. The quality of Federal statistics depends on the willingness of businesses to respond to statistical surveys. Reducing reporting burdens will increase response rates, and therefore lead to more accurate characterizations of the economy.

    4. Enhanced sharing of business data among the Bureau of the Census, the Bureau of Economic Analysis, and the Bureau of Labor Statistics for exclusively statistical purposes will improve their ability to track more accurately the large and rapidly changing nature of United States business. In particular, the statistical agencies will be able to better ensure that businesses are consistently classified in appropriate industries, resolve data anomalies, produce statistical samples that are consistently adjusted for the entry and exit of new businesses in a timely manner, and correct faulty reporting errors quickly and efficiently.

    5. The Congress enacted the International Investment and Trade in Services Act of 1990 that allowed the Bureau of the Census, the Bureau of Economic Analysis, and the Bureau of Labor Statistics to share data on foreign-owned companies. The Act not only expanded detailed industry coverage from 135 industries to over 800 industries with no increase in the data collected from respondents but also demonstrated how data sharing can result in the creation of valuable data products.

    6. With subtitle A of this title, the sharing of business data among the Bureau of the Census, the Bureau of Economic Analysis, and the Bureau of Labor Statistics continues to ensure the highest level of confidentiality for respondents to statistical surveys.

  2. Purposes.—The purposes of this subtitle are the following:

    1. To authorize the sharing of business data among the Bureau of the Census, the Bureau of Economic Analysis, and the Bureau of Labor Statistics for exclusively statistical purposes.

    2. To reduce the paperwork burdens imposed on businesses that provide requested information to the Federal Government.

    3. To improve the comparability and accuracy of Federal economic statistics by allowing the Bureau of the Census, the Bureau of Economic Analysis, and the Bureau of Labor Statistics to update sample frames, develop consistent classifications of establishments and companies into industries, improve coverage, and reconcile significant differences in data produced by the three agencies.

    4. To increase understanding of the United States economy, especially for key industry and regional statistics, to develop more accurate measures of the impact of technology on productivity growth, and to enhance the reliability of the Nation's most important economic indicators, such as the National Income and Product Accounts.

Sec. 522. Designation of Statistical Agencies.

For purposes of this subtitle, the term "Designated Statistical Agency" means each of the following:

  1. The Bureau of the Census of the Department of Commerce.

  2. The Bureau of Economic Analysis of the Department of Commerce.

  3. The Bureau of Labor Statistics of the Department of Labor.

Sec. 523. Responsibilities of Designated Statistical Agencies.

The head of each of the Designated Statistical Agencies shall—

  1. identify opportunities to eliminate duplication and otherwise reduce reporting burden and cost imposed on the public in providing information for statistical purposes;

  2. enter into joint statistical projects to improve the quality and reduce the cost of statistical programs; and

  3. protect the confidentiality of individually identifiable information acquired for statistical purposes by adhering to safeguard principles, including—

    1. emphasizing to their officers, employees, and agents the importance of protecting the confidentiality of information in cases where the identity of individual respondents can reasonably be inferred by either direct or indirect means;

    2. training their officers, employees, and agents in their legal obligations to protect the confidentiality of individually identifiable information and in the procedures that must be followed to provide access to such information;

    3. implementing appropriate measures to assure the physical and electronic security of confidential data;

    4. establishing a system of records that identifies individuals accessing confidential data and the project for which the data were required; and

    5. being prepared to document their compliance with safeguard principles to other agencies authorized by law to monitor such compliance.

Sec. 524. Sharing of Business Data Among Designated Statistical Agencies.

  1. In General.—A Designated Statistical Agency may provide business data in an identifiable form to another Designated Statistical Agency under the terms of a written agreement among the agencies sharing the business data that specifies—

    1. the business data to be shared;

    2. the statistical purposes for which the business data are to be used;

    3. the officers, employees, and agents authorized to examine the business data to be shared; and

    4. appropriate security procedures to safeguard the confidentiality of the business data.

  2. Responsibilities of Agencies Under Other Laws.—The provision of business data by an agency to a Designated Statistical Agency under this subtitle shall in no way alter the responsibility of the agency providing the data under other statutes (including section 552 of title 5, United States Code (popularly known as the Freedom of Information Act), and section 552b of title 5, United States Code (popularly known as the Privacy Act of 1974)) with respect to the provision or withholding of such information by the agency providing the data.

  3. Responsibilities of officers, Employees, and Agents.— Examination of business data in identifiable form shall be limited to the officers, employees, and agents authorized to examine the individual reports in accordance with written agreements pursuant to this section. officers, employees, and agents of a Designated Statistical Agency who receive data pursuant to this subtitle shall be subject to all provisions of law, including penalties, that relate—

    1. to the unlawful provision of the business data that would apply to the officers, employees, and agents of the agency that originally obtained the information; and

    2. to the unlawful disclosure of the business data that would apply to officers, employees, and agents of the agency that originally obtained the information.

  4. Notice.—Whenever a written agreement concerns data that respondents were required by law to report and the respondents were not informed that the data could be shared among the Designated Statistical Agencies, for exclusively statistical purposes, the terms of such agreement shall be described in a public notice issued by the agency that intends to provide the data. Such notice shall allow a minimum of 60 days for public comment.

Sec. 525. Limitations on Use of Business Data Provided by Designated Statistical Agencies.

  1. Use, Generally.—Business data provided by a Designated Statistical Agency pursuant to this subtitle shall be used exclusively for statistical purposes.

  2. Publication.—Publication of business data acquired by a Designated Statistical Agency shall occur in a manner whereby the data furnished by any particular respondent are not in identifiable form.

Sec. 526. Conforming Amendments.

  1. Department of Commerce.—Section 1 of the Act of January 27, 1938 (15 U.S.C. 176a) is amended by striking "The" and inserting "Except as provided in the Confidential Information Protection and Statistical Efficiency Act of 2002, the".

  2. Title 13.—Chapter 10 of title 13, United States Code, is amended—

    1. by adding after section 401 the following:

      "Sec. 402. Providing business data to Designated Statistical Agencies"

      "The Bureau of the Census may provide business data to the Bureau of Economic Analysis and the Bureau of Labor Statistics ('Designated Statistical Agencies') if such information is required for an authorized statistical purpose and the provision is the subject of a written agreement with that Designated Statistical Agency, or their successors, as defined in the Confidential Information Protection and Statistical Efficiency Act of 2002."; and

    2. in the table of sections for the chapter by adding after the item relating to section 401 the following:

      "402. Providing business data to Designated Statistical Agencies.".

Attachment 4—Survey Applications


A. Use of the NLSY79 for Diffusion of Useful Information on Labor


The NLSY79 is widely used by state, federal, and local government agencies; universities; news media and foundations; and other private organizations. The broad, omnibus nature of this data set reflects the charge to the Bureau of Labor Statistics to “...acquire and diffuse among the people of the United States useful information on subjects connected with labor, in the most general and comprehensive sense of that word, and especially upon its relation to capital, the hours of labor, the earnings of laboring men and women, and the means of promoting their material, social, intellectual, and moral prosperity” (Title 29 USC, Section 1).


Data from the NLS are also used in congressional testimony. For example, in 1991 testimony was given before a Senate Budget subcommittee on the problems of poverty, delinquency, and cognitive development of children from disadvantaged families.


B. Use of the NLSY79 for Examination of Department of Labor Employment and Training Programs


The Youth Employment and Demonstration Projects Acts of 1977 and the 1978 amendments to CETA added several new programs to those designed to upgrade the skills of unemployed, underemployed, and economically disadvantaged youth. Beginning in FY 1984, the Job Training Partnership Act (JTPA) has involved further programmatic changes. Overall, 14.5 percent of the NLSY79 population had been involved in one of the programs by the 1983 survey date. Due to the oversampling of minorities and the economically disadvantaged by the NLSY79, the surveys contain substantial numbers of persons who participated in these programs, as well as many others who were eligible to participate but did not enroll. The data from the NLSY79 allow estimates of the proportion of eligible youth who participated in various types of programs. Further, the longitudinal nature of the NLSY79 permits the longitudinal comparison of the experiences of participants and eligible nonparticipants. Studies are under way to measure the long-term consequences of training programs. The NLSY79 sample has also been used as a control group for the evaluation of other Department of Labor programs.


With respect to the gains from participation over time, a wide variety of outcomes have been examined, including earnings, welfare and other transfer payment receipt, weeks of employment and unemployment, aspirations, job satisfaction, quality of working conditions, length of schooling, on-the-job training received, and job search activities. A variety of individual characteristics can be used to statistically “match” the participants and the comparison group. These include school status, prior labor force experience, family characteristics, marital status, race, sex, ethnic background, and several social-psychological variables, to mention only a few. Data will continue to be collected on local labor market conditions and type of employment and training services received. This information will help ascertain how these variables influence the impact of program participation.


In addition, participants have been asked their reactions to the program in which they participated, how it helped them in the labor market, and what facets of the program could have been improved. Responses can be related to the individual’s characteristics and backgrounds and to the specific services they received in an effort to discover why people choose to participate in employment and training programs, what services are being provided, how well they are received, who drops out of the programs, and what might be done to improve the programs.


The longitudinal nature of the NLSY79 has at least three important features for analysis of employment and training programs. First, and most important, the surveys can provide a long-term assessment of the effects of participation in government employment and training programs on a variety of outcomes, including employment, education and high school completion or equivalence, and reliance on welfare and other government support programs as required by JTPA. Second, the analysis is facilitated by being able to aggregate participants over time. Such aggregation results in a larger pool of participants and permits analysis of the effectiveness of programs for relatively small subgroups. Third, the service mix of employment programs has changed in recent years, with more emphasis on training and less on subsidized employment. The impact of these changes can ultimately be measured.


C. Use of the NLSY79 in Understanding Labor Markets


1. Orientation toward the Labor Market

In the NLSY79, we have repeatedly asked respondents about their education, training, and labor force behavior. In earlier years we also collected data on aspirations and expectations for the future. As our respondents have advanced in their careers, we can examine the degree to which their aspirations have been met.


By virtue of the extensive event history on work behavior that has been collected, we can examine some of the most important (but hardest to answer) questions about the evolution of careers in the United States. We have data on temporary, consultant, and contract work status. In recent years this mode of work has expanded rapidly and many have voiced concern about the impact of these tenuous employer relationships on the careers of workers. We can trace these relationships and measure the extent to which these tenuous relationships do or do not lead to more stable work arrangements. Is this another form of job search with employers following a conservative strategy to ascertain a job match? In 2002, we added a new employer supplement with questions tailored to these nontraditional workers. Because respondents will answer questions more appropriate to their situation instead of the questions on regular employment, we hope to obtain better data and facilitate further investigation of this topic. Respondents gave us positive feedback on this new section in 2002, and we retained the questions for nontraditional workers in 2004 and beyond.


With the reports of layoffs and downsizing that appear in the popular press, many are wondering about the impact of a layoff on workers’ careers. If there is a large return to employer-employee specific matches, then layoffs represent a significant degradation of the stock of match-specific capital in the workplace. On the other hand, if rewards to experience are not employer specific, then these layoffs will have less serious impact on the earnings of persons being downsized. The issue becomes one of the rate of return to tenure versus the rate of return to experience, and the NLSY79 is the best data set available to examine such issues since it tracks mobility among employers over a long period of time with great detail.


As many of the military respondents in this cohort approach retirement, we will be able to examine the extent to which military skills translate to civilian labor market earnings.


2. Factors in Educational Progress

The NLSY79 continues to yield detailed information on the progress of respondents in GED programs, college, and graduate school that is being used to provide answers to a number of policy-related questions concerning both the causes and consequences of premature school termination and the effects of post-secondary education. While the sample will be 43 to 50 years of age at the beginning of 2008, it will still include adults who have delayed educational completion or who are returning to school to supplement earlier education and employment experiences. NLSY79 data, both historical and current, help researchers to address the following questions:


(a) What are the long-term consequences for high school students who withdraw without obtaining a diploma? What is the relative importance of such factors as differences in ability, differences in motivation, and differences in the economic status of the young adults and their families? Research completed with this data suggests that for both male and female youth dissatisfaction with schooling is a more important reason for leaving school than employment- or income-related reasons. After controlling for socio-economic differences, minority groups continue to have above-average high school non-completion rates. Linkages between these factors and child-specific characteristics can also now be considered using the recently collected cognitive, emotional, and physiological data about the children of the female respondents.


(b) Are high school dropouts at a disadvantage compared with high school graduates in terms of earnings and occupational status as of mid-career? Do these differences narrow and/or disappear over time, or do they persist? In general, have declining labor market opportunities for semi-skilled and unskilled workers affected the relative wages for these groups? Cross-cohort comparisons between young men and women in the original Young Men cohort (in the late 1960s) with young adults in this cohort can directly address this important issue. In addition, the high school graduates and dropouts now have sufficient post-school employment records to clarify these critical issues. Comparisons with NLSY97 respondents will reveal how the effect of dropping out of high school may have changed in the last two decades.


(c) The availability of high school transcript records and the Armed Services Vocational Aptitude Battery scores for these youth greatly expand the utility of the interview data for measuring and tracking qualitative differences in the patterns of regular schooling of these youth in relation to outcomes in mid-career. Given that the respondents attended over 3000 high schools (public, private, inner city, suburban, and rural), detailed evaluation of the impact of different high school curricula and programs of study is possible. High school diplomas encompass a highly variable range of academic standards, course requirements, and learning achievements.


The presence of data on these factors makes possible more sophisticated rates-of-return models than those that simply rely on years of schooling completed. For example, by including the AFQT score and high school diploma receipt in the same wage equations, one can get a more accurate assessment of the dual effects from high school completion of diploma certification and human capital development on labor market success.


(d) What is the long-term impact of out-of-school job-related training? Does it tend to widen or narrow the differences between graduates and dropouts at the time of the first job? To what extent is work experience while attending high school complementary to post-schooling jobs? More specifically, how successful are the work-study cooperative programs both for preparing the student for jobs after graduation and for keeping him/her in school? What evidence, if any, is there that work outside school hours affects retention rates in school? What are the factors that affect the match between post-school jobs and field of study while in college? Is there a greater mismatch between school training and out-of-school employment in a loose rather than in a tight labor market, and does this have an effect on dropout rates? In early 2008, this cohort will be 43-50 years of age. As all members of this sample are well beyond high school age, we can examine the relationship between training, educational attainment, and employment success for fully representative samples of this cohort who have completed high school at different phases of the economy.


3. Transition from School to Work

A critical area of research relates to the processes of early accommodation of youth to the labor market. This involves studying the nature of the bridge between formal education and training and the establishment of relatively stable attachments to given types of work, including experience with temporary part-time jobs while in school and the early exploration of alternatives after leaving school. The continuing high unemployment rates among youth during schooling and in the several years following its termination, as well as high rates of job mobility during the early post-school years, suggest potentially serious social and economic problems for our society. However, little is known about the actual magnitude of this problem and its long-run implications for the individual. Issues that can be researched include the following:


(a) To what extent are variations in the extent and character of youth’s employment while in high school explained by characteristics of the school and of the local labor market, parent financial status, and social-psychological characteristics of the youth (including their attitudes toward school)?


(b) Are individuals with certain socio-economic characteristics likely to enter and be trapped in low-paying jobs in which traditional human capital variables appear to be irrelevant, or does the labor market operate so as to sort individuals out among jobs equitably in terms of their productive capabilities? What are the processes of mobility, post-school training, work experience, and modification of goals that result in youth settling into long-term career jobs?


(c) Does unsatisfactory experience (e.g., extensive unemployment) in the immediate post-school years leave “scars” that affect later labor market behavior and experience, or are these problems essentially transitory, with no lasting effects? Research on the potential “scarring” effects of unemployment has been completed for the original NLS cohort of Young Men. It is now possible to undertake comparable analyses for this cohort who attained adulthood in the 1980s and to compare the experiences of NLSY79 respondents with those in the newer NLSY97 cohort.


(d) What relationship, if any, is there between the high school experience, including work activity, and the post-school labor market activity of non-college-bound youth? Possible research topics focus on whether labor market success depends on the extent or type of work experience and whether the smoothness of the school-to-work transition affects later labor market outcomes.


(e) How does the curriculum of students—whether vocational, college preparatory, or general—affect their later labor market success? Early analyses of the NLSY79 indicated that vocational and academic curricula have similar payoffs in the immediate post-high school period. A longer perspective, possible only with a longitudinal data set, is necessary, however. This cohort has a number of years of post-school employment experience that supports meaningful research on this critical question.


4. The Work Environment

The earlier National Longitudinal Surveys relied heavily on traditional economic variables to characterize the types of jobs held by respondents, e.g., earnings, hours worked, occupation, and industry. Other dimensions of employment include transportation time to work, perceived hazards of the work site, the respondent’s perception of relationships with supervisors and co-workers, and his/her assessment of the long-term possibilities of the job.


These kinds of information make possible a much more penetrating analysis of the character of work experience. For example, the following kinds of questions can be addressed.


(a) To what extent may increased responsibility and improvement in job content occur without being reflected in a change in the job title or a change in employer, the conventional indicators of “job change?” In addition, what is the character of the extensive job changes made by adults? Does “job hopping” result in progressively better jobs along all of the dimensions described above, or does it simply represent a string of equally poor employment opportunities? How does the answer to this question vary depending on the sex, race, ethnicity, and socio-economic and psychological characteristics of the respondent? What is the effect of variation in the economic environment? What is the role of job-specific experience relative to general labor market experience in explaining earnings growth?


(b) What kinds of jobs do people consider desirable, and at what wage rates? What is the extent of variation in this regard by sex, race, ethnicity, and socio-economic status? What are the causes of job satisfaction or dissatisfaction? Does job dissatisfaction lead to job mobility? Research already completed shows that there are systematic differences in the desired characteristics of jobs at entry versus those at mid-life, at least among the non-college population. Young adults entering the labor market are thought to be more concerned with job security, perhaps because of their tendency to be in unstable positions, while mid-life workers are more concerned with promotions. This sample is well suited for studying this evolution of attitudes. Gender differences include a greater emphasis by women than men on job significance and good interpersonal relationships.


5. Racial, Sex, and Cultural Differences in Employment and Earnings

One of the principal purposes of the NLSY79 study is to examine racial, sex, and ethnic differences in employment and earnings. At a descriptive level, gross differences in employment and earnings among various race, sex, and ethnic groups have been identified. In addition, multivariate techniques are currently being used to ascertain the underlying factors responsible for these gross differences. For example, human capital theory suggests that an individual’s earning power in the labor market will reflect the effects of various types of human capital investments. Consequently, earnings should be significantly related to educational attainment, total work experience in the labor market, and tenure on the current job. In addition, various studies have identified a number of other factors that appear to be significantly related to earnings, e.g., ability (IQ), class of worker, health status, size of place of residence, and region of residence.


The kinds of analyses described above for the NLSY79 cohort have in many cases already been done for the original NLS cohorts. Consequently, several areas of considerable interest can be examined by comparing the new and old cohorts. Further, the questions asked of the NLSY97 respondents will permit comparisons with an even younger cohort in the next several years. For example, changes in the returns to various kinds of human capital investments for different groups can be measured. The extent of labor force attachment (and related labor market outcomes) among young women in the NLSY79 cohort compared with that of their counterparts in the predecessor cohorts has already been investigated. Additional research has considered changes reflecting demographics (i.e., changes in cohort sizes due to the baby boom), social change (i.e., impact of the women’s liberation movement), and the state of the economy. The answers to these questions are important for helping to guide public policy.


6. The Relationships between Economic and Social Factors and Family Transitions and Well-Being

In recent years nearly 20 percent of all births were to teenage mothers. Nearly 40 percent of these births were premarital and nearly two-thirds of the mothers had not completed high school. Past research suggests that these women have much poorer prospects than those who have children later: teenage mothers receive less education, have more children, and have a higher risk of divorce and of becoming dependent on public assistance. For young men as well as young women, early parenthood may curtail the amount of education they receive and reduce their earnings potential. What are the implications of these early behaviors for the long-term development of adults?


The NLSY79—particularly as it has been enhanced by the detailed pregnancy histories incorporated into the fourth and subsequent waves—makes it possible to study a variety of issues relating to these problems:


(a) What are the cultural, familial, attitudinal, and economic factors that increase the chances of early childbearing, early marriage, and separation or divorce? How have these causal relationships changed over time, as indicated by comparisons of the 1979 youth cohort, the 1966 and 1968 youth cohorts, and the newer NLSY97 cohort? Research with the NLSY79 data has already documented important changes in the relationship between early childbirth and early school leaving and how early pregnancy is associated with a variety of family and outside influences. This research has documented the importance of alternate education programs, such as the GED, in helping mothers attain secondary school credentials.


(b) What are the long-term social and economic consequences of early childbearing, marriage, and divorce? How do these effects vary according to sex, race, ethnicity, and socio-economic status? As NLSY79 respondents reach mid-career, the availability of nearly 25 years of data permits significant exploration of these long-term effects in later life.


(c) For individuals who assume the responsibilities of marriage and child rearing at early ages, and for young mothers whose marriages dissolve, what kinds of public interventions have been effective in promoting economic independence? What are the potential roles of the provision of childcare, counseling, access to continuing education, and job training? Increasing numbers of researchers are utilizing the NLSY79 data set to explore these important policy-relevant questions.


(d) What are the implications of marital turbulence for mid-life outcomes? Do the effects of divorce depend upon when the divorce comes and the length of the marriage it terminated? What are the implications of marital status and especially divorce for measures of income equality?


(e) The updating of the fertility histories in conjunction with the supplementary data on infant nutrition and maternal and child health included in the 1983 and subsequent survey rounds has permitted a careful causal examination of the longitudinal dimensions of childbearing, infant health and care, and the employment and employability of mothers. From an employment perspective, the data collected in the sixth (1984) and subsequent waves permit an examination of the associations between having had a recent birth, the infant’s health, and the ability of the mother to enter the labor force or maintain employment, if she is already working. Her pattern of employment will be related to her education and training background and prior marital and fertility experiences. The longer-term horizons that will permit using data through the forthcoming 2008 survey supplement on child outcomes will be highlighted below.


7. The Geographic Mobility of Young Baby Boomers

The NLSY79 is being used to examine in detail the associations between geographic mobility, local and national levels of economic activity, and social, economic, and demographic characteristics of this cohort and their families. The longitudinal survey design, in conjunction with the interviewing of respondents regardless of where they move, enables researchers to model the determinants and consequences of geographic movement. In particular, the rich attitudinal content of the survey permits inferences with respect to the relative strength of economic motives in migration. This research is, however, only now becoming feasible, as it requires the cumulating of residence records for a number of years. As of the 2008 wave, the respondents will be 43-50 years of age and nearly all have left their parental household, completed their education, and formed their own households. Preliminary research has already been completed that has documented the levels of mobility for this cohort, and its linkages with leaving school and early family and employment transitions. This exploratory research highlights a more complete mobility research agenda that can be accomplished with this data set. Research examining the economic consequences of their geographic moves, particularly how they relate to their employment and unemployment data, is becoming increasingly feasible. Because respondents are interviewed even if they leave the United States, studies of emigration (or return migration to the country of origin) are also possible.


8. The Measurement and Analysis of Gross Changes in Labor Market Status

The NLSY79 permits quantification of gross changes in many aspects of the labor force status of young baby boomers. The oversampling of blacks and Hispanics permits comparative analyses of labor force transition patterns for male and female adults. A wide variety of background information also permits a careful examination of the extent to which variations in labor force behavior reflect differences in backgrounds, ethnic characteristics, and differential access to schooling. Patterns of labor force continuity and discontinuity for the various groups can be examined in great detail, and the social and economic costs of the variations in work attachment can be analyzed.


From a descriptive perspective, a variety of types of mobility of adults can be quantified: movement into and out of the labor force and between employment and unemployment, movements between jobs, and movement between full- and part-time employment. Moreover, the relationship between these changes and changes in school enrollment status, demographic events, and work attitudes can be analyzed. Examination of changes in labor force and employment status in relationship to changing levels of national and local unemployment permit the testing of the “discouraged worker” and the “additional worker” hypotheses and an analysis of a variety of dimensions of frictional and “disguised” unemployment. By 2008, the adults in this sample have been followed through a variety of economic climates, permitting a more careful examination of the extent to which these gross flows are sensitive to cyclical and regional variations in economic conditions.


Also, by contrasting the patterns of labor force dynamics of the original NLS samples of young men and women with the patterns of the NLSY79 cohort, the question of whether or not the relationships between these transitions and levels of economic activity have changed over the past decade can be considered. Finally, one is able to examine whether or not demographic and socio-economic factors such as marriage, childbearing, and changes in family income levels show the same association with gross labor force movement as was true in earlier decades.


D. Use of the NLSY79 for Social Indicators Analysis


Data derived from the NLSY79 used in conjunction with data from the Young Men (1966) and Young Women (1968)—the “original” youth cohorts—and from the NLSY97 represent a unique means of measuring certain dimensions of social change among young American adults. The NLSY79 cohort can be matched with comparable nationally representative cross-sections of male and female young adults in the 1960s from the original NLS cohorts and with today’s young adults in the NLSY97.


We are now able to measure trends in school attrition, labor force entries and exits, and family transitions. A variety of attitudinal measures, toward work, school, and home, are available for intertemporal comparative purposes. From a more purely economic perspective, patterns of labor force behavior and experience of the cohorts can be compared. This kind of analysis permits insights into such questions as (a) the extent to which the draft and the Vietnam War conditioned labor market experiences of young men during the late 1960s, (b) the extent to which the labor market experience of the earlier cohort reflected the impact of their large numbers relative to the total labor force, and (c) the degree to which changing attitudes about the appropriate role of women have influenced the educational and labor market experience of the current group of women.


The 1960s youth cohorts and NLSY79 already have been used in this manner to compare early fertility patterns, work attitudes, and working propensities of youth in the late 1960s and late 1970s. Comparison of this cohort with the earlier NLS cohorts of young people has shown dramatic changes in work expectation over the intervening decade. In particular, the proportion of young women who anticipate being out of the labor force at age 35 has been reduced by over half. Young women still anticipate jobs that typically belong to females, but there has been a marked shift from aspirations for clerical to professional careers. At the same time, more young men are aspiring to jobs in the skilled trades than was the case in the 1960s. As NLSY97 respondents navigate the school-to-work transition, similar comparisons with that cohort will be possible.


1. Delinquent Behavior, Arrest Records, and School Discipline

The inclusion of self-reported delinquent behavior, school discipline, and arrest records in the 1980 NLSY79 interview has permitted examination of the effects of these deviant behaviors on adolescent employment activity. Combined with subsequent data on employment and education, these data permit examination of the extent to which (1) a sustained pattern of delinquent activities through adolescence and early adulthood is related to employment difficulties and (2) early deviant behaviors may be causally associated with a disposition toward excessive alcohol usage in later adolescence and adulthood. Several specific areas can be explored:


(a) What are the long-term effects of delinquency on adult employment? How many adults with prior arrest records are in the labor force? Is prior official contact with the law in itself a barrier to employment, over and above the effects of factors leading to delinquent behavior? Are there differences in the employment implications of adolescent misbehavior for adults from different social strata or different ethnic groups? Are particular patterns of delinquent behavior associated with different patterns of employment? In this regard, recently completed research suggests that the relationship between illegal activity and employment does vary according to the type of crime involved. Among youth out of school, young men who engage in violent activities have trouble getting and keeping jobs, resulting in less time employed and more time unemployed than their more peaceable counterparts.


(b) How do the factors associated with deviant behavior affect the performance and outcomes of subsequent government education and training programs? To what degree have such programs reached youth with police records? What implications does delinquency have for the accumulation of skills and/or education? Do youth with school discipline problems face special difficulties in acquiring employment-related skills? What effect does a criminal record have on school completion? High school dropouts have relatively high levels both of self-reported illegal behavior and of criminal records. To what extent does delinquency or criminal records contribute to the employment problems of dropouts? For young women, in particular, how does a delinquency record interact with early school leaving and early pregnancy and motherhood?


2. Drug and Alcohol Use

Funding from NIAAA has provided for collection of eight waves (1982–85, 1988–89, 1992, 1994) of alcohol use data, and funds from NIDA permitted collection of drug use information in 1984, 1988, 1992, 1994, and 1998. The 2002 and 2006 surveys included a short series of questions on current alcohol use, which we plan to include on a rotating basis in future rounds including the 2008 survey. The pregnancy history section collects data on substance abuse during pregnancy. Together, these sections profile the substance use patterns of these adults, a particularly important population. Evidence from drug abuse agencies indicates an increasing frequency of polydrug abuse, but the dynamics of such abuse in the general population is unknown. The children and young adult offspring of the mothers of the NLSY79 represent a particularly important group, since substance abuse in these formative adult years may be important in preventing successful transitions into adult roles. Having substance use information on the NLSY79 permits research into a number of important areas:


(a) What are the patterns of drug and alcohol use among this population? Information on drug use can be used to look at persistence and change in drug use patterns over time. What are the correlates of drug use? How do drug use patterns vary across ethnic groups and social classes? Which young people are most likely to persist in drug use? Of particular interest is research on use of various combinations of drugs and alcohol, and the relationships between these combinations and successful life cycle transitions.


Research already completed indicates that there are sharp differences in the levels of alcohol use between men and women and between black men and other men. Women and black men report much lower levels of alcohol use, and especially much lower frequency of heavy drinking, than do white or Hispanic men. The data set will permit examination of the demographics of changes in alcohol use patterns over time and the impact of marriage, school, and parenthood transitions on drinking.


(b) Do labor market conditions, particularly high unemployment rates, affect the incidence and prevalence of drug use? Can we predict which unemployed will turn to drugs, based on their background characteristics, and conversely how drug use may lead to unemployment? The nine waves of information on alcohol use will allow causal inferences to be made, controlling for levels of alcohol use preceding spells of unemployment.


(c) Use of alcohol and some drugs are an integral part of social life among wide segments of society, and at the same time they are known contributors to major social problems. The data set may permit researchers to distinguish between socially acceptable and socially destructive patterns and combinations of drug use.


E. Use of the NLSY79 to Measure Maternal and Child Inputs and Outcomes


For many years, NICHD has provided funds for the collection of detailed fertility histories for the respondents as well as for a variety of supplemental materials on maternal and infant health. These data include (but are not limited to) a comprehensive longitudinal battery on childcare, infant feeding practices, maternal health care during pregnancy (including the use of cigarettes or alcohol during the pregnancy), the availability and use of maternity leave, employment before and after a birth, and maternal and infant health care during the first year of life. Thus, in summary, we have available a data set that links economic, social, and health-related behaviors and attitudes in a continuing longitudinal, temporal context, in a manner superior to any data set in existence. The availability of these data elements over time, across generations, within and across families, and for minorities is unique, and they offer opportunities for basic as well as policy-oriented research relevant to the needs and interests of several agencies and many academics. The juxtaposition of all the dimensions specified above permits analyses of issues of critical concern to our society and its government. The data permit a careful examination of the ways in which family structure, parents’ employment, childcare, economic well-being, and family background intersect to affect the well-being of young children and their families.


Several previous survey rounds gathered information on a number of outcome measures relating to the children of the female respondents. These data greatly enhance the utility of the overall data set for examining a variety of issues relating to the impact that family background and parents’ employment, earnings, and other personal characteristics and behaviors have on critical dimensions of child and adolescent development. Many of these cognitive and socio-emotional assessments will be repeated in 2008.


This sociometric data collection enormously enhances the utility of the overall data set for researchers and practitioners in a number of disciplines, including medicine, economics, psychology, and sociology. The availability of these child outcome measures permits researchers to address in detail the following critically important research agenda. For the most part, other data sets do not permit researchers to comprehensively address these issues in an appropriate analytical manner.


1. Research Issues Linking Employment, Income, and Child Outcomes

The following abbreviated list indicates some of the more important employment and income-related issues that can be addressed using these child outcome measures.


(a) Linkages between the extensive employment histories, other maternal behaviors, and child outcomes permit researchers to carefully and uniquely consider a variety of policy-relevant issues. First, and most directly, what impact does the patterning, extensiveness, and type of a woman’s employment during pregnancy and in the period after a birth have in the short and long run on an infant’s or child’s physical, emotional, and intellectual well-being? The NLSY79 includes great detail not only on the extensiveness of a woman’s employment before and after a birth, but also on the woman’s satisfaction with that work and the physical demands of the job. Paralleling this direct employment information, we have available detailed information about the income, earnings, and assets situation of the family unit as well as whether the woman has access to maternity leave. From a health perspective, we know about the extent to which the woman used prenatal health care services, and, to some degree, her own health status and health care (e.g., cigarette and alcohol use, general health status, weight gain). Finally, we have some knowledge about post-birth experiences, the infant’s birth weight and length of gestation, the mother’s infant-feeding practices, childcare practices, and medical care during the first year of life. Thus, with the information about the child’s cognitive, physical, and emotional well-being, many researchers are considering questions like the following: What is the effect of a mother’s employment on a child’s well-being? How is this association mediated by the myriad of social, economic, and family factors typically associated with a woman’s employment? This is an issue of fundamental importance in contemporary American society.


(b) Several issues that are intimately interwoven with the general employment and child well-being association can also be carefully evaluated and resolved. First, to what extent does infant feeding inhibit a woman’s ability to work in the immediate post-birth period or, conversely, to what extent does female employment inhibit nursing during a child’s first year of life? There have been substantial increases in breastfeeding among American mothers during the past two decades and issues relating to the relationship between feeding practices, employment behavior, and child health outcomes are of great contemporary, social, and political concern. In addition, these associations vary between mothers and children from different socio-economic, racial, and ethnic backgrounds.


(c) The NLSY79 since 1986 has included a wealth of information on the childcare practices of young mothers. Contrary to popular opinion, most childcare arrangements are relatively informal, not in highly structured public or private day care centers. The NLSY79 includes information not only on the nature of the arrangement (including location, type, and costs) but great detail about the actual family structure where more casual within-family arrangements exist. Thus, incorporation of child outcomes to the survey permits a much more comprehensive examination of the effects of various childcare arrangements for a full cross-section of American mothers (including care by the child’s mother) on child outcomes, independent of all the related social and economic factors that normally confound such analyses. The children in the survey cover a full spectrum of family and childcare environments. That is, both for family situations where two parents and one parent are present, we can define relatively large samples of children where (1) the mother is at home, (2) the mother is absent but the child is watched by a relative or non-relative in the home, (3) the child is cared for outside of the home by relatives/non-relatives in household situations, or (4) the child is cared for in more formal public or private day care environments. It is possible to contrast the development (and early school success, for those of school age) of children who followed different family/childcare arrangement paths controlling for other relevant social and economic background factors. It also provides important clarification of the effect of a mother’s employment on her children and suggests what might be more optimal strategies for enhancing a child’s well-being in an environment where mothers with young children need or want to work.


(d) It has been estimated that as many as 50 percent of contemporary marriages will end in separation or divorce. Reflecting this phenomenon, a substantial number of children will spend at least part of their childhood in one-parent or in blended households. This is particularly true for the children of women who begin marriage and childbearing at a young age. The NLSY79 child data set permits careful analyses of not only the effect on children of early marriage and childbearing, but also the implications of single parenthood for the emotional, physiological, and cognitive development of the child. The wealth of economic and educational background information permit researchers to separate out the direct and indirect effects of single parenthood per se on child development from the effects of differences between single and married parents in economic well-being, including the employment status of the parent and other family members. From a non-economic perspective, the longitudinal dimensions of the data make it possible to clarify whether or not younger mothers and/or single parents are intrinsically different in their child raising patterns or whether differences in child raising patterns between these mothers and their later-marrying or currently married counterparts reflect their family circumstances. Research on this topic can follow the NLSY79 children into adolescence and early adulthood, examining the long-term effects of family disruption. From a policy perspective, it is possible to measure the extent to which AFDC/TANF, other transfer payments, and federally sponsored education and training programs ultimately translate into improved outcomes for the children of single parents. An important related area of research is the extent to which repeat childbearing by these mothers further complicates child raising, as defined by less satisfactory child outcomes.


(e) A major objective of the NLSY79 child data-collection effort has been to increase our knowledge about the technology of child development. Adopting the notion of a production function from economics, one can think of the determinants of cognitive achievement and socio-affective traits as inputs that, when combined in particular proportions, produce a child development output. These inputs are in part subject to parental choice (e.g., the amount of parental time spent with the child) and are in part endowed (e.g., innate talents). The inputs that can be varied are chosen by parents according to some objective, and are subject to whatever financial and social constraints they face. The unique feature of this survey is that it contains by far the most comprehensive set of inputs available for the study of child development.


These data have enabled researchers to estimate the relative contribution of different inputs to child development outcomes. An important implication of the behavioral model is that input choices depend on the endowment level; these choices are possibly observed to some extent by the parents, but are not observed by the researcher. Thus, if families with children of high intelligence behave differently, for example, spend less time with their children, the true effect of those inputs will be contaminated by their relationship to endowments; it will look like spending time with children has a smaller (presumably positive) impact than in actuality. Because we are surveying all children in the same family, the difficulty that arises from variation in family-specific endowments can be handled statistically, i.e., we can look at the relationship between within-family input variation and within-family child development outcomes. In other words, we can see how differences in treatment (inputs) of different children in the same family are related to different child outcomes within the same family. If families know the endowments of their individual children, and allocate resources differentially based on those child-specific endowments, then it is necessary to observe the same child over time in order to obtain correct estimates of technology. Extensive research using this perspective has been completed.


In this view, the role of economic variables such as income, wage rates, and prices is to alter the level of inputs chosen. Thus, women with high wage rates will work more and be likely to allocate somewhat less time to children while possibly substituting other inputs like educational toys or specialized childcare services. These behavioral, as opposed to technological, relationships can also be explored with these data. Moreover, although the methodology is not well developed at this point, the data can in principle be used to understand the effect of unanticipated (by the parents) child development outcomes on behavior. There are no other data sets that can be used to study these complex interactions.


(f) Having a full range of psychological inputs/outcomes permits researchers to model contemporary female labor supply in a more comprehensive manner than has ever been possible. Since the physical, cognitive, and social development patterns of the children are measured as well as the parental inputs to this development process, it will be feasible to measure the extent to which the relative “success” (i.e., successful development) of children affects mothers’ hours and patterns of work as well as use of childcare. Many of the dimensions of childcare “quality” that are central to female labor supply analyses, but for which only very crude proxies are usually available, are measured with great precision in the NLSY79. Thus, trade-offs between hours of female employment and hours spent in “quality” childcare can be more appropriately defined.


Given the fact that we have “developed ability” measures for both mothers and children, it is possible to more directly and comprehensively address issues related to quality vs. quantity of childcare, in particular the effects of and reasons for substituting quality (own-time input) for quantity. Further, the selection of this particular trade-off may be associated with specific characteristics of women, as measured by “developed ability,” education, family structure, or other factors.


(g) There is important but inadequate literature in social-psychology that strongly suggests that the actual characteristics of a woman’s job, and the extent to which she may be satisfied with her employment, may have a greater effect on a child’s development than simply whether or not the mother is employed or is frequently absent from the home. The more important dimension may be the quality of the mother-child interaction rather than the quantity or amount of time spent together. First, a mother who is satisfied with her employment (or non-employment, for that matter) will probably have a better relationship with her children, which should translate into more positive social and perhaps intellectual traits in the child.


Second, the nature and characteristics of the job per se, whether the mother has supervisory responsibilities, has a job requiring extensive thinking, or has rigidly controlled work hours, affect the values that a mother will transmit to her children. Such values can have a major effect on a child’s social and emotional traits and ultimately on the child’s educational and vocational development. These issues are becoming increasingly important as more mothers are working, but our knowledge about the psychological impact of mother’s employment on the mother-child interaction process and child outcomes is slight. The NLSY79 child data set is being used extensively to address these important issues.


(h) From a social as well as a cost-benefit perspective, it is possible to examine to what extent social intervention programs for aiding the poor have been effective in helping mothers work or learn skills and helping to narrow child outcome differentials between more advantaged and less economically advantaged children. In addition to the voluminous education and training program data available, the NLSY79 collects information on sick and well care received by infants, and on whether or not this care was received in a publicly funded facility. Thus, it will be possible to sort out which kinds of children (in terms of economic, racial, and geographic backgrounds) have received assistance, and subsequently, how this assistance has affected development as well as early progress in school.


The rich body of data on transfer payments that has been collected since the inception of the survey makes it possible to examine whether or not a variety of presumably health-related inputs have affected the health of the mother and her children in the household. For example, information on the receipt of AFDC/TANF payments, food stamps, and other welfare payments is available. It is possible to measure whether or not these federal- and state-sponsored inputs to family well-being have any direct or indirect effects on the health of the mother or on the physical or cognitive development of her children.


(i) A research area of considerable interest, which has been handicapped by the inadequacy of available data, relates to the extent to which intact family units handle childcare needs by staggering the employment hours of the father and mother, as well as how different patterns of employment may impact child outcomes. At one extreme, what is the effect of having a father caring for a child full-time while the mother works, in comparison with the more traditional family employment pattern? In between these two extremes is a continuum of parent employment combinations, in conjunction with the potential availability of childcare services by other family members who may be present. How might these different patterns impact on the psychological development of the child?


(j) The assessment material collected in a single year permits researchers to consider the level of a child’s intellectual and socio-emotional development in relation to the full range of background information available. To properly measure how child development is linked with other facts requires the repetition of the development outcome information at more than one point in time. That is, at the micro disaggregated level of analysis, it is essential to measure how changes in intellectual or socio-emotional development are linked to other attributes and changes in other attributes. For this reason, the 2008 survey round will repeat many of the child cognitive achievement and socio-emotional measures. It is less important to repeat the assessments that are generally considered to have a large aptitude (as opposed to achievement) component, and presumably normed scores on these assessments should remain relatively stable over time. For this reason, all the children in the survey will repeat the assessments elaborated on in the section entitled “Summary of Child Aptitude Measures to Be Used” of Attachment 7 for which they are age-eligible in 2008. These assessments will be given in the same manner as done in previous years; the protocols are indicated in the mother and child supplements found in Attachment 11. See table 9 below for the age groups eligible to take each test.


Table 9. Child Assessment Tests and Eligible Age Groups


Tests

Eligible Age Groups

Child Supplement

Digit Span

7–11 years1

PIAT, Math and Reading

5–14 years

PPVT

4–5 and 10–11 years1

Mother Supplement

The HOME

0–14 years

How My Child Acts (Temperament)

0–6 years

Motor & Social Development

0–3 years

Behavior Problems

4–14 years


1 If not completed in a prior round. However, all 10- and 11-year-olds will repeat the PPVT and Digit Span assessments in 2008.



2. Other Research Issues Relating Family Structure and Child Outcomes

The above research issues are meant to be suggestive of some of the important employment-related questions that can be resolved using the NLSY79 and NLSY79 child data. Equally as important are the large number of medically and social-psychologically based research issues that can be comprehensively addressed. Many of the following issues are of fundamental importance for helping program and policy makers in the social and health fields make better informed judgments about the most appropriate distribution of federal funds. The following are several important areas of research in the social-medical sphere that can be effectively addressed with the addition of the child attribute outcomes:


(a) At a most basic level of analysis, it is possible to carefully examine differentials in child development between black, white, and Hispanic children, controlling for the many social and economic factors known to differ between the racial and ethnic groups. For example, to what extent do differential child outcomes reflect economic differences (including mother’s employment differentials) between black, white, and Hispanic families?


(b) At a second level of analysis, it is possible to measure the extent to which these differences in child outcomes are associated with differentials in other critical intervening factors. These include differences in maternal health care during pregnancy; mother’s use of cigarettes, drugs, or alcohol; employment during pregnancy; and infant health care (including propensity to nurse) during the first year of life. Other important intervening variables that can, of course, be affected by many of these health-related factors are the birth weight of the child, the length of gestation, and the changed weight of the mother during pregnancy as well as her weight at the initiation of the pregnancy. To our knowledge, the NLSY79 represents the only data set that includes all of these critical data inputs and outcomes for representative samples of younger American women of all races and ethnic groups.


(c) One other research area focuses on the extent to which the various child outcomes are correlated with each other and the extent to which they either separately or jointly impact on childhood educational behavior. Our ability to answer these questions is enhanced since we have gathered child developmental outcome measures and school outcome information for a number of years. Such data permit researchers to untangle the nature of the causality between the various psychological components (cognitive, socio-affective, and physical) and school outcomes. These factors can obviously reinforce each other over time, as success or lack of success in school can impact on social and cognitive development, which can in turn affect school success. These phenomena can be explored more thoroughly within a longitudinal context.


Also, research can consider the extent to which the effect of social or cognitive development on school outcomes is conditioned by other factors in the child’s contemporaneous situation or earlier background. For example, we have already considered the issue of how a mother’s employment may affect a child’s cognitive or social development. The issue being raised here is whether or not a given cognitive state (e.g., level of intelligence) translates into different school outcomes, depending on various dimensions of the child’s mother’s employment. Similarly, how does a family’s economic well-being, its structure (including number of siblings and the presence of a spouse), the educational level of the parents, and so on affect the relationship between measured intelligence and school outcomes? From a program perspective, the answers to these questions affect whether program funds and social interventions should be aimed at the child life cycle stage where particular cognitive conditions are being enhanced or at the point where these aptitudes are presumably being translated into school success or failure.


(d) We have already addressed the issue of early childbearing, marital status, and its effect on childbearing from an economic perspective. From a social-psychological perspective, the data set permits a careful examination of the effect that early childbearing can have on later child outcomes. At this time, we can examine long-term child outcomes into later adolescence and early adulthood. Information on whether or not the child was “wanted” by the mother as of the point of conception is available, as are a myriad of intervening health care variables. Indeed, completed research with the NLSY79 suggests that young mothers who did not want a particular pregnancy at the time they conceived were less likely to begin prenatal care early in pregnancy and, on average, have slightly lower birth weight babies, a factor that is known to be strongly associated with an above average level of infant health problems and infant mortality. Comprehensive modeling of the early childbearing-maternal/infant health-later child outcomes temporal progression incorporating relevant economic and family factors permits researchers to resolve a number of important questions. For instance, they are able to address how much of an independent effect early childbearing has on child outcomes per se, and how much of the effect reflects the possibility that the characteristics of early child bearers and their environment are less conducive to satisfactory child development.


Related to this issue, there is considerable interest in the health care and social demography community about the joint effects of early childbearing and early marriage on child development. Because many of the children who were born to young mothers are now adolescents or young adults, we can examine this issue using a long-term perspective.


(e) The NLSY79 sample includes a large number of sister pairs where both sisters have been interviewed. This unique sampling element permits researchers to carefully examine the extent to which common origin effects and intervening economic, social, and health behaviors by young women separately and jointly impact on child outcomes. For example, it is possible to follow sister pairs who come out of the same environment and examine the extent to which they follow similar or different behavior paths (e.g., early employment, smoking, drinking, general health care), as well as witness how these factors translate into different or similar child outcomes. This unique analysis permits far stronger statements to be made regarding the relative influence of innate traits versus the influence of environment than is usually possible. It also permits researchers to pinpoint the influence of specific health care practices as determinants of early child development.


(f) One of the most common social problems in contemporary American society is that of adolescent childbearing and its linkages with premature school termination, as well as with a host of other social problems of adolescence. While the dimensions of this problem have been well described, information is inadequate on the causes of this problem. For example, the extent to which early pregnancy and early school leaving may be intimately linked with prior attitudes and behaviors determined inside and outside of the home has not been well defined. Clearly many of the children raised in difficult environments have a relatively problem-free adolescence, whereas others encounter many difficulties. Extensive research on this topic is underway. The NLSY79 represents a unique data-collection vehicle for gathering information that could provide important insights into this adolescent development process.


The NLSY79 data set already includes a vast battery of information about the children’s maternal and family background, as well as cognitive, emotional, and physiological assessment information. As already noted, it is planned that the 2008 interview round will update the achievement and emotional assessments asked in 2006 and previous rounds. The 2008 survey round also will repeat a battery of questions (addressed both to children age 10 and over and to their mothers) about a variety of issues related to the child’s interaction with parents and peers, his or her school success, and his/her evolving sexuality. This information provides important insights into how prior family and maternal behaviors are linked with a variety of pre-adolescent and adolescent outcomes and how these outcomes are linked with long- and short-term changes in cognitive and emotional development. The continued collection of these data about children’s patterns of interaction with parents and peers provides important information about, and insights into, the processes associated with the transition to adolescence, and how social, intellectual, and physiological factors may impede or contribute to early sexual activity or premature school termination.


The above list of major research themes is not intended to be exhaustive, but rather is meant to suggest how the child outcome measures augment the value of the data set for policy-relevant research of interest to many government agencies. The comprehensive longitudinal database on employment, training, income, and family background for a large national sample and the heavy overrepresentation of minority respondents make the NLSY79 a totally unique data set for considering these issues. The child aptitude tests broaden immensely the scope of labor force research that has traditionally been the focus of the survey. Many of the unanswered questions about the effects of women’s employment on family life are now answerable. Equally important, the data provide significant clarification of the extent to which women’s employment is affected by family, specifically by the presence of children.


Attachment 5—Analysis of Content of Interview Schedules


The purpose of this attachment is to indicate the relevance of the interview schedule to one or more of the specific research objectives of the study. We do this by means of a matrix in which the interview schedule sections are listed in the stub and the research objectives that have been described in Attachment 4 are set forth in the box heads. A blank cell in a particular column means that the subject matter of that section may not be immediately relevant to that particular research question. An “X” indicates that items in the section can be used as either an independent or dependent variable in the analysis.

MATRIX: Rationale for Questionnaire Subject Matter


Questionnaire Sections

1. Examination of Department of Labor Employment and Training Programs

2. Aspirations and Expectations

(a) Examination of employment and training program impact

(b) Estimate of employment and training program participation

(c) Reactions of participants in employment and training programs

(a) Realism of aspirations

(b) Changes in aspirations over time

1. Household Interview






2. Family Background






A. Migration






B. Religion






3. Marital History

X



X

X

4. Schooling

X

X

X

X

X

5. Military




X

X

6., 7. On Jobs, Employer Supplement

X

X


X

X

8. Gaps in Employment

X

X


X

X

9. Training

X

X

X



10. Spouse/Partner Employment






11. Fertility






A. Fertility History/Child Residence

X

X


X

X

B. Maternal/Infant Health Update




X

X

12. Childcare


X


X


13. Health






A. Health of R/Family

X


X

X

X

B. Alcohol Use



X

X

X

14. Income and Assets

X

X

X

X

X

15. Retirement Expectations




X

X

Mother/Child Supp. (Child Assessments)




X

X

Child Self-Admin. Supp. (ages 10–14)

X


X

X

X

Young Adult Interview (ages 15–20)

X

X

X

X

X

MATRIX: Rationale for Questionnaire Subject Matter


Questionnaire Sections

3. Factors in Educational Progress

4. Transition from School to Work

(a) Retention rates

(b) Post-school job entry

(c) Effects of in-school experiences

(a) Curriculum choice and its effects

(b) Extent of employment in high school

(c) Effects of employment in high school

1. Household Interview







2. Family Background







A. Migration







B. Religion







3. Marital History

X

X

X

X

X

X

4. Schooling

X

X

X

X

X

X

5. Military


X

X

X



6., 7. On Jobs, Employer Supplement




X

X

X

8. Gaps in Employment


X

X

X

X

X

9. Training


X

X

X

X

X

10. Spouse/Partner Employment







11. Fertility







A. Fertility History/Child Residence

X

X

X


X

X

B. Maternal/Infant Health Update

X

X





12. Childcare

X

X




X

13. Health







A. Health of R/Family

X

X

X

X

X

X

B. Alcohol Use

X

X

X

X

X

X

14. Income and Assets

X

X

X

X

X

X

15. Retirement Expectations




X



Mother/Child Supp. (Child Assess.)



X


X

X

Child Self-Admin. Supp. (ages 10–14)

X

X

X

X



Young Adult Interview (ages 15–20)

X

X

X

X

X

X

MATRIX: Rationale for Questionnaire Subject Matter


Questionnaire Sections

5. Work Environment

6. Racial, Sex, and Cultural Differences in Employment and Earnings

(a) Job hopping among youths and young adults

(b) Occupational mobility and job characteristics

(c) Job satisfaction among youths and young adults

1. Household Interview





2. Family Background





A. Migration

X

X

X

X

B. Religion





3. Marital History

X

X

X

X

4. Schooling

X

X

X

X

5. Military


X



6., 7. On Jobs, Employer Supplement

X

X

X

X

8. Gaps in Employment

X

X


X

9. Training

X

X

X

X

10. Spouse/Partner Employment





11. Fertility





A. Fertility History/Child Residence

X

X

X

X

B. Maternal/Infant Health Update



X


12. Childcare

X

X

X


13. Health





A. Health of R/Family

X

X

X

X

B. Alcohol Use

X

X

X

X

14. Income and Assets

X

X

X

X

15. Retirement Expectations

X

X

X

X

Mother/Child Supp. (Child Assessments)


X

X

X

Child Self-Admin. Supp. (ages 10–14)



X


Young Adult Interview (ages 15–20)

X

X

X

X

MATRIX: Rationale for Questionnaire Subject Matter


Questionnaire Sections

7. Relationships between Economic and Social Factors
and Family Transitions and Well-being

8. Geographic Mobility

(a) Causes of “undesirable” behavior

(b) Consequences of “undesirable” behavior

(c) Effects of public programs

(d) Health determinants and consequences

(a) Causes

(b) Consequences

1. Household Interview

X

X

X

X



2. Family Background







A. Migration





X

X

B. Religion







3. Marital History

X

X

X

X

X

X

4. Schooling

X

X

X

X

X

X

5. Military





X

X

6., 7. On Jobs, Employer Supplement



X




8. Gaps in Employment


X


X

X

X

9. Training



X

X

X


10. Spouse/Partner Employment







11. Fertility







A. Fertility History/Child Residence

X

X


X

X

X

B. Maternal/Infant Health Update


X

X

X

X

X

12. Childcare


X





13. Health







A. Health of R/Family


X

X

X

X


B. Alcohol Use

X

X


X


X

14. Income and Assets

X

X

X

X

X

X

15. Retirement Expectations





X

X

Mother/Child Supp. (Child Assess.)

X

X

X

X


X

Child Self-Admin. Supp. (ages 10–14)

X

X


X

X


Young Adult Interview (ages 15–20)

X

X

X

X

X

X

MATRIX: Rationale for Questionnaire Subject Matter


Questionnaire Sections

9. Gross Changes in Labor Market Status

10. Social Indicators

11. Effects of Military Service

(a) Effect of military on civilian labor market

(b) Value of military service in civilian labor market

(c) Military wage and employment policies

1. Household Interview


X




2. Family Background






A. Migration

X


X



B. Religion


X




3. Marital History

X

X

X


X

4. Schooling

X

X

X

X

X

5. Military



X

X

X

6., 7. On Jobs, Employer Supplement



X

X


8. Gaps in Employment

X

X

X

X

X

9. Training

X

X

X

X


10. Spouse/Partner Employment

X





11. Fertility






A. Fertility History/Child Residence

X

X

X

X

X

B. Maternal/Infant Health Update

X





12. Childcare


X




13. Health






A. Health of R/Family

X

X

X



B. Alcohol Use

X

X

X



14. Income and Assets

X

X

X

X

X

15. Retirement Expectations

X


X


X

Mother/Child Supp. (Child Assessments)


X



X

Child Self-Admin. Supp. (ages 10–14)






Young Adult Interview (ages 15–20)

X

X

X

X


MATRIX: Rationale for Questionnaire Subject Matter


Questionnaire Sections

12. Delinquent Behavior, Arrest Records,
and School Discipline

13. Drug and Alcohol Use

14. Maternal and Child Inputs and Outcomes

(a) Effects of problem behavior on employment and family

(b) Effects of problem behavior on education and training

(a) Pattern of drug use

(b) Effect of labor market conditions

(c) Occupational patterns of drug users

(a) Employment related

(b) Non-employment related

1. Household Interview

X





X

X

2. Family Background








A. Migration








B. Religion








3. Marital History

X

X

X

X

X

X

X

4. Schooling

X

X

X

X

X

X

X

5. Military

X

X

X

X

X

X

X

6., 7. On Jobs, Employer Supplement

X


X

X

X

X

X

8. Gaps in Employment

X

X

X

X

X

X

X

9. Training

X

X

X

X

X

X


10. Spouse/Partner Employment

X


X

X

X

X

X

11. Fertility








A. Fertility History/Child Residence

X

X

X

X


X

X

B. Maternal/Infant Health Update

X

X

X


X

X

X

12. Childcare

X

X

X

X


X

X

13. Health








A. Health of R/Family

X

X

X

X

X

X

X

B. Alcohol Use

X

X

X

X

X

X

X

14. Income and Assets

X

X

X

X

X

X

X

15. Retirement Expectations








Mother/Child Supp. (Child Assess.)

X

X

X

X

X

X

X

Child Self-Admin. Supp. (ages 10–14)

X

X

X

X


X

X

Young Adult Interview (ages 15–20)

X

X

X

X

X

X

X

Attachment 6—New Questions and Lines of Inquiry


As mentioned earlier in this document, BLS has undertaken a continuing redesign effort to examine the current content of the NLSY79 and provide direction for changes that may be appropriate as the respondents enter middle age. Based on the 1998 redesign conference and subsequent discussions, as well as our experiences in 2000–2006, the 2008 instrument reflects a number of changes recommended by experts in various social science fields and by our own internal review of the survey’s content. The major changes are described in this attachment. Additions to the questionnaire have been balanced by deletions of previous questions so that the overall time required to complete the survey should remain about the same.


Additions/Modifications


Assets. After round 19, we determined that it was not necessary to ask an extended series of assets questions in every survey round. The assets module was excluded for rounds 20 and 22 and included in round 21. It will again be rotated into the survey in round 23, using the same questions as round 21. These questions, which appear after income in the survey, are numbered SC.1-SC.12b, FA.1-FA.11a, Q13.116-Q13.116A, NFA.0-NFA.2c, Q13.127A-Q13.132D, NFA.4-NFA.7d, DBT.1-DBT.4, Q13.141-Q13.142, and PS.1-PS.6.


Retirement Expectations. As our respondents move closer to their retirement, we must start to gather information about how they are planning for retirement and, indeed, even how they define retirement. As we examined the Health and Retirement Survey and other sources of information on retirement it became apparent that this is a complex issue. It is no longer sufficient to ask what a respondent will do upon reaching age 65 or 67; rather, we must seek to accommodate retirement plans ranging from a reduction in hours worked to a change in occupation (to a less demanding job) to a complete cessation of employment. In round 22, we included a retirement expectations experiment, with questions addressed to approximately 1000 respondents, to help us develop a module for use with the entire sample.


Most questions in the experiment worked well, and respondents were able to answer the questions without experiencing confusion. Therefore, most of the questions will be included in round 23 for the whole sample without significant changes from round 22. The questions which remain the same are RETIRE.EXP.P2.2-RETIRE.EXP.P2.6. The only change in this series of questions is an expansion of the response categories in question RETIRE.EXP.P2.2, based on responses in round 22 that were not able to be coded in the original list of categories.


The experiment indicated that we needed to include a couple of questions investigating the employment status of respondents who are not currently employed before entering the retirement module. Although the cohort is still fairly young, a few respondents are already retired (particularly those in the military), and some respondents are out of the labor market and do not expect to work for pay in the future (e.g., homemakers and respondents who are disabled). These initial screening questions will permit us to ask these respondents only the questions that make sense for their situation.


Finally, the retirement expectations experiment included an open-ended question, RETIRE.EXP.P2.1, intended to help us determine the response categories necessary for RETIRE.EXP.P2.2. This question is no longer necessary since the experiment is over and will be dropped.


Spouse/Partner Race and Ethnicity. Although we have collected a great deal of information about respondents’ spouses and partners, we have never asked the respondent to report the race or Hispanic ethnicity of spouses or partners. This is considered to be vital information by sociologists, demographers, and many other social scientists, and race/ethnicity is frequently a key control variable in a wide variety of analyses. Beginning in round 23, we plan to add this information to the data collected about spouses and partners. Race and Hispanic ethnicity will be collected for all spouses and partners reported at each interview using the standard OMB format in SPARRACE.1-2. In round 23 only, we will also ask retrospective questions about spouses and partners reported by the respondent in past rounds, to fill in this important gap in our demographic data.


Highest Degree Received. In each round, we update the highest degree received information for any respondents who report attending school since the last interview. However, it has been many rounds since we independently recorded this information for the entire sample. Asking a sample-wide highest degree received question provides a valuable double-check on the accuracy of this vital piece of information, at very low cost in terms of time or respondent burden. The questions for highest degree ever received, and date of degree, are Q3.10D and Q3.10E.


Health. The most extensive changes in the survey occurred in the health section. In the past few rounds, the survey has included a “40+ health module,” a series of questions about health status and behaviors addressed to respondents in the first interview after they turned 40. All respondents have now aged out of this module, so it will be dropped. In its place, we have developed a “50+ health module,” with the goal of updating data from the 40+ module and incorporating new questions representing additional health concerns of aging. This module was developed after extensive consultation with medical and public health experts about the types of data needed for health research, especially as it relates to the longitudinal nature of the survey and the rich store of background information on respondents collected over the past few decades.


Many of the questions in the 50+ module are simple repetitions of the 40+ module. These include the CES-D (depression) scale, health status of the respondent’s biological parents, general state of health and health problems in the past 4 weeks, whether the respondent suffers from a variety of specific health conditions (for example, high blood pressure, cancer, lung disease). These repeated questions are the Q11.H50CESD series, the Q11.H50BPAR series, the Q11.H50SF12 series, and all questions in the Q11.H50CHRC.1-Q11.H50CHRC.9B range not specified below as new.


Based on feedback from our panel of experts, we also incorporated new questions in response to new areas of research in public health and the aging of the respondents. New questions about specific health conditions include skin cancer (Q11.H50CHRC.2B), asthma (Q11.H50CHRC.3E), depression (Q11.H50CHRC.7B-7D), and osteoporosis (Q11.H50CHRC.9C). The 50+ health module also includes new questions on functional limitations in performing various activities (Q11.H50FL.1-2). Q11.H50FL.1 is taken from the National Health Interview Survey (NHIS), and the FL.2 series is based on questions in the Health and Retirement Survey (HRS) and the NLSW. These questions are frequently used in public health research and will permit comparison of the NLSY79 with other large national samples. Another new series (Q11.H50SLP.1-5)asks about respondents’ usual sleep patterns and frequency of sleep problems; these questions are drawn from the National Survey of Midlife Development in the United States (MIDUS II) and the HRS. Again, these questions address an area which recent research suggests may have a significant impact on health, and they will permit comparison with other national surveys. Finally, we have added an open-ended question (Q11.H50OPEN.1) which gives the respondent the opportunity to tell us anything else about their health that they feel is important.


On the advice of the experts, we also deleted some questions from the 40+ module. These include whether the respondent suffered from a broken bone or head injury in the past 10 years, the respondent’s use of corrective lenses and hearing aids, and various items from a lengthy list of minor health concerns (Q11.H40CHRC.10a-ii in the round 22 survey).


As our panel of experts collaborated on the 50+ health module, they also made recommendations for revising the general health questions addressed to all respondents. In general, the changes to this section were made with the intention of addressing emerging concerns in public health research while removing unnecessary questions to reduce time burden. The new questions include:

  • Q11.CARE.1-4B, a series of questions on whether the respondent regularly cares for a household or family member due to their health concerns. These questions are important because care giving can directly impact labor supply, and as NLSY79 respondents age they are more likely to spend time caring for aging parents. This series was taken from the NSLW questionnaire, so cross-cohort comparisons will be feasible.

  • Q11.GENHLTH.PRV1-2, which ask whether the respondent has a regular heath care provider. These questions will provide important information about the accessibility and type of health care available to the respondent.

  • Q11.GENHLTH.4C.M through Q11.GENHLTH.4E.F, a series of questions focused on preventive health care. Respondents are asked to report whether they have screening tests such as cholesterol, diabetes, colonoscopy, blood pressure, PAP smear, etc. Female respondents also report whether they are taking estrogen or other hormone replacement medications. These preventive health measures are widely recognized as an important aspect of an individual’s long-term health.

  • Q11.GENHLTH.5A.1-3, which asks about the respondent’s dental health. Recent research indicates that dental health and preventive care can have a significant impact on other aspects of an individual’s health. These questions are drawn from the British National Diet and Nutrition Survey.

  • Q11.GENHLTH.7C.1-7F.2, a series of questions about the respondent’s eating habits. These questions are drawn from the National Longitudinal Survey of Adolescent Health (wave III).


The series of questions on the source of the respondent’s health insurance was modified, in an attempt to simplify these questions so that respondents can answer them more quickly and easily. The new questions are based on the NHIS 2005 Family Health Module, with some additional wording from the PSID and MIDUS. These questions collect the same basic information about health insurance coverage as the previous series, but we anticipate that they will be perceived as easier by respondents. Further, the new questions will permit easier comparison of the NLSY79 with other national surveys. These questions are Q11.HLTHPLN.INTRO through Q11.80B, Q11.83-84B, and Q11.87-88B.


Deleted health questions include whether the respondent has missed any days of work due to a child’s asthma (Q11.ASTHMA.10-10B in round 22) and health care topics discussed at the respondent’s last general exam (Q11.GENHLTH.4B in round 22).


Deletions


Current Population Survey. Round 22 included questions on current labor force status based on the Current Population Survey. After the 1998 redesign conference, it was determined that they could be repeated every third or fourth round rather than every round. Round 22 represented their regular rotation, so they will be excluded in round 23. “CPS” questions were those prefixed with “Q5” (Q5.1-Q5.93).


Attitudes. The round 22 survey included a series of questions adapted from the Rosenberg Self-Esteem Scale. This scale is included in the survey only periodically (1980, 1987, and 2006), so it is not necessary to re-ask the scale in 2008.


Consumption, Impatience, and Risk. With regard to consumption, questions were added in round 22 that gather information on monthly amounts spent on groceries, non-food items, eating out, telephone service, internet service, and utilities. The questions were CONSUMPTN_1 through CONSUMPTN_8A. These questions are intended to permit researchers to explore important aspects of wealth accrual in middle age. However, we feel that it is not feasible nor necessary to ask these questions in every survey round; we may consider rotating them back into the survey at some point in the future.


Round 22 also included a series on risk and impatience. Numbered RISK_1 through RISK_4_SR000001 and IMPATIENCE_1 through IMPATIENCE_2, these questions were intended to help researchers to identify attitudes that may affect a variety of choices respondents may make with respect to investing, job changes, retirement planning, and the like. Existing research on risk and impatience indicates that these attitudes change very slowly, so it is not necessary to ask these questions in every round.


Finally, several recent surveys including round 22 contained a set of questions on job risk. While added to the survey before the risk and impatience series mentioned above, these questions complemented that series. As the job risk questions (JOB_RISK-1 through JOB_RISK-3) similarly record attitudes that change slowly over time, it is not necessary to ask these in every round and they are being rotated out of the survey.


Volunteerism. Round 22 included a series of questions about volunteerism, such as whether respondents had any unpaid volunteer work, the number of weeks, number of hours worked per week, types of organizations, and the types of organizations through which the respondents volunteered the most. The questions were ACP_16A through ACP_20A. Although these questions are important and provide an opportunity for cross-generational analysis with other NLS cohorts, we do not view them as core questions and so we intend to ask them only on a periodic basis.





Attachment 7—Child Assessment Measures


A. The Child Assessment Measures

All of the other major subject areas and indeed almost all of the questions in this survey have been asked in prior survey rounds. There has been little respondent resistance to any of the items, as witnessed by the very high response rate, continued respondent cooperation, and high quality data collected. The child assessment questions were all asked in the 1986, 1988, 1990, 1992, and 1994–2006 survey rounds. Discussions with NORC field staff indicate that these materials are greeted in an enthusiastic manner by the respondents and their children. The respondents view the assessment component of the survey as a positive experience. Response rates have been very high on these assessments, generally over 90 percent, and analysis suggests that the quality of the data is very good. We include here the information that describes the assessments in detail, including estimates of their established reliability and validity.


The decisions about which child aptitude tests to ask of the sample members were carefully considered from a number of perspectives. These decisions were made jointly by staff from the National Institute of Child Health and Human Development (NICHD),the Ohio State University Center for Human Resource Research (CHRR), and nationally recognized panels of experts from medicine and the social sciences. Each of these experts has specialized areas of interest central to this study. A list of these individuals and their affiliations is shown on page 12 in table 4.


For a test to be considered for possible inclusion in this study, it needed to meet a number of specific criteria. The obvious one being that it is essential as either an input or outcome measure (or both) for analyses of interest to NICHD, which provides funding to the Bureau of Labor Statistics for the Child Survey. All of the aptitude tests chosen met the following criteria.


(a) They are “tried and true” tests that have been extensively used by data collectors in a variety of social, economic, and cultural situations. They have been used in household settings similar to the interviewing procedures used with the NLSY79, and they have been administered by non-technical interviewing personnel to a full cross-section of American youth, middle class and economically disadvantaged non-black/non-Hispanics as well as black and Hispanic youth.


(b) All of the tests are recognized by the social science community as well-established and well-normed. Reliability and validity statistics indicate that they are all highly reliable and valid. High reliability means that if the same individual is repeatedly given the same test, that person will repeatedly have similar scores. Validity means that the test measures what it purports to measure, and it is generally determined by comparing results on the given test with results for the same individual on a different test whose validity has already been well established. All of the tests are rated highly in Tests in Print (Vol. 3, Burroughs, 1983), a highly regarded testing manual that rates all of the major aptitude tests.


(c) All of the tests chosen for use are inexpensive to administer, require very little equipment (important for tests to be administered in a home setting), and are relatively short.


(d) The utility and appropriateness of all of the tests have been considered from both longitudinal and cross-sectional perspectives. First, the participants in the questionnaire development process have carefully ensured that tests are included that cover all the critical cognitive, personality, and physical health dimensions for children at all maturational stages. Second, every effort has been made to assure that the tests complement each other analytically from a longitudinal perspective. Finally, to the maximum extent feasible, every effort has been made to include basic cognitive and personality scales that can, in a cross-sectional context, be compared across age groups at one point in time.


(e) None of the tests involve any physical or psychological risk to either the child or any other family member. In all instances, the mother, who is the sample respondent, will be intimately involved in the testing procedures and indeed, for a large proportion of the tests, the questions are addressed directly to the mother. If at any time there is any reticence regarding a procedure by either a child or the respondent, testing will immediately cease. We will in no instance jeopardize the quality of the material being collected for the Department of Labor or the basic integrity of the longitudinal design.


B. Summary of Child Aptitude Measures To Be Used

This section presents summary descriptions of the tests and measures to be used in the 2008 survey round. In addition to briefly describing the measure, we will include summary documentation about its validity and reliability, its utility for evaluating aptitudes of minority respondents, and the estimated time required to give the test.


1. Home Observation for Measurement of the Environment (HOME)

This set of scales measures the nature and quality of the child’s developmental environment. Some of the items are maternal self-reports, while others are interviewer observations. There are three versions: for infant (0–2 years), preschool (3–5 years), and elementary-aged (6 years and older) children. The instrument yields a total score reflecting the quality of the home environment and mother-child relationship, two general indicators of emotional support and cognitive stimulation, and subscales measuring several processes of the home environment. The HOME Inventory’s total score, factor scores, subscale scores, and item scores have been used by previous researchers. The scales have been previously used across a full range of ethnic and SES subgroups.


The infant version consists of six categories: maternal emotional and verbal responsiveness, maternal acceptance of and involvement with her child, materials for play, organization of the environment, and variety of stimulation. The preschool version consists of eight categories: maternal warmth and acceptance; organization of the environment; variety of stimulation; modeling of maturity; learning, language, and academic environmental processes; and avoidance of harsh discipline. The elementary version also consists of eight categories: maternal responsiveness, emotional climate, organization of the environment, modeling of maturity, family participation in growth experiences, paternal involvement, opportunities for growth, and provision for active stimulation.


This widely used battery was created by Dr. Bettye Caldwell and Dr. Robert Bradley of the University of Arkansas at Little Rock. For each HOME subscale, three items for the NLSY79 child survey were selected based on previous factor analysis (Bradley and Caldwell, 1979) and Dr. Bradley’s consultation.


The HOME is considered very reliable. Bradley and Caldwell (1981) report inter-rater reliabilities from six studies in the high .80s to low .90s. Bradley, Caldwell, and Elardo (1979) found that six-month test-retest subscale correlations ranged from .45 to .87. Studying children from 6 to 42 months of age, Yeates et al. (1983) found twelve-month test-retest reliabilities from .43 to .68, and two-year test-retest reliabilities of .38 to .56. Ramey et al. (1984) reported two-year test-retest reliabilities of .56 and .57. Van Doornick et al. (1981) reported high total score stability (r = .86) among siblings tested at least ten months apart. In the NLSY79, Mott and Quinlan (1991) found that reliability for the subscales of the HOME used in the NLSY79 was 0.59 for children under three, rising to 0.69 for children three to six, 0.73 for children six to nine, and 0.68 for children ten and over. Two-year test-retest reliabilities were 0.52 for children three to five and 0.59 for children six and over, similar to levels found in other studies.


Prior longitudinal research indicates that the HOME predicts later cognitive, social, and physical development. Yeates et al. (1983) longitudinally compared the predictive ability of the HOME relative to the predictive ability of maternal intelligence for child intellectual development at two, three, and four years of age, finding that maternal intelligence was initially more predictive, but by age four the quality of the home environment was most predictive of cognitive development. The HOME is more predictive of subsequent cognitive development than is concurrently measured cognition (Elardo, Bradley, and Caldwell, 1975). When administered as early as two months of age, the HOME has correlated from .34 to .72 with intelligence tests subsequently administered as late as four-and-a-half years of age. The HOME at one and two years correlated (.33 to .65) with academic achievement in the first through fourth grades of school (Bee et al., 1982; Bradley and Caldwell, 1976, 1980, 1984; Elardo, Bradley, and Caldwell, 1975; Van Doornick et al., 1981).


Besides these strong predictive correlations with subsequent mental development, the HOME has also indicated a variety of developmental risks and delays such as clinical malnutrition, lead burden, failure-to-thrive, socio-cultural retardation, language delay, developmental delay, and poor academic achievement (Elardo and Bradley, 1981). The HOME is moderately related to SES and parental education (r = .2 to .6, Elardo and Bradley, 1981). A meta-analysis of the correlation between SES and intelligence found that measures of the home environment accounted for four to eleven times as much of the variation in academic achievement and intelligence (median r = .55) as did standard measures of SES. The homes of divorced working mothers provided less cognitive stimulation and emotional support according to the HOME Inventory than did the homes of married (working or nonworking) mothers. Six studies found relationships between temperamentally difficult and unsociable infants, and decreased cognitive stimulation and emotional support available in their homes. In general, this has been the most widely used of all the NLSY79 child assessments. As in previous rounds, the HOME will be administered for the full sample of children.


References

Bee, H.; Bernard, K.; Eyres, S.; Grey, C.; Hammond, M.; Spietz, A.; Snyder, C.; and Clark, B. “Prediction of IQ and Language Skill from Perinatal Status, Child Performance, Family Characteristics, and Mother-Infant Interaction.” Child Development 53 (1982): 1134-1156.

Bradley, Robert H. and Caldwell, Bettye M. “Home Observation for Measurement of the Environment: A Revision of the Pre-School Scale.” American Journal of Mental Deficiency 84 (1979): 235-244.

Bradley, Robert H. and Caldwell, Bettye M. “The Relation of Infants’ Home Environments to Mental Test Performance at Fifty-Four Months: A Follow-Up Study.” Child Development 47 (1976): 1172-1174.

Bradley, Robert H. and Caldwell, Bettye M. “The HOME Inventory: A Validation of the Pre-School Scale for Black Children.” Child Development 52 (1981): 708- 710.

Bradley, Robert H. and Caldwell, Bettye M. “The Relation of Home Environment, Cognitive Competence, and IQ among Males and Females.” Child Development 51 (1980): 1140-1148.

Bradley, Robert H. and Caldwell, Bettye M. “The Relation of Infants’ Home Environment to Achievement Test Performance in First Grade: A Follow-Up Study.” Child Development 55 (1984): 803-809.

Bradley, Robert H.; Caldwell, Bettye M.; and Elardo, R. “Home Environment and Cognitive Development in the First Two Years: A Cross-Lagged Panel Analysis.” Developmental Psychology 15 (1979): 246-250.

Caldwell, Bettye M. and Bradley, Robert H. Home Observation for Measurement of the Environment. Little Rock, AR, 1984.

Elardo, Richard D. and Bradley, Robert H. “The Home Observation for Measurement of the Environment (HOME) Scale: A Review of Research.” Developmental Review 1 (1981): 113-145.

Elardo, Richard D.; Bradley, Robert H.; and Caldwell, Bettye M. “The Relation of an Infant’s Home Environment to Mental Test Performance from Six to Thirty-Six Months.” Child Development 46 (1975): 71-76.

Mott, Frank L. and Quinlan, Steven V. Children of the NLSY79: 1988 Tabulations and Summary Discussion. Columbus, OH: Center for Human Resource Research, March 1991.

Ramey, C.; Yeates, K.; and Short, E. “The Plasticity of Intellectual Development: Insights from Preventive Intervention.” Child Development 55 (1984): 1913- 1925.

Van Doornick, W.; Caldwell, Bettye M.; Wright, C.; and Frankenburg, W. “The Relationship between Twelve-Month Home Stimulation and School Achievement.” Child Development 52 (1981): 1080-1083.

Yeates, K.; MacPhee, D.; Campbell, F.; and Ramey, C. “Maternal IQ and Home Environment as Determinants of Early Childhood Intellectual Competence: A Developmental Analysis.” Developmental Psychology 19 (1983): 731-739.


2. Wechsler Intelligence Scale for Children–Revised: Digit Span Subscale

There are two parts to this measure of short-term memory for children age 7 and older. First, the child listens to and repeats a sequence of numbers said by the interviewer. In the second part, the child listens to a sequence of numbers and repeats it in reverse order. In both parts, the length of the sequence of numbers increases as the child responds correctly.


This subscale is from the revised Wechsler Intelligence Scale for Children (WISC-R) published by the Psychological Corporation. The WISC-R is one of the best normed and highly respected measures of child intelligence. The Digit Span score is a good measure of short-term memory and attentiveness for children 7 and older. It correlates (r = .45) with the PIAT’s reading recognition. Its parallel form reliability is about 0.53. It is administered to the NLSY79 children age 7–11.


References

Jackson, N. and Myers, M. “Letter Naming Time, Digit Span, and Precocious Reading Achievement.” Intelligence 6 (1982): 311-329.

Wechsler, D. Wechsler Intelligence Scales for Children–Revised. New York: The Psychological Corporation, 1974.


3. Peabody Picture Vocabulary Test–Revised (PPVT-R)

The PPVT-R measures receptive vocabulary knowledge of orally presented words by directing the child to nonverbally select the picture (out of four) that best describes the word’s meaning. The PPVT-R is among the most highly recognized and established indicators of verbal intelligence and scholastic aptitude across childhood (from age 3 and older). Since 1978, it is the fourth most frequently cited test in Mitchell’s Tests in Print (1983). It is currently administered to NLSY79 children age 4–5 and 10–11.


Numerous studies have replicated the reliability estimates from the PPVT-R’s standardization sample (4,200 children ages 2 years and 6 months to 18 years and 11 months): a median split-half reliability of .80 (ranging from .67 to .88), a median parallel form reliability of .70 (ranging from .50 to .87), and a median 9- to 31-day test-retest reliability of .78 (.52 to .90; Dunn and Dunn, 1981). Goldstein et al. (1970) reported a 21-month test-retest reliability of .61 among 160 disadvantaged three- to seven-year-olds, and Costello and Ali (1971) found a two-week retest reliability of .77 among thirty-six black preschoolers.


The PPVT-R demonstrates a high construct validity with a variety of intelligence tests. Its median correlation with other vocabulary tests was .71 (based on 55 criterion validity coefficients, ranging from .20 to .89); with other individual intelligence tests it was from .38 to .72 (based on 291 correlations ranging from -.16 to .92). Its correlation was higher with the Binet and Wechsler tests than with less reputable tests; and correlations were higher with verbal intelligence (.66 to .71) than with performance (.46 to .65; Dunn and Dunn, 1981).


Because it demonstrates high predictive validity with a variety of achievement measures, combined with other information, the PPVT is an extremely important predictor of early and middle school outcomes. Median correlation with math achievement was .50 (based on 16 correlations ranging from .27 to .77 with the Wide Range Achievement Test (WRAT), California Achievement Test (CAT) and PIAT); with language achievement it was .44 (16 correlations, from .02 to .66 with the WRAT, PIAT, CAT, with Metropolitan Achievement Test (MAT)); with reading comprehension it was .63 (seven from .42 to .70 with the CAT and PIAT); and with reading recognition it was .38 (WRAT) and .52 (PIAT) (14 ranging from .01 to .71; Dunn and Dunn, 1981).


Zigler, Abelson, and Seitz (1973) found an inverse relationship (r = -.53) between the magnitude of the increased IQ on retest and the initial IQ estimate. This indicates that a disadvantaged preschooler’s measured intelligence is influenced by anxiety and sociability during assessment, that emotional patterns are distinct from cognitive deficits and are measured by the temperament and interviewer relationships between sociability and cognitive performance.


References

Costello, J. and Ali, F. “Reliability and Validity of PPVT Scores of Disadvantaged Preschool Children.” Psychological Reports 28 (1971): 755-760.

Dunn, L. and Dunn, L. PPVT-R Manual. Circle Pines, MN, 1981.

Goldstein, L.; Collier, A.; Dill, J.; and Tilis, H. “The Effect of a Special Curriculum for Disadvantaged Children on Test-Retest Reliabilities of Three Standardized Instruments.” Journal of Educational Measurement 1 (1970): 171-174.

Lamb, M.; Garn, S.; and Keating, M. “Correlations between Sociability and Cognitive Performance among Eight-Month-Olds.” Child Development 52 (1981): 711-713.

Plant, W.T. and Southern, M.L. “The Intellectual and Achievement Effects of Pre-School Stimulation of Poverty Mexican-American Children.” Genetic Psychology Monographs 86 (1972): 141-173.

Radin, N. “Maternal Warmth, Achievement Motivation, and Cognitive Functioning in Lower-Class Pre-School Children.” Child Development 42 (1971): 1560-1565.

Robertson, G. and Eisenberg, J. PPVT-R Technical Supplement. Circle Pines, MN: American Guidance Service, 1981.

Zigler, W.E.; Abelson, W.; and Seitz, V. “Motivational Factors in the Performance of Economically Disadvantaged Children on the PPVT.” Child Development 44 (1973): 294-303.


4. Peabody Individual Achievement Test (PIAT)

The PIAT is a wide-range measure of academic achievement for children aged five and over that is widely known and used in research. It is perhaps the most widely used brief assessment of academic achievement having demonstrably high test-retest reliability and concurrent validity. From its national standardization sample, the PIAT’s median one-month test-retest reliability is .74 for Math, .89 for Reading Cognition, and .64 for Reading Comprehension (Dunn and Markwardt, 1970). The concurrent validity coefficients for the three PIAT subscales were reported in the discussion of the PPVT-R. Of all psychological tests, the PIAT had the forty-second largest number of citations since 1978 in Mitchell’s Tests in Print (1983). In addition to the raw and standard scores, percentiles, age equivalents, and grade equivalents are also available. Some of the PIAT’s many correlations with measures of memory, the home environment, and intelligence have been previously mentioned. The PIAT subscales are administered to all eligible children ages 5-14. In this context, “eligible” means that a sufficient score was achieved on the previous assessment to proceed to the next assessment. In particular, some children may not advance to the reading comprehension subscale.


(a) PIAT Math Subscale

This subscale measures ability in mathematics as taught in mainstream education. It consists of 84 multiple-choice items (each with four options) that increase in difficulty. It begins with such early skills as recognizing numerals and progresses to measuring advanced concepts in geometry and trigonometry. The child looks at the problem, then points to the answer.


(b) Peabody Reading Recognition Subscale

This subscale measures word recognition and pronunciation ability to indicate reading achievement. Children read a word silently, then say it aloud. Recognition contains 84 items (each with four options) that increase in difficulty from preschool to high school levels. Skills assessed include matching letters, naming names, and reading single words aloud.


(c) Peabody Reading Comprehension Subscale

This subscale measures the ability to derive meaning from sentences that are read silently. For each of the 66 items of increasing difficulty, the child silently reads the sentence once and then selects one of four pictures that best portrays the meaning of the sentence.


While the recognition and comprehension subscales cannot measure all dimensions of reading comprehension (our “ideal” objective requiring at least a half-hour of assessment time), they do measure two critical components of reading comprehension, word recognition and comprehension of the meaning of sentences. In Hammill and McNutt (1981), meta-analysis (8,239 coefficients from 322 studies) of reading correlates reports median concurrent correlations of .72 between recognition and composite reading, .72 between comprehension and composite reading, .85 between composite reading and general academic achievement, .74 between recognition and comprehension, and .62 between composite reading and math.


References

Dunn, L. and Markwardt, F. PIAT Manual. Circle Pines, MN: American Guidance Service, 1970.

Hammill, D. and McNutt, G. The Correlates of Reading. Austin, TX: PRO-ED, 1981.

Naglieri, J. and Harrison, P. “McCarthy Scales, McCarthy Screening Test, and Kaufman’s McCarthy Short Form Correlations of the PIAT.” Psychology in the Schools 19: 149-155.

Robertson, G. and Eisenberg, J. PPVT-R Technical Supplement. Circle Pines, MN: American Guidance Service, 1981.


5. Temperament Scales

Because the child’s temperament is partially a parental perception (Bates, 1980), the behavioral style of children is measured by a set of maternal-report items (for children younger than 7 years) and interviewer ratings (for children directly assessed). Because the review by Hubert et al. (1982) found no single instrument to be satisfactory, our scale is based on items from Rothbart’s IBQ, Campos and Kagan’s compliance scale, and other items from Dr. Joseph Campos. The maternal scale “How My Infant Usually Acts” addresses the activity, predictability, fearfulness, positive affect, and irritability of the 0–11-month-old child. “How My Toddler Usually Acts” addresses the fearfulness, positive affect, and irritability of the 12–23-month-old child. “How My Child Usually Acts” measures the compliance, affect, attachment, and sleep problems of children aged 2 years to 6 years and 11 months. The interviewer rates the child’s shyness at first meeting and at the end of the session; during the assessment, the interviewer records the child’s cooperation, interest and motivation, energy, persistence, and attitude about and rapport with the interviewer during the assessment.


Temperament is important to child development, personality development, the child’s impact on family members, and the development of behavioral problems (Bates, 1980). These scales include dimensions such as sociability, mood, adaptability, and compliance, all factors that are components of Thomas’s easy/difficult temperament construct and are precursors to personality development and social adjustment (areas measured by the Behavior Problems Index that is discussed below), social relations, and performance on tests such as the Motor and Social Development Scale and PPVT-R (e.g., Lamb, 1982).


As with adult personality measures, reviews of temperament (Bates, 1980; Campos et al., 1983; Hubert et al., 1982) contend that the perceiver plays a significant role, that mild to moderate inter-rater agreement is the rule; median parent-observer correlations of .2 to .4 in infancy increase to .3 to .6 by age two, median between-parent correlations are .4 to .6. They also state that moderate internal consistency (.2 to .8) and retest reliability (-.1 to .9) are present and that fair validity coefficients (.3 to .6) are found with a wide variety of criteria. Hubert et al. (1982) state that the most consistent and substantial relationship is found between temperamental difficulty and infant distress/fussiness with people. Published correlates include levels of neurotransmitters associated with stress, spectrographic analysis of cries, respiratory distress, and post mature birth syndromes; maternal anxiety, sociability, responsivity, and stress; family moves, employment changes, paternal childcare, birth of siblings; sensitivity to change and adversity, social communication, subsequent behavior disorders (i.e., delinquency and emotional disturbance), and cognitive and motor development. This assessment will be given to children age 0–6 in 2008.


References

Baydar, Nazli. “Reliability and Validity of Temperament Scales of the NLSY Child Assessments.” Journal of Applied Developmental Psychology 16,3 (July-September 1995): 339-370.

Bates, J. “The Concept of Difficult Temperament.” Merrill-Palmer Quarterly 26 (1980): 299-319.

Campos, Joseph J.; Barrett, Karen C.; Lamb, Michael E.; Goldsmith, H.; and Stenberg, Craig. “Socioemotional Development.” In P. Mussen (ed.), Handbook of Child Psychology, Vol. 2, 4th Ed. New York: Wiley, 1983.

Hubert, N.C.; Wachs, T.D.; Peters-Martin, P.; and Gandour, M.J. “The Study of Early Temperament: Measurement and Conceptual Issues.” Child Development 53 (1982): 571-600.

Lamb, Michael E.; Garn, S.; and Keating, M. “Correlations between Sociability and Motor Performance Scores in Eight-Month-Olds.” Infant Behavior and Development 5 (1982): 97-101.

McDevitt, S.C. and Carey, W.B. “Stability of Ratings vs. Perceptions of Temperament from Early Infancy to 1-3 Years.” American Journal of Orthopsychiatry 51 (1981): 342-345.

Menaghan, Elizabeth G. and Parcel, Toby L. “Measuring Temperament in a Large Cross Sectional Survey: Reliability and Validity for Children of the NLS Youth.” Working Paper, Department of Sociology, The Ohio State University, 1988.

Peters-Martin, P. and Wachs, T.D. “A Longitudinal Study of Temperament and Its Correlates in the First 12 Months.” Infant Behavior and Development 7 (1984): 285-298.


6. Perceived Competence Scale for Children/Self-Perception Profile

This self-report magnitude estimation scale measures the child’s perceived competence in the academic skill domain and the child’s sense of general self-worth. Children age 12–14 answer the questions in these two domains by first selecting one of two alternatives describing how they usually act, and then indicating how true that alternative is of themselves.


There are many studies that have documented the importance of this scale as a predictor of important child outcomes and behaviors. For example, it has been shown to correlate highly with teacher ratings of children and with a child’s achievement motivation. It has high internal reliability (r = .73 to r = .86) and high (nine month) test-retest reliability (r = .8). Mott and Quinlan (1991) found reliability scores of 0.68 in the NLSY79. Prior uses of the schedule suggest no apparent cultural bias.


References

Harter, Susan. “The Perceived Competence Scale for Children.” Child Development 53 (1982): 87-97.

Harter, Susan. “Developmental Perspectives on the Self-Esteem.” In M. Hetherington (ed.), Social Development: Carmichael’s Manual of Child Psychology. New York: Wiley, 1984.

Harter, Susan. “Effectance Motivation Reconsidered: Toward a Developmental Model.” Human Development 1 (1978): 34-64.

Mott, Frank L. and Quinlan, Steven V. Children of the NLSY79: 1988 Tabulations and Summary Discussion. Columbus, OH: Center for Human Resource Research, March 1991.


7. Behavior Problems Index

This widely-used scale was created by Dr. Nicholas Zill and James Peterson of Child Trends to measure the frequency, range, and type of childhood behavior problems. These items are from Achenbach’s (1978) Child Behavior Checklist developed at the NIMH Laboratory of Developmental Psychology, one of the most thoroughly researched and widely used parental report measures of behavior problems in childhood. Some of these items have been used in earlier national surveys (Cycle II of the Health Examination Surveys: 1963–1965 and the Foundation for Child Development’s National Survey of Children in 1976). The present set of 28 items were selected for inclusion in the 1981 Child Health Supplement to the National Health Interview Survey based on their ability to distinguish children referred for psychological treatment from typical children in a large sample (1,300 children in each group). In addition to their discriminant validity, the items were selected to measure six behavioral syndromes (antisocial, anxious/depressed, hyperactive, stubborn/parental conflicting, social withdrawing/peer conflicts, and immature dependency) suitable for boys and girls age 4 and over.


The 1981 Child Health Supplement data analysis found the internal consistency reliability of the behavior problems index to be .89 (.91 for children 12 and over). The internal consistency reliabilities on the subscales ranged from .54 to .76. Mott and Quinlan (1991) found internal consistency reliability measures in the same range, suggesting the Child Survey from the NLSY79 was comparable. The two-week test-retest reliability of the hyperactivity subscale was .68, suggesting that the retest reliability of the total scale is in the low .90s. Having been employed in prior national surveys, children from the entire range of social, economic, and ethnic backgrounds may be accurately assessed.


References

Achenbach, T.S. “The Child Behavior Profile: Boys Aged 6-11.” Journal of Consulting and Clinical Psychology 46 (1978): 478-488

Achenbach, T.S. and Edelbrock, C.S. “Behavioral Problems and Competencies Reported by Parents of Normal and Disturbed Children Aged Four through Sixteen.” Monographs of the Society for Research in Child Development, Serial No. 188, Vol. 46, Whole No. 1, 82 pages, 1981.

Mott, Frank L. and Quinlan, Steven V. Children of the NLSY79: 1988 Tabulations and Summary Discussion. Columbus, OH: Center for Human Resource Research, March 1991.

National Center for Health Statistics. “Current Estimates from the National Health Interview Survey: United States, 1981.” Vital and Health Statistics, Series 10, No. 141, DHHS Publ. No. (PHS) 83-1569. Public Health Service. Washington, D.C.: U.S. Government Printing Office, October 1982.

Peterson, J.L. and Zill, Nicholas. “Marital Disruption, Parent-Child Relationships, and Behavioral Problems in Children.” Journal of Marriage and the Family 48,2 (May 1988).

Rutter, M.; Tizard, J.; and Whitmore, K. Education, Health, and Behaviour. London: Longman, 1970.

Schreiner, I. “Analysis of the Reinterview Data from the 1981 Child Health Supplement to the National Health Interview Survey.” Statistical Methods Division Memorandum. Washington, D.C.: U.S. Bureau of the Census, February 17, 1983.


The mother and child questionnaire supplements included in Attachment 11 describe the assessment battery more fully. With the exception of the copyrighted testing materials (for example, the standard PPVT-R and PIAT examining materials and scoring sheets), these attachments fully describe the child survey.


Attachment 8—August 25, 2006, BLS News Release


Attachment 9—Respondent Advance Letters and Privacy Act Statement


Because of the Early Bird experiment, the survey uses two different versions of the main youth advance letter. In addition, there are two versions of the Young Adult advance letter, one for respondents who have previously been interviewed as Young Adults and one for those new to the sample. Each letter includes information about the privacy act and OMB approval of the survey. This attachment presents the following advance letters and information from the round 22 survey:


1. NLSY79 Early Bird advance letter

2. NLSY79 regular advance letter

3. Previously interviewed Young Adult advance letter

4. New Young Adult advance letter

5. Privacy Act statement and OMB Disclosure Notice, included on the reverse side of all letters


For round 23, we will use advance letters that are very similar in form and content. We anticipate altering the opening paragraph slightly to make a fresh appeal to respondents and updating the contact information as appropriate. The letters will remain the same in all other aspects.


«DATE»


«FNAME» «LNAME»

«ADDR1» «APT»

«ADDR2»«CITY», «STATE» «ZIP»


Dear «FNAME»,


You are part of something truly great: the National Longitudinal Survey of Youth 1979. The NLSY79 is invaluable in helping us understand who we are and where our country is headed. We invite you to take part in our Early Bird program for this round of the National Longitudinal Survey of Youth 1979 (NLSY79). You will be mailed $«AMT» in thanks for your prompt participation.


By calling our Early Bird hotline number, you save us the time and effort we would spend contacting you. We want to share the savings with you! To get your extra cash and participate in the Early Bird program, you must contact us within 4 weeks of receiving this letter. If you choose not to participate in this program, we still want you to take part in the NLSY79 as you have in the past, and you will receive the standard payment in appreciation for your time. As a small thank you, we have included a calendar magnet to help you keep track of important dates.

All it takes is 3 easy steps!


  1. C

    With Early Bird

    Get Extra Dollars!

    all our
    Early Bird toll free number at 1-800-675-4440.


  1. Leave us a message with your:

    • Name

    • Early Bird Number << EB ID>>

    • Phone Number

    • Good time to call you back


  1. An interviewer will return your call, ready to do your survey or to schedule an appointment time that’s convenient for you.



The questions and answers shown on the back of this letter provide further information about this survey and your confidentiality. If you have any additional questions about the study or the Early Bird program, please call us toll free at 1-800-675-4440 or send us an e-mail at [email protected]. We look forward to talking with you soon! And again, thank you!


Sincerely,

Dr. Charles R. Pierret

Program Director

National Longitudinal Surveys






«DATE»


«FNAME» «LNAME»

«ADDR1» «APT»

«ADDR2»«CITY», «STATE» «ZIP»


Dear «FNAME»,


You are part of something truly great: the National Longitudinal Survey of Youth 1979.


The NLSY79 is invaluable in helping us understand who we are and where our country is headed. There are no other data sources like the NLSY79. As NLSY79 researcher Jay Zagorsky said, “The ability to follow people for most of their lives is key for answering lots of important research questions. And by using a longitudinal survey like the NLSY79 we can make better predictions about the future.


We are grateful that you share your time with us for this priceless study. Without your participation in the NLYS79, hundreds of people go unrepresented. And, without you, changes in so many lives will never be known.


O

ne of our interviewers from NORC at the University of Chicago will be contacting you in the coming weeks to set up a convenient appointment for your interview. In the meantime, if you have any questions about the study, please feel free to call us toll free at 1-877- 853-5908. The questions and answers shown on the enclosed Confidentiality Statement card provide further information about this survey and your confidentiality.


As a small thank you, we have included a calendar magnet to help you keep track of important dates.


We look forward to talking with you soon! And again, thank you!



Sincerely,

Dr. Charles R. Pierret

Program Director

National Longitudinal Surveys


«DATE»


«FNAME» «LNAME»

«ADDR1» «APT»

«ADDR2»

«CITY», «STATE» «ZIP»



Dear «FNAME»,


You are part of something truly great: the National Longitudinal Survey of Youth 1979. And once again we’d like to talk with you about your education, job, family, and future plans. Few people have the opportunity to make such a lasting contribution.


There are many ways you can make a difference in your community and country, such as by voting, volunteering, or donating to your favorite charities.  Participating in the NLSY79 is another great way for you to make a positive impact on so many people’s lives.


As always, the information you provide is protected by law and the answers you give in the interview are completely confidential and cannot be traced to you in any way. The U.S. Department of Labor and the National Institutes of Health sponsor this study.


We are grateful for your continuing participation in this study. One of our interviewers from NORC at the University of Chicago will be contacting you in the coming weeks. You will receive $40 for the interview, which will take about an hour over the phone. If it’s easier for you to meet with us face to face, we will set up an appointment with you.


In the meantime, if you have any questions about the study, please feel free to call us toll free at 1-877-853-5908 or send us an email at [email protected]. The questions and answers shown on the back of this letter provide further information about this study and your confidentiality.


We look forward to talking with you soon! And again, thank you!


Sincerely,





Dr. Charles R. Pierret

Program Director

National Longitudinal Surveys


«DATE»


«FNAME» «LNAME»

«ADDR1» «APT»

«ADDR2»

«CITY», «STATE» «ZIP»



Dear «FNAME»,


You are part of something truly great: the National Longitudinal Survey of Youth 1979 (NLSY79). Now that you are approaching adulthood, we would like you to continue your involvement with the NLSY79 family as a young adult.


Few Americans have the opportunity to share their experiences with so many for such an important cause and to make such a lasting contribution. Your participation in the NLSY79 will help government leaders build programs and pass laws based partly on your experiences—programs and laws that may benefit you in the future.


There are many ways you can make a difference in your community and country, such as by voting, volunteering, or donating to your favorite charities.  Participating in the NLSY79 is another great way for you to make a positive impact on so many people’s lives.


As a young adult, you’ll be asked questions similar to the ones we’ve asked your mother—questions about school, work, your family, and your future plans. As always, the information you provide is protected by law and the answers you give in the interview are completely confidential and cannot be traced to you in any way. The U.S. Department of Labor and the National Institutes of Health sponsor this study.


One of our interviewers from NORC at the University of Chicago will be contacting you in the coming weeks. You will receive $40 for the interview, which will take about an hour over the phone. If it’s easier for you to meet with us face to face, we will set up an appointment with you. In the meantime, if you have any questions about the study, please feel free to call us toll free at 1-877-853-5908 or send us an email at [email protected]. The questions and answers shown on the back of this letter provide further information about this study and your confidentiality.


We look forward to talking with you soon! And again, thank you!


Sincerely,





Dr. Charles R. Pierret

Program Director

National Longitudinal Surveys

WHY IS THIS STUDY IMPORTANT?

Thanks to your help, policymakers and researchers will have a better understanding of the work experiences, family characteristics, health, financial status, and other important information about the lives of people in your generation. This is a voluntary study, and there are no penalties for not answering questions. However, missing responses make it more difficult to understand the issues that concern people in your community and across the country. Your answers represent the experiences of hundreds of other people your age. We hope we can count on your participation again this year.

WHO AUTHORIZES THIS STUDY?

The sponsor of the study is the U.S. Department of Labor, Bureau of Labor Statistics. The study is authorized under Title 29, Section 2, of the United States Code. The Center for Human Resource Research at The Ohio State University and the National Opinion Research Center at the University of Chicago conduct this study under a contract with the Department of Labor. The U.S. Office of Management and Budget (OMB) has approved the questionnaire and has assigned 1220-0109 as the study’s control number. This control number expires on January 31, 2007. Without OMB approval and this number, we would not be able to conduct this study.

WHO SEES MY ANSWERS?

We want to reassure you that your confidentiality is protected by law. The Privacy Act of 1974 and the Confidential Information Protection and Statistical Efficiency Act of 2002 require us to keep all information about you and your household strictly confidential. The information collected in this survey must be used exclusively for statistical purposes. All the employees who work on the survey at the Bureau of Labor Statistics and its contractors must sign a legal document stating that they will not disclose the identities of survey participants to anyone who does not work on the National Longitudinal Surveys program and is therefore not legally authorized to know those identities. In fact, only a few people have access to information about your identity because they need that information to carry out their job duties.

Some of your answers will be made available to researchers at the Bureau of Labor Statistics and other government agencies, universities, and private research organizations through publicly available data files. These publicly available files contain no personal identifiers, such as names, addresses, Social Security numbers, and places of work, and exclude any information about the States, counties, metropolitan areas, and other, more detailed geographic locations in which survey participants live, making it much more difficult to figure out the identities of participants. Some researchers are granted special access to data files that include geographic information, but only after those researchers go through a thorough application process at the Bureau of Labor Statistics. Those authorized researchers must sign a written agreement making them official agents of the Bureau of Labor Statistics and requiring them to protect the confidentiality of survey participants. Those researchers are never provided with information about the personal identities of participants. The National Archives and Records Administration and the General Services Administration may receive copies of survey data and materials because those agencies are responsible for storing the Nation’s historical documents. The information provided to those agencies does not include participants’ identities.

HOW MUCH TIME WILL THE INTERVIEW TAKE?

Based on preliminary tests, we expect the average interview to take about 60 minutes. Your interview may be somewhat shorter or longer depending on your circumstances. If you have any comments regarding this study or recommendations for reducing its length, send them to the Bureau of Labor Statistics, National Longitudinal Surveys, 2 Massachusetts Avenue, N.E., Washington, DC 20212.



Attachment 10—Justification for Political Participation Questions

December 12, 2006



This document describes the questions that the American National Election Studies (ANES) would like to place on the next runs of the NLSY79 and YA studies. We appreciate very much the opportunity to propose these questions to you and to explain the rationales for them.


This memo is divided into four sections. We begin by explaining a bit about the history and goals of the ANES, so you know about the “collective good” research program that the new questions would feed. Second, we describe the process by which we selected the items we propose to be included in your surveys. Third, we provide reassuring information about the respondent burden imposed by the questions we propose – after decades of asking these sorts of questions, we have found that respondents readily answer them. Lastly, we provide specific justifications for each of the questions we propose.


We hope this information is helpful to you as you consider these questions for inclusion in the upcoming NLS79 and YA surveys. Please let us know if we can provide any additional information.



Sincerely,


Jon A. Krosnick and Arthur Lupia


____________________________________________


About the ANES


Since 1948, the American National Election Studies (ANES) surveys have been conducted every two years to help scholars around the world understand American election outcomes by providing data that permit rich hypothesis testing using a wide array of variables while maximizing methodological excellence and permitting comparisons across people, contexts, and time. The ANES provides researchers with a view of the political world through the eyes of citizens to help us understand the forces that shape their actions, which in turn determine election outcomes and governance of the nation.


ANES conducts national surveys of the American electorate in national election years, and during the other years ANES carries out R&D work through pilot studies that produce and validate new questions, which are then used in subsequent election year surveys. The longevity of the ANES time-series enhances the utility of the data, because measurements can be pooled over time, allowing illumination of long-term trends and the political impact of historic events.


All ANES questionnaires and the resulting public datasets are made available free of charge and without restriction or embargo from the ANES website (www.electionstudies.org) to any interested scholars. The ANES Bibliography documents over 5,000 citations resulting from the use of ANES data: http://www.electionstudies.org/resources/biblio/anes_bib.pdf


The questions proposed here, and the entire ANES enterprise, are designed to help Americans to better understand the functioning of their nation and the relation of its citizenry to their government. In a sense, one can think of these questions as a survey of government’s “customers,” very much in keeping with many federal surveys conducted over the years to gauge public evaluations of federal services provided by the Internal Revenue Service, the Veteran’s Administration, and many other agencies. By helping government (and scholars around the world) to understand how American citizens react to political events and evaluate government options, these surveys allow government agencies to be better informed about how to make and implement policy and how to design elections and educational efforts to educate citizens about government activities.


Development of the NLS Questionnaires


Like all other ANES questions, the questions we propose for inclusion in upcoming NLS surveys are the result of an extensive and conscientious peer review process.


The proposed NLS questions were selected in response to feedback received from a large group of scholars who responses to our calls for feedback. These calls were distributed broadly to various communities of scholars during the fall of 2005 and 2006. We received detailed advice and justifications from more than 100 scholars representing over 55 universities and other organizations. The questions that scholars suggested totaled well over 400 minutes of interview time. To develop the final list, we spent many hours reading every e-mail, following up to learn more about issues raised, consulting ancillary materials, and more. Our goal was to select questions that had broad support among the community of scholars, solid theoretical and empirical justifications, and fit the opportunity to facilitate intergenerational and longitudinal dynamics of electorally relevant phenomena. The questions that we selected address political parties, voter turnout on Election Day, other forms of behavioral participation in politics, and variables that may be useful in explaining turnout and participation. We considered many questions that, in the end, we could not include. Although we looked for consensus among scholars with many interests, we were particularly swayed by arguments based on hard evidence – such as results from previous surveys or strong theory – documenting the value of a proposed question.


Most of the questions that were proposed to us for inclusion in the NLS surveys were drawn from questionnaires that have been used in prior ANES surveys. All previous ANES questions were themselves the product of calls for feedback to the ANES user community and have undergone extensive theoretical and empirical review. The planning and execution of each ANES study takes place during two years prior to conducting the field work and is a collaborative effort involving the Principal Investigators, the ANES Board of Overseers, and the research community. The ANES Board of Overseers is an advisory committee comprised of prominent scholars from across the United States. They come from various disciplines, though a plurality is political scientists. The current membership can be viewed here: http://www.electionstudies.org/overview/anes_board.htm


Respondent Burden


For the lengthy, high response rate surveys ANES conducts, each respondent is interviewed twice during an election period – once before the election and once afterwards. Interview lengths have varied from year to year, with anywhere from 30 to 180 minutes of questions asked of each respondent. Respondent break-offs (partial interviews) are very rare.


We are not aware of any evidence that respondents in past ANES surveys have found any of the proposed NLS questions to be offensive or distasteful. Our interviewers are trained to record such comments about the respondent experience. Respondents have been given information for contacting IRB representatives if they have concerns about our surveys. The IRB reports back to us about such instances, and we are not aware of any IRB reports about concerns regarding the NLS questions we are suggesting.

Questions for the NLSY79 Main Youth Survey and Young Adult Survey (Ages 15-20)


The questions we propose for the upcoming NLSY79 main youth survey and for age-appropriate respondents in the NLSY79 Young Adult (ages 15-20) survey are a subset of those that were included in the questionnaire for the grant-funded sample in the most recent YA survey. To maximize comparability, we would like to ask that these questions be asked in exactly the same manner (with response choice order randomization where it was done previously) as in the grant-funded sample of the YA 2006 survey (all subsequent references in this memo to the 2006 YA survey apply to the grant-funded sample only). Below each question that follows, we explain the rational for it.




ATT-POL-77. In talking to people about elections, we often find that a lot of people were not able to vote because they were sick or they just didn't have time or for some other reason. Which of the following statements best describes you: One, I did not vote in the November 2006 election. Two, I thought about voting in the November 2006 election, but didn't. Three, I usually vote, but didn't vote in the November 2006 election. Or four, I am sure I voted in the November 2006 election.


1. I DID NOT VOTE IN THE NOVEMBER 2006 ELECTION

2. I THOUGHT ABOUT VOTING IN 2006, BUT DIDN'T

3. I USUALLY VOTE, BUT DIDN'T IN 2006

4. I AM SURE I VOTED


ATT-POL-77 measures turnout in a previous election. Its follows the general format developed by the American National Election Studies and used in its production surveys since 1952. Turnout is important as a measure of civic engagement, and much of what is known about it, including its propensity to change over the life course, is based on versions of ATT-POL-77 asked on the ANES over the past five decades (Miller and Shanks 1996; Wolfinger and Rosenstone 1980).


Many scholars are concerned that answers to simple and direct voter turnout questions (“Did you vote?”) are subject to a reporting bias: over-reporting of turnout among a small set of people who usually vote but did not in this election (McDonald 2003; Presser and Traugott 1992). The wording of the question we propose is designed to minimize this bias. It includes a recitation of common reasons for not voting as a means for reducing over-reporting of turnout. This expansion of the choice set allows a more detailed set of responses than “yes” and “no” – it was first proposed and tested in the 1994 ANES Pilot Study and has been examined experimentally in ANES studies multiple times since then. The new response choice set follows from the “source confusion” perspective in psychology – its purpose is to minimize confusion of memories of having voted in other elections or having planned to vote in this election with memories of actually voting in this election (Belli et al 1994). Respondents who are uncertain about their turnout status, confusing their recent turnout behavior with past or typical turnout behavior, concerned about the social desirability of their response, or prevaricating their response altogether have the option of providing an answer that shows uncertainty without providing an outright “no” in response. The broad set of response choices thus minimizes misreporting.


Experimental comparisons of this question with simple, direct questions have been conducted in multiple ANES surveys, most recently in the 2004 ANES post-election survey. No respondents refused to answer either version of the question. The simple, direct question resulted in 82% “yes (I did vote)” responses and 18% “no (I did not vote)” responses. The new, longer version of the question resulted in 75% “I am sure I voted” responses, 4% “I usually vote, but didn’t this time,” 8% “I thought about voting this time, but didn’t,” and 13% “I did not vote” responses. An analysis of the new choice set using data from an experiment in the 2002 ANES showed an 8% reduction in over-reporting, primarily among those least engaged with politics (Duff et al, 2004). Thus, the long question appears to provide more accurate measurements.


A version of this question, YASR-77, was included in the YA 2006 study, and the distribution of answers was:

[N = 1116] I did not vote in the 2004 presidential election

[N = 531] I thought about voting in 2004, but didn’t

[N = 143] I usually vote, but didn’t in 2004

[N = 1448] I am sure I voted

[N = 10] don’t know response

[N = 4] refusal to answer


Thus, almost no respondents declined to answer.



ATT-POL-78. Generally speaking, do you usually think of yourself as {ROT_PARTY}, an Independent, or what?


In this question, the placeholder {ROT_PARTY} is for the terms “Democrat” or “Republican.” The order in which these terms appear in the question is rotated randomly to reduce response choice order effects. So, some respondents hear “Republican” first, while others hear “Democrat” first.


ATT-POL-78A. A strong {party} or a not very strong {party)?


In this version of the follow up question, respondents who answered “Republican” or “Democrat” are asked about the strength of their partisan attachment.


ATT-POL-78B. What party?


In this version of the follow-up question, respondents who did not choose one of the parties in the initial question are given an opportunity to enter another party name.


ATT-POL-78C. Do you think of yourself as {ROT_CLOSER}, or equally close to both?


This version of the follow-up is asked of respondents who indicated no party preference. ROT_CLOSER is a placeholder for the term "closer to the Democratic party, closer to the Republican party" or its inverse. Again, the order is rotated randomly to reduce response choice order effects.


ATT-POL-78 measures identification with political parties. This variable has been a staple of the American National Election Studies since its inception and has been the focus of numerous landmark studies in political science (e.g., Campbell, Converse, Miller, and Stokes 1960; Miller and Shanks 1996).


These questions were most recently administered in the 2004 ANES pre-election survey. In response to Question 2, 29% answered Republican, 32% answered Democrat, 33% answered Independent, 1% answered a different party, and 5% indicated no preference. Eight persons answered “don’t know” and only two persons refused to answer. Those who indicated a party identification of Republican or Democrat received Question 2A, with 54% answering “strong” and 45% answering “not very strong” – two persons answered “don’t know,” and one person refused to answer. The 33% of the sample that answered Independent in Question 2 received Question 2C, in which 29% responded feeling closer to the Republican party and 44% volunteered feeling closer to the Democratic party - 26% volunteered that they felt closer to neither, four persons said they didn’t know, and no persons refused to answer.


This question, called YASR-78, was included in the YA 2006 study, and the distribution of responses was:

[N = 1234] Democrat

[N = 510] Republican

[N = 833] Independent

[N = 13] Other party (volunteered)

[N = 471] No preference (volunteered)

[N = 176] don’t know response

[N = 15] refusal to answer


In a preliminary YA 2006 distribution, YASR-78A had this distribution:

[N = 694] Strong

[N = 510] Not very strong

[N = 11] don’t know response

[N = 0] refusal to answer


In response to YASR-78B in the YA 2006 study, seven persons were able to name a specific party, whereas six answered “don’t know.”


YASR-78C was also included in the YA 2006 study, and the distribution of answers was:

[N = 118] Closer to Republican Party

[N = 297] Closer to Democratic Party

[N = 517] Equally Close

[N = 422] Never (volunteered)

[N = 139] don’t know response

[N = 15] refusal to answer





ATT-POL-79. How often do you follow what's going on in politics? Always, most of the time, about half the time, once in a while, or never.


ATT-POL-79 measures the extent to which people follow politics. Versions of the question have been in use in ANES surveys for decades. The variable is widely used in studies of politics and correlates with measures of turnout and political participation. The wording of this version of the question reflects improvements based on recent developments in questionnaire design. In particular, the response options have been changed to categories for which there is evidence of greater respondent comprehension and differentiation (Krosnick and Fabrigar 2007).


In the 2004 ANES, 296 persons said they follow what’s going on in government and public affairs “most of the time,” 431 said “some of the time,” 238 said “only now and then,” and 98 said “hardly at all,” with two persons answering “don’t know” and one refusing to answer.


This question, called YASR-79, was included in the YA 2006 study, and the distribution of responses was:

[N = 203] Always

[N = 567] Most of the time

[N = 658] About half the time

[N = 1184] Once in a while

[N = 623] Never

[N = 8] don’t know response

[N = 9] refusal to answer




ATT-POL-80. How often does the federal government do what most Americans want it to do? Always, most of the time, about half the time, once in a while, or never.


ATT-POL-80 is one of many ANES questions that attempt to measure respondent attitudes about government policy. In 2004, less than one percent of respondents refused to answer this sort of question in our survey. This particular question is included on the NLS battery because it is easily comprehended and can be asked and answered quickly. We expect answers to this question to vary in interesting ways over the adult life cycle.


This question, called YASR-80, was included in the YA 2006 study, and the distribution of responses was:

[N = 34] Always

[N = 259] Most of the time

[N = 957] About half the time

[N = 1194] Once in a while

[N = 688] Never

[N = 113] don’t know response

[N = 7] refusal to answer




ATT-POL-84. Generally speaking, how often can you trust other people? Always, most of the time, about half the time, once in a while, or never.


ATT-POL-84 is a widely used ANES question to measure the extent to which people trust others. The question is particularly relevant to “trusting people we don’t know” (Uslaner 2002). In many studies, trust has been shown to be linked to civic engagement (e.g., Rahn and Transue 1998). People who are less trusting tend to opt out of many civic interactions and tend to be different from others in terms of their views about the range of services that governments can provide to citizens and their evaluations of economic, social and political variables. We expect trust to vary in interesting ways over the adult life cycle.


A similar question has been used in the ANES time series surveys since 1992. In the 2004 ANES post-election interview, 471 persons answered that “most people can be trusted,” and 591 persons answered “you can’t be too careful in dealing with people,” with two don’t know responses and two refusals.


This question, called YASR-84, was included in the YA 2006 study, and the distribution of responses was:

[N = 65] Always

[N = 813] Most of the time

[N = 959] About half the time

[N = 970] Once in a while

[N = 436] Never

[N = 5] don’t know response

[N = 4] refusal to answer



Questions for the NLSY79 Young Adult Survey (Ages 21 and Older)


This set of questions is our second contribution to the YA survey. Our initial questions, asked of the grant-funded sample during the 2006 wave, established important baseline measurements that are likely to be of great interest to ANES users and to users of the NLS who are interested in interactions between political and other social factors. The list that follows repeats some of our 06 questions, and the remainder replaces questions from the last wave for which measurements in the next YA wave are unlikely to provide new information.


These questions cover the content we would like to address. The questions asked on the previous wave (1-6) should be asked in the identical form in the next wave – with the caveat that the turnout question (1) should refer to the 2006 general election rather than the presidential election of 2004. Regarding the new questions (7-15), we would be grateful for your feedback to help improve wording where you think that could be done.


Since the first four questions and the sixth one here are identical to the five questions proposed for NLS79, we offer question-specific justifications only for the remaining questions.



YASR-77. In talking to people about elections, we often find that a lot of people were not able to vote because they were sick or they just didn't have time or for some other reason. Which of the following statements best describes you: One, I did not vote in the November 2006 election. Two, I thought about voting in the November 2006 election, but didn't. Three, I usually vote, but didn't vote in the November 2006 election. Or four, I am sure I voted in the November 2006 election.


1. I DID NOT VOTE IN THE NOVEMBER 2006 ELECTION

2. I THOUGHT ABOUT VOTING IN 2006, BUT DIDN'T

3. I USUALLY VOTE, BUT DIDN'T IN 2006

4. I AM SURE I VOTED



YASR-78. Generally speaking, do you usually think of yourself as {ROT_PARTY}, an Independent, or what?


In this question, the placeholder {ROT_PARTY} is for the terms “Democrat” or “Republican.” The order in which these terms appear in the question is rotated randomly to reduce response order effects. So, some respondents hear “Republican” first, while others hear “Democrat” first.


YASR-78A. A strong {party} or a not very strong {party}?


In this version of the follow up question, respondents who answered “Republican” or “Democrat” are asked about the strength of their partisan attachment.


YASR-78B. What party?

In this version of the follow-up question, respondents who did not choose one of the parties in the initial question are given an opportunity to enter another party name.


YASR-78C. Do you think of yourself as {ROT_CLOSER}, or equally close to both?


This version of the follow-up is asked of respondents who indicated no party preference. ROT_CLOSER is a place holder for the term "closer to the Democratic party, closer to the Republican party" or its inverse. Again, the order is rotated randomly to reduce response order effects.



YASR-79. How often do you follow what's going on in politics? Always, most of the time, about half the time, once in a while, or never.


To reduce response order effects, on questions with this set of responses we rotate responses.



YASR-80. How often does the federal government do what most Americans want it to do? Always, most of the time, about half the time, once in a while, or never.



YASR-81. How often is politics so complicated that you don't really understand what's going on? Always, most of the time, about half the time, once in a while, or never.


YASR-81 is an indicator of political efficacy – the extent to which people feel able to engage in politics successfully. A sense of political efficacy is strongly associated with the behavior of political participation (Verba and Nie 1972; Verba, Schlozman, and Brady 1995), and efficacy has been measured on the ANES for five decades. While responses to this question correlate with education and income, the correlations are far from perfect. This question serves many purposes in analyses and is considered to capture a distinctive component of why some people who might otherwise be expected to be interested in a range of political, economic, and social matters simply opt out. This is also a factor that we might expect to vary in interesting ways over the adult life cycle.


A variant of this question has been asked in ANES since 1952. In the 2000 ANES study, in regards to the statement “Sometimes politics and government seem so complicated that a person like me can’t really understand what is going on,” 25% of respondents said “agree strongly,” 46% answered “agree somewhat,” 7% answered “neither agree nor disagree,” 13% said “disagree somewhat,” and 9% said “disagree strongly,” with four persons answering “don’t know” and three respondents refusing to answer.


YASR-81 was included in the YA 2006 study, and the distribution of responses was:

[N = 380] Always

[N = 665] Most of the time

[N = 757] About half the time

[N = 993] Once in a while

[N = 414] Never

[N = 34] don’t know response

[N = 9] refusal to answer




YASR-84. Generally speaking, how often can you trust other people? Always, most of the time, about half the time, once in a while, or never.



YASR-85A. How interested are you in information about what’s going on in government and politics? Extremely interested, very interested, moderately interested, slightly interested, or not interested at all?


1. EXTREMELY INTERESTED

2. VERY INTERESTED

3. MODERATELY INTERESTED

4. SLIGHTLY INTERESTED

5. NOT INTERESTED AT ALL

8. DON’T KNOW

9. REFUSED


YASR-85A is an ANES question that has been asked many times and measures the extent to which people are interested in information about politics. This question is only modestly correlated with responses to YASR-79 above (how often respondents follow politics). Some respondents are very interested in information but do not follow politics often. From their inception, ANES surveys have included multiple questions measuring the extent to which respondents pursue various kinds of social information. This particular question is included on the NLS battery because it is easily comprehended and can be asked and answered quickly. We expect answers to vary in interesting ways over the adult life cycle.


A version of this question asking about interest in political campaigns has been asked on ANES surveys since 1952. In 2004, 498 persons indicated they were “very much interested,” 528 persons indicated they were “somewhat interested,” and 186 persons indicated they were “not much interested.” No respondents answered “don’t know,” and no respondents refused the question.



YASR-85B. Generally speaking, do you believe that you have a duty to vote in every national election, or do you believe that you do not have a duty to vote in every national election?


Unlike previous questions, YASR-85B has not appeared on previous ANES surveys. YASR-85B was requested by several scholars in our recent call for proposals. It has several motivations. One involves the extent to which people perceive socially valuable acts such as voting as a matter of duty or a matter of choice. Another has to do with the role of economic models of politics. The use of these models is controversial, with their application to the domain of turnout being particularly so. Since any individual voter is incredibly unlikely to affect the outcome of a mass election, some economic models generate the conclusion that voting is a suboptimal behavior. Other models reach different conclusions. One set of models, with the most prominent example being Riker and Ordeshook (1968), predict significant turnout but only after assuming that citizens take certain actions to fulfill a sense of civic duty (i.e., they gain non-instrumental utility). The extent to which respondents feel that an act such as voting is a matter of duty can address both the patriotism and modeling issues. Responses are also likely to vary over the lifecycle.



YASR-85C. "Do you ever talk with friends, family, co-workers, or other people about political events?”


YASR-85CA. IF YES: “During a typical week, how many days do you talk with anyone about political events?”


YASR-85C serves multiple purposes. First, it serves some scholars as a measure of civic engagement in the context of communication about political issues. It also reflects the growing interest in social networks. Again, ANES asks an extensive battery of questions to measure how often and to whom people talk about social issues. This single question is likely to capture important variations in the extent to which people present their views to others and are exposed to other worldviews. Such variations, in turn, are likely to affect how people evaluate a wide range of social and political phenomena. We expect answers to vary in interesting ways over the adult life cycle.


Variants of these questions have been asked on ANES studies for decades. In the 2004 ANES post-election survey, 850 persons said they have ever discussed politics with family or friends, 215 said they have never done so, with no persons answering “don’t know” and only one person refusing.



YASR-86A. If a person works hard to be successful in life, do you think the person will probably get more rewards than he or she deserves, fewer rewards than he or she deserves, or about the amount of rewards he or she deserves?


YASR-86AA. IF MORE: A great deal more, a moderate amount more, or a little more?


YASR-86AB. IF LESS: A great deal less, a moderate amount less, or a little less?


YASR-87A. Would it be good, bad, or neither good nor bad if the federal government were to make sure that everyone has an equal opportunity to succeed in life?


YASR-87AA. If good: Extremely good, moderately good, or slightly good?


YASR-87AB. If bad: Extremely bad, moderately bad, or slightly bad?


YASR-88A. Do you think it is a good idea, a bad idea, or neither good nor bad for the federal government to provide financial help to people who have serious financial problems because they lost their job?


YASR-88AA. If good: Extremely good, moderately good, or slightly good?


YASR-88AB. If bad: Extremely bad, moderately bad, or slightly bad?


YASR-86A, YASR-87A and YASR-88A address a theme common to previous ANES surveys – equal opportunity and the extent to which the federal government intervenes when inequality occurs. Many national conversations about important domestic policy issues are framed in terms of competing narratives about why inequality occurs. Often linked to particular narratives about the causes of inequality are conclusions about the desirability of governmental action.


These three questions are designed to document respondent perceptions of core themes in a wide range of domestic policy debates. YASR-86A solicits a respondent’s views about causes of inequality. YASR-87A gauges opinions about the government ensuring equality. YASR-88A focuses on governmental intervention in a specific circumstance.


These exact questions have not been asked before, but ANES has asked many others like them. Variants of YASR-86A have been asked of ANES respondents for decades.  In the 2004 ANES post-election survey, respondents were asked whether they agree with the statement, “It’s really a matter of some people not trying hard enough; if blacks would only try harder they could be just as well off as whites.”  To this statement, 227 respondents replied “agree strongly,” 349 said “agree somewhat,” 161 said “neither agree nor disagree,” 208 answered “disagree somewhat,” and 113 said “disagree strongly,” with six “don't know” responses and two refusals to answer.  In the 1986 ANES, respondents were asked about the statement that “any person who is willing to work hard has a good chance of succeeding,” with 481 “agree strongly” responses, 446 answering “agree somewhat,” 59 saying “neither agree nor disagree,” 75 saying “disagree somewhat,” and 17 answering “disagree strongly,” with only three answering “don't know” and nine refusing to answer or providing responses that could not be coded.  In the 1972 ANES, 845 persons thought the statement "becoming a success is a matter of hard work; luck has little or nothing to do with it" was closer to the way they feel than the statement "getting a good job depends mainly on being in the right place at the right time," with 19 "don't know" responses and 26 refusals or responses that could not be coded. 

 

As to YASR-87A, the 2004 ANES asked respondents their agreement with the statement that “Our society should do whatever is necessary to make sure that everyone has an equal opportunity to succeed.” 624 respondents answered “agree strongly,” 317 said “agree somewhat,” 68 said “neither agree nor disagree,” 36 answered “disagree somewhat,” and 19 said “disagree strongly,” with one “don't know” and one refusal to answer.



In the YA 2006 study, to leverage the scholarly potential in linking data from NLS parents and children, we included questions about respondents’ perceptions of their parents’ partisanship and about how often their parents discussed politics when the respondents were growing up.


We are requesting that questions YASR-89 through YASR-93 be asked of respondents who are asked the political question sequence for the first time in 2008.


Since we do not expect any significant change in responses between 2006 and 2008, these questions need not be asked of respondents who answered these questions in 2006.


YASR-89. When you were growing up, how often did you hear the adults in your household talking about politics? Always, most of the time, about half the time, once in a while, or never.


YASR-89 was included in the YA 2006 study, and answers were distributed as follows:

[N = 87] Extremely often

[N = 202] Very often

[N = 451] Moderately often

[N = 1399] Once in a while

[N = 1091] Never

[N = 17] don’t know response

[N = 5] refusal to answer


YASR-90. When you were growing up, did your mother think of herself mostly as {ROT_PARTY}, an Independent, or what?


YASR-90A. What party?


YASR-90 (mother) was included in the YA 2006 study, and answers were distributed as follows:

[N = 1163] Democrat

[N = 493] Republican

[N = 395] Independent

[N = 4] Other party (volunteered)

[N = 242] No preference (volunteered)

[N = 945] don’t know response

[N = 10] refusal to answer


In YASR-90A, all four respondents who said “other party” answered “don’t know” when asked the name of the party.


YASR-91. How often did she follow what was going on in politics? Always, most of the time, about half the time, once in a while, or never.


YASR-91 (mother) was also included in the YA 2006 study, and answers were distributed as follows:


[N = 139] Always

[N = 418] Most of the time

[N = 522] About half the time

[N = 1106] Once in a while

[N = 611] Never

[N = 447] don’t know response

[N = 9] refusal to answer


YASR-92. Think about your father, stepfather, or someone else who was most like a father to you when you were growing up. Did he think of himself mostly as {ROT_PARTY}, an Independent, or what?


YASR-92A. What party?


YASR-92 (father figure) was included in the YA 2006 study, and answers were distributed as follows:

[N = 1023] Democrat

[N = 557] Republican

[N = 378] Independent

[N = 5] Other party (volunteered)

[N = 176] No preference (volunteered)

[N = 122] Respondent had no father figure (volunteered)

[N = 982] don’t know response

[N = 9] refusal to answer


In YASR-92A, of the five respondents who said “other party,” one was able to provide a specific party name and the other four answered “don’t know.”

YASR-93. How often did he follow what was going on in politics? Always, most of the time, about half the time, once in a while, or never.


YASR-93 (father figure) was also included in the YA 2006 study, and answers were distributed as follows:


[N = 288] Always

[N = 531] Most of the time

[N = 502] About half the time

[N = 766] Once in a while

[N = 454] Never

[N = 579] don’t know response

[N = 10] refusal to answer




References

Belli, Robert, Santa Traugott, and Steven J. Rosenstone. 1994. Reducing Over-Reporting of Voter Turnout: An Experiment using a ‘Source Monitoring’ Framework, ANES Technical Report Series, No. nes010153.

ftp://ftp.electionstudies.org/ftp/nes/bibliography/documents/nes010282.pdf


Campbell, Angus, Philip E. Converse, Warren E. Miller, and Donald E. Stokes (1960). The American Voter. New York: John Wiley.


Duff, Brian, Michael J. Hanmer, Won-ho Park, and Ismail K. White. 2004. How Good is This Excuse?: Correction of Overreporting of Voter Turnout Bias in the 2002 American National Election Study, ANES Technical Report Series, No. nes010872.

ftp://ftp.electionstudies.org/ftp/nes/bibliography/documents/nes010872.pdf


Krosnick, Jon A. and Leandre R. Fabrigar. 2007. “Labeling the Points on Rating Scales. Words? Numbers?” To be included in Designing Great Questionnaires: Insights from Psychology. Forthcoming, Oxford University Press.


McDonald, Michael P. (2003). On the Over-Report Bias of the National Election Study. Political Analysis, 11, 180-186.


Miller, Warren, and Shanks, J. Merrill (1996). The New American Voter. Cambridge, MA: Harvard University Press.


Presser, Stanley, and Michael Traugott. Little White Lies and Social Science Models: Correlated Response Errors in a Panel Study of Voting. Public Opinion Quarterly, 56, 77-86.


Rahn, Wendy M., & Transue, John E. (1998). Social trust and value change: The decline of social capital in American youth, 1976-1995. Political Psychology, 19, 545-563.


Riker, William H., and Ordeshook, Peter C. (1968). A Theory of the Calculus of Voting. American Political Science Review, 62, 25.


Uslaner, Eric M. (2002). The Moral Foundations of Trust. New York: Cambridge University Press.


Verba, Sidney, and Norman Nie (1972). Participation in America. New York: Harper.


Verba, Sidney, Kay Lehman Schlozman, and Henry E. Brady (1995). Voice and Equality: Civic Voluntarism in American Politics. Cambridge, MA: Harvard University Press.


Wolfinger, Raymond E., and Stephen J. Rosenstone (1980). Who Votes? New Haven, CT: Yale University Press.













Attachment 11—23rd Wave (2008) Proposed
Interview Schedules


The following questionnaires are attached under separate cover:


1. NLSY79 Questionnaire (main youth cohort)

--includes Mother Supplement questions for mothers of children under age 15


2. NLSY79 Young Adult Questionnaire (ages 15–20)


3. NLSY79 Child Supplement (children under age 15)

--includes Child Self-Administered Supplement (children ages 10–14)


1 Three series were extracted from the FactFinder website; PCT12, PCT12J, and PCT12H which track the U.S. population’s age broken down by sex and race/ethnicity. PCT12 tracks the total population, PCT12J tracks black non-Hispanics and PCT12H track the Hispanic population. Non-black, non-Hispanic figures are computed by subtracting PCT12J and PCT12H from PCT12. All data come from the Census 2000 Summary File 1 (SF 1), which combines both short and long form answers and does not subsample any information.

File Typeapplication/msword
File TitleFrom
AuthorCenter for Human Resource Research
Last Modified ByAmy Hobby
File Modified2007-10-12
File Created2007-10-12

© 2024 OMB.report | Privacy Policy