National Longitudinal Survey of Youth 1997
OMB Control Number 1220-0157
OMB Expiration Date: 8/31/2022
Summary
This package requests clearance for an interim supplement between Rounds 19 and 20 of the National Longitudinal Survey of Youth 1997 (NLSY97) on the effects of the coronavirus pandemic. The main NLSY97 sample includes 8,984 respondents who were born in the years 1980 through 1984 and lived in the United States when the survey began in 1997. Sample selection was based on information provided during the first round of interviews. This cohort is a representative national sample of the target population of young adults. The sample includes an overrepresentation of blacks and Hispanics to facilitate statistically reliable analyses of these racial and ethnic groups. Appropriate weights are provided so that the sample components can be combined in a manner to aggregate to the overall U.S. population of the same ages.
The main survey is funded primarily by the U.S. Bureau of Labor Statistics at the U.S. Department of Labor. Additional funding has been provided in some years by the Departments of Health and Human Services, Education, Defense, and Justice, and the National Science Foundation. The Bureau of Labor Statistics has overall responsibility for the project. The BLS has contracted with the Center for Human Resource Research (CHRR) at the Ohio State University to conduct the survey. The contractor is responsible for survey design, interviewing, data preparation, documentation, and the preparation of public-use data files.
The data collected in this survey are part of a larger effort that involves repeated interviews administered to a number of cohorts in the U.S. Many of the questions are identical or very similar to questions previously approved by OMB that have been asked in other cohorts of the National Longitudinal Surveys (NLS). This supplement will be the first collection in which the NLSY97 sample members will be offered internet interviews.
National Longitudinal Survey of Youth 1997 (NLSY97)
oMB cONTROL NO. 1220-0157
1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.
This statement covers the coronavirus pandemic supplement, an interim fielding between Round 19 and 20 of the National Longitudinal Survey of Youth 1997 (NLSY97). The NLSY97 is a nationally representative sample of persons who were ages 12 to 16 on December 31, 1996. The Bureau of Labor Statistics (BLS) contracts with external organizations to interview these youths, to study how young people make the transition from full-time schooling to the establishment of their families and careers, and their labor market outcomes over the life-cycle. Interviews were conducted on a yearly basis through Round 15; beginning with Round 16 they are interviewed on a biennial basis. The longitudinal focus of this survey requires information to be collected about the same individuals over many years in order to trace their education, training, work experience, fertility, income, and program participation.
The mission of the Department of Labor (DOL) is, among other things, to promote the development of the U.S. labor force and the efficiency of the U.S. labor market. The BLS contributes to this mission by gathering information about the labor force and labor market and disseminating it to policymakers and the public so that participants in those markets can make more informed and, thus more efficient, choices. The charge to the BLS to collect data related to the labor force is extremely broad, as reflected in Title 29 USC Section 1:
“The general design and duties of the Bureau of Labor Statistics shall be to acquire and diffuse among the people of the United States useful information on subjects connected with labor, in the most general and comprehensive sense of that word, and especially upon its relation to capital, the hours of labor, the earnings of laboring men and women, and the means of promoting their material, social, intellectual, and moral prosperity.”
The collection of these data contributes to the BLS mission by aiding in the understanding of labor market outcomes faced by individuals in the early stages of career and family development. See Attachment 1 for Title 29 USC Sections 1 and 2.
2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.
The major purpose of the longitudinal data collection is to examine the transition from school to the labor market and into adulthood. The study relates each respondent’s educational, family, and community background to his or her success in finding a job and establishing a career. This interim collection serves a unique purpose: to collect data that will permit the study of long-term effects on labor market outcomes from short-term labor market disruptions due to the coronavirus pandemic. As such, this survey will focus on current employment, employment disruptions over the past 12 months, schooling of children who live in the household, and health.
During Round 1, the study included a testing component sponsored by the Department of Defense that assessed the aptitude and achievement of the youths in the study so that these factors can be related to career outcomes. This study, begun when most participants were in middle school or high school, has followed them as they enter college or training programs and join the labor force. Continued biennial interviews will allow researchers and policymakers to examine the transition from school to work, including labor market outcomes over the life-cycle. This study will help researchers and policymakers to identify the antecedents and causes for difficulties some youths experience in making the school-to-work transition. By comparing these data to similar data from previous NLS cohorts, researchers and policymakers will be able to identify and understand some of the dynamics of the labor market and whether and how the experiences of this cohort of young people differ from those of earlier cohorts.
The NLSY97 has several characteristics that distinguish it from other data sources and make it uniquely capable of meeting the goals described above. The first of these is the breadth and depth of the types of information that are being collected. It has become increasingly evident in recent years that a comprehensive analysis of the dynamics of labor force activity requires a theoretical framework that draws on several disciplines, particularly economics, sociology, and psychology. For example, the exploration of the determinants and consequences of the labor force behavior and experience of this cohort requires information about (1) the individual’s family background and ongoing demographic experiences; (2) the character of all aspects of the environment with which the individual interacts, and the resources that the individual may use to affect that environment; (3) human capital inputs such as formal schooling and training; (4) a complete record of the individual’s work experiences; (5) the behaviors, attitudes, and experiences of family members, including spouses and children; and (6) a variety of social psychological measures, including attitudes toward specific and general work situations, personal feelings about the future, and perceptions of how much control one has over one’s environment.
A second major advantage of the NLSY97 is its longitudinal design. This design permits investigations of labor market dynamics that would not be possible with one-time surveys and allows directions of causation to be established with much greater confidence than cross-sectional analyses permit. Also, the considerable geographic and environment information available for each respondent for each survey year permits a more careful examination of the impact that local labor market conditions have on the employment, education, and family experiences of this cohort.
Third, the supplemental samples of blacks and Hispanics make possible more detailed statistical analyses of those groups than would otherwise be possible.
The interim supplement leverages these advantages to provide a uniquely valuable source of information on the impacts of the coronavirus pandemic on the labor market outcomes of the NLSY97’s cohort. It measures employment and health outcomes that may be directly or indirectly affected by the pandemic. The value of these measures is enhanced by the ability to link them to the breadth of data collected in NLSY97 main rounds over the years that precede and follow, as well as by the ability to calculate separate measures among racial and ethnic subsamples. The supplement’s questions have also been chosen to allow for comparisons with coronavirus-related measures collected in Round 29 of the NLSY79.
The NLSY97 is part of a broader group of surveys that are known as the BLS National Longitudinal Surveys program. In 1966, the first interviews were administered to persons representing two cohorts, Older Men ages 45-59 in 1966 and Young Men ages 14-24 in 1966. The sample of Mature Women ages 30-44 in 1967 was first interviewed in 1967. The last of the original four cohorts was the Young Women, who were ages 14-24 when first interviewed in 1968. The survey of Young Men was discontinued after the 1981 interview, and the last survey of the Older Men was conducted in 1990. The Young and Mature Women surveys were discontinued after the 2003 interviews. In 1979, the National Longitudinal Survey of Youth 1979 (NLSY79 – OMB Clearance Number 1220-0109), which includes persons who were ages 14-21 on December 31, 1978, began. The NLSY79 was conducted yearly from 1979 to 1994 and has been conducted every two years since 1994. One of the objectives of the National Longitudinal Surveys program is to examine how well the nation is able to incorporate young people into the labor market. These earlier surveys provide comparable data for the NLSY97.
The National Longitudinal Surveys are used by BLS and other government agencies to examine a wide range of labor market issues. The most recent BLS news release that examines NLSY97 data was published on May 5, 2020, and is available online at https://www.bls.gov/news.release/nlsyth.nr0.htm. In addition to BLS publications, analyses have been conducted in recent years by other agencies of the Executive Branch, the Government Accountability Office, and the Congressional Budget Office. The surveys also are used extensively by researchers in a variety of academic fields. A comprehensive bibliography of journal articles, dissertations, and other research that have examined data from all National Longitudinal Surveys cohorts is available at http://www.nlsbibliography.org/. More information about survey applications is provided in Attachment 2.
3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also, describe any consideration of using information technology to reduce burden.
The NLS program and its contractors have led the industry in survey automation and continue to use up-to-date methods for the NLSY97. This includes the continued use of computer-assisted interviewing (CAI) for the survey.
In this interim supplement, BLS proposes to collect data by mixed mode of NIST 800-53 compliant direct application internet and telephone. All respondents will receive an advance letter describing the ability to respond via the internet or phone and an individualized web address to the survey Additionally, sample members for whom we have a valid email address will be sent the individualized link via email. Mass texting will be used to encourage phone participation. The web interview is not publicly available; only those with the direct link will be able to access the survey.
The NLSY97 has undergone mode changes in recent rounds. Through Round 17, the NLSY97 was primarily an in-person survey. In Round 18, the survey was converted to a predominantly telephone survey. Round 19 was fielded as a primarily telephone survey, with 96 percent of respondents participating by telephone. We anticipate that at least one-third of interviews will be completed by internet for the first time during the interim supplement.
A new instrument will be created for this interim supplement. The instrument is substantially shorter and contains no preloaded information. We anticipate that the internet interviews will be approximately 8 minutes long and the phone interviews with identical content will be 14 minutes long, for an average length of 12 minutes. The questionnaire items are available in Attachment 5.
Item non-response rates may increase in response to the change in mode. Sensitive items have consistently been found to be under-reported in interviewer-administered versus self-administered modes (Tourangeau and Yan, Psychological Bulletin 2007, Vol. 133, No. 5, 859–883). For instance, the items on income and earnings may have higher response rates among the internet-based, self-administered surveys compared to the telephone-based, interviewer-administered surveys.
We plan a targeted sequence of outreach methods based on the cooperativeness of the respondents in past rounds and the efforts it took to get to their final status in Round 19. We will attempt to complete these interviews by internet, with initial outreach extended by mail, internet, and text. For those sample members who do not complete via internet, outreach will be extended by telephone, both to encourage participation in the internet survey and to secure a telephone interview.
In order to meter the number of telephone interviews that will be conducted, we will employ practices similar to those used in Rounds 18 and 19 of the NLSY97 to limit the number of telephone interviews conducted. Telephone interviewing will not be available during the first three weeks of the field period; when available, field managers may issue prior approval for a telephone interview.
Since text messaging is a preferred mode of communication for the age group of the NLSY97 cohort, field interviewers are issued cell phones by the contractor to increase their responsiveness to text messages and emails. This reduces the response time to the respondents and accommodates the respondent’s preferred mode of communication. Additionally, in limited situations where the respondent does not have their own phone or the ability to borrow someone’s phone for the length of the interview, the project will send a pre-paid cell phone to the respondent with enough minutes to allow the respondent the opportunity to participate.
We do not know of a national longitudinal survey that samples this age bracket and explores an equivalent breadth of substantive topics including labor market status and characteristics of jobs, education, training, aptitudes, health, fertility, marital history, income and assets, participation in government programs, attitudes, sexual activity, criminal and delinquent behavior, household environment, and military experiences. Data collection for the National Longitudinal Study of Adolescent Health (Add Health) is less frequent and addresses physical and social health-related behaviors rather than focusing on labor market experiences. The studies sponsored by the National Center for Education Statistics do not include the birth cohorts 1980 through 1984. The Children of the NLSY79, also part of the NLS program, spans the NLSY97 age range and touches on many of the same subjects, but does not yield nationally representative estimates for these birth cohorts. Further, the NLSY97 is a valuable part of the NLS program as a whole, and other surveys would not permit the kinds of cross-cohort analyses that are possible using the various cohorts of the NLS program.
The repeated collection of NLSY97 information permits consideration of employment, education, and family issues in ways not possible with any other available data set. The combination of (1) longitudinal data covering the time from adolescence; (2) a focus on youths and young adults; (3) national representation; (4) large minority samples; and (5) detailed availability of education, employment and training, demographic, health, child outcome, and social-psychological variables make this data set and its utility for social science policy research on youth issues unique.
None of the listed special circumstances apply.
7. Explain any special circumstances that would cause an information collection to be conducted in a manner:
requiring respondents to report information to the agency more often than quarterly;
requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;
requiring respondents to submit more than an original and two copies of any document;
requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records for more than three years;
in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;
requiring the use of statistical data classification that has not been reviewed and approved by OMB;
that includes a pledge of confidentially that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or
requiring respondents to submit proprietary trade secret, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentially to the extent permitted by law.
By collecting data during an off-year of the biennially-collected NLSY97, the interim supplement will meet 2 key needs specific to the coronavirus pandemic:
It will collect data that is unusual and unique to the period of collection. Although we expect that the coronavirus pandemic will have long term effects on the labor market, we expect that some immediate experiences of respondents may be unique to the time period covered in the survey: current conditions at the time of collection and the 12 months preceding collection. Such experiences may include unique circumstances regarding employment arrangements, finances, and health of respondents. There is a reasonable expectation that these circumstances will change significantly between the collections of Round 19, the interim supplement and Round 20. Less frequent data collection would risk a less complete characterization of these rapidly changing measures.
It will allow timely reporting of valuable statistics on the impact of the coronavirus pandemic. In addition to its long-term value, the data collected in the interim supplement are expected to provide a valuable snapshot of the experiences of the NLSY97 cohort at the time of collection. NLS expects to report these statistics in Fall 2021, to provide information of current interest to the public and support the timely development of effective agency programs. Less frequent collection – e.g. waiting until Round 20 to collect these data – would delay the dissemination of these data by more than a year.
Note that, prior to Round 16, the NLSY97 was conducted annually, and that frequency had been essential for accurately capturing the educational, training, labor market, and household transitions that young people typically experience. Starting in Round 16 data collection changed to a biennial schedule. As NLSY97 respondents age, they experience fewer educational and labor market transitions which make biennial data collection more feasible. It is reasonable to expect that the coronavirus pandemic will temporarily disrupt the relative stability that complements biennial collection.
Note also that, as shown in Table 1, the NLSY97 experienced a decline in response rate during Rounds 16, 17 and 18, as did the NLSY79 in its transition to biennial interviewing. Although the likely causes of the decline in Round 16 were manifold, including a pause in data collection related to a partial shutdown of the federal government and an extremely severe winter, the reduction in respondent contact inherent in the transition to biennial interviewing may have played a role. The decline in response in Round 18 was larger than expected and, in part, may be attributed to the change in primary mode from in-person to telephone; the rate rebounded in Round 19 (to 79.4) after NLS made adjustments to its fielding procedures.
Table 1. NLSY79 and NLSY97 Response Rates Surrounding the Transition to Biennial Interviewing
|
NLSY79 Response Rate* (Year) |
NLSY97 Response Rate* (Year) (Round) |
2 rounds prior to transition to biennial |
92.1 (1993) (R15) |
84.4 (2010) (R14) |
Round prior to transition to biennial |
91.1 (1994) (R16) |
83.9 (2011) (R15) |
First round after 2-year gap |
88.8 (1996) (R17) |
80.8 (2013) (R16) |
2nd round after transition to biennial |
86.7 (1998) (R18) |
80.6 (2015) (R17) |
3rd round after transition to biennial |
83.2 (2000) (R19) |
76.7 (2017) (R18) |
* Retention rates exclude deceased and out of sample cases.
8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.
Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.
Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years -- even if the collection-of-information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.
One comment was received as a result of the Federal Register notice published in 85 FR 46187, on July 31, 2020. The comment, which was e-mailed to BLS on July 31, 2020, expressed the opinion that the survey does not benefit the citizens of the country. The National Longitudinal Surveys are widely used by researchers to learn about the functioning of the labor market and the multiple factors that affect it. They have supported several thousand published studies. These studies, in turn are used widely by policymakers to formulate and adjust policies that represent the interests of the Nation. Additionally, the NLS program has partnered extensively with other Federal agencies to discover facts that are especially relevant to particular policies and programs.
There have been numerous consultations regarding the NLSY97. In 1988, the National Science Foundation sponsored a conference to consider the future of the NLS. This conference consisted of representatives from a variety of academic, government and nonprofit research and policy organizations. The participants endorsed the notion of conducting a new youth survey. The NLSY97 incorporates many of the major recommendations that came out of that conference.
The NLS program also has a technical review committee that provides advice on interview content and long-term objectives. This group typically meets twice each year. Table 2 below shows the current members of the committee.
Table 2. National Longitudinal Surveys Technical Review Committee (2020)
Shawn Bushway Rockefeller College of Public Affairs and Policy Department
of Public Administration and Policy |
Jennie Brand Departments of Sociology And Statistics University of California, Los Angeles |
Sarah Burgard Department of Sociology University of Michigan |
Judith
Hellerstein |
David Johnson Survey Research Center University of Michigan
|
Michael Lovenheim Departments of Economics, Industrial and Labor Relations, and Policy Analysis and Management Cornell University |
Nicole Maestas Harvard Medical School
|
Melissa McInerney Department of Economics Tufts University |
Kristen Olson Department of Sociology University of Nebraska-Lincoln
|
John Phillips Division of Behavioral and Social Research National Institute on Aging/NIH
|
Rebecca Ryan Department of Psychology Georgetown University
|
Jeffrey Smith Department of Economics University of Wisconsin |
9. Explain any decision to provide any payments or gifts to respondents, other than remuneration of contractors or grantees.
The NLSY97 is a long-term study in which the same respondents have been interviewed on an annual basis. Because minimizing sample attrition is critical to sustaining this type of longitudinal study, respondents in all prior rounds have been offered financial and in-kind incentives as a means of securing their long-term cooperation and slowing the decline in response rates.
Evidence shows that incentives can have positive effects on both respondent and interviewer behavior, and therefore are indispensable in reaching the project response rates. Incentives result in conversions among those least likely to participate and quicker cooperation among the more likely. Many interviewers find that having something to offer respondents, such as a monetary incentive, additional in-kind offering, or new conversion materials, allows them to open the dialogue with formerly reluctant respondents. Interviewers want a variety of options to respond to the particular needs, issues, and objections of the respondent.
We propose an incentive plan for the COVID Supplement with three components. Selected definitions are provided below the initial definition of incentive components. Incentive payments will average $15 per completed interview across the full sample, with no respondent participating in the COVID Supplement interview receiving more than $25 or less than $10. Table 3 below provides example counts of incentive offers and payments.
Table 3. Incentive Plan
Group Definition |
Never Offered Final Push |
Offered Final Push |
Sample Totals |
||||||
Initial assignment |
Number of respondents |
Incentive Amount |
Estimated Completion Rate |
Number of interviews |
Incentive Amount |
Offered final push |
Estimated additional interviews |
Total Interviews |
Average Incentive |
Base Incentive group |
4500 |
$10 |
.7 |
3150 |
$25 |
1000 |
557 |
5274 |
$14.69 |
Targeted Incentive group |
4200 |
$20 |
.34 |
1428 |
$25 |
2772 |
139 |
Details of the Incentive Plan are as follows.
Targeted Completion Incentive (n=4200 respondents). Respondents who may be less likely to participate in the web survey would be offered a $20 incentive for survey completion. These respondents would include:
Individuals not known to have internet access (for whom participation may be more challenging), or
Individuals with historically lower rates of cooperation, or
Individuals in high priority analytic subgroups.
Base Completion Incentive (n=4500 respondents). Respondents not identified for $20 targeted incentive in category a above would be offered a $10 incentive for survey completion. These respondents would include:
Individuals believed to have internet access, AND
Individuals with historically high rates of cooperation, AND
Individuals outside of high priority analytic subgroups.
Final push (number to be determined during data collection). After the first 8 weeks of data collection, we propose to assess data collection progress to identify analytic or operational subgroups whose participation in the COVID Supplement may be most lagging relative to the overall sample. Respondents in these subgroups may be offered a total of $25 for survey completion (increasing their incentive from $10 or $20 depending on their initial classification). If required given budgetary constraints, a random sample of individuals in a designated subgroup may receive the increased offer (for example, one-third of respondents in a category).
Numbers of cases to be offered final push will be determined based on response to date (with greater numbers of cases offered final push the lower the initial completion rate).
Candidate demographic subgroups for this purpose will be defined by: gender, race/ethnicity, educational attainment, recent work experience (income, weeks worked), cognitive ability, and presence of children (any children < 6 years, children 6-18 years, no children in household) and combinations of these characteristics. Analytic subgroups will be considered for final push offer based on sub-group completion rate in the COVID Supplement, with smaller subgroups at higher likelihood of final push offers (to ensure adequacy of data for subgroup-specific analyses).
Operational subgroups will be defined as: individuals with whom no confirmed contact has been achieved as of the 9th week of data collection; individuals in ‘special’ fielding categories in Round 19 such as incarcerated respondents, individuals living outside of the U.S, individuals without known mailing addresses on file, or individuals requiring assistance to complete the main Youth interview; or individuals who have experienced a gap of at least 3 weeks since beginning the COVID supplement interview without completing it. Operational subgroups will be considered for final push offer based on sub-group completion rate.
Definitions:
Cooperativeness will be defined using survey participation through Round 19, including numbers of recent consecutive rounds completed, number of attempts required for completion in Rounds 18 and 19, and indications of refusal to participate in Rounds 18 and 19.
High-priority analytic groups will be prioritized based on low completion rates in Rounds 18 or 19, or sample sizes that put them at risk of analytic insufficiency for supporting subgroup-specific estimates. Candidate analytic subgroups will be defined by: gender, race/ethnicity, educational attainment, recent work experience (income, weeks worked), cognitive ability, and presence of children (any children < 6 years, children 6-18 years, no children in household) and combinations of these characteristics.
Based on the expectation that 5,275 sample members complete the interim supplement, the expected incentive cost is $79,125.
Respondents will be able to receive their payments electronically through PayPal or electronic gift cards. Electronic payments were offered to all respondents in Rounds 18 and 19. We propose to offer them again to all respondents and use electronic payment as default for the respondents who received electronic payment in Round 19. Approximately 57 percent of the respondents who completed the Round 19 interview received their payment electronically.
10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.
The information that NLSY97 respondents provide is protected by the Privacy Act of 1974 (DOL/BLS – 17 National Longitudinal Survey of Youth 1997 (67 FR 16818)) and the Confidential Information Protection and Statistical Efficiency Act (CIPSEA). CIPSEA is shown in Attachment 3.
The Confidential Information Protection and Statistical Efficiency Act (CIPSEA) safeguards the confidentiality of individually identifiable information acquired under a pledge of confidentiality for exclusively statistical purposes by controlling access to, and uses made of, such information. CIPSEA includes fines and penalties for any knowing and willful disclosure of individually identifiable information by an officer, employee, or agent of the BLS.
Based on this law, the BLS provides respondents with the following confidentiality pledge/informed consent statement:
“Thank you for participating in the NLS COVID Survey being conducted by NORC on behalf of the Bureau of Labor Statistics (BLS). The survey is voluntary – there are no penalties for not answering any question.
This survey will help measure the impact of coronavirus on your employment and health. The survey is authorized under Title 29, Section 2, of the United States Code. We estimate the average interview will take about 12 minutes. The U.S. Office of Management and Budget has approved the questionnaire and has assigned 1220-0157 as the survey’s control number. This control number expires on 08/31/2022. Without OMB approval and this number, we would not be able to conduct this survey. If you have any comments regarding this survey or recommendations for reducing its length, send them to the Bureau of Labor Statistics, National Longitudinal Surveys, 2 Massachusetts Avenue, N.E., Washington, DC 20212.
The Bureau of Labor Statistics, its employees, agents, and partner statistical agencies, will use the information you provide for statistical purposes only and will hold the information in confidence to the full extent permitted by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act (44 U.S.C. 3572) and other applicable Federal laws, your responses will not be disclosed in identifiable form without your informed consent.
BLS may release records to a contractor to compile non-individually identifiable data for use by the general public and federal agencies for research purposes. BLS may provide geographic information to researchers to conduct specific research projects which further the mission and functions of the agency. Such authorized researchers must sign a written agreement making them official agents of the Bureau of Labor Statistics and requiring them to protect the confidentiality of survey participants. These researchers are never provided with the personal identities of participants.”
BLS policy on the confidential nature of respondent identifiable information (RII) states that “RII acquired or maintained by the BLS for exclusively statistical purposes and under a pledge of confidentiality shall be treated in a manner that ensures the information will be used only for statistical purposes and will be accessible only to authorized individuals with a need-to-know.”
By signing a BLS Agent Agreement, all authorized agents employed by the BLS, the prime contractor and associated subcontractors pledge to comply with the Privacy Act, CIPSEA, other applicable federal laws, and the BLS confidentiality policy. No interviewer or other staff member is allowed to see any case data until the BLS Agent Agreement, BLS Confidentiality Training certification, and Department of Labor Information Systems Security Awareness training certification are on file. Respondents will be provided an advance letter shown in Attachment 4 which includes questions and answers about uses of the data, confidentiality, and burden. These questions and answers appear on the back of the letter that respondents will receive in advance of the interim supplemental interviews. This information will also be included in email to respondents (Attachment 4a).
NLS contractors have safeguards to provide for the security of NLS data and the protection of the privacy of individuals in the sampled cohorts. These measures are used for the NLSY97 as well as the other NLS cohorts. Safeguards for the security of data include:
1. Like all federal systems, NLS and its contractors follow the National Institute of Standards and Technology (NIST) guidelines found in special publication 800-53 to ensure that appropriate security requirements and controls are applied to our system. This framework provides guidance, based on existing standards and best practices, for organizations to better understand, manage and reduce cybersecurity risk.
2. Respondents who self-administer using the web link will access the survey application using FIPS 140-2 SSL/TLS encryption. All data is encrypted while in transit from keyboard to server. Telephone surveys are conducted from NLS contractor owned and managed laptops. All laptops are in the NLS boundary and have mobile device protection compliant with the NIST 800-53, whole disk encryption as required by NIST. For both self-administered and telephone interviews, data input into the survey application is stored directly to the NLS Oracle database which, as described above, is protected by NLS network boundary and access control mechanisms and are compliant with the NIST 800-53 rev. 4 moderate baseline to prevent unauthorized access to survey data.
3. Protection of computer files against access by unauthorized persons and groups. Especially sensitive files are secured via a series of passwords to restricted users. Access to files is strictly on a need-to-know basis. Passwords change every 90 days.
Protection of the privacy of individuals is accomplished through the following steps:
1. The first question of the web survey confirms that respondents have received BLS’s confidentiality information, which respondents will have been sent in an advance letter and via email. The question will allow respondents to acknowledge that they have received the BLS confidentiality information and understand participating is voluntary. If a respondent indicates they did not receive the confidentiality pledge, it will be provided on the following screen, which will be skipped by everyone else. For respondents receiving a telephone interview, the interviewer ensures that the respondent has received the advance letter with the BLS confidentiality information and understands that participation is voluntary. If the confidentiality pledge has not been received, the interviewer will orally provide the information and obtain oral permission to conduct the interview.
2. Information identifying respondents is separated from the questionnaire and placed into a nonpublic database. Respondents are then linked to data through identification numbers.
3. After the final interview round, respondent identifier computer files will be destroyed.
4. The public-use version of the data, available on the Internet, masks data that are of sufficient specificity that individuals could theoretically be identified through some set of unique characteristics.
5. The Field Interviewer training manual includes targeted sections to increase protection of respondent confidentiality. These include an enhanced focus on confidentiality and data security, clear instructions on what field interviewers may or may not do when working cases, continued reminders of respondent confidentiality in field communications throughout the field period, and specific email, text and voice mail scripts for initial respondent outreach.
6. Three questions, asking for the identification of Highest Degree Completed, State of Birth, and Height, will be used to verify that the correct respondent has been reached. Confirmation of these values will be carried out during post-processing. Note that the NLS usually verifies a respondent’s date of birth at the beginning of its interviews this check has been in place since Round 16. We have modified this procedure to avoid any appearance of data phishing in the web-based application.
11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.
This interim supplement will contain many fewer sensitive questions compared to the main rounds of data collection, though we do propose to include some items on mental health, income, and financial well-being.
a.) Mental health
The literature linking mental health with various outcomes of interest to the NLSY97, including labor force participation, is fairly well-established. We propose to include in the interim supplement the 7-item Center for Epidemiological Depression Scale (CESD), a screening instrument for depression in the interim supplement. This scale was included in Round 19 and has been fielded as part of the NLSY79 data collection in several rounds.
b.) Income, Assets, and Program Participation
The interim supplement asks all respondents about whether their income from work increased or decreased due to causes related to the Coronavirus pandemic. We will not collect a detailed list of income sources such as self-employment income, receipt of child support, interest or dividend payments, or income from rental properties as part of the interim supplement.
Respondents are free to refuse to answer any survey question, including the sensitive questions described above. Our experience has been that participants recognize the importance of these questions and rarely refuse to answer.
12. Provide estimates of the hour burden of the collection of information. The statement should:
Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. General, estimates should not include burden hours for customary and usual business practices.
If this request for approval covers more than one form, provide separate hour burden estimates for each form.
Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included in Item 14.
The interim supplement field effort will seek to interview each respondent identified when the sample was selected in 1997. We will attempt to contact approximately 8,750 sample members who are not known to be deceased. BLS expects that interviews with approximately 5,275 of those sample members will be completed. The content of the interview will be small compared to the interviews in Round 19. Based upon interview length in past rounds and the timings from the Household Pulse Survey, we estimate the interview time will average about 8 minutes when self-administered by internet and about 14 minutes when administered by telephone. We expect one-third of the interviews to be conducted by internet and the other two-thirds to be conducted by telephone, with an overall average interview length of 12 minutes.
Interview length will vary across respondents, though less than in main rounds of data collection. For example, the core of the interview covers the current employment of the respondent and of the respondent’s spouse/partner. Those respondents who are not currently employed or who are unmarried and unpartnered will have shorter interviews. We aim to be comprehensive in the data we collect, and this leads to variation in the time required for the respondent to remember and relate the necessary information to the interviewer. For these reasons, the timing estimate is more accurate on average than for each individual case.
To minimize the effects of attrition, we will seek to complete interviews with living respondents from Round 1 regardless of whether the sample member completed an interview in intervening rounds.
Household burden will vary with the number of in-scope sample members present, so households with three sample members may require more than half an hour, and so forth. Although more than 1,800 households included multiple respondents at the time of the initial interview, by Round 19 most respondents have established their own households, and very few multiple respondent households remain. We are sensitive to the fact that the interviews in households with several sample members theoretically can pose interviewing problems, but that has not been our experience in recent rounds.
During the interim supplement field period, we will conduct validation interviews with no more than 2 percent of respondents to ascertain that the interview took place as the interviewer reported and to assess the quality of the data collected. These cases will be selected purposefully, based on data and assessments by survey management that indicate a field interviewer’s caseload merits further scrutiny. These validation interviews average about two minutes each. Reasons for validation interview included missing audio on CARI files and High CARI recording refusal rate.
Based on our experience of recording segments of the main interview in Rounds 11 through 13, and use of these segments for data quality assurance in Rounds 14 through 19, respondents will be asked to provide their consent for the recording of segments of the interim supplemental interview.
“My computer is equipped to record this interview for quality control, research, testing and training purposes. As always your confidentiality is protected by Federal law and the policies of the Bureau of Labor Statistics and (Name of Contractor). May I continue with the recording?”
YES
NO
If the respondent objects to the recording of the interview, the interviewer will confirm to the respondent that the interview will not be recorded and then proceed with the interview.
Recording these interviews will enable BLS to improve data quality while reducing respondent burden. These recordings help verify that the interviews actually took place and that the interviewers did not fabricate the data. Recordings also help to ensure that interviewers are reading the questions exactly as worded and entering the responses properly. In addition, they help to identify parts of the interview that might be difficult or causing misunderstanding for interviewers or respondents. Our experiences with OMB-approved interview recordings since Round 10 indicate that respondents are generally quite willing to consent to be recorded, and the quality of recordings is sufficient for meaningful data-quality assurance.
In addition, we employ statistical review of questionnaire data to investigate interviewer performance when recordings are not available (such as in the case of a recording refusal or microphone issues). Statistical review includes, but is not limited to, looking at cases with unusual lengths of interview (too short or too long), the percent of recording refusal, response outliers, etc. When statistical anomalies are identified in interviewer performance, these interviewers’ cases are then subjected to additional recording or data review.
These two methods of data quality review are conducted by project staff and reduce respondent burden as verification of the interview and its quality can be done without further outreach to the respondent. This methodology also reduces cost as statistical review can be done on an interviewer-level basis to evaluate the particular field interviewer’s data quality.
Table 2: No. of Respondents and Average Response Time, NLSY97 Interim Supplement
Form |
Total Respondents |
Frequency |
Total Responses |
Average Time per Response |
Estimated Total Burden |
Interim
Supplement: |
5,275 |
One-time |
5275 |
12 minutes |
1,055 hours |
Validation interview: April 2020 |
105 |
One-time |
105 |
2 minutes |
3.5 hours |
TOTALS* |
5,275 |
— |
5,380 |
— |
1,058.5 hours |
* The difference between the total number of respondents and the total number of responses reflects the fact that about 5,275 are expected to complete the main interview. In addition, about 105 respondents may be interviewed twice, once in the interim supplemental survey and a second time in the 2-minute validation interview.
The total response burden for the survey is 1,058.5 hours. The total annualized cost to respondents, based on burden hours and the Federal minimum wage of $7.25 per hour, is $7,674.13.
13. Provide an estimate of the total annual cost burden to respondents or recordkeepers resulting from the collection of information. (Do not include the cost of any hour burden shown in Items 12 and 14).
the cost estimate should be split into two components: (a) a total capital and start up cost component (annualized over its expected useful life); and (b) a total operation and maintenance and purchase of service component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.
If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collection services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.
Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.
Respondents for this survey will not incur any capital and start-up costs; respondents will not incur any operation and maintenance or purchase of service costs.
14. Provide estimates of the annualized cost to the Federal Government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), any other expense that would not have been incurred without this collection of information. Agencies also may aggregate cost estimates from Items 12, 13, and 14 into a single table.
The estimated cost to the federal government for the interim supplement to NLSY97 is $2 million. This cost includes survey management, questionnaire design, instrument development, pretest and main data collection including incentive payments, cleaning and preparation of data files for users, and services to users of the data files.
15. Explain the reasons for any program changes or adjustments.
In this round, the NLSY97 will use both internet and telephone to conduct interviews. We have kept the length of the interview short and focused on the impact from the coronavirus pandemic in order to persuade respondents to participate. The length of this interim supplement is different from our main data collection, because we are not collecting employment, training, marriage, or fertility histories, along with many other data items. This is a supplemental survey intended to capture the effects of the novel coronavirus pandemic while those effects may be on-going. It is intended to supplement data collection in the regular rounds, not replace the regular rounds. The field period will be much shorter in length, with a lower response rate than in regular data collection.
16. For collections of information whose results will be published, outline plans for tabulations, and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.
The following is the planned schedule for the data collection for interim supplement to NLSY97.
Questionnaire Development July 2020
Respondent Materials Development December 2020 – January 2021
Main Data Collection February 2021 – May 2021
Data
Processing June 2021 – August 2021
BLS
Publications September 2021
Release of Public-Use Main Data
Files September 2021
17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.
The OMB number and expiration date will be provided in the advance letter.
18. Explain each exception to the certification statement.
We do not have any exceptions to the “Certificate for Paperwork Reduction Act Submissions” statement.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | NLSY97 OMB Justification |
Author | meisenheimer_j |
File Modified | 0000-00-00 |
File Created | 2021-01-13 |