NLSY97 R17 OMB justification_PartB_2015_03_20m

NLSY97 R17 OMB justification_PartB_2015_03_20.docm

National Longitudinal Survey of Youth 1997

OMB: 1220-0157

Document [docx]
Download: docx | pdf








Information Collection Request for

The National Longitudinal Survey of Youth 1997

OMB # 1220-0157

Part B


Submitted by the Bureau of Labor Statistics

TABLE OF CONTENTS


B. Collections of Information Employing Statistical Methods 3

1. Respondent Universe and Respondent Selection Method 3

2. Design and Procedures for the Information Collection 4

3. Maximizing Response Rates 5

4. Testing of Questionnaire Items 6

5. Statistical Consultant 8


B. Collections of Information Employing Statistical Methods


1. Respondent Universe and Respondent Selection Method

This section summarizes the primary features of the sampling and statistical methods used to collect data and produce estimates for the NLSY97. Additional technical details are provided in the NLSY97 Technical Sampling Report, available online at http://www.nlsinfo.org/preview.php?filename=nlsy97techsamprpt.pdf. Chapter 2 of the report describes the design of the NLSY97 sample. Chapter 3 describes the sample-selection process. Chapter 4 describes the sample weighting process. Chapters 5 and 6 describe the accuracy and representativeness of the sample.


Additional information about statistical methods and survey procedures is available in the NLSY97 User’s Guide at:

http://www.nlsinfo.org/nlsy97/docs/97HTML00/97guide/toc.htm


The initial sample was selected to represent (after appropriate weighting) the total U.S. population (including military personnel) 12 to 16 years of age on December 31, 1996. The sample selection procedure included an overrepresentation of blacks and Hispanics to facilitate statistically reliable analyses of these racial and ethnic groups. Appropriate weights are developed after each round so that the sample components can be combined to aggregate to the overall U.S. population of the same ages. Weights are needed to adjust for differences in selection probabilities, subgroup differences in participation rates, random fluctuations from known population totals, and survey undercoverage. Computation of the weights begins with the base weight and then adjusts for household screener nonresponse, sub-sampling, individual nonresponse, and post-stratification of the nonresponse-adjusted weights. The number of sample cases in 1997, the first round, was 8,984. Retention rate information for subsequent rounds is shown in the table below. BLS anticipates a somewhat higher retention rate in Round 17 than was attained in Round 16. In Round 16, the retention rate (taking into account the deceased respondents) was close to that of Round 8. We saw an increase in retention rate in Round 10, a modest decline in Round 11, and additional increases in Rounds 12 and 13. Response rates fell slightly in Round 14 and then remained stable in Round 15. Round 16 was the first round after adoption of a biennial (rather than annual) interview schedule. The retention rate dropped markedly in Round 16, although continuing to exceed 80 percent when deceased respondents are accounted for. Only sample members who completed an interview in Round 1 are considered in-scope for subsequent rounds (as long as they are not known to be deceased). Even if NORC is unable to complete an interview for an in-scope sample member in one round, they attempt to complete an interview with that sample member in each subsequent round. The interview schedule is designed to pick up crucial information that was not collected in the missed interviews.


The schedule and sample retention rates of past survey rounds are shown in Table 3.


Table 3. NLSY97 Fielding Periods and Sample Retention Rates

Round

Months conducted

Total respondents

Retention rate

Number of deceased sample members

Retention rate excluding the deceased

1

February–October 1997
and March–May 1998

8,984

2

October 1998–April 1999

8,386

93.3

7

93.4

3

October 1999–April 2000

8,209

91.4

16

91.5

4

November 2000–May 2001

8,081

89.9

15

90.1

5

November 2001–May 2002

7,883

87.7

25

88.0

6

November 2002–May 2003

7,898

87.9

30

88.2

7

November 2003–July 2004

7,755

86.3

37

86.7

8

November 2004–July 2005

7,503

83.5

45

83.9

9

October 2005–July 2006

7,338

81.7

60

82.2

10

October 2006–May 2007

7,559

84.1

77

84.9

11

October 2007-June 2008

7,418

82.6

90

83.4

12

October 2008 – June 2009

7,490

83.4

103

84.3

13

September 2009 – April 2010

7,561

84.2

112

85.2

14

October 2010 – May 2011

7,420

82.6

118

83.7

15

September 2011 – June 2012

7,423

82.6

134

83.9

16

October 2013 – June 2014

7,141

79.5

151

80.8

17

September 2015 – May 2016

7,200*

80.1*

165*

81.6*


Note 1: The retention rate is defined as the percentage of base year respondents who were interviewed in a given survey year.

Note 2: * indicates projection; actuals not known.


2. Design and Procedures for the Information Collection

The NLSY97 includes personal interviews with all living Round 1 respondents, regardless of whether they subsequently become institutionalized, join the military, or move out of the United States. We employ a thorough and comprehensive strategy to contact and interview sample members. At each interview, detailed information is gathered about relatives and friends who could assist NORC field staff in locating respondents if they cannot readily be found in a subsequent survey round. Every effort is made to locate respondents. Interviewers are encouraged to attempt to contact respondents until they reach them. There is no arbitrary limit on the number of call-backs.


Preceding the data collection, the NORC interviewers are carefully trained, with particular emphasis placed on resolving sensitive issues that may have appeared in the pretest and in prior rounds. Most of the NORC interviewers have lengthy experience in the field from having participated in earlier NLSY97 rounds as well as from involvement with the NLSY79 and other NORC surveys. All new recruits are given one day of personal training on general interviewing techniques. All interviewers (whether having previous experience on the NLSY97 or NLSY79 or not) must complete a comprehensive and robust online training covering the questionnaire and its administration, data quality, study protocols and procedures and respondent confidentiality. Interviewers must then attend training calls with their field managers and pass a certification where they must prove a full understanding of questionnaire and how to correctly administer it.

Field interviewers are supervised by NORC Field Managers and their associates. All Field Managers are required to complete the same online training that their interviewers will complete prior to the start of the study. NORC has divided the U.S. into 10 regions, each supervised by a Field Manager who is responsible for staffing and for the quality of field work in that region. A ratio of 1 supervisor to 15 interviewers is the standard arrangement. Field Managers are, in turn, supervised by one of the two Field Project Managers.


The interview content is prepared by professional staff at BLS, CHRR, and NORC. When new materials are incorporated into the questionnaire, assistance is generally sought from appropriate experts in the specific substantive area.


Because sample selection took place in 1997 in preparation for the baseline interview, sample composition will remain unchanged.


For several years, respondents have been encouraged to visit the NLSY97 respondent web-site to update their contact information. In Round 16, we expanded the on-line contact form, by adding questions, including several from the main survey. We propose to continue use of the expanded on-line contact form prior to the Round 17 main survey. We extend that effort by adding additional e-mail or text prompting activities to see how such activities might affect participation, and by adding additional types of questions to understand on-line response behavior in our sample.


A random sample of 2,910 main study respondents (33 percent of the living sample members) will be invited to update their contact information on the NLSY97 respondent website concurrent with the Round 17 pretest. The expanded web contact form takes approximately 10 minutes to complete. No pre-loaded data are exhibited to the respondent (thus minimizing security risk from someone other than the R entering the instrument). Items include a combination of substantive items and operational questions, including free text response as well as those with choice selection.  Selected respondents would receive additional prompts requesting online response. We propose to include text as well as e-mail and mail prompts in Round 17, given the prevalence of mobile phone communications among the sample age group. NLSY97 respondents not selected for the web prompting test would receive their advance letter (with invitation to update contact information) closer to the launch of the main data collection period and would not be asked additional questionnaire items if they visited the NLSY97 website to update their contact information.


The expanded contact form is designed to help us understand the issues of access and technical difficulties associated with completing a web-based form, as well as help us understand possible data quality differences by mode in reporting of information among this panel of individuals who are accustomed to the in-person NLSY97 interview. Contact information includes address and other locating information, preferences for how respondents are contacted by the project team, and selected questionnaire variables in order to allow analysis of data quality differences in this mode. Web-collected variables will be included with Round 17 main field period data release (when not confidential), with no attempts to retrieve information that has not been collected by web.








3. Maximizing Response Rate

A number of the procedures that are used to maximize response rate already have been discussed in items 1 and 2 above. The other component of missing data is item nonresponse. Nonresponse includes respondents refusing to answer or not knowing the answer to a question. Almost all items in the NLSY97 have low levels of nonresponse. For example, in prior rounds there was virtually no item nonresponse for basic questions like the type of residence respondents lived in (YHHI-4400) or the highest grade of school respondents had ever attended (YSCH-2857).


Cognitively more difficult questions, such as “How many hours did you work per week?” (YEMP-23901) have low levels of nonresponse. In the hours per week example, 6 individuals out of 2,810 (0.2%) did not answer the question in Round 8.


Sensitive questions have the highest nonresponse. Table 4 presents examples of Round 14 questionnaire items that are most sensitive or cognitively difficult. Even very personal questions about sexual activity have low rates of nonresponse. The top row of the table show that the vast majority of respondents (over 95%) were willing and able to answer the question, “Did you have sexual intercourse since the last interview?” The third row shows that only 1.2% of respondents did not respond to the question on marijuana usage since the last interview. The fourth row shows that very few respondents (1%) did not answer whether they had carried a handgun since the last interview. Lastly, almost all respondents (0.3% nonresponse rate) were willing to reveal whether they had earned money from a job in the past year, but many did not know or refused to disclose exactly how much they had earned (11.5% nonresponse rate). Because high nonresponse rates were expected for the income amount question, individuals who did not provide an exact answer were asked to estimate their income from a set of predetermined ranges. This considerably reduces nonresponse on the income question. Only 7.9% of those who were asked to provide a range of income did not respond. These individuals represent 0.4% (24/5944) of all individuals requested to provide income data in that round.



Table 4. Examples of Nonresponse Rates for Some Round 14 Sensitive Questions


Q Name

Question

Number Asked

Number Refused

Number Don’t Know

% Nonresponse

YSAQ2-299B

Have Sex Since Date of Last Interview?1

7,393

230

30

3.5%

YSAQ-370C

Use Marijuana Since Date of Last Interview?

7,393

73

19

1.2%

YSAQ-380

Carry a Handgun Since Date of Last Interview?

7,393

60

16

1%

YINC-1400

Receive Work Income in 2003?

7,477

19

8

0.36%

YINC-1700

How Much Income from All Jobs in 2003?

5944

45

641

11.5%

YINC-1800

Estimated Income from All Jobs in 2003?2

686

30

24

7.9%


1 Asked of respondents who have previously reported having sexual intercourse who do not report a spouse or partner in the household.

2 Asked of respondents who were unable or unwilling to answer the previous question (YINC-1700).


To reduce the proportion of “don't know” or “refused” responses to questions on income or assets (such as YINC-1700, shown in table 4), respondents who do not provide exact dollar answers are asked follow-up questions designed to elicit approximate information. For many income categories, the respondents are asked to select the applicable category from a predefined list of ranges. The approach for asset questions is slightly different: The initial question asks the respondent to provide an exact value, but if he or she is unable or unwilling to do so, interviewers are instructed to ask the respondent to define a range for the value using whatever values he or she feels are appropriate. If the respondent does not know or refuses to provide either an exact value or a range, a follow-up question asks him or her to select the appropriate range from a predefined list. This method provides researchers with some information on income, asset, and debt amounts when the respondent is reluctant or unable to furnish an exact figure.


4. Testing of Questionnaire Items

BLS is cautious about adding items to the NLSY97 questionnaire. Because the survey is longitudinal, poorly designed questions can result in flawed data and lost opportunities to capture contemporaneous information about important events in respondents’ lives. Poorly designed questions also can cause respondents to react negatively, making their future cooperation less likely. Thus, the NLSY97 design process employs a multi-tiered approach to the testing and review of questionnaire items.


When new items are proposed for the NLSY97 questionnaire, we often adopt questions that have been used previously in probability sample surveys with respondents resembling the NLSY97 sample. We have favored questions from the other surveys in the BLS National Longitudinal Surveys program to facilitate intergenerational comparisons. We also have used items from the Current Population Survey, the Federal Reserve Board’s Survey of Consumer Finances, the National Science Foundation-funded General Social Survey, and other Federally funded surveys.


All new questions are reviewed in their proposed NLSY97 context by survey methodologists who consider the appropriateness of questions (reference period, terms and definitions used, sensitivity, and so forth). Questions that are not well-tested with NLSY97-type respondents undergo cognitive testing with convenience samples of respondents similar to the NLSY97 sample members.


Existing questions are also reviewed each year. Respondents’ age and their life circumstances change, as does the societal environment in which the survey is conducted. Reviews of the data help us to identify questions that may cause respondent confusion, require revised response categories, or generate questionable data. Sources of information for these reviews include the questionnaire response data themselves, comments made by interviewers or respondents during the course of the interview, interviewer remarks after the interview, interviewer inquiries or comments throughout the course of data collection, other-specify coding, recordings of items during the interviews, and comparison of NLSY97 response data to other sources for external validation. We also watch carefully the “leading edge” respondents, who answer some questions before the bulk of the sample – for example, the first respondents to attend graduate school or to get a divorce. These respondents are often atypical, but their interviews can reveal problems in question functionality or comprehensibility.


A comprehensive pretest is planned as part of this information collection request and would occur approximately three months preceding the main NLSY97 field period to test survey procedures and questions. This pretest includes a heterogeneous sample of 201 respondents of various racial, ethnic, geographic, and socio-economic backgrounds. On the basis of this pretest, the various questionnaire items, particularly those being asked for the first time, are evaluated with respect to question sensitivity and validity. When serious problems are revealed during the pretest, the problematic questions are deleted from the main NLSY97 instrument.


Although further edits to questionnaire wording are extremely rare, we monitor the first several hundred interviews each round with particular care. Based on this monitoring, field interviewers receive supplemental training on how best to administer questions that seem to be causing difficulty in the field or generating unexpected discrepancies in the data.


Round 17 questions that have not appeared in previous rounds of the NLSY97 include:


Screening for Business Ownership. In Rounds 24 and 25 of the NLSY79, BLS has collected information about businesses owned by the respondent. In Round 17 of the NLSY97, we propose to collect initial business ownership status information from one birth cohort of sample members as an exploration to inform possible future administration of a business ownership module in the NLSY97. The selected questions are taken directly from the NLSY79 Round 24 questionnaire without modification.


Questions on wage bargaining (in the Employment section). Four questions are being introduced about the respondent’s current job to understand patterns of wage bargaining across types of workers and jobs at employment initiation. These questions are taken from the Princeton Data Improvement Initiative. As a part of that effort, the questions were administered to 2,500 individuals in a national random-digit dial telephone survey. That administration and the resulting data are described in a 2012 paper in the American Economic Journal : Macroeconomics by Hall and Kreuger (http://dx.doi.org/10.1257/mac.4.4.56).


Assets-35. The NLSY97 has collected information on financial and other assets on a periodic basis, including at first economic independence or age 18, and at ages 20, 25, and 30. In keeping with this pattern, we introduce in Round 17 a module to collect assets information at age 35. No questions have been altered from the Assets-30 section.




A list of all changes to the NLSY97 questionnaire from rounds 16 to 17 is shown in attachment 7.



5. Statistical Consultant

Kirk M. Wolter

Executive Vice President for Survey Research

NORC

55 East Monroe Street, Suite 3000

Chicago, IL 60603

(312) 759-4206


The sample design was conducted by NORC, which continues the interviewing fieldwork.


The NLS program also has a technical review committee that provides advice on interview content and long-term objectives. This group typically meets twice each year. Table 2 below shows the current members of the committee.

Table 2. National Longitudinal Surveys Technical Review Committee (2014)

Jay Bhattacharya
CHP/PCOR
Stanford University
117 Encina Commons
Stanford, CA 94305-6019

Email: [email protected]

Phone: 650-736-0404

Fax: 650-723-1919

Shawn Bushway

School of Criminal Justice
University at Albany, State University of New York

135 Western Avenue
Draper Hall 219
Albany, NY 12222

Email: [email protected]
Phone : (518) 442-5210
Fax : (518) 442-5212


Amitabh Chandra, Professor

John F. Kennedy School of Government
Harvard University
Mailbox 26
79 JFK Street
Cambridge, MA 02138

Email: [email protected]

Phone: 617-496-7356

Lingxin Hao, Professor
Department of Sociology
Johns Hopkins University
Baltimore, MD 21218
Email: [email protected]

Phone: 410-516-4022

Judith Hellerstein, Professor
Department of Economics
University of Maryland
3105 Tydings Hall
College Park, MD 20742
Email: [email protected]
Phone: 301-405-3545

Lance Lochner

Department of Economics
University of Western
Social Science Centre, Room 4071
London, Ontario, Canada, N6A 5C2
Email: [email protected]

Phone: 519 661-3500

Alex Mas

Industrial Relations Section
Princeton University
Firestone Library A-16-J-1
Princeton, NJ 08544
Email: [email protected]

Phone: 609-258-4045

Fax: 609-258-2907

Kathleen McGarry

Department of Economics

University of California, Los Angeles

Los Angeles, CA 90095-1477

Email: [email protected]

Phone: 310-825-1011

Fax: 310-825-9528

Kelly Raley

Department of Sociology
University of Texas
Austin, TX 78712

E-mail: [email protected]

Phone: 512-471-8357

Jesse Rothstein

Richard & Rhoda Goldman School of Public Policy

University of California, Berkeley

2607 Hearst Avenue

Berkeley, CA 94720-7320

Email: [email protected]

Phone: 510-643-8561

Seth Sanders, Professor
Department of Economics
Duke University
213H Social Sciences Building
Durham, NC 27708
Email: [email protected]
Phone: 919-660-1800





File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy