2011-05-18 1220-0157 NLSY97 R15 OMB justification_Part Bm

2011-05-18 1220-0157 NLSY97 R15 OMB justification_Part B.docm

National Longitudinal Survey of Youth 1997

OMB: 1220-0157

Document [docx]
Download: docx | pdf








Information Collection Request for

The National Longitudinal Survey of Youth 1997

OMB # 1220-0157

Part B


Submitted by the Bureau of Labor Statistics

TABLE OF CONTENTS


B. Collections of Information Employing Statistical Methods 3

1. Respondent Universe and Respondent Selection Method 3

2. Design and Procedures for the Information Collection 4

3. Maximizing Response Rates 6

4. Testing of Questionnaire Items 7

5. Statistical Consultant 9


B. Collections of Information Employing Statistical Methods


1. Respondent Universe and Respondent Selection Method

This section summarizes the primary features of the sampling and statistical methods used to collect data and produce estimates for the NLSY97. Additional technical details are provided in the NLSY97 Technical Sampling Report, available online at http://www.nlsinfo.org/preview.php?filename=nlsy97techsamprpt.pdf. Chapter 2 of the report describes the design of the NLSY97 sample. Chapter 3 describes the sample-selection process. Chapter 4 describes the sample weighting process. Chapters 5 and 6 describe the accuracy and representativeness of the sample.


Additional information about statistical methods and survey procedures is available in the NLSY97 User’s Guide at:

http://www.nlsinfo.org/nlsy97/docs/97HTML00/97guide/toc.htm


The initial sample was selected to represent (after appropriate weighting) the total U.S. population (including military personnel) 12 to 16 years of age on December 31, 1996. The sample selection procedure included an overrepresentation of blacks and Hispanics to facilitate statistically reliable analyses of these racial and ethnic groups. Appropriate weights are developed after each round so that the sample components can be combined to aggregate to the overall U.S. population of the same ages. Weights are needed to adjust for differences in selection probabilities, subgroup differences in participation rates, random fluctuations from known population totals, and survey undercoverage. Computation of the weights begins with the base weight and then adjusts for household screener nonresponse, sub-sampling, individual nonresponse, and post-stratification of the nonresponse-adjusted weights. The number of sample cases in 1997, the first round, was 8,984. Retention rate information for subsequent rounds is shown in the table below. BLS anticipates a somewhat lower retention rate in Round 15 than was attained in Round 13. In Round 12, the retention rate was close to that of Round 8. We saw an increase in retention rate in Round 10, a modest decline in Round11, and additional increases in Rounds 12 and 13. Only sample members who completed an interview in Round 1 are considered in-scope for subsequent rounds. Even if NORC is unable to complete an interview for an in-scope sample member in one round, they attempt to complete an interview with that sample member in each subsequent round. The interview schedule is designed to pick up crucial information that was not collected in the missed interviews.


The schedule and sample retention rates of past survey rounds are shown in Table 3.


Table 3. NLSY97 Fielding Periods and Sample Retention Rates

Round

Months conducted

Total respondents

Retention rate

Number of deceased sample members

Retention rate excluding the deceased

1

February–October 1997
and March–May 1998

8,984

2

October 1998–April 1999

8,386

93.3

7

93.4

3

October 1999–April 2000

8,209

91.4

16

91.5

4

November 2000–May 2001

8,081

89.9

15

90.1

5

November 2001–May 2002

7,883

87.7

25

88.0

6

November 2002–May 2003

7,898

87.9

30

88.2

7

November 2003–July 2004

7,755

86.3

37

86.7

8

November 2004–July 2005

7,503

83.5

45

83.9

9

October 2005–July 2006

7,338

81.7

60

82.2

10

October 2006–May 2007

7,559

84.1

77

84.9

11

October 2007-June 2008

7,418

82.6

90

83.4

12

October 2008 – June 2009

7,490

83.4

103

84.3

13

September 2009 – April 2010

7,561

84.2

112

85.2

14

October 2010 – May 2011

7,420

82.6

118

83.7


Note: The retention rate is defined as the percentage of base year respondents who were interviewed in a given survey year. All numbers for Round 14 are projections.


2. Design and Procedures for the Information Collection

The NLSY97 includes personal interviews with all living Round 1 respondents, regardless of whether they subsequently become institutionalized, join the military, or move out of the United States. We employ a thorough and comprehensive strategy to contact and interview sample members. At each interview, detailed information is gathered about relatives and friends who could assist NORC field staff in locating respondents if they cannot readily be found in a subsequent survey round. Every effort is made to locate respondents. Interviewers are encouraged to attempt to contact respondents until they reach them. There is no arbitrary limit on the number of call-backs.


Preceding the data collection, the NORC interviewers are carefully trained, with particular emphasis placed on resolving sensitive issues that may have appeared in the pretest and in prior rounds. Most of the NORC interviewers have lengthy experience in the field from having participated in earlier NLSY97 rounds as well as from involvement with the NLSY79 and other NORC surveys. All new recruits are given one day of personal training on general interviewing techniques, followed by three days of personal training on the questionnaire and field procedures. Experienced interviewers receive online self-study training consisting of over 8 hours on specially designed materials requiring study of the questionnaire and procedural specifications, with exercises on new or difficult sections and procedures.


Field interviewers are supervised by NORC Field Managers and their associates. NORC has divided the U.S. into 10 regions, each supervised by a Field Manager who is responsible for staffing and for the quality of field work in that region. A ratio of 1 supervisor to 15 interviewers is the standard arrangement. Field Managers are, in turn, supervised by one of the two Field Project Managers.


The interview content is prepared by professional staff at BLS, CHRR, and NORC. When new materials are incorporated into the questionnaire, assistance is generally sought from appropriate experts in the specific substantive area.


Because sample selection took place in 1997 in preparation for the baseline interview, sample composition will remain unchanged.


Some new activities are planned in Round 15 to supplement the main interview.


Release of Postsecondary Education Records

With BLS encouragement, a research team headed by Chandra Muller of the University of Texas at Austin has submitted a grant proposal to the National Institute of Child Health and Human Development to collect college transcripts and other postsecondary enrollment information for NLSY97 sample members. With funding from this grant, NORC collected signed permission forms from Round 14 respondents to grant BLS and NORC permission to obtain transcripts. A sample permission form is provided in attachment 7. During Round 15, NORC plans to seek permission forms from individuals who had not yet provided one, primarily those who did not complete the Round 14 interview or who completed it by phone rather than in person.


In Round 14, permission was sought from all respondents who had reported that they received a high school diploma or GED or completed coursework in a postsecondary degree program. We will seek permission from this broad group, rather than just the respondents who reported some college coursework, to help validate the educational attainment information that respondents have provided during the NLSY97 interviews. Some respondents who reported college coursework might not actually have completed such coursework. Similarly, some respondents who did not report any college experience may actually have attended college. For NLSY97 respondents who sign the permission form, we will obtain their transcripts and other information about college attendance through the National Student Clearinghouse (http://www.studentclearinghouse.org/).


Permission forms were sought from respondents first by field interviewers at the time of the Round 14 in-person interview. A follow-up mail effort will take place after the close of the Round 14 data-collection period and will request return of signed permission forms from sample members completing the Round 14 interview by phone or not completing the Round 14 interview at all. We estimate by Round 14 that 7,425 respondents will have completed high school, earned a GED, or completed a term in a postsecondary degree program. Including respondents completing and not completing the Round 14 interview, we estimate that 6,311 will provide signed permission forms. BLS and NORC asked Round 14 pretest respondents to sign permission forms to test respondent cooperation, but NORC does not actually plan to obtain transcripts for pretest respondents. Notification of grant funding from the National Institute of Child Health and Human Development (NICHD) was received just in time for the main fielding of Round 14; BLS and NORC are asking respondents to sign permission forms, and NORC will obtain the transcripts for respondents who grant their consent.


OMB previously approved the collection of permission forms and transcripts as part of the Round 13 and Round 14 OMB clearances. We estimate the respondent burden to read and sign the transcript release is 1.5 minutes per respondent.



NIR Questionnaire

Although we continue to have excellent rates of return among sample members who missed some previous rounds, the NLSY97 now has a few hundred respondents who are extremely unlikely to complete an interview in any given round. At the start of Round 15, we expect there to be about 650 respondents who were last interviewed prior to Round 10. In Round 13, respondents who had missed at least 5 rounds completed interviews at a rate of 8.1%. Conceiving of these individuals as likely nonrespondents rather than as potential respondents may help us improve our understanding of our sample and our ability to convert long-term nonrespondents. We plan to field an experimental nonresponse questionnaire among sample members who have extremely low probabilities of completion.


The purpose of a noninterview respondent (NIR) questionnaire would be to capture key status information about the sample member’s life. This information could permit nonresponse analysis to understand whom we are missing and what difference it might make in our analytic results. In addition, a brief, minimally intrusive “interview” might slightly increase these sample members’ willingness to participate in a full interview in subsequent rounds.


Respondents who had not completed the prior 5 interviews would be eligible for the NIR questionnaire. They would be asked to complete an NIR questionnaire if they refuse the first three requests for a full interview or for some other reason a field manager believes that the case cannot result in a completed interview. We would offer a $15 incentive because it is Round 15. NIR questionnaire respondents would receive a modified thank you letter. These cases would still be recorded as refusals, but would also have an NIR questionnaire completed. We would still secure consent from respondents, so that we could provide the NIR questionnaire responses in the public-use data file. Respondents completing the NIR questionnaire would be fielded for the main interview in the next round as usual. Respondents would still get the full NIR premium in a future round if they completed the interview. In future rounds, if the NIR questionnaire were continued, respondents would not be permitted to complete NIR questionnaires in two consecutive rounds. Given the nature of the sample, the presumption is that almost all of these NIR questionnaires would be completed by telephone. We do not expect to field NIR questionnaires with individuals who completed the Round 13 NIR questionnaire. The Round 13 fielding yielded very few completed NIR questionnaires but yielded more completed full interviews than expected with this long-term NIR group.


Given an 8.1% completion rate of the full interview among this group in Round 13, we might expect 130 completed NIR questionnaires based on a 20% completion rate of the NIR questionnaire among the estimated 650 respondents at the start of Round 15 who were last interviewed prior to Round 10. We will devise a more specific plan for Round 15 fielding of the NIR questionnaire after reviewing the Round 14 experience with it.


The NIR questionnaire is shown in attachment 8.


Collection of birth certificates in pretest

Use of administrative records is a promising technique that can improve data quality and reduce respondent burden. As an initial investigation into this technique, we propose a small trial collection of birth certificates in the Round 15 pretest. Birth certificates are the optimal source of information about birth weight, a measure of considerable research interest given its relationship with child development and lifetime obesity, among other outcomes. In the Round 15 pretest, we will offer respondents $10 to permit us to record their birth certificate using a handheld scanner at the time of the in-person interview, or $5 to sign a waiver granting us access to their birth certificate from vital statistics agencies. Telephone interview respondents would be eligible for the same offer, to be transacted through mail. Such a trial will primarily provide insight into respondent reactions and concerns regarding the release of administrative records, as well as the logistical issues surrounding the handling, acquiring, and coding of such documents. Clearance for this trial was received for the Round 14 pretest, but we did not conduct the trial at that time in favor of collecting college transcript permission forms from respondents.



3. Maximizing Response Rates

A number of the procedures that are used to maximize response rate already have been discussed in items 1 and 2 above. The other component of missing data is item nonresponse. Nonresponse includes respondents refusing to answer or not knowing the answer to a question. Almost all items in the NLSY97 have low levels of nonresponse. For example, in prior rounds there was virtually no item nonresponse for basic questions like the type of residence respondents lived in (YHHI-4400) or the highest grade of school respondents had ever attended (YSCH-2857).


Cognitively more difficult questions, such as “How many hours did you work per week?” (YEMP-23901) have low levels of nonresponse. In the hours per week example, 6 individuals out of 2,810 (0.2%) did not answer the question in Round 8.


Sensitive questions have the highest nonresponse. Table 4 presents examples of round 10 questionnaire items that are most sensitive or cognitively difficult. Even very personal questions about sex have low rates of nonresponse. The top row of the table show that the vast majority of respondents (over 95%) were willing and able to answer the question, “Did you have sexual intercourse since the last interview?” The third row shows that only 1.2% of respondents did not respond to the question on marijuana usage since the last interview. The fourth row shows that very few respondents (0.5%) did not answer whether they had carried a handgun since the last interview. Lastly, almost all respondents (0.6% nonresponse rate) were willing to reveal whether they had earned money from a job in the past year, but many did not know or refused to disclose exactly how much they had earned (20.4% nonresponse rate). Because high nonresponse rates were expected for the income amount question, individuals who did not provide an exact answer were asked to estimate their income from a set of predetermined ranges. This considerably reduces nonresponse on the income question. Only 6.4% of those who were asked to provide a range of income did not respond.



Table 4. Examples of Nonresponse Rates for Some Round 10 Sensitive Questions


Q Name

Question

Number Asked

Number Refused

Number Don’t Know

% Nonresponse

YSAQ2-299B

Have Sex Since Date of Last Interview?1

7,460

283

25

4.1%

YSAQ-370C

Use Marijuana Since Date of Last Interview?

7,460

73

20

1.2%

YSAQ-380

Carry a Handgun Since Date of Last Interview?

7,460

32

9

0.5%

YINC-1400

Receive Work Income in 2003?

7,559

14

28

0.6%

YINC-1700

How Much Income from All Jobs in 2003?

6,386

48

1,252

20.4%

YINC-1800

Estimated Income from All Jobs in 2003?2

1,300

37

46

6.4%


1 Asked of respondents who have previously reported having sexual intercourse who do not report a spouse or partner in the household.

2 Asked of respondents who were unable or unwilling to answer the previous question (YINC-1700).


To reduce the proportion of “don't know” or “refused” responses to questions on income or assets (such as YINC-1700, shown in table 4), respondents who do not provide exact dollar answers are asked follow-up questions designed to elicit approximate information. For many income categories, the respondents are asked to select the applicable category from a predefined list of ranges. The approach for asset questions is slightly different: The initial question asks the respondent to provide an exact value, but if he or she is unable or unwilling to do so, interviewers are instructed to ask the respondent to define a range for the value using whatever values he or she feels are appropriate. If the respondent does not know or refuses to provide either an exact value or a range, a follow-up question asks him or her to select the appropriate range from a predefined list. This method provides researchers with some information on income, asset, and debt amounts when the respondent is reluctant or unable to furnish an exact figure.


4. Testing of Questionnaire Items

BLS is cautious about adding items to the NLSY97 questionnaire. Because the survey is longitudinal, poorly designed questions can result in flawed data and lost opportunities to capture contemporaneous information about important events in respondents’ lives. Poorly designed questions also can cause respondents to react negatively, making their future cooperation less likely. Thus, the NLSY97 design process employs a multi-tiered approach to the testing and review of questionnaire items.


When new items are proposed for the NLSY97 questionnaire, we often adopt questions that have been used previously in probability sample surveys with respondents resembling the NLSY97 sample. We have favored questions from the other surveys in the BLS National Longitudinal Surveys program to facilitate intergenerational comparisons. We also have used items from the Current Population Survey, the Federal Reserve Board’s Survey of Consumer Finances, the National Science Foundation-funded General Social Survey, and other Federally funded surveys.


All new questions are reviewed in their proposed NLSY97 context by survey methodologists who consider the appropriateness of questions (reference period, terms and definitions used, sensitivity, and so forth). Questions that are not well-tested with NLSY97-type respondents undergo cognitive testing with convenience samples of respondents similar to the NLSY97 sample members.


Existing questions are also reviewed each year. Respondents’ age and their life circumstances change, as does the societal environment in which the survey is conducted. Reviews of the data help us to identify questions that may cause respondent confusion, require revised response categories, or generate questionable data. Sources of information for these reviews include the questionnaire response data themselves, comments made by interviewers or respondents during the course of the interview, interviewer remarks after the interview, interviewer inquiries or comments throughout the course of data collection, other-specify coding, recordings of items during the interviews, and comparison of NLSY97 response data to other sources for external validation. We also watch carefully the “leading edge” respondents, who answer some questions before the bulk of the sample – for example, the first respondents to attend graduate school or to get a divorce. These respondents are often atypical, but their interviews can reveal problems in question functionality or comprehensibility.


A comprehensive pretest is planned as part of this information collection request and would occur approximately two months preceding each round of the main NLSY97 to test survey procedures and questions. This pretest includes a heterogeneous sample of 201 respondents of various racial, ethnic, geographic, and socio-economic backgrounds. On the basis of this pretest, the various questionnaire items, particularly those being asked for the first time, are evaluated with respect to question sensitivity and validity. When serious problems are revealed during the pretest, the problematic questions are deleted from the main NLSY97 instrument.


Although further edits to questionnaire wording are extremely rare, we monitor the first several hundred interviews each round with particular care. Based on this monitoring, field interviewers receive supplemental training on how best to administer questions that seem to be causing difficulty in the field or generating unexpected discrepancies in the data.


Round 15 questions that have not appeared in previous rounds of the NLSY97 include:


Questions on respondents’ siblings: These were tested in cognitive interviews in Fall 2010 and no problems were revealed. These items appear in the Household Information section.


Work schedule questions for respondents and their spouse/partner. These questions were developed in consultation with Professors Julia Henley and Susan Lambert of the University of Chicago School of Social Service Administration, who had fielded them with multiple samples of individuals in particular industries. These questions were also tested in cognitive interviews in Fall 2010. They appear in the Employment section of the interview.


Reservation wage questions. Especially during times of high unemployment, job search is of great importance. BLS plans to ask two questions from the NLSY79 in the Round 15 questionnaire, one about the lowest wage at which a respondent would accept a job offer (known to labor economists as the reservation wage), and one about the hours of desired work for such an offer. Data from these NLSY79 items were extensively analyzed and are expected to be of interest in Round 15.


Handedness questions. Handedness, or left/right dominance, is of interest to psychologists and neurologists, who have documented handedness as predictive of later neurological conditions. In addition, handedness has been studied in the NLSY79 as a predictor of workplace injuries. Handedness is believed to be highly persistent over the life course. These questions appear in the Tell Us What You Think section of the instrument. Similar handedness questions are asked as part of the NLSY79 Young Adult interview. These questions were included in Fall 2010 cognitive interviews.


Questions capturing open-ended speech. Questions capturing open-ended speech appear in the Employment and Tell Us What You Think sections of the Round 15 instrument. The items have been tested and fielded in a variety of settings, including the cognitive interviews conducted for Round13, the Round 13 pretest, an NORC-funded trial of these types of code-switching items, and the Moving to Opportunity longitudinal survey sponsored by the U.S. Department of Housing and Urban Development and the MacArthur Foundation. Further information about these items is provided in Attachment 10.


More information about cognitive testing of questions on respondents’ siblings, work schedules for respondents and their spouses, and handedness can be found in the memo attached to the end of this document.


Childcare. Substantial revisions have been made to questions previously asked in the NLSY97 in the long version of the childcare section, especially those pertaining to use of in-home nonrelative care. These revisions are made on the basis of extensive cognitive interviewing and other pretesting of similar items for the Design Phase of the National Study of Child Care Supply and Demand. Results of those tests and recommendations for improved questionnaire design are documented in Bowman, Datta, and Yan (2010) and followed in the revisions for the current round.1



A list of all changes to the NLSY97 questionnaire from rounds 14 to 15 is shown in attachment 9.



5. Statistical Consultant

Kirk M. Wolter

Executive Vice President for Survey Research

NORC

55 East Monroe Street, Suite 3000

Chicago, IL 60603

(312) 759-4206


The sample design was conducted by NORC, which continues the interviewing fieldwork.



MEMO

TO: Rupa Datta

FROM: Marietta Bowman

RE: NLSY Cognitive Interviews (employment, schedule, job search, household/family structure, voting history and handedness)

DATE: November 30, 2010

Summary of respondents

A total of fifteen respondents participated in cognitive interviews between September 28 and November 16, 2010. All respondents were administered the same cognitive interview protocol. Additional prompting questions, which included revised response options, were added on November 2. Five of the respondents received these new prompts.

Recruiting was conducted through ads posted on Craigslist and limited fliers posted at coffee shops, grocery stores and libraries around the downtown area. All respondents indicated that they had seen the Craigslist advertisement. Cognitive interviews were conducted by one staff member at NORC’s 55 E. Monroe office. Interviews took an average of 20.9 minutes with the shortest time of 16 minutes and the longest at 27 minutes. The following table summarizes demographic characteristics of cognitive interview participants.


 

Criteria

Total=15 R

Gender

Male

8

Female

7

Marital Status

Single/widowed/divorced/separated (not co-habitating)

4

Not married (co-habitating)

6

Married (co-habitating)

5

Children Living in HH

Yes

3

No

12

employment status

Work full-time

5

work part-time

5

not working

4

Employment Schedule

Set schedule (Works Days)

4

Set Schedule (Works Evenings /Nights/weekends)

2

Variable schedule (shifts or assigned)

2

Variable schedule (Set own schedule)

3

n/a

4

 

 

 

Spouse/Partner
employment status

Work full-time (school/training)

7

work part-time

2

working more than one job

1

not working

1

Spouse/Partner
Employment Schedule

Set schedule (Works Days)

4

Set Schedule (Works Evenings /Nights/weekends)

1

Variable schedule (shifts or assigned)

3

Variable schedule (Set own schedule)

2

n/a

1


Respondent and Spouse/Partner Schedule

All respondents were administered the same protocol in the cognitive interview, but the last five respondents were asked additional response categories as part of the November 2 revised protocol. The following summary describes respondents’ answers and reactions to the primary questions within the protocol.

  • For respondent who work multiple jobs or who are also in school, the employment and schedule questions did not make it clear which activity should be the focal point. Multiple jobs may have different types of schedules and therefore would elicit different response patterns. Direct focus on a primary job or activity would help to clarify this.

  • Respondents easily answered questions 5-7 and 25-27 regarding average, minimum and maximum numbers of weekly hours worked in the last month although respondents tended to round to the nearest 5.

  • Questions 8/28 asking about how respondent/partner/spouse is paid on the job was difficult for respondent who is self-employed and may not get paid at the same time regularly.

  • In response to questions 9/29 asking about general times when respondent/spouse/partner may work, a number of respondents worked across these categories; not split shifts but, for example, a therapist who worked afternoons and evenings seeing clients.


Questions #10/30:“How far in advance do you usually know what days and hours you will need to work?”

  1. One week or less

  2. Between 1 and 2 weeks

  3. Between 3 and 4 weeks

  4. 4 weeks or more

Revised response options:

  1. My work schedule is set and does not change so I always know when I work

  2. I always work standard business hours


These items seemed to work well with people who have shift or hourly schedules where they are assigned times generally one or two weeks ahead of time. Respondents who work traditional business hours or have a salaried job for which they either have a set schedule (e.g., traditional business house, Monday-Friday 9a-5p) or that allows that some more flexibility (e.g. self-employed or highly-autonomous working environments) had some difficulty answering this. Of the original ten respondents, half had no confusion in answering these questions. The additional response options were considered by the last five respondents: one of whom specifically liked the new response options; three who liked the old; and one who still found responses problematic.

One of the problems with this is item is that there is no time reference period, therefore respondents are answering the question differently; some considering the normal or average schedule and others considering the variation even if it is minimal, but which could then occur every week. Respondents considering variation in set work schedule when answering this question, such as working longer hours than expected, and in turn indicating “A. One week or less.” They (or their spouse/partner) may actually be working “standard” business hours (9a-5p) but do not consider the consistent nature of it, but rather than exceptions. One respondent’s partner worked in an office (M-F, 9a-5p) but indicated she knew his schedule one week or less ahead of time because of extra hours he may work. This circumstance would likely be described by researchers as “D: 4 or more weeks” but that was not how she thought of it. When probed about other potential response options, she said the option “My work schedule is set and does not change so I always know when I work” fit slightly better than response A (one week or less). For that respondent, “I work standard business hours” was not a good response because the definition of “standard” was too subjective. Two other respondents with similar schedules each indicated one of the revised response options as more effectively capturing their set schedules and related overtime.

Respondents with shift schedules had the easiest time selecting one of the original response options. One respondent was confused because he got his schedule in the middle of the week for the following weekend so he had some difficulty calculating the reference week and therefore deciding between A and B. “Less than” was an unclear term for him.


Questions #11/31:

Which of the following statements best describes how your working hours are decided?

  1. Starting and finishing times are decided by my employer and I cannot change them on my own;

  2. Starting and finishing times are decided by my employer but with my input;

  3. I can decide the time I start and finish work, within certain limits;

  4. I am entirely free to decide when I start and finish work.

Revised response options:

  1. Starting and finishing times are determined by client needs or other factors outside of my employer’s control

  2. I determine starting and finishing times for work based on personal or family responsibilities and schedules

  3. Starting and finishing times are decided by my employer or in coordination with my co-workers, but with my input


Again, of the original ten respondents, half were able to answer these questions with ease. All of the last five respondents were able to answer these questions easily as well, with two indicating that the new response options were more applicable. The original response options worked well for individuals who had specific shifts for which there was no little or no flexibility (e.g., security guard, call center). Self-employed respondents had some trouble with this question because although they set their own hours, there are factors other than an employer that influence when they work (e.g., client schedules, trying to keep traditional work hours). Those other external factors also were noted by other respondents as well who worked a variety of types of schedules, where they had the ability to decide with co-workers which shifts they would each take or was allowed the flexibility by their employer to come in early or leave later at their own discretion and made those schedule decisions based on personal or other responsibilities. One respondent indicated that his wife has the option of two work shifts and she and a co-worker determine who will work when. The same respondent works a flexible schedule but has to notify employer and his staff as to what it will be. This led to the development of response option G which indicates a different decision point on schedules than option B previously had.

Some employment was driven directly by client schedules or appointments and in those instances response option E and G were more accurate than the original responses. One respondent said that the key word to describe his wife’s schedule as a personal trainer was “flexible”—she works around her clients’ needs and her own schedule constraints. A man’s whose wife was is a therapist indicated C as the best response to describe her scheduling of appointments with clients while another respondent who owned his own business in sales did not find any of the original response options sufficient (he was not offered the revisions) and indicated that client availability drove his own schedule. This led to development of response option E.

Finally, two individuals indicated that they have the flexibility to accept work shifts or not (e.g., catering work) but once they do they are committed to it. Therefore, there was some difficulty between responding to A or B. One respondent suggested another response option: “My employer set specific schedule options (or shifts) and allows me to choose from those options.”


Questions 16-19: Seeking Employment

Four respondents were unemployed and currently seeking employment and were administered questions 16-19 and overall, there were no problems with this question set. Everyone hoped to work between 30-40 hours per week. The weak economy was a factor in the minimum required salary ranges reported. Two individuals with lower education levels (e.g., high school stated that they would hope for $8-9 per hour but would settle for minimum wage because work was so scarce right now. Everyone had some assumptions about benefits or commute when considering the types of job they would take, but with the exception of needing a car to get there these were not prohibitive to accepting a job. A male respondent with a graduate degree assumed health insurance, retirement and possibly commuter benefits when thinking about his salary range, although he was not able to give a specific minimum annual salary rate noting the difficult economy and that he had been looking for a job for several months. A female respondent with college education was the only person who did not stress economic need in finding a job and she reported finding an employer who matched her ideological views as an important consideration.


Questions 36-46: Household and Family Composition

This question set generally worked well in collecting a list of respondent siblings with the exception of a couple of scenarios. The questions allowed respondents to report varying types of sibling relationships including full blood, half, step and adopted. When respondents did list their siblings, question 44 allowed for more detailed collection of those relationships (e.g., shared mother and/or father, adopted, etc.) A few items that may need to be clarified:

  • Question 41: Unclear if deceased siblings should be included in question 40 chart

  • Question 42: Unclear if any reported additional siblings should then listed in question #40 chart

  • Question 44: response option D should allow for distinction of whether respondent is adopted or sibling is adopted

The most problematic circumstance to capture was respondents who did not have a personal relationship with certain siblings. For example, one respondent had a sister given up for adoption, whom he did not know. Another respondent was unable to report on any siblings because she does not have a relationship with any of them; her mother had children starting at age 13 so the respondent was not sure how many siblings she had. A third respondent has a half-sister, whom he does not consider family and does not know her exact age.

Most respondents were able to easily report their siblings’ ages. The most common methods for recalling the ages were year of sibling’s birth, age in relation to respondent’s own age, and “I just know it.”


Voting Behavior

Every respondent was able to recall their voting behavior in November 2008 with ease and indicated a high level of confidence in their response. It is noteworthy that interviews were conducted in Chicago, the hometown of Barack Obama, and this appeared to have increased the significance of this election for many individuals. For example, respondents indicated that they easily recalled not only voting but that they were at the post-election celebration in Grant Park or that they usually did not vote but decided to because they felt compelled to support a certain candidate. Some respondents also stated that they always vote so that is how they were confident that they had voted in 2008.


Handedness

Every respondent was able to answer these questions with ease and no confusion was reported.


1 Marietta Bowman, A. Rupa Datta, and Ting Yan, Design Phase of the National Study of Child Care Supply and Demand (NSCCSD): Cognitive Interview Findings Report for Demand Questionnaire, Chicago: National Opinion Research Center, January 31, 2010. Also available at: http://www.acf.hhs.gov/programs/opre/cc/design_phase/cog_interview_demand.pdf


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy