Supporting Statement B_2018_ATUS Leave

Supporting Statement B_2018_ATUS Leave.doc

Leave Supplement to the American Time Use Survey

OMB: 1220-0191

Document [doc]
Download: doc | pdf

ATUS Leave Supplement

1220-0191

September 2017


SUPPORTING STATEMENT B



B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS

1. Respondent Universe


The proposed Leave Module will directly follow the American Time Use Survey (ATUS) in 2018, and thus all respondents to the ATUS will be asked the module questions. The ATUS sample is drawn from households that have completed their final month of the Current Population Survey (CPS), so the universe is the same as that of the CPS. The universe for the CPS is the civilian noninstitutional population residing in occupied households. From this universe, the Census Bureau selects a sample of approximately 72,000 households each month, of which approximately 60,000 households are eligible for interviews. The Census Bureau actually interviews individuals in about 53,000 households each month. For more information about the CPS sample, see www.bls.gov/cps/sample_redesign_2014.pdf.


Households that have completed their final (8th) CPS interview become eligible for selection in the ATUS. About 2,060 of these households are selected for the ATUS sample each month. The ATUS sample is a stratified, three-stage sample. In the first stage of selection, the CPS oversample in the less populous States is reduced. In the second stage of selection, households are stratified based on the following characteristics: race/ethnicity of the household reference person, presence and age of children, and the number of adults in adult-only households. In the third stage of selection, an eligible person from each household selected in the second stage is randomly selected as the designated person (respondent) for the ATUS. An eligible person is a civilian household member at least 15 years of age.


The sample persons are then randomly assigned a designated reference day (a day of the week for which they will be reporting) and an initial interview week (the week the case is introduced). In order to ensure accurate measures of time use on weekdays and weekend days, the sample is split evenly between weekdays and weekend days. Ten percent of the sample is allocated to each weekday and 25 percent of the sample is allocated to each weekend day. For more information about the ATUS sample see chapter 3 of the ATUS User's Guide: www.bls.gov/tus/atususersguide.pdf.


2. Description of Procedures


  1. Estimation Procedures


A complete description of the estimation procedures for the ATUS can be found in chapter 7 of the ATUS User’s Guide: www.bls.gov/tus/atususersguide.pdf.





  1. Data Collection


The 2018 Leave Module is associated with the ATUS and thus the procedures for data collection are the same as those of the ATUS. All ATUS interviews are conducted using Computer Assisted Telephone Interviewing (CATI) technology. Interviewers from the U.S. Census Bureau's Jeffersonville Call Center in Jeffersonville, Indiana, conduct the interviews and assign the activity codes.


The ATUS interview is a combination of structured questions and conversational interviewing. For the household roster update, employment status questions, the CPS updates, and the proposed Leave Module questions, Census Bureau interviewers read the question on the screen and enter the appropriate response. For the time-use diary and subsequent summary questions on childcare, paid work, volunteering, and eldercare, the interviewer uses conversational interviewing to more flexibly interview respondents, filling in the diary grid as questions are answered.


The data collection instrument includes an edit check that ensures all cells are filled before the interviewer exits the diary. Extensive interviewer training has been provided on how to do conversational interviewing—including when to selectively probe for adequate information to code activities. Refresher training is conducted periodically. Interviews are regularly monitored by supervisors, coaches, and BLS sponsors to evaluate conversational interviewing performance. Because the interviewers also are responsible for coding activity information collected in the time diary, they understand the level of detail that must be collected during the interview. Interviewers never code data from the interviews they conducted. A coding verification and adjudication process is in place to ensure activities are accurately coded. Verification continues to be done at 100 percent to ensure high and consistent data quality.


3. Methods to Maximize Response


The proposed module will be attached to the ATUS and the transition between the two will be seamless. In 2016, most people (97 percent) who participated in the ATUS also completed the module running at the time, the Eating and Health Module; because of this, the present discussion focuses on response to the ATUS.


The 2001 ATUS field test examined the effectiveness of incentives, sending advance materials by priority mail, doubling the number of eligible interviewing days by using a day-of-week substitution methodology, calling in advance to set interview appointments, “recycling” cases for field visits, and extending the field period from 4 to up to 8 weeks. (See Attachment F.) Testing showed that incentives significantly increased response rates. “Recycling” cases to the field—that is, turning nonresponse cases over to interviewers to conduct face-to-face interviews in the respondent’s home—also was effective in maximizing response rates, particularly for no-telephone-number households. However, incentives to all respondents and recycling were both cost prohibitive. Thus, incentives were sent only to persons for whom the Census Bureau does not have a working telephone number. In mid-2008 and again in mid-2011, ATUS expanded the definition of no-telephone-number households to include households with non-viable telephone numbers (e.g., “number could not be completed as dialed"). These households have similar characteristics as other no-telephone-number households. Incentives currently are offered to about 9 percent of the ATUS sample.


Findings from the 2001 study showed that calling in advance to set an appointment (“proactive appointment setting”) did not improve response, and completed interviews using that strategy required 70 percent more contact attempts than other completed interviews. As a result, advance appointment setting was rejected. Day-of-week substitution increased response rates by about 4 percentage points over 8 weeks; however, it led to a disproportionately high number of completed interviews on Wednesday and a disproportionately low number on Fridays. To maintain integrity in the day-of-week distribution of the sample, substitution was also rejected.


Consistent with survey methods literature, priority mail appears to have increased response rates in the ATUS field test—by over 10 percentage points. It is relatively low-cost to implement (about $6.65 per mailing in 2016) and is currently used for sending advance materials. The optimal field period length varies depending on incentive use. Without an incentive, the field test showed that an 8-week fielding period was required to approach 70 percent (69 percent was achieved in the field test). As a result, this 8-week fielding period was adopted for full production. To even out workload and measure time use across days of the month, one quarter of the sample is introduced each week for 4 weeks. Active cases are called up to 8 times per day on one eligible day each week for 8 weeks.


To maximize response, a toll-free number is provided to all eligible respondents in the advance materials. They can use the number to call in and set an appointment or to complete the interview (if they call on an eligible interviewing day). In addition, interviewers have job aids—answers to frequently asked questions (FAQs)—designed to help answer questions about the survey and to assist them in gaining respondents’ cooperation to participate.


In 2016, the survey’s overall unweighted response rate by sample month was 46.8 percent, and the weighted response rate was 47.1 percent. During 2016 data processing, a small percentage of completed cases were eliminated for data quality reasons. As a result, the final unweighted response rate was 45.1 percent after processing, and the weighted response rate was 45.5 percent after processing. The BLS and the Census Bureau have conducted a number of analyses of non-response in ATUS. In particular, BLS and Census have done or are doing the following to test and address response rate issues:


  • Conducted in-depth critiques and revisions of ATUS advance letters and brochures (see Attachment G), resulting in improved readability of the materials and changes to address common questions about the ATUS

  • Translated advance materials and refusal conversion materials to Spanish in order to better target Spanish speaking households

  • Developed a refusal conversion letter

  • Revised evening call operations at the Census interviewing center

  • Implemented a policy of conducting more research into phone numbers (when invalid) and trained interviewers to conduct this research on a more-timely, interactive basis

  • Increased interviewer motivation by setting weekly goals

  • Conducted comprehensive analyses of non-response bias (see Attachment H for a list of ATUS nonresponse bias studies)

  • Developed a website containing information for ATUS respondents (www.bls.gov/respondents/tus/home.htm)

  • Evaluated returned mail (such as advance letters) to see if cases were movers and to better investigate wrong or incomplete addresses (see Attachment I)

  • Developed an ATUS-specific “gaining cooperation” workshop to teach interviewers techniques to increase respondent cooperation, and incorporated this material into other periodic training courses

  • Implemented a periodic newsletter to inform interviewers and improve interviewer morale

  • Investigated incomplete cases to identify possible causes of noncontact or refusal (such as non-viable telephone numbers) and converted some cases to incentive cases

  • Researched the feasibility of assigning cases that are soft refusals to refusal conversion specialists as soon as the case enters the field

  • Scrutinized and revised interviewer operations in several ways in order to increase the probability of completed interviews, such as redesigning the call blocks to add more call attempts during evening hours (see Attachment J)

  • Investigated the incidence and impact of cell phones on ATUS response rates and data quality (see Attachment K)

  • Examined the feasibility of implementing a substitution-of-day mechanism in the ATUS (see Attachment L)

  • Investigated the effects of substituting the designated respondent within a household on the ATUS data (see Attachment L)

  • Implemented a "We've been trying to reach you letter" that is sent via FAX when ATUS calls go to FAX machines

  • Added FAQs to the collection instrument that ATUS interviewers can easily reference to respond to respondents' concerns

  • Researched the possibility of implementing web-based data collection (see Attachment M)


4. Testing of Procedures


The proposed 2018 Leave Module is identical to the 2017 Leave Module. The 2017 Leave Module included several questions that were not included in the 2011 module. These questions were reviewed by survey methods experts and cognitively tested. Unlike the 2011 Leave Module, the 2017 version did not include questions to collect details about the specific types of leave plans offered by respondents’ employers nor general health questions. Several questions were modified to collect more detailed information about job flexibility. The job flexibility section also included new questions about advance notice of and input into one’s work schedule, information about shift work, and the ability and frequency of working from home. See the Cognitive Testing Results for the 2017 Leave Module (Attachment E) for more information.


Some of the questions appearing on the proposed module were cognitively tested in 2010 before becoming a part of the 2011 Leave Module. See Attachment D for the full report.


A version of the advance notice question was fielded in the NLSY97 from 2011 to 2015, and also included in the 2014 General Social Survey. The Women’s Bureau wanted to ask this question of workers who, in an earlier question, said that their employers decide their work schedules without employee input. The final question as it appears on the 2017 Leave Module is as follows:


Advance Notice Question in the 2017 Leave Module

How far in advance do you usually know what days and hours you will need to work?

  • Less than one week

  • From 1 to 2 weeks (not including reports of 2 weeks)

  • From 2 to 3 weeks (not including reports of 3 weeks)

  • From 3 to 4 weeks (not including reports of 4 weeks)

  • 4 weeks or more


BLS proposed a change to the response options in the NLYS97 beginning in the fall of 2017. The change captures more detail than “less than one week” when recording responses. BLS staff were asked to consider implementing this change to the advance notice question in the ATUS for any future years of the Leave Module (see Attachment O). BLS examined Leave Module data from the first quarter of 2017 (the only data available at this time) and the data do not support a need for refinement of the response categories. Further, fielding the Leave Module again in 2018 will allow researchers to combine and analyze the 2017-18 Leave Module data and any changes between the years would make this challenging.

5. Contact Persons


The following individuals may be consulted concerning the statistical data collection and analysis operation:


Statistical Design:

Antoinette Lubich

Demographic Statistical Methods Division

U.S. Census Bureau


Statistical Analysis:

Rachel Krantz-Kent

Program manager

American Time Use Survey

Bureau of Labor Statistics

Data Collection/Survey Design:

Beth Capps  

Assistant Survey Director for the American Time Use Survey

Associate Director for Demographic Programs

U.S. Census Bureau



Attachments:

A. Proposed Leave Module Questionnaire

B. Legal Authority

C. ATUS Advance materials

D. Cognitive Testing Results for the 2011 Leave Module

E. Cognitive Testing Results for the 2017 Leave Module

F. ATUS Field Test Analysis

G. Advance Materials Re-evaluation

H. Summary of ATUS Nonresponse Bias Studies

I . Returned Mail Analysis

J. Call Block Research Study

K. Cell Phone Research

L. Westat Final Report – Substitution of Days

M. Westat Final Report – Web Collection

N. CPS unpublished Table A7 2016 Median Hourly Earnings, Page 2

O. 2017 Advance Notice of Schedule Question Discussion


6


File Typeapplication/msword
File TitleSUPPORTING STATEMENT
AuthorOEUS Network
Last Modified BySYSTEM
File Modified2017-09-07
File Created2017-09-07

© 2024 OMB.report | Privacy Policy