Supporting Statement B_1220-0185 ATUS Well-being_2020

Supporting Statement B_1220-0185 ATUS Well-being_2020.docx

Well-being Supplement to the American Time Use Survey

OMB: 1220-0185

Document [docx]
Download: docx | pdf

ATUS Well-being Module

OMB Control Number 1220-0185

OMB Expiration Date: Reinstatement


SUPPORTING STATEMENT FOR

ATUS WELL-BEING MODULE


OMB CONTROL NO. 1220-0185



1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection methods to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.


The proposed Well-being Module will directly follow the American Time Use Survey (ATUS) in 2021, and thus all respondents to the ATUS will be asked the module questions. The ATUS sample is drawn from the Current Population Survey (CPS), so the ATUS universe is the same as the CPS universe. From this universe, the Census Bureau selects a sample of approximately 72,000 households each month, of which approximately 60,000 households are eligible for interviews. The Census Bureau actually interviews individuals in about 53,000 households each month. For more information about the CPS sample, see chapters 2-1 and 2-2 of Design and Methodology: Current Population Survey, Technical Paper 77 (https://www2.census.gov/programs-surveys/cps/methodology/CPS-Tech-Paper-77.pdf).


Households that have completed their 8th CPS interview become eligible for selection in the ATUS. About 2,060 of these households are selected for the ATUS sample each month. The ATUS sample is a stratified, three-stage sample. In the first stage of selection, the CPS oversample in the less populous States is reduced. In the second stage of selection, households are stratified based on the following characteristics: race/ethnicity of householder, presence and age of children, and the number of adults in adult-only households. In the third stage of selection, an eligible person from each household selected in the second stage is selected as the designated person (respondent) for the ATUS. An eligible person is a civilian household member at least 15 years of age.


The sample persons are then randomly assigned a designated reference day (a day of the week for which they will be reporting) and an initial interview week code (the week the case is introduced). In order to ensure accurate measures of time spent on weekdays and weekend days, the sample is split evenly between weekdays and weekend days. Ten percent of the sample is allocated to each weekday and 25 percent of the sample is allocated to each weekend day. For more information about the ATUS sample see chapter 3 of the ATUS User's Guide: http://www.bls.gov/tus/atususersguide.pdf.


In 2019, the overall response rate for the ATUS was 40.1 percent, for a total of 9,435 respondents. Because the Well-Being Module will start in March 2021, the number of potential respondents is estimated to be 7,860. The 2013 Well-Being Module was completed by 10,378 respondents. The 2021 Well-Being Module is estimated to have fewer respondents than in 2013 due to an overall decline in response rates as well as two fewer months of collection (with a March 2021 targeted start).


Estimated Number of Respondents for 2021 Well-Being Module

(Mar. 2021-Dec. 2021)

ATUS Universe

(Persons)

Well-Being Module Universe

(Persons)

2019 ATUS Response Rate

Estimated

Well-Being Module Respondents per month

Estimated Total Well-Being Module Respondents

1,960 eligible cases per month

1,960 eligible cases per month

40.1 percent

786

7,860



2. Describe the procedures for the collection of information including:


  • Statistical methodology for stratification and sample selection,

  • Estimation procedure,

  • Degree of accuracy needed for the purpose described in the justification,

  • Unusual problems requiring specialized sampling procedures, and

  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


The 2021 Well-being Module is associated with the ATUS and thus the procedures for data collection are the same as those of the ATUS. All ATUS interviews are conducted using Computer Assisted Telephone Interviewing (CATI) technology. Interviewers from the U.S. Census Bureau's National Processing Center in Jeffersonville, Indiana, conduct the interviews and assign the activity codes.


The ATUS interview is a combination of structured questions and conversational interviewing. For the household roster update, employment status questions, the CPS updates, and the proposed Well-being Module questions, Census Bureau interviewers read the question on the screen and enter the appropriate response. For the time-use diary and subsequent summary questions on childcare, paid work, volunteering, and eldercare, the interviewer more flexibly interviews the respondent, filling in the diary grid as questions are answered.


The data collection instrument includes an edit check that ensures all cells are filled before the interviewer exits the diary. Extensive interviewer training has been provided in how to do conversational interviewing—including when to selectively probe for adequate information to code activities. Refresher training is conducted at least annually. Interviews are periodically monitored by supervisors, coaches, and BLS sponsors to evaluate conversational interviewing performance. Because the interviewers also are responsible for coding activity information collected in the time diary, they understand the level of detail that must be collected during the interview. Interviewers never code data from the interviews they conducted. A coding verification and adjudication process is in place to ensure activities are accurately coded. Verification continues to be done at 100 percent to ensure high and consistent data quality.


A complete description of the estimation procedures for the ATUS can be found in chapter 7 of the ATUS User’s Guide: www.bls.gov/tus/atususersguide.pdf.



3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.


The proposed module will be attached to the ATUS and the transition between the two will be seamless. In 2013, most respondents (94 percent) who participated in the ATUS also completed the Well-being Module; because of this, the present discussion focuses on response to the ATUS.

A number of efforts have been undertaken to maximize ATUS survey response rates.

  1. Field Test. The 2001 field test examined the effectiveness of incentives, sending advance materials by priority mail, doubling the number of eligible interviewing days by using a day-of-week substitution methodology, calling in advance to set interview appointments, “recycling” cases for field visits, and extending the field period from 4 to up to 8 weeks. (See Attachment B.)

  1. Use of Incentives and recycling cases to the field. As discussed in Part A, section 9, testing showed that incentives significantly increased response rates. “Recycling” cases to the field—that is, turning nonresponse cases over to interviewers to conduct face-to-face interviews in the respondent’s home—was also effective in maximizing response rates, particularly for no-telephone-number households. However, incentives to all respondents and recycling were both cost prohibitive.


  1. Appointment setting. Calling in advance to set an appointment (“proactive appointment setting”) did not improve response, and completed interviews using that strategy required 70 percent more contact attempts than other completed interviews. As a result, advance appointment setting was rejected.


  1. Day-of-week substitution. Allowing day-of-week substitution increased response rates by about 4 percentage points over 8 weeks; however, this practice led to a disproportionately high number of completed interviews on Wednesdays and a disproportionately low number on Fridays. To maintain integrity in the day-of-week distribution of the sample, substitution was also rejected.


  1. Use of priority mail. Consistent with survey methods literature, priority mail appears to have increased response rates in the ATUS field test—by over 10 percentage points. It is relatively low cost to implement ($5.15 per mailing in 2020) and is currently used for sending advance materials.


  1. Fielding period. The optimal field period length varies depending on incentive use. Without an incentive, the field test showed that an 8-week fielding period was required to approach 70 percent (69 percent was achieved in the field test). As a result, this 8-week fielding period was adopted for full production. To even out workload and measure time use across days of the month, one quarter of the monthly sample is introduced each week for 4 weeks. Active cases are called up to 7 times per day on one eligible day each week for 8 weeks.


  1. Incentive expansions. Two OMB-approved incentive expansions were implemented in recent years. As of 2013, incentives are sent to DPs in no-telephone-number households as well as individuals for whom the Census Bureau assigned call outcome codes of: 108 Number not in service; 109 Number changed, no new number given; 124 Number could not be completed as dialed; and 127 Temporarily not in service after the first week of collection. (See Attachment C.) The use of incentives has helped to boost response among difficult-to-reach populations. Individuals who are sent incentives are more likely to be black, of Hispanic or Latino ethnicity, to have less education, and to have lower household incomes than members of households that provide phone numbers.


BLS fielded a new incentive study starting with the December 2019 sample. The ATUS incentive study has two goals. The first goal will be to test the effectiveness of using $0, $5, and $10 cash incentives, where effectiveness will be measured in terms of survey response. The second goal will be to test whether a $5 or $10 cash incentive can boost survey response among certain underrepresented populations. In this study, the focus will be on sampled persons who are 15 to 24 years old. Data for the incentive study started with the December 2019 sample and will be collected through the September 2021 sample, after which they will be examined to determine answers for the study’s two goals and to conduct additional analyses. (See Attachment K.)



  1. Toll-free number provided to DPs. To maximize response, a toll-free number is provided to all eligible respondents in the advance materials. They can use the number to call in and set an appointment for an interview or, if they call on their interview day, to complete the interview.



  1. Advance materials revised. In 2005, an examination of the ATUS advance materials was undertaken and the advance materials were subsequently revised. The advance materials were reviewed and updated again in 2012-13. The advance letters were revised to include information commonly asked by respondents during their first contact with interviewers. The ATUS brochure was updated and redesigned to appeal to more respondents. The debit card and instruction sheet also were redesigned to appear more prominently in the advanced mailer envelope. These materials were modified based on feedback received from expert reviewers and focus groups of ATUS interviewers who examined existing materials. (See Attachments D, E, F, and G).


  1. Respondent Web site. BLS developed a Web site to address common respondent questions about the survey. Its web address is included in the advance letters (http://www.bls.gov/respondents/tus/home.htm).


  1. Fax letters. BLS worked with Census to develop "we've been trying to reach you letters" to fax to telephone numbers that reach fax machines. Like an answering machine message, the fax letters ask the sampled person to call the Census Bureau and complete an interview.


  1. Interview Operations Analysis. In 2004, telephone call center operations were examined to determine if measures could be taken to increase response rates, and three basic operations were changed. First, the ATUS staff learned that while many surveys set calling goals for interviewers, the call center management was not providing ATUS interviewers with daily or weekly goals. Beginning in the summer of 2004, the telephone center management set daily goals for ATUS interviewers, providing concrete guidelines for how many completed calls are desired. Although the interviewers do not always meet their goals, these goals assist the telephone center management to measure daily progress and to motivate the interviewers. Second, it was discovered that because of the way call blocks (times) were scheduled, many calls were being made between about 4:30 pm and 5:00 pm, before many people were home from work. Methods for calling were changed so that more calls would be made after 5:30 pm, when people who work regular 9-5 hours would be more likely to be home. Finally, the Census Bureau conducted more research into invalid phone numbers in an attempt to find valid phone numbers for the contact person.


  1. Interviewer job aids. Interviewers have job aids—answers to frequently asked questions—designed to help answer questions about the survey and to assist them in gaining respondents' cooperation to participate.


  1. Interviewer incentives. An interviewer incentive study was considered but subsequently rejected as the reality of implementing interviewer incentives was determined to be cost prohibitive.


  1. Newsletters. In cooperation with Census, BLS periodically produces newsletters that are designed to motivate and inform interviewers.


  1. Interviewer training. BLS and Census have conducted workshops for interviewers on techniques to gain cooperation from respondents, and much of the material developed for this training was incorporated into other interviewer training courses. Interviewer operations also have been scrutinized and revised to increase the probability of completed interviews, such as redesigning the call blocks to add more call attempts during evening hours.


  1. Studies to understand nonresponse and possible nonresponse bias. In addition to the efforts listed above, a number of studies have been done to understand nonresponse in the ATUS. See Attachment H for a summary of ATUS nonresponse bias studies.


  1. Web collection of ATUS diary. BLS consulted with Westat to explore the feasibility of using a mixed-mode design that includes the collection of ATUS data via a Web instrument. A move to a mixed-mode design could potentially help ATUS improve response and be prepared for the survey climate of the future. The project included a literature review of web and mixed-mode data collection, provided recommendations on the design of web data collection for the ATUS, including respondent allocation and contact strategies and question design considerations for a web instrument. The project also included a discussion of comparability issues between web and telephone data collection with methods to evaluate the proposed design including errors of nonobservation (e.g., coverage and nonresponse error) and errors of observation (e.g., measurement error). Westat also provided a preliminary mockup of the recommended diary design. (See Attachment I.)




4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of test may be submitted for approval separately or in combination with the main collection of information.


Most of the questions appear on the proposed module were cognitively tested in 2009 before becoming a part of the 2010 Well-being Module. See Attachment L for the full report.


The 2012 and 2013 Well-being Module included two additional questions that were not included in the 2010 Well-being Module. Both of these questions were reviewed by survey methods experts and cognitively tested. See the Cognitive Testing Results for the 2012 Well-being Module (Attachment M) for more information. The 2021 Well-Being Module is identical to the 2012 and 2013 modules.


5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze person(s) who will actually collect and/or analyze the information for the agency.


The following individuals may be consulted concerning the statistical data collection and analysis operation:


Statistical Design:

Antoinette Lubich

Demographic Statistical Methods Division

U.S. Census Bureau


Statistical Analysis:

Rachel Krantz-Kent

Program manager

American Time Use Survey

Bureau of Labor Statistics


Data Collection/Survey Design:

Beth Capps  

Assistant Survey Director for the American Time Use Survey

Associate Director for Demographic Programs

U.S. Census Bureau




Attachments:

A - 2021 Well-being Module questionnaire

B - Field Test Analysis

C - Incentive Expansion OMB Memo

D - ATUS Debit Card Mailer

E - Advance Letters

F - Advance Brochure

G - Advance Materials Reevaluation

H - Summary of Nonresponse Bias Studies

I - ATUS Web Collection Study

J - Table A7 Median Hourly Earnings

K - Cash Incentive Study Proposal for the American Time Use Survey

L - Cognitive Testing Results for the Core Well-being Module

M - Cognitive Testing Results for Well-being Module Questions added in 2012

N – Legal Authority

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSUPPORTING STATEMENT
AuthorOEUS Network
File Modified0000-00-00
File Created2021-01-14

© 2024 OMB.report | Privacy Policy