SUPPORTING STATEMENT
1. Respondent Universe
The proposed Well-being Module will directly follow the American Time Use Survey (ATUS) in 2013, and thus all respondents to the ATUS will be asked the module questions. The ATUS sample is drawn from households that have completed their final month of the Current Population Survey (CPS), so the universe is the same as that of the CPS. The universe for the CPS is the civilian non-institutional population residing in occupied households. From this universe of individuals in about 111 million households, the Census Bureau selects a sample of approximately 72,000 households each month, of which approximately 60,000 households are eligible for interviews. The Census Bureau actually interviews individuals in about 55,000 households each month. See chapter 3 of Technical Paper 66 at http://www.census.gov/prod/2006pubs/tp-66.pdf for more information about the CPS sample.
Households that have completed their final (8th) CPS interview become eligible for selection in the ATUS. About 2,200 of these households are selected for the ATUS sample each month. The ATUS sample is a stratified, three-stage sample. In the first stage of selection, the CPS oversample in the less populous States is reduced. In the second stage of selection, households are stratified based on the following characteristics: race/ethnicity of householder, presence and age of children, and the number of adults in adult-only households. In the third stage of selection, an eligible person from each household selected in the second stage is randomly selected as the designated person (respondent) for the ATUS. An eligible person is a civilian household member at least 15 years of age.
The sample persons are then randomly assigned a designated reference day (a day of the week for which they will be reporting) and an initial interview week code (the week the case is introduced). In order to ensure accurate measures of time spent on weekdays and weekend days, the sample is split evenly between weekdays and weekend days. Ten percent of the sample is allocated to each weekday and 25 percent of the sample is allocated to each weekend day. For more information about the ATUS sample see chapter 3 of the ATUS User's Guide: http://www.bls.gov/tus/atususersguide.pdf.
2. Description of Procedures
Estimation Procedures
A complete description of the estimation procedures for the ATUS can be found in Part B of the ATUS Supporting Statement at: http://www.reginfo.gov/public/do/PRAViewICR?ref_nbr=201006-1220-002. Estimation procedures to use with the Well-being module data can be found in the Well-being module data dictionary: http://www.bls.gov/tus/wbmintcodebk.pdf
Data Collection
The 2013 Well-being Module is associated with the ATUS and thus the procedures for data collection are the same as those of the ATUS. All ATUS interviews are conducted using Computer Assisted Telephone Interviewing (CATI) technology. Interviewers from the U.S. Census Bureau's National Processing Center in Jeffersonville, Indiana, conduct the interviews and assign the activity codes.
The ATUS interview is a combination of structured questions and conversational interviewing. For the household roster update, employment status questions, the CPS updates, and the proposed Well-being Module questions, Census Bureau interviewers read the question on the screen and enter the appropriate response. For the time-use diary and subsequent summary questions on childcare, paid work, volunteering, and eldercare, the interviewer more flexibly interviews the respondent, filling in the diary grid as questions are answered.
The data collection instrument includes an edit check that ensures all cells are filled before the interviewer exits the diary. Extensive interviewer training has been provided on how to do conversational interviewing—including when to selectively probe for adequate information to code activities. Refresher training is conducted at least annually. Interviews are periodically monitored by supervisors, coaches, and BLS sponsors to evaluate conversational interviewing performance. Because the interviewers also are responsible for coding activity information collected in the time diary, they understand the level of detail that must be collected during the interview. Interviewers never code data from the interviews they conducted. A coding verification and adjudication process is in place to ensure activities are accurately coded. Verification continues to be done at 100 percent to ensure high and consistent data quality.
3. Methods to Maximize Response
The proposed module will be attached to the ATUS and the transition between the two will be seamless. In 2010 and 2012, most people (97 percent in 2010 and 91 in the first two quarters of 2012) who participated in the ATUS also completed the Well-being Module; because of this, the present discussion focuses on response to the ATUS.
The 2001 ATUS field test examined the effectiveness of incentives, sending advance materials by priority mail, doubling the number of eligible interviewing days by using a day-of-week substitution methodology, calling in advance to set interview appointments, “recycling” cases for field visits, and extending the field period from 4 to up to 8 weeks. (See Attachment E.) Testing showed that incentives significantly increased response rates. “Recycling” cases to the field—that is, turning nonresponse cases over to interviewers to conduct face-to-face interviews in the respondent’s home—also was effective in maximizing response rates, particularly for no-telephone-number households. However, incentives to all respondents and recycling were both cost prohibitive. Incentives currently are offered to just over 5 percent of the sample for which the Census Bureau does not have a telephone number. In mid-2008 and again in mid-2011, ATUS expanded the definition of no-telephone-number households to include households with non-viable telephone numbers (e.g., “number could not be completed as dialed"). These households have similar characteristics as other no-telephone-number households.
Calling in advance to set an appointment (“proactive appointment setting”) did not improve response, and completed interviews using that strategy required 70 percent more contact attempts than other completed interviews. As a result, advance appointment setting was rejected. Day-of-week substitution increased response rates by about 4 percentage points over 8 weeks; however, it led to a disproportionately high number of completed interviews on Wednesday and a disproportionately low number on Fridays. To maintain integrity in the day-of-week distribution of the sample, substitution was also rejected.
Consistent with survey methods literature, priority mail appears to have increased response rates in the ATUS field test—by over 10 percentage points. It is relatively low-cost to implement (about $5.15 per mailing) and is currently used for sending advance materials. The optimal field period length varies depending on incentive use. Without an incentive, the field test showed that an 8-week fielding period was required to approach 70 percent (69 percent was achieved in the field test). As a result, this 8-week fielding period was adopted for full production. To even out workload and measure time use across days of the month, one quarter of the sample is introduced each week for 4 weeks. Active cases are called up to 8 times per day on one eligible day each week for 8 weeks.
To maximize response, a toll-free number is provided to all eligible respondents in the advance materials. They can use the number to call in and set an appointment or to complete the interview (if they call on an eligible interviewing day). In addition, interviewers have job aids—answers to frequently asked questions (FAQs)—designed to help answer questions about the survey and to assist them in gaining respondents’ cooperation to participate.
Since its inception in 2003, the ATUS has had annual response rates ranging from 52.5 percent to 57.8 percent. In 2011, the response rate was 54.6 percent. (These are pre-processing response rates.) A number of initiatives have been undertaken to understand and address non-response. Attachment F lists seven studies on nonresponse in the ATUS and provides information about their major findings. These studies have been done by BLS, U.S. Census Bureau, and outside researchers. (ATUS survey methods files are publically available.) In addition to these studies, BLS and the U.S. Census Bureau have undertaken several projects that target response rates and seek to improve them:
An analysis by the Census Bureau focused on why response rates dropped between prefielding in 2002 and full production (see Attachment G).
An analysis of returned mail was completed by the Census Bureau to assess their address review process, assign more accurate case outcome codes, and improve incentive case response rates (see Attachment H).
An analysis of ATUS call outcome codes was used to justify an expansion in the definition of no-telephone-number households (incentive cases) to include certain nonviable numbers (those assigned call outcomes of "number could not be completed as dialed" and "number changed, no new number given"). The definition was expanded in mid-2008 and again in mid-2011.
BLS has conducted workshops for interviewers on techniques to gain cooperation from respondents, and much of the material developed for this training was incorporated into other interviewer training courses.
A study of the ATUS call blocks was conducted and subsequently a "boost" was implemented within the ATUS call scheduler in 2010. The boost increases the probability of a case being called at a time that had been successful for the final CPS interview.
An interviewer incentive study was considered but subsequently rejected as the reality of implementing interviewer incentives was determined to be cost prohibitive.
The ATUS advance materials were examined and revised (see Attachment I).
Advance and refusal conversion gatekeeper letters were developed in response to interviewer focus group concerns that parents or guardians of minor designated persons were often refusing the interview for the minor. These letters were revised to improve readability and translated into Spanish. (See Attachment J.)
BLS developed a Web site to answer respondent questions: (http://www.bls.gov/respondents/tus/home.htm).
In cooperation with Census, BLS produces a periodic newsletter to motivate and inform interviewers.
Interviewer operations have been scrutinized and revised in several ways in order to increase the probability of completed interviews, such as redesigning the call blocks to add more call attempts during evening hours.
BLS is looking at the incidence and impact of cell phones on ATUS response rates and data quality.
BLS contracted with Westat to provide guidance on whether and how to implement a substitution-of-day mechanism in the ATUS to increase response rates. A secondary purpose of the project is to investigate how allowing substitution of the designated respondent within a household might affect the ATUS data.
4. Testing of Procedures
Most of the questions appearing on the proposed module were cognitively tested in 2009 before becoming a part of the 2010 and 2012 Well-being module. See Attachment D for the full report. The proposed 2013 Well-being module is identical to the 2012 module.
The 2012 Well-being Module included two additional questions that were not included in the 2010 Well-being Module. Both of these questions were reviewed by survey methods experts and cognitively tested. One question is a measure of overall life satisfaction that will provide important information about respondents' well-being, beyond what can be learned from the moment-to-moment information collected using the affect questions. The second question asks about respondents' overall emotional experience yesterday; these data will be used to help explain variance in responses to the affect questions. See the Cognitive Testing Results for the 2012 Well-being Module (Attachment K) for more information.
5. Contact Persons
The following individuals may be consulted concerning the statistical data collection and analysis operation:
Statistical Design:
Andrew Zbikowski
Demographic Statistical Methods Division
Bureau of the Census
(301) 763-5939
Statistical Analysis:
Rachel Krantz-Kent
Office of Employment and Unemployment Statistics
Division of Labor Force Statistics
Bureau of Labor Statistics
(202) 691-6517
Data Collection/Survey Design:
Richard A. Schwartz
Chief, Consumer Expenditure Survey Branch
Demographic Surveys Division
Bureau of the Census
4600 Silver Hill Rd, Rm. 6H041
Washington, D.C. 20233-8400
(301) 763-7491
Attachments:
A. Proposed Well-being Module Questions
B. Legal Authority
C. ATUS Advance materials
D. Cognitive Testing Results for the 2010 Well-being Module
E. ATUS Field Test Analysis
F. Summary of ATUS Nonresponse Bias Studies
G. Response Rates Analysis
H. Returned Mail Analysis
I. Advance Materials Re-evaluation
J. Refusal Conversion Letters
K. Cognitive Testing Results for the 2012 Well-being Module
L. Well-being Module specifications
M. NRC ATUS Module Report
File Type | application/msword |
File Title | SUPPORTING STATEMENT |
Author | OEUS Network |
Last Modified By | vogel_a |
File Modified | 2013-01-03 |
File Created | 2012-11-29 |