Supporting Statement B

Supporting Statement B.doc

American Time Use Survey-Eating and Health Supplement

OMB: 1220-0187

Document [doc]
Download: doc | pdf

SUPPORTING STATEMENT



B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS

1. Respondent Universe


The proposed Eating and Health Module will directly follow the American Time Use Survey (ATUS) in 2014-15, and thus all respondents to the ATUS will be asked the module questions. The ATUS sample is drawn from households that have completed their final month of the Current Population Survey (CPS), so the universe is the same as that of the CPS. The universe for the CPS is the civilian noninstitutional population residing in occupied households. From this universe, the Census Bureau selects a sample of approximately 72,000 households each month, of which approximately 60,000 households are eligible for interviews. The Census Bureau actually interviews individuals in about 55,000 households each month. See chapter 3 of Technical Paper 66 at http://www.census.gov/prod/2006pubs/tp-66.pdf for more information about the CPS sample.


Households that have completed their final (8th) CPS interview become eligible for selection in the ATUS. About 2,190 of these households are selected for the ATUS sample each month. The ATUS sample is a stratified, three-stage sample. In the first stage of selection, the CPS oversample in the less populous States is reduced. In the second stage of selection, households are stratified based on the following characteristics: race/ethnicity of householder, presence and age of children, and the number of adults in adult-only households. In the third stage of selection, an eligible person from each household selected in the second stage is randomly selected as the designated person (respondent) for the ATUS. An eligible person is a civilian household member at least 15 years of age.


The sample persons are then randomly assigned a designated reference day (a day of the week for which they will be reporting) and an initial interview week (the week the case is introduced). In order to ensure accurate measures of time use on weekdays and weekend days, the sample is split evenly between weekdays and weekend days. Ten percent of the sample is allocated to each weekday and 25 percent of the sample is allocated to each weekend day. For more information about the ATUS sample see chapter 3 of the ATUS User's Guide: http://www.bls.gov/tus/atususersguide.pdf.


2. Description of Procedures


  1. Estimation Procedures


A complete description of the estimation procedures for the ATUS can be found in chapter 7 of the ATUS User’s Guide: http://www.bls.gov/tus/atususersguide.pdf. Estimation procedures to use with the Eating and Health Module data can be found in the Eating and Health Module User's Guide: http://ers.usda.gov/publications/ap-administrative-publication/ap-047.aspx.


  1. Data Collection


The 2014-15 Eating and Health Module is associated with the ATUS and thus the procedures for data collection are the same as those of the ATUS. All ATUS interviews are conducted using Computer Assisted Telephone Interviewing (CATI) technology. Interviewers from the U.S. Census Bureau's National Processing Center in Jeffersonville, Indiana, conduct the interviews and assign the activity codes.


The ATUS interview is a combination of structured questions and conversational interviewing. For the household roster update, employment status questions, the CPS updates, and the proposed Eating and Health Module questions, Census Bureau interviewers read the question on the screen and enter the appropriate response. For the time-use diary and subsequent summary questions on childcare, paid work, volunteering, and eldercare, the interviewer more flexibly interviews the respondent, filling in the diary grid as questions are answered.


The data collection instrument includes an edit check that ensures all cells are filled before the interviewer exits the diary. Extensive interviewer training has been provided on how to do conversational interviewing—including when to selectively probe for adequate information to code activities. Refresher training is conducted periodically. Interviews are regularly monitored by supervisors, coaches, and BLS sponsors to evaluate conversational interviewing performance. Because the interviewers also are responsible for coding activity information collected in the time diary, they understand the level of detail that must be collected during the interview. Interviewers never code data from the interviews they conducted. A coding verification and adjudication process is in place to ensure activities are accurately coded. Verification continues to be done at 100 percent to ensure high and consistent data quality.


3. Methods to Maximize Response


The proposed module will be attached to the ATUS and the transition between the two will be seamless. In 2006-08, most people (over 99 percent) who participated in the ATUS also completed the Eating and Health Module; because of this, the present discussion focuses on response to the ATUS.


The 2001 ATUS field test examined the effectiveness of incentives, sending advance materials by priority mail, doubling the number of eligible interviewing days by using a day-of-week substitution methodology, calling in advance to set interview appointments, “recycling” cases for field visits, and extending the field period from 4 to up to 8 weeks. (See Attachment F.) Testing showed that incentives significantly increased response rates. “Recycling” cases to the field—that is, turning nonresponse cases over to interviewers to conduct face-to-face interviews in the respondent’s home—also was effective in maximizing response rates, particularly for no-telephone-number households. However, incentives to all respondents and recycling were both cost prohibitive. Incentives currently are offered to just over 5 percent of the sample for which the Census Bureau does not have a telephone number. In mid-2008 and again in mid-2011, ATUS expanded the definition of no-telephone-number households to include households with non-viable telephone numbers (e.g., “number could not be completed as dialed"). These households have similar characteristics as other no-telephone-number households.


Calling in advance to set an appointment (“proactive appointment setting”) did not improve response, and completed interviews using that strategy required 70 percent more contact attempts than other completed interviews. As a result, advance appointment setting was rejected. Day-of-week substitution increased response rates by about 4 percentage points over 8 weeks; however, it led to a disproportionately high number of completed interviews on Wednesday and a disproportionately low number on Fridays. To maintain integrity in the day-of-week distribution of the sample, substitution was also rejected.


Consistent with survey methods literature, priority mail appears to have increased response rates in the ATUS field test—by over 10 percentage points. It is relatively low-cost to implement (about $5.05 per mailing in 2013) and is currently used for sending advance materials. The optimal field period length varies depending on incentive use. Without an incentive, the field test showed that an 8-week fielding period was required to approach 70 percent (69 percent was achieved in the field test). As a result, this 8-week fielding period was adopted for full production. To even out workload and measure time use across days of the month, one quarter of the sample is introduced each week for 4 weeks. Active cases are called up to 8 times per day on one eligible day each week for 8 weeks.


To maximize response, a toll-free number is provided to all eligible respondents in the advance materials. They can use the number to call in and set an appointment or to complete the interview (if they call on an eligible interviewing day). In addition, interviewers have job aids—answers to frequently asked questions (FAQs)—designed to help answer questions about the survey and to assist them in gaining respondents’ cooperation to participate.


In 2012, the survey’s overall unweighted response rate by sample month was 53.7 percent, and the weighted response rate was 54 percent. During 2012 data processing, a small percentage of completed cases were eliminated for data quality reasons. As a result, the final unweighted response rate was 51.8 percent after processing, and the weighted response rate was 52.1 percent after processing. Because response rates have been lower than the 69-percent rate achieved (using no incentives) during the 2001 field test, the BLS and the Census Bureau continue to conduct a number of analyses of non-response in ATUS. In particular, BLS and Census have done or are doing the following to test and address response rate issues:


  • Conducted in-depth critiques and revisions of advance materials (see Attachment G)

  • Translated advance materials and refusal conversion materials to Spanish in order to better target Spanish speaking households

  • Developed a refusal conversion letter

  • Revised evening call operations at the Census interviewing center

  • Implemented policy of conducting more research into phone numbers (when invalid) and trained interviewers to conduct this research on a more-timely, interactive basis

  • Increased interviewer motivation by setting weekly goals

  • Conducted comprehensive analyses of non-response bias (see Attachment H for a list of ATUS nonresponse bias studies )

  • Developed a Web site containing information for ATUS respondents (http://www.bls.gov/respondents/tus/home.htm)

  • Evaluated returned mail (such as advance letters) to see if cases were movers and to better investigate wrong or incomplete addresses (see Attachment I)

  • Developed an ATUS-specific “gaining cooperation” workshop to teach interviewers techniques to increase respondent cooperation, and incorporated this material into other periodic training courses

  • Implemented a periodic newsletter to inform interviewers and improve interviewer morale

  • Investigated incomplete cases to identify possible causes of noncontact or refusal (such as non-viable telephone numbers) and converted some cases to incentive cases

  • Researching the feasibility of assigning cases that are likely refusals to refusal conversion specialists as soon as the case enters the field

  • Scrutinized and revised interviewer operations in several ways in order to increase the probability of completed interviews, such as redesigning the call blocks to add more call attempts during evening hours (see Attachment J)

  • Investigating the incidence and impact of cell phones on ATUS response rates and data quality (see Attachment K)

  • Contracted with Westat to provide guidance on whether and how to implement a substitution-of-day mechanism in the ATUS as well as to investigate how allowing substitution of the designated respondent within a household might affect the ATUS data (see Attachment L)

  • Implemented a "We've been trying to reach you letter" that is sent via FAX when ATUS calls go to FAX machines

  • Added FAQs to the collection instrument that ATUS interviewers can easily reference to respond to respondents' concerns


4. Testing of Procedures


Many of the questions appearing on the proposed module were cognitively tested in 2005 before becoming a part of the 2006-08 Eating and Health Module. See Attachment D for the full report.


The proposed 2014-15 Eating and Health Module includes several additional questions that were not included in the 2006-08 module. These questions were reviewed by survey methods experts and cognitively tested. Unlike the 2006-08 Eating and Health Modules, the 2014-15 version will not include questions to collect details about when respondents engaged in secondary drinking and questions about meals children obtained at school. Also, because the 2006-08 Eating and Health Module took less time to administer than anticipated, several new questions were added to the 2014-15 Module, including two questions about soft drink consumption, two questions about grocery shopping, three questions about meal preparation, one question about food sufficiency, and two questions about physical exercise done in the last week. See the Cognitive Testing Results for the 2014-15 Eating and Health Module (Attachment E) for more information.


Cognitive testing revealed that the initially proposed question measuring food sufficiency did not test well:


The next question is about the food eaten in your household in the last 30 days, and whether you were able to afford the food you need. Which of these statements best describes the food eaten in your household-- enough of the kinds of food (I/ we) want to eat, enough but not always the kinds of food (I/ we) want to eat, sometimes not enough to eat, or often not enough to eat?


1. Enough of the kinds of food (I/we) want to eat

2. Enough but not always the kinds of food (I/we) want to eat

3. Sometimes not enough to eat

4. Often not enough to eat


The question was confusing and the intended meaning was not consistently understood by participants. The question attempted to measure three concepts: food affordability, having enough to eat, and having the kinds of food one wants to eat. There was also some confusion over what the types of food one wants to eat meant – some felt it meant quality while others felt it meant dietary restrictions. Because the question did not test well, the expert reviewers recommended using another question measuring food sufficiency from the 1995 and 1996 Food Security Supplement (FSS) to the Current Population Survey that was proven to test well. See the Cognitive Testing Results for the 1995 FSS (Attachments M and N) for more information.

5. Contact Persons


The following individuals may be consulted concerning the statistical data collection and analysis operation:






Statistical Design:

Yang Cheng

Demographic Statistical Methods Division

Bureau of the Census

301-763-3287


Statistical Analysis:

Rachel Krantz-Kent

Program manager

American Time Use Survey

Bureau of Labor Statistics

(202) 691-6517


Karen Hamrick

Economist

Economic Research Service

United States Department of Agriculture

202-694-5426

Data Collection/Survey Design:

Beth Capps  

Assistant Survey Director for the American Time Use Survey

Associate Director for Demographic Programs

Bureau of the Census

301-763-6738



Attachments:

A. Proposed Eating and Health Module Questionnaire

B. Legal Authority

C. ATUS Advance materials

D. Cognitive Testing Results for the 2006-08 Eating and Health Module

E. Cognitive Testing Results for the 2014-15 Eating and Health Module

F. ATUS Field Test Analysis

G. Advance Materials Re-evaluation

H. Summary of ATUS Nonresponse Bias Studies

I . Returned Mail Analysis

J. Call Block Research Study

K. Cell Phone Research

L. Westat Final Report

M. Cognitive Testing Results for the 1995 Food Security Supplement to the Current Population Survey

N. 1996 CPS-FSS Fielding Evaluation

O. Nonresponse Bias Analysis of Body Mass Index Data in the Eating and Health Module

P. Investigating the Time Use Patterns of Obese Americans

Q. How Much Time Do Americans Spend on Food?

R. Shopping For, Preparing, and Eating Food: Where Does the Time Go?

S. Working Parents Outsource Children’s Meals

T. How Much Time Do Americans Spend Eating?

U. List of articles and publications using data from the 2006-08 Eating and Health Modules



File Typeapplication/msword
File TitleSUPPORTING STATEMENT
AuthorOEUS Network
Last Modified ByRowan, Carol - BLS
File Modified2013-12-19
File Created2013-07-12

© 2024 OMB.report | Privacy Policy