ATUS 2009 Supporting Statement A

ATUS 2009 Supporting Statement A.doc

American Time Use Survey (ATUS)

OMB: 1220-0175

Document [doc]
Download: doc | pdf


American Time Use Survey OMB Clearance Package, 2009

Supporting Statement

American Time Use Survey

A. Justification


  1. Necessity for the Data Collection


The purpose of this review request is for the Bureau of Labor Statistics (BLS) to renew clearance for the monthly collection of time-use data through a revised American Time Use Survey (ATUS), which began full production in January 2003. A clearance is being requested for the revised ATUS because of changes to module questions located at the end of the interview:


  • The Eating and Health module questions sponsored by the Economic Research Service (ERS) of the United States Department of Agriculture (USDA) ended in December 2008 and are no longer a part of this clearance package.


  • A well-being module sponsored by the National Institute on Aging (NIA) has been proposed to run in calendar year 2010. This module will ask general health questions, as well how happy, tired, sad, stressed, and in pain respondents felt during randomly selected activities. Respondents will not be asked these questions about personal activities. Since this module will run during the time period covered by this three-year clearance, this module is included in this clearance request.


Clearance also is being requested to expand the collection of “who” information to include times when people report that they were working or doing work-related activities. Information about who was with a respondent currently is collected for most activities that people report.


The ATUS is the Nation’s first federally administered, continuous survey on time use in the United States. A nationally representative sample of persons from households completing their final month of interviews for the Current Population Survey (CPS) is drawn for ATUS. BLS contracts with the Census Bureau to conduct one interview with one person age 15 or over from each selected household. The primary focus of the interview is on activities done "yesterday" (from 4 a.m. to 4 a.m.), though additional questions are asked about work during the prior week and trips away from home during the prior 1-2 months.


Collection of time-use data fits well within the BLS mission, as outlined in Title 29, United States Code, Section 1:


The general design and duties of the Bureau of Labor Statistics shall be to acquire and diffuse among the people of the United States useful information on subjects connected with labor, in the most general and comprehensive sense of that word, and especially upon its relation to capital, the hours of labor, the earnings of laboring men and women, and the means of promoting their material, social, intellectual, and moral prosperity.”


According to economist William Nordhaus, “Inadequate data on time use is the single most important gap in federal statistics,” (Nordhaus, 1997). Approximately 50 other countries collect, or soon will collect, time-use data. Such data are considered important indicators of both quality of life and the contribution of non-market work to national economies. They measure, for example, time spent caring for children, volunteering, working, sleeping, or doing leisure activities. Using time-use data in conjunction with wage data allows analysts to better compare production between nations that have different mixes of market and non-market activities. In the United States, several existing Federal surveys collect income and wage data for individuals and families, and analysts often use such measures of material prosperity as proxies for quality of life. Time-use data substantially augment these quality-of-life measures.



2. Purpose of the Survey

The major purpose of the ATUS is to develop nationally representative estimates of how people spend their time. Many ATUS users are interested in the amount of time Americans spend doing non-market work activities. These include unpaid childcare and adult care, housework, and volunteering. The survey also provides information on the amount of time people spend in many other activities, such as commuting, religious activities, socializing, exercising, and relaxing. To produce these estimates, data are collected not only about what people did, but also about where and with whom each activity occurred, and whether the activities were paid work or work-related. This additional contextual information enables coders to assign codes that describe each activity with consistency.


Because the ATUS sample is drawn from a subset of households that completed interviews for the CPS, the same demographic information collected from that survey are available for the ATUS respondents. Comparisons of activity patterns across characteristics such as sex, race, age, disability status, and educational attainment of the respondent, as well as the presence of children and the number of adults living in the respondent’s household, are possible. Since the data are collected on an ongoing, monthly basis, five years of time series data are available, enabling analysts to identify changes in how people spend their time. Also, the ATUS activity coding lexicon was designed to ensure that time-use information in the United States can be compared, at broad levels, with information from other countries.


To ensure the widest distribution of information, BLS releases annual and quarterly data to the public once a year in the form of published tables. Microdata sets are also available, and special analyses by BLS and outside analysts appear in the Monthly Labor Review (published by BLS) and in other publications. Five years of ATUS data have been published (2003-2007), and the data have received wide interest from a variety of users, including economists, sociologists, health researchers, journalists, and businesspersons. ATUS information has also been of interest to government policymakers, educators, lawyers, and others, as the survey information has numerous applications. In addition to appearing in many national newspapers, magazines, and television programs, ATUS data have been used in articles appearing in many academic journals. A list of publications, both BLS and non-BLS, using ATUS data is available on the ATUS Web site (http://www.bls.gov/tus/papersandpubs.htm).


The survey captures not only hours worked on a typical weekday or weekend day, but also shows the distribution of where work is being done—at home, at a workplace, or somewhere else1—and whether, over time, these distributions are changing. In addition to providing information about time spent in work activities, ATUS data have been analyzed to gain insight into commuting patterns and other behaviors associated with work.


Survey information about who people are with has been used to measure the time parents spend with their children and social contact among older Americans. This “who” information currently is collected for most activities people report, but not times when people are working or doing work-related activities. The proposed addition of the “who” question for times when people report working or doing work-related activities will provide useful information about social contact while at work. Together with information collected in the proposed well-being module, it will provide insight about the pleasantness of the experience when one’s boss, co-workers, or others are present. Additionally, collecting the “who” information for work will make it easier to interpret the social contact information overall.


Unpaid activities, such as raising children, volunteering, and doing housework, are not currently counted in the National Income and Product Accounts—even though they are critical to society and to national well-being. ATUS data provide more comprehensive information about these activities on a continuous basis. Analysts have used ATUS measures of time spent doing such activities to estimate the contribution they make to overall economic activity.


For decades, economists have acknowledged that changes in GDP may reflect changes in institutional arrangements rather than actual changes in economic activity (Landefeld and McCulla, 2000). For example, under traditional methods used to value the Nation’s output, the worker who decides he will wash and iron his own dress shirts rather than send them to the cleaners as he has previously done contributes to a decline in GDP, because the washing and ironing activity is no longer captured as a market transaction. However, ATUS respondents report on the ways they use their own time. The availability of this detailed information allows economists to more accurately value a household’s final products by estimating the value of the time (labor services) used to produce final goods and services. Child and adult care, meal preparation, and home repair projects are just a few of the non-market activities that ATUS data can be used to evaluate. Bureau of Economic Analysis researchers have used these data as a critical input into prototype estimates of satellite accounts that measure the value of unpaid work, including volunteering, child care, and household activities (Landefeld, Fraumeni, and Vojtech, 2005).


International organizations and researchers have used the data to compare the United States to other countries. Both the UN and the OECD have published ATUS estimates in order to compare time use of Americans to those living in other countries.


Sociologists have used the data to examine social contact, such as how much time people spend with their children, colleagues, or family members. They also have examined the degree to which people are trading off time spent with family or in leisure activities to do market or non-market work.


The ATUS data may help Federal, State, and local government policy makers more fully understand noneconomic, as well as economic, effects of policy decisions, and to better determine when to develop new or change existing policies to address the needs of our society. For example, ATUS data are currently being utilized in a report on low-income people’s access to healthy food.


Health researchers have used ATUS data to explore the amount of time spent in activities that impact Americans’ health, such as sleep, eating, meal preparation, and physical exercise. The data have also been used to analyze Americans’ exposure to traffic accident risk.


The data from the proposed well-being module will support the BLS mission of providing relevant information on economic and social issues. The data will provide a richer description of work; specifically, it will measure how workers feel (tired, stressed, in pain) during work episodes compared to nonwork episodes, and how often and with whom workers interact on the job. It can also measure if the amount of pain varies by occupation and disability status.


The data also closely support the mission of the module’s sponsor, NIA, to improve the health and well-being of older Americans. By analyzing the module data, the experience of pain and aging can be studied. Changes in time use can be used as a measure of the impact of the onset of a health problem and the impact of an intervention, especially if there is a measure of how difficult it was to do the activity. Some of the questions that can be answered include:


  • Do older workers experience more pain on and off the job?

  • Is the age-pain gradient due to differences in activities or differences in the amount of pain experienced during a given set of activities?

  • Do those in poor health spend time in different activities?



  1. Use of Information Technology


The Census Bureau, which collects and processes the data for BLS, uses state-of-the-art methods to conduct interviews and record respondent information.

Census Bureau interviewers conduct all interviews over the telephone, completing the respondent’s time-use diary using Computer Assisted Telephone Interviewing (CATI). Using an automated call scheduler and hourly reports from the system, cases are presented to interviewers in order depending on respondents’ designated interview days, pre-set appointment times, CPS information on the best time to call respondents, and other information.


The ATUS questionnaire is built in Blaise, a windows-based software package developed by Statistics Netherlands and adopted as the Census Bureau standard. The software’s graphical user interface (GUI) enables the usage of data entry grids that accept many entries on one screen. ATUS respondents verbally report to the interviewer about the activities of the previous day—what they did, who was with them, where they were, and how long the activity lasted. The instrument enables interviewers to enter the information for each activity into the diary grid in any order, and it automatically computes the duration of an activity after each entry. This feature enables the interview to be flexible, making reporting easier for respondents. (See Attachment A for the main ATUS instrument; see Attachment B for the proposed well-being questions.)


The ATUS activity coding system is also built in Blaise. Diary entries are imported into the Blaise coding instrument. Coders view on the screen three horizontal windows that display the activity to be coded, coding categories, and the respondent's diary and codes assigned. The software includes a “trigram search” feature that enables coders to type in an activity verbatim (e.g., “washed the car”), which generates a list of codes that contain 3-letter combinations in the search string. This feature helps coders find the correct code for ambiguous activities.


A debit card tracking system is in place to manage incentive payments to “no-telephone-number” households in the sample. (See Part A, section 9.)



  1. Efforts to Identify Duplication


No private or public institutions conduct time-use surveys at regular intervals. Two academic institutions, the Universities of Maryland and Michigan, have collected time-use data periodically since 1965, but their data collection methodologies changed across years, and no continuous survey was ever conducted in the United States prior to the ATUS. As a result, analysts must infer (or ignore) patterns that occurred between survey periods, making reliable trend analyses very difficult. Continuous data collection through the ATUS will allow analysts to determine if, and by how much, time-use patterns are changing over time.

Additionally, the ATUS sample size is large enough to enable demographic comparisons of time use not possible in earlier studies. Demographic analyses of previous time-use surveys conducted by academic institutions have been limited because sample sizes have only been large enough to yield valid statistical results at aggregate levels. The 1985 time-use survey conducted by the University of Maryland was the largest of the previous U.S. time-use surveys completed, yet it only had 5,300 respondents—just over 40 percent of the approximate 13,240 annual ATUS respondents (Robinson and Godbey, 1997). The ATUS sample is also more demographically controlled than those in previous surveys. Because the sample is drawn from the CPS, households are stratified by demographic characteristics. Black and Hispanic households and households with children are oversampled to ensure their adequate representation in the ATUS estimates. (See Part B, section 1.)



5. Involvement of Small Establishments


The ATUS is a survey of individuals in households and does not involve small businesses or other small entities.


  1. Consequences of Less Frequent Data Collection


The 13,240 ATUS interviews are spread across 12 months so that a large annual sample size can be achieved at the same time that the ability to examine seasonal patterns across years can be maintained. Less frequent collection would reduce the analytical value of trend analyses and would eliminate analyses of seasonal patterns in time use.


In addition, monthly data collection operations are more efficient to manage than larger-scale, less frequent operations. A stable, well-trained staff has been developed and cases are spread evenly across the work weeks and months. Each month’s ATUS sample is introduced over 4 weeks (1/4 sample each week). Each case has up to an 8-week field period. Interviewing respondents about their time use for a 24-hour period in such a way that reports can be consistently and accurately coded requires significant training and practice. Likewise, experience and familiarity with the coding rules and coding lexicon are extremely important to coders for producing accurate results. Less frequent data collection could seriously impact training costs and impede performance.



  1. Special Circumstances


The ATUS requires the use of an activity coding classification system not in use in any other Federal survey. A coding lexicon was developed to classify reported activities into 17 major categories, with two additional levels of detail. (ATUS coding lexicons can be found on the Internet at http://www.bls.gov/tus/lexicons.htm). BLS designed the ATUS lexicon by studying classification systems used for time-use surveys in other countries, drawing most heavily on the Australian time-use survey lexicon, and then determining the best way to produce analytically relevant data for the United States. The coding lexicon developed for the ATUS was extensively tested by Census Bureau coders and by coders at Westat (Westat, 2001) prior to the start of full production in 2003. Development of the ATUS lexicon is described in Shelley (2005).


No other special circumstances apply.



  1. Federal Register Notice and Consultations Outside the Agency

Federal Register Notice

Four comments were received as a result of the Federal Register notice published in 74 FR 14160 on March 30, 2009.


The comments focused on the proposed addition of the NIA-sponsored well-being module. They were generally positive or neutral in nature. One commentator stated, "The well-being module will make a substantial contribution to understanding the impact of time spent working on well-being and how the relationship may vary as a function of occupation and industry." The commentator also noted that, "The set of feelings included in this survey is well chosen to provide summary information on the influence of activities on physical and emotional well-being."


Two commentators suggested that BLS add questions to the survey or collect additional detail about work activities in the diary section. Adding new questions to the survey and collecting detail about the tasks people do while working would greatly expand the length and cost of the survey, as well as the burden on ATUS respondents. It is not feasible to expand the survey at this time. An avenue for adding questions to the survey exists through sponsorship of a module; BLS can provide more information about module sponsorship upon request.


One commentator suggested that BLS ask questions about the quality of sleep. The proposed module includes a question that is designed to measure sleep quality; it asks, "When you woke up yesterday, how well-rested did you feel?" Another commentator suggested that BLS divide the WHO category for "co-workers, colleagues, or clients" into multiple categories. BLS understands the value of collecting this information in separate categories and plans to include the following WHO categories in 2010: boss or manager, people whom I supervise, co-workers, and customers.


One commentator expressed concern about ATUS measures of sleep because they differ from those generated in other studies. BLS is willing to work with this commentator to improve understanding about differences in the sleep measures they cited and those from the ATUS.


Comments specific to the well-being module:


One commentator asked for clarification about the randomization process used to select activities for the feelings questions. Three activities will be randomly chosen using a frequency-based selection process. Any activity lasting 5 minutes or longer will be eligible for selection, even if it was previously reported on the diary day. For example, if a respondent reports two episodes of watching TV, each episode will have an equal probability of selection, as long as they each meet the duration requirement.

A suggestion was made to include sleep as an activity that is eligible for the feelings questions. It is not possible to judge one's feelings while asleep, and asking respondents to do this would be confusing. A question about quality of sleep is included in the module ("When you woke up yesterday, how well-rested did you feel?").


One commentator asked why there are two different approaches to the feelings questions: five are structured to obtain information about how a person felt during an activity and one (meaningful) is structured to determine how an activity made a person feel. The question about how meaningful an activity was is designed to help researchers sort trivial activities from important ones, beyond what can be discerned from the 5 affect questions.


A recommendation was made that BLS ask a more general question about health rather than asking about hypertension. The module does include a question about general health ("Would you say your health in general is excellent, very good, good, fair, or poor?"). The question about hypertension has been included to facilitate linkages to other studies, such as the Health and Retirement Study (HRS); similar to the well-being module, this study also is sponsored by the NIA.


One commentator expressed concern that there is no specific threshold to determine whether a respondent was interacting with others and thus the commentator feels the results would be meaningless for activities of a long duration. The interaction question is designed to capture the respondent's perception about whether she or he was interacting with others. One's perception is what matters for understanding the person's affective experience. An explicit threshold would not necessarily clarify the experience because a short period of interaction can have a high impact on affective experience and a long interaction can have a low impact.


One commentator suggested that the affect "tired" is not specific enough and that BLS instead should substitute "sleepy" or "drowsy" because they have been formally linked to excessive or pathological fatigue. The goal of this question is to assess everyday feelings rather than pathological fatigue; "tired" is a meaningful everyday concept that captures feelings of normal fatigue. Note that a separate commentator expressed that the questions related to tiredness are appropriate.



Survey Methods Research Community

ATUS sponsored a brainstorming session with survey methodologists in June 2001. ATUS research was presented for comment at the American Association for Public Opinion Research (AAPOR) conferences in 2000 and 2001 and at the International Field Directors conference in 2001. Research was also presented at FedCASIC in March 2005 and March 2008, at the May 2005 International Field Directors conference, at the August 2005 American Statistical Association meetings, at the ATUS Early Results Conference in December 2005, at the June 2006 Panel of Income Dynamics Conference, and at the 2006 and 2008 International Association for Time Use Research conferences Additionally, 2003-2007 ATUS survey methodology data files have been made publicly available to enable outside survey methodologists to perform analyses.


Federal Economic Statistical Advisory Council (FESAC)

Plans for ATUS were


discussed at the June 2001, December 2001, and June 2006 FESAC meetings. ATUS staff members regularly solicit feedback and consultation from members of this group.


National Academy of Sciences (NAS)

Plans for ATUS were presented and reviewed at a NAS-sponsored conference on time use held in 1999 (NAS, 2000).


MacArthur Foundation

BLS and the MacArthur foundation jointly sponsored a conference in 1997 to discuss research applications of time-use data.


Westat

BLS has consulted with Westat on methods for programming the time-use data collection instrument and on the usability of the coding lexicon and the coding software.


Business Research Advisory Council (BRAC) to the BLS

BLS has consulted periodically with the BRAC on the ATUS.


Labor Research Advisory Council (LRAC) to the BLS

BLS has consulted periodically with the LRAC on the ATUS.


See Attachment C for names and contact information of people in the above organizations who have been in consultation with BLS on the ATUS.


Council of Professional Associations on Federal Statistics (COPAFS)

BLS consulted with COPAFS on the ATUS at the June 2004 quarterly meeting.



  1. Payment to Respondents


A 2001 ATUS field test evaluated, among other operational design strategies, the effect of monetary incentives (debit cards) on response rates. Results of the evaluation showed that incentive payments significantly increased response, as well as encouraged faster response. For households for which the Census Bureau had a telephone number, a $20 incentive payment increased response significantly, from 69 percent to 77 percent, and a $40 incentive payment further increased response to 83 percent. (See Attachment D.) However, BLS determined that providing incentives to all respondents or only to refusals and noncontacts (after 4 weeks) would be cost prohibitive. Therefore, payments are not used as incentives to respondents in households for which the Census Bureau has a recent telephone number.


BLS does offer incentives to respondents from “no-telephone-number” households. Persons in these households either do not own a phone or have not given a phone number to the Census Bureau as of CPS month-in-sample 8 (final month). They account for about 5 percent of the CPS sample, and are more likely to be black, to have less education, and to have lower household incomes than members of households that provide phone numbers. The number of such cases is relatively small—approximately 1,320 potential cases each year. Because these households may differ from phone households on unobservable characteristics, including their time-use patterns, and because providing incentives to this small group is not cost prohibitive, BLS believes it is beneficial to expend additional effort and expense to secure their responses.


In the 2001 field test, designated persons in these no-telephone-number households (n=165), defined as those with no telephone or no telephone number, were sent a $60 debit card with their other ATUS advance materials. Their letter encouraged them to call a toll-free number to complete the interview. After 4 weeks, 41 percent had called in and completed the interview. To contain costs in production, BLS uses a $40 debit card as an incentive rather than the $60 used in the field test. The $40 amount was chosen for two reasons. In a field test debriefing, respondents most frequently selected $20 as the lowest amount respondents should be paid to participate in the full survey. They chose $50 as the highest amount. In addition, most ATMs disburse money in $20 bills, so BLS only considered incentive payments in $20 increments. The debit card is sent with the advance materials. However, the PIN number to activate the card is only given to the designated person upon completion of the interview. (See Attachment E.)


As mentioned above, the $60 incentive given to no-telephone-number households in the 2001 field test yielded a 41 percent response rate in 4 weeks. Assuming response rates increase or decrease as incentive amounts increase or decrease as demonstrated by the incentive test for telephone households, BLS projected that a $40 incentive to no-telephone-number households would yield a response rate lower than the 41 percent after 4 weeks. This has proven to be the case, as unweighted response rates for no-telephone-number households averaged about 31.6 percent in 2007, and weighted response rates averaged 32.0 percent in 2007.2


In the course of investigating ATUS nonrespondents, BLS discovered that some cases have non-viable telephone numbers (e.g., “number could not be completed as dialed”), and the response rate for these households is very low, only around 5 percent. BLS submitted a nonsubstantive change in December 2007 to obtain permission to expand the definition of “no-telephone households” to include these cases. Cases with non-viable telephone numbers are sent debit cards after the first call attempts have been made, whereas regular incentive cases receive the debit card in the advance letter. A test of the new incentive cases was conducted with the June 2008 panel, and partial implementation occurred with the August 2008 panel. Since implementation, 33 of these cases have been identified, and 13 have resulted in completed interviews (about a 39 percent response rate). It is expected that these new incentive cases will add approximately 100 cases per year, making a total of 1,420 incentive cases annually.


In 2007, the survey’s overall unweighted response rate by sample month was 52.6 percent, and the weighted response rate was 53.8 percent. During 2007 data processing, a small percentage of completed cases were eliminated for data quality reasons. As a result, the final unweighted response rate was 50.9 percent after processing, and the weighted response rate was 52.2 percent after processing. Response rates have been increasing in the first ten months of 2008 to an unweighted preprocessing rate of 54.0 percent. Because response rates were lower in 2007 than the 69-percent rate achieved (using no incentives) during the 2001 field test, the BLS and the Census Bureau are continuing to cooperate to conduct a number of analyses of non-response in ATUS. In particular, BLS and Census have done or are doing the following to test and address response rate issues:


  • Conducted in-depth critique and revision of advance materials

  • Translated advance materials and refusal conversion materials to Spanish in order to better target Spanish speaking households

  • Developed a “minor gatekeeper” advance letter and refusal conversion letter

  • Assessed the feasibility of an incentive study

  • Revised evening call operations at the Census interviewing center

  • Implemented policy of conducting more research into phone numbers (when invalid)

  • Increased interviewer motivation by setting weekly goals

  • Conducted a comprehensive analysis of non-response bias (See Part B, section 4)

  • Developed a Web site containing information for ATUS respondents (http://www.bls.gov/respondents/tus/home.htm)

  • Evaluated returned mail (such as advance letters) to see if cases were movers and to better investigate wrong or incomplete addresses

  • Developed an ATUS-specific “gaining cooperation” workshop to teach interviewers techniques to increase respondent cooperation, and incorporated this material into other training courses

  • Implemented a quarterly newsletter to inform interviewers and improve interviewer morale

  • Investigated incomplete cases to identify possible causes of noncontact or refusal (such as non-viable telephone numbers) and converted some cases to incentive cases

  • Analyzing call attempt times to identify optimal call blocks

  • Researching the feasibility of assigning cases that are likely refusals to refusal conversions specialists as soon as the case enters the field


  1. Confidentiality of Data


The Census Bureau employees hold all information that respondents provide in strict confidence in accordance with Title 13, United States Code, Section 9. (See Attachment F.) Each interviewer has taken an oath to this effect, and if convicted of disclosing any information given by the respondent may be fined up to $250,000 and/or imprisoned up to 5 years. In addition, Title 13 prohibits Census Bureau employees from disclosing information identifying any individual(s) in the ATUS to anyone other than sworn Census employees.


ATUS data are collected by the Census Bureau under the authority of Title 13, United States Code, Section 8. Section 9 of the law requires that all information about respondents be kept strictly confidential, and that the information be used only for statistical purposes. Respondents are informed of their right to confidentiality under Title 13 in the ATUS advance letter and brochure, mailed approximately 10 days before the interview date. (See Attachments G and H.) The ATUS advance letter also advises respondents that this is a voluntary survey. (It should be noted that the CPS advance letter, which all ATUS respondents will have received in months 1 and 5 of CPS interviewing, makes no reference to future contacts for other surveys. In the 8th month (final) of CPS CATI interviews, interviewers tell respondents that “this is the last regularly scheduled interview for this household for the Current Population Survey. We may, however, need to contact you one more time in the near future to update some information. Households like yours that were interviewed this month may be called upon to participate in a follow-up survey. As with any CPS interview, we are required to keep all information about you and your household strictly confidential. We may use this information only for statistical purposes.” The CPS “thank you” postcard makes no mention of final or future contacts.)


All Census Bureau security safeguards regarding the protection of data files containing confidential information against unauthorized use, including data collected through Computer Assisted Telephone Interviewing (CATI), apply to ATUS data collection.


The BLS Processing System design requires that ATUS data be securely transferred from the Census Bureau server to the BLS server. This process mirrors the process used to transfer Current Population Survey data.


  1. Sensitive Questions


During the course of a 24-hour day, many people engage in activities—such as alcohol or drug use or sexual activities—that they may consider too personal or sensitive to report. To examine respondent concerns about the sensitivity of the diary and other survey questions, respondents were asked in the field test if they thought any of the questions were too sensitive. Ninety-two percent of respondents did not think that questions about their time use were too personal or sensitive. During full production, Census Bureau ATUS interviewers advise respondents before beginning the interview that they need not report anything they think is too personal. This instruction does not appear to lead to nonresponse. In 2007, well under one percent of the total number of activities captured was reported by respondents as “none of your business.” A potentially sensitive question is included before the diary, as part of the household roster update, about whether the respondent has any children who do not live with him or her (so that analysts may examine noncustodial parents’ time with their children.)


Some of the proposed well-being module questions could be potentially sensitive. After the respondents complete the main ATUS interview, the CATI instrument will randomly select three activities, and respondents will be asked, on a six-point scale, how happy, tired, stressed, sad, and in pain they felt during the activity, and also how meaningful the activity was. These affect questions will not be asked for certain personal activities (e.g., sleep, grooming, or sex). For the remaining activities for which these questions will be asked, none of the 28 participants in the cognitive testing thought the questions were too personal (see Attachment I).


During the cognitive testing of the well-being questions, participants were also asked how they reacted to being asked how they felt during an activity. The majority (23 out of 28) either had no reaction or felt neutral about the questions (e.g., “the questions were fine”). The remaining 5 participants did not express discomfort with the questions. Their comments were mainly about the nature of the six-point scale or how interesting or revealing the questions were to them. Finally, when asked their reaction to the explanation of why the government was collecting these data, only 1 out of the 28 felt that the government should not collect the information.


  1. Estimate of Information Collection Hour Burden


Starting with the sample introduced in December 2003, the ATUS sample was reduced by 35 percent. ATUS interviewers began attempting to contact one designated person in each of approximately 2,190 sample households per month, down from about 3,380 sampled households per month during the first year of production. Of the 2,190 households sampled each month, about 2,000 will actually be eligible for the ATUS at the time of contact. Since the sample reduction in December 2003, an average of 1,100 interviews was completed each month, or about 13,200 per year. A similar number is expected in future years. The expanded definition of incentive cases will add about 40 completed cases per year. The total number of completed interviews per year is thus expected to be about 13,240. Each respondent is interviewed in depth about only one day's activities and is not contacted for repeat interviews. A complete interview consists of:


  • a brief introduction

  • a household roster and employment status update

  • collection of time diary information

  • five summary question series (on paid work, childcare, volunteering, missed days, and eating as a secondary activity)

  • an update of additional information—on earnings, occupation and industry, layoff/job search, and school enrollment—collected in the CPS

  • Well-being module (calendar year 2010)


The average length of time to complete the main ATUS interview, including the updates of demographic and labor force information as well as the time diary, is approximately 16 minutes. The well-being module questions are estimated to take no longer than 5 minutes to complete (based on cognitive testing), and will run for the calendar year 2010, which includes parts of FY 2010 and FY2011.


For FY 2009, the estimated number of burden hours is 3,751. This estimate includes the 4 minute average interview of the ERS-USDA module that ran in October through December 2008.


The number of burden hours for FY 2010 is expected to be 4,358. The average length of the interview for the first three months of this fiscal year is expected to remain 16 minutes. However, beginning in January 2010, the well-being questions will add an additional 5 minutes to the interview for the remaining 9 months in the FY 2010.


The number of burden hours for FY 2011 is expected to be about 3,806. The decline in burden hours from FY 2010 reflects the discontinuation of the well-being module after December 2010. The average length of the interview will return to 16 minutes.


Based on these estimates of annual burden, the overall annualized dollar cost to the respondents for collection of ATUS data is expected to be about $44,800 for FY 2009. The estimates for FY 2010 and FY 2011 will be about $52,600 and $45,400, respectively. These estimates assume an hourly wage rate for all respondents of $11.95, which equals the median hourly earnings for all wage and salary workers (paid hourly rates) in 2007.


Table 1 provides details on the estimated respondent burden for the ATUS collection for FY 2009. Table 2 provides details on the estimated respondent burden for the ATUS collection for FY 2010, Table 3 provides details for FY 2011.


Table 1. Estimated Respondent Burden for FY 2009 (Hours and Dollars)


Form

Total Respondents

Frequency

Average Time per Response

Estimated Total Burden

(Hours)

Estimated Total Burden

(Dollars)



Full production


13,240

One Time

20 minutes for 3 months; 16 minutes for 9 months

3,751

$44,800



Table 2. Estimated Respondent Burden for FY 2010 (Hours and Dollars)


Form

Total Respondents

Frequency

Average Time per Response

Estimated Total Burden

(Hours)

Estimated Total Burden

(Dollars)



Full production


13,240

One Time

16 minutes for 3 months; 21 minutes for 9 months

4,358

$52,600



Table 3. Estimated Respondent Burden for FY 2011 (Hours and Dollars)


Form

Total Respondents

Frequency

Average Time per Response

Estimated Total Burden

(Hours)

Estimated Total Burden

(Dollars)



Full production


13,240

One Time

21 minutes for 3 months; 16 minutes for 9 months

3,806

$45,400



13. Cost Burden to Respondents or Recordkeepers


  1. Capital start-up costs: $0

  2. Total operation and maintenance and purchase of services: $0


Respondents to this survey are individuals and will not incur any capital start-up costs or costs related to total operation and maintenance and purchase of services agreements.


14. Cost to Federal Government


The total estimated cost to the Federal Government for the ATUS base program in each fiscal year will be about $5.2 million annually.


The total estimated additional cost for the fielding of the well-being questions in calendar year 2010 is $0.1 million. This cost is being borne by the NIA.


Costs associated with the ATUS cover survey management, questionnaire design, instrument development, training, data collection, incentive payments, data editing, preparation of the files for data users, and support for users of the data files.


  1. Changes in Burden


The estimated burden is 3,751 hours for FY 2009; 4,358 hours for FY 2010; and 3,806 hours for FY 2011.


The change in burden from FY 2009 to FY 2010 to FY 2011 is due to the discontinuation of the ERS – USDA questions in December 2008, and the addition of the well-being questions in January 2010. The discontinuation of the ERS – USDA questions subtracted 4 minutes off of the average interview length. However, the well-being questions will add an average of 5 minutes in length to each ATUS interview. This is projected to result in a program change of + 607 hours between FY 2009 and 2010, and +55 hours between FY 2009 and 2011.


The program change and adjustments result in a net increase of 662 burden hours over FY 2009-2011.



  1. Time Schedule for Information Collection and Publication


The following is the schedule for the ATUS data collection:


Full production data collection, Starting in January 2009,

without the well-being module continuing monthly through December 2009


Full production data collection, Starting in January 2010,

with the well-being module continuing monthly through

December 2010


Full production data collection, Starting in January 2011,

without the well-being module continuing monthly through December 2011


Release of ATUS estimates Mid-2009

Mid-2010

Mid-2011


Cross tabulation, time-series, and multivariate analyses will be used to analyze the data.



  1. Request to Not Display OMB Expiration Date


The Census Bureau does not wish to display the assigned expiration date of the information collection because the instrument is automated and the respondent, therefore, would never see the date. The advance letter sent to households by the Census Bureau contains the OMB survey control number for the ATUS.



  1. Exceptions to “Certification for Paperwork Reduction Act Submissions”


There are no exceptions to the “Certificate for Paperwork Reduction Act Submissions.”



1 Interviewers for the ATUS assign one of 24 location codes to each activity reported by respondents.

2 All response rates given are calculated using the American Association for Public Opinion Research’s (AAPOR’s) response rate 2 formula. For more information, see AAPOR’s Standard Definitions—Final Dispositions of Case Codes and Outcome Rates for Surveys, 2008.

19


File Typeapplication/msword
File TitleSUPPORTING STATEMENT
Authorrones_p
Last Modified Byrowan_c
File Modified2009-06-25
File Created2009-06-25

© 2024 OMB.report | Privacy Policy