Summary of ATUS Nonresponse Bias Studies
Last updated March
2022
|
Study
|
Summary
|
Major Findings and Suggestions for Further Research
|
Response Analysis Survey: A Qualitative look at Response and
Nonresponse in the American Time Use Survey (HTML)
Grace
O'Neill and Jessica Sincavage. U.S. Bureau of Labor Statistics.
Statistical Survey Paper. 2004.
|
Response Analysis Study (RAS) conducted in 2004 to understand
response propensity of ATUS respondents and nonrespondents
|
Reasons for responding to ATUS:
No specific reason (24%)
General, survey-related
reasons (28%)
Government/Census Bureau
sponsorship (20%)
CPS participation (9%)
Interviewer (9%)
Topic (7%) and Advance Letter
(2%)
Reasons for not responding to ATUS:
Tired of doing CPS (33%)
Too busy to complete ATUS
(16%)
Other non-ATUS related reasons
(14%)
Other reasons for not
responding: inconvenient call times, topic was too private/none
of government’s business, Census/government sponsorship,
interviewer, survey difficulty, and general disdain of surveys
Suggestions for Further Research:
|
Nonresponse in the American Time Use Survey: Who Is Missing
from the Data and How Much Does It Matter? (HTML)
Katharine
G. Abraham, Aaron Maitland and Suzanne M. Bianchi. Public Opinion
Quarterly. Volume 70, No. 5/2006.
|
Tested 2 hypotheses:
Busy people are less likely to
respond (people who work longer hours, have children in home,
have spouses who work longer hours
People who are weakly
integrated into their communities are less likely to respond
(Renters, Separated or Never Married, Out of Labor Force,
Households without children, Households with adults that are not
related to householder
Also looked at sex, age,
race/ethnicity, household income, education, region, and
telephone status
|
Found little support for
hypothesis that busy people are less likely to respond to the
ATUS
There are differences in
response rates across groups for social integration hypothesis.
Lower response rates for those: out of labor force, separated or
never married, renters, living in urban areas, in households
that include adults not related to them. Noncontact accounts
for most of these differences
When the authors reweighted
the data to account for differences in response propensities,
found there was little effect on aggregate estimates of time use
Suggestions for further research:
Compare recent movers (those
that moved between 5th and 8th survey
waves) to non-movers
Compare “difficult”
versus “easy” respondents (# of call attempts)
Add questions to outgoing CPS
rotation group to gain better information about those selected
for ATUS who end up not responding
|
Nonresponse bias in the American Time Use Survey (PDF)
Grace
O’Neill and John Dixon. American Statistical Association
Proceedings. 2005.
|
Describes nonresponse by
demographic characteristics (using CPS data)
Uses logistic analysis to
examine correlates of nonresponse, such as demographic and
interviewer characteristics
Uses a propensity score model
to examine differences in time-use patterns and to assess the
extent of nonresponse bias
Uses ATUS data from 2003
|
Race is the strongest
predictor of refusals and noncontacts among ATUS respondents:
those who were not white or black were less likely to complete
the survey
Age also is an important
factor in the nonresponse rates, with both refusal and
noncontact rates increasing as age increases
Estimates of refusal and
noncontact bias were small relative to the total time spent in
the activities (e.g., in 2003, it was estimated that the
population spent an average of 12.4 hours in personal care
activities; of this total, there was an estimated refusal bias
of 6 minutes and noncontact bias of 12 minutes)
Suggestions for further research:
Examine the assumption that
the propensity model represents nonresponse
Focus on better evaluations
for activities in which few people participate on a given day
(those data that have non-normal distributions)
|
Nonresponse Bias for the Relationships Between Activities in
the American Time Use Survey (HTML)
John Dixon. U.S. Bureau of Labor
Statistics. Statistical Survey Paper. 2006.
|
|
There were no nonresponse
biases in the time-use estimates, probability of use of time
categories, or the relationship between the categories
The potential biases that were
identified were small for the most part
Potential biases were usually in opposite directions for
refusal and noncontact, which mitigates the overall effect
|
Nonresponse
in the American Time Use Survey
Phawn
M. Letourneau and Andrew Zbikowski. American Statistical
Association Proceedings. 2008.
|
|
Findings similar to earlier studies:
Lower response rates for
people living in a central city and renters
Lower contact rates for people
with less education, lower incomes, and in younger age groups
Higher refusal rates for
people missing household income in the CPS
Higher response rates and
contact rates for people living in Midwest
Lower response rates and
cooperation rates for males
Findings different from earlier
studies:
No significant effect on
response rates for people who are unemployed or not in labor
force, separated, or never married.
No significant effect on
contact rates for people who work longer hours, are Hispanic or
black
|
How
Social Processes Distort Measurement: The Impact of Survey
Nonresponse on Estimates of Volunteer Work (HTML)
Katharine G. Abraham, Sara E.
Helms, and Stanley Presser. American
Journal of Sociology. Volume 114, No. 4/January 2009.
|
Examines whether higher
measures of volunteerism are associated with lower survey
response
Links 2003-04 ATUS data to the
September 2003 CPS Volunteer Supplement
Examines ATUS respondents and
nonrespondents in the context of their responses to the
Volunteer Supplement
|
Findings:
ATUS respondents were more
likely to volunteer, and they spent more time volunteering, than
did ATUS non-respondents (there is evidence of this within
demographic and other subgroups)
The ATUS estimate of volunteer
hours suffers from nonresponse bias that makes it too high
ATUS estimates of the
associations between respondent characteristics and volunteer
hours are similar to those from CPS
|
The Relationship Between Response Propensity and Data Quality
in the Current Population Survey and the American Time Use Survey
(HTML)
Scott S. Fricker and Roger
Tourangeau. Public Opinion Quarterly.
Volume 74, No. 5/December 2010.
|
when high
nonresponse propensity cases were excluded from the respondent
pool
|
Findings consistent with
earlier studies: higher response rates for those who are
non-Hispanic, older, and having higher levels of family income
Higher nonreponse for those
who skipped the CPS family income question, had been a CPS
nonrespondent, or were not the respondent in the last CPS
interview
ATUS nonresponse propensity
increased as function of the number of call attempts and of the
timing of
those calls
Absence of findings supporting
the busyness account of ATUS participation also is consistent
with results reported in Abraham et al. (2006)
Despite strong indications at
the bivariate level that ATUS nonresponse was related to social
capital variables, the results of the multivariate social
capital model failed to find the predicted effects. This is
contrary to the findings of Abraham et al. (2006)
Removing high nonresponse
propensity cases produced small, though significant, changes in
a variety of mean estimates and estimates of the associations
between variables (i.e., regression coefficients)
|
Total Survey Error in the American Time Use Survey (PDF)
John Dixon and Brian Meekins. 2012.
|
demographic
and contact history characteristics.
patterns and
to assess the extent of nonresponse bias.
Assessed measurement error with
indicators based on item nonresponse and interviewer judgement.
|
Findings:
Found some demographic
characteristics were significant predictors of refusing the
ATUS. Specifically, white respondents less likely to refuse,
while married and older respondents more likely to refuse.
Estimates of bias were very
small from all sources. Noncontact had the largest effect.
|
Cell Phones and Nonsampling Error in the American Time Use
Survey (PDF)
Brian Meekins and Stephanie Denton. U.S. Bureau of Labor
Statistics. Statistical Survey Paper. 2012.
|
|
Findings:
Differences in measurement error
appear to be negligible. There are some differences in the
estimates of time use, but these are largely due to demographic
differences
|
Nonresponse Patterns and Bias in the American Time Use Survey
John Dixon. Joint Statistical
Meetings Proceeding. 2014.
|
Using 2012 data, examines
nonresponse using propensity models for overall nonresponse as
well as its components: refusal and noncontact.
Examines nonresponse based on
hurdle models.
Assessed interrelationship
between indicators of measurement error and nonresponse.
To explore the possibility that
nonresponse may be biasing the estimates due to the amount of
zeroes reported, compared the proportion of zeroes between the
groups.
|
Findings:
No nonresponse bias was found,
but the level of potential bias differed by activity.
The measurement error
indicators correlated to different activity categories, and work
needs to be done before reporting potential biases.
|
Enhancing the Understanding of the Relationship Between Social
Integration and Nonresponse in Household Surveys (HTML)
AE Amaya. Dissertation for Joint Program in Survey
Methodology, University of Maryland. 2015.
|
|
Findings:
While integration was
predictive of nonresponse in both surveys, the details were
inconsistent.
Civically engaged individuals
were significantly more likely to respond to ATUS, suggesting
that individuals integrated through other routes are not
|
American Time Use Survey Nonresponse Bias Analysis Morgan
Earp and Jennifer Edgar (2016)
|
|
Findings:
No significant differences in
employment rate were found between ATUS respondents and
nonrespondents in the overall sample or within the eight varying
response propensity groups, indicating that ATUS estimates
correlated with CPS employment status also may not exhibit
nonresponse bias.
|
Comparison of weighting procedures in the presence of unit
nonresponse: a simulation study based on data from the American
Time Use Survey (HTML)
Morgan Earp and David Haziza.
U.S. Bureau of Labor Statistics. Statistical Survey Paper. 2019.
|
|
Findings:
Regression tree weights tended
to result in less bias than logistic regression, class, or ATUS
weights, however they also tended to have higher variance both
in terms of the weights themselves, and in terms of the
estimates with respect to mean square error values.
Depending on how nonresponse
is simulated, trees may perform worse overall with regard to
mean square error or it may vary based on the estimate.
Given that ATUS is used to produce trend estimates of how
Americans spend their time, very careful consideration would
have to be given to changing the weighting method used to adjust
for nonresponse, since it would require reweighting previous
datasets or making a break in the time series.
|
|
|
|