Attachment H - Summary of ATUS Nonresponse Bias Studies

H - Summary of ATUS Nonresponse Bias Studies.docx

Leave Supplement to the American Time Use Survey

Attachment H - Summary of ATUS Nonresponse Bias Studies

OMB: 1220-0191

Document [docx]
Download: docx | pdf

Summary of ATUS Nonresponse Bias Studies

Last updated June 30, 2016



Study

Summary

Major Findings and Suggestions for Further Research

Grace O'Neill and Jessica Sincavage (2004), Response Analysis Survey: A Qualitative look at Response and Nonresponse in the American Time Use Survey (PDF)

Response Analysis Study (RAS) conducted in 2004 to understand response propensity of ATUS respondents and nonrespondents


Reasons for responding to ATUS:

  • No specific reason (24%)

  • General, survey-related reasons (28%)

  • Government/Census Bureau sponsorship (20%)

  • CPS participation (9%)

  • Interviewer (9%)

  • Topic (7%) and Advance Letter (2%)


Reasons for not responding to ATUS:

  • Tired of doing CPS (33%)

  • Too busy to complete ATUS (16%)

  • Other non-ATUS related reasons (14%)

  • Other reasons for not responding: inconvenient call times, topic was too private/none of government’s business, Census/government sponsorship, interviewer, survey difficulty, and general disdain of surveys


Suggestions for Further Research:

  • Conduct new/updated RAS


Katharine G. Abraham, Aaron Maitland and Suzanne M. Bianchi (2006), Nonresponse in the American Time Use Survey: Who Is Missing from the Data and How Much Does It Matter? (PDF)

  • Tabulated response outcomes for people with different characteristics

  • Estimates multivariate logistic regressions of the factors that determine response outcome

Tested 2 hypotheses:

  1. Busy people are less likely to respond (people who work longer hours, have children in home, have spouses who work longer hours

  2. People who are weakly integrated into their communities are less likely to respond (Renters, Separated or Never Married, Out of Labor Force, Households without children, Households with adults that are not related to householder

  3. Also looked at sex, age, race/ethnicity, household income, education, region, and telephone status

  • Examines whether reweighting the data to account for differences in response propensities affects time use estimates

  • Found little support for hypothesis that busy people are less likely to respond to the ATUS

  • There are differences in response rates across groups for social integration hypothesis. Lower response rates for those: out of labor force, separated or never married, renters, living in urban areas, in households that include adults not related to them. Noncontact accounts for most of these differences

  • When the authors reweighted the data to account for differences in response propensities, found there was little effect on aggregate estimates of time use


Suggestions for further research:

  • Compare recent movers (those that moved between 5th and 8th survey waves) to non-movers

  • Compare “difficult” versus “easy” respondents (# of call attempts)

  • Add questions to outgoing CPS rotation group to gain better information about those selected for ATUS who end up not responding


Grace O’Neill and John Dixon (2005), Nonresponse bias in the American Time Use Survey (PDF)


  • Describes nonresponse by demographic characteristics (using CPS data)

  • Uses logistic analysis to examine correlates of nonresponse, such as demographic and interviewer characteristics

  • Uses a propensity score model to examine differences in time-use patterns and to assess the extent of nonresponse bias

  • Uses ATUS data from 2003


  • Race is the strongest predictor of refusals and noncontacts among ATUS respondents: those who were not white or black were less likely to complete the survey

  • Age also is an important factor in the nonresponse rates, with both refusal and noncontact rates increasing as age increases

  • Estimates of refusal and noncontact bias were small relative to the total time spent in the activities (e.g., in 2003, it was estimated that the population spent an average of 12.4 hours in personal care activities; of this total, there was an estimated refusal bias of 6 minutes and noncontact bias of 12 minutes)


Suggestions for further research:

  • Examine the assumption that the propensity model represents nonresponse

  • Focus on better evaluations for activities in which few people participate on a given day (those data that have non-normal distributions)

  • Examine differences in the relationships between the time-use categories (elasticities) for respondents and nonrespondents

John Dixon (2006), Nonresponse Bias for the Relationships Between Activities in the American Time Use Survey






  • This paper follows up on the 2005 study that John Dixon and Grace O'Neill conducted

  • Focuses on nonresponse rates and nonresponse bias in the relationship between time-use categories

  • Uses ATUS data from 2004

  • There were no nonresponse biases in the time-use estimates, probability of use of time categories, or the relationship between the categories

  • The potential biases that were identified were small for the most part

  • Potential biases were usually in opposite directions for refusal and noncontact, which mitigates the overall effect

Scott S. Fricker (2007), The Relationship Between Response Propensity and Data Quality in the Current Population Survey and the American Time Use Survey (PDF)


(This was later published with coauthor Roger Tourangeau in Public Opinion Quarterly. Volume 74, No. 5/December 2010).

  • Examined characteristics that affect nonresponse in the ATUS

  • Also examined how survey results changed

when high nonresponse propensity cases were excluded from the respondent pool

  • Uses ATUS data from 2003


  • Findings consistent with earlier studies: higher response rates for those who are non-Hispanic, older, and having higher levels of family income

  • Higher nonreponse for those who skipped the CPS family income question, had been a CPS nonrespondent, or were not the respondent in the last CPS interview

  • ATUS nonresponse propensity increased as function of the number of call attempts and of the timing of

those calls

  • Absence of findings supporting the busyness account of ATUS participation also is consistent with results reported in Abraham et al. (2006)

  • Despite strong indications at the bivariate level that ATUS nonresponse was related to social capital variables, the results of the multivariate social capital model failed to find the predicted effects. This is contrary to the findings of Abraham et al. (2006)

  • Removing high nonresponse propensity cases produced small, though significant, changes in a variety of mean estimates and estimates of the associations between variables (i.e., regression coefficients)


Phawn M. Letourneau and Andrew Zbikowski (2008), Nonresponse in the American Time Use Survey (PDF)

  • Analysis of nonresponse using 2006 ATUS data – comparing results to earlier studies

  • Uses logistic regression to model response propensities



Findings similar to earlier studies:

  • Lower response rates for people living in a central city and renters

  • Lower contact rates for people with less education, lower incomes, and in younger age groups

  • Higher refusal rates for people missing household income in the CPS

  • Higher response rates and contact rates for people living in Midwest

  • Lower response rates and cooperation rates for males


Findings different from earlier studies:

  • No significant effect on response rates for people who are unemployed or not in labor force, separated, or never married.

  • No significant effect on contact rates for people who work longer hours, are Hispanic or black


Katharine G. Abraham, Sara E. Helms, and Stanley Presser (2009), How Social Processes Distort Measurement: The Impact of Survey Nonresponse on Estimates of Volunteer Work (PDF)




(This paper was published in the American Journal of Sociology, January 2009.)

  • Examines whether higher measures of volunteerism are associated with lower survey response

  • Links 2003-04 ATUS data to the September 2003 CPS Volunteer Supplement

  • Examines ATUS respondents and nonrespondents in the context of their responses to the Volunteer Supplement


Findings:

  • ATUS respondents were more likely to volunteer, and they spent more time volunteering, than did ATUS non-respondents (there is evidence of this within demographic and other subgroups)

  • The ATUS estimate of volunteer hours suffers from nonresponse bias that makes it too high

  • ATUS estimates of the associations between respondent characteristics and volunteer hours are similar to those from CPS


John Dixon and Brian Meekins (2012), Total Survey Error in the American Time Use Survey (PDF)

  • Used logistic analysis to examine correlates of nonresponse, including

demographic and contact history characteristics.

  • Utilized a propensity score model to examine differences in timeuse

patterns and to assess the extent of nonresponse bias.

  • Assessed measurement error with indicators based on item nonresponse and interviewer judgement.

Findings:

  • Found some demographic characteristics were significant predictors of refusing the ATUS. Specifically, white respondents less likely to refuse, while married and older respondents more likely to refuse.

  • Estimates of bias were very small from all sources. Noncontact had the largest effect.



Brian Meekins and Stephanie Denton (2012), Cell Phones and Nonsampling Error in the American Time Use Survey (PDF)

  • Authors examine the impact of calling cell phone numbers on nonresponse and measurement error


Findings:

  • Cell phone volunteers are less likely to complete ATUS interviews due to noncontact

  • Refusal rate of cell phone volunteers is similar to those volunteering a landline number

  • Differences in measurement error appear to be negligible. There are some differences in the estimates of time use, but these are largely due to demographic differences

John Dixon (2014), Nonresponse Patterns and Bias in the American Time Use Survey


(This paper was presented at the 2014 Joint Statistical Meetings)

  • Using 2012 data, examines nonresponse using propensity models for overall nonresponse as well as its components: refusal and noncontact.

  • Examines nonresponse based on hurdle models.

  • Assessed interrelationship between indicators of measurement error and nonresponse.

  • To explore the possibility that nonresponse may be biasing the estimates due to the amount of zeroes reported, compared the proportion of zeroes between the groups.

Findings:

  • No nonresponse bias was found, but the level of potential bias differed by activity.

  • The measurement error indicators correlated to different activity categories, and work needs to be done before reporting potential biases.

  • The differences between the reported zeroes from the survey and the estimated zeroes for nonresponse were very small, suggesting that reasons for doing the activity were likely not related to the reasons for nonresponse.

Amaya (2015), Enhancing the Understanding of the Relationship Between Social Integration and Nonresponse in Household Surveys


(Dissertation for Joint Program in Survey Methodology, University of Maryland)

  • Examined the components of integration and the components of nonresponse in the ATUS and SHARE

Findings:

  • While integration was predictive of nonresponse in both surveys, the details were inconsistent.

  • Civically engaged individuals were significantly more likely to respond to ATUS, suggesting that individuals integrated through other routes are not


more likely to respond than isolated individuals.

Morgan Earp and Jennifer Edgar (2016) American Time Use Survey Nonresponse Bias Analysis

  • Compared the characteristics of ATUS respondents and nonrespondents using a regression tree model using demographic variables from the CPS

  • Examines the relationship between these characteristics and employment status (from the CPS), since employment status is expected to be related to time use to assess potential for nonresponse bias

Findings:

  • Research in progress.



1


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authordenton_s
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy