Response Rates Analysis

G - Response Rates Analysis.pdf

Well-being Supplement to the American Time Use Survey

Response Rates Analysis

OMB: 1220-0185

Document [pdf]
Download: pdf | pdf
June 7, 2004
MEMORANDUM FOR

Documentation

From:

ATUS Nonresponse Analysis Team (Signed)

Subject:

ATUS Response Rates Analysis Results (ATUS-09)

For questions about the attached analysis, please contact Tamara Adams (301-763-7880) or
Harland Shoemaker (301-763-4275).
Attachment
cc:

D. Herz
(BLS)
E. Robison
J. Sebold
(DSD)
J. Montcalm
J. Bushery
(PRED)
N. Ferraiuolo (TMO)
L. Cahoon
(DSMD)
J. Scott
H. Shoemaker
T. Adams
S. Ennis

Attachment

American Time Use Survey Response Rates Analysis
Executive Summary
The ATUS Response Rate Analysis group was formed to examine the operational data collected
from the American Time Use Survey in an attempt to determine why the response rate for the
survey dropped between its pre-fielding phase (November-December 2002) and its production
phase, which started in January 2003.
There were several hypotheses related to refusals and noncontacts the group wanted to test:
-

The more experienced interviewers have lower refusal rates, and as inexperienced
interviewers replaced experienced interviewers on the survey, the response rate
dropped

-

The refusal rate varies by day of week and/or time of day and making calls at
these times causes a higher refusal rate

-

Too few call attempts are being made, resulting in a high noncontact rate, which
leads to a higher nonresponse rate.

-

Call attempts are not being made at optimal times of the day for contacting
respondents

-

The call scheduler is not reattempting to contact busy signals quickly enough,
resulting in lost opportunities to contact the respondents

The group also performed logistic regression analyses to see if relationships exist between
demographic and operational variables, whether or not the respondent was contacted, and
whether or not the respondent refused.
The group also examined the performance of individual interviewers by constructing statistical
control charts for their refusal rates by month to see if consistently high-performing or lowperforming interviewers could be identified, compared to the average response rate for all
interviewers. The thought here was that consistently high-performing interviewers could be
debriefed in order to find out which interviewing techniques they use might be useful for other
interviewers to adopt.
A summary of our analytical results follows.
Refusals

2
Interviewer Effect – The hypothesis that higher refusal rates are being caused by new
interviewers doing more work and more experienced interviewers doing less work is not
supported by the data:
•
•
•

We found no appreciable differences in overall refusal rates by different interviewer
types, except for supervisors.
The group contributing the most to the overall refusal rate varies across month.
There was no relationship between years of experience interviewing and the refusal rate
for an interviewer.

Day of the Week/Time of Day Effect – For 3 of the 7 months of data we looked at, cases
attempted on Thursdays, Fridays, or Saturdays had higher refusal rates than those attempted on
other days of the week. There seems to be a spike in refusal rates at the 10 am, 2 pm, and 9 pm
time periods for many of the months we examined.
Missing CPS Income – Logistic regression analysis of refusal rates showed that cases missing
the CPS family income variable were more likely to be ATUS refusals than cases where the CPS
family income variable was not missing.
Noncontacts
Day of the Week/Time of Day Effect – We found considerable variation in the number of call
attempts by time of day (respondent time), with fairly low valleys at 10am and 6pm, and fairly
high peaks at 9am, 12 noon, 5pm, and 7 pm. This is most likely related to how the call scheduler
operates. The noncontact rate does not vary appreciably across time of day, however. In looking
at number of attempts by JTC time, we found a high peak at 9 am and a significant drop off after
7 pm.
Reattempting after a Busy – Calls resulting in busy signals appear to be re-attempted within
one hour, with about 20 percent of these calls resulting in a contact.
Logistic Regression Analysis – Logistic regression analysis of noncontact rates showed that in
most months we are less likely to contact people in the morning and during the weekdays.
Younger people and black people were less likely to be contacted. The presence of the CPS
family income variable was not a significant predictor of contact.

3
1

Introduction

The American Time Use Survey (ATUS) is designed to give estimates of how people use their
time each day in the United States. The ATUS had a dress rehearsal and pre-fielding period in
September-December 2002. In January 2003, the ATUS entered its production phase. In that
time, with the exception of March, the response rates have dropped (see Table 1). In addition,
there is a large difference between noncontact rates and refusal rates (see Chart 1). Refusals are
generally a larger part of the nonresponse; however, the proportion of the nonresponse accounted
for by refusals varies from month to month. This analysis is aimed at trying to uncover what
may be causing the drop in response rates between pre-fielding and production.
Table 1 – R esponse Ra te by Co mpletion M onth 1 for the
American Time Use Survey
M onth

a

1

Respon se Rate a

November

65.4

December

63.9

January

61.7

February

58.9

March

64.1

April

53.1

May

56.7

Response rate calculated using AAPOR Response Rate 2

There are two different months that a case is assigned in the ATUS. The panel month is the month for
which the case is selected. This month is two months after the case retires from CPS. The completion month is the
mon th in which the case is completed . Cases are we ighted acco rding to their co mpletion month.

4

2

Background

2.1

ATUS Sample Design

The ATUS uses the Current Population Survey (CPS) sample as a frame. Respondents to the
CPS in their eighth month in sample are eligible for the ATUS. After restratifying these
households, a sample size of approximately 3,400 households is randomly selected each month
for interviewing. In this sample, we oversample households with minorities and households with
children. Then, a designated person (DP) age 15 or older from the household is randomly
selected as the person whom will be interviewed. Four panels are randomly created, one for each
week of the month. The sample is also randomized by day, with 50 percent of the sample chosen
to report about Monday through Friday and 50 percent of the sample chosen to report about
Saturday and Sunday.
2.2

ATUS Data Collection

Respondents are notified that they have been selected for inclusion in the ATUS with an advance
letter and pamphlet. Respondents are included in the ATUS two months after their household is
out of the CPS (e.g., if a household’s final interview for CPS is in November, then the members
of that household are eligible for ATUS in January). If a phone number is available from the
CPS, the Jeffersonville Telephone Center (JTC), a computer-assisted telephone interviewing
(CATI) center run by the U.S. Census Bureau, calls the household the day after their designated
response day to obtain a report about their response day. If a phone number is unavailable from

5
the CPS, the respondent is sent an inactivated debit card for $40 and asked to contact the JTC to
complete their interview, at which time the debit card is activated for the respondent.2 If the case
is not completed during the first day, it remains active for seven additional weeks; the day of the
week about which the sampled person is to report does not change.
Since the majority of the cases are outbound cases attempted by the JTC, we concentrate on those
cases in this analysis. The JTC attempts calls in four call blocks throughout the day and is
required to make at least one call in each call block until contact is made with each household
(see Table 2). An automated call scheduler is used in production; during pre-fielding, JTC
managers manually scheduled the calls.
Table 2 – Call Blocks in the ATUS (Time
in JTC Time)
9AM-11:59AM
12PM-4:59PM
5PM-8:59PM
9PM-12AM

2.3

Nonresponse in the ATUS

Nonresponse can be divided into three main categories (Groves and Couper, 1998):
•

•

•

Refusals – These are cases where the interviewer contacts the household and someone,
not necessarily the DP, refuses to complete the interview. Refusals can happen after the
interview begins, but tend to happen very early in the contact.
Noncontacts – These are cases where JTC cannot contact the household. These can be
due to barriers; such as answering machines, caller ID, call butlers, and other
impediments; or can be due to having an incorrect phone number, no answer at the
household, no one home, or other factors.
Other – These are cases where an interview cannot be completed due to a language barrier
and other design features of the survey.

Chart 1 shows that the main contributor to the ATUS high nonresponse are the refusals.
However, noncontacts also contribute to the overall nonresponse. ATUS is unusual in that it
uses expired CPS sample. Therefore the respondents have already been contacted eight times in
the past 16 months by the Census Bureau for a survey. In addition, the ATUS must be conducted
on the designated day, rather than at any time during the week. Also, proxy responses are not
allowed.

2

The average comp letion rate for incentive cases from January through May was 3 3.8 p ercen t.

6
Applying theories of nonresponse, we know that refusals can be caused by interviewers (Groves
and Couper 1998). We also know that noncontacts are not caused by interviewers, but rather are
caused by an interaction of number of attempts, time of attempts, and respondent behaviour
(Groves and Couper 1998). Due to these theories, we will study refusals and noncontacts
separately in the ATUS.
3

Limitations

Other than the coefficients for logistic regression models, there are no significance tests or
implied statistical comparisons in this analysis. We do not conduct statistical tests because we
are looking for patterns that may point to methods to reduce the ATUS nonresponse rates.
4

Results

Because they have different causes, we will examine refusals separately from noncontacts. This
will allow us to analyze various causes of the ATUS nonresponse problem. However, we will
not examine other noninterviews (language barriers, etc.) because these are caused by limitations
in the data collection procedures.
4.1

Refusals

We will examine refusals assuming contact since a respondent cannot refuse until contacted.
Below, refusal rate means the following:

where I=Interview, P=Partial Interview, and R=Refusal
Note that Refusal Rate = (1-Cooperation Rate 4) from AAPOR.
4.1.1

High refusal rate interviewers are doing more work and low refusal rate interviewers are
doing less work, so the overall response rate is going down over time.

7
In Chart 2, we see that in January, February, and April, more of the refusals were completed by
interviewers in the high refusal rate group than by interviewers in the low refusal rate group3.
These are the months with the higher refusal rates (as indicated by the green line). However, this
does not hold for March and May.

Since there is an absence of a strong pattern, we can conclude that there is not one group of
interviewers having a large effect on the refusal rates month to month. In addition, only 8 of the
38 interviewers included in this analysis were consistently in the high refusal rate group or the
low refusal rate group.
Chart 2A shows a positive relationship between number of contacts and refusal rate. The more
contacts an interviewer has, the higher the refusal rate. This may be due to the fact that the
interviewers who have more contacts are more frequently assigned to difficult cases that will
generate more refusals.

3

High Refusal Rate Group means that at least 45 p ercent of the interviewers’ cases were refusals. Low
Refusal Rate Group means that less than 45 percent of the interviewer’s cases were refusals. These cutoffs were
chosen arbitrarily to divide the interviewers. They are chosen month-to month so that an interviewer can change
refusal rate group each month.

8

4.1.1.1 More experienced interviewers have lower refusal rates and as more inexperienced
interviewers replace experienced interviewers, the refusal rate increases.
Chart 3 shows that the first contact refusal rate is fairly constant across groups, with the
exception of the supervisors. This could happen if the JTC does not randomly assign cases but
uses other criteria to assign more difficult cases to supervisors. Also note that the supervisors
have far fewer cases assigned to them for the first contact.

9

In Chart 4, we see no relationship between years of CATI experience and refusal rates in the
ATUS. We calculated a Pearson’s correlation coefficient between years of CATI experience and
refusal rates of 0.34 that is statistically insignificant at the 10% level.

10
We used statistical process control charts of interviewer refusal rates by month to determine if
there were certain interviewers with outlying refusal rates. The control charts showed that the
majority of interviewers with rates above the upper control limit were refusal conversion
specialists and supervisors. We would expect refusal conversion specialists and supervisors to
have higher refusal rates because they are assigned more difficult cases. We also identified those
interviewers with refusal rates below the lower control limit in case JTC would like to debrief
them in order to find out some interviewing techniques they use to keep their refusal rates low.
4.1.2

There is a day of week and/or time of day effect in that certain days and/or times have
higher or lower refusal rates.

Overall, we do not see a strong pattern for day of week or time of day effects in refusal rates.
However, if we examine Chart 5, we see that November, December, and March have overall
lower refusal rates, but do not follow a day of the week pattern different from the other months.
February and April have higher refusal rates for Thursday and Saturday. There is no clear trend
between day and month of contact.

Chart 6 and Table 3 show that January, February, and April have more variable refusal rates than
the other months across time of day. We see spikes and valleys at 10AM, 2PM, and 9PM.
However, there is no clear trend from this chart. Chart 6A shows that there are more completed

11
interviews overall at 5PM, 6PM, and 7PM, where the chart measures number of completes every
minute.
Table 3 – Average Ab solute Differences
from the Mean for H ourly Refusal Rates
by M onth
M onth

Avera ge Abso lute
Difference

November

3.85

December

4.48

January

7.13

February

10.42

March

6.57

April

4.70

May

5.70

12

13
4.1.3

Logistic Regression Analysis

Table 4 presents the coefficients from a logistic regression predicting refusals. In this model, a
coefficient that has a significant positive value means that cases with that characteristic are more
likely to be a refusal, holding all other factors constant. Similarly, a significant negative value
means that cases with that characteristic are less likely to be a refusal, holding all other factors
constant. We have two separate models with the same covariates – one model with cases from
November-December (the pre-fielding period) and one model with cases from January-May (the
production period).
If we compare the two models, we see that in both models there is a significant interviewer
effect. All interviewers, except the supervisors, are more likely to have completions than the
original interviewers. This may be due to the assignment of more difficult cases to more
experienced interviewers. The supervisors are more likely to have refusals than the original
interviewers. In January-May, there are more significant time of day and day of week effects
than in November-December, even after controlling for interviewer effects and respondent
demographics. However, the coefficients are quite small for both of these effects.
In addition, we see that those respondents who live in a household that is missing family income
from the CPS reports are more likely to refuse in the ATUS than those who reported income.
This could indicate that some households are more difficult to survey – missing data in the CPS
could help predict refusal in the ATUS.

14

Table 4 – Logistic Regression Coefficients from Predicting Refusal
November - December
January-May
Va riable
Estimate
P-value
Estimate
P-value
Intercept
0.02
0.82
0.60
0.00***
Operational Characteristics
Responde nt
CPS Designated Person
-0.04
0.22
-0.04
0.03**
Non-CPS Designated
--------Person
Respondent Time of Day
9-10AM
-0.33
0.00 ***
-0.22
0.00***
10-11AM
0.18
0.12
0.34
0.00***
11AM-12PM
0.15
0.16
-0.24
0.00***
12PM-1PM
0.01
0.94
-0.04
0.52
1-2PM
0.25
0.04 **
-0.11
0.11
2-3PM
0.49
0.00 ***
0.19
0.01**
3-4PM
0.12
0.36
0.17
0.02**
4-5PM
0.14
0.25
0.03
0.62
5-6PM
0.04
0.70
0.21
0.00***
6-7PM
0.00
1.00
-0.07
0.25
7-8PM
-0.14
0.15
-0.01
0.82
8-9PM
-0.26
0.04 **
-0.25
0.00***
9-10PM
--------Attempt Day of Week
Monday
-0.05
0.41
-0.15
0.00***
Tuesday
0.18
0.05 *
-0.05
0.28
W ednesday
--------Thursday
-0.04
0.69
0.07
0.13
Friday
0.12
0.20
0.10
0.04**
Saturday
0.06
0.49
0.20
0.00***
Sunday
-0.22
0.00 ***
-0.11
0.00***
Interviewer Group
Original Interviewers
--------New Interviewers
-0.50
0.00 ***
-0.49
0.00***
Coach
-1.75
0.00 ***
-0.76
0.00***
Unknown
-0.06
0.73
0.43
0.00***
Refusal Converters
-0.19
0.05 **
-0.22
0.00***
Supervisors
3.11
0.00 ***
1.73
0.00***
Demographic Characteristics of the Sampled Person
Gender
Male
-0.01
0.80
0.03
0.10
Fem ale
--------Age
15-25
-0.13
0.10 *
-0.02
0.59
26-60
0.09
0.08 *
0.07
0.01**
61+
--------Education
Less than High School
0.01
0.92
0.12
0.00***

15
Table 4 – Logistic Regression Coefficients from Predicting Refusal
November - December
January-May
Va riable
Estimate
P-value
Estimate
P-value
High School Graduate or
0.15
0.00 ***
0.02
0.34
Some C ollege
College Graduate or M ore
--------Race
W hite/Other
-0.09
0.06 *
-0.26
0.00***
Black
--------Family Income
Not M issing
-0.44
0.00 ***
-0.45
0.00***
Missing
--------*=p<0.1, **=p<0.05, ***=p<0.01

16
4.2

Noncontacts

The noncontact rate is defined as:

where NC=Noncontact
For the majority of this section (unless otherwise stated) the noncontact rate is calculated for
calls, not for cases. There can be multiple calls for a case.
4.2.1

Too few attempts are being made over time, causing a higher noncontact rate.

Chart 7 shows that, with the exception of February, the number of contacts has steadily increased
during the ATUS, but the noncontact rate stayed steady until April and May. However, there is
not a strong relationship between the number of contacts and the noncontact rate.

4.2.2

Call attempts are not being made at optimal higher-contact times, causing a higher
noncontact rate over time.

17
Looking at the call attempts per respondent hour, we see several trends. First, we see little
relationship between the number of attempts and the noncontact rate. Second, we see peaks at
the 9AM, noon, 5PM, and 7PM time slots. Similarly, we see valleys at 10AM, 3PM, and 6PM.
This could be due to the way the call scheduler at JTC is programmed to operate.

Next if we examine the number of attempts per JTC hour (the hour the call was attempted at the
JTC) we also see some peaks and valleys with similar contact rates. Therefore, we can conclude
that if the number of attempts were more constant, then we may see more contacts being made
throughout the day. We also see a sharp drop in attempts at 8 PM and later (data for 9PM to
midnight are not shown here).

18
4.2.3

The call scheduler is not reattempting busy signals quickly when we suspect that a person
is home.

The majority of busy calls are reattempted within 1 hour of the busy signal (see Chart 10).
Approximately one-fifth of these results in a contact, which is more than one would expect
choosing a random time since the overall contact rate for this survey is low. This concurs with
the theory that we would expect someone to be at home soon after a busy signal and should
reattempt contact.

4.2.4

Logistic Regression Analysis

In the logistic regression analysis below, we are predicting contact. Therefore, a positive
coefficient means a household with those characteristics is more likely to be contacted, holding
all other covariates constant. Likewise, a negative coefficient means a household with those
characteristics is less likely to be contacted. We have two models with the same covariates – one
for pre-fielding (November-December) and one for production (January-May).
Table 5 shows that in January through May, people are less likely to be contacted in the morning.
This could be due to the fluctuating call patterns that we saw previously. Next, we notice that in
both prefielding and production, people were less likely to be contacted during the week than on
weekends, most likely due to at home patterns. When we examine the demographic

19
characteristics, we see that answering the CPS family income question is not highly significant in
both models as it was in the model predicting refusals. This is not unexpected since missing data
could be showing a tendency to not want to answer rather than being difficult to contact. In
addition, we see that younger people and black people are less likely to be contacted. Younger
people are more likely to be out of the house at jobs during the majority of the day.
5

Conclusions and Recommendations

Based on our results, there is not a simple solution that would raise the ATUS response rates.
However, from this analysis, we can see patterns that may assist in designing a solution. These
include:
•

•

Using CPS interview characteristics to pre-determine difficult cases – Households
missing the CPS income variable are more likely to be refusals. This may indicate a
general trend towards reluctance to participate in a survey. Therefore, these cases could
be targeted and sent to refusal conversion specialists from the start. Alternatively, a
measure of reluctance could be developed based on several missing variables within the
CPS that could indicate a household’s reluctance to participate.
Call patterns during the day – The number of attempts during the day are variable with
the same contact rate. Since the contact rate is currently constant throughout the day, we
may see more contacts if we have a more equal number of calls spread throughout the
day.

20
6

References

Groves, Robert M and Couper, Mick, Nonresponse in Household Surveys, John Wiley and Sons,
1998.

21
Table 5 – Logistic Regression Coefficients from Predicting Contact
November January-May
December
Va riable
Estimate
P-value
Estimate
P-value
Intercept
-2.0545
0.0001***
-1.8799
0.0001***
Operational Characteristics
Respondent Time of Day
9-10AM
0.0662
0.1277
-0.2572
0.0001***
10-11AM
0.0739
0.1881
-0.0779
0.0457**
11AM-12PM
-0.0771
0.1137
-0.126
0.0001***
12PM-1PM
-0.3128
0.0001***
-0.7286
0.0001***
1-2PM
-0.1802
0.0011***
-0.3339
0.0001***
2-3PM
-0.1425
0.0218**
0.0585
0.1064
3-4PM
0.2077
0.001***
0.2395
0.0001***
4-5PM
0.00906
0.8734
-0.1007
0.0043***
5-6PM
-0.117
0.0091***
-0.3727
0.0001***
6-7PM
0.2522
0.0001***
0.3994
0.0001***
7-8PM
-0.0563
0.1944
-0.2868
0.0001***
8-9PM
-0.1354
0.0163**
0.2837
0.0001***
9-10PM
--------Attempt Day of Week
Monday
0.0498
0.0888*
-0.0982
0.0001***
Tuesday
-0.0631
0.1382
0.0371
0.1422
W ednesday
--------Thursday
-0.0706
0.1079
0.0279
0.2561
Friday
-0.1174
0.0098***
-0.1168
0.0001***
Saturday
0.1132
0.0056***
0.1497
0.0001***
Sunday
0.1592
0.0001***
-0.0481
0.0156**
Demographic Characteristics of the Sampled Person
Gender
Male
-0.066
0.0001***
-0.0957
0.0001***
Fem ale
--------Family Income
Not M issing
0.0543
0.0105**
-0.0113
0.3441
Missing
--------Age
15-25
-0.4351
0.0001***
-0.4719
0.0001***
26-60
-0.1635
0.0001***
-0.1987
0.0001***
61+
--------Education
Less than High School
-0.0341
0.2218
0.00151
0.9267
High School Graduate or
-0.0467
0.0215**
-0.0664
0.0001***
Some C ollege
College Graduate or M ore
--------Race
W hite/Other
0.2865
0.0001***
0.2902
0.0001***
Black
---------


File Typeapplication/pdf
AuthorBureau Of The Census
File Modified2004-06-07
File Created2004-06-07

© 2024 OMB.report | Privacy Policy