Cognitive Testing of Potential Changes

Attachment J - ASEC Cog Interview Report.docx.pdf

2015 Annual Social and Economic Supplement to the CPS (Parallel Survey)

Cognitive Testing of Potential Changes

OMB: 0607-0982

Document [pdf]
Download: pdf | pdf
Attachment J

Cognitive Testing of Potential Changes
to the Annual Social and Economic
Supplement of the Current Population
Survey

Authors
Wendy Hicks
Jeffrey Kerwin

July 25, 2011

Prepared for:
U.S. Census Bureau
4600 Silver Hill Road
Suitland, MD 20746

Prepared by:
Westat
1600 Research Boulevard
Rockville, Maryland 20850-3129
(301) 251-1500

Table of Contents
Background and Objectives ...................................................................................................2
Methods
..............................................................................................................................3
High-Level Description of Methods Common to Both Rounds...............3
Recruiting Cognitive Interview Participants..................................................4
First Round Cognitive Interview Design .......................................................6
Second Round of Cognitive Interview Design ...........................................11
Updates to the structural changes used in Round One ...............11
Updates to Specific Income Types ...............................................155
MEANS-TESTED BENEFIT PROGRAMS ..................................................15
REPORTING OF ASSET INCOME ............................................................166
DISABILITY .....................................................................................................16
PROPERTY .......................................................................................................17

Findings

............................................................................................................................17
Observations Regarding the Structural Changes ........................................17
Source first approach.........................................................................17
Tailoring the order of income source presentation ......................18
Follow-up Bracket Items for “Don’t Know” Responses ............19
Observations on the Income Sources of Key Interest for this Project ..20
TANF ..............................................................................................20
Food Stamps.......................................................................................23
Disability Income...............................................................................24
Asset Income ......................................................................................28

Conclusions and Recommendations for Further Testing ...............................................35
Structural Changes...........................................................................................35
Reporting of Means Tested Program Participation....................................37
Disability, SSDI and SSI Reporting............................................................377
Retirement and Asset Reporting .................................................................388
References ............................................................................................................................38

Background and Objectives
The Annual Social and Economic Supplement to the Current Population Survey (the CPS-ASEC) is
one of the most widely-used of all the surveys conducted by the U.S. government. The official
poverty statistics – announced each fall for the prior calendar year – are derived from the CPS
ASEC. The survey gives researchers a unique combination of features – detailed demographic data,
detailed income data, data on a wide range of benefit programs, sufficient sample size at a state-level,
cross-year consistency, and very timely data production.
As noted in the Census Bureau RFP, the ASEC data reflect errors of varying magnitudes across
income types, and by particular subpopulations. For example, overall, enrollment in benefit
programs is under-reported (as is the case in virtually all surveys). Even after Census Bureau
imputations for missing data, enrollment in public benefit programs is under-reported by
approximately 26 percent (SSI) to 43 percent (food stamps) (Wheaton, 2007). Additionally, the
ASEC data also require a substantial level of imputation and allocation. Specifically, 35 percent of
Social Security income, and 63 percent of asset income is allocated in the ASEC (Czajka and
Denmead, 2008). The Census Bureau has begun a research program to investigate methods to
address misreporting and item nonresponse by taking greater advantage of the functionality and
flexibility afforded by computer assisted interviewing (CAI) within the ASEC interview.
This report covers two rounds of cognitive interviewing completed using a modified version of the
ASEC instrument content. The first round reflects modifications addressing the data quality issues
noted of concern directly by the Census Bureau, as well as specific findings from the literature
review completed as part of this contract by Urban Institute and Westat (delivered to BOC
December, 2010). The second round of cognitive interviews used another version of the ASEC
instrument content that built upon the results of the first round of cognitive interviews, as well as
incorporated the findings from the Data Analysis Report, a deliverable completed as part of this
contract by the Urban Institute (delivered to BOC in April of 2011).
The overall goal of this initial testing was to develop a viable alternative approach for collecting the
ASEC data that will be further tested in the future using an actual CAPI/CATI instrument, and in
comparison to the current ASEC instrument. Thus, our analysis focused on assessing response
errors associated with the instrument as designed for these two rounds of testing.
In general, across rounds, the redesign efforts focused on addressing:






comprehension errors resulting from differences in the specific language used to define
income sources in the ASEC relative to the language naturally used by sampled participants,
across geographic areas
recall errors associated with income sources received at irregular or low frequencies, or
received in only small amounts
reporting errors associated with income sources perceived as having an attached social
stigma with receipt, such as public assistance and food stamps
item nonresponse, either as a result of proxy reporting (i.e., a household respondent
reporting for all adult household members), or lack of knowledge about income source.
errors resulting from respondent fatigue

2



errors reflective of changes in the structure of income sources, such as TANF eligibility,
retirement accounts and other assets, since the last ASEC redesign in the early 1990s.

These two rounds of cognitive testing, as well as the literature review and data analysis report, serve
as the initial components of the Census Bureau’s larger testing and development effort for the CPS
ASEC instrument. Since this represents the earliest stages of testing, all modifications were
developed and tested using a paper interviewer-administered instrument. Future testing will
develop alternative versions of the ASEC instrument in the Census Bureau’s Blaise environment.

Methods
High-Level Description of Methods Common to Both Rounds
Both rounds of cognitive testing used the same general interviewing approach, though each round
used a slightly different version of an alternate ASEC interview. (As noted above, the second round
instrument reflected updates based on the results of the first round, as well as results of the data
analysis effort not available prior to the first round of testing.) All interviews started with a review
of the consent form, which included the request to audio record the interview (see Attachment A).
After discussion of any questions raised by the respondent and getting the respondent’s signature,
the interviewer proceeded to explain the task to the respondent, noting that Westat is evaluating
some possible changes to the Current Population Survey - Annual Social and Economic Supplement
interview for the U.S. Census Bureau. Respondents were told that at different points in the
interview, the interviewer would stop and ask questions to get his/her feedback on some of the
ASEC questions, explaining that their feedback gives a lot of valuable information about how well
the questions work. Interviewers also emphasized that respondents did not have to answer any
questions they didn’t want to, noting a greater interest in how they interpret the questions and how
they come up with an answer rather than the actual answer. All interviews in each round started
with the work experience questions before moving into the questions about income sources. All
participants received $40 in appreciation for their time.
The same cognitive interviewers conducted the interviews in each round. Each interviewer
participated in a detailed training session prior to each round. In addition to training on the
instrument and procedures for cognitive testing, interviewer training in the first round also covered
the current ASEC interview and data collection protocol to provide a better general understanding
of the ASEC survey environment. Prior to starting their cognitive interviews, each interviewer was
instructed to conduct a least one practice interview to gain familiarity with the various components
of the interview and associated materials (these are discussed in more detail in design and procedures
section for each round).
All interviews were conducted in focus group facilities or similar types of facilities, with the recruited
respondents traveling to the facilities to complete the interview. Conducting the interviews in a
central location allows completion of several interviews in the course of the day since interviewers
do not need to commute from one location to another. However, the disadvantage for the ASEC

3

testing is that respondents did not have access to their financial records, another departure from the
actual ASEC interview situation. Interviewers attempted to address this by probing at the end of the
interview on the types of records respondent had at home and whether or not they felt they would
use records to respond to this interview. However, these probes are hypothetical in nature, and
responses may not reflect actual behaviors in a production ASEC interview situation.
Interviews were conducted in the Spring of 2011, in a total of 9 different states across two rounds.
In the first round, we completed 28 interviews in the following 4 locations:





San Francisco, CA
Columbia, SC
New York, NY
Jacksonville, FL

In the second round, we completed 29 interviews across the following 5 locations:






Minneapolis, MN
Columbus, OH
Nashville, TN
Alexandria, VA
Edison, NJ

Recruiting Cognitive Interview Participants
BOC and Westat agreed that three income source areas would be the primary focus for the
cognitive interviews: 1) TANF and food stamps, 2) asset income, including assets held within both
retirement and non-retirement accounts, and 3) Social Security, including receipt for nonretirement reasons and partial year receipt. Thus, we focused our recruiting efforts on finding
and scheduling persons with characteristics deemed relevant for testing the ASEC items on these
areas.
For Round 1, we focused on recruiting persons in the following four groups:





Persons who lived in subsidized (public) housing or received Section 8 housing assistance in
2010. We believed such persons would be especially likely to have received TANF and food
stamps;
Persons who had investments in the stock or bond markets in 2010, including assets in both
retirement and non-retirement savings);
Persons who are age 62 or older and retired sometime in 2010. Spouses of such persons
were eligible as well. We expected such persons to be partial-year recipients of Social
Security;

4



Persons who received disability-related government financial support for at least part of
2010. Spouses of such persons were eligible as well.

Some of the 28 Round 1 participants met more than one of these characteristics. Altogether, we
interviewed 8 persons receiving public housing assistance, 15 persons with stock/bond investments,
4 persons who retired in the previous year, and 13 persons who received disability payments for at
least part of the year.
Round 2 had a very similar focus. Specifically, we targeted:





Low income persons with children. Since none of the Round 1 participants appeared to
have received TANF in 2010, we tried to recruit low income persons by contacting local
TANF offices in each of the five interview locations, with mixed success. TANF offices in
some locations would not assist us whereas those other locations did. If persons could not
be obtained through local TANF offices, recruiters targeted persons receiving Section 8
public housing in 2010;
Persons who had a retirement account (such as a 401(k) or IRA) during 2010, including a
mix of persons currently retired and persons of high income who are still working;
Persons who received disability-related government support in 2010, including some who
received it for only part of the year.

Some of our 29 participants in this round met more than one characteristic. Altogether, we
interviewed 15 low-income persons, 12 persons with retirement assets, and 13 who had received
disability payments last year.
Table 1 summarizes the demographic characteristics of the cognitive interview participants, by
round.

5

Table 1: Demographic Characteristics of Cognitive Interview Participants
Round 1

Round 2

Gender
Male
Female

12
16

9
20

Education
High school or less
Some college
College graduate
Advanced degree

5
12
6
5

9
10
7
3

Age
20 to 39
40 to 59
60 or older

5
7
16

8
12
9

16
11
1

17
8
3

Race/Ethnicity
White
Black
Hispanic
Asian
Other

1

First Round Cognitive Interview Design
The first round of cognitive interviewing assessed several high-level structural changes to the design
of the ASEC instrument. The structural changes, in general, reflect increased usage of the potential
functionality provided by Computer Assisted Interviewing technology. The main objectives of these
structural changes were to: 1) reduce under-reporting of income sources for self and proxies, 2)
reduce the effect of fatigue on response quality and, 3) reduce the level of item missing data
particularly in regards to reports of specific income amounts.
Specifically, the first round cognitive interview instrument examined the following structural
changes:


Reduce underreporting by collecting all sources of income for a household prior to
asking for any amounts. Conventional wisdom in survey research purports that using an
interleafed design (e.g., income source 1, amounts for source 1, income source 2, amounts
for source 2, and so on) provides respondents the opportunity to learn the level of follow-up
asked about each identified income source early in the interview. As a result, this increased
awareness of the ‘consequence’ of reporting receipt of an income source may negatively
affect reporting of sources of income that fall later in the interview. In fact, work by Kreuter
et al (2010) found evidence of this occurring for some survey topics, though the finding did
not universally apply to all topics included in the study. With this in mind, the Round one

6

cognitive interview instrument restructured the collection of all income sources first before
proceeding to collect the amounts received by household members for each source. This
structural change aims to reduce under-reporting of income, particularly for those sources
that fall later in the interview.
To assess this change, probes were built into the cognitive interview to identify if any
sources of income were missed. Additionally, interviewers were instructed to observe and
probe as necessary any respondents who reacted in any way to the transition from income
sources to collecting detailed amounts, or if they observed any reporting difficulty in
connecting the earlier reported source of income to the questions about the amount of
income received for that source.


Reduce fatigue effects by using a tailored order of presentation of income sources,
depending on known characteristics of the household either from previous waves of data
collection or from the core items asked earlier in the interview. For cognitive testing, Westat
used screening information collected as part of participant recruitment to determine the
presentation order for income sources. The goal of this manipulation was to reduce
potential fatigue effects on reporting, moving the questions about the most likely source of
income for that respondent earlier in the interview. Round one used three different tailored
orders of income presentation.
As with the previous change, probes were built into the cognitive interview to identify if any
sources of income were missed under this tailored approach. In addition, interviewers paid
attention to the occurrence of reports of income sources shifted to later in the interview for
a specific tailored presentation order.



Reduce item missing data by using bracketed amounts as follow-up to initial “don’t
know” responses to the income amounts questions, with brackets defined by geography
and income type. The objective of this manipulation was to reduce item nonresponse by
allowing respondents to instead provide a less precise response to questions about the
amount of income received for a particular income type. Respondents may simply choose
to respond with a “don’t know” response rather than provide a point estimate that they do
not feel confident about. In addition, as discussed by Tourangeau et al (2000), changing
from an open recall response task to a recognition task by selecting from a provided dollar
range can facilitate recall and reporting. Westat evaluated this modification by noting
instances in which respondents selected a bracketed range, after an initial “don’t know”
response. Similarly, interviewers noted and probed on instances when respondents did not
provide an answer to either the point estimate request nor from the bracketed ranges.

In addition, the round one cognitive interview instrument also included several changes to the
collection of asset income identified by the Census Bureau as a concern in terms of data quality.


Added questions to collect the asset value (e.g., the account balance for a mutual fund or
savings account). While the CPS is not necessarily interested in the total value of an asset,
having the total value would allow for more informed imputation of dividends and interest
amounts. In round one, about half the interviews asked for the total asset value prior to
asking for interest or dividend amount. The other half maintained the current ASEC

7

approach and asked for the interest or dividend amount, but then followed up with a request
for the account balance.
The analysis of this change focused on identifying the respondents’ relative awareness of
their account balance as compared to the dividend amounts or interest earned. In addition,
interviewers observed and probed as necessary any sensitivity to reporting the asset value.
The varying order of total value versus interest/dividend amount provided a way to assess
the relative sensitivity of the total value request.


Separated reporting of retirement accounts/assets (e.g., IRA, 401k, 403k accounts,
etc) from other assets. This change was introduced to better facilitate reporting of
retirement income sources, other than pensions, and better reflect of the changes in
retirement planning and savings approaches of today. Currently, ASEC makes no distinction
between investment income received in a retirement account and investment income
received outside of a retirement account. It may be easier for people to consider and report
them separately. Interviewers assessed respondents’ ability to separate retirement and nonretirement accounts, and for those with both types, probed on how respondents maintain
records of that information, how they think about the accounts and the easiest way for them
to report such income.

Round one testing also addressed BOC concerns with reporting of Social Security and SSI income,
as well as under-reporting of means tested assistance programs (public assistance or TANF and
SNAP).
 Issues with respect to reporting back payments for Social Security and SSI Income.
People who receive either Social Security or SSI may experience a delay in receiving their
initial payment. It is not uncommon for the first payment in either of these programs to
cover several months prior to the first month of receipt, while subsequent payments cover
one month each. In this first round, recruitment targeted respondents whose first payment
for either of these programs occurred at some point in the previous calendar year (i.e., the
ASEC reference period) in order to understand how respondents naturally think about and
report these initial ‘back payments’ within the current ASEC language for collecting this
income information. While the ASEC question wording remained unchanged, interviewers
asked probes that identified if a back payment occurred, if respondents reported that back
payment or not, and if not, why not.


Issues with respect to reporting public assistance and food stamps. As identified by
the BOC in their RFP, in the literature review, and later in the data analysis report, public
assistance and food stamp income is chronically under-reported in the ASEC. The
literature review results suggest that under-reporting is exacerbated among partial year
recipients within either of these programs. In round one, recruiting efforts targeted those
receiving public housing as likely recipients of one or both of these programs, but the
question wording and structure for collecting these income sources remained the same as
currently in the ASEC instrument. The only modification made was to drop the use of the
family income questions used to determine which households get asked about public

8

assistance, food stamps and other means tested programs. (In fact, the Data Analysis report
provided evidence that this question likely inappropriately screens out households eligible
for and participating in one or more of these programs1.) All respondents, regardless of
what we knew about their household income level from recruitment and screening, received
the recipiency question for these programs.
Interviewers asked in-depth probes to understand whether in fact respondents or anyone in
their households actually received income from either of these sources, the language or
terms the respondents naturally use in reference to these programs, the types of cues salient
to respondents in thinking about and discussing these programs and knowledge of the
amounts received, if any. Westat planned on using the results of this probing to inform any
needed language or cueing changes for the second round of interviewing.
Round one interviews included 5 different interview versions. The five versions reflect the three
different orders of presentation of income types, and the two different orders for asking total asset
value – either before or after asking for interest/dividend amount. (Attachments B1 through B5
show each interview version.) Table 2 below shows the three different orders of income. Cognitive
interview respondents fell into one of the three orders based on the following criteria.






Low-income: Respondents recruited as receiving housing assistance of any type, including
living in section 8 housing, received this version of the instrument. Housing assistance was
used as a proxy indicator for public assistance as we did not want to ‘preview’ the ASEC
questions and concepts of interest in recruiting participants. The actual ASEC can make use
of employment information from prior waves, household size, the presence of dependent
children in the household and address/geography to determine whether a household should
receive this presentation order.
Seniors: Respondents who identified themselves as 62 or older in screening received this
order. The actual ASEC can make use of employment information from prior waves (e.g,
retired) as well as age as a determinant of eligibility for this order.
All others: This was the default condition if the respondent did not meet either of the
above criteria. The order used here closely reflected the existing ASEC income order.
However, since the cognitive testing instruments did not include the Family Income
screener, all income sources were asked of all respondents.

The protocol that included both the global and specific probes for Round One interviews is
included as Attachment C.

1

Tabulations of 2008 American Community Survey data show that 12 percent of SNAP households and 20 percent of public assistance income
recipients would have failed the family income item test if it had been applied on the ACS (see Table 3.3 in the Data Analysis Report)

9

Table 2: Round One Alternate Orders of Income Presentation.
Low Income
Unemployment and Workers
Compensation
Public Assistance / TANF
Food Stamps (SNAP)
WIC
School Lunches
Social Security
Supplemental Security Income
(SSI)
Veterans
Survivor Benefits
Disability Income
Retirement and Pensions
Interest
Dividends
Property Income
Education Assistance
Child Support
Alimony
Financial Assistance from friends
or relatives
Other Income I (as in ASEC now)
Other Income II (as in ASEC
now)

Seniors
Social Security
Supplemental Security Income
(SSI)
Veterans
Survivors
Retirement and Pensions
Interest
Dividends
Property
Unemployment and Workers
Compensation
Public Assistance / TANF
Food Stamps (SNAP)
WIC
School Lunches
Disability
Education Assistance
Child Support
Alimony
Financial Assistance from friends
or relatives
Other Income I (as in ASEC now)
Other Income II (as in ASEC
now)

All Others
Unemployment and Workers
Compensation
Social Security
Supplemental Security Income
(SSI)
Public Assistance
Food Stamps (SNAP)
WIC
School Lunches
Veterans
Survivor Benefits
Disability
Retirement and Pensions
Interest
Dividends
Property Income
Education Assistance
Child Support
Alimony
Financial Assistance from friends
or relatives
Other Income I (as in ASEC now)
Other Income II (as in ASEC
now)

In order to better reflect the capabilities of an automated instrument while actually using a paper
interview instrument, interviewers used a structured notes page to record each type of income
received by household member (See attachment B6). Interviewers referred to this note page to
determine which income amount questions to ask for which household members. In addition,
interviewers had a calculator available to sum non-annual amounts of income (e.g., weekly, monthly,
or quarterly) reported in order to feed back a total amount for respondent verification. The note
page and the calculator will not be necessary in future stages of testing with an automated version of
the instrument.

10

Second Round of Cognitive Interview Design

Updates to the structural changes used in Round One

In Round 2, we retained and continued testing the same three structural changes included in round 1
to get a larger set of responses. Specifically, Round Two interviews continued collecting all sources
of income first prior to collecting amounts as a method to reduce under reporting of income types;
used a tailored order of presentation to reduce potential fatigue effects on reporting, and; included
bracketed amounts as follow-up questions to don’t know responses as a way to reduce item missing
data.
Round One results (discussed in detail in the Findings section) did however suggest some additional
modifications in the implementation of the source first approach and the specific ordering of
income types. In general, the link between the source of income received and the amount of that
income received was not always maintained with the collection of all sources of income first. This
primarily reflects the fact that interviewers did not have the benefit of an automated instrument to
store and reference an income source name or label once he/she began asking about the amounts
received for each income source. Interviewers needed to maintain this link themselves, which
proved difficult when respondents identified multiple sources of income or different sources for
different household members. In addition, without the immediate context of the source question
itself, the labels or terms captured in the ASEC for a source were often not specific enough to cue
unambiguously the correct source of income when returning to that income type in responding to
the amount questions. This was particularly problematic in the Interest and Dividends sections. In
order to address these issues, we designed the Round 2 instruments to use more specific cuing
within the asset reporting, but also for all other income source that include sub-sources (Disability,
Pensions, Survivor Benefits).
Using these detailed cues for each source meant that respondents receiving a given income type
would need to respond to a few more individual questions within that income type. As a way to
reduce the perceived burden with this approach, we re-ordered the presentation of the detailed cue
questions within each income type to better reflect the proportional distribution of these income
sources, at a national level, from highest to lowest. (However, with an automated instrument, these
orders can potentially be tailored to the state level distribution.) We also limited the specific
questions for a detailed source within each income type to only the top 5 dollar amounts (ranked at a
national level) as a way to minimize burden.
As examples of the changes between rounds in the level of detailed cues within an income source,
table 3a shows the differences between the source questions for disability income by round, and
table 3b shows the difference by round in the survivor benefits source questions.

11

Table 3a: Disability income ‘source’ questions by cognitive interviewing round
Round 1
Did you receive any income in 2010 as a result of
(your/his/her) health problem (other than Social
Security/other than VA benefits/ Other than
Social Security or VA benefits)?
(If YES) What was the source of that income?
(Response categories not read)
- Workers compensation
- Company / union disability
- Federal government disability
- U.S. Military retirement disability
- State or Local government employment
disability
- U.S. Railroad retirement disability
- Accident or disability insurance
- Black Lung miner’s disability
- State Temporary sickness
- Other or don’t know (specify)

Round 2
Did (you/name) receive any disability income in
2010 as a result of (your/his/her) disability or
health problem?
(If YES) Was this disability income from…
a. Social Security Disability Insurance, for
people who are eligible based on years of
work? (Y/N)
b. Supplemental Security Income, which
provides payments to low-income aged,
blind and disabled persons? (Y/N)
c. Company or union disability payments?
(Company/Union/None)
d. Federal, State or local government
employee disability?
(Federal/State/Local/None)
e. Accident or disability insurance? (Y/N)
f. Did you receive disability income from
any other source?
(If YES to ‘f’) What was this source of disability
income?
- Workers compensation
- U.S. Military retirement disability
- U.S. Railroad retirement disability
- State temporary sickness
- Black Lung miner’s disability
- Other or don’t know (specify)

Table 3b: Survivor benefit ‘source’ questions by cognitive interviewing round
Round 1
Did (you/anyone in this household) receive any
survivor benefits in 2010 such as widow’s
pensions, estates, trusts, insurance annuities, o
rany other survivor benefits (other than Social
Security/ other than VA benefits/ other than
Social Security or VA benefits)?
(If YES) What was the source of that income?
(Response categories not read)
- Company or union survivor pension
(INCLUDE PROFIT SHARING)
- Federal government survivor pension
- U.S. Military retirement survivor pension
- State or Local government survivor
pension
- U.S. Railroad retirement survivor pension

Round 2
Did (you/anyone in this household) receive any
survivor benefits in 2010 such as widow’s
pensions, estates, trusts, insurance annuities, o
rany other survivor benefits (other than Social
Security/ other than VA benefits/ other than
Social Security or VA benefits)?
(If YES) Was (your/name’s) income from…
a. A company or union survivor pension
(include profit sharing)?
(Company/Union/None)
b. Federal, State or local government
survivor (civil service) pension?
(Federal/State/Local/None)
c. U.S. Military retirement survivor pension?
(Y/N)

12

-

Worker’s compensation survivor pension
Black Lung survivor pension
Regular payments from estates or trusts
Regular payments from annuities or paidup insurance policies
Other or don’t know (specify)

d. Regular payments from estates or trusts?
(Y/N)
e. Regular payments from annuities or paidup insurance policies? (Y/N)
f. Did you receive survivor benefits from
any other source?
(If YES to ‘f’) What was the source of this
income?
- Social Security survivor payments
- Black Lung survivor pension
- U.S. Railroad retirement survivor pension
- Worker’s compensation survivor pension
- Other or don’t know (specify)

The specific cues captured for each income type were then used as fills in the amount section for the
income type. For example, if a respondent indicated that she received a U.S. Military retirement
survivor pension as well as payments from annuities, the survivor income amounts section asked for
the amount she received from the U.S. Military retirement survivor pension, followed by a separate
question for the amount received from the annuity.
With the addition of these detailed level cues within each income source, it became apparent that
several of the income types shared the same or very similarly named detailed sources of income. For
example , Disability income includes a probe for Social Security Disability Insurance, which could
easily be confused with the higher level screening question for Social Security as a source of Income.
Similarly, both Pensions and Survivor Benefits include a probe for company or union (survivor)
pensions. In order to reduce the potential confusion and possible duplicate or misreporting across
income types, Round Two also changed the order of presentation slightly for each of the three
categories used in Round one. Table 4 shows the revised source type order groups used in Round
Two.

13

Table 4: Round Two Alternate Orders of Income Presentation.
Low Income
Unemployment and Workers
Compensation
Public Assistance / TANF
Food Stamps (SNAP)
WIC
School Lunches
Public Housing
Energy Assistance
Disability
Social Security
Supplemental Security Income
(SSI)
Veterans
Survivor Benefits
Pensions
Annuities
Retirement Accounts –
Withdrawals or distributions
Other Income Earning Assets
(outside of retirement)
Property Income
Education Assistance
Child Support
Alimony
Financial Assistance from friends
or relatives
Other Income I (as in ASEC now)
Other Income II (as in ASEC
now)

Seniors
Disability
Social Security
Supplemental Security Income
(SSI)
Veterans
Survivors
Pensions
Annuities
Retirement Accounts –
Withdrawals or distributions
Other Income Earning Assets
(outside of retirement)
Property Income
Unemployment and Workers
Compensation
Public Assistance / TANF
Food Stamps (SNAP)
WIC
School Lunches
Public Housing
Energy Assistance
Education Assistance
Child Support
Alimony
Financial Assistance from friends
or relatives
Other Income I (as in ASEC now)
Other Income II (as in ASEC
now)

All Others
Unemployment and Workers
Compensation
Disability
Social Security
Supplemental Security Income
(SSI)
Veterans
Survivor Benefits
Public Assistance
Food Stamps (SNAP)
WIC
School Lunches
Public Housing
Energy Assistance
Pensions
Annuities
Retirement Accounts –
Withdrawals or distributions
Other Income Earning Assets
(outside of retirement)
Property Income
Education Assistance
Child Support
Alimony
Financial Assistance from friends
or relatives
Other Income I (as in ASEC now)
Other Income II (as in ASEC
now)

In addition, we modified the interviewer note sheet which interviewers used to record the reported
the income types per household member from round one. We added a space to record the specific
term or terms used by respondents in identifying each particular source of income received. The
interviewers could then use the specific or detailed income source received as the cue in the
subsequent amount section. (See Attachment D4).

14

Updates to Specific Income Types

In addition to the updates to the structural modifications, the Round One results and the Data
Analysis Report both suggested additional updates to specific income types for Round Two testing.

MEANS-TESTED BENEFIT PROGRAMS
As discussed in more detail in the Recruiting Section below, Round One did not result in the
successful recruiting of TANF recipients and only a limited number of SNAP recipients. Thus
modifications to TANF and SNAP collection in Round Two are based almost entirely on the Urban
Institute Data Analysis Report. In particular, the Data Analysis report demonstrated that in regards
to Food Stamps (SNAP) and SSI, the weighted caseload totals are about two thirds of the target
derived from administrative totals, indicating ASEC misses recipients of each program benefit.
However, the total dollar amounts are slightly higher, roughly seven tenths of the targets suggesting
that perhaps those who receive higher levels of benefits may be more likely to remember those
benefits and report them. This is particularly likely if those who receive a benefit for fewer than 12
months of the year are less likely to report any benefit, particularly in the case where a benefit was
received early in the calendar year, in some cases more than 12 months prior to the interview. The
Round One cognitive interview findings did identify a potential issue with the wording of the SNAP
benefit question that could contribute to missed recipients. Specifically, the question asks if
“(you/anyone) gets Food Stamps or a Food Stamp benefit card”. One respondent pointed out that
he didn’t ‘get’ the card in the previous year, he received it prior to that but was still receiving benefits
on it during 2010.
The Urban Institute data analysis found that the pattern for TANF is just the opposite of that found
with SNAP and SSI —the caseload aggregate is 57% of the target value, but the dollar total is only
43% of the total. This suggests that those who with larger TANF benefits may actually be less likely
to report any receipt. Or it might be the case that respondents who report receipt of TANF simply
systematically underreport amounts received during the calendar year.
With these findings in mind, recruiting efforts again targeted individuals who just started receiving
SNAP or TANF benefits during the previous calendar. As with Round One, all versions of the
interview instrument excluded the Family Income Screener question currently in the ASEC
Interview ( a modification supported by the Data Analysis results that found between 12% and 20%
of SNAP and TANF recipients would be screened out if the Family Income Screening criteria were
applied to ACS data.) The Round Two wording for the SNAP question replaced “get” with “use”
as one approach for addressing missed recipients. However, in the absence of the opportunity to
evaluate the TANF question in Round One, no specific wording changes were made for Round
Two. Probing to identify knowledge of TANF, language and labels for TANF and any prior receipt
of TANF served as the main evaluative information about those income questions.

15

REPORTING OF ASSET INCOME
Round One cognitive interviews revealed some issues with the identification of withdrawals and
required distributions from retirement accounts, based on the wording used in that round. Thus
Round Two made additional changes that more specifically cue on both withdrawals and
distributions when asking if these are a source of income. The Data Analysis Report supported the
need for these updates, identifying that the CPS misses over 90 percent of retirement account
withdrawals.
The report also demonstrated that the CPS misses about a quarter of SOI pension recipients and
pension income. It misses nearly 40 percent of aggregate interest income. With comparable shares
of filers with interest income, the underreporting of CPS interest amounts is driven by
underreporting of dollars and not by underreporting of receipt. The CPS is also missing about 75
percent of dividend income.
As a result of these findings, the Round Two interviews further distinguished between retirement
assets and those outside of retirements, but for both, the interviewer started by asking for the
specific types of assets or accounts the person might have and then probed about interest or
dividends for the individual accounts or assets identified. This approach is based on the hypothesis
that some of the under-reporting may reflect the need for additional cueing to help respondents
think through all such assets and accounts in order to include them in their report of interest earned
or dividends received.
In addition, for respondents who refused or said ‘don’t know’ to the probe on interest or dividend
amounts, interviewers asked for the total account balance. In reaction to the observed sensitivity of
collecting total account values with some respondents (discussed further in the results section), the
account value question was prefaced with an explanation that having the value amount would allow
the Census Bureau to estimate the amount of interest earned, or dividends received. If respondents
could/would not answer the total value question, the interviewer followed –up with the bracketed
amount questions to collect income or dividend amounts. This combination of changes - cueing
for specific accounts, and then adding two different methods for capturing interest earned or
dividends received – targeted reduction of under-reporting as well as decreased item nonresponse.

DISABILITY
In addition to the more specific cuing for types of disability income noted above, Round Two
included one more additional change. In Round One, respondents noted confusion in regards to
the implied reference period in the first (screener) disability income source question. Specifically,
the question asked “(Do you/Does anyone in the household) have a health problem or disability
which prevents (you/them) from working or limits the kind or amount of work (you/they) can do?”
The question wording implies ‘as of right now’ which for the ASEC interview is in March following
the appropriate calendar year reference period. This could screen out people who did have a
disability and were receiving disability income in the prior year, but are not at the time of the
interview resulting in underreporting. Thus, Round Two changed the screener question to refer to
the prior calendar year.

16

PROPERTY
In Round One, one respondent noted that he did have a rental property, but chose not to report this
potential source of income because it did not yield a profit in the prior calendar year. In fact, the
Data Analysis report identified that the CPS appears to miss a significant proportion of rental unit
losers and the associated income losses. Thus, Round two included specific language to include
property as a source of income even in the absence of a profit.
Attachments D1 – D3 show the three different interview instruments used in Round Two. (In this
round, the cognitive interview probes were embedded within each interview instrument.)

Findings
Observations Regarding the Structural Changes
Both rounds of cognitive testing included the following three structural changes:




Collecting all sources of income first, before asking for amounts for any income source.
Varying the order of presentation of income types, based on known respondent
characteristics, with three general orders used in each round.
Following-up “Don’t Know” or “refused” responses with a series of questions asking
bracketed dollar amounts.

As noted above the purpose of each of these structural changes was to make use of automation to
minimize under-reporting and missing data. In round two, one additional structural change was
made to increase the memory cues for respondents as another method for minimizing underreporting, as well as to better facilitate the separation of source and amounts collection. Specifically,
in the Asset sections (interest and dividends) as well as within other income sources that cover
multiple sub-sources (e.g., disability, pensions, survivor benefits) we modified the instrument to
collect responses at the sub-source level.

Source first approach
In the first round, the source-first approach resulted in some difficulties for respondents at the point
when requesting the amount received for a given income source. However, the difficulty resulted
from issues with using a paper rather than an automated instrument. Interviewers did not always
record the exact sub-type of income within the source level questions. Thus, when the interviewer

17

asked the corresponding amount question, he or she could not repeat back as part of the question
the necessary cue (sub-source name). With automation, CAPI can store the sub-source name and
use it as a fill in the corresponding amount question that comes later in the interview. Despite this
paper-based limitation, there were some respondents who noted a benefit to the source first
approach, saying things such as: “it’s better like you did it – to me –because then that way you didn’t
really put nobody on guard…. You could get it out of a person better than asking flat out ‘how
much did you get…?”. There were also respondents who stated a preference for the interleafed
approach, though in part this preference reflected the interviewer difficulty with maintaining the link
between the positive response at the source question and the appropriate (sub) source in the
corresponding amount question.
In the second round, interviewers did not probe specifically for respondents’ feedback on the source
first approach. Rather interviewers attempted to determine if any income sources were missed, and
observed any confusion between the source questions and the request for the associated amount of
income from that source. Round two interviews did not result in any of the same confusion or
difficulties observed in round one with maintaining the link between the two pieces of information.
In fact, in the second round there was some suggestion that the separation of source and amounts
could have an additional advantage. In two different interviews, the respondents were aware of and
could report the different types of assets they owned. They reported their assets without hesitation.
However, when it came to asking for the earned income from those assets, neither respondent knew
the amounts, and repeatedly “excused” their lack of knowledge saying that the accountant knew the
amounts. Both respondents seemed to get agitated or slightly defensive. However, because this
occurred after collecting all sources of income, the structure diminished the possible negative effect
on reporting of subsequent incomes types.
We also observed one other potential advantage of separating source and amount questions. The
separation allows for a ‘second-pass’ at thinking through the income source. An interviewer
encountered a couple of instances in which in the amount section, respondents noted two or more
of an income type (e.g., two 401K accounts) that hadn’t been discussed initially in the source
section. It seemed the second pass through the income type when requesting amounts could
provide an opportunity for more specific recall, perhaps encouraging better reporting. That being
said, these interviews did include additional probing between the source and amounts section which
could also influence respondent recall, and the analysis cannot separate out that affect. However,
there was no direct indication that the source first approach resulted in any negative response
characteristics. More specifically, we did not observe any completely missed sources of income that
seemed linked to the source first approach.

Tailoring the order of income source presentation
In both Round One and Round Two, interviewers used one of three orders for collecting the
sources of income: a low-income order, a senior (62+) order, and the default order. The order
presented depended on known characteristics of the respondents. In the cognitive interview
situation, the respondent characteristics used for determining the appropriate order came from the
screening interview completed during recruitment. In the actual ASEC, automation will allow the
use of more detailed information collected in prior waves, or even from earlier in the interview (e.g.,
roster, work history) to determine the most appropriate order.

18

The goal for using a tailored order is to reduce the effects of respondent fatigue on reporting, by
increasing the likelihood that the most relevant source of income are discussed and identified earlier
in the interview. The target outcome is a reduction in missing data, and under-reporting.
However, without the benefit of a ‘truth’ source, the cognitive interview environment limits our
ability to assess the effect on data quality. As with the source-first assessment, our analysis focused
on identifying any missed sources of income, particularly those that came later in the order of
presentation. Round one did not reveal any such missed sources. However, in Round two we did
observe three cases of missed income (identified by the detailed probing at the end of the interview)
from the “other income” source questions. However, in all three cases it seemed the respondent
did not report the income for reasons other than the order of presentation of income types. For
example, one female respondent did not report two $10 gift cards she received for taking her infant
to two well-baby doctor visits. She indicated that she didn’t think of these small gift cards only
available to her for this very specific reason, as relevant to this data collection. The other instances
also reflected the respondent’s perception that the money did not match the scope of this data
collection. One woman neglected to report income received for a few months of the previous year
through a special grant that covered the dietary needs of her disabled son. Another respondent’s
partner received a one-time payment of $40 for driving a relative some place. These errors reflect
small dollar amounts, for very specific instances, and do not likely contribute to a possible
explanation of underreporting in the ASEC. Nor do these missed reports seem attributable to the
order of presentation; rather this seems to reflect the respondents’ perceived relevance to the survey
objectives.

Follow-up Bracket Items for “Don’t Know” Responses
In both Round One and Round Two, interviewers used follow-up brackets for “don’t know”
responses or refusals. There was one difference between rounds in the timing of the bracketed
response questions, but only for the Asset reporting. In Round Two (described in more detail
below), if respondents initially refused or indicated they didn’t know the amount of interest earned
or dividends received, the interviewer first attempted to get the total account value. In requesting
the account value, the interviewer explained that the Census Bureau can use the account value to
estimate interest/dividend amounts. Only if respondents were not able or unwilling to report the
account value did interviewers ask the bracketed amounts questions for Asset reporting in Round
Two.
In Round One, interviewers used brackets at least once in 13 of the 27 interviews. Interviewers
used brackets in nine instances for estimating interest amounts, both for accounts owned by the
respondent and those owned by other household members. Brackets were used to estimate earnings
for other household members in four interviews. In Round two, brackets were used in only 6
interviews, though this seemed reflective of Round Two respondents’ greater willingness to provide
an estimated dollar amount. In Round Two, interviewers used brackets for estimating interest in
checking and savings accounts, and with dividends, and in one instance in estimating another
householders earnings. Across these six cases, only one respondent went beyond the second level
of the bracket, and one respondent indicated that she could not go further than the initial question.
One of the concerns with the use of the bracketed follow-up questions is that they make it easy for
an uninformed respondent to guess. However, the self-imposed limitation we observed in Round
Two suggests that respondents don’t necessarily ‘acquiesce’ and give responses beyond what they

19

think they can reasonably know. In fact, one respondent in Round Two after going to the third
level of brackets for his own interest income, after the initial question the interest on a savings
account for someone else in the household said “Oh, you’re just going to keep on, aren’t you. I
don’t know.” He then told the interviewer to make the answer ‘don’t know.’ Similarly, in Round
One, three of the 13 cases who initially started using brackets ultimately decided to respond with
“don’t know” rather than select from the brackets.
We observed one other potential benefit of using follow-up brackets in round one. In three cases,
respondents used the brackets to help develop a point estimate for their own interest earned. It
seemed the brackets provided an additional opportunity to think about their answer, leading to a
point estimate for a response that originally started out as a ‘don’t know’. We did not see this same
behavior in Round Two.
As with the evaluation of the other structural modifications, the absence of a “truth” source of the
respondents’ income means we limit our data quality assessment to what we observed in the
interviews when respondents used the brackets. Our observations suggest that respondents are
willing to use the bracketed follow-up questions, but limit the level of precision in their response
rather than guessing in subsequent, lower-level brackets. We also don’t have appropriate data to
speak to whether respondents might view the brackets as overly burdensome. Thus, we suggest
the Census Bureau include the use of brackets into subsequent testing of an automated instrument,
but also include measures of perceived burden, as well as a method to assess the quality of the
bracketed response.

Observations on the Income Sources of Key Interest for this Project

TANF
Since receipt of TANF is known to be chronically under-reported in ASEC, we put a priority on
recruiting low-income persons for this project likely to have received these benefits in 2010, so as to
explore possible reasons for erroneous reporting. The strategy for determining whether the
household had members who received TANF was essentially the same across the two rounds,
though there were some notable differences. In Round 1, we used the current ASEC question
wording for determining TANF receipt. In Round 2, we converted a respondent instruction to
include food stamps, WIC and other benefits as TANF to an interviewer instruction, and more
importantly we attempted to gain the most specific type of assistance possible so as to be able to ask
more targeted questions when asking for amounts (consistent with the approach taken in Round 2
for other income sources, as discussed in the Methods section). We also provided flexibility for
respondents to report TANF amounts separately for each child – it is possible that children of the
same mother within a household receive differing TANF amounts because one, for example,
receives child support or disability benefits that another child does not receive. In such cases, it may
be easier for respondents to report two separate amounts, rather than a single total amount. In both
rounds, and in all locations, interviewers referred to the specific state program name for TANF.
Below we show how the questions varied between Round and Round 2.

20

Current ASEC and Round 1

At any time during 2010, even for one month, did (you/ anyone in this household) receive any CASH
assistance from a state or county welfare program such as (STATE PROGRAM NAME)?
INCLUDE CASH PAYMENTS FROM:
Welfare or welfare-to-work programs,
(State Program Name and/or acronyms),
Temporary Assistance for Needy Families program (TANF),
Aid to Families with Dependent Children (AFDC),
General Assistance/Emergency Assistance program,
Diversion Payments,
Refugee Cash and Medical Assistance program,
General Assistance from Bureau of Indian Affairs, or Tribal Administered General Assistance.
Do not include food stamps/Supplemental Nutrition Assistance Program (SNAP) benefits, SSI, energy
assistance, WIC, School meals, or transportation, childcare, rental, or education assistance.

------------------------------Round 2
At any time during 2010, even for one month, did (you/ anyone in this household) receive any CASH
assistance from a state or county welfare program such as (STATE PROGRAM NAME)?
Yes
No (SKIP)
From what type of program did (name/you) receive the CASH assistance? Was it a welfare or welfare-towork program such as (STATE PROGRAM NAME), General Assistance, Emergency Assistance, or some
other program?
1
2
3
4
5
6

(State Program Name)/Temporary Assistance to Needy Families (TANF)/welfare/AFDC
General Assistance
Emergency Assistance/short-term cash assistance
Diversion Payments
Refugee Cash and Medical Assistance program
General Assistance from Bureau of Indian Affairs, or Tribal Administered General Assistance

IF RESPONDENT MENTIONS ANY OF THE FOLLOWING CATEGORIES 7 THROUGH 12, NOTE
THIS, BUT PROBE: “RIGHT NOW WE ARE INTERESTED IN CASH ASSISTANCE” AND SEEK
ANSWER TO 59C8-R USING CATEGORIES ABOVE.
7 Food stamps/Supplemental Nutrition Assistance Program (SNAP) benefits/
8 SSI
9 Energy assistance
10 WIC
11 School meals
12 Transportation, childcare, rental or education assistance
13 Some other program What type of program? ______________

21

Earlier, you reported that [NAME(S)] received [CASH ASSISTANCE PROGRAM NAME (Q59C8)] (on
behalf of children) in 2010. What would be the easiest way for you to report the amount received - for
everyone in the household combined, or for each person separately?
COMBINED
SEPARATELY

In Round 1, no one among the 8 low income persons we recruited reported having received TANF
during the reference year (2010). From what interviewers could gather through probing, none of
these respondents seemed to have failed to report TANF receipt in response to the ASEC
questions. No respondents seemed to have had difficulty understanding the question. Four of 5
persons probed on the state program name were aware of it. A couple of respondents had received
TANF in prior years, but it appeared that most of our low income respondents were in fact not
eligible for it (based on income, no children, etc.).
As noted in the Methods section, for Round 2 we made a more concerted effort to recruit persons
potentially eligible for TANF. We contacted local TANF offices in each location for assistance,
though offices in 3 locations refused to provide it. We also required low income recruits to have
dependent children in the household (a strict requirement for receipt of TANF). Out of 15 lowincome persons recruited for the second round, 7 reported having received TANF during the
previous year. As in Round 1, we observed nothing to suggest that someone may not have received
TANF yet did not report it in response to the ASEC questions. However, we believe that 3 of the 7
persons reporting TANF receipt did so incorrectly:




One reported TANF receipt on the basis of having received food stamps. When asked the
new question we had devised for this round regarding the type of program providing this
assistance, he simply answered “welfare.” He later noted that he has always thought of food
stamps as being welfare, recalling as a child being taken to the “welfare office” by his
mother in order to get their food stamps;
The other two respondents reported TANF receipt on the basis of assistance that went
directly to their landlords to help pay their rent – in other words, they did not receive money
to help with their general expenses. One asked the follow-up item regarding the program
type, one described it as “Welfare-to-work, the DWP” (i.e., Diversionary Work Program),
while the other said it was “Emergency Assistance.”

Although the new question we inserted asking for the type of program failed to catch these incorrect
reports of TANF receipt, the question did appear to be effective overall, with respondents offering a
variety of programs, including “TANF,” “Welfare,” “General Assistance,” “Emergency Assistance,”
and “DWP” or “Diversionary Work Program.” As intended, we referred to these program types
later when asking for the amounts. So we think this follow-up question is worth additional
consideration and testing for ASEC. Note that to the extent respondents mention assistance
programs that should not be reported as TANF (e.g., food stamps, WIC, rental assistance, etc.), this
information could be captured and allow for skipping later items on these programs. So the burden
associated with this additional question should be very minimal.

22

We were somewhat surprised by the “false positives” we observed with respect to TANF receipt,
given the known problem of serious under-reporting of this income source. But to minimize such
errors, the Census Bureau might want to consider referring to “general CASH assistance” in the
question (rather than just “CASH assistance.”) Also consider following up “yes” responses with a
question such as: “Did this general cash assistance have to be used for a specific purpose, such as _____ ?” It
would be useful if the examples referred to in such a question are state-specific.
Food Stamps
Like TANF, Food Stamps are an income source known to be chronically under-reported in ASEC.
Thus, we hoped to explore reasons for this among the low-income persons recruited for the
cognitive interviews. For the first round, we tested the current ASEC questions on food stamp
receipt, which appear as follows:
Did (you/anyone in this household) get food stamps or a food stamp benefit card at any time during 2010?
YES (SKIP)
NO
At any time during 2010, even for one month, did (you/anyone in this household) receive any food
assistance from (STATE PROGRAM NAME)?

In all locations, we referred to the unique food stamp program name for that state.
In Round 1, 6 of 8 recruited low-income participants reported receiving food stamps. We saw no
reason to believe that anyone had received food stamps yet was not reporting it in response to the
ASEC questions. One person did note that asking “Did you get…a food stamp benefit card..” was
a bit confusing, since one can receive food stamps benefits on the same card over a considerable
period of time. This person did answer the question correctly though, and all 8 low-income persons
seemed familiar with the notion of food stamps and a food stamp benefit card. For example, one of
the persons not reporting food stamp receipt discussed how he had applied for food stamps yet was
told he was ineligible. In addition, almost all seemed to correctly (and quite easily) report the
amount of the food stamp benefit they had received. One individual, in a very large and complex
household, did likely misreport the amount – it seemed he failed to multiply correctly when
generating the total benefit amount per month across all household members covered by this
assistance. Furthermore, when making this calculation he used his currently monthly benefit
amount, neglecting to consider two occasions in 2010 when the benefit level had been adjusted.
In Round 2, we made a minor adjustment to the initial food stamp item in an effort to avoid
confusion regarding when some may have gotten their benefit card:
Did (you/anyone in this household) get food stamps or use a food stamp benefit card at any time during
2010?

As in Round 1, the food stamps items appeared to be very straightforward and work as intended.
Sixteen persons in this round reported having received food stamps in 2010, but the questions were
always easily answered, regardless of the response. As far as we were able to determine, no one had
received food stamps while failing to report it in response to the ASEC questions. As with TANF,

23

we observed nothing to suggest that under-reporting in ASEC is due to problems with
comprehension of the questions. Reporting of the food stamp benefit amount was generally easy
and straightforward as well. One person could not recall with certainty whether a drastic reduction
to her monthly benefit had occurred in late 2009 or early 2010 – she ultimately may have underreported on the amount since she decided the reduction had occurred in 2009. In addition, after
hearing the ASEC item: “What is the (monthly) value of the food assistance received in 2010?” one participant
replied with “Meaning how much did she get?” Perhaps the wording of this item is unnecessarily
formal in tone, and could be reworded in a more conversational manner, such as: “How much did
(you/name) receive in (monthly) food stamp benefits in 2010?”

Disability Income
In this section we describe the respondent difficulties and other issues we observed with respect to
reporting income received for reasons of disability, including short-term disability and longer term
disability payments from Social Security Disability Income (SSDI) and Supplemental Security
Income (SSI). In Round 1 we tested the current ASEC approach for collecting disability income,
detecting several problems. In Round 2 we tested major potential revisions to ASEC’s approach to
capturing this type of income.
In the first round of interviews, 13 respondents were screened as having disability income. One
significant observation from the first round was that three partial-year recipients of SSDI did not
report large “back-payments” that covered a lag in time between the determination of their eligibility
and the receipt of their first disability payment. Two respondents did not think any ASEC question
asked for the back-payment amount since it was something quite different from their regular
monthly payment amounts. Neither one even thought of it when responding to the ASEC items
collecting the amount of their disability income. Another respondent chose not report the backpayment due to the sensitivity of having received a large amount – she discussed how revealing the
amount to her family and friends had been a source of tension among them.
Two respondents did not report receiving short-term disability in the income sources section, as
they had trouble determining the appropriate questions at which to report this income. In both
cases, the key ASEC items designed to identify persons who might have received such disability
income failed to do so. These questions are as follows:



(Do you/Does anyone in the household) have a health problem or disability which prevents
(you/them) from working or which limits the kind or amount of work (you/they) can do?
(Did you/Is there anyone in this household who) ever (retire or leave/ retired or left) a job for
health reasons?

Note that the first question (ASEC Q59A) is phrased in the present tense. Both respondents
answered “no” because they no longer have the health issue that prevented them from working for
part of 2010. They both also answered the second item with “no” since they had merely stopped
working for a short time, rather than retiring or leaving their job. As one put it: “I didn’t leave the job –
I was out for a while, but I was still employed by the same company, so I just went back to work for them, after my
recovery.”

24

Interestingly, both of these respondents wanted to report this short-term disability income – they
were just unsure of where in the ASEC interview to report it. One respondent asked if she should
report her disability income in response to the ASEC item on Supplemental Unemployment
Benefits. The other, upon hearing ASEC’s definition of SSI (“assistance payments to low-income
aged, blind and disabled persons, and come from state or local welfare offices, the Federal
government, or both”) wondered if his short-term disability might fit here. However, he ultimately
reported the amount of his short-term disability when asked about the financial assistance he had
received. Here he neglected to report actual financial assistance he should have reported, replacing it
with his short-term disability payments. The use of a more effective cue when asking for the
amount of financial assistance would have avoided this confusion.
At least two respondents had difficulty distinguishing SSDI from SSI benefits. One respondent
reported getting both forms of disability pay at different times in 2010. When he was asked ASEC
Q57a (“During 2010 did [you/ anyone in this household] receive: any SSI payments, that is, Supplemental Security
Income?”), he asked “That’s the same thing as Social Security, right?” When he was asked “As you
understand it, which do you get: regular Social Security or SSI?” the respondent said he received
both sources, although it was clear he was unfamiliar with the SSI program name: “I got Social Security
part of the year then it switched over to Social Security Disability.” Another respondent who received
disability income, likely through Social Security, also had trouble distinguishing the two programs.
This respondent was asked ASEC Q57a, as well as the interviewer note: “SSI are assistance
payments to low-income aged, blind and disabled persons, and come from state or local welfare
offices, the Federal government, or both.” In response, he said: “Well they say I didn't qualify for
welfare, but disability and social security are the same thing. It would fall under social security, yes” and he went
on to report receiving SSI. However, later, when he was at the disability source section, he reported
“From what I understood, I was supposed to qualify for both ... but they only qualify me for one ... that would be
under SSD.” He suggested what he received should most accurately be reported as ‘disability income’
rather than Social Security payments or SSI.
For the Round 2 interviews, a number of changes were made in the instrument after first round to
address the difficulties we observed.


To serve as a cue for respondents to recall back-payments received, we added a question to the
disability amounts section for those who reported receiving Social Security for less than a full
year:
During 2010, did (name/you) receive an initial disability payment that was larger than
the usual payment? (This is sometimes done to make up for a delay in the start of
payments)?



To minimize confusion about whether to report disability income in response to early or later
questions, we changed the ordering of income sources to have the disability section
immediately precede the sections asking about SSI and SSDI payments, and furthermore added
follow-up items into the disability income section designed to determine the specific source(s)
of disability pay. This includes items on SSI and SSDI, since we expected many receiving these
benefits would now report them in the disability section.

25



In addition to moving the disability source section before sections on SSI and Social Security, we
added descriptions of the programs into the sub-items that referred to those programs (at
Q61B_R, item a and Q61B_R, item b), to make it easier for respondents distinguish them:
a. Social Security Disability Insurance, for people who are eligible based on years of
work;
b. Supplemental Security Income, which provides payments to low-income aged, blind
and disabled persons;



To help prevent underreporting among those who experienced only short-term interruptions in
their income due to a disability, we revised ASEC Q59A (one of the ASEC disability screener
items) from its current present tense form to the following:
At any time in 2010 (did you/did anyone in the household) have a disability or health
problem which prevented (you/them) from working, even for a short time, or which
limited the work (you/they) could do?

In Round 2, we observed less evidence that respondents were misreporting receipt of disability
income. One respondent did under-report the amount of disability benefits he received due to an
unreported back-payment. A total of 21 of the 29 Round 2 respondents answered yes to one or
both of the ASEC screener questions in the disability section. Of those 21 respondents, 14 reported
that someone in their household received disability income in 2010. For one of the 14, it was
undetermined if this (Social Security) income was actually due to disability or was traditional
retirement, though the amount was reported without difficulty.
Three respondents had some difficulty responding to the follow-up items we added to the disability
section that were designed to determine the specific program/source of disability pay:




One respondent had difficulty recalling the source of his disability income. He qualified his
‘Yes’ answer to an item asking if it was ‘Social Security Disability Insurance, for people who
are eligible based on years of work,” by indicating he received income from a state
temporary disability program, though he thought the source was Social Security. It seemed
likely that he was reporting his disability source prematurely, as a moment later he indicated
that it came from “Federal, State or Local government employee disability” and he knew that
he received disability income from only one source;
One respondent had difficulty understanding the intent of the follow-up items we had added
to determine the source of disability pay. She was administered the question stem (‘Was this
disability income from…’) but as the sub-items were read, she appeared to think the items
were asking whether she had access to these sources, and not whether she actually received
income from each of them in 2010. At the item on ‘Company or union disability payments’
she seemed unsure if her employer’s long-term disability benefit plan should be reported.
She demonstrated the same problem at an item meant to determine if the disability pay was
‘Accident or disability insurance’ - she paraphrased the question incorrectly as ‘Do I have
accident or disability insurance…?’ As a separate issue, she was uncertain if she should include
Social Security-Disability that she had already reported in this series again at an item
determining whether she had gotten ‘Federal, State or Local government employee

26



disability.’ In our view, this respondent’s difficulties could best be handled by simple
clarifications provided by an interviewer;
Another respondent appeared unsure of who the follow-up items were pertaining to. She
indicated confusion at the item ‘Supplemental Security Income, which provides payments to
low-income aged, blind and disabled persons;’], saying “I don’t understand that question. My kids
get Social Security…uh, SSI…Social Security income, because I’m disabled and they’re in school.” She did
not seem to realize the questions were asking about her own disability income. This is
another instance that could easily be handled by an experienced interviewer.

Some confusion might be avoided by revising the order of follow-up items (currently ordered from
programs with the highest number of recipients to the fewest), rephrasing the stem wording to refer
explicitly refer to the respondent or household member who received the disability pay.
Three respondents had difficulty distinguishing SSDI and SSI. Two of these reported (correctly, we
believe) household members receiving income from both programs. One respondent reported both
SSI and SSDI as sources in the disability section but when asked at the next section if she received
Social Security aside from what was already counted, she asked if SSI should be reported. She
referred to it as ‘Supplemental Social Security Income,’ suggesting a reason for her confusion
between programs is her limited knowledge of the acronym, and not knowing the program name
that it represents. The other respondent demonstrated a similar confusion, as she referred to SSI as
“Social Security…uh, SSI…Social Security income” when reporting about her children’s benefits. Such
confusion is likely to persist due to the similarities in program names, though it may be minimized
by referring to programs consistently using the same wording. In addition, respondents indicated
that SSI payments are electronically deposited on the 1st of each month and SSDI payments are
deposited on the 3rd of the month – if this can be established as true for virtually all persons
receiving such disability payments, this information should be included on CATI/CAPI screens as
interviewer help text.
Collecting income amounts from our interview participants with disability income was generally very
straightforward and almost no problems or issues were observed. Only one of the 5 respondents
who were asked the new question regarding back-payments had difficulty reporting them correctly.
This respondent initially referenced receiving a back-payment when he was deciding whether to
report it in the Social Security section despite already having reported receiving Social Security
benefits in the disability section. He omitted a sizeable amount of disability income ($9,000-$10,000)
received from back-payments. The respondent seemed to interpret the back-payment question as
referring only to the payments he received on a monthly basis, noting he received portions of his
back-payment at various times during the year. Our question, as worded, did assume that recipients
receive one-time back-payments. So we would now recommend a wording that does not make this
assumption. Furthermore, the term “back-payment” is one we heard respondents repeatedly use in
these interviews, so we suggest this term be used in the question. Consider the following wording,
for example:
During 2010, did (name/you) receive any back-payments to make up for a delay in the start of
(your/his/her) disability payments?

27

Asset Income
Based on the work and recommendations of the Urban Institute, we explored the feasibility of two
major refinements to the ASEC data collection of asset income. First, as a supplement to the current
questions on dividend and interest income, we developed questions asking for the values of the
assets generating this income. People may be more knowledgeable about the balances of accounts
holding stocks and mutual funds, certificates of deposits, and other forms of savings, relative to
income received within these accounts. Second, we explored ways of collecting asset income (and
values) separately for tax-advantaged retirement accounts, apart from other forms of investments.
The Urban Institute argued that collecting information on retirement accounts would yield an
improved picture of the wealth of U.S. households. With the asset income questions in the current
ASEC, it is unclear to what extent the questions are intended to collect income from retirement
accounts.
Table 5 illustrates the approach we tested in Round 1 for determining whether household members
have retirement accounts, and whether payments or withdrawals were received from these accounts.
Note that the current ASEC relies upon the respondent to recall and report income from retirement
accounts in response to rather broad questions that do not specifically cue retirement accounts.

Table 5. Determining Sources of Retirement income: ASEC versus Round 1 Questionnaire
Current ASEC

Round 1

1) During 2010 did (you/anyone in HH)
receive any pension or retirement income
from a previous employer or union (other
than Social Security/VA benefits)?
2) What was the source of (name’s) income?

1) At any time during 2010 did (you/anyone in
HH) have any retirement accounts such as a
401(k), 403(b), KEOGH, or IRA?
2) During 2010 did (you/anyone in HH)
receive any pension or retirement income
from a previous employer or union (other
than VA benefits)?
3) During 2010 did (you/anyone in HH)
receive any other retirement income (other
than Social Security/VA benefits) (including
payments or lump sum withdrawals from a
retirement account)?
4) What was the source of (name) income?

After asking the standard ASEC questions on whether household members had interest-earning
assets (e.g., money-market funds, savings accounts, savings bonds, certificates of deposits) during
the reference year, we added questions to determine if these assets were held in a retirement account
(if applicable), outside the retirement account, or in both:

28

[IF PERSON HOLDS A RETIREMENT ACCOUNT] Did (you/NAME) have (this asset / any these assets)
within a retirement account?
Yes
No (SKIP)
Did (you/NAME) (also) have (this asset / any these assets) outside of a retirement account?
Yes
No

Similarly, if someone in the household was reported to have dividend-earning assets (i.e., shares of
stock in corporations or mutual funds), we sought to determine if they were held in a retirement
account (if applicable), outside the retirement account, or in both:
[IF PERSONS HOLDS A RETIREMENT ACCOUNT] Did (you/NAME) own any of these shares within a
retirement account?
Yes
No (SKIP)
Did (you/NAME) (also) own shares of stock or mutual funds outside of a retirement account?
Yes
No

When collecting amounts of income from interest and dividend-earning assets, we sought to obtain
the income separately for retirement and non-retirement accounts (as applicable), and to obtain asset
values in addition to the income generated. Table 6 illustrates this approach. It should be noted
that we varied the order of collecting asset income and values across interviews.
Table 6. Collecting Asset Income: Current ASEC versus Round 1 Questionnaire
Current ASEC
1) How much did (name/you) receive in
interest from these sources during 2010,
including even small amounts reinvested or
credited to accounts?

Round 1
1) Within (your/NAME’s) retirement
account(s) what was the value of the (interest
earning accounts or money market
funds/savings bonds/treasury notes, CDs or
other investments which pay interest) at the
end of 2010?
2) Within (your/NAME’s) retirement
account(s) how much did (name/you)
receive in interest from these sources during
2010, including even small amounts
reinvested or credited to accounts?
3) (Outside of (your/NAME’s) retirement
account(s)), what was the value…
4) (Outside of (your/NAME’s) retirement
account(s)), how much did you receive…

29

We observed a myriad of respondent difficulties and problems in the first round of cognitive
interviews. Some of the observations stemmed from issues with the current ASEC items, while
others were due to issues with the additional measures we had developed for this section. The most
notable problem was that respondents, as well as interviewers, often struggled to determine precisely
what asset was being targeted by a question asking whether the asset was held within a retirement
account, for the amount of interest income earned, or the value of the asset. Many of our
respondents had assets in both retirement and non-retirement accounts, at least two persons had
more than one retirement account (e.g., both an IRA and a 401(k), or more than one IRA). Some
also reported having retirement pensions from previous employers. While the new questions we
developed attempted to address assets held in retirement accounts separately from those in nonretirement assets, they did not distinguish among different types of retirement accounts, or
adequately distinguish between retirement savings accounts and pensions. One respondent, when
asked if any of “these assets” (i.e., interest-earning assets she had just reported having) summed up
the problem in this way:
“I don’t know how to classify different things. I think when you [ask a question] and then you don’t really
list the thing that I’m looking for, it’s like ‘what category did I put it under?”
Furthermore, the current ASEC questions on interest-earning assets ask about multiple asset types
in a single question. For example, one item asks if anyone in the household had a money-market
fund interest-earning checking account, or a savings account during the reference year. Another
asks if anyone had “any treasury notes, certificates of deposit, or any other investments which pay
interest.” One respondent neglected to report having a CD worth $165,000 in response to this latter
item – he seemed distracted by the reference to “treasury notes” in the item. Otherwise, since these
questions do not determine precisely what kind of account is being reported, the latter questions
asking for the amount of interest earned were frequently awkward and confusing, since much of the
wording of these questions referred to accounts the respondent does not have. On the other hand,
In at least two cases, the respondent had more than one of the account types mentioned in the
question, and in such cases it can be unclear whether the respondent should report for the multiple
accounts combined, or one by one. Thus, considerable unscripted probing and clarifications
between the respondent and cognitive interviewers were necessary when collecting the amount of
interest income.
Several other observations from the first round of interviews are worth mentioning:




We learned that the notion of a “retirement account” needs greater specificity. For one
respondent, the retirement account was simply an account in which his Social Security
check gets deposited, rather than a tax-advantaged account designed for retirement savings.
This caused confusion at the new question we had created to determine If certain interestearning assets were within his retirement account;
Referring to a “lump sum withdrawal” can be confusing. When asked “During 2010, did
anyone…receive any other type of retirement income, including payments or lump sum
withdrawals from a retirement account?” one person said “They weren’t lump sums, I took
the minimum amount.” In retrospect, we should have not used this wording to refer to a

30











retirement account distribution, as it could be taken to mean withdrawing the entire amount
of the account, as noted by this respondent;
Not everyone with stock and mutual fund investments understands the difference between
dividends and interest. For example, when asked how much dividend income he had
received, one person said the question was identical to one we had previously asked about
interest income. In fact, he had reported dividend income in response to the item seeking
the amount of interest income. As another example, a respondent who owned
stock/mutual fund shares, when asking how much interest she had earned from an interestearning asset, reminded the interviewer that “some stocks pay you interest too.” She
apparently believed she should be reporting dividends from the stock/mutual fund shares
in combination with interest from her money-market fund and savings account;
At least two respondents referred to having received a “required minimum distribution”
from their retirement account. None of the questions we tested included this wording, but
since this seems to be standard, commonly understood language, we decided it was worth
using in revised questions tested in the second round of cognitive interviews;
Asking for the “value” of an asset (such as a retirement account) was not immediately clear
to at least two respondents. For example, one person thought about it briefly and sought
clarification by asking: “You mean ‘what is the balance?’”
Asking for the value of an asset can be viewed as intrusive. In a few instances, respondents
refused to divulge asset values, such as the amount in a retirement account. With the
exception of one person who did not want to reveal the amount of a back-payment for
disability benefits, we encountered no refusals to divulge income information requested by
ASEC in these interviews.
Respondents did seem to possess a greater level of knowledge of the values of their assets
as compared to the amount of interest or dividend income these assets generated. In the
vast majority of cases, respondents were able to report an approximate value of their assets
(e.g., amount held in stocks and mutual funds) at the end of the previous year, but we
observed frequent difficulties with generating the income earned from these assets. This
pattern was particularly true for retirement accounts – a couple of persons did not even
know whether their retirement account assets were invested dividend-earning assets (i.e.,
stock and mutual fund shares) or interest-earning assets (e.g., money-market funds). As one
person explained: “I have investments in Vanguard, I don’t know what they are.”

For Round 2, the sections of the ASEC instrument determining sources of interest and dividend
income, including that within retirement accounts, were substantially revised. The revisions were
driven directly by the notable problems we observed in Round 1.
First, we developed a short, straightforward section for retirement accounts. This section
determined who in the household held retirement accounts and what types of accounts (e.g., 401(k),
Roth IRA, SEP, etc.). This was done so as to allow for greater specificity (thus making questions

31

more clear) and more effectively cue respondent recall when interviewers are asking the amount of
interest or dividends earned from a retirement asset – for example, by referring to “your 401(k)
account” rather than “your retirement account.” In addition, we added a question that specifically
addresses whether the individual withdrew any money (received a distribution) from these
retirement accounts. It became clear to us after Round 1 that ASEC is intended to capture this as
income, yet the current instrument lacks a question directly asking about retirement account
withdrawals.
Second, we revised the current ASEC items determining interest and dividend-earning assets to a
somewhat longer, but simpler series of items. For example, rather than asking if anyone in the
household had money in “any kind of money market fund, interest-earning checking account, or
savings account“ (Item Q63A1 in the ASEC items booklet), we sought to determine whether anyone
had money in each of these accounts, by obtaining a “yes” or “no” response for each. While this
may initially appear to increase respondent burden by increasing the number of questions, asking
short, easier-to-understand items will serve to decrease respondent burden. Also note that by
obtaining the specific types of interest-earning assets held, the latter items collecting the amounts of
interest earned can be asked with greater specificity (by referring to a single account type, rather than
multiple account types, some of which do not apply), which should further aid in increasing
respondent comprehension and cueing recall.
Finally, rather than ask all persons with these assets for both the value of the assets (which Round
1revealed to be perceived as quite intrusive) and the interest or dividend income generated by these
assets, we instead relied upon first asking for the interest or dividend income, and only if the
respondent indicating not knowing the amount (or refused), we followed up by asking for value. We
attempted to “soften” the request for the asset value by offering a brief justification for the question,
noting that the Census Bureau can estimate the amount of income generated if the asset value is
known.
The questions on retirement accounts and interest/dividend-earning assets that we tested in Round
2 are shown below.
At any time during 2010 did (you/ anyone in this household) have any retirement accounts such as a 401(k),
403(b), IRA, or other account designed specifically for retirement savings?
YES
NO (SKIP)
Which members of this household ages 15 and over had such a retirement account?
What type of retirement account (did you/ NAME) have? Did (you/he/she) have…
(MARK ALL THAT APPLY)
A 401(k)?
A 403(b)?
A Roth IRA?
A Regular IRA?
A KEOGH plan?
A SEP plan?
Another type of retirement account?

(SPECIFY): ______________________

32

Did (you/NAME) withdraw any money or receive a distribution from (your/his/her) [ACCOUNT TYPE]
account in 2010? IF AGE 70+ ADD: including distributions you may have been required to take?
YES
NO
IF ANY RETIREMENT ACCOUNTS IN HH, READ TRANSITION:
(Now I will ask about assets that may have paid interest or dividends in 2010 outside of the [ACCOUNT
TYPES).
At anytime during 2010, did (you/anyone in this household) have money in:
A. An interest-earning checking account?
YES  IF NECESSARY DETERMINE WHO
NO
B. A savings account?
YES  IF NECESSARY DETERMINE WHO
NO
C. A money-market fund?
YES  IF NECESSARY DETERMINE WHO
NO
D. CDs (certificates of deposit)?
YES  IF NECESSARY DETERMINE WHO
NO
E. Savings bonds?
YES  IF NECESSARY DETERMINE WHO
NO
F. Shares of stock in corporations or mutual funds?
YES  IF NECESSARY DETERMINE WHO
NO
G. Any other savings or investments that pay interest or dividends?
YES  DETERMINE WHO AND ASK: What type of investment is that? __________
NO
Within your [ACCOUNT TYPE] account(s), how much did (you/NAME) earn in (interest or
dividends/interest/dividends) during 2010? Please include small amounts reinvested or credited to the account.
_____________

_____________ _____________

IF DK/REFUSED:
The Census Bureau can estimate the amount earned in this account based on the size of the account. So can you tell
me how much money was in (your/his/her) [ACCOUNT TYPE] at the end of 2010?
_____________

_____________ _____________

33

In Round 2, the sections of the test ASEC instrument asking about retirement accounts and assets
earning interest and dividend income flowed much more smoothly than in Round 1. Although there
were a couple of cases where a respondent was unsure of the type of retirement account they (or
another householder) had, the series of items on retirement accounts, including asking whether a
withdrawal had been made last year, generally went very well. The extended series of items on assets
outside of retirement accounts also flowed very smoothly. In addition, when collecting the amount
of interest or dividend income earned, there was much less confusion observed on the part of
respondents (virtually none, in fact) with respect to what account or asset we were asking about.
Thus, the greater specificity and cuing of account types we incorporated into the instrument for the
second round appeared to have major positive impact.
Nevertheless, among the 12 respondents who appeared to have retirement asset income in Round 2,
we observed several notable issues:










When asked if his wife had withdrawn any money or received a distribution from her
retirement account, one person answered “Yes, I think she borrowed out of there.” Taking a loan
out of one’s retirement account should not be reported as income. Consider adding an
instruction for persons under the age of 60 to not count loans they may have taken from
their retirement accounts. At a minimum, an interviewer instruction should be included on
the CAPI/CATI screen for this question indicating that such loans should not be counted as
withdrawals;
When asked what types of retirement account he has, one respondent was unclear on the
reference period. He wondered if we wanted to know about retirement plans that had been
rolled over into this current plan. This could easily be addressed, by adding the reference
year to the question. That is: “What type of retirement account did you have in 2010? Did you have..”;
Two people reported pension income amounts, but then noted that these amounts were
after taxes. In both cases, getting only the after-tax amount would constitute a substantial
under-report. It’s important to note that the ASEC question addressing pension income
does not specify whether gross or net income should be reported – a flaw in the question
that could easily be addressed . As one of these respondents put it: “You need to tell me whether
or not I should be thinking about taxes and health insurance policies and stuff like that, which are…prepaid
before I get a check;”
One person found the question asking for the value of an asset to be quite intrusive (and
refused to divulge it), wondering why the government would be asking for such information.
Others in this round had no issue discussing asset values, however;
When asked to report the amount of interest or dividends within his 401(k) account, one
person with two 401(k) accounts thought only of the smallest one. It was the account
connected to a previous employer who now pays him a pension. We had just asked for the
amount of this pension income, and this led the respondent to think only of the 401(k)
associated with this job (the respondent directly pointed this out as the reason for neglecting
the other account);

34



One respondent had withdrawn money from two separate retirement accounts and
indicated it would be much easier to report the full amount withdrawn (for the two
accounts combined), rather than for each account separately, which is the way our Round 2
ASEC instrument requested. Similarly, another person had difficulty reporting interest
earned on different assets (savings account and CDs) held at the same bank. Of course,
ASEC does not need this level of detail – the instrument could (and should) be flexible
enough to capture this type of income for different accounts separately, or combined. It will
be easier for some respondents to report for specific accounts. For others, it will be easier
to report for multiple accounts combined.

Conclusions and Recommendations for Further Testing
As noted above, the primary objective of this research was to develop an alternative approach for
collecting the ASEC data that takes advantage of the functionality available with an automated
instrument in order to improve recall and reporting of income data. Automation can increase the
amount of flexibility built into the instrument for collecting the data. As done in these interviews,
automation facilitates tailoring the presentation order of income types to match those source most
likely received by respondents given certain known characteristics of the respondent. Automation
also allows for the collection of all sources of income first, separately from amounts, since the CAPI
instrument can store the detailed income source name or label and then feed that back to the
respondent later in the interview to collect the amounts.
Cognitive interviewers ‘simulated’ a CAPI interview using a paper interview instrument, a record
keeping worksheet and a calculator, rather than incurring the time and costs of developing an actual
CAPI instrument in this early stage of testing and development of a revised ASEC instrument.
Thus there are some limitations in the assessment of the benefits of increased automation, but the
‘simulation’ approach provided useful information in regards to the effect of the structural changes
on respondent reporting more generally.

Structural Changes
In the cognitive testing environment, we relied on the detailed cognitive probes to identify any
missed sources of income or misclassification of income type that might result from the changes to
the structure of the interview, in particular the collection all sources first before amounts, and the
tailored order of presentation of income sources. With the exception of a few irregular, or one-time
small payments for odd-jobs ($40 total) and a missed gift card ($20 total), we did not find evidence
of missed sources of income within the revised structure of the interview. Nor did we observe true
misclassification of income types after implementing the Round Two changes. However, with very
small numbers and the absence of any truth source, a field test comparing the source first approach
to the current ASEC approach is needed to more fully speak to the impact on data quality.
The cognitive interviewing did suggest that the dual-pass approach through the income types, first
with identifying all sources of income, and then in collecting amounts, may provide an additional
memory cue to respondents in thinking about their income and can possible improve recall and
reporting. Collecting dietary intake information uses this same multi-pass approach based on data

35

quality and validation studies (see Thompson et al, 2008 for review of dietary data collection
methodology). Similarly, the redesign of the National Crime Victimization Survey in 1994 also
included a multi-cue, or multi-pass approach at collecting victimization data based on studies
indicating improved recall under this design (Rand et al, 1995). The next stage of testing should
retain the source first collection, followed by amounts collection for further evaluation.
The cognitive interviews did not reveal any missed income reporting with the tailored ordering of
the presentation of income types, nor was there evidence of confusion due to the order and thus
non-standardized context of the income reporting. Screening information gathered during
respondent recruitment determined the order of presentation for these interviews. However, the
ASEC will have relevant information available to determine an appropriate order of information
(age, employment status, household size, presence of children, etc) from either prior waves of data
collection or from earlier in the current interview. The next stage of testing should incorporate
evaluation of the algorithm used to determine order of presentation based on available ASEC data
since the one-time interview approach with cognitive testing could not.
Cognitive testing also evaluated the inclusion of bracketed ranges of amounts, custom to each
income type and geographic area, as follow-up questions to “don’t know” or “refusal” responses.
In both rounds, interviews resulted in reduction of item non-response, with respondents able to
select at least one-level of brackets as a response. Within these interviews, the follow-up brackets
were most often used in regards to reporting of asset income which reflects a very high level of item
nonresponse for interest and dividend reporting currently in the ASEC data. There are no apparent
disadvantages of this from a respondent reporting perspective, other than a possible perception of
increased burden. However, in this testing, the need for follow-up brackets occurred infrequently
within a given interview so the actual impact on burden for any one respondent is likely low. We
suggest including these customized bracket follow-up questions in the next stage of testing to better
assess burden on an individual reporting unit. In addition, future testing should incorporate an
assessment of how these bracketed data will affect current imputation/assignment procedures in the
ASEC.
Each of the structural changes noted above reflects some use of data already known about the
household or the respondent, each a variety on dependent interviewing methods. Mathiowetz et al
(2000) provide an overview of different methods of dependent interviewing and discuss the benefits
and limitations of each. Across waves, we’re suggesting using information known about a
respondent or household to determine the most appropriate order of presentation. Within an
interview, we’re suggesting improving the performance and effectiveness of the source first
approach by using the identification of the detailed sub-source of income as a fill in collecting the
amount information for that source. In addition, while bracketed ranges can be predefined based
on geography, the instrument can also make use of employment information reported earlier in the
interview, or possibly in prior waves to refine any follow-up brackets used in earned income
reporting. Based on the cognitive interviewing done to date, we believe these methods will result
in data quality gains, specifically in terms of reduced under-reporting and lower levels of missing
data. However, implementing these types of changes will require a fairly rigorous testing and
development cycle for the CPS-ASEC which we realize has timing and budget implications that the
Census Bureau must also consider.

36

Reporting of Means Tested Program Participation
The findings reported by the Urban Institute strongly suggest that some public assistance and food
stamp receipt is not collected by ASEC due to the Family Income Screener, which serves to skip
many households past the relevant questions. The questions asking about participation in these
programs pose relatively little burden for respondents, and thus the Census Bureau should consider
asking them in all households. With respect to public assistance (TANF), we believe further testing
should be done with a follow-up question (like we used in Round 1) that collects more detailed
information on the type of benefit, or the program that provided this benefit. Public assistance
comes in different forms and has many different names, and thus it would be useful for the ASEC
instrument to capture these cues for use later in the interview when collecting the amount of the
benefits received. Also consider referring to “general cash assistance” (in Q59A88 of the
instrument) when asking if anyone in the household received this type of assistance, and/or adding a
follow-up item (for those reporting receipt) to determine if the assistance was provided for a specific
expense, such as rent or utilities – ideally these examples should be tailored so as to be consistent
with what a given state provides low income households. Finally, continue examining the potential
advantages and disadvantages of allowing the respondent to report TANF amounts on a per child
basis, or as a combined amount – that is, letting the respondent choose whichever method is easiest.
The questions about receipt of SNAP (food stamp) benefits appeared to be readily understood and
answered by our participants, and we would not suggest major revisions to these questions.
However, the Census Bureau should consider rephrasing ASEC Q87 to “…get food stamps or use a
food stamp benefit card during [year]?” since the card used by program recipients may have been
obtained prior to the reference year.

Disability, SSDI and SSI Reporting
We believe the ASEC instrument’s “flow” would be improved for households with a disabled
person if the sections on disability, Social Security and Supplement Security Income are rearranged
and asked consecutively, a strategy we tested in Round 2. This change should reduce confusion and
facilitate classifying disability income as SSDI, SSI, or other forms disability pay. Additional ideas
that deserve further testing by the Census Bureau include:





Revising the disability screener question (Q59A) so that it refers to the ASEC reference year,
rather than the present health/disability status of household members
Asking explicit follow-up items to determine specific sources of disability income, rather
than asking for the source in an open-ended fashion as the current ASEC does
Using the different direct-deposit dates of SSDI and SSI as cues to help respondents
distinguish the two programs
For persons reported to have received SSDI or SSI for only part of the reference year, ask
about the receipt of back-payments that may have been provided to cover the delay between
the establishment of someone’s eligibility for disability benefits and the processing of
payments

37

Retirement and Asset Reporting
We believe the strategy that we tested in Round 2 for collecting information on retirement assets,
interest, and dividends is a good starting point for the Census Bureau to consider. Specifically:








Determine specific asset types held by persons in the household, both within and outside of
retirement, and target these specific accounts separately when asking about interest and
dividend income. Generally speaking, respondents should find it easier to consider and
report one each account at a time, rather than trying to report on the basis of multiple
accounts. However, ASEC could easily be designed to accommodate respondents who do
prefer to report for multiple accounts at once
Maintain the focus on collecting amounts of interest and dividends earned, and ask for the
value of assets that generate these forms of income only when respondents are unable to
provide these amounts
Directly ask respondents if money was withdrawn (distributions taken) from retirement
accounts. The ASEC instrument does not currently do this. Respondents will occasionally
need to be instructed that loans from retirement accounts should not be counted as
withdrawals
When asking for the amount of pension income, ASEC should specify that is the amount
“before taxes and other deductions” that is desired

References
Kreuter, F., McCulloch, S., Presser, S., and Tourangeau, R., (2011) The Effects of Asking Filter
Questions in Interleafed Versus Grouped Format. Sociological Methods and Research. 40:88
http://smr.sagepub.com/content/40/1/88
Mathiowetz, N. and McGonagle, K. (2000) “An Assessment of Current State of Dependent
Interviewing in Household Surveys” Journal of Official Statistics, Vol 16, No. 4 pp. 401-418
Rand, Michael and Bruce Taylor. "The National Crime Victimization Survey Redesign: New
Understandings of Victimization Dynamics and Measurement." Orlando, FL: Annual
meetings of the American Statistical Association, August 13-17, 1995.
Thompson FE, Subar AF. Dietary assessment methodology. In: Coulston AM, Boushey CJ, editors.
Nutrition in the Prevention and Treatment of Disease. 2. San Diego: Academic Press; 2008.

38


File Typeapplication/pdf
AuthorVicki Given
File Modified2014-12-19
File Created2013-09-05

© 2024 OMB.report | Privacy Policy