Download:
pdf |
pdfApril 18, 2012
2012 AMERICAN COMMUNITY SURVEY RESEARCH AND EVALUATION REPORT MEMORANDUM SERIES
#ACS12-RER-18
DSSD 2012 AMERICAN COMMUNITY SURVEY MEMORANDUM SERIES
#ACS12-MP-2
MEMORANDUM FOR
ACS Research and Evaluation Steering Committee
From:
David C. Whitford /Signed/
Chief, Decennial Statistical Studies Division
Prepared by:
Jennifer Tancreto
Chief, American Community Survey Data Collection Methods Branch
Decennial Statistical Studies Division
Subject:
Design of the American Community Survey Internet Instrument
Attached is the American Community Survey Research and Evaluation report “Design of the American
Community Survey Internet Instrument.” This report details the design considerations in developing
the American Community Survey Internet instrument used in both 2011 American Community Survey
Internet tests. This Internet instrument will also be used in American Community Survey production
(with minor changes) starting in January 2013.
If you have any questions about this report, please contact Jennifer Tancreto at 301-763-4250.
Attachment
cc:
ACS Research and Evaluation Team
Debbie Griffin
(ACSO)
Todd Hughes
Andrew Roberts
Brian Wilson
American Community Survey Research and Evaluation Program
April 18, 2012
Design of the American
Community Survey Internet
Instrument
Final Report
Jennifer Guarino Tancreto, Mary Davis, Mary Frances Zelenak
Decennial Statistical Studies Division
Intentionally Blank
Table of Contents
1. Introduction .............................................................................................................................................. 1
2. Global Instrument Design ...........................................................................................................1
2.1 Navigation ..............................................................................................................................2
2.2 Access/Security ......................................................................................................................3
2.3 Web Survey Appearance ........................................................................................................3
2.4 Web Survey Features .............................................................................................................4
2.5 Unfolding Design ...................................................................................................................5
2.6 Auto Calculations ...................................................................................................................5
2.7 Errors ......................................................................................................................................5
2.8 Review & Edit ........................................................................................................................6
3. Paradata ................................................................................................................................................... 7
Acknowledgements ..........................................................................................................................7
References ........................................................................................................................................8
i
List of Figures
Figure 1. Pick the Next Person Screen ......................................................................................................... 2
Figure 2. Welcome/Login Screen ........................................................................................................3
Figure 3. Example Question ......................................................................................................................... 4
Figure 4. Unfolding Design with Error Message ........................................................................................... 6
Figure 5. Review and Edit Screen ................................................................................................................. 7
ii
1. Introduction
The American Community Survey (ACS) is an on-going survey designed to provide communities with
reliable and timely demographic, social, economic and housing data every year. The ACS collects data in
every U.S. county and Puerto Rico and has an annual sample of about three million addresses allocated
into twelve monthly samples of approximately 250,000 addresses each.
Currently, the ACS collects data using three sequential modes: mail, followed by nonresponse follow-up
by telephone and finally personal visit1. The 2011 ACS Internet Test was designed to evaluate the
feasibility of introducing a web response mode during the mail data collection phase. The main objective
was to determine the best way to present the Internet mode in the mailings to maximize response.
The Census Bureau tested “choice” and “push” strategies for notifying sampled units about the Internet
mode. The choice strategy allowed households to choose between mail and web to respond. Under the
choice strategy, the Census Bureau tested two approaches – a prominent versus a subtle choice. In the
prominent choice, the web option was noticeably advertised in all mailings as an alternative to the mail
questionnaire. Under the subtle choice design, the web option appeared only on the questionnaire in an
inconspicuous place. The motivation for testing a subtle choice was to combat the potential response
decrease that has occurred in previous studies when web is presented a response mode choice (Griffin et
al. (2001); Smyth et al. (2010); Gentry et al. (2008); Lesser (2010)), while still providing the option for
those who are specifically looking for it.
The push strategy directed households to use the Internet first before offering the paper questionnaire at a
later mailing. If they did not respond by Internet after a few weeks, they received a paper questionnaire.
While the push strategy has not proved effective in previous studies (Bentley et al., 2006; Brady et al.,
2004), it seemed important to retest as it has potential cost savings. If successful in maintaining or
increasing response, this strategy may save costs associated with printing, postage, data capture, as well
as nonresponse follow-up costs.
We expected from past research that the likelihood of using the Internet will differ by the characteristics
of the housing units (Lugtig et al., 2010; Guarino, 2001; Couper, 2000). Therefore, the ACS test included
a stratification of the sample population into targeted and not targeted segments. The targeted segment
consisted of geographic areas containing households that we expected to use the Internet at a higher rate.
In general, they have a large proportion of people residing in these areas that are highly educated, married
homeowners living in single-unit houses, or single renters with higher than average education living in
urban multi-units. The people residing in the not targeted areas are generally as racially diverse or more
than the national average, have the same or less education than the national average, and have the same or
lower income than the national average (Bates, 2007).
2. Global Instrument Design
Given that the Internet mode was being offered to the entire population, the intent of our design was to
enable even novice Internet users to easily respond to the ACS online. As a starting point, we reviewed
research from previous Internet studies, and consulted external web design experts. Our design was
heavily-based on design principles provided in Couper (2008). Given the complexities and length of the
ACS, we consulted other studies as well as external experts to advise on the instrument navigation,
appearance, and content.
1
Mail and telephone nonrespondents are subsampled prior to inclusion in personal visit follow-up.
1
2.1 Navigation
The ACS instrument was designed for linear navigation through the survey questionnaire, similar to the
mail, Computer Assisted Telephone Interview (CATI) and Computer Assisted Personal Interview (CAPI)
modes. Users can move forward through the screens using the "Next" button, and return to a previously
visited page in a linear fashion by selecting the "Previous" button until the desired page is shown. We
chose linear navigation for this test to simplify instrument development. In future iterations, we want to
research a more flexible approach, recognizing the potential complexity with the length of the survey and
the skip patterns in the questionnaire.
We consulted the research literature when considering whether to use a person-based survey design
(asking questions for one person in the household at a time) or a topic-based design (asking about the
entire household for each question topic at one time). The 2005 National Census Test compared designs
for the Census short form and found mixed results (Zajac et al., 2007). The person-based design had a
higher rate of break-offs, meaning that some people in the household had less or no data provided. The
topic-based design resulted in higher item nonresponse rates for most items. Given these findings, as well
as the fact that completion times were comparable between the designs, this study did not provide clear
direction. Thus, we decided to emulate our current design for the CATI and CAPI modes, where we use a
topic-based approach for the basic demographic questions at the front of the survey (relationship, age,
date of birth, sex, Hispanic Origin, and race) and a person-based design for the detailed person questions
(for example, educational attainment, employment, and income). There is a set of housing questions
between topic- and person-based designs to help ease the transition. One reason for the difference in
design between the two sections is that the detailed questions contain skip patterns, so a particular
question may not be relevant for all household members. Secondly, a topic-based design may be
particularly difficult for unrelated households, as the respondent may not know the answers for
roommates. Moreover, there may be privacy issues with asking personal questions, such as income, in
this manner.
One flexibility that the instrument provides is the ability for the user to choose the next household
member for whom they want to answer the detailed person questions (see Figure 1). This feature will
likely be most helpful in situations where there are unrelated household members.
Figure 1. Pick the Next Person Screen
2
2.2 Access/Security
The Census Bureau, like all federal agencies, has strict Information Technology (IT) Security policies
designed to protect the privacy and confidentiality of respondents. The challenge for this test was to find
a way to meet the security requirements in a manner that was also user-friendly. Working with our
internal IT security office, we developed plans for allowing users access to the instrument, as well as
plans to let users re-enter the instrument as necessary to complete the survey.
Users could access the web survey using a ten-digit access code, which was provided to the selected
housing unit on the mail label of the survey request. The system then generated a four-digit Personal
Identification Number (PIN) that was required to re-enter the survey at a later time. Users could leave the
survey at any time by using the “Save & Logout” feature or by manually exiting the browser. Also, users
were automatically logged out if they left the survey idle for 15 minutes. To re-enter the instrument, users
were required to enter their ten-digit User ID and the four-digit PIN they were given during their initial
session. Lost or forgotten PINs could not be reset for security purposes. After five unsuccessful reentry
attempts, users were locked out for 15 minutes. Returning users returned to the survey where they left off
when they logged back in. Access to the instrument was disabled once a respondent submitted the
survey.
Figure 2. Welcome/Log in Screen
2.3 Web Survey Appearance
The web survey maintains the look and feel of the ACS mailing pieces, including the paper questionnaire
and associated brochures. The screen background is the same light green color as the mail questionnaire,
and the banner image was lifted from a brochure that is mailed to sampled addresses (see Figure 3).
Similar to the mail questionnaire, questions were bolded and response options were not. Instructions and
3
examples were italicized. The questions on the web survey were numbered to map to the mail
questionnaire in case respondents wanted to follow along.
The majority of the instrument displays one question per screen to help facilitate skip patterns and to keep
page content short to avoid horizontal and vertical scrolling whenever possible. Some screens contain
two questions when the topics are related. The majority of the questions use radio buttons and check
boxes for responses, with text boxes for write-in fields where necessary. Question response labels that
followed radio buttons and check boxes were clickable to provide users with a larger area to select an
answer. For the write in fields that only accept a fixed length of characters (e.g., date of birth and
telephone number), we used auto tabbing in fields to advance the cursor automatically. We avoided drop
down menus in most cases due to potential for misuse (Dillman, 2000).
Figure 3. Example Question
2.4 Web Survey Features
The screens provide several helpful links. First, Frequently Asked Questions (FAQs) and instructions for
completing the survey are accessible on every screen. These links contain general information about the
ACS as well as tips for using the web survey. In addition, privacy, security, and accessibility information
appear in links on the bottom right of every screen (as seen in Figure 3). These required links provide
standard information about the Census Bureau’s privacy, security, and accessibility policies.
Furthermore, topic-specific help is provided by a link immediately following the question, where
applicable. The help link generates a new window (rather than a pop-up that can be blocked) that is
overlaid on the screen such that the user can still see the web survey. Each time “Help” is clicked, the
help window refreshes with the information specific to the particular question. The navigation buttons
(Next and Previous) follow the user’s line of sight from the response options. Clicking the “Next” or
“Previous” button also saves the user’s response to that question.
We provide a specific “Save & Logout” feature to allow users to exit the survey if needed. We placed the
link on the top menu bar on the right-hand side of the screen to discourage its selection since we would
prefer users to complete the survey in one session in case they forget to return. There are three ways in
which a user can leave the survey: click “Save & Logout”, close the browser, or he/she can be timed out
4
if the survey is left idle for 15 minutes. Regardless of the way the user exits the survey, no data will be
lost since the data is saved every time a user navigates from each screen.
We also provide a general progress indicator labeled "Where You Are" to inform the user of their location
in the survey. The fact that the time required to complete the ACS varies greatly depending on the age
and characteristics of household members makes it is very difficult to accurately present a traditional
completeness indicator. Yet, given the length of the questionnaire and the risk of a user feeling lost in the
instrument, we felt we needed to provide a general measure to allow users to gauge their progress in the
survey. Our progress indicator is displayed on the right-hand side of the screen, and highlights the section
of the instrument in which the user is located at any given time while graying out completed sections.
The indicator is not interactive; that is, respondents cannot click on the various sections to navigate within
the instrument.
2.5 Unfolding Design
For some questions, one question does not make sense without the context of the preceding question. In
these instances, the preceding question appears before the active question, but the text is gray indicating
to the user the question is for reference only (See Figure 4). In fact, to change the answer to the preceding
question, the user has to navigate back to the previous screen. Usability testing uncovered the need for
the gray-out feature in the unfolding design. Test results suggested that users were revisiting the
preceding question when it was intended for context only.
2.6 Auto Calculations
The ACS web instrument takes advantage of the automated technology when computations are required.
First, the age is computed directly when the user provides date of birth. Secondly, after the user enters
income from eight different sources, the instrument sums the amounts to provide a total income for
verification for each person. In both places, the user can overwrite the automatic calculation.
2.7 Errors
Survey questions deemed critical to the ACS or critical to the instrument path are subject to soft edits
when left blank. Moreover, the instrument provides error messages in some cases when inconsistent
responses or out of range values are entered. In the event of an error or a blank response, the instrument
renders a message directly above the question indicating that there is a problem with the information
entered. Moreover, any write-in fields associated with that error are highlighted in yellow to draw
attention to the source of the problem (see Figure 4). The user is given the opportunity to change the
response and continue. If the user fails to change the problem response, he or she can bypass the error
and continue in the instrument.
5
Figure 4. Unfolding Design with Error Message
2.8 Review & Edit
After the user has seen all of the survey questions for each household member, he or she has the option of
reviewing all of the answers provided. The user can also choose to submit without reviewing. For those
that choose to review, their responses are displayed in a table with abbreviated question labels on the left
and their responses on the right. If the user left an item blank, the review screen displays "[BLANK]" in
bold font, all capitalized, in brackets (see Figure 5).
The responses on the review screen are hyperlinks that take the user to the screen containing the question
they wish to review or edit. The user can change the response or simply review the question and answer.
When they are done, the only navigational button on this screen takes them to the Review & Edit screen.
In the event that the user changed a response that elicits a different instrument path than the original
response, the review screen requests a response to the next question on that path. The review screen will
continue to request answers to questions on the new path one by one as the questions on the new path are
answered until there are no remaining new questions.
6
Figure 5. Review and Edit Screen
3. Paradata
Paradata have been defined as data or measurements about the survey process, not including the response
data. In Internet surveys, paradata include the location of break-offs, changed answers, error messages,
mouse clicks, quantity of help requests, and response times, among other things. These paradata can be
used to identify potential problems with the survey instrument from question wording to design.
Additionally, they can help researchers understand the process the respondent takes to complete the
survey, which can in turn help them identify ways to make the survey task less burdensome for
respondents.
The ACS web survey was designed to collect click data that trail the user’s mouse clicks on each screen,
as well as time stamps associated with each click. Using this information, we plan to study behavior
related to the user’s interaction with the instrument to assess instrument design. That is, we will look at
break-offs, impact of error messages, answer changes, number of sessions, navigation, use of instrument
features, and response time. In addition to evaluating the instrument design, this information will guide
the changes we make to the instrument for the next iteration of Internet testing.
Acknowledgements
We would like to thank the following people for their valuable contributions and assistance to the
development of this project: Michelle Ruiter, Megha Joshipura, Rachel Horwitz, Debbie Klein, Andrew
Roberts, Brian Wilson, Todd Hughes, Tony Tersine, John Studds, Chris Butler, Kathy Ashenfelter,
Temika Holland, and Victor Quach.
7
References
Bates, N., and Mulry, M., (2008), “Segmenting the Population for the 2010 Census Integrated
Communications Program,” C2PO 2010 Census Integrated Communications Research Memoranda Series
#1, October 24, 2008. http://2010.census.gov/partners/pdf/C2POMemoNo_1_10-24-08.pdf
Bentley, M., and Tancreto, J., (2006), “2005 National Census Test: Self-Response Options Analysis,”
2010 Census Test Memorandum Series: 2005 National Census Test, No. 26, U.S. Census Bureau.
Brady, S., Stapleton, C., and Bouffard, J., (2004), “2003 National Census Test: Response Mode
Analysis,” DSSD 2003 Memorandum Series #B-02, U.S. Census Bureau.
Couper, M., (2000), “Web Surveys: A Review of Issues and Approaches,” Public Opinion Quarterly,
Vol. 64, No. 4, pp. 464-494.
Couper, M., (2008), Designing Effective Web Surveys, Cambridge University Press, New York, NY.
Couper, Mick P. and Miller, Peter V. (2008), Web survey Methods: Introduction to the Special Issue of
POQ
on
Web
Survey
Methods.,
Public
Opinion
Quarterly, 72,
5,
831-835.
http://poq.oxfordjournals.org/content/vol72/issue5/#ARTICLES
Dillman, D., (2000) Mail and Internet Surveys: The Tailored Design Method, John Wiley & Sons, Inc,
New York, NY.
Gentry, R. and Good, C. (2008), "Offering Respondents a Choice of Survey Mode: Use Patterns of an
Internet Response Option in a Mail Survey." Presentation at the Annual Conference of the American
Association for Public Opinion Research, May 15-18.
Griffin, D., Fischer, D., and Morgan, M. (2001), “Testing an Internet Response Option for the American
Community Survey,” Paper Presented at the Annual Conference of the American Association for Public
Opinion Research, May 17-20.
Guarino, J. (2001), “Assessing the Impact of Differential Incentives and Alternative Data Collection
Modes on Census Response,” Census 2000 Testing and Experimentation Program, September 6, 2001
Lesser, V. (2010), “Does Providing a Choice of Survey Modes Influence Response?” Paper Presented at
the Annual Conference of the American Association for Public Opinion Research, May 13-16.
Lugtig, P., Lensvelt-Mulders, G., Frerichs, R., and Greve, A. (2010), “Estimating Nonresponse Bias and
Mode Effects in a Mixed-mode Survey,” Paper presented at the Annual Conference of the American
Association for Public Opinion Research, May 13-16.
Smyth, J., Dillman, D., Christian L., and O'Neill, A. (in press), "Using the Internet to Survey Small
Towns and Communities: Limitations and Possibilities in the Early 21st Century," American Behavioral
Scientist.
Zajac, K., Allmang, K., and Barth, J., (2007), “2005 National Census Test: Response Mode Analysis,”
2010 Census Test Memorandum Series: 2005 National Census Test, No. 28, U.S. Census Bureau.
8
File Type | application/pdf |
File Title | Design of the American Community Survey Internet Instrument |
Subject | Internet Data Collection, Data Collection, Data Quality |
Author | U.S. Census Bureau |
File Modified | 2012-05-30 |
File Created | 2012-04-18 |