Usability Study of the online ACS

OMB1512_ACS_mobile_enc1.doc

Generic Clearance for Questionnaire Pretesting Research

Usability Study of the online ACS

OMB: 0607-0725

Document [doc]
Download: doc | pdf

The Census Bureau plans to conduct additional research under the generic clearance for questionnaire pretesting research (OMB number 0607-0725). We will be conducting a usability study of the online American Community Survey (ACS) with a focus on participant interactions while using a mobile device (e.g., either smartphone or tablet). This is a follow-up study to the previous study conducted at the end of last year/beginning of this year for which we received final approval on November 18, 2014. The previous study tested the production non-mobile friendly ACS instrument on mobile devices. We learned of the difficulties respondents had using their mobile devices to complete the ACS. Since then the ACS has been redesigned to be mobile-friendly.


The goal of the test is to identify how respondents use mobile devices to respond to the ACS now that it has been re-skinned with a “mobile friendly” design. We will identify any positive and/or negative characteristics that occur when users respond to the survey using mobile devices. Based on the usability test results, the usability team will identify a list of screens that are most problematic for users to respond to in a mobile environment as well as some suggestions on how to improve the screens for the mobile experience. Specific measures that we will be tracking and coding include whether:

    • Participants engage in an excessive amount of manipulation (e.g., pinching and zooming) to answer the question

    • Participants had to scroll horizontally

    • Participants cannot identify the survey task, particularly on screens with grayed text

    • Participants appear to mis-click or change their answer

    • Oral comments, both negative and positive, as well as questions

The testing will be conducted from June 1, 2015 through June 23, 2015. Testing will take place either at the usability lab at the U.S. Census Bureau or at locations more convenient for participants, what we call remote testing, such as at local community centers or libraries. A total of 14 sessions using a smartphone and 7 sessions using a tablet will be conducted for a total of 21 sessions.


Participants will be required to bring their own device for the usability test to occur. Participants will be from the general public and about half the participants will be from multi-person households. In addition we will attempt to recruit about 1/3 of the participants that fall into the age range of 55 and above. All participants will have a minimum of at least a half-year experience using the mobile device to use the Internet and use the mobile device on the Internet at least three times a week to search for information. As occurred in Round 1 usability testing for the ACS on mobile (and due to the longer duration of the usability study at 90 minutes,) participants will be compensated $60.00 for their participation.


Participants will complete an electronic initial questionnaire about their mobile device use, and their demographic characteristics. After that, each participant will be asked to complete the online ACS using their own smartphone/tablet. After the survey, each participant will answer a set of satisfaction questions. Following that, each participant will be asked debriefing questions.


The following materials will be used in the study and are attached with this letter:

  • Enc. 1: Background questionnaire (To get understanding of users’ mobile device use and Internet experience)

  • Enc. 2: Demographic questionnaire

  • Enc. 3: Satisfaction questionnaire

  • Enc. 4: Debriefing questions (Allows for a back-and-forth, conversation between test administrator and user about topics related to application that was not yet covered)

  • Enc. 5: Protocol used for the study (To see how the research study will be run)

  • Enc. 6: Screenshots of online American Community Survey – Note: a handful of the screens have been “re-skinned” with the “mobile-friendly” design and are also attached.


We estimate that users will spend 90 minutes on average taking the study, including time spent working on the demographic and satisfaction questions, the online survey and the debriefing, for a total of 32 hours. In general, for Census Bureau staff, it requires three screener conversations to recruit one participant. As well, each screener conversation lasts approximately five minutes. We estimate it will take 6 hours to screen and recruit 21 respondents. Thus, the total estimated respondent burden for this study is 38 hours.


The contact person for questions regarding data collection and statistical aspects of the design of this research is listed below:


Erica L. Olmsted-Hawala

Center for Survey Measurement

U.S. Census Bureau

Washington, D.C. 20233

(301) 763-4893

[email protected]


File Typeapplication/msword
File TitleThe purpose of this letter is to inform you of our plans to conduct research under the generic clearance for questionnaire pre
AuthorBureau Of The Census
Last Modified ByJennifer Hunter Childs
File Modified2015-05-07
File Created2015-05-07

© 2024 OMB.report | Privacy Policy