The Census Bureau plans to conduct additional research under the generic clearance for questionnaire pretesting research (OMB number 0607-0725). We will be conducting research using 20 questions from the 2011 Internet Test version of the American Community Survey (ACS) and the 2011 ACS Content Test. The purpose of this research is to find a way to identify confused Internet respondents. Specifically, cognitive interviews will be used to identify particular mouse cursor movements that are associated with difficulty answering a question. Tracking eye movements has been used for this purpose in laboratory settings. However due to cost and feasibility constraints, eye tracking cannot be used in the field for production surveys. In the human-computer interaction literature, researchers and Web designers have successfully tracked mouse movements as a proxy for eye movements because it is much cheaper and can be collected in real-time for all users. Since mouse tracking has not been used in a survey environment, this study proposes concurrently analyzing eye and mouse movements to help explain what the different mouse movements mean. By using these two features together, we can get a better understanding of which mouse movements indicate confusion, which will provide the knowledge to collect and analyze mouse movements in productions surveys in the future.
The results will be used in future studies to create a model capable of predicting when an Internet respondent is confused, what they are confused about, and then providing them with real time assistance in the form of a pop-up help window or a chat window. The goal is to implement this feature in a future ACS Internet test.
This study will look at all the different mouse and eye movements respondents make while taking a survey to determine which are diagnostic of trouble answering a survey question. We will look at several types of trouble, such as vague or unfamiliar terms, difficulty mapping one’s own experience onto the answer categories, and common words used in a way that differs from the everyday sense. The specific questions selected are believed to contain these types of problems for some respondents. As there is no current literature about mouse movements as they apply to survey taking, this study will attempt to define a set of movements that methodologists can use to determine when respondents are confused by a question. The indicators used will be based on the human computer interaction literature and observations from the 2011 ACS Internet Test usability sessions, and have been used by Google in their analyses and pending patents. Specifically, we will examine regressions, instances of vertical reading, instances of horizontal reading, highlights, clicks on specific words (but not response options), hovers (over text in the question, the previous or the next button, as well as long clicks), and instances where the mouse was used as a marker (a hover on a response option before clicking)1.
By watching the location of the mouse and where the respondent’s focus of attention is based on his/her eye movements, we can classify different mouse movements into different types of difficulty and other general movements. If we repeatedly see respondents displaying the same behavior for a question, we can develop hypotheses relating mouse movements to different types of problems, which can help researchers assess how to best help respondents.
In August 2011 we will conduct approximately 40 laboratory interviews. Recruiting will take place using advertisements in local newspapers. The interviews will be conducted at the Joint Program in Survey Methodology at the University of Maryland, in a lab equipped with Tobii eye- and mouse-tracking capabilities. Participants of varying ages, education levels, and levels of computer experience will be recruited using a screening questionnaire. A copy of the screening questionnaire is enclosed. Participants will answer 20 ACS questions on a computer using a condensed version of the ACS Internet instrument. Questions will be modified to reflect the laboratory setting of the survey. Specifically, questions will ask about “your” house instead of “this” house. Following each question, the participant will also be asked to rate the previous question on its level of difficulty. A copy of the survey is enclosed. Additionally, respondents will answer a series of demographic questions using a paper form. Finally, they will be debriefed on their experience with the survey. A copy of the research protocol is enclosed.
Interviews will be video-taped and the participant’s eye and mouse movements will be tracked, with respondent permission, to facilitate analysis of the results. Participants will be informed that their response is voluntary and that the information they provide is confidential and will be seen only by employees involved in the research project. Participants will receive $30 for their participation.
Interviews will take between 45 minutes and one hour. Thus, the total estimated burden for this research is 50 hours including screener burden.
I am planning a follow up study to this one which will use vignettes and two versions of the questions. The purpose of this is two fold: the vignettes will provide us with a “true value” so accuracy can be assessed and the multiple question versions allows us to manipulate the level of difficulty so we will be able to directly assess if respondents behave differently when they're confused. This follow up study will allow us to statistically test the hypotheses discovered in the study described here.
The contact person for questions regarding data collection and statistical aspects of the design of this research is listed below:
Rachel Horwitz
Decennial Statistical Studies Division
U. S. Census Bureau
4600 Silver Hill Road
Washington, DC 20233
301-763-2834
1 The mouse movements have been cited in the following papers: Arroyo E., Selker, T., & Wei, W. (2006). "Usability Tool for Analysis of Web Designs UsingMouse Tracks," Proc. Conf. Human Factors in Computing Systems (CHI), ACM Press, pp. 484–489., Guo, Q. & Agichtein, E. (2008). ”Exploring Mouse Movements for Inferring Query Intent.” SIGIR, Singapore., Mueller, F., & Lockerd, A. (2001). “Cheese: Tracking Mouse Movement Activity on Websites, a Tool for User Modeling.” Proceedings from Conference on Human Factors in Computing Systems, Seattle, Wa., Rodden, K., Fu, X., Aula, A., & Spiro, I. (2008). “Eye-Mouse Coordination Patterns on Web Search Results Pages.” Proceedings of the Conference on Human Factors in Computing Systems, Florence Italy.
File Type | application/msword |
File Title | Exhibit A |
Author | Jeffrey Kerwin |
Last Modified By | demai001 |
File Modified | 2011-06-06 |
File Created | 2011-06-06 |