ACS Internet Testing Plan

omb1134ACSinternet.doc

Generic Clearence for Questionnaire Pretesting Research

ACS Internet Testing Plan

OMB: 0607-0725

Document [doc]
Download: doc | pdf


The Census Bureau plans to conduct additional research under the generic clearance for questionnaire pretesting research (OMB number 0607-0725). We will be conducting research using 20 questions from the 2011 Internet Test version of the American Community Survey (ACS) and the 2010 ACS Content Test.


This study is a continuation of a prior study which sought to find a way to identify confused Internet respondents. This was sent to OMB for approval on April 7 and approved on June 6, 2011. The original study identified specific mouse cursor movements that are associated with difficulty or confusion answering a question on an Internet survey. These results then informed a set of hypotheses that will be statistically tested in the current study. The end result will be a model that can be used to predict, in real time, when a respondent is having trouble answering a question. The principal application of this research is the ability to provide respondents with tailored real-time assistance while they are taking an Internet survey. The goal is to implement this feature in a future ACS Internet test.


The first study aimed to define a set of mouse movements that may be related to specific types of difficulty. Continuing with this idea, Study 2 will test the hypotheses formulated in Study 1 (currently ongoing). It will deliberately manipulate respondent difficulty answering a question using scenarios. Each question will have two versions: one complicated and one straightforward. The level of difficulty will be manipulated either by using two different scenarios for one question, or two questions for one scenario. Each respondent will receive 10 complicated questions and 10 straightforward questions. A copy of the questions and scenarios is enclosed. Using scenarios allows us to know the true answer to each question, and thereby assess accuracy. It also allows us to know exactly what the respondent is confused about. This provides the opportunity to test two theories: whether respondents, as a whole, move their mouse in a similar fashion when they are confused, and whether they display different types of movements for different types of confusion.


In conjunction with testing these two theories, we can also create a model that predicts respondent difficulty in real time. For each of the movements analyzed in the first study, we will statistically test whether each is predictive of difficulty. From there, we will create a hierarchical model of respondent difficulty that accounts for idiosyncrasies in the way each individual uses his or her mouse. Depending on the results of testing the theories, we may have multiple models for different types of difficulty, or one model which can predict general confusion. These models will provide a means of knowing what respondents are having trouble with, which will allow developers to offer real-time tailored help which more directly addresses the respondent’s problem than is currently available.


In November and December 2011 we will conduct approximately 100 laboratory interviews. Recruiting will take place using advertisements in local newspapers. The interviews will be conducted at the Joint Program in Survey Methodology at the University of Maryland, in a lab equipped with Tobii eye- and mouse-tracking capabilities. Participants of varying ages, education levels, and levels of computer experience will be recruited using a screening questionnaire. A copy of the screening questionnaire is enclosed. Participants will answer 20 ACS questions on a computer using a condensed version of the ACS Internet instrument. Following each question, the participant will also be asked to rate the previous question on its level of difficulty. A copy of the questionnaire is enclosed. Additionally, respondents will answer a series of demographic questions using a paper form. The questions are included with the enclosed consent form. Finally, they will be debriefed on their experience with the survey. A copy of the research protocol is enclosed.


Interviews will be video-taped and the participant’s eye and mouse movements will be tracked, with respondent permission, to facilitate analysis of the results. Participants will be informed that their response is voluntary and that the information they provide is confidential and will be seen only by employees involved in the research project. Participants will receive $30 for their participation.


Interviews will take between 30 and 45 minutes. Thus, the total estimated burden for this research is 75 hours.


I am planning a follow up study to this one which will test different ways of providing respondents with real-time assistance. Currently, respondents report lower levels of satisfaction when researchers offer unsolicited help. Therefore, the purpose of this research is to test different presentation methods (text, audio, and chat), to determine which results in the highest respondent satisfaction, as well as the highest accuracy. Again, vignettes will be used so accuracy can be measured. This presentation method will then be combined with the model created in the current study to provide respondents with assistance in a future ACS Internet Test. A separate letter will be submitted to request approval for this study.

The contact person for questions regarding data collection and statistical aspects of the design of this research is listed below:


Rachel Horwitz

Decennial Statistical Studies Division

U. S. Census Bureau

4600 Silver Hill Road

Washington, DC 20233

301-763-2834

[email protected]



File Typeapplication/msword
File TitleExhibit A
AuthorJeffrey Kerwin
Last Modified Bydemai001
File Modified2011-10-13
File Created2011-10-13

© 2024 OMB.report | Privacy Policy