Generic Information Collection Request
Eye-tracking Evaluation of American Community Survey Invitation Letters
Request: The U.S. Census Bureau plans to conduct additional research for the collection of routine customer feedback (OMB number 0690-0030). We propose to conduct an eye-tracking study to evaluate new survey invitation letters for the American Community Survey (ACS). We are seeking approval for this project.
Background: The Census Bureau has undertaken a project to develop new mail materials for the ACS in order to improve self-response rates (i.e., before nonresponse follow-up operations begin) as well as overall response rates. The new mail materials are intended to be easy to read and persuasive in terms of affecting respondents’ decisions to participate in the survey and in a timely manner. To those ends, the content is concise and limited to information that is required by law, and that which is hypothesized to be meaningful and persuasive. In addition, the new materials use enhanced visual elements (color, images, and novel formatting and structure) with the goal of being appealing to respondents. The new materials will be evaluated in a future experiment to be conducted in the ACS Methods Panel, a sub-sample of the ACS production panel used for research. The proposed study will focus on the initial survey invitation letters.
Purpose: We plan to evaluate the effectiveness of the visual design elements of the invitation letters via an eye-tracking study. This study will be conducted in the Census Bureau’s usability lab. Please note that a separate cognitive pretesting project will be conducted to comprehensively evaluate the information contained in all of the ACS materials. The eye-tracking study proposed herein is solely intended to evaluate the visual design strategies using a representative sample of the materials.
Population of Interest: These materials are intended for use with the general U.S. adult population.
Language: The study will be conducted in English.
Timeline: The study will be conducted March-May, 2019.
General Protocol: In an eye-tracking study, participants’ eye movements are measured as they look at visual stimuli. Participants will be seated in front of the eye-tracking equipment and shown individual mail pieces on a video monitor. The eye-tracking equipment measures which parts of the materials participants look at, the duration of their gazes at particular areas, and in what order they view them. At the start of a session, the participant will be oriented to the eye-tracking equipment and a calibration procedure will be conducted to ensure maximum accuracy for the eye-tracking session. The eye-tracking procedure is non-intrusive, and participants are only asked to remain still in a seated position while reviewing each letter. The letters will be presented on a monitor one at a time. This will allow us to assess the effectiveness of the visual design strategies in drawing participants’ attention to specific parts of the materials. After reviewing each letter, participants will be administered a recall test consisting of an open-ended question followed by a questionnaire with two closed-ended questions. The recall test is intended to evaluate how attentive participants are in reading the letters. A current ACS invitation letter will be included in the study to compare reading behaviors with the new letters.
Sample: Twenty adults will be selected with the goal of obtaining a sample that is diverse with regard to education level.
Recruitment: Participants will be selected from CBSM’s recruitment database. Additional participants may be recruited through Craigslist or other online advertising channels.
Use of Incentive: The Census Bureau will use an incentive of $40 per participant for this 60-minute interview.
Below is a list of materials to be used in the current study.
Sample testing protocol (Enclosure 1)
Demographic questionnaire (Enclosure 2) – (Previously approved by OMB for usability testing in the spring of 2015 for the 2015 Census Test.)
Survey invitation letters – current ACS letter and three test letters (Enclosure 3)
Sample recall questionnaire (Enclosure 4)
Recruitment materials (Enclosure 5)
Additional screening questions (Enclosure 6)
Consent form (Enclosure 7)
Length of interview: We estimate 60 minutes to interview each participant. For recruiting participants, in general, four screener conversations are required to recruit one participant. On average, each screener conversation lasts approximately six minutes. We estimate it will take approximately 8 hours to screen and recruit 20 participants (20 interviews X 4 candidates X 6 minutes per candidate = 480 minutes or 8 hours). For 20 participants, the estimated burden for the interviews is 20 hours, bringing the total burden to 28 hours.
The contact person for questions regarding data collection and methodological aspects of the design of this research is listed below:
Alfred D. Tuttle
Center for Behavioral Science Methods
U.S. Census Bureau
Room 5K020F
Washington, D.C. 20233
(301) 763-7809
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | OMB Letter |
Author | Erica L Olmsted Hawala |
File Modified | 0000-00-00 |
File Created | 2021-01-15 |