SUPPORTING STATEMENT
Generic Testing – Eye Tracking
OMB No. 0535-0248
This mini-supporting statement is being submitted to OMB to define the need for eye tracking research on formatting of survey questions. This research will focus on how to effectively display instructions and definitions associated with specific survey questions. The goal is to determine the formats that respondents are most likely to attend to when answering a survey question. No more than 40 respondents will participate in this research.
In establishment surveys, it is common to include detailed instructions and definitions with survey materials. Questions in establishment surveys are sometimes not questions but keywords reflecting the construct being measured followed by definitions (e.g., Intermediate Market (business or organization in the middle of the supply chain marketing locally – and/or regionally – branded products) (Haraldsen, 2013). Definitions are used to ensure that all respondents are interpreting these complex constructs in the same way. Over time, research has shown that it is important to have these instructions/definitions as close to the survey question as possible. Gernsbacher (1990) found that in the case of keywords followed by definitions, respondents tend to read the keywords and ignore the definitions that follow. Respondents are even less inclined to look for auxiliary information (in instruction booklets, help screens) when answering the survey question even if this information is necessary for answering the survey question (Conrad et al., 2006; Sloan and Ott, 2016).
At NASS, we often struggle with where to put our instructions and definitions and are always concerned that are respondents are not reading them. When conducting cognitive interviews, we are often asked to probe respondents to see if they read question instructions and definitions. However, cognitive interviewing is not an ideal method for determining this as respondents are not very good at telling you what they have read and have not read (Schall and Bergstrom 2013; Guan et al., 2006; Albert and Tedesco, 2010). In fact, we often find in cognitive interviews that respondents will answer yes to these types of probes but upon further probing it becomes clear that they did not read these instructions/definitions or if they did, they did not adhere to them.
Eye tracking provides an objective measure of what respondents look at when completing a survey questionnaire. Using eye tracking equipment, researchers can determine what information respondents fixated on, how long they fixated on it and the order in which they looked at the information presented. Of course respondents may fixate on a text such as a definition but not read it; however, the more time respondents spend fixating on a definition, the more it impacts their responses (Galesic, Tourangeau, Couper and Conrad, 2008).
The purpose of the current study is to determine the optimal placement of question instructions and definitions to increase the likelihood that they are seen by most respondents. NASS would like to conduct eye tracking to evaluate 5 common formats used on our surveys:
Definitions preceding survey questions
Definitions following survey questions
Definitions embedded within a matrix
Include and exclude statements within the survey questions
Double banked include and exclude statements
Specifically, NASS survey methodologists would like to answer the following research question: Which instruction/definition format(s) are respondents more likely to attend to when completing a survey questionnaire.
In this study, participants will be recruited from the general population. We are aiming to get eye tracking data from 30 participants. Because the eye tracking equipment does not work for everyone, we may need to run up to 40 participants.
For the survey questions, we selected question topics that could be answered by any member of the public. These questions come from different household surveys and some cases were modified for the study purposes. Topics include health behaviors and conditions, and transportation expenses. We are not interested in their interpretations of the survey questions or their responses per se, but rather which format encouraged them to read the associated instructions and definitions. Two versions of the survey have been created to randomize the order in which respondents see the question formats (see table below). Participants will be randomly assigned to one of the surveys.
Survey 1
|
Survey 2
|
Participants will complete a short online survey while the Tobii eye tracker monitors their eye movements. We will evaluate the results of the eye tracking to identify any patterns in their reading.
After they complete the survey, we will conduct a post-test interview addressing their thoughts about the different question formatting options. The interview will cover the following questions:
In this study, we are looking at how the format of the question can impact responses. Specifically, we are interested in the formatting on the instructions and definitions associated with the survey questions. Did you notice/read the definitions as you were reading the survey questions?
Do you have any thoughts about the location of the definitions across the different questions?
Did you read the include/exclude statements?
Do you have any thoughts about the formatting of the include/exclude statements?
A. JUSTIFICATION
Circumstances making collection of information necessary.
In establishment surveys, such as the type conducted at NASS, it is common to include detailed instructions and definitions with survey materials. Questions in establishment surveys are sometimes not questions but keywords reflecting the construct being measured followed by definitions (e.g., Intermediate Market (business or organization in the middle of the supply chain marketing locally – and/or regionally – branded products) (Haraldsen, 2013). Definitions are used to ensure that all respondents are interpreting these complex constructs in the same way. Over time, research has shown that it is important to have these instructions/definitions as close to the survey question as possible. Gernsbacher (1990) found that in the case of keywords followed by definitions, respondents tend to read the keywords and ignore the definitions that follow. Respondents are even less inclined to look for auxiliary information (in instruction booklets, help screens) when answering the survey question even if this information is necessary for answering the survey question (Conrad et al., 2006; Sloan and Ott, 2016).
How, by whom, and for what purpose information is to be used.
The data will be collected in the Bureau of Labor Statistics (BLS) cognitive lab using Tobii eye tracking equipment. The data will be analyzed by NASS’s Research and Development Division to determine the optimal placement of question instructions and definitions on NASS surveys.
Use of improved information technology.
This research will be conducted using a Tobii eye tracker.
4. Efforts to identify duplication.
Respondents will be recruited from the general population.
Methods to minimize burden of small businesses.
Respondents will be recruited from the general population
Consequence if information collection were less frequent.
The qualitative research is planned to be conducted between November 2017 and November 2018, as resources become available. This docket expires in June 2019, so interviews not completed by that time will be resubmitted with a mini-supporting statement after this docket (0535-0248) is renewed
Special circumstances.
There are no special circumstances associated with this information collection.
Federal register notice and consultation with outside persons.
Not applicable.
Payments or gifts to respondents.
There are no payments or gifts to respondents.
Confidentiality provided to respondents.
The same confidentiality that is applied to the Eye Tracking Study will be applied to data collected during the qualitative research interviews.
Questions of a sensitive nature.
Question topics will include health behaviors and conditions and transportation expenses. No questions of a sensitive nature will be asked.
12. Hour burden and annualized costs to respondents.
All interviews will be conducted by trained survey methodologists at BLS to explore which question formats respondents are most likely to attend to when responding to a survey questionnaire.
Qualitative Interviews |
Number of Respondents |
Hours per Interview |
Total Burden Hours |
Eye Tracking Interviews |
40 |
10 minutes |
6.7 |
NASS uses the Bureau of Labor Statistics’ Occupational Employment Statistics (most recently published on March 31, 2017 for the previous May) to estimate an hourly wage for the burden cost. The May 2016 mean wage for bookkeepers was $19.34. The mean wage for farm managers was $36.44. The mean wage for farm supervisors was $23.47. The average of the three is $26.42. The annual estimated reporting time of 7 hours is multiplied by $26 per hour for a total cost to the public of $182.
Total annual cost burden to respondents.
There are no capital/start-up or ongoing operation/maintenance costs associated with this information collection.
Annualized costs to federal government.
Costs for conducting the qualitative research interviews are estimated at $50,000. This will cover expenses for staff payroll, travel, survey analysis, and any other expenses that may be incurred while updating survey materials based on our findings.
Reasons for changes in burden.
This mini-supporting statement addresses the use of burden to conduct eye tracking research.
Tabulation, analysis, and publication plans.
No data will be published from these tests. Data are for internal use only, but results may be presented at outside conferences or seminars.
Request for approval of non-display of expiration date.
There is no request for approval of non-display of the expiration date.
18. Exceptions to certification statement.
There are no exceptions to the certification statement.
B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS:
Respondent universe, sampling, and response rate.
Participants will be recruited from the general population, using the participant pool that is maintained at BLS. We will recruit people who do not wear glasses to use the computer and who have not participated in a study at BLS in the last 9 months.
Procedures for the collection of information.
Interviewers will follow standard pretesting techniques as defined in the original Supporting Statement Part A for the Generic Clearance docket (0535-0248).
Information collected adequate for intended uses.
Participants will be selected based on specific criteria as stated above.
Test of procedures or methods.
Not applicable.
Individuals consulted on statistical aspects of survey.
Selection of methods of testing for this research was done by the Research and Development Division; Heather Ridolfo (202) 692-0293
November 2017
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | HancDa |
File Modified | 0000-00-00 |
File Created | 2021-01-21 |