2016 American Community Survey Respondent Burden Testing
The U.S. Census Bureau in its continuing effort to reduce respondent burden on the American Community Survey (ACS) plans to conduct additional research under the generic clearance for questionnaire pretesting research (OMB number 0607-0725). Specifically, the U.S. Census Bureau will test changes to seven ACS topics. The corresponding revised questions, developed under the auspices of the OMB Interagency Group on the American Community Survey and related subject-area subcommittees, are designed to facilitate respondent comprehension, reduce respondent burden, and where possible, streamline wording. We recognize that these potential changes would reduce the analytic utility of the data for some of these questions, but they reflect a first attempt to identify whether we can reduce the respondent burden associated with these changes. The Census Bureau will continue to work with the interagency group as this research process evolves and will ensure that the ACS continues to meet the mandatory and required needs of Federal agencies.
For the “2016 ACS Respondent Burden Testing,” the U.S. Census Bureau has contracted Westat, a statistical survey research corporation headquartered in Rockville, Maryland to perform cognitive testing on the proposed question changes. For testing purposes, the seven topics have been organized into three testing groups (see below), each of which requires a specified set of respondents:
Group 1: Telephone Service; Computer and Internet; Year Built
Group 2: Year of Naturalization and Year of Entry
Group 3: Address for Place of Work; Place of Residence One Year Ago; Weeks Worked/Hours Worked
The Telephone Service, Computer, and Internet questions are being streamlined to reduce the burden of this series of questions. Results of cognitive testing will inform us about whether this approach was easy to use and confirm that respondents understand what the questions are asking.
The Year Built question currently asks respondents to select a category that is in 10-year categories from 1940 to 2000 and then report individual years for newer buildings. In this cognitive test, we will test two versions of the question that merge the response categories into larger ranges that align with the required uses. One mandated requirement is that data are collected on buildings built in 1939 or earlier. Data users also need specific information on the year built starting in 2010. Another need, while not mandated, is for information on buildings built before 1980, as that year is important for health concern legislation (ex: the banning of lead based paint). Both of our proposed versions capture this information. One version divides the years in between 1940 and 2010 into four 20-year categories. The other version divides the years between 1940 and 2010 into two categories: 1940-1979 and 1980-2009. Both of these versions collect data according to the mandated needs, and these broader categories potentially lessen respondent burden. Testing is desired to determine if either option is perceived as easier than the current version and if one of the two alternatives performs better than the other.
The Year of Naturalization and Year of Entry questions currently ask for a specific year that a person entered the United States and was naturalized (when applicable). In this test, we will instead offer categorical responses. Two versions are proposed, both aligning with required uses. Several stakeholders need the specific year of entry for the last ten years, so both of the versions contain a category of 2005 or later. We determined that specific data for years before 1985 were not necessary, so both versions also have a category for all years before 1985. For one version, the years between 1985 and 2005 are split into four five-year categories. We believe that this has the potential to lessen respondent burden, while providing stakeholders with more specificity than having broader categories. The other version extends the length of the categories and divides the years between 1985 and 2005 into two ranges: 1985 – 1996 and 1997 - 2004. Some stakeholders stated that having this information was useful for programmatic needs in determining benefits based on Immigration and Welfare Reform Act of 1996 (although this is not a mandatory or required need). Testing is desired to determine if either version is easier than the other, or the current version.
Addresses for Place of Work and Residence One Year Ago will be modified to test more standard address fields in an order that we think most respondents are used to. Collection of county will be moved to the end of the address fields, rather than in the middle, as well as question about whether the address is within the city limits. Cognitive testing will inform us about how easy or difficult it was to provide the information in this format.
Weeks Worked and Hours Worked are being tested to build off findings from a previous round of cognitive interviews conducted for the 2016 ACS Content Test. Two options will be evaluated that examine asking Hours Worked before or after Weeks Worked to see if asking Hours Worked first provides helpful context in framing the work done in the past year. We will also evaluate a lead-in statement in the interviewer-administered mode that uses a specific date fill for the current date one year ago to today as well as evaluate the addition of an option in the interviewer-administered mode for a respondent to provide a response in months, and then validate the number of weeks.
Between June 7 and June 24, 2016, Westat will perform 24 English-language cognitive interviews per topic group –– 72 cognitive interviews in total. Westat plans to conduct the interviews at its Rockville (69 interviews) and Frederick (3 rural address interviews), Maryland locations. If necessary, Westat will use additional locations to facilitate recruitment.
For the 40-minute cognitive interview, a respondent will answer a set of ACS questions specific to his or her assigned testing group. Groups 1 and 2 will consist of 12 Internet interviews (conducted using paper screen shots) and 12 interviewer-administered interviews. Group 3 will consist of 12 paper questionnaire interviews and 12 interviewer-administered interviews. As part of the cognitive interview process, Westat interviewers will probe the respondents’ understanding and perception of the questions. The interviews will be audiotaped and may be subject to one-way observation from designated sworn employees of Westat and the U.S. Census Bureau.
To recruit 72 respondents that meet the specified characteristics, Westat will use its own participant database as a starting point, and supplement it with recruits obtained through print advertising. See the attached OMB-approved flyer template and the Craigslist ad template (Attachment A). The specific headline that will be inserted into the flyer and Craiglist ad for recruiting for a given topic/question are also included in Attachments A . If necessary, Westat will also use make use of its personal and business networks, and target commercial locations that attract the desired respondents.
Westat will use a screener (see Attachment A) to screen potential respondents who respond to any of its outreach efforts. Respondents who are selected and complete an interview will receive $40 to offset the cost of participation (e.g., transportation, childcare costs). Westat will screen approximately 235 people to obtain 72 completed interviews. Screening will take approximately 7 minutes per person. Therefore, the initial screening of 235 people will take approximately 27.5 hours. An interview will take approximately 40 minutes per respondent. Therefore, the total number of hours to interview 72 respondents will take 48 hours. The maximum burden is approximately 75.5 hours.
The testing protocols per testing group/mode are found in Attachments B through M. The cognitive test interviewer-administered survey questions are found in Attachments N and O, the internet survey instrument is found in Attachment P, and the paper survey instrument is found in Attachment Q. The consent form is found in Attachment R.
Materials to be used for this project are listed below:
A: 2016 ACS Respondent Burden Testing: Recruiting Screener and Advertisements
B: ACS Respondent Burden Testing Protocol: Group 1, Version 1 (CAI)
C: ACS Respondent Burden Testing Protocol: Group 1, Version 2 (CAI)
D: ACS Respondent Burden Testing Protocol: Group 1, Version 1 (Internet)
E: ACS Respondent Burden Testing Protocol: Group 1, Version 2 (Internet)
F: ACS Respondent Burden Testing Protocol: Group 2, Version 1 (CAI)
G: ACS Respondent Burden Testing Protocol: Group 2, Version 2 (CAI)
H: ACS Respondent Burden Testing Protocol: Group 2, Version 1(Internet)
I: ACS Respondent Burden Testing Protocol: Group 2, Version 2 (Internet)
J: ACS Respondent Burden Testing Protocol: Group 3, Version 1 (CAI)
K: ACS Respondent Burden Testing Protocol: Group 3, Version 2 (CAI)
L: ACS Respondent Burden Testing Protocol: Group 3, Version 1 (Paper)
M: ACS Respondent Burden Testing Protocol: Group 3, Version 2 (Paper)
N: ACS Respondent Burden Testing Interviewer-Administered Survey Instrument: All Groups, Roster
O: ACS Respondent Burden Testing Interviewer-Administered Survey Instrument: Groups 1 & 2, Version 1
P: ACS Respondent Burden Testing Interviewer-Administered Survey Instrument: Groups 1 & 2, Version 2
Q: ACS Respondent Burden Testing Interviewer-Administered Survey Instrument: Group 3, Version 1
R: ACS Respondent Burden Testing Interviewer-Administered Survey Instrument: Group 3, Version 2
S: ACS Respondent Burden Testing Internet Survey Instrument: Group 1, Version 1
T: ACS Respondent Burden Testing Internet Survey Instrument: Group 1, Version 2
U: ACS Respondent Burden Testing Internet Survey Instrument: Group 2, Version 1
V: ACS Respondent Burden Testing Internet Survey Instrument: Group 2, Version 2
W: ACS Respondent Burden Testing Paper Survey Instrument: Group 3, Version 1
X: ACS Respondent Burden Testing Paper Survey Instrument: Group 3, Version 1
Y: ACS Respondent Burden Testing Consent Form
The contact person for questions regarding data collection and study design is:
Todd Hughes
American Community Survey Office
ROOM # 4K273
Washington, DC 20233
(301) 763-6686
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Broderick E Oliver (CENSUS/ADEP FED) |
File Modified | 0000-00-00 |
File Created | 2021-01-28 |