OMB Non-substantive Change Request
Department: Commerce
Agency: U.S. Census Bureau
Title: American Community Survey Methods Panel Tests
OMB Control Number: 0607-0936
Expiration Date: 12/31/2015
2014 ACS Internet Test
Motivation:
Results from the 2011 ACS Internet Tests indicated that the Push Internet notification strategy was successful in not only driving response to the Internet, but in keeping overall response very close to ACS production (Tancreto et al., 2012; Matthews et al., 2012). Based on these findings, the Census Bureau introduced an Internet mode into the ACS production starting in January 2013 using this protocol.
In addition to informing which notification strategy to use, the 2011 ACS Internet Tests also allowed us to evaluate the design of the Internet instrument. We identified several issues that this test will attempt to remedy. Specifically, we identified several screens on which a relatively high percent of respondents broke off, which resulted in higher item nonresponse for questions later in the survey. Additionally, we found that fewer Internet respondents than mail respondents entered multiple ancestries and many of the error messages rendered in the instrument were on screens with unfolding formats (i.e., two tasks on one screen).
Before we describe the experimental treatments, we outline instrument changes we propose making to each treatment based on the results of the 2011 tests. These changes are intended to make the instrument more user-friendly. First, on the PIN screen, we would like to add a security question so respondents have an alternate way to re-enter the instrument if they lose their PIN (see Figure 1 in Appendix A). Specifically, the security questions will appear in a drop down box; the respondent may choose one question and provide the answer. On subsequent logins, the respondents can either enter their PIN or click a hyperlink to go to the screen with the security question if they forgot their PIN (See Figures 2 and 3 in Appendix A). If they successfully answer the security question, they will receive a new PIN for login (Figure 4 in Appendix A). If they click the link to go to the security question, but never answered one, they will be routed to a screen that informs them they never selected a security question (Figure 5 in Appendix A).
The second change to the instrument is increasing the height of the write-in box on the Ancestry question (Figure 6 in Appendix A). The 2011 tests revealed that respondents did not report as many ancestries on Internet compared to mail, and we believe the format of the box on the Web may be the reason (Horwitz et al., 2013). On the mail form, there are clearly two lines for respondents to enter their ancestries. On the Internet, there is one text box that is two lines high with arrows for scrolling designed to convey that the box can accept more than one ancestry. As this format did not have the generate ancestry reporting similar to mail, we want to increase the height of the box.
Finally, questions with a second task (e.g., an embedded write-in box/drop down) following a radio button elicited many error messages in the previous tests because people overlooked the second task (Horwitz et al., 2012; Horwitz et al., 2013). In an effort to reduce these errors and resulting respondent burden, we propose drawing more attention to the second task once the associated radio button is selected. Specifically, once a radio button is selected, an arrow would appear pointing to the next task, the border of the write-in/drop down field would be bolded and the inside of the field would change color (Figure 7 in Appendix A). Affected questions include: Place of birth, year built, current grade level, highest grade level, citizenship, residence one year ago, health insurance, computer use, Internet subscription, race, and Hispanic origin.
These changes would apply to all test treatments discussed below. We will evaluate the impact of these changes by comparing to standard production cases from the same monthly sample panel.
Treatments:
We propose the following six treatments for testing so we can assess their impact on reducing Internet breakoffs. Note that the changes listed in treatment 1 (the ones discussed above) are included in the remaining five treatments. Treatment 1 will serve as a baseline against which other treatments can be compared.
Basic Changes – includes all of the changes discussed above, including the new verification question to reset PIN, the expanded write-in box for the Ancestry question, and drawing attention to write-in boxes associated with radio buttons.
Revised Transition Screen – eliminates one transition screen (Saved Person) and adds the information to the Pick Next Person screen. Additionally, the Pick Next Person screen has new language to encourage respondents to answer questions for other household members to the best of their ability. The 2011 Internet tests found the Pick Next Person screen was the most common screen on which respondents broke off. Further, Peytchev (2009) found that respondents break off more frequently on screens that do not have a specific survey-related task, which corresponds to the Saved Person screen. The screenshots of the original screens can be found in Figures 8 and 9 in Appendix A, while the revised screen can be found in Figures 10-12.
Reminder Email 1 – collects email addresses on the Respondent Name screen (Figure 13 in Appendix A). These email addresses are then used to remind respondents who have started but not completed the survey to return and finish. This email contains the survey name in the subject line and in the body of the email. There will be two waves of emails, although respondents will only receive one email. The first reminder will be sent after they receive the reminder postcard, but before they receive the replacement questionnaire. The second reminder will be sent to breakoffs that first entered the survey after the first reminder email was sent and before the cut for CATI cases. The revised transition described in treatment 2 is also included in this treatment.
Reminder Email 2 – Mimics all aspects of Reminder Email 1, except the survey name is excluded from both the subject line and the body of the email (except the acronym in the survey URL). We exclude the survey name due to concerns about privacy and identifying survey inclusion and participation to outside parties through unsecure email correspondence.
Reminder Email 3 – Mimics all aspects of Reminder Email 2, except the link to access the survey is embedded in text that says, “Click here” so the respondent does not see the URL/survey name. Again, we exclude the survey name due to privacy concerns.
Reminder Email 4 – changes wording on the Respondent Name screen to potentially allow us to share email addresses corporately at the Census Bureau. Currently, the language on the screen says, “We may contact you if there is a question.” We are updating the language on the screen to allow us the flexibility to contact them about the ACS or another Census survey. We are also updating the Help text to reflect that their information may be used in a variety of ways (Figure 14 in Appendix A). We will still send a reminder email in this treatment to those breakoffs that provide an email address. The email will follow the format in Treatment 3.
Sample:
To field this follow-up test, we are planning to use ACS production (Clearance number: 0607-0810, expires on 6/30/2016). Thus, there is no increase in burden resulting from this test since the treatments will result in approximately the same burden estimate per interview (40 minutes). We have divided the monthly production sample into groups of approximately 12,000 addresses. For the July 2014 panel, we will use six of the groups for our test (one group corresponding to each treatment). The remaining groups from the sample will be standard production. As we are using production cases for our test, the test will run through the complete 3-month data collection.
Our primary evaluation measure for this test is the breakoff rate. Comparing the breakoff rate between two treatments allows us to detect a 2.2 percentage point difference with 80% power and α=0.1. A secondary evaluation measure is the item nonresponse rate, which we expect to decrease if breakoff rates decrease. Comparisons between two treatments of 12,000 addresses each will allow us to measure approximately a 1.4 percentage point difference in item nonresponse at the household level with 80% power and α=0.1. Comparisons within a treatment of 12,000 addresses will allow us to measure a difference in item nonresponse rates of 1.9 percentage points with 80% power and α=0.1 (this could include comparisons across different household sizes). With approximately 12,000 addresses per treatment and six experimental treatments, the total sample size of this test is approximately 72,000 addresses.
References:
Horwitz, R., Tancreto, J.G., Zelenak, M.F., Davis, M.C. (2013). Using Paradata to Identify Potential Issues and Trends in the American Community Survey Internet Instrument. Available at: http://www.census.gov/acs/www/Downloads/library/2013/2013_Horwitz_02.pdf.
Horwitz, R., Tancreto, J.G., Zelenak, M.F., Davis, M.C. (2012). Use of Paradata to Assess the Quality and Functionality of the American Community Survey Internet Instrument. Available at: http://www.census.gov/acs/www/Downloads/library/2012/2012_Horwitz_01.pdf
Matthews, B., Davis, M.C., Tancreto, J.G., Zelenak, M.F., Ruiter, M. (2012). 2011 American Community Survey Internet Tests: Results from Second Test in November 2011. Available at: http://www.census.gov/acs/www/Downloads/library/2012/2012_Matthews_01.pdf
Peytchev, A. (2009). Survey Breakoffs. Public Opinion Quarterly 73(1).
Tancreto, J.G., Zelenak, M.F., Davis, M., Ruiter, M., and Matthews, B. (2012). 2011 American Community Survey Internet Tests: Results from the First Test in April 2011. Available at:
http://www.census.gov/acs/www/Downloads/library/2012/2012_Tancreto_01.pdf
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | MLR |
File Modified | 0000-00-00 |
File Created | 2021-01-27 |