May 2014 Test Plan

OMB_Int_1401_May14.doc

Generic Clearance for Internet Nonprobability Panel Pretesting

May 2014 Test Plan

OMB: 0607-0978

Document [doc]
Download: doc | pdf

The Census Bureau plans to conduct new research under the Generic Clearance for Internet Nonprobability Panel Pretesting (OMB number 0607-0978). We plan to conduct testing using an online questionnaire to gather information about email subject lines building on the research started under the Generic Clearance for Pretesting. This “May” test using a nonprobability panel is part of the iterative testing strategy that we plan to use to supplement the 2020 Census research program. The nonprobability panel pilot test was successfully pilot tested in January 2014 and the second test was conducted in March 2014.


The goal of the iterative testing is to determine the best email subject lines to maximize self-response for the 2020 Census. Email subject lines that fail to generate significant click-through rates in the nonprobability panel are not likely to perform better in a probability panel. Eliminating poorly performing email subject lines before larger-scale testing is the purpose of the nonprobability testing we propose here.


We will send survey invitation emails to a sample of people who opted-in to participate in Census Bureau research studies through the Census Bureau’s email subscription website, run by GovDelivery. There was no incentive to sign up and there is no incentive (other than a copy of the research report) to participate. Currently, the panel has over 10,000 emails. Because of the opt-in nature, this panel is considered a nonprobability panel. Based on the two tests conducted thus far, we expect no more than a 35 percent open rate to the emails and a 24 percent click-through rate to the survey. In this “May” test, we will again measure the open rates and click through rates with emails using different subject lines and formats.


There are twelve email panels in the current test (3 initial email subject line variations by 2 reminder email variations by 2 email formats). We are not testing different survey panels in this test. Each email panel consists of a sample of 125 email addresses for a total sample of 1,500 email addresses. There are three initial emails where the only difference is the subject line of the initial email. One initial email subject line in the May test is “Confidential 2014 Census Survey” which is very similar to the initial subject line “Confidential 2014 Census Study” which generated the greatest click-through rate in the March test. The second initial email subject line in the May test is “10-minute U.S. Census Survey to help your community.” This subject line will test how well a number does to motivate the email open rate and click-through rate. The third initial email subject line removes the number and uses only “U.S. Census Survey to help your community.” With the sample size we have chosen, we will be able to detect a difference in the email open rate of 8 percent or higher between the three initial subject line panels. Because one of the subject lines includes the words “10 minutes,” referring to the burden estimate for the 2020 Census, we will change the content of the emails to include reference to a 10-minute survey, instead of a 5-minute survey, even though the average length of the survey is only 5 minutes.


The reminder emails also differ: for each initial email, half the sample will receive a reminder email and final reminder email with a due date in the subject line: “Reminder: Complete the U.S. Census Survey by May 23” and “Due May 23: Final reminder for the U.S. Census Survey.” The other half will receive reminder email subject lines without the due date: “Reminder Complete the U.S. Census Survey” and “Final reminder for the U.S. Census Survey.” With the sample size we have chosen, we will be able to detect a difference in the email open rate of 7 percent or higher between the two reminder email panels.


The format of the emails also differs: for each initial email and reminder panel, half the sample will receive a “standard” or “plain” format used in the January 2014 pilot test and the March test. The other half uses a “fancy” format – defined as using colors and graphics, but the same content. The March study also tested a “plain” versus a “fancy” format and the “plain” format overall had more click-throughs to the survey than the “fancy” format. We are rerunning this experiment with a slightly different “fancy” format in order to confirm or refute the March results. With the sample size we have chosen, we will be able to detect a difference in the email open rate of 7 percent or higher between the two email format panels.


There is only one version of the online survey for the May test. This version was tested in January and March. It contains the same address layout as the 2014 Census Test, a production survey being fielded in June.


The sample will receive a maximum of three notification emails:

  • one of the initial emails on Monday, May 12,

  • a reminder email on Thursday May 15 (if they had not yet click on the link to the survey), and

  • a final reminder email on Monday, May 19.


The survey will be closed on Friday, May 23 at midnight.


The survey that the emails link to collects address data. After the address screens, we ask opinion questions as a part of the “debriefing.” The objective of these questions is to gather qualitative data in order to guide future iterations of this test, to gain a sense of how respondents want to be contacted about the census, to answer the census, and what these highly motivated individuals think the census collects. These data will be shared with 2020 Census staff and the communications area of the Census Bureau. The answers to the demographic questions and questions on new technologies will allow us to look at characteristics of respondents. These questions are identical to the questions used in the March test and we will collapse the results across the iterative tests for the final report.


The pilot test will be conducted from May 12 through May 23, 2014. Staff from the Center for Survey Measurement’s Human Factors and Usability Research Group will select the sample and send the emails through GovDelivery. The survey will be hosted on our secure servers within the Application Services Division of the Census Bureau that hosts all other secure online production surveys. The username needed to enter the survey will be the email address where the email was sent (this is the same email used to sign up to participate in Census Bureau research studies). If the respondent starts the survey but does not complete it, that person will not be allowed to re-enter the site later. The emails and questions were tested in previous usability sessions under the Generic Clearance for Pretesting.


We estimate that users will spend 5 minutes on average completing the survey and approximately 5 minutes reading emails. Thus, the total estimated respondent burden for this study is approximately 250 hours, which assumes everyone reads the emails and answers the survey.


The contact person for questions regarding data collection and statistical aspects of the design of this research is listed below:


Elizabeth Nichols

Center for Survey Measurement

U.S. Census Bureau

Washington, D.C. 20233

(301) 763-1724

[email protected]

File Typeapplication/msword
File TitleThe purpose of this letter is to inform you of our plans to conduct research under the generic clearance for questionnaire pre
AuthorBureau Of The Census
Last Modified ByJenny Childs
File Modified2014-04-24
File Created2014-04-24

© 2024 OMB.report | Privacy Policy