2019 NSCG Mailout Strategy Experiment Research Questions and Desing Rationale

Appendix_I_MailoutExperiment_10Oct18.docx

2017 National Survey of College Graduates (NSCG)

2019 NSCG Mailout Strategy Experiment Research Questions and Desing Rationale

OMB: 3145-0141

Document [docx]
Download: docx | pdf

APPENDIX I



2019 NSCG Mailout Strategy Experiment

Research Questions and

Design Rationale

2019 NSCG Mailout Strategy Experiment

Research Questions and Design Rationale


Background Information on Mailout Strategies

One of the most important design decisions of a survey is the contact strategy. It makes potential respondents aware of the survey and gives them a means to respond. Research has shown that the type, timing, and number of contacts (Dillman et al., 2000) as well as the level of personalization (Cook, Heath, Thompson, 2000) can influence a respondent’s decision to participate in a survey. Unfortunately, there is no definitive “best strategy” for survey designers to use. The strategy that will be most effective for a particular survey will depend on many factors, including the target population, the sponsor, and the mandatory status of the survey. We do, however, have guidelines that can help determine what features of different contacts tend to be successful in different modes. These guidelines can be used to create one or more contact strategies that can be tested and refined to efficiently maximize response rates within the survey’s cost constraints.


In mixed mode surveys like the NSCG, Dillman and his colleagues (2014) emphasize the importance of contacting sample cases in multiple modes, such as mail, email, and telephone. Using multiple modes increases the likelihood the message will be received. People often delete email messages from unknown sources and do not answer calls from unrecognized phone numbers. Nichols (2012) found that over half of the nonrespondents in the American Community Survey (ACS) either did not remember receiving mail contacts or did not open them. Contacting sample cases through multiple modes can make the survey more familiar, increasing the chances one of the contacts will be received, while also legitimizing the survey.


Even when using multiple contact modes, it is valuable to consider the guidelines for each mode. In many cases, these guidelines overlap. For example, in all modes it is essential that contacts are unique and present only the pertinent information. In their 2009 book, Dillman and his colleagues state, “Perhaps more important than whether three, four, or five contacts are used is that each communication differs from the previous one and conveys a sense of appropriate renewal of an effort to communicate.” This includes changing the look and feel of the materials as well as the message across contacts, as sending the same message repeatedly is unlikely to make someone change their mind about responding (Dillman et al., 2014). Each contact should emphasize what needs to be done and why it is important. Different reasons for participation are salient to different people, so it is important to identify what those reasons are and mention them in different messages.


In 2014, the ACS worked with a contractor to evaluate all of its mailing materials using focus groups, cognitive interviews, and visual testing (Walker, 2015). The main findings from this research were that it is important to use visual devices, including callout boxes and bolding to highlight important, actionable information, like a URL or deadline, or it is more likely to be missed. Similar to what Dillman and his colleagues found, the ACS participants also indicated they wanted to know and understand how responding benefits them and their communities (Dillman et al., 2009) and said that if messages or mailing packages are repetitive, have too much information, or appear too cluttered, they are more likely to disregard them (Dillman et al., 2014). While this research focused on mail contacts, the general principles apply to all modes. Contacts should be non-repetitive, inform sample cases how to respond and why they should respond, and emphasize pertinent information.


In other cases, however, the guidelines are more mode-specific. For mail surveys, Dillman and his colleagues (2000) originally recommended the five-contact mailing strategy: prenotice letter, questionnaire, postcard, replacement questionnaire, and final reminder. Other surveys, like the NSCG and ACS, have taken this model and tweaked it to accommodate a web push strategy, by replacing the first questionnaire with a request to complete the survey online. More recently, however, Dillman and his colleagues (2014) are questioning the overall effectiveness of the prenotice letter, as there is no call to action, and have recommended replacing that contact with an additional reminder, later in the contact strategy. Both the 2012 National Census Test and the ACS experimented with this new strategy in the presence of an internet push approach. The 2012 National Census Test found that sending an additional reminder in place of a prenotice letter had a slightly lower internet response rate (0.9 percent), but the difference was not statistically significant. The ACS also saw a small, significant, negative effect from removing the prenotification of 1.7 percent for internet response. However, after the data collection activities across all modes (paper, phone, and personal visit) were completed, there was no significant difference in the overall response rate. Both the Census and ACS opted to eliminate the prenotification due to the cost savings (Murphy and Roberts, forthcoming; U.S. Census Bureau, 2014). Dillman and his colleagues (2014) do mention that prenotifications are particularly important in surveys that are contracted to another organization for data collection to explain the connection to the sponsor. While this may be applicable for the new portion of the NSCG sample, the returning sample is already familiar with the sponsor and the survey, so eliminating the prenotice letter may have no effect.


While some surveys have modified the mail survey guidelines to include offering an internet option, many internet surveys prefer to invite sample cases via email instead of mail. Email invitations allow respondents to click a hyperlink that takes them directly to the survey and, in some cases, contains a userid for an easier login process. The guidelines for email notifications are very similar to those for mail surveys (Dillman et al., 2014). Many survey administrators like to use email contacts because they are very cheap, but they are not particularly effective, as compared to mail invitations (Kaplowitz et al., 2004; Shih and Fan, 2008; Leece et al., 2004). For example, Kaplowitz and his colleagues (2004) obtained a response rate of 20.7 percent when contacting students via email and 31.5 percent when using mail contacts. These researchers compared mail contacts for a paper survey to email contacts for a web survey. However, when Kaplowitz added a mail postcard to the email treatment, the response rate increased to 29.7 percent, which was not statistically different from the mail only group. Similarly, Millar and Dillman (2011) compared a strategy with four postal mailings (two letters and two reminders) to one with two postal letters and two email reminders and found that the augmented strategy yielded a 4.3 percent increase in response. This research shows a benefit to using multiple contact modes to obtain a response and the key is to identify the optimal number and timing of contacts in each mode.


In the 2017 NSCG, the National Center for Science and Engineering Statistics (NCSES) within the National Science Foundation (NSF) experimented with a new mailing strategy that incorporated best practices from the body of literature. This included varying both the look and feel of each mail contact and sending emails at the same time as mail correspondence in order to provide legitimacy and increase the chances that contact was made with the sample case. The 2017 experiment, in a general sense, aimed to strike a balance between response and burden by determining the best way to contact sample cases, including contact mode, number of attempts, and message content.


However, there was only enough sample available to test one overall strategy without any variations. Decisions were made using all the information available, but it was not always possible to know the best way to present information when no research was available. Therefore, it is possible that some untested variations of the 2017 strategy may have been more successful. The 2019 experiment seeks to answer questions from the 2017 design that currently have minimal research associated with them in order to create a more informed final mailout strategy.


Proposed Mailout Strategies for the 2019 NSCG Mailout Strategy Experiment

The 2019 NSCG mailout strategy experiment proposes to test the following contact strategies to measure their effectiveness within the NSCG data collection effort:

  • Varying the location of the participation deadline date on the mailing materials; and

  • Varying the participation deadline and when a deadline is provided.


The sections below discuss the rationale for including these contact strategies as part of the experiment.


Revised Mailout Strategy and New Survey Contact Materials

This section describes how we came to our decisions regarding the location of the due date and the timing of the participation deadline.


Dates

In the past the NSCG requested that sample cases respond to the survey “within two weeks” of the initial letter mailout date. This request was made within the invitation letter. Then, on week 8, the questionnaire package requested a response “within two weeks.” This request was located on the outside of the envelope. In the 2017 Contact Strategy Experiment, new letters were developed that included specific deadlines in the Week 1 and 2 mailouts but maintained the “respond within two weeks” request on the outside of the questionnaire packet. The overall new contact strategy was successful and will be used for 2019. Thus, the first two mailings in 2019 will include a date deadline. In 2017 the inclusion of a date was one of several changes that were analyzed jointly, not independently. The 2019 design gives us the opportunity to isolate the effect of including a deadline to answer two questions: 1) Is an actual date (e.g., “February 22, 2019”) more effective than the generic phrase “respond within two weeks”? and 2) Is it more effective to place the deadline on the envelope or in the letter?


Participation Deadline Timing

The literature regarding deadlines is quite mixed. Some research suggests a deadline can either help overall response (Stokes, Reiser, Bentley, Hill, Meier, 2011; Martin, 2009) or lead to faster response (Bouffard, Brady, Stapleton, 2004) while others found no impact (Dillman, 1991). The inconsistency in the literature may be a result of the unique conditions of each survey. For example, Brassell, Dayton, ZuWallack, Seligson, Immerwahr, and Lim (2018) found that a deadline included in the invitations letter had no impact on response. However, the deadline was one month from when sample cases received the invitation. The researchers believe that had the deadline been closer to the mailout date, the results may have been different. Feedback from focus groups for the 2015 NSCG Contact Strategy Analysis supported this hypothesis, suggesting that a deadline should be a maximum of two weeks from the mailing (Morales, 2016).


This tight deadline can be problematic for surveys with lengthy data collection periods, such as the NSCG. In focus groups, respondents say they prefer dates to general statements like “respond within two weeks” or “respond as soon as possible” as a date seems more urgent (Kerwin and Moses, 2006; Morales, 2016). This feedback was tested as part of the 2017 contact strategy experiment, where a specific deadline (two weeks from the initial mailout) was provided in the first two contacts. While the experimental treatment was successful overall, we were unable to isolate the effect of the date because so many other components of the strategy were changed. Additionally, a delay in the second mailing resulted in many sample cases receiving the contact after the stated deadline which depressed response for that week. One issue with using a specific date is that respondents may think they do not need to respond past that deadline. To avoid this issue, Dillman (2014) recommends reserving a deadline or “approaching soon” for the final mailing. The NSCG currently uses the language “last chance” for its final mailing and that mailing only increases the overall response rate by about one percentage point (Newman, Misra, Horwitz, forthcoming). Among people who received the last chance mailing in 2017, approximately 11.55 percent completed the survey by closeout. It is possible that the data collection period is so long, there simply are not many cases willing to respond that have not already done so.


Deadline Location

While there is limited research on the best way to present deadlines to potential respondents, even less is known regarding where the information should be placed. In the research cited above, the deadline was typically placed within the letter. However, results from the 2015 NSCG focus groups suggests that many people do not read letters and many of those who do only skim them (Morales 2016). Therefore, it is possible that many cases do not even see or attend to deadlines contained within invitation letters. Another alternative is to place the deadline on the outside of the envelope, as the NSCG does for the week 8 questionnaire mailing. The expectation here is that more people would see the deadline, but it may have a negative effect of being too aggressive.


By investigating the combination of how and where the deadline is presented, we can determine whether there is an optimal combination for the NSCG’s unique sample and data collection cycle.


2019 NSCG Mailout Strategy Experimental Groups

The examination of these different contact strategies creates a total of four experimental groups that will be included in the experiment. The NSCG mailout strategy uses a deadline within the letters for the week 1 and 2 mailings. The deadline location experiment will test placing the deadline on the envelope in addition to inside the letter. Alternatively, the deadline timing experiment will test the impact of providing the deadline at week 12 instead of at the beginning of data collection (weeks 1 and 2). When these experiments are combined, some cases will receive a deadline at weeks 1 and 2 with the deadline appearing only in the letter while others will receive the deadline both on the envelope and within the letter. Another group of cases will not receive the deadline at the beginning of data collection but will receive it in the week 12 reminder mailing either within the letter or on the envelope and within the letter. Cases that do not receive a deadline until week 12 will receive contact materials prior to week 12 that indicate they should “respond within two weeks.” Table I.1 provides a description of the experimental design.


Table I.1. 2019 NSCG Mailout Strategy Experimental Groups

Deadline Location

Deadline Timing

Treatment

Within Letter

Initial Mailout (Weeks 1 and 2)

LI (Letter/Initial)*

Late Mailout (Week 12)

LL (Letter/Late)

Envelope and Letter

Initial Mailout (Weeks 1 and 2)

EI (Envelope/Initial)

Late Mailout (Week 12)

EL (Envelope/Late)

*Control Group



Research Questions for the 2019 NSCG Mailout Strategy Experiment

The experimental groups allow for the evaluation of the following research questions.

  1. Relative to the control group, does placing the deadline on the outside of the envelope result in (LI vs EI):

    1. Higher response rates?

    2. Higher R-indicators?

    3. Lower costs?

    4. Any impact on key estimates?


  1. Relative to the control group, does varying the participation deadline timing result in (LI vs LL):

    1. Higher response rates?

    2. Higher R-indicators?

    3. Lower costs

    4. Any impact on key estimates?


  1. Relative to the control group, does the combination of the deadline location and the participation deadline timing result in (LI vs EL):

    1. Higher response rates?

    2. Higher R-indicators?

    3. Lower costs?

    4. Any impact on key estimates?


References:

Bentley, M., Hill, J., Reiser, C., Stokes, S., & Meier, A. (2011). 2010 Census Quality Survey.

Bouffard, J., Brady, S., & Stapleton, C. (2004). 2003 National Census Test: Contact Strategy Analysis.

Brassell, T., Lim, S., Immerwhar, S., Seligson, A.L., Dayton, J., & ZuWallack, R. (2018). Finding the Right Ingredients: Mixing Incentives, Deadlines, and Different Mode Protocols to Improve Response Rates in ABS Designs. Presented at the American Association for Public Opinion Research Annual Conference, Denver, CO.

Dillman, D. A. 1991. "The Design and Implementation of Mail Surveys." Annual Review of Sociology 17:225-49.

Dillman, D., Smyth, J., & Christian, L. (2014). Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method (4th Edition). New York: Wiley & Sons.

Horwitz, R., Newman, B., & Misra, J. (2018). Contact Strategy Experiment Results for the 2017 National Survey of College Graduates.

Horwitz, R., Newman, B., Reiser, C., & Tancreto, J. (2017). Contact Strategy Analysis for the 2015 National Survey of College Graduates.

Martin, E. (2009). Can a Deadline and Compressed Mailing Schedule Improve Mail Response in the Decennial Census? Public Opinion Quarterly, 73(2), 361-367.

Morales, G. D. (2016). Assessing the National Survey of College Graduates Mailing Materials: Focus Group Sessions.

1


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorMilan, Lynn M.
File Modified0000-00-00
File Created2021-01-20

© 2024 OMB.report | Privacy Policy