Improving Response Rate Document

Dillman Final report on Three Proposed ACS Marketing Strategies August 13 2014.docx

American Community Survey Methods Panel Tests

Improving Response Rate Document

OMB: 0607-0936

Document [docx]
Download: docx | pdf

Final Report from Don A. Dillman to Agnes Kee, U.S. Bureau of the Census, Washington D.C. August 13, 2014



Review of Proposed Materials for Improving response rates and

Survey awareness of the American Community Survey


By

Don A. Dillman

Pullman, Washington

[email protected]


The Request


I was asked to review three potential mail-out packages of materials the aims of which are to increase self- response rates and survey awareness for the American Community Survey.


I will begin this report by reviewing the current procedures now in use and commenting on the strengths and weaknesses of the current ACS mail-out procedures. These comments are based upon the available research literature, as well as my own experiences in designing and implementing mail requests to respond by mail and or the Internet.


As a frame of reference for the comments on current procedures I am attaching a copy of pages

298-313 from my 2000 book, “Internet and Mail Surveys: The Tailored Design Method” where I summarized results from several studies conducted with the aim of improving response rates to the 2000 Decennial Census (attached as Appendix A).


Summary of Decennial Census (Long and Short Form) Experiments


A study of why people did not respond to the 1990 Decennial Census, conducted in early 1991, revealed multiple reasons for why some households did not respond. Among the non-respondents were:


  • Some people who did not remembering receiving a response request.

  • Others who remembered receiving it, but did not open it.

  • Individuals who opened the request, but did not start to complete the census form.

  • Some people who started to complete it, but did not finish.

  • Others who finished filling it out, but made no effort to return it.

  • People who completed the census form and placed in the return envelope, but did not mail it back.


The major learning from this study was that the Census response process breaks down at different stages of receiving and responding to the questionnaire. These findings suggest that communicating with and persuading individuals to respond to a Census request is something that happens over time, rather than as the result of single focused effort on only one aspect of designing and implementing the data collection process.


During the early 1990’s experimental research was conducted to examine the impact of 16 different factors on response rates to the Decennial Census Long and Short forms. Only six factors were shown to have significant positive impacts on response rates (Dillman, Clark and Treat, 1994; Dillman, 2000). These factors and their estimated effects (based upon multiple tests) included:

  • Pre-notice letter (5-7%)

  • Reminder postcard (5-8%)

  • Replacement paper questionnaire (6-11%)

  • Respondent friendly questionnaire design (2-4%)

  • Prominent disclosure on the envelope that a U.S. Census for was enclosed and that response was mandatory (9-11%)


Among the tested issues that had no affect on response were these:

  • Prominent disclosure of benefits (as alternative to mandatory response)

  • Strong vs. standard confidentiality statement

  • Offering a choice of whether to respond by telephone or mail.

  • Additional letter reminder after the thank you postcard.

  • Color of questionnaire (blue vs. green).

  • Booklet vs. one-sheet form (short form test only).

  • Stapled vs. unstapled booklets

  • Inclusion of difficult to answer questions


Another set of issues we tested was use of a “marketing orientation” vs. “official government” emphasis in construction of the mail-out envelopes and census forms. These tests used bright yellow envelopes and questionnaires, which are shown in Appendix A. Test Version A, using a vertical form produced a response rate five percentage points lower than the official business form, while Test Version B that used an envelope and form in the same shape and size as the official business control form, reduced response rates by nine percentage points (Leslie, 1996, 1997). An extensive set of cognitive interviews (Dillman, Jenkins, Martin, DeMaio, 1996) provided considerable insight into the reasons for these declines, and is attached as Appendix B. The general reason for the decline is that the mailing materials seemed inconsistent with what people expected from government when conducting the Census. The graphical layouts and colors made the request appear like a marketing effort for a commercial product. Appendix A summarizes additional results from this test, and more detail is provided in the associated references.


This series of Census tests was run across six years, building implementation methods sequentially, i.e. results from one test were used as controls and/or components of subsequent tests (See Appendix 3, Dillman, Clark and Treat, 1994) Multiple sets of cognitive interviews were used to examine effects of various aspects of implementation. Correspondence for all tests included dated letters addressed to specific addresses, on Census stationery.


An insert or brochure with detailed instructions used in 1990 was eliminated from all tests based upon early evaluations of their lack of use and helpfulness in completing the Census forms. In developing the multiple communications that supported the redesign of Census forms, considerable emphasis was placed on the timing and connectivity, or mutual support across elements in order to remind and support people through the multiple steps of finding, opening, completing and returning process.


The results of these tests are quite consistent with existing literature. The importance of multiple contacts was well established in the survey literature prior to that time and these experiments simply confirmed the importance of using multiple contacts and replacement questionnaire (Dillman, 1978). The main new insights obtained from these experiments were the positive effect of the prominent disclosure of response being mandatory and the negative effects of a marketing approach that made the census forms appear not to be associated with government business. I will return to the relevance of these findings later in this report.


What is different about trying to obtain responses over the Internet?


Research on getting people to respond to questionnaires over the Internet has evolved during the past decade. Most of this research has been conducted outside of a Census context. Several findings from this research are relevant to designing data collection that encourages people to respond over the Internet, but provides other alternatives for people unable or unwilling to respond in that way. A summary of some of the major findings relevant to joint internet and mail data collection follows:


  1. Offering people a choice of responding by internet vs. mail in an initial contact results in most people (70-80% of respondents) completing and returning the mail option. The likely explanation is that the mail option takes less work because of the questionnaire being immediately available, and the recipient of the response request not has to switch from opening mail to going to a computer and manually entering a url and password (Smyth, Dillman, Christian and O’Neill, 2010).


  1. Offering a choice of response mode lowers the overall response rate. This effect has been observed in multiple studies, including the American Community Survey by Griffin, Fischer and Morgan (2001), a media diary survey (Gentry and Good (2008), and household surveys of the general public (Messer and Dillman, 2011).A meta-analysis showed for 19 out of 20 tests that response was lower when choice was offered. This negative effect of choice may be explained in part by the “Paradox of Choice” described by Barry Schwartz (2004). He argued that providing a choice makes deciding what to do more difficult, and encouraged thoughts of not choosing either of the available choices. A second problem is that the complexity of deciding may result in a “delay” in deciding, and once the task is set aside, fewer people feel compelled to return to and complete it. In the Decennial Census the greatest response occurs after the census questionnaire is received, and rapidly declines, unless follow-up reminders are used.


  1. Offering only an internet response option, while withholding a mail response option can successfully drive significant portions of response to the internet. However, it produced a lower response rate than a mail-only data collection process in ten experimental comparisons (Smyth, Dillman, Christian and O’Neill, 2010; Messer and Dillman 2011; Edwards, Dillman and Smyth, Forthcoming; Dillman, Smyth and Christian, 2014). These tests were conducted using university sponsored surveys of address-based samples of household populations in various U.S. states, on a variety of topics, using 12 page questionnaires with 80-140 items conducted from 2007-2012. Specific results were as follows:

    1. Postal-only treatments produce a mean response rate of 53% (range 38%-71%).

    2. Push-to-web, withholding mail until final contact, produced a mean response 43% (range 31% -55%).

    3. The push-to-web approach resulted in 62% of all responses being received over the web.

    4. See Dillman, Smyth and Christian (August, 2014) for details on all of these experiments.

A potential explanation for the lower overall response when pushing to the web is that some people are unable or unwilling to respond by web and dismiss the questionnaire as something they can or should complete. While 85% of individuals report they use the internet, only 75% have access from their homes, and only about 70% have high-speed access.


  1. Whereas following a request to respond by web with a later request to respond by mail improves response rates significantly, the reverse procedure of following a postal only request to respond over the web does not significantly increase response rates (Smyth, et al. 2010; Messer and Dillman, 2011). The explanation for this difference is probably that an earnest request for a mail response encourages types of respondents who would respond to web, to do so by mail because that option is available.


  1. A mail follow-up to a web request brings in quite different respondents, demographically, than responded to the web-only request. The mail responses tend to be less-educated, older, have lower incomes, and are more likely to live alone (Smyth, et al. 2010; Messer and Dillman, 2011).


Applying these findings to the American Community Survey should be done with appropriate caution. Overall these studies suggest that reasonable response rates can be obtained outside a Census context in which household responses are not required by a federal law. Government surveys, even when they are not mandatory, obtain high response rates than university or private sectors surveys because they have greater legitimacy. That has been the case for decades, and there are no recent data that bring that finding into question.


To achieve these response rates for the 10 studies, summarized in Dillman, Smyth and Christian, 2014), token cash incentives (up to $5) were sent with the initial survey request. Use of these incentives was especially helpful for increasing the proportion of web responses as revealed experimentally. Including $5 with the request, as “a small token of appreciation” increased web response to 31% compared to only 13% when it was not used in a statewide test in Washington State (Messer and Dillman, 2011). The mandatory response requirement may be achieving for the ACS what the token cash incentive achieves for non-government surveys.


Evaluating the Current ACS Data collection Procedures


I have been asked to evaluate the proposed revision in ACS procedures against current procedures. After reviewing your current procedures I do not think they are optimal for maximizing ACS response through self-administration by mail and web. Consequently, I am going to comment on ways that I think current ACS procedures can be improved, and use that as a referent point for discussing the proposed new procedures later in this report.


I have read three papers by Stephanie Baumgardner, one of which was coauthored by Deborah Griffin and David Raglin (2014). They provide useful information on how the change from mail to a push to web data collection strategy has affected response. The papers do not provide as much detail as I would like, or could likely learn through conversations with the authors. However, my general impression is that benefit is being realized from pushing to the Internet and following up by mail using the current methods. The papers do not have information on low income vs. high income households or minority populations. Also, I cannot tell whether blank responses are included in the returns (I remember that they were in the Decennial Census initial numbers) and I am having difficulty seeing how much response is coming from the telephone TQA. I also do not have a sense of the effectiveness of current procedures for obtaining responses from minority households and those where English is a second language.


That said, I sense you are doing reasonably well with the new data collection procedures even though there are some things that I think are less effective than they could be, which I will discuss in relation to evaluating the Reingold proposed forms.


It’s my understanding from the reports that in the early 2013 tests, about 36% were responding over the internet and a total of 62% by Internet plus mail. If I understand the table correctly this means about 58% of the self-administered responses were coming in over the Internet. This is only slightly lower that the proportion of internet responders obtained in the non-census tests I have conducted (Dillman, Smyth and Christian, 2014). However, I think it may be possible to push the ACS response somewhat higher because of the power of the “mandatory” response expectation, thus reducing the extent of follow-up needed by telephone and in-person interviews.


I will organize discussion of current ACS procedures around each of the six mail contacts now being used and concerns I have about their current designs. As a suggestion, I think you would find it helpful to read this section with copies of the current ACS mail-outs in hand inasmuch as I’m going to go into some detail on each of the mailings.



Current Mail-out # 1: Pre-notice


The pre-notice letter was considered essential for the Decennial Census in 2000 because only three mail contacts could be made prior to beginning the enumerator phase of data collection in order to complete data collection by the scheduled time for reporting the Census count to the President of the United States. Experimentally, we showed that the pre-notice had a significant influence on response. A factorial design including 50,000 households showed that it improved response by 5-7 percentage points and in combination with the use of a postcard reminder improved response by about 13 percent (Dillman, Clark and Treat, 1994). This finding was instrumental in developing the Census implementation strategy for 2000 and 2010.


Previous research outside the Census Bureau has also shown that pre-notices are important. However, the total number of contacts has also been showed to be important for improving response. Consequently, in the 1990’s, outside of Census, I conducted several experiments testing whether a pre-notice added uniquely to response rate. I found that comparing use of a pre-notice with sending a final notice to another treatment group as an alternative produced no significant differences; it is the total number of contacts that appears to be most important. The main reason I continued to use the pre-notice in my own work (e.g. see Dillman, Smyth and Christian, 2009), was that when I used incentives and thought it likely that some people would not open the mailing with the incentive and paper questionnaire. Thus, I used the pre-notice to help getting the mailing containing the questionnaire opened so that the incentive could have an impact.


When I switched to a push-to-internet approach and initially withholding the mail questionnaire, I concluded that the pre-notice was less important as a stand-alone entity. (I could substitute another contact later in the data collection process to get the same effect). Also, if someone is being told they need to respond to an internet survey, it’s easy to provide the url in that initial mailing. Also, based upon informal cognitive testing, I came to the conclusion that it seemed strange to recipients to be asked to respond by the web, and not be shown how to do it at the same time. In essence, using a pre-notice without a URL made the respondent work harder to respond!


When I considered your current pre-notice I did not get the impression that it is very effective for improving response in ways that will build a compelling argument for responding to that later request for responding over the Internet. No mention is made of the response being mandatory. The letter is not from a person, which makes it an obvious sign of being unimportant. In addition the letter is not dated (another sign of unimportance). Although the letter has an inside address, it seems strange that the line that follows it reads, “A message from the Director, U.S. Census Bureau.” Culturally, that signals that the message is unimportant. Finally, there is no signature at the bottom of the letter.


This is not the way that important mail is sent to other people in the U.S. The letter has all of the trappings of being an unimportant mass mailing, and I think many people will immediately ignore it. Sending people to the website doesn’t seem particularly useful, since the recipient is asked to go there with no particular purpose in mind.


The enclosed brochure places English on the inside where it is not likely to be see. If one opens the brochure and turns to the back side one finds English and the fact that a response is required by law. This important message is completely hidden from view.


Ordinarily, a message as important as this would be mentioned in the letter, itself. In short, the communication style defines whatever is about to come in the mail as decidedly unimportant. I doubt that this pre-notice decreases response (as opposed to sending no prior contact), but I do not think it increases it at all, especially in ways that could not be accomplished more effectively by other means.


In collaboration with several others, I conducted a series of experiments on the effects of personalization on survey response (outside of Census). These results show that personalization of correspondence improved response for household surveys a few percentage points, but did not improve response for special interest group surveys (Dillman, et al. 2007). For household surveys of the general public, which includes the Decennial Census, it appears to remain important.


The letter in its present form needs to be changed!


Current Mail-out #2: Internet Invitation


The follow-up invitation has several desirable qualities. It appears quite official, with the government return address, and penalty for private use indicator. The envelope is also larger so it will stand out in the mail box (recent research shows that larger survey envelopes are more likely to be opened, e.g. Tarnai, et al. (2012) and Dillman, Smyth and Christian 2014).


The mandatory response is an important disclosure. However, I see that the term, “American Community Survey” is used in that box. I understand the desire to “brand” the ACS. However, I would be inclined to keep the box the same as the one used for the 2000 Census, i.e. “U.S. CENSUS FORM ENCLOSED; YOUR RESPONSE IS REQUIRED BY LAW”. It’s the Census Bureau that has the perceived legitimacy to make it appropriate as a reason for requiring a response. Using the somewhat colloquial phrase, “American” in the name of the survey, makes it seem somewhat less official. I will return to this issue again later.


I was both surprised and disappointed to see the enclosed materials. The essential page is the one with numbers that must be entered when logging into the computer. The recipient is only informed generally about the need for this card at the bottom of this card where it is unlikely to be seen last if at all.


Ordinarily, the most critical information for recipients should be placed in the letter, but in this case it is not. The letter is obviously a “form” letter, and not the kind of mailing that is likely to attract attention or be carefully read. In essence you have divided the information across three separate inserts and done so in a way that makes it challenging to figure out what to do.


The letter tells where to go to respond, but then one has to go to another place (the card) to learn what number has to be used for the log in. The sentence in the letter that conveys the mandatory expectation “hides” the requirement to respond behind the sentence that the person has not personally been chosen to respond. It’s a strange sequence of words. I also noticed that the Spanish language information does not include the log in numbers so that the respondent has to turn over to the English side for that. If a person who decides to fill out the ACS on line doesn’t keep all three parts of this mailing together, I suspect that will make it more difficult to complete the form. The current procedures needlessly make it hard to figure out what the recipient should do.


Mass mailing techniques appear to have won out over targeting a particular household with an easy to follow request to respond. When people have to combine information from four different locations (three inserts plus the web site) to figure out what to do, I worry about how many people will be willing to do it, and how unhappy they may become as a result.


I suspect that the reason for this strange structuring of information has to do with cost and quality control of the mailing process. The large card fits perfectly inside the envelope and the address on the card fits perfectly. If the address line were on the cover letter, that could produce a quality control problem of having to get two inserts matched in the envelope stuffing process.


This is not the place to go into detail about reorganizing the material into a more user-friendly format, but I think that needs to be done. In some ways the desire for branding with ACS appears to have prevented using the top of the card to explain what to do and how to do it.


Current Mail-out # 3: The Reminder Postcard


Reminders are essential for improving response rates, and I was pleased to see that this follow-up is being used. However, I see that there is no mention of the need to use the numbers assigned to the household for the log in. I assume that the website directs people to use the numbers on the front side of the post card. That is desirable, but it would be more desirable to explain the connection that needs to be made in the mail out so it would be easier to relate what a person must do when he/she enter the web site.


I also see that this third communication starts: “A message from the Director, U.S. Census Bureau”. The message is about the same as that included in the mailing sent a week earlier, and the name of the Director is omitted. I also see there is no mention of needing to use the numbers on the other side of the postcard to log in. I suspect the vagueness is intentional on this postcard because of Title 13 and not wanting casual readers (including postal workers or other carriers of the mail) to see personalized information.


I’m always concerned when I see messages that “look” the same being sent one after another. Doing that makes people feel they have already seen the information, so “why should I read the same thing again?” However, on the positive side, the pre-notice, request for response and postcard are physically different, and that’s desirable.


Rethinking the Current First Three Mail-outs


I suggest that you eliminate the pre-notice (because of now using a push-to-web-approach) and integrate that contact with what is currently the second mailing. Then, I would send the third mailing inside an envelope—not as a postcard—using the same size as the current pre-notice envelope.


I would redesign the current second mailing in a significant way. I would integrate the content of the first and second mailings and probably use a folded card with the address printed in its current position, where I would provide complete instructions on use of the url and id numbers, getting them into a more logical order. I would also rework the cover letter to emphasize why this survey is being done, the importance of doing it online, etc. However, I do not think that I would include the url in the letter (but perhaps may think differently on this later). I would rely on the inserted card. If you make my proposed changes you will now have a lot more space to give the completion instructions. I might also be inclined to eliminate the brochure, getting the key points onto the front or back of the envelope.


Then, when the one week follow-up is done, I would repeat the mandatory notice on the envelope, and because I am protecting the identity information inside an envelope I would repeat the URL and code instructions. I would also date the follow-up letter. There are several things that can be done to integrate the two mailings better than is now done.


In essence, I’m willing to lose what little impetus I think you get from the initial pre-notice. However, you should recover it, by shifting the envelope to the third letter and including the mandatory notice). I think this change would result in cost savings as well as provide a stronger request to respond.


Current Mail-out # 4: The Paper Questionnaire


This mail out emphasizes that respondents are free to choose which of two options to use in responding, thus adding complexity to the response task. Such complexity is at the base of what Schwartz (2004) describes in the “Paradox of Choice” as leading people to choose none of the offered choices, which in this case means becoming a non-respondent.


In addition this mailing includes six enclosures:

1 Paper questionnaire

2 A card, similar in style to the one sent previously that states there are two options for responding.

3 An undated and separate cover letter that emphasizes in black type two options for responding and begins as all other messages have in the first three mail-outs, “A message from the Director, U.S. Census Bureau”.

4 A sixteen page booklet that explains how to complete the Census form and provides information on why The Census Bureau asks certain questions.

5 The same frequently asked question brochure as include in a previous mailing.

6 A return envelope for the paper questionnaire.


Including so many enclosures makes the response tasks appear very difficult, thus adding to the complexity of choice. The complexity of having so many components in this mail-out, plus the problem of choosing which mode to use in responding, work together to increase the likelihood of non-response.


Some of the information is redundant—information on there being two options (internet and mail) is repeated in the letter and on the card as well as the questionnaire. Information on the importance of the survey and why it is being conducted also appears in the brochure.


Other information is probably read or used by very few people, if any. The detailed instruction form is similar to one included in the 1990 Census, but was not used in 2000 because of the lack of evidence that it improved people’s responses to the form. Also, the current brochure is oriented towards only the paper form. Much of the information in it is not provided in the Internet version of the questionnaire. Trying to read this instruction form for any reason is quite tedious. Some of the information is provided in the questionnaire, e.g. “List the name of each person who lives at this address.” I doubt there is evidence that inclusion of this form improves the quality of either the paper or internet questionnaire answers. We had no such evidence in the 1990’s and I am not aware of any experimental test being done since then. I’m concerned that by trying to provide answers to every conceivable question that someone might consider asking, that you are focusing on just a few people, rather than the many others who decide that completing this form is just too hard to do. I think you are focusing on a very few at the expense of catering to the many.


In sum, this mailing probably decreases response to the ACS because of providing options without advice on which response mode to use and the reason, and includes communications that are to some extent redundant and unconnected. It also conveys an expectation that respondents must connect different pieces of paper in order to respond, and provides a great deal of information that is not read or otherwise used.


The Need to Revise Current Mail-out # 4


In order to improve response I suggest these changes be made:


First, the 16 page “Your Guide for the American Community Survey” should be eliminated from this mail out.


Second, I suggest that a signed letter come from the Director of the Census Bureau and that the content of the card be combined with the extra card so that the respondent gets a clear set of instructions on how to respond.


Third, I believe it would be helpful to change the nature of the request, and connect the letter from the Director to the response situation. At this point I would convey that you are sending a paper questionnaire for two reasons. The first reason is that it shows you recognize that for some people it is difficult to respond over the internet, and they may prefer mail. The second reason is that people, who can and will respond over the internet, may prefer seeing the kinds of questions that are included in the questionnaire, and perhaps this would be helpful in preparing to go on line to complete it. In short, I suggest getting away from two options and the “you decide” approach. Instead I would try hard to convey the idea you are trying to be helpful by providing another means of responding. And, your reason for doing this is relevant to both mail and internet responders. In essence, I am proposing a quite different communication style than you have used of mostly assembling independent parts.


I have observed another government survey in which there is a very strong preference for getting people to respond over the web. For that survey they sent a sample questionnaire to people in order to give them an idea of the questions, and it provided helpful in getting more people to respond over the web. It’s also clear that some of the ACS questions, particularly for the second and third person in a household, may be impossible for a respondent to answer. As the “respondent” proceeds through the internet questionnaire, they may find questions to which they simply don’t know the answer. Telling them they can go back into the questionnaire to answer blank questions is inconvenient and I have found it hard in my own research to convince people to go into internet questionnaires a second time. Mostly they just quit and won’t come back. I think you can turn this mailing into an integrated set of materials that in the end suggests that as an agency you are trying to be helpful to recipients, rather than just explain what you want them to do.


I also don’t think the paragraph on choosing someone’s address, not them, personally, followed by the reference to “you” are required to respond, will make sense to most recipients. Structurally, within the paragraph, it is also in a subordinated position that makes it less likely to be processed. Also, when information is repeated word-for- word in later letters those words lose their effectiveness.


I’m also afraid that at this point the mandatory nature of the request has lost its effectiveness. It’s obvious that mass mailings are being used—no dates on letters, no actual letter, with only a very passive reference to the director sending you a message. The reference to not being actually chosen personally is likely to leave many people thinking they don’t really need to respond. You are in essence leaving conveyance of the mandatory message to the block on the envelope, where it appears like just so much more advertising associated with a marketing campaign. I hope you’ll reconsider how these elements are now combined.


It was mentioned to me by one of your staff that the stuffing capabilities for envelopes had been maxed-out, i.e. no more inserts could be included. In a strange sort of way that suggests to me that this mail-out in particular is being approached in a way that suggests bombarding people with additional messages will in some way help. Doing so is counterproductive. You have already exceeded the number of inserts that will be helpful and the consequence has been that they are simply not integrated, nor do I believe they are helpful to people for encouraging them to respond.


Current Mail-out #5: 2nd Postcard reminder


The message on this card is fairly strong, e.g. “Please complete the questionnaire and return it now….”


And, in this case the mandatory requirement is emphasize, and the recipients are told that a Census Bureau interviewer may contact them. I think it is okay to do this. However, it sounds like a threat as worded. If it were me I would try to build in a money-saving rationale that is consistent with your earlier request for an internet response. I understand the reason for saying an interviewer may come, i.e. sampling occurs at this stage. However, it also weakens your argument a bit. I would be inclined to rework the wording here.


Also, the start of the letter, “A message from the Director, U.S. Census Bureau,” is redundant with previous communications. Such redundancy gives the impression, “I’ve seen this letter before so why should I continue reading. The lack of date also communicates that it’s a mass mailing. I also think the same argument about information for schools, hospitals, roads, etc. has been used in previous mailings. Redundancy is not our friend in convincing people to respond to surveys.


I also find it strange on this card that you have emphasized “Now is the time…” with bolding, but not put the online link in bold.


I suggested a revision in the third (now second) contact that would introduce a letter as a replacement for the earlier postcard, and for that reason I think I would stick with a postcard for this mailing, while trying to freshen the wording. In other words, use of a postcard at this point because a fresh stimulus, if the prior postcard is changed to a letter.



Current Mail-out #6: A two-sided postcard (green)


This contact has a larger card stock for the postcard, and is a different color. Thus, it contrasts with all the previous mailings. At the same time, the message is virtually the same, as is the style. The larger card provides contrast. I don’t think that the green color is particularly helpful, but neither do I feel it hurts the likelihood of response.


Proposed Alternative to Current Mail-out #6


I am assuming the idea behind the additional postcard is to pick up a few more responses, but meantime the sampling for telephone interviews has occurred.


It’s not clear to me whether a card is sent to the households to be telephoned or whether that contact is only being made by mail. I have not seen an exact description of the sampling procedure. If I am interpreting the April 30, 2014 memorandum from Stephanie Baumgarder correctly, around 40-45% of the household for which a telephone attempt was made responded with answers. I remember a presentation by Deborah Griffin at a National Academy of Science meeting that indicated about seven percent of the ACS respondents came in over the telephone. I am not sure whether the calculations in the Baumgardner paper are limited to those households for which a telephone number could be located or whether it is only households for which a potential telephone number could be located. It’s also not clear to me what the total yield is from the second postcard households only.


These uncertainties stated, let me simply raise a question. Have you considered sending another paper questionnaire to the households that are assigned to telephone in an effort to increase the telephone yield? I raise this issue based upon these considerations:


  1. The preference for responding over the telephone is low. Olson, Smyth and Wood (2010) have reported alternative follow-up tests with individuals who have previously stated a preference for telephone, and obtained higher response rates with a mail only approach.

  2. De Leeuw et al (2007) have shown in a meta-analysis that a prior contact by mail increases the likelihood that people will respond to a telephone call. These data are consistent over many decades of testing.

  3. You are already using the “threat” of a telephone call as a means of improving response rates by self-administration, although it is somewhat vague and mentions “maybe”.

  4. Also, one of the weaknesses in the data sets I have seen, and my own research, is that when a combined internet-mail data collection effort is made, surveyors do not typically send a second mail questionnaire. Yet, the literature is quite clear that whenever a replacement questionnaire is sent that it will improve response rates.


I suggest that you consider sending a replacement mail questionnaire to the assigned telephone households indicating that you plan to call this address, and wanted them to be aware of that. You can also say that you are sending a replacement questionnaire because you have found that some people find it helpful to use it so the information is available when the call is made. And, then you can mention they could also respond by a particular date by mail. The Internet possibility could also be mentioned again.


For the postcard only group that will not be contacted (as I understand it) I would also suggest substituting the mailing of a final paper questionnaire for the current postcard. I would not mention the telephone call, since it won’t be made to this group (if I understand your procedures correctly), but based on the effectiveness of sending replacement copies of questionnaires that appears in the literature, I would consider that alternative.


Summary of recommendations for Improving Current Mail-out Procedures


I am pleased to see that you have been able to “hold” your self-administration response rate at about the same level (or perhaps a percentage point higher) by pushing respondents to the web so that 58% of self-administration responses are being returned over the Internet.


I also think that there are several things that might be done to gain further improvement of the self-administration response rate, and reduce costs for telephone and in-person follow-up. In general, they include:


  1. Eliminating one of the postal contacts (pre-notice), while also strengthening the current second and third contacts by reworking several of the component parts.

  2. Develop greater connectivity across the components of the parts so that it takes respondents considerably less effort to understand the request.

  3. Reduce the complexity of the paper questionnaire mailing by eliminating some of the components completely, and connecting other parts together, as well as reducing the emphasis on options.

  4. Replace the third postcard reminder with a second mailing of the paper questionnaire targeted differently to the telephone-only and the households not selected for telephone follow-up.


In the paragraphs above I have provided detail with regard to the research literature support that underlie these recommendations, and led me to the conclusion that the current procedures are not optimal for improving the self-administered response by Internet and mail to the ACS.



Evaluation of Three Alternatives for Current ACS Data Collection Procedures


I have reviewed each of the proposed revisions, but did not do so until I had completed the above analysis of the current ACS procedures. It had been several years since I had looked at exactly what procedures were being used, and thought it would be useful to comment on existing procedures and their adequacy before discussing the proposed Reingold procedures.


Based upon the color of the American Community Survey words on the outgoing envelopes and my sense of the organizing purpose for content, I will refer to them as:


1) Gold—Community benefit,

2) Blue—We the People or Patriotic

3) Green—Other language appeals


I begin here with some general comments that apply to all three of the alternatives, and then make some specific comments about each. It would be helpful if the reader of this report has all of the materials in hand when reviewing the comments that follow.


Many positive qualities


I am impressed with the quality of the work represented in each of the proposals. They have been carefully thought out. The letters and forms I received were nicely executed in the samples. I consider them professionally well done.


I also see that Reingold has questioned the value of the pre-notice and suggested dropping it. They have also recommended eliminating the massive number of enclosures with the paper questionnaire, and made a number of other recommendations that I think are quite positive, which I will discuss in this section. I also seek to ask and answer more general questions about the fundamental approach being used to improve response.


Will a marketing approach Improve response to the ACS?


I mentioned earlier the Decennial Census Experience of testing a marketing approach for the 2000 Census, which resulted in decreasing response rates by 5-9 percentage points (See Appendix A). It was the only tested element that produced a response decline of this magnitude. That test was particularly persuasive because it was conducted in the presence of the other five elements (pre-notice letter, reminder postcard, respondent-friendly questionnaire design, replacement questionnaire and mandatory response announcement on the envelope). Thus, it was undoing a significant portion of the response improvement that had been achieved by other means (Dillman, 2000).


Individually conducted cognitive interviews (Appendix B) suggested strongly that the designs used in those test made the envelopes and questionnaires appear like a marketing appeal from a private business rather than an official government mailing. In addition the information that “Your response is required by law” used in those 1996 test forms, has been significantly changed in current ACS mailings, i.e. “U.S. Census Form Enclosed” has been deleted.) Placing the remainder of the tested messaged in reverse print on both forms was (in my opinion) instrumental in making that information less accessible to people. Considerable research shows that people have difficulty moving back and forth between positive and reverse print (See Dillman, Smyth and Christian, 2014, for the reasons). The result of this change of presentation was that it was less likely to be seen and read.


I would recommend that none of the three marketing versions produced by Reingold be adopted by the ACS staff in their present form. I am going to suggest running a test of a revised ACS procedure with revised Reingold alternatives, the nature of which I’ll describe in the paragraphs that follows.


I am raising this point here for two reasons. One reason is I do not think focus groups have the methodological power to evaluate how people will respond to these forms in an actual ACS data collection procedure. Second, I recall a mistake that was nearly made at Census in the 1990’s when focus groups from throughout the U.S. were being used to test: “Response is Required by Law: U.S. Census Form Enclosed” when inscribed on the envelope in a prominent box with prominent type. The conclusion reached in these focus groups was that this mandatory statement on the envelope should not be used. Instead, it was concluded by these test group participants that an alternative envelope label be used: “It pays to be counted: U.S. Census Form Enclosed”. Focus groups constitute a situation in which some members influence other members, and people were likely influenced by what they thought that others in the group already were thinking (in other words “normative”), so were less likely to say they would pay attention to the mandatory message and be influenced by it.


This research on the mandatory message is intricately related to the marketing test done in the 1990’s, because it was precisely that information that was visually demoted in the marketing designs that were tested, and was probably not seen by many respondents (a very common occurrence with reverse print—see Appendix A for elaboration). I also recall that the decision to use the Marketing Approach in 2000 had almost been made (one of the forms had been rolled-out at the Department of Commerce), and without an experimental test. Had we not done an experiment I suspect the cognitive test results would have been ignored, despite the negative evaluations that we reported.


I have provided this background, because I do not think that I or other expert evaluators, using the usual group (focus) and individual (cognitive interview) testing methods can provide an adequate test of the Reingold forms. The reason is that a number of affective aspects of contacting people and asking for their help are associated with reactions to designs. I am also concerned about the aspect of asking people to provide sensitive information about themselves (e.g. income and sources of income).


Testing of a sequence of communications that includes an overtone of “being required” and collection of personal data is quite different than deciding to test a consumer product that costs relatively few dollars. The risk to a respondent of impulse buying such a product is relatively low. Deciding whether to respond to a questionnaire that asks for a great deal of personal information is a much more complicated request that may not have a low risk to the respondent. Thus, deciding to respond to the ACS requests goes far beyond being affected mostly by response to the addition of color and graphics.


Consequently, my approach to evaluating these three proposals is less one of judging their current adequacy, but rather to focus on changing certain features of them that I think will bring them closer to being consistent with available survey design research on what influences people to respond or not respond to surveys of this nature. In this regard I’ll return frequently to my earlier assessment of current ACS procedures.


Your Response is Required by Law


This issue is intricately connected to whether any of the marketing approaches will communicate to households that their response is required by U.S. Law.


Visually, three issues are dominant on the three envelopes,


1) the United States Census Bureau Logo,

2) the American Community Survey logo, and

3) Your response is required by law.


On two of the envelopes the mandatory nature of the response request is in reverse print. Visual science research has shown that people do not navigate well from positive print to negative print. And, even though the reverse print makes it eye-catching, it may not be seen if people start processing positive print and don’t switch to the other type of print because of it being in reverse print (See by 2009 and 2014 books with Smyth and Christian for the reasons and the visual science references that support this point).


None of these three logos convey this request for a survey response is official government business from the U.S. Census Bureau. If one goes to the fine print, they will see that the envelope locates the U.S. Census Bureau is located in Jeffersonville IN (which most people have not heard of) rather than Washington D.C. That in combination with the three dominant visual features may easily lead people to believe that the sender is pretending that this is an official government survey.


I appreciate your interest in gaining name recognition for the American Community Survey, but I don’t think the envelope is the place to do it. It does not have the appeal that the U.S. Census Bureau and its activities (particularly the Decennial Census) have. The only thing on the envelope that conveys authenticity to some people y is the penalty for private use language.


I think the current ACS envelope is significantly more likely to be opened than any of the three marketing (proposed) envelopes. However, as discussed earlier, it could also be improved.


My suggestion to you would be to


  1. remove American Community Survey from the envelope and from inside the mandatory response box

  2. Replace this information in the mandatory box with “U.S. Census Form Enclosed; your response is required by law,” in order to bring it closer to the originally tested version that was experimentally evaluated (this applies to the Reingold forms as well as the current ACS envelopes).

  3. Remove the implication that the Census Bureau is located in Jeffersonville by inserting “U.S. Census Bureau Processing Office (which I think is more likely to be understood by recipients). It’s hard for many people to imagine that the U.S. Census Bureau is located where the envelope says that it is.

  4. Shift the US Census logo to the bottom of the envelope where is located on the control.


Making these changes will increase the likelihood that the marketing envelopes (and current ACS envelopes) will get opened, which is the first step to getting a response. I am raising these issues, because of what I call the “reductionist” problem. There is a tendency to reduce the question of what convinces people to respond to surveys to a very few individual attributes. I think it’s more likely that people react to the totality of how individual items on the envelope (and inside) intersect with one another.


Recently I was in a meeting that included some Census employees who were lamenting the “fact” as they described it that respondents preferring ugly questionnaires. My response at that time was that I did not believe that people wanted “ugly”, but instead they want things to make sense. The envelope is the first place where one needs to make clear what the message is—there is something essential that you need to see on this envelope to define it as being official government business.


Aesthetically, I like and appreciate the gold and blue envelopes with the visual runner along the bottom. However, I don’t think the envelopes are more likely than the plain census envelope to get opened. The improvements I am suggesting above are things I would suggest doing before doing an experimental test of the proposed census vs. current ACS forms.


Content of the Request to Respond Over the Internet


Inasmuch as the contents of the initial envelopes are somewhat different I will discuss them individually. However, to avoid redundancy I will include some general discussion of common elements in the initial evaluation of the Gold form


Proposed First mailing--The Gold Form, Community focus


There are some aspects of the content I find quite positive. For example, the letter is more professionally written than the ACS letter now used (see my earlier comments on the current ACS internet request). The proposed letter starts with random selection, goes to the reasons, asks for an internet response, and is straightforward about the response being required by law, even bolding it rather than sort of “hiding” it in the second line of the fourth paragraph of the current ACS letter. In addition it thanks people rather telling them what to do if they need help; the latter approach comes across as a bit condescending.


I also think this letter is an appropriate place to introduce the branding of the American Community Survey. The internal form number is visually “hidden” from the respondent by placing it in the upper right corner of the page. People’s eyes are not likely to go there unless they are looking for it. The current ACS places the form number in the upper left where people’s eyes are likely to go first in normal reading.


I think it’s also good that the U.S. Census Bureau is shown as being located in Washington D.C.! This needs to be done in all letters, even though the processing office (identified as such) is the appropriate return address for the envelope.


However, the marketing features for this mailing make the request seem less important. Putting the reverse print in the middle of the page and bleeding it to the left, makes it dominate the page, but if someone starts with the positive print they are likely to skip over it. Serious business letters do not do this sort of thing. Also, the graphics at the bottom make it seem less serious than desirable. “A message from the U.S. Census Bureau” also makes it seem less important, i.e. it conveys the connotation of being a form letter. This concern also applies to the currently ACS letters now being used by the Census Bureau.


I think that I understand the reasons (the risk of a mismatch with address on the Internet instruction card) that Census does not want to insert an address on this letter even though it would be desirable from the standpoint of personalization. Here is an alternative I would suggest for both the control and this treatment group. Place the US. Department of Commerce and U.S. Census address in the upper left hand corner as it typically appear in normal stationary, and include the wording, office of the Director


Then underneath it I would put in a current date plus:


“A Request from John Thompson, Director of the U.S. Census Bureau”


This is a second best approach to placing a signature at the bottom of the page, but it would help convey the importance of this request.


I am also concerned about the two cards. The folded card has four faces (pages) of information to process, and the address page has two faces. Graphically there is enough similarity (blue and gold) that it is a bit challenging to figure out front and back, and there is no organized sequence that people need to go through in processing the information. A number of pictures are included to increase visual impact; reverse print gets used on four of the card faces for important information that results in it not being processed when nearby information is being used. In the end I get a sense of clutter, rather than an organized request.


I think the current presentation of material will not do what is needed to get people to quickly see what they need to do, and the reasons. The desire to produce visualness has trumped the need to linearly process information in order to perform a task. As an aside, on posters, and artistic pictures, and at an aesthetic level I’m not at all uncomfortable with what’s here, but when people are being asked to undertake a survey task this is not the best model for guiding design.


As an aside, I often think about this when walking through airports, metro systems, and driving down streets and highways where following instructions that are quickly viewed needs to be done. In questionnaire and mailing design the same principles of keeping it simple and sequential apply. In these designs it is pretty much left up to the recipient to turn cards and letter back and forth to figure out what to do, and even then I don’t think it will be clear to some on what they need to do. For background on this point you may find it helpful to look at Dillman, Gertseva and Mahon-Haft (2005). It shows the application of usability and visual design principles for recreating a large government survey.


My suggestions are to:


  1. Eliminate the second card in this envelope completely.

  2. Revise the address card to provide a linear description of what the respondent needs to do to answer the survey. “Your survey response card” has no stand-alone meaning. How about something like, “Two simple steps to answer the American Community Survey”. Underneath the instructions I would insert:


Step 1: Enter this information to be taken to the online questionnaire

Step 2: You will be asked to enter information from your address label to enter your answers.


The side bar on communities across the nation currently tends to clutter this page further. If a more linear and clearer navigational path is added to the right side of the card, then I would not worry so much about this competing information on the left.


  1. Use the back of the letter from John Thompson to include the content deemed absolutely essential from the card that I am proposing for deletion.


I would also fold the letter in the other direction, so that the beginning of the message from Director Thompson is on the top, rather than folded inside a page that is blank. (Note: I may have misunderstood which way the letter was folded because of the time I spent looking at the various components in different ways, but my memory is that the print was inside the fold.)


One of the harder visual design concepts to implement in surveys is the idea that “less is more”. When gratuitous graphics are created for motivating people to respond to a survey that places the motivational emphasis in the wrong place. The motivational emphasis needs to rest primarily with helping people understand with little effort that this is a government sponsored survey and responses are required. The second part of that motivation is to make it easier to see what to do. Graphics can often help, but when they interfere with the response process and attempts are made to increase their prominence in isolation, e.g. reverse print and the banner at the bottom of the letter they start to convey other meanings.


Proposed first mailing--Blue form--Patriotic


My reactions to this form are similar to my reactions to the first with regard to the outside of the envelope, and the contents (letter plus two large cards, included with this one).


I would change the letter in the ways suggested earlier, except I think its fine to have John Thompson’s signature at the bottom (and would perhaps make that change on the Gold form as well) I also think the content of the gold form letter is stronger. This one is greatly simplified and does not hang together so well, in my opinion. I don’t think it is very persuasive to start out saying, “Enclosed you will find instructions for completing….” Doing that assumes people are ready for the instructions. They need to be told first by the sender (John Thompson) why he is writing to them and why they need to respond.


Nor do I think that likening the response to “jury duty” is helpful. There are many, many ways to avoid jury duty, and only registered voters are eligible. Saying in the third paragraph that the ACS is an official Census Bureau survey is not something that strikes me as very powerful. The reader has already been told in the previous paragraph that response is mandatory, and this sentence doesn’t add much to that.


It also seems strange to say, “If you have access to the Internet and want to learn more….” to go to the web site. That comment about if they have access to the Internet seems unusual when the thrust of the letter is to ask people to respond on line.


It’s often helpful to put things in a P.S. and such information attracts attention there. However, in this case the main purpose of the letter is to ask people to respond on line, and I was surprised to see this held until the P.S.


I would also suggest eliminating the second card and integrating that information with the letter from Thompson, using both sides. Finally on the card itself, you place emphasis on telling people that it should not be discarded. “Keep this card” is somewhat hidden in a circle in reverse print (see Appendix A for similar problem with one of the 1996 marketing forms).


The first line on this gold form treatment is a receding gray that strikes me as a little unusual. “Completing the American Community Survey online—it’s the quickest and easiest way to respond.”

When one tells people what is “quickest and easiest” for them it sounds presumptuous. Virtually every comparison I have seen when people are given a choice between an enclosed mail questionnaire and the internet, shows that about ¾ of respondents answer by mail. That provides some indication of what many people think is easier. The reduction of instructions to “slogans” as done here makes the appeal less convincing that I believe it could be. In some respects it’s akin to the use of “new and improved” on store products when that’s not really the case.


Proposed first mailing—the Green form, multiple language or official approach


Most of the envelope and letter comments also apply to this form.


In addition I am concerned that the divided envelope, which pulls attention immediately to other languages, may convey that this is a marketing effort rather than a request to respond to a required survey being done by the U.S. Census Bureau.


This letter, in my opinion is better (more complete) than the second letter. Again, I would change the front of the letter to official letterhead and add a date as suggested earlier. The start of the first paragraph is on target and strong. However, I’m surprised that you put confidentiality in that paragraph. The mandatory requirement is more important from a persuasion standpoint than is confidentiality. Some research also found in the 1990’s that confidentiality could be emphasized too much. It’s not belabored here, but I don’t think I would put it in dark print.


I would put the internet request in the second or third paragraph, and make it stand out by simply putting the internet address information in bold and indented.


I am puzzled by the use of the term “internet response card”. It implies the survey is a card, and it is not. You mention the internet response card is the letter and then use that as the heading of the card. I’d be inclined to refer to it as the address card, if you think a label is needed, and talk about simple instructions for completing the questionnaire.


I sense there is some concern about being very explicit saying what information from the address card is needed for answering the ACS. I’m not sure of the reason, but since it is a sealed envelope I do not understand why you cannot be more explicit about entering the number below as step 2.


My comments on the two cards are the same as for the Gold and Blue forms, and I have discussed earlier the need for two-step instructions.


Summary Evaluation of Internet Request letter


I think it may be helpful to offer a couple of summary comments. I have suggested (as has Reingold) that the pre-notice be eliminated, and the now first mailing be strengthened.


It’s important to contemplate at this point whether response will be improved with any of three suggested revisions. In general a high response to the initial mailing(s) will carry over into a higher response rate. I have expressed my concern about shortcomings that I believe exist with each of the formats. I am also concerned about the existing ACS procedures and think several aspects of the current ACS mailing need to be changed.


I also somewhat puzzled as an experimentalist, what one would learn by implementing a field experiment with each of the forms in their current formats. On the three test forms there are so many things that are different (e.g. letters, envelopes, graphics, cards) that it would be impossible to determine what might cause any observed differences. Testing them in the current form would seem to be a general test of a package of proposed changes. Understanding the causes of any effects is badly confounded. My inclination would be to bring many of the details somewhat closer to being in common, so that one might be able to better pinpoint the causes of any differences that are observed in response.


I have sometimes run “package” experiments and understand that sometimes it is most helpful to test a package of ideas, and accept the limitation of not being able to pinpoint the causes. However, as I reflected on these initial contacts, I think each of them could be improved upon as could the current ACS, so simply point out here as issues to continue to think about as your work continues.


Proposed Second mailing—the postcard


I have spent quite a bit of time reviewing and thinking about the proposed postcard follow-ups.


I think that all three of the proposed postcards will increase response rates to some extent. A follow-up postcard that reaches people shortly after they have received a previous request to respond to a survey has been found consistently to improve response rates.


I suggested in my comments about the current ACS follow-up that I would switch this follow-up to a letter. My primary purpose in doing that would be to be able to provide a little more detail on connecting the address and log in numbers to the web page, and make the instructions more complete.


I realize that a postcard is an open communication, and I assume you want to withhold details so that someone for whom the postcard is not intended will not immediately go to the web page, realizing they can log in, provide the code and complete the questionnaire. Sending this request in the privacy of an envelope would provide an opportunity to provide more detail on exactly what to do and how to do it.


I’m also concerned that all three of the post cards start out with, “A few days ago, you should have received instructions for completing the American Community Survey online.” Simply because instructions have been provided is not a compelling reason for responding to this survey.


I’m also concerned that you are relying on the logos US Census and American Community Survey at the top of the page to carry the message. I think the current postcard would be stronger with the Washington DC address. That message could be strengthened by including that the response is required by law and adding the Director’s signature at the bottom of the page. The yellow form has the response is required by law message on the address side, and the others place that message in paragraph form.


I’m also concerned about the return address on the blue and green forms identifying the Census Bureau as being located in Jeffersonville. I see that on the Gold form Louisville is identified as being the National Processing Center, and that is desirable.


All three of the new postcards strike me as being marketing oriented because of multiple colors and changes in fonts. They take on the aura of “search and find” similar to techniques used in posters, without taking into consideration the sequential navigation that goes with understanding and acting upon the request to respond. I don’t think that this style of construction will improve response over the standard postcard now in use. At the same time I think the official business postcard now in use could be improved by making the changes mentioned above. And, further improvement could be made by using a letter in an envelope (assuming the pre-notice is abandoned) to strengthen the message.


Proposed Third mailing—the Paper questionnaire


I believe that all three of these paper questionnaire mailings represent a significant improvement over the current six enclosures mailing (see my earlier discussion under current ACS procedures). There is less information to process and that information is concentrated into three pieces for each of the mailings.


The 16 page guide, which was almost impossible to read, process and use, and represented a reason for setting the request aside for when the recipient felt like he/she had more time, is now gone.


However, there are four things that concern me.


First the outside of the envelopes are the same as for the internet request mailings. Please see the earlier section of my report for how that might be revised.


Second, there is no letter requesting a response. A letter from the director needs to be included. It should emphasize why a paper form is enclosed, and indicate how it can be used by both those who prefer to respond by paper and those who will decide to go to the internet. I am suggesting that in part to draw attention away from offering “choice” which tends to inhibit response. This letter also provides another opportunity to underscore the reason response is mandatory, and how important the data are. The letter needs to come from John Thompson, the Director, and not as a “message from the U.S. Census Bureau.


Third, I would eliminate the card. This is a carryover from the earlier mailings when a card was used with the address on it. You now have the address placed on the ACS form itself, so it’s not needed for that purpose. And, as now structured it pushes the focus onto the Option 1 vs. Option 2, which then carries the baggage of offering choice. I think a letter is a more effective place to explain how the paper questionnaire can help people in responding, even when responding over the internet. Eliminating this card will then reduce the enclosures to three—the letter, the brochure and the return envelope, making it quite simple.


Fourth, although I think the content of the brochures is pretty good, there is a mechanical problem. When I pulled the gold and green enclosures out of the envelope neither brochure came with the other materials. It reminded me of a similar problem we had in the 1990’s when “smaller” enclosures seemed to stick in the envelopes after other contents were removed. If the card is eliminated I think you can increase the size of these brochures so that they pull out with the other materials, and they would not be in competition with the envelope-size card.


I looked at all three of the brochures carefully, and see that the content differs quite a bit among them. I think all of them are well written, and relevant. They are attractively laid out and printed. I also think that one included with the gold form has the most useful content. I like the explanations about why it is mandatory, why certain questions are asked and the general information at the end on benefit to the community. My only concern is its small size.


The green form is a quite different style. It starts with trying to be sure no one is listed twice, and makes arguments for questions that I think are overly simplistic. It does not have the helpful appeal that I believe the gold form brochure has. I also think the “guide” in the blue patriot survey less appealing in the sense of not connecting with respondents and questions that they might have. I would not use this form or the green one. At this point I am having some difficulty following the thematic that was posited as guides for each of the three versions. I may be missing some connections that I need to make.


That said I feel more positive about these three proposed mailings than I do about the current ACS mailing. If changes are made in the manner I have suggested above I think these will have a more positive response effect than the current ACS mailing with its six enclosures. However, none of these three alternative approaches should in my opinion be used without reinserting a letter and eliminating the internet card.


Proposed 2nd reminder card (4th contact)


This is a reasonable place for use of a postcard. A postcard follow-up to a postal questionnaire is a procedure with longstanding empirical support. The timing about a week after the previous mailing is also desirable.


I think that the blue and gold proposals will be less effective than the current ACS postcard. The marketing orientation and design elements lack freshness, because of appearing in previous mailings (e.g. the internet response banner on the gold card, and the eagle on the other card). Also, the changes in color and fonts make it difficult to sequentially process the information.


Information that gives immediate credibility of the Census Bureau is not apparent to justify the “your response is required by law” because of the apparent location of the U.S. Census Bureau in Jeffersonville. The current ACS card starts with a Washington DC address, and letterhead. I think it could be improved by adding the Director’s name and signature, and eliminating the opening line, “A message from the Director”, U.S. Census Bureau.


The follow-up card for the Green form provides a more official look, and is also an interesting departure from the Green form graphics. The official business insignia on the back is also desirable. I think this one could be more effective than either the blue or gold graphically enhanced cards.


Proposed additional postcard (5th contact)


I have discussed this extensively with regard to the ACS and suggested that this contact might be changed to a letter. I think this could be done for the proposed new strategies as well.


However, to respond to the specifics of the cards, the Gold and Blue versions have the graphic concepts from previous mailings included, and lack linearity if one actually tries to read through the entire proposal. The effort to switch from gold print on white, to black on white, to blue on white, and then reverse print, white on gray, and then back to blue on white, followed by black on white is not insignificant.


I also see that the arguments for responding are quite different.


One card begins, “You have received multiple mailings about the American community survey.” The second card starts, “REMINDER: YOU RESPONSE IS RQUIRED BY LAW. And, the third starts, “a reminder from the U.S. Census Bureau. It’s not clear to me why different arguments are presented in each of these cards. The three do not seem to be thematically related in a unique way to the Gold—community, Blue—patriotic, Green—official business styles being tested.


I realize these are a continuation of previous communications and the legitimacy should be established by this time. However, if one were to show these cards to disgruntled housemate or spouse who is looking at this request for the first time, the lack of legitimacy will come through from the logo substituting for Census Letter head with an address, and not originating from a responsible official.


I don’t particularly care for the color (a shade of green) of the current ACS card, but think it is a stronger communication and will be seen as more legitimate. I would have the director sign it and also change the address (adding “Census Processing office” on the address side of the card.


Finally, an additional postcard at this stage is in my view a “second best” strategy. I would be inclined to send another paper questionnaire, as discussed earlier in my evaluation of current ACS procedures.


Is Thematic Appeal Even Relevant


I have not yet answered the question of whether one of the thematic appeals is stronger than the others, and therefore likely to produce higher response rates.


None of the three themes, i.e. Gold--community-oriented, Blue—patriotic, and Green—Official business strike me as likely to lower response rates significantly, in comparison to the others. All three are reasonable themes.


However, I also think that a particular theme is far less powerful as an influence on response than are other considerations. These “other” considerations include, the mandatory response requirement, multiple contacts, the actual questionnaire content, length of the questionnaire and whether questions can be answered, perceived authenticity of sponsorship by the U.S. government, respondent-friendly design of questionnaires, and personalization of communication that suggest there are real human beings behind the request. There is published research evidence available on each of these features. Also, in mixed-mode data collection, using the internet and mail, there are matters of connectivity across the communications and modes, and avoiding complexity of mailings that will affect responses.


It is clear that surveys on some topics obtain higher response rates than others. Some questionnaires are more offensive than others, and some contain questions that are more difficult to answer than others. The selection of the theme does not affect this aspect of data collection. I know of no research that shows clearly that selection of alternative themes will significantly improve response in a high response rate situation. . I added this qualifier, because if one is trying to improve response from 2-3% to say 5-7%, I think I would consider these designs carefully. However, when my concern is with getting from 55% or so to say 70%, then one is using many other response inducing factors, and it’s unlikely that the specific theme will add much at that level.


Although the three themes use different graphics and appeals, they are using the same general response-inducing methods (number of contacts, invoking of mandatory response authority, and other considerations). Also, the details of arguments for responding are stated at different times in different ways in each of the mailings. Thus, it is very difficult to see what the mechanism is for inducing a higher response rate. I have dealt with this in my discussion above, by trying to identify what are to a considerable extent shared weaknesses of the proposed implementation methods (authenticity of the mandatory request, personalized correspondence, creating communicative linkages across mailing components and the sequential connections across mail-outs. The larger, more important goal, I think, is to improve the response of the current ACS procedures as well as those proposed for consideration in these new tests.


Is it Desirable to Achieve a Unique Identity for the American Community Survey


I think it is appropriate to try and create a special identify and sense of presence with the American Community Survey. However, it’s my impression that several features of the current and proposed designs are trying to achieve that by decreasing the sense of linkage to the U.S. Census Bureau and its other data collection efforts such as the Decennial Census. This is the reason that I have suggested backing off certain features that I think hinder this larger and I believe helpful identity.


An example is replacing “American Community Survey” in the Response is required by Law, box, instead of the originally tested, “U.S. Census Form enclosed.” The Census has a clear and generally positive identity that the ACS does not now have. The design effort to give ACS a special color and repeated mention is something that I think can be built over time. However, I would be careful about emphasizing it at the expense of losing the U.S. Census Bureau in Washington D.C. identity as is now being done.


Summary and Conclusion


I have covered a lot of territory in this report, and spent at least half of the report evaluating current ACS procedures, and suggesting alternatives. I then used that as context for evaluating whether the three proposed marketing designs are likely to improve self-administered response rates for the ACS. In evaluating these procedures I have also used the available scientific literature on survey response and specific research conducted on Decennial Census (including long form) experimentation in the 1990’s. In addition, I had used recent published research on the use of mail contacts to push responses to the Internet that has been conducted during the last several years.


The changes I have recommended for the current ACS procedures may be summarized as follows:


  1. Eliminate the pre-notice postal contact because of its demonstrated ineffectiveness when pushing sample units to the Internet.

  2. Use those resources to strengthen the initial internet request and reminder contact by improving the communications, strengthening the cover letters, and making clearer the legitimacy of the mandatory request. I have also suggested improving the connectivity between them and switching the reminder postcard to a letter. The current 2nd and 3rd contacts would now become the first and second contacts.

  3. Reduce the complexity of the paper questionnaire mailing by reducing both the number of components and emphasis on choice.

  4. Consideration should be given to replacing the third postcard reminder with a second mailing of the paper questionnaire targeted differently to the telephone-only as well as those households not selected for the telephone follow-up. This change is recommended because of the potential I believe it has for increasing the self-response above current levels.

My conclusion from reviewing the ACS procedures is that many of the procedures now in use are not supported by current survey design research on how to improve response over the internet to surveys that require mail contact. This is the basis of my arguing that many specific issues need to be rethought, based on the research reported here. I then used the suggested procedures as background for evaluating the proposed Reingold Procedures.


My main concern with the Reingold procedures is that previous research has not supported the use of a marketing approach for improving response rates. The use of colorful, thematically based mail-out procedures, as not been shown to be effective in achieving high self-administered, response rates (in the 50-75%) range, as needs to be achieved for the ACS and other Census surveys. Their use in 1996 Census tests actually lowered response rates dramatically.


However, those tests were done about twenty years ago, and some mistakes in design were clearly made (e.g. the reverse print and mandatory message content). Also, the times may have changed and marketing procedures could be more effective in today’s environment. My approach to evaluating the Reingold proposals has focused heavily on how people come to see surveys as legitimate, how mandatory requests are communicated and interpreted, and how the communication within and across multiple survey contacts need to be coordinated. I have suggested multiple changes for the Reingold procedures, which I hope will increase the likelihood that the proposed marketing procedures will be effective.


My view of the lack of effectiveness of the current pre-notice and the negative effects of the overburdened mail request are quite similar to those expressed by Reingold, and I agree with their recommendations for all three of the proposed forms that the pre-notice be eliminated and the paper questionnaire mailing be greatly simplified. I have also suggested a number of specific other changes that I think will improve the effectiveness of the three themes. Many of these changes are similar to changes I have suggested earlier for the ACS, e.g. strengthening the cover letters, better communication across mailings of the mandatory request, and related issues.


It is difficult for me to choose one of the three marketing themes as likely to perform better than the other two. None of the themes seem unreasonable to me. However, from a research perspective I know of no survey research that suggests one of them would be superior to the others. Also, as I looked at each of the proposed themes and how different topics and component procedures were emphasized in different mailings, it was increasingly difficult to identify a basis for one of them performing extraordinarily well.


I would encourage the Census Bureau to conduct an experiment using 1) the current ACS internet-push methodology, 2) a revised ACS that makes the changes in the ACS I have outlined above, and 3) a marketing approach treatment that incorporates the kinds of changes that I have recommended for the current ACS. My inclination would be focus more on the community and patriot themes than the third one, which was less clear to me.


I hope these comments are helpful to you.




References


Baumgardner, Stephanie, Deborah Griffin and David Raglin. 2014. The Effects of Adding an internet Response Option to the American community Survey. Memorandum Series ACS145-RER-21. Washington, DC, U.S. Bureau of the Census.


Baumgardner, Stephanie, 2014. Daily Self-Response Check-in Rates and Daily Internet Usage Rates for the January and February 2013 Panels. Memorandum Series #ACS13-RER-15 (February 25). Washington, DC. U.S. Bureau of the Census.


Baumgardner, Stephanie. 2014. Response rates for the January, February, and March 2013 American community Survey Panels. Memorandum Series #ACS14-RER-19 (April 30). Washington, DC U.S. Bureau of the Census.


de Leeuw, Edith, Callegaro, M., Hox, J., Korendijk, E. and Lensvelt-Mulders, G. 2007. The influence of advance letters on response in telephone surveys: A Meta-analysis. Public Opinion Quarterly, 71(3), 413-443.


Dillman, Don A. 1978. Mail and Telephone Surveys: The Total Design Method. New York: John Wiley.


Dillman, Don A. 2000. Mail and Internet Surveys: The Tailored Design Method. 2nd Edition. New York: John Wiley


Dillman, Don A., Jolene D. Smyth and Leah Melani Christian. 2009. Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method, 3rd edition. John Wiley: Hoboken, NJ


Dillman, Don A. Jolene D. Smith and Leah Christian. 2014. Internet, Phone and Mail Surveys: The Tailored Design Method, 4th edition. John Wiley” Hoboken, NJ


Dillman, Don A., Virginia Lesser, Robert Mason, John Carlson, Fern Willits, Rob Robertson, Bryan Burke. 2007. “Personalization of Mail Surveys for General Public and Populations with a Group Identity: Results from Nine Studies.” Rural Sociology 72(4): 632-646.


Dillman, Don A., Cleo Jenkins, Betsy Martin, and Theresa DeMaio. 1996. Cognitive and Motivational Properties of Three Proposed Decennial Census Forms. Technical Report 96-29 of the Social and Economic Sciences Research Center, Washington State University, Pullman, Washington. (also released by the Center for Survey Methods Research, U.S. Bureau of the Census, Washington, DC.)


Dillman, Don A., Michael D. Sinclair, and Jon R. Clark. 1993. "Effects of Questionnaire Length, Respondent-Friendly Design, and a Difficult Question on Response Rates for Occupant-Addressed Census Mail Surveys." Public Opinion Quarterly 57: 289-304.


Dillman, Don A., Jon R. Clark, and James Treat. 1994. "The Influence of 13 Design Factors on Response Rates to Census Surveys." Annual Research Conference Proceedings, U.S. Bureau of the Census, Washington, DC. Pp. 137-159


Dillman, Don A., Jon R. Clark, and Michael A. Sinclair. 1995. "How Pre-notice Letters, Stamped Return Envelopes, and Reminder Postcards Affect Mail back Response Rates for Census Questionnaires." Survey Methodology 21(2): 1-7.


Dillman, Don A., Eleanor Singer, Jon R. Clark, and James B. Treat. 1996. "Effects of Benefits Appeals, Mandatory Appeals, and Variations in Confidentiality on Completion Rates for Census Questionnaires.” Public Opinion Quarterly 60(3): 376-389


Dillman, Don A., Arina Gertseva and Taj Mahon-Haft. 2005. “Achieving Usability in Establishment Surveys Through the Application of Visual Design Principles.” Journal of Official Statistics 21(2): 183-214


Edwards, Michelle L., Don A. Dillman and Jolene D. Smyth. Forthcoming. An Experimental Test of the Effects of Survey Sponsorship on Internet and Mail Survey Response. Public Opinion Quarterly


Leslie, Theresa. 1996. 1996 National content survey results (Internal DSSD Memorandum No. 3. Washington, DC U.S. Bureau of the Census.


Leslie, Theresa. 1997. Comparing two approaches to questionnaire design: Official government versus public information design. Paper presented at the meeting of the American Statistical Association, Anaheim, CA


Messer, Benjamin L. and Don A. Dillman. 2011. Surveying the General Public Over the Internet Using Address-Based Sampling and Mail Contact Procedures. Public Opinion Quarterly 75 (3): 429-457


Millar, Morgan M. and Don A. Dillman. 2011. Improving Response to Web and Mixed-Mode Surveys. Public Opinion Quarterly 75 (2): 249-269


Olson, Kristen, Jolene D. Smyth and H. Wood (2012. Does giving people their preferred survey mode actually increase survey participation? 611-635.


Rookey, Bryan D., Han way, Steve, and Dillman, Don A. 2008. Does a Probability-Based Household Panel Benefit from Assignment to Postal Response as an Alternative to internet-only? Public Opinion Quarterly. 72(5): 962-984


Smyth, J.D., Dillman, D.A., Christian, L.M., & O’Neill, A. 2010. “Using the Internet to survey small towns and communities: Limitations and possibilities in the early 21st century.” American Behavioral Scientist. 53: 1423-1448


Tarnai, J., Schultz, D. Pfingst, L. and Solet, D.2012 Response rate effects in an ABS survey for stamped vs. business reply return envelopes with and without incentives, and medium vs. standard size outgoing envelopes. Poster presentation to the American Association for Public Opinion Research. Orlando, FL May 16-20,



Attachments


These attachments are being transmitted as separate Files. I am sending them because of their relevance to this review, and because they may not be easily accessible to Census Bureau staff.


Appendix A. Excerpt from Dillman, Don A. (2007) Mail and internet Surveys: The Tailored Design Method, Second Edition, pages 298- 313. (This was republished as part of an update 2nd edition in 2007).


Appendix B. Cognitive and Motivational Properties of Three Proposed Decennial Census Forms. SESRC and Bureau of the Census Technical Report prepared by Dillman, D., Jenkins, C., Martin, E. and DeMaio, T. May, 1996.


Appendix C. Dillman, Don A., Jon Clark and Michael Sinclair. 1995. How Pre-notice Letters, Stamped Return Envelopes and Reminder Postcards Affect Mail back Response Rates for Census Questionnaires. Survey Methodology. 21 (2): 1-7/



27


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorDon Dillman
File Modified0000-00-00
File Created2021-01-25

© 2024 OMB.report | Privacy Policy