Coordinated Collection Report

Attachment F - Coordinated Collection.docx

Generic Clearance for Census Bureau Field Tests and Evaluations

Coordinated Collection Report

OMB: 0607-0971

Document [docx]
Download: docx | pdf





Findings and Recommendations from Debriefing Interviews for the Odyssey Coordinated Collection Research



Prepared for:

Lisa E. Donaldson, Economic Management Division

Kimberly P. Moore, Economy-Wide Statistics



Prepared by:

Melissa A. Cidade, Response Improvement Research Staff

Aryn Hernandez, Response Improvement Research Staff



Office of Economic Planning & Innovation

Economic Programs Directorate

U.S. Census Bureau

October 30, 2020



The Census Bureau has reviewed the report for unauthorized disclosure of confidential information and has approved the disclosure avoidance practices applied. (Approval ID: CBDRB-FY22-ESMD005-003)

Table of Contents

Executive Summary 4

Research Objectives 5

Research Methodology 5

Findings and Recommendations 8

General Findings and Recommendations 8

Finding #1: The COVID-19 novel coronavirus global pandemic negatively impacted communications with respondents. 8

Finding #2: General communication with the Census Bureau results in mixed results for respondents. 9

Specific Findings and Recommendations 9

Premailer Letter 9

Finding #3: Respondents understand the content and purpose of the premailer letter. 9

Finding #4: Respondents’ evaluation of consolidating contacts is mixed. 10

Finding #5: Respondents understand the purpose of the Odyssey Flyer; however, many respondents did not receive – or do not remember receiving – it. 11

Initial Letter 11

Finding #6: Respondents understand the purpose of the initial letter and take action in response to it. 11

Finding #7: More respondents prefer the combined letter approach than prefer the multiple letter approach; even more respondents prefer email contact. 12

Finding #8: More respondents preferred staggered due dates than preferred combined due dates, regardless of experimental arm assignment. 13

Respondent Portal 15

Finding #9: In general, respondents are satisfied with the respondent portal, finding it easy-to-use and convenient. 15

Finding #10: Respondents suggested additional features for the portal, especially with regard to how often the portal refreshes. 16

Finding #11: Most respondents did not use the delegation feature on the respondent portal; of those who did, most are satisfied but would like more parameters in place. 16

Finding #12: Respondents use the portal to prioritize their workload and prepare to respond to the survey request. 17

About the Data Collection Methodology and Research (DCMR) Branch 19

Appendix A: Methodological Overview 20

Appendix A 21

Appendix B: Interview Protocol 24

Appendix B 25

Appendix C 41




Executive Summary

Over the course of three months, researchers with the Data Collection Methodology Research Branch (DCMRB) within the Economic Statistical Methods Division (ESMD) conducted 35 phone debriefing interviews to evaluate the Odyssey Coordinated Collection Experiment. The overall findings include:

General Findings and Recommendations

Finding #1: The COVID-19 novel coronavirus global pandemic negatively impacted communications with respondents.

Finding #2: General communication with the Census Bureau results in mixed results for respondents.

Specific Findings and Recommendations

Premailer Letter

Finding #3: Respondents understand the content and purpose of the premailer letter.

Finding #4: Respondents’ evaluation of consolidating contacts is mixed.

Finding #5: Respondents understand the purpose of the Odyssey Flyer; however, many respondents did not receive – or do not remember receiving – it.

Initial Letter

Finding #6: Respondents understand the purpose of the initial letter and take action in response to it.

Finding #7: More respondents prefer the combined letter approach than prefer the multiple letter approach; even more respondents prefer email contact.

Finding #8: More respondents preferred staggered due dates than preferred combined due dates, regardless of experimental arm assignment.

Respondent Portal

Finding #9: In general, respondents are satisfied with the respondent portal, finding it easy-to-use and convenient.

Finding #10: Respondents suggested additional features for the portal, especially with regard to how often the portal refreshes.

Finding #11: Most respondents did not use the delegation feature on the respondent portal; of those who did, most are satisfied but would like more parameters in place.

Finding #12: Respondents use the portal to prioritize their workload and prepare to respond to the survey request.

Research Objectives

Researchers conducted debriefing interviews to gain a better understanding of the process of completing Census Bureau surveys of companies sampled more than once for annual surveys. During these interviews, we sought the following information:

  • Understanding how respondents comprehend specific contact materials;

  • Identifying respondents’ use of the respondent portal for accessing and delegating surveys;

  • Assessing the impact the consolidated collection may have on companies that have a single point of contact versus multiple contacts; and

  • Identifying difficulties in completing the questionnaires

Research Methodology

Between June 1 and August 13, 2020, we conducted 35 phone interviews lasting no more than 60 minutes (one hour) with firms that had been selected to participate in the Odyssey Coordinated Collection experiment. This experiment included several experimental arms, including:

  • Consolidated contact – contacts were either consolidated for this research – meaning that the firm originally had more than one contact listed for annual surveys – or where already using a single point of contact.

  • Due dates – firms were randomly assigned to either have all surveys due on the same date or surveys due on differing dates.

  • Surveys – finally, firms were randomly assigned to which surveys to consolidate. This research includes the Annual Wholesale Survey (AWS), the Annual Retail Trade Survey (ARTS), and the Services Annual Survey (SAS), and firms can be arranged into any two or all three of these surveys, creating four unique combinations.

With two experimental arms with two groups, and one experimental arm with four groups, any given respondent could have been in one of 16 combinations. As such, few cases represent each combination, and some combinations did not result in an interview. See Figure 1 for an overview of the experimental assignments of respondents.

Figure 1: Number of Respondents by Experimental Treatments



Looking across combinations of surveys, each of the four survey combinations were represented by respondents, with most respondents receiving the AWTS and the SAS for this experiment. Table 2 represents the combinations of assigned surveys. Of all 35 respondents, 18 represent firms that were assigned to the AWTS and SAS, and nine were assigned to the ARTS and AWTS combination. Five firms were assigned the ARTS and SAS combination, while three respondents represented firms that were assigned all three surveys, ARTS, AWTS, and SAS.

Finally, looking at respondents across the contact experimental arm, 26 respondents represented firms that had a single point of contact before the experiment began. Another nine respondents represented firms where the contact names were consolidated for this experiment.

While this research had originally intended to reach both respondents and non-respondents, ultimately timing and lack of resources truncated the research to include only those that had completed all the requisite surveys into which they were sampled.

This research project relied on debriefing interviews. Debriefing interviews are targeted conversations after an event has occurred (in this case, completing the surveys). The strength of debriefing interviews is that researchers can ask pointed questions about specific aspects of an experience, and respondents, in turn, can respond based on their actual experience, rather than speculate on a given scenario. On the other hand, debriefing interviews can be unduly influenced by memory lapses, especially if the time between the event and the debrief is long. For the purposes of this research, the debriefing questions were focused on the respondents’ experiences with the request to participate in at least two Census Bureau surveys. The interviews followed a semi-structured interview protocol, found in Appendix B. More information about the methodology used for this project is available in Appendix A.




Findings and Recommendations

Below are the findings and recommendations from the debriefing interviews. The first section – labeled General Findings and Recommendations – outlines overall issues with the coordinated collection efforts that were tested, including the impact of the COVID-19 global pandemic and general communication with the Census Bureau. The second section – labeled Specific Findings and Recommendations – provides feedback on the specific contact materials of interest for this research.

General Findings and Recommendations

Finding #1: The COVID-19 novel coronavirus global pandemic negatively impacted communications with respondents.

On March 13, 2020, the executive branch declared a national state of emergency because of the emerging global pandemic caused by the novel coronavirus, COVID-19. As a result, non-essential workers were mostly teleworking or otherwise not at their usual place of business. At the same time, the Coordinated Collection experiment had already been underway; as such, we cannot disentangle the successes and setbacks of this experiment from the impact of this pandemic. All the findings in this report are against the backdrop of the evolving public health crisis happening concurrently.

One of the most significant impacts of the pandemic was mail delays and lack of forwarding. The crux of the traditional contact strategy of the annual surveys is various paper copy mailings sent to firms’ physical addresses. Asked about the various letters involved in this research, respondents mentioned that they simply did not receive some of the contacts due to mandatory telework/remote work. Said one, he simply “hasn't been getting mail in a while due to COVID.” The communication breakdown varied among firms, from a full mail stoppage to long delays. Said one, she “stopped receiving business mail – [I] have not seen any paper mail since going on telework.” She went on to say that “there is no clear plan on how we are getting mail” going forward, and in fact, even her telecommunications are limited, saying “I also don't think I'm getting my phone calls, either.” Compare to another respondent who noted that even though he has “been teleworking since March,” he will “pop into the office periodically to swap out files,” and that “a couple of times letters have been passed to me and I look at them when I go in,” but that he is “not getting it as promptly as normally.”

Response Error: The thrust of this research was centered around shifts in the contact strategy for these annual surveys. The interruption due to the global pandemic cannot be underestimated when examining response rates, impressions of letters, and even firms that agreed to participate in interviews. Researchers in DCMRB have heard from respondents throughout this period that participating in research is untenable, due to staff shortages, rolling furloughs, noisy telework arrangements, and other consequences of the pandemic.

Recommendation: Given the unforeseen circumstances under which this experiment was conducted, each of the findings and recommendations in this report must be understood against the backdrop of the global pandemic. While the findings are still elucidating and reflective of respondents’ experiences, it is understood that they are also reflective of the time and place in which they were collected.



Finding #2: General communication with the Census Bureau results in mixed results for respondents.

During the course of interviewing, respondents shared with interviewers their past experiences contacting the Census Bureau. Several respondents mentioned that they have had some difficulty updating contact information for their firm; this is particularly salient given the nature of the Odyssey Coordinated Collection experiment. One respondent mentioned that she “tried to call and get [the contact information] changed to my name as the single point of contact and was unsuccessful.” Others echoed this experience, with one saying that he “tried and tried to get [the contact information] changed to me, but was told they couldn't change it,” which lead to another colleague who called to change the information, “and she said that they said that they changed it.” However, the respondent noted that “that was six months ago” and in the interim, she has not “heard anything else” about it.

Response Error: Sending communications to the wrong person at a company increases the response burden and decreases respondent motivation to complete a survey. However, being unable to easily update information from those respondents who contact the Census Bureau is an additional level of frustration that erodes goodwill with our respondents. This could lead to decreases in response rates and, subsequently, data quality.

Recommendation: Design an easier way to pass information about respondent contacts from different parts of the Census Bureau. At the same time, be sure to “close the loop” on communications with respondents as a means of good customer service.

Specific Findings and Recommendations

Premailer Letter

Finding #3: Respondents understand the content and purpose of the premailer letter.

Asked about the premailer letter, most respondents could identify the purpose of the letter and could describe this purpose to the interviewer. Said one matter-of-factly, the letter “gives the contact name, phone number and email address for the three different reports.” Said another, the purpose of the letter is to “streamline so one person in company is contact for these three surveys.”

As a follow up, interviewers asked respondents to identify the first thing they saw when looking at the letter. Most mentioned the bolded text alerting respondents to a change in contact methods. Looking at the letter, one respondent said that the letter “looks important” and even “says mandatory” on it, leading him to conclude that “something has changed, I should pay attention” to this letter. Another echoed this sentiment, saying that “initially, when I first looked at [the letter], I noticed 'notice of change.' This means we need to pay attention” to this letter. A few respondents saw this letter and thought that the thrust was that there were surveys coming due. Said one respondent, the premailer letter “tells us that we are going to be required to fill out the survey that they contacting us about.” Another mentioned that the letter was intended to communicate to her “the number of surveys” that were expected from her firm.

Asked to evaluate the effectiveness of the premailer letter, most respondents found it helpful. Said one, the premailer letter is “helpful to give you a heads up that something's coming” from the Census Bureau. Another mentioned that prenotification is “helpful [because it] lets us plan the scope of our work.”

Response Error: There is no response error for this finding.

Recommendation: Retain parts of the letter that resonate with respondents – the contact list, the mandatory nature of the surveys, and the information about the specific surveys for which the firm is responsible.

Finding #4: Respondents’ evaluation of consolidating contacts is mixed.

When respondents were asked about the consolidation of contacts outlined in the premailer letter, some answered favorably. Most mentioned that having a list of previous responders gave them a ‘starting point’ for completing the current surveys. Said one, the listing “gives me a starting point of where to go back to in order to ask for information on how to fill [the current survey] out.” This same respondent mentioned that by having the previous contact information, “I can find out how they reported on it and then I can reach back to them if there are issues.” Similarly, another respondent mentioned that the listing is helpful because his “company has [employee] turn-over,” and since he “took over position from previous person” in his position, the list lets him know who to contact with issues. One respondent even mentioned that the listing may decrease burden on the Census Bureau, saying “I would know who was doing this reporting prior to the [current] person taking over; I can reach out to them for help as opposed to calling the Census Bureau,” adding that this “gives the person an extra point of contact” for help.

Still others said that the listing was not helpful; this was particularly true of respondents from smaller firms with smaller reporting departments, or a singular person responsible for reporting. One mentioned that the list was “not necessarily” helpful because “it is going to be to [complete] it; we don’t have a lot of turnover in these roles” of reporting. Still others mentioned that the listing contained out of date or otherwise inaccurate information, which was ultimately not helpful. Said one, “each survey had different contact information, but the email addresses weren't correct or up-to-date. The contact name was incorrect, my name was on there for at least one [survey].” However, this same respondent went on to say that seeing the out-of-date contact information lead his team to “create a general email address for the staff to access; then, as people come and go, they can be added or taken off” of that address. Another echoed accuracy concerns, saying “sometimes contact names are people no longer here that I don't even recognize.”

Response Error: There is no response error for this finding.

Recommendation: Given that those respondents who find the listing helpful use it to respond to Census Bureau surveys, and those that do not find the listing helpful do not suffer a detriment (that is, it does not impact the quality of the data they return), we recommend retaining the list of previous contacts by surveys as a reference for respondents.

Finding #5: Respondents understand the purpose of the Odyssey Flyer; however, many respondents did not receive – or do not remember receiving – it.

Many respondents mentioned that either did not receive or do not remember receiving the Odyssey Flyer that outlines the steps on using delegation on the respondent portal. Said one, “I can’t say that I have seen” this document prior to the interview. Another stated that he had “no memory,” and went on to ask, “[is this flyer] something new?”

Asked about the content of the flyer, most respondents were able to accurately identify that the purpose of document was to inform them about the ‘share survey’ functionality on the respondent portal. One respondent said, the flyer is “trying to communicate the process of sharing the survey [including] the steps.” Another echoed this saying that flyer is about “how to better use the portal, [and] specifically, share survey,” and that the document includes “step-by-step instructions.”

Response Error: There is no response error for this finding.

Recommendation: When respondents were exposed to the flyer during testing, they understood the intent and content of the document. Therefore, we recommend deploying another copy of this document with future mailings. We also recommend adding the document in a “Help” or “FAQ” section of the respondent portal for those who did not receive it in paper form.



Initial Letter

Finding #6: Respondents understand the purpose of the initial letter and take action in response to it.

In the interview, we then moved on to the initial letter. Asked about the purpose of the letter, most respondents were able to correctly identify that the letter is meant to give access to upcoming surveys. One respondent said, succinctly, that the purpose of the letter is to inform him that “I have a survey to do.” Similarly, another said that the letter’s purpose is “that a survey is due.” One respondent noted that while the premailer was informative, the initial letter solicits action, saying “I would act on this one.” Finally, a few mentioned that the initial letter is similar to other Census Bureau communications with which they are familiar, with one saying it is “similar to previous letters I've received.”

Interviewers then asked respondents about the first thing they noticed upon opening the initial letter. Note that respondents were presented with a generic version (that is, without firm-specific information) of this letter during the interview. The most prevalent response was noticing the due date listed on the letter, a critical element that respondents use to estimate their upcoming workload. Said one, the prominent information to him was “three surveys are due and there are specific due dates.” Another prominent response was that respondents notice the authentication code, especially those with a lot of experience with Census Bureau surveys. Said one, “what I always refer to as soon as I see these letters is the authentication code and the due date;” this same respondent noted that she is familiar with this type of communication, saying “I've gotten [this kind of letter] so often each year, it's pretty easy to find.” Another mentioned that the “first thing I notice is the authentication code; this is a standard letter,” indicating a degree of familiarity with this type of communication. This same respondent went on to say that “sometimes I glance down at the burden estimate statement; it is never accurate.”

When asked what action respondents would take in response to receiving this letter, most said that they would log into the respondent portal and enter the authentication code to get access to the surveys. Said one, thinking about the initial letter, “this one I would pay attention to - it looks like an obligation to report. I would follow the steps to complete the surveys.” He then added that this letter is “a call to action.” Another added that this letter would prompt her to go to “the portal, entered the code so I can see the survey, and share it with preparers” within her company. On the other hand, some respondents mentioned that they do not immediately access the portal, but rather, record the due date and come back to the request later. Said one, upon receiving the initial letter, “the first thing I did was check the due date and then I put it in my file of upcoming things to do.” Another mentioned that this letter helps them to organize their upcoming work, saying that upon receiving, “I look at the due date, look at my internal reporting deadlines. Internal reportings are the priorities.” This respondent went on to mention that “honestly these surveys are the last thing I do. It's time consuming and hard to gather the data, have to reach out to other people, data we have might not be the right format, so it takes a lot of time.” Finally, a few respondents mentioned that upon receiving the initial letter, their first step is to log onto the platform and request a due date extension. One said, “usually someone has passed [the letter] to me, and the first step is always to go and extend the deadline as far as we can. It's not a procrastination thing; it just takes so much time” to complete the survey.

Response Error: There is no response error for this finding.

Recommendation: In this case, respondents are familiar with the layout and content of the initial letters. We recommend leveraging this familiarity by maintaining the current template for letters of this kind.



Finding #7: More respondents prefer the combined letter approach than prefer the multiple letter approach; even more respondents prefer email contact.

One of the changes in contact strategy for this experiment is combining survey requests into one letter, instead of each survey request contained in a separate letter. More respondents preferred the combined letter approach – which includes information on all of the surveys that are coming due – than the single letter approach – a letter for each survey. Said one, “notification of multiple surveys on one letter is fine as long as it is streamlined. One notification is fine, as long as they are all right there easily.” Said another, “one code for all of your surveys, even with different due dates, would be helpful - enter one code and then they would all populate. I would know exactly what I needed to do.” This respondent went further, saying that combining all Census Bureau reporting obligations would be ideal: “like, send it out in January and give me the year to complete them.”


A few respondents said that they prefer the single letter approach, mostly citing organizational concerns. Said one, “I prefer one letter per survey; it would be a reminder that something needs to be done or hasn't been done.” He went on to say that with multiple letters, “I can hand it off to someone else. I don't care for multiple requests on one letter at all.” Another said that she “didn't like having multiple surveys on one code,” and that it “would be easier to keep them separate if all had different codes.” Note, however, that more respondents advocated for the single letter than for the multiple letters.


That said, even more respondents mentioned the recent innovation of relying more on email contact as being the most convenient way to alert them to upcoming surveys. One mentioned that “for me, when I have the option on how to receive correspondence, I prefer email instead of mail.” Another echoed this sentiment, saying “we have not been in the office; I appreciated that I got a couple of emails from the Bureau. This is more effective for me than the physical letters.” This respondent went on to praise the quick pivot to email by the Census Bureau, saying “the way you changed to sending emails was good.” Independently, some respondents mentioned that the mail is unpredictable, saying that “sometimes our mail takes longer, especially now,” adding that she “would definitely say electronic is better.”



Response Error: There is no response error for this finding.

Recommendation: Because more respondents said they preferred the combined letter than the single letters, we recommend continuing to experiment with combined contacts. Further, we strongly recommend that the Census Bureau consider expanding its electronic communications capabilities, especially regarding emails.



Finding #8: More respondents preferred staggered due dates than preferred combined due dates, regardless of experimental arm assignment.

Firms in this research project firms were assigned a single due date for all surveys or a due date for each individual survey. Regardless of assignment, more respondents preferred staggered (multiple) due dates than preferred having all surveys due on the same date.

Of those who preferred more than one due date and were assigned to the consolidated due date group, the main reasoning for wanting more than one due date was a concern that having one due date would be overly burdensome. Said one, multiple due dates “make life easier” since her firm does “four or five surveys a year for [the] Census [Bureau] and having them all due at once is a burden.” Similarly, another noted that “people tend to slip this work in between their regular work, so if they are spread out it makes it more manageable.” One respondent said that combined due dates is “not helpful” because “just getting one completed is hard enough, so having multiple due on the same day is a burden.”

Of those who preferred more than one due date and were assigned to the staggered due date group, the main reason for preferring more than one due date was the same: the stress and burden of completing multiple surveys on the same timeline. Said one, “if you give me one due date [the surveys] wouldn't get done,” going on to say that “this is not my main job; I was not hired to fill out census reporting.” One respondent outlined the issues with a combined due date for his specific firm, saying:

Because we are a concentrated group, I would say no, [a combined due date] would not be helpful. In the ways that you all have different groups to process forms, we have different groups to submit the forms. I have teams that provide that information. If it was [all due on] the same date, I would have to plan to collect the data, but something [else] would have to get postponed… if I received [due dates] at the same time, I would not be able to file it all on the same date. A lot of the due dates conflict with our year end schedule. We don't file [our] annual report until [D:DATE]; we can't release [data to the Census Bureau] until it has been released publicly.

A very few respondents noted that they preferred the combined due date, citing that it is easier to organize response if there are fewer due dates to monitor. One noted “I have a lot of due dates already and times tables - if I knew all Census [surveys] were due on one day, it makes it a lot easier.” Another echoed this sentiment, saying that he “does like all of them having the same due date” because it is “easier to keep track of.” Another consideration in favor of a combined due date is that it may reduce response burden. Said one respondent, “I think the same [due] date is better [because] sometimes the information you need for different surveys [is] similar, so it would relieve me of going into the two surveys at once.”

Note that we asked respondents in the combined due date experimental arm if the singular due date impacted the way that they completed the surveys. Most said that there was no impact on their response procedures, with one saying that a single due date has “no impact, [I] just do them when I have the chance, or request an extension.” Similarly, another respondent mentioned that the “same due date did not make any difference” in response procedures.

Response Error: A combined due date may lead some respondents to become discouraged and lead to higher non-response or lower quality data.

Recommendation: Respondents preferred multiple due dates over combined due dates. However, given the added (and unforeseen) stress of COVID-19 on staffing in particular, we recommend additional research on combining or staggering due dates. Specifically, we recommend testing response rates and data quality for firms with a combined due date, while further integrating the use of online communication and outside of the current public health emergency. It may be that the stress of a single due date is partially attributable to the stress of staffing and other constraints due to the change in work arrangements in the wake of the global pandemic.



Respondent Portal

Finding #9: In general, respondents are satisfied with the respondent portal, finding it easy-to-use and convenient.

We asked respondents specifically about their interactions with the respondent portal through which they gain access to their assigned surveys, can communicate with Census Bureau staff, and can submit data. Overall, respondents evaluated the portal positively. One respondent compared it to other online platforms he uses, saying that “it's a pretty easy platform. This one doesn't frustrate me. Some others do.” Another mentioned that the portal “is easy to navigate through, and very user-friendly,” adding that “last year I was able to forward a survey to another colleague of mine; it was easy to forward that request on to them and for them to get in and figure it out.” Specifically, respondents found entering the survey authorization code to be easy and intuitive. Said one respondent, it is “very easy to enter the authentication code,” while another said she “[didn’t] remember having any problems” entering the code.

When we asked specifically about the survey squares on the portal, respondents understood that each square represented a survey to complete. Said one, “the various surveys are right in front of you. It has exactly what you have to do, [and the] due date helps you know when things are due.” Another echoed this, saying that “what pops up when I first log in [to the respondent portal] are the surveys that are open with their due dates; that's helpful – I don't need anything else.”

Interviewers asked respondents about setting up a new account in the respondent portal. Most respondents said that it was easy to create a new account. Said one, creating a new account “was pretty easy, especially when you have the authentication code.” Another said that creating an account was “not too bad.” One respondent, however, outlined that while accessing his account was fairly easy this year, creating an account in previous years was burdensome, saying:

I went through the mess of creating an account [on the respondent portal] last year. I took it over from the controller. I was semi-familiar with it [at the time]. Entering [the authentication] code: it was brutal last year. I could not make it work. I finally had to call in [to the Census Bureau] and have somebody help me. This year, it was pretty easy. I just went in and there was a code on the letter; I had the access, so it was easier this year. Previously, I didn't have access. I didn't have a code, a log in, a password…nothing. Once I recreated it all, it made it easier for this year.

Response Error: There is no response error for this finding.

Recommendation: Respondents are generally satisfied with the respondent portal. Therefore, we recommend retaining the portal functionality and layout as it is currently.

Finding #10: Respondents suggested additional features for the portal, especially with regard to how often the portal refreshes.

Even when satisfied with the portal, some respondents also suggested additional features. These included:

  • Information on how the data are used and where the data are reported;

  • Estimated amount of time it will take to complete the survey;

  • Notification that a shared survey has been opened or altered;

  • Longer amount of time before the portal ‘times out’; and

  • Required verification of data entered in delegated surveys.

One feature that a few respondents mentioned involved how frequently the portal updated information on response status. On the one hand, the updated information may be inaccurate, prompting one respondent to mention that “the day [a survey] is due, it will be marked as overdue [on the portal], even though it is the due date. I think they are marking it too early and that is annoying. If they don't have it at 12:01 on the [due date], they mark it as overdue.” On the other hand, respondents stated that the portal did not update frequently enough, with one saying “when you submit a survey, the square doesn't update right away; [it] takes a few days.” Another mentioned that her company has started tracking Census Bureau surveys independently of the portal because of this issue, saying “the survey squares don't update frequently enough; [I am] tracking responses separately because there isn't an updated status.”

Response Error: There is no response error for this finding.

Recommendation: We recommend further study into additional features that may support respondent survey completion. However, we note that the lack of the suggested additional features were not mentioned as impediments to response, so this could be a lower priority research topic.



Finding #11: Most respondents did not use the delegation feature on the respondent portal; of those who did, most are satisfied but would like more parameters in place.

We specifically asked respondents about the delegation functionality on the respondent portal. Most respondents mentioned that they did not use the delegate function. The most cited reason for not using it is that respondents were unaware that this functionality existed. Said one respondent, “I didn't know [share survey] was under [the] options [menu], and I don't know if the last guy [who had my job] knew it was there either.” This respondent suggested an “icon somewhere on the front page” to bring the delegation functionality to respondents’ attention. Another mentioned that knowing about the delegation feature may have altered the way she completed the survey: “If it is possible to make this more apparent, it would be helpful. I would have shared the survey a long time ago, and then others would have gotten those reminders.” Some respondents noted that they did not use the delegation function because it was not warranted; said one, “I have it all – [I am the] single point of contact and single respondent to the survey.” Said another, using the delegate function is contingent on the survey questions, saying “all the information that is needed for these surveys is available; for others, I've used the share survey function, whenever the data are not available.”

However, a few respondents noted that they did not use the delegation functionality because they wanted more control over what they were sharing and with whom. One respondent – who did use delegation – was overall satisfied with the portal; however, he noted that “the only functionality that would be nicer would be that if I delegate [a survey], I can delegate just a specific part of the survey. So, like, if I could delegate by [business] unit, that would be really nice.” Another noted that the trepidation over sharing a survey comes from the inability to verify what has been entered, saying “We're not sure what someone would enter. We have one account, and let people get on it, and then I can go in and verify [the data they have entered],” adding that “we are not utilizing share survey” because of “verifying the data entry - how to do that when we share the survey?” Yet another respondent echoed this anxiety, saying that to share the survey, “you have to be trusting - when I share [a survey] with another person, they are able to go into all the forms under that collection. It makes me apprehensive; I wish we could limit access to which forms.” This respondent went on to talk about a negative experience with survey delegation, saying “I shared the survey with another person, and she shared it with someone else, and I had to tell her you should not be sharing the survey with others.”

Of those who used delegation, most responded that they liked the functionality, with one simply stating, “we've never had issues with the delegation functionality” on the portal. Another said that “once you've done it once, it is easy [to delegate a survey]; when you want to share the survey, you add their email address and a note, and then you get a chance to proofread and check the request, which I really like.” Another echoed, saying that delegation is “very easy to use,” and that “we use it a lot.”

Response Error: There is no response error for this finding.

Recommendation: We recommend further study into additional features supporting the survey delegation functionality – specifically with regard to limiting access to certain parts of the survey. However, we note that most respondents who have used this functionality are satisfied, suggesting that this could be a lower priority future research topic.

Finding #12: Respondents use the portal to prioritize their workload and prepare to respond to the survey request.

When we asked respondents about what they usually do when on the respondent portal, many stated that they use the available information to help them organize and prioritize their reporting responsibilities. Said one, she decides how to respond based on “due dates and prior knowledge help us prioritize which we will answer and in which order. We have to get done with the immediate need first.” Another mentioned that when he accesses the portal, he will “work on the [survey] that's due first.”

Others mentioned that they use the information on the portal to help them prepare to respond to the survey. For example, upon logging into the portal, one respondent mentioned that when she “requested an extension, I opened the survey to make sure that the questions are the same as the previous [year]. Then, we did a worksheet for each survey so that we can pull the information from our systems. Once we had all the information it is quite easy and quick to answer the survey.” Another also admitted to perusing the instrument before responding, saying that he “usually downloads the PDF and prints [it]” to see what the survey is asking.

A few said that their first action upon entering an authentication code is to reach out to others within their firm to start gathering the necessary data. One respondent mentioned that she has “to reach out to a lot of people, [and that] takes a lot of time.” This same respondent went on to mention that she “uses a tracker to keep tabs on the various surveys they are working on” within her company. Another mentioned that she “doesn't really prepare surveys myself, I send them to others, who leave it to the last minute [to respond], especially if [it is] not mandatory.”

Response Error: There is no response error for this finding.

Recommendation: We recommend retaining the response portal as is.



About the Data Collection Methodology and Research (DCMR) Branch

The Data Collection Methodology and Research (DCMR) Branch in the Economic Statistical Methods Division assists economic survey program areas and other governmental agencies with research associated with the behavioral aspects of survey response and data collection. The mission of DCMR is to improve data quality in surveys while reducing survey nonresponse and respondent burden. This mission is achieved by:

  • Conducting expert reviews, cognitive pretesting, site visits and usability testing, along with post-collection evaluation methods, to assess the effectiveness and efficiency of the data collection instruments and associated materials;

  • Conducting early stage scoping interviews to assist with the development of survey content (concepts, specifications, question wording and instructions, etc.) by getting early feedback on it from respondents;

  • Assisting program areas with the development and use of nonresponse reduction methods and contact strategies;

  • And conducting empirical research to help better understand behavioral aspects of survey response, with the aim of identifying areas for further improvement as well as evaluating the effectiveness of qualitative research.

For more information on how DCMR can assist your economic survey program area or agency, please visit the DCMR intranet site or contact the branch chief, Amy Anderson Riemer.


Appendix A: Methodological Overview

Appendix A

Respondent Debriefing Interviews: An Overview

In the course of evaluating establishment surveys, sometimes the most appropriate method is a series of Respondent Debriefing Interviews (RDIs). This type of interviewing is nested under the wider umbrella of ‘ethnographic methods’ used in evaluating surveys, and is usually a semi-structured, protocol-guided conversation between a researcher and one or more respondents, with or without observers. Respondent debriefings are a recontact method, wherein researchers interview respondents who have already interacted in some form with the survey lifecycle, and are asked about response strategies, data sources, and other interactions with the survey design (Snijkers et al. 2013: 278). These kinds of interviews use "retrospective focused interviewing techniques” (279) to identify issues within the context of the survey response process.

By asking respondents about their process of response and reflections on a survey, the debriefing interview empowers respondents to take on the role of informant rather than respondent, which can results in a survey instrument that is better attuned to the respondents’ needs (Presser and Martin 2004: 162). In this way, it is vital that debriefing interviews ask pointed questions about the process of responding and not about specific aspects of the survey; retrospective respondent debriefings are "more valuable when they do not rely on the participant's memory" but are instead focused on "features that might not generate spontaneous comments during the [survey response] session or asking preferences" related to the survey administration (Nichols et al. 2020: 339).

Determining who to recruit for such interviews is dependent upon the goals of the research. Respondents selected for an interview can be purposeful (that is, targeting a specific 'type' of respondent) or drawn from the sample frame of the survey. Interviews can be conducted in person, over the phone, or using online communications platforms, like Microsoft’s Skype for Business.

While RDIs are used to identify a range of possibly problematic aspects of a survey, there are specific types of issues that this method is most attuned to capture. Tourangeau et al. (2019: 56-57) outline the myriad of topics RDIs can cover, including:

  • The overall survey response experience;

  • Whether any questions were difficult to understand, overly burdensome, or potentially embarrassing;

  • Respondent confidence in their answers to particular survey items;

  • Difficulties in recalling or retrieving information;

  • Events or facts that respondents may have failed to report or reported incorrectly during the survey; and

  • Reactions to features of the survey design (including contact strategies).



One advantage of the RDI is that it cannot only uncover any issues, but also “in many situations, suggestions for dealing with the problem” (Hughes 2004: 6). At the same time, Presser and Martin (2004: 169) identify potential drawbacks of RDIs, including that they may not provide direct evidence of reporting error; they may provide indirect evidence about questionnaire performance which may need to be supplemented with additional performance indicators; and they are most useful when designed around a substantive and methodological theory (that is, clearly defined research questions).

In addition, RDIs work well in tandem with other survey testing methods, most notably cognitive interviewing (Hughes and DeMaio 2002: 1535; Ikart 2018: 129). However, a major difference between cognitive and usability testing and respondent debriefing is the timing. RDIs can be useful in any type of survey project, but are most particularly helpful for identifying outstanding issues in existing questionnaires (Campanelli et al. 1991: 254). Likewise, while cognitive interviewing usually occurs during the operationalization phase of a survey research life cycle, respondent debriefing is intentionally a later-stage method, coming after the respondent has interacted with the survey materials (Hughes 2004: 6). Critical, then, to insightful RDIs is the amount of time between survey administration and the interview, with Presser and Martin (2004: 170) arguing that while processes to completion may be clearly communicated by respondents, “ephemeral thoughts or reactions are likely to be quickly forgotten” and so, the closer the RDI is to the actual date of survey completion, the more likely recollections are to be detailed and insightful.



Works Cited:

Campanelli, Pamela C., Elizabeth A. Martin, and Jennifer M. Rothgeb. 1991. “The Use of Respondent and Interviewer Debriefing Studies as a Way to Study Response Error in Survey Data.” The Statistician 40(3):253–64.

Hughes, Kristen A., and Theresa J. DeMaio. 2002. “Does This Question Work?  Evaluating Cognitive Debriefing Interview Results Using Respondent Debriefing Questions.” Pp. 1535–41 in Proceedings of the 2002 Annual Conference of the American Association of Public Opinion Research. St. Pete Beach, Florida.

Hughes, Kristen Ann. 2004. Comparing Pretesting Methods: Cognitive Interviews, Respondent Debriefing, and Behavior Coding. Survey Methodology. 2004–02. Washington, DC: U.S. Bureau of the Census.

Ikart, Emmanuel Matthew. 2018. “Questionnaire Pretesting Methods: A Comparison of Cognitive Interviewing and Respondent Debriefing Vis-à-Vis the Study of the Adoption of Decision Support Systems by Knowledge Workers.” International Journal of Business and Information 13(2):119–54. doi: 10.6702.

Nichols, Elizabeth, Erica Olmsted-Hawala, Temika Holland, and Amy Anderson Riemer. 2020. “Usability Testing Online Questionnaires:  Experiences at the U.S. Census Bureau.” Pp. 315–47 in Advances in questionnaire design, development, evaluation and testing, edited by P. C. Beatty, D. Collins, L. Kaye, J.-L. Padilla, G. B. Willis, and A. Wilmot. Hoboken, NJ: Wiley.

Presser, Stanley, and Elizabeth Martin, eds. 2004. “Vignettes and Respondent Debriefing for Questionnaire Design and Evaluation.” Pp. 149–71 in Methods for Testing and Evaluating Survey Questionnaires, Wiley Series in Survey Methodology. Hoboken, NJ: John Wiley & Sons, Inc.

Snijkers, Ger, Gustav Haraldsen, Jacqui Jones, and Diane K. Willimack. 2013. Designing and Conducting Business Surveys. Hoboken, NJ: John Wiley & Sons, Inc.

Tourangeau, Roger, Aaron Maitland, Darby Steiger, and Ting Yan. 2019. “A Framework for Making Decisions About Question Evaluation Methods.” Pp. 47–73 in Advances in Questionnaire Design, Development, Evaluation and Testing, edited by G. B. Willis, D. Collins, A. Wilmot, P. C. Beatty, J. L. Padilla, and L. Kaye. New York: John Wiley & Sons, Inc.





Appendix B: Interview Protocol



Appendix B

CC Protocol


Start of Block: Interview Overview



Q6 INTRODUCTION: Thank the respondent for completing the surveys initially and for taking the time to talk today. Introduce all attendees including yourself. Brief overview and purpose of the call:  


The Census Bureau conducts many different types of surveys throughout the year.  Currently, we are in the process of evaluating the ways that we reach out to businesses across the country, especially those that are in more than one Census Bureau survey. 

 
Today, I am going to ask you a few questions about your recent experience with being a contact for [ODYSSEY SURVEYS:
${e://Field/_ODSY}]. Surveys Key:
R - ARTS - Annual Retail Trade Survey
S - SAS - Services Annual Survey
W - AWTS - Annual Wholesale Trade Survey
Please be candid and frank – all of your responses are confidential, and neither your name nor the name of your company will be included in our reports.







Q7 Firm Background/Primary Contact Questions:
Tell me a little bit about your business.  What types of goods or services does this business provide?

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________

Q8
What is your role in the company? 
What was your role in the process of responding to Census Bureau surveys?


________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________



End of Block: Introductory Text


Start of Block: Pre-Notice



Q9 Instruct the respondent to open the Pre-Mailer.



Q10 Did you receive this letter?

  • Yes (1)

  • Maybe/Don't Remember/Unsure (2)

  • No (3)



Q11 Do these requests usually come directly to you, or does it take time for them to show up on your desk?

  • Directly to me (1)

  • Routed through some other means, describe: (2) ________________________________________________







Q12 Notes on receiving letter:

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________





Page Break




Q13
Looking at the letter now, what is your initial reaction?

What is the first thing you notice?


________________________________________________________________







Q14 In your own words, what do you think this letter is trying to communicate to you? What do you consider to be the most important pieces of information in this letter?

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________



Page Break






Q15 To the best of your memory: What was your reaction to this letter? In what ways, if any, did receiving this letter impact the way you responded to the Census Bureau surveys? What actions, if any, did you take as a result of this letter?

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________





Page Break




Q16   Now, take a look at the bottom of the letter – we have included the names and contact information for previous respondents of your business to this survey. Did you find this listing helpful?  Why or why not? Is any information missing from this letter? Did you communicate with any of the listed people about this letter?  Why or why not?  If so, how and what did you communicate?

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________



End of Block: Pre-Notice


Start of Block: Initial Letter


Q17 Instruct respondent to open up the initial letter.







Q18 Did you receive this letter?

  • Yes (1)

  • Maybe/Don't remember/Not sure (2)

  • No (3)







Q19 Notes on receiving initial letter

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________







Q20 Looking at it now, what is the first thing you notice about this letter?  What is your initial reaction?  

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________



Q21 What actions, if any, did you take as a result of this letter?

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________



End of Block: Initial Letter


Start of Block: Initial Letter - Stagg



Q22 Look at bolded text, “due dates.”  Did you notice those dates when you first looked at this letter?

  • Yes (1)

  • Maybe/Unsure/Don't Remember (2)

  • No (3)







Q23 In this case, each survey has a deadline for response.  How, if at all, did these different dates inform the way you answered your surveys? Do you think that providing different due dates for each survey? Is helpful or not helpful?  Why? Is it confusing or not confusing?  In what ways?

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________







Q24 Some businesses had all of their surveys due on the same date.  Do you think that having all surveys due on the same date: Makes it easier or more difficult to respond?  Why? Would encourage or discourage you to respond?  Why?

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________



End of Block: Initial Letter - Stagg


Start of Block: Initial Letter - Combo



Q25 Did you notice the due date when you first looked at this letter?

  • Yes (1)

  • Maybe/Don't Remember/Unsure (2)

  • No (3)







Q26 Notes on noticing the date:

________________________________________________________________







Q27 In this case, each survey had the same deadline for response.  How, if at all, did this inform the way you answered your survey? Do you think that providing the same due date for each survey: Is helpful or not helpful?  Why? Is it confusing or not confusing?  In what ways?

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________







Q28 Some businesses had all of their surveys due on different due dates.  Do you think that having all surveys due on different due dates:Makes it easier or more difficult to respond?  Why? Would encourage or discourage you to respond?  Why?

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________



End of Block: Initial Letter - Combo


Start of Block: Initial letter ending/Begin Portal use


Q29
As I mentioned earlier, we are experimenting with different ways of contacting business respondents such as yourself.  This year, we combined requests for a few surveys into one letter. We also sent two initial letters. One letter was sent out before the surveys were available to answer, the other one was sent out once the surveys were available to answer.




Was the earlier letter helpful or not helpful when preparing to respond to these surveys? Are there other ways you prefer we contact you to inform you that the surveys require your response? (phone call, email, etc…) Was it clear or not clear from the letters that your company would be responsible for completing multiple surveys? If we keep this survey structure (multiple surveys in one request), is there some way we could communicate this change more clearly?

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________





Page Break




Q30 Instruct the respondent to open the screenshots document.  Start with the first - portal landing page.

This is what we call our respondent portal. Do you remember if you needed to create a new account, or did you already have an account on our respondent portal?  
Odyssey coordinated contact screenshots v2

  • Created new account (1)

  • Already had an account (2)

  • Don't know/Don't remember (3)





Display This Question:

If Instruct the respondent to open the screenshots document.  Start with the first - portal landing... = Created new account



Q31 Can you tell me about the process of creating a new account?

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________





Page Break






Q32 Once you were logged in, how easy or difficult was it to enter the authentication code from the letter? Did you notice the squares with the different surveys on them popup? Was is clear or unclear that each of these was a different survey you were required to complete? What information, if any, is missing from these squares?
 
Odyssey coordinated contact screenshots v2
 

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________





Page Break




Q33
Did you click on the ‘options’ tab on any of the squares?
 
Odyssey coordinated contact screenshots v2

  • Yes (1)

  • Maybe/Unsure/Don't Rember (2)

  • No (3)






Display This Question:

If Did you click on the ‘options’ tab on any of the squares?   Odyssey coordinated contact screensho... = Yes



Q34 Did you use any of the options?  Which?  What did you think of them?

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________





Page Break




Display This Question:

If Checked_In Contains Y



Q35
FOR CASES THAT COMPLETED....



Once you had access to your surveys, how did you proceed?
How did you decide which survey to answer first?
How easy or difficult was it for you work on these surveys simultaneously?
What additional information, if any, would you have liked to have known before answering your surveys?  What additional information might you have wanted on this screen?

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________





Display This Question:

If Delegation = Yes



Q38 FOR CASES THAT USED DELEGATION: Did you need to coordinate with other people in your company? I noticed that you used the ‘share survey’ function.  How easy or difficult was it to use this function? Why did you use the delegate function? Did you find it to be helpful or not helpful?

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________





Display This Question:

If Delegation = No



Q39 FOR CASES THAT DID NOT USE DELEGATION: Did you answer the surveys or did someone else at your firm answer the surveys? How did you communicate what data were needed with the other people/person? On the website, we have a ‘share survey’ feature that allows you to share access to the survey with others in your company. I noticed that you did not use the ‘share survey’ function.  Why is that?

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________





Display This Question:

If If FOR CASES THAT DID NOT USE DELEGATION: Did you answer the surveys or did someone else at your firm answer the surveys? How did you communicate what data were needed with the other people/pers... Text Response Is Displayed



Q49 Odys flyer- INSTRUCT THE RESPONDENT TO OPEN UP THE FLYER   Do you remember seeing this flyer? In your own words, what is this document trying to communicate to you? Were you aware that there is a ‘share survey’ function that available within the instrument?  [IF NO]: would this feature have been helpful to you?  Why or why not? How can we make this function more obvious on the website?

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________





End of Block: Initial letter ending/Begin Portal use


Start of Block: Process



Q41 As I mentioned earlier, the Census Bureau is looking for ways to streamline our data collection from businesses across the country.  Did getting access to the surveys all at once change the way you typically respond to our surveys? Did the timing of the surveys make a difference in the way your firm responded to the surveys?

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________



Q42 End/Wrap up Overall, what do you think of the process we discussed today? Did you contact the Census Bureau with any questions while completing the surveys? Did you call? Did you use the secure messaging center in the portal? Would you say that someone in your role was the appropriate person to answer questions like these?  If not, who would be the best person to answer questions like these? Do you have any other comments, questions or suggestions for us?
Thank the respondent for his/her time and attention.
 

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________

________________________________________________________________



End of Block: Process






Appendix C

Premailer Letter

All data in this letter are fabricated.

Shape1

NOTICE OF CHANGE:

NEW COMPANY CONTACT FOR MANDATORY CENSUS BUREAU SURVEY(S)

The Census Bureau is modifying our communication with you and your company for one or more of the 2019 annual surveys, which will be mailed in early 2020. Please read this notice carefully.

Why is the Census Bureau making this change?

In previous years, the Census Bureau requested your company’s participation in multiple economic surveys and sent separate communications regarding each one. In order to make effective use of taxpayer dollars and improve our operational efficiency, we are beginning to streamline our communications. Starting with the 2019 Annual Retail Trade Survey, 2019 Annual Wholesale Trade Survey, and 2019 Annual Services Report, we will no longer notify different points of contact at your company for each of these surveys.

Moving forward, the Census Bureau has identified the following person to be our main point of contact regarding the surveys listed below. If you have any questions, please call us at the number provided at the end of this notice.

Contact Name 111-222-3333

[email protected]

The following are the surveys affected by this change and their previous contacts:

Annual Retail Trade Survey:

Contact Name

111-222-3333

[email protected]

Annual Wholesale Trade Survey:

Contact Name 111-222-3333

[email protected]

Annual Services Report:

Contact Name 111-222-3333

[email protected]

What if I have questions about these changes?

If you have any questions or concerns regarding this change, please call our customer help line at 1877-787-9860, Monday through Friday, 8:00 a.m. to 5:00 p.m. Eastern time.

As always, the Census Bureau thanks you for your participation in our surveys to ensure timely and accurate statistics about the U.S. economy.


Initial Letter – Combined Due Date

All data in this letter are fabricated.

Shape2

XXXX-L1 (Draft)

A Message from the Director, U.S. Census Bureau:


We request your participation with Survey 1 and Survey 2 [and Survey 3 if applicable]. Data compiled from these surveys provide part of the official statistics used to measure economic performance in the United States, and provide the nation’s policymakers and business leaders like you with measures of these important economic sectors.

Authentication Code: XXXXXXXX Due Date: March 24, 2020

  1. Sign in OR register at https://portal.census.gov

  2. Shape3
    Add your authentication code.

  3. Report by clicking on each survey’s “REPORT NOW” button. You can return to your account over multiple sessions to complete these surveys.

YOUR RESPONSE IS REQUIRED BY LAW and will be kept strictly CONFIDENTIAL. We estimate the time to complete the Annual Retail Trade Survey will take an average of 37 minutes. We estimate the time to complete the Annual Wholesale Trade Survey will take an average of 31 minutes. Additional information about the authority, confidentiality, and burden of this data collection can be found on the back of this letter.

For assistance with completing these surveys, please sign into your Census Bureau account or call our customer help line at 1-877-787-9860, Monday through Friday, 8:00 a.m. to 5:00 p.m. Eastern time.

Thank you in advance for your time and participation, and for helping the U.S. Census Bureau measure America’s people, places, and economy.

Sincerely,

Steven D. Dillingham

Director

XXXX-L1 (Draft)

OMB Number

These collections have been approved by the Office of Management and Budget (OMB). Without this approval, we could not conduct these surveys. The eight-digit OMB approval number for the Annual Retail Trade Survey is 0607-0013, and for the Annual Wholesale Trade Survey is 0607-0195. The applicable number will appear in the top right corner of each reporting screen.

Authority and Confidentiality

Title 13, United States Code, Sections 131 and 182, authorizes these collections. Sections 224 and 225 require your response. The U.S. Census Bureau is required by Section 9 of the same law to keep your information confidential and use your responses only to produce statistics. The Census Bureau is not permitted to publicly release your responses in a way that could identify your business, organization, or institution. Per the Federal Cybersecurity Enhancement Act of 2015, your data are protected from cybersecurity risks through screening of the systems that transmit your data.

Burden Estimate Statement

We estimate the time to complete the Annual Retail Trade Survey will take an average of 37 minutes, including the time to review instructions, search existing data sources, gather and maintain the data needed, and complete and review the survey.

We estimate the time to complete the Annual Wholesale Trade Survey will take an average of 31 minutes, including the time to review instructions, search existing data sources, gather and maintain the data needed, and complete and review the survey.


Initial Letter – Staggered Due Dates

All data in this letter are fabricated.

Shape4

ODYS-L1R

(Draft)


The Office of Management and Budget (OMB) approval number for the Annual Services Report is 0607-0422, the Annual Wholesale Trade Survey is 0607-0195, and the Annual Retail Trade Survey is 0607-0013.

REMINDER OF REPORTING OBLIGATION

Recently, the U.S. Census Bureau mailed you a letter asking you to complete the Survey 1, Survey 2 [, and Survey 3 if applicable]. If you have submitted any or all of these surveys in the past few weeks, thank you. If you have not yet reported, please do so before the due dates noted below.

Due Dates: February 25, 2020 – Survey 1 March 24, 2020 – Survey 2

April 28, 2020 – Survey 3

Shape5
Please check the status of each survey by following these steps:

  1. Sign in OR register at https://portal.census.gov

  2. Add your authentication code OR locate your surveys under “My Surveys”

Authentication Code: XXXX-XXXX-XXXX

(if code was used, space used to provide message stating code used and to log in)

  1. View the reporting status for each survey

  2. Report by clicking on “REPORT NOW” for any survey not showing a “Complete” status

Thank you in advance for your time and participation, and for helping the U.S. Census Bureau measure America’s people, places, and economy.

Sincerely,


Kimberly P. Moore

Chief, Economy-Wide Statistics Division

U.S. Census Bureau


Odyssey Flyer

All data in this letter are fabricated.

Shape6

Respondent Portal Screenshots

All data in this letter are fabricated.

Shape7







File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorpick0002
File Modified0000-00-00
File Created2023-12-22

© 2024 OMB.report | Privacy Policy