Attachment H
Department of Commerce
United States Census Bureau
OMB Information Collection Request
Annual Integrated Economic Survey
OMB Control Number 0607-1024
Debriefing Interviews for the 2023 AIES
September 30, 2024
Cognitive Testing Support Project: Final Report on Debriefing Interviews of the Annual Integrated Economic Survey (AIES)
Final Report
Prepared for
Economy-Wide Statistics Division
Economic Statistics and Methodology Division
The U.S. Census Bureau
Prepared by
Y. Patrick Hsieh, PhD, Katherine Blackburn,
Victoria Dounoucos, PhD, Tim Flanigan, and Chris Ellis
RTI International
3040 E. Cornwallis Road
Research Triangle Park, NC 27709
The Census Bureau has reviewed this data product to ensure appropriate access, use, and disclosure avoidance protection of the confidential source data (Project No. P-7530157, Disclosure Review Board (DRB) approval number: CBDRB-FY25-ESMD001-001).
Contents
Section Page
1. Introduction 1-1
2. Methodology 2-1
2.1 Goals And Research Questions 2-1
2.2.1 Population of Interest 2-1
2.2.2 Participant recruitment 2-1
2.2.3 Debriefing interview protocol and procedure 2-3
3. Findings from the Debriefing Interviews 3-1
3.1 Module 1: Introduction (n=51) 3-1
3.1.1 Process of responding to Census Bureau surveys 3-1
3.1.2 Process of finding the data for Census Bureau surveys 3-4
3.1.3 Process of entering the data for Census Bureau surveys 3-5
3.2 Module 2: Responding to the AIES (n=51) 3-6
3.2.1 Overall positive experience with the AIES 3-6
3.2.3 Company data does not match Census Bureau requests 3-10
3.2.4 Customizing reports and other accommodating activities for responding to Census surveys 3-11
3.2.5 High respondent burden associated with the integrated survey design for the new AIES 3-14
3.2.6 Question comprehension and applicability 3-15
3.2.7 Redundant or duplicative questions 3-18
3.2.8 Technical issues related to reporting for the AIES 3-19
3.3 Module 3: Locations in Puerto Rico (n=13) 3-21
3.4 Module 4: Explicit Response Choice and Error Checking (n=51) 3-22
3.4.1 Decision to use downloadable Excel file or online spreadsheet 3-22
3.4.2 Feedback about using online spreadsheet for data entry 3-27
3.4.3 Feedback about using downloadable Excel file for data entry 3-28
3.4.4 Additional feedback on reporting to the AIES independent from both methods 3-33
3.5 Module 6: Web Standards Exploratory (n=26) 3-37
3.5.1 Screenshot 1: Step 2 Reporting Period 3-37
3.5.2 Screenshot 2: Fiscal Year Selected 3-39
3.5.3 Screenshot 3: FAQ Modal Window 3-39
3.5.4 Screenshot 4: Step 2 Grid Format 3-40
3.6 Module 7: Interactive Content Tool (Protocol Version A, n=11) 3-42
3.6.1 Comprehension, prior knowledge, and navigation of the interactive content tool 3-42
3.6.2 Ease of selecting the industry that best represents the company 3-43
3.6.3 Engagement with the question export functionality 3-44
3.7 Module 8: AIES Website (Protocol Version B, n=13) 3-45
3.7.1 Feedback on the AIES landing page on Census website 3-45
3.7.2 Feedback on the AIES “Information for Respondents” page 3-46
3.7.3 Feedback on the AIES “FAQ” page 3-48
3.7.4 The most helpful page 3-49
3.8 Module 9: AIES Emails (Protocol Version C, n=13) 3-49
3.8.1 Feedback on postcard 3-49
3.8.2 Feedback on AIES advanced email 3-52
3.8.3 Feedback on due date reminder email 3-53
3.9 Module 10: AIES Letters (Protocol Version D, n=13) 3-54
3.9.1 Feedback on AIES –initial letter (L1) 3-54
3.9.2 Feedback on AIES – past due notice letter (L2) 3-55
3.9.3 Feedback on AIES – Office of General Council “Light” letter (L4L) 3-57
3.9.4 Feedback on AIES – experimental “Dear CEO” letter (ECSL1) 3-58
3.10 Module 11: Ideal Field Period (n=51) 3-60
4. Recommendations 4-1
4.1 Recommendations for the AIES instrument 4-1
4.1.1 Recommended improvements for the question arrangement and instructions 4-1
4.1.2 Recommended improvements for the online spreadsheet and the downloadable Excel spreadsheet 4-2
4.2 Recommendations for the AIES materials 4-4
4.3 Recommendations for the fielding time frame 4-6
5. Concluding Remarks 5-1
Appendix A. Debriefing Interview Protocol A-1
Appendix B. Participant Informed Consent Form B-1
Appendix C. Recruitment Materials C-1
Appendix D. Communication Materials D-1
Appendix E. Report of Debriefing Interviews with Firms Primarily or Exclusively Located in Puerto Rico E-1
Appendix F. Participant Informed Consent Form (Spanish) F-1
Appendix G. Recruitment Materials (Spanish) G-1
Exhibits
Exhibit 2.1. Sample Characteristics and Recruitment Outcomes 2-2
Exhibit 3.1. Time to Complete the AIES in Hours by Number of Locations 3-60
The Annual Integrated Economic Survey (AIES) is a survey re-designed to integrate and replace seven existing annual business surveys into a single streamlined instrument. The goals of the AIES redesign are to provide an easier reporting process for businesses, collect better and more timely data, and reduce costs for the U.S. Census Bureau through more efficient data collection. Designed to be conducted annually, the AIES will provide key yearly measures of economic activity, including the only comprehensive national and subnational data on business revenues, expenses, and assets.1
As part of the research effort for improving AIES, the Census Bureau contracted RTI International and Whirlwind Technologies, LLC, to conduct debriefing interviews with the respondents who completed the AIES web instrument to assess the clarity, effectiveness, and usability of the first AIES instrument launched in March 2024. From the interview feedback, the project team was able to assess these research questions and gather comments about the respondent-facing communication materials to provide relevant recommendations.
This report summarizes that effort, thematic research findings, and ensuing recommendations and includes the materials used in support of the effort.
This evaluation study explored the reporting experiences of the AIES respondents, focusing on soliciting feedback about their experience with completing the survey, including the level of effort for completion and their ideal length and timing for participation; their general impressions of the survey instrument, including the screen layout, font, and other features of the instrument; and the content and accessibility of the communication materials supporting the data collection. The findings from the debriefing interviews enabled the Census Bureau to understand the types of issues that emerged during the AIES, confirm the consistency of participants’ experiences with the findings from previous research efforts, and assess strategies to consider for further improving the usability of the web instrument and the clarity of the communication materials. The study contributed to the continuous effort of the AIES research by the Census Bureau for further improvement.
The population of interest for debriefing interviews consisted of businesses that completed the AIES instrument after April 1, 2024. Because of the rapid turnaround needed to meet the debriefing interview schedule, only businesses for which the Census Bureau had an email address and phone number in the dataset were eligible for this study. Census Bureau staff from the Economy-Wide Statistics Division (EWD) provided the project team with a sample list of businesses that completed the AIES instrument on a rolling basis throughout the debriefing interview data collection period. This sample list was used to contact potential respondents.
In addition to contact information, the EWD staff were also able to append to the recruitment file several key characteristics of the businesses, including the total number of establishments, North American Industry Classification System (NAICS) code, whether the business had establishments in Puerto Rico, and whether the business was in the manufacturing sector. However, these key characteristics were not available for all businesses and were used to track sample diversity of completed interviews rather than for targeted recruitment. The total number of establishments was used to categorize businesses into three different sizes: single unit, small business with less than 10 establishments, and large business with 10 establishments or more. Because of missing NAICS code data, business sector information was confirmed during the debriefing interview.
The goal of participant recruitment was to conduct approximately 50 debriefing interviews. Consistent with the previous effort of debriefing interviews for the AIES Dress Rehearsal, conducting 50 debriefing interviews from a diverse group of businesses was deemed adequate to provide sufficient feedback on the AIES experience and supportive communication materials.
The Census Bureau and the project team collaboratively designed a schedule for implementing two rounds of debriefing interviews, with a brief intermission to conduct a preliminary analysis to assess the insights learned from the participants during the first round and facilitate changes to the protocol as needed after half of the interviews were complete. On April 11, 2024, Census Bureau staff shared the first sample list with the project team to recruit debriefing interview participants. As the project team proceeded to contact and solicit research participation from the sample list, we also monitored the breakdowns of sectors and business size of completed interviews.
The project team completed 26 interviews from April 11 to May 24, 2024, for Round 1, and then another 25 interviews from June 25 to July 26, 2024, for Round 2. Across both rounds of data collection, the project team contacted 657 sampled businesses and completed 51 interviews; six did not attend their scheduled interview, and another 31 refused to participate. Among the 51 completed interviews, 12 participants represented a single-unit company, six participants represented a small company with less than 10 establishments, and 33 represented a large company with more than 10 establishments. Fourteen participants represented a manufacturing company, and another 11 participants represented a company with establishments located in Puerto Rico. Additionally, some participants in the concurrent e-Commerce Exploratory Interviews data collection were asked about their operations in Puerto Rico.2 Their comments on AIES were included in the analysis of the debriefing interviews. Exhibit 2.1 details the distribution of completed interviews by business size, noting the sum of 532 total samples across the three columns because of missing data on sector size from some sampled businesses.
Exhibit 2.1. Sample Characteristics and Recruitment Outcomes
n |
Interview status by type |
||
Number of units |
|||
Single Unit |
Small (<10) |
Large (≥10) |
|
Total sample |
163 |
130 |
239 |
Completed |
12 |
6 |
33 |
No show/withdrew/refused |
19 |
6 |
18 |
Data collection began with the following contact protocol: one email followed by one call, then a second email for all sample members. Because of low engagement and the rapid data collection timeline, this contact protocol was adjusted to focus primarily on emails, which allowed for a wider reach of potential participants more quickly. All 657 sample members received at least one email communication, and 292 were contacted by phone to solicit participation. Following calls, 206 sampled members received either a second email invitation or a second phone call as the last recruitment attempt (see Appendix C for details of the recruitment messaging materials).
Participant recruitment was challenging due to several factors. Sample members refused to participate in the interview primarily because of the time commitment; with many businesses engaged in end of fiscal quarter activities and tax filing season, typical points of contact for Census Bureau surveys were overwhelmed with other work. Future data collection efforts with AIES respondents should be sensitive to timing around year- or quarter-end activities and the deadline for tax filing and other businesses’ regulatory compliance activities.
As a survey pretesting method, debriefing interviews are typically employed at a later stage in the instrument design and evaluation process.3 Although the analytic goals of debriefing interviews may be similar to those of cognitive interviews, debriefing interviews tend to use a more wholistic approach to ask participants to respond to the debriefing questions or probes in a live survey setting. This method is suitable for collecting feedback on the survey instrument of focus as part of a field test or a production survey implementation like the AIES.
Census Bureau staff from EWD provided the project team the debriefing interview protocol for review, and in turn, the project team recommended some revisions for Census Bureau’s final approval to further clarify the debriefing questions and the flow of the protocol.
During data collection, RTI survey methodologists conducted all debriefing interviews virtually using Microsoft Teams. Participants were provided the phone number to call into the meeting or join via their computer. Depending on whether the interviewers possessed a Census Bureau–issued laptop or had access to Census Bureau’s virtual desktop, approximately half of the interviews were completed with audio only and the rest were completed with video interview. The debriefing interviews followed a semi-structured interview protocol (Appendix A). The protocol included questions that gauged participants’ experience in completing the AIES, how their experience compared with previous Census Bureau surveys, how participants retrieved information, participants’ perception of question difficulty, and what level of burden might be associated with data retrieval and reporting. The protocol also inquired about participants’ experience with, and feedback on, the interactive data entry functionality of the instrument, including error checking, the downloadable Excel spreadsheet (for reupload), and online spreadsheets. Lastly, the protocol also included questions that asked about participants’ impression of, and engagement with, AIES communication materials they received, including an interactive content tool for survey preview, a postcard, the emails, mailed letters, and help resources on the website. Interview length was approximately 45 to 60 minutes, and the study did not offer monetary incentives for research participation (see Appendix D for the details of the communication materials).
Before the interview, participants were asked to review the consent document, which was sent via email. At the start of each interview, the main points of consent were reviewed, and the interviewer confirmed that the participant consented to be interviewed. Additionally, participants were asked to consent to the recording of the interview. If consent was provided, the interviewer began the recording. The interviewer then reviewed the introduction to the survey with the participant, which focused on explaining the goal of the interview to gather feedback about their experiences with the AIES.
Each interview began with the interviewer asking a few background questions about the company’s core business and the participant’s experience with answering Census Bureau surveys, along with how their previous experience compared with completing the AIES. Participants were then asked questions about their interaction with the AIES instrument, including estimated burden and challenges with specific questions or topics. Some participants struggled to recall specific items with which they had difficulties, but others were able to access a copy of the survey they saved for their records during the interview to aid their recall. Participants were also asked specifically about their data entry experience, focusing on their feedback on the interactive elements of the web instrument, including the error checking functionality, the online spreadsheets, and the downloadable Excel spreadsheet. The second half of the debriefing interview focused on questions regarding participants’ engagement with the communication materials, including a review of an interactive content tool for survey preview, the survey information provided to respondents on the AIES website, a postcard, two emails, and four mailed letters that were sent to participants in advance of the interview. The following sections summarize the findings and suggested recommendations.
This section is organized by the flow of the interview protocol. Each section corresponds to the reports of the main themes that emerged from the specific module during the debriefing interviews.
Participants’ length of time in their role as point of contact for Census Bureau surveys correlated directly with their ease of response. Participants who have been reporting for more than 1 year find the process of completing the new AIES easier than those who reported for the first time this year. These new AIES respondents can be overwhelmed by the number of questions, format of data entry, granularity of the questions, and number of people they need to contact. This trend of increased ease with repeated reporting was confirmed by three participants who had been reporting for the AIES from 2 to 6 years, who indicated they believed the survey next year would be easier to complete.
“Because this was my first time it wasn’t that easy. And also, because we have several locations, they all needed to be reported separately…If I wasn’t such a novice, it would probably be easier. The next time it is going to be easier probably.”
“I will probably use the forms and reports I already created. It should be easy to move those forward and use the same methodology to populate next year’s forms as long as they don’t change a bunch of things, like columns or questions.”
Most participants find the data they need to answer Census Bureau surveys by reviewing internal reports and by requesting the support of other colleagues. Thirty-six participants reported working with one or more of their colleagues to pull the data they need. On the high end, one participant mentioned working with 18 different individuals within their large company to gather all the needed data and responses for the AIES.
“Partially due to the size of our company, there is a lot of coordination with individuals from other business units. I have payroll information, but in terms of locations and financial data for those locations, I have to work with about 8 other individuals…[in] real estate, accounting, finance.”
“Overall, this was a very wide-ranging survey. It definitely covered a lot of areas of our business and what comes to mind immediately with this question is we worked with many, many teams. It wasn’t all items we could pull ourselves. We worked with probably five to seven teams around our business to pull all of the different data just because it was really varied material.”
Those who collaborate with other colleagues typically work with individuals in the Human Resources (HR) department for payroll and employee data, information technology (IT) departments for technical questions, and Research and Development (R&D) units for specific R&D data. Although many of the points of contact at businesses are accountants or housed in finance departments, some are in administrative or HR roles. Participants in these roles must contact more people in their organization because they do not have easy access to the data they need. They serve as the liaison for the company, reviewing the questions and figuring out which colleague can provide the correct data. These participants also typically have limited specialized knowledge about finance and sometimes need assistance understanding and interpreting the questions.
Some participants must also work with outside resources when accounting is not done in house. Two participants talked about the need to consult with their accounting firm to obtain data on appreciable assets. These participants noted that working with their accounting firm can be expensive and is not a step that they have had to take for past Census Bureau surveys. A few participants also serve in roles as an external reporter for their companies although their primary role is to handle surveys and government reporting. These participants are typically at very large businesses and must work across many data sources and colleagues to gather the data needed for the AIES.
“Having to reach out to our accounting firm to get a refresher on reading the depreciation schedules on our financial statements. This cost us money to have to consult with them.”
“I’m not going to make anyone else do it or pay someone a lot of money per hour to try and figure it out.”
One pain point for the new AIES for participants who work extensively with other colleagues is the lack of an easy way to preview all the questions so they can send all data requests at one point in time. Although some participants found the question preview tool, most did not find it helpful because it seemed to list more general questions than those contained in the survey or questions that were not applicable for their industry. Only one participant thought being able to download the questions in advance was helpful and seemed to match the questions they were asked on the survey.
“[When I used the question preview tool,] I got every question it could possibly ask and some of those were not in my actual survey because mine is manufacturing of a certain type. It was dumb and a waste of time. It was tight, it was hard to read. It was before part 1 and you needed part 1 to know what to answer next, this was generic.”
“It was helpful to see the [question] preview but it doesn’t tell you everything you need to know. I’m trying to figure out what information I need to have and whether I will need to get information from other places. The whole process was really convoluted to have to go through all of the steps. I’d rather have one giant spreadsheet which is more or less the way it used to be.”
However, many participants seemed unaware of any way to preview the questions on the survey. Participants were accustomed to being able to preview all the survey questions, typically through a PDF, when answering Census Bureau surveys. Viewing all survey questions at once facilitates data collection when the point of contact needs to work with others to collect the information required for the survey. A PDF preview of the survey allows the point of contact to send all questions and requests for information to their colleagues at once, helping ensure the organizational burden remains at a manageable level for respondents. A preview of the questions also helps AIES respondents better prepare for the task of reporting. Many participants felt surprised and frustrated when they got to Step 3 and saw the amount of information being requested.
“One of the things that was frustrating that I tried to—and maybe it was an error on my part—but I tried to print all the questions beforehand so that I would know exactly who I needed to ask what, that way I could ask it all at one time. But I was unable to do that. So, there were some instances where I had to say ‘oh, here’s another question’… If I had a full view of what the survey looked like then I could kind of go through the questions and say okay let me parcel this portion out that’s specific to payroll so let me ask them all these specific questions…I think it would help kind of having access to view all of the questions up-front as opposed to as you’re going through the survey because that will help plan and reach out to others. I can be working on a portion of it while my counterpart is also working on a portion of it, as opposed to ‘oh, I can’t move forward because I don’t have all of this information to proceed’.”
“Maybe if it started with an outline of what it would ask me about, but when you are just clicking from question to question, there wasn’t any rhyme or reason for where they were going or what was going to be next…”
Relatedly, participants expressed frustration with their inability to move forward in the survey without providing a response. Moving freely through the survey would also address some of the pain points of not having a complete preview of all the survey questions. With the current format, participants had to reach out to their colleagues and wait for a response before they could proceed with the next question for data entry in the online platform. Participants copied and pasted questions, took screenshots, or sent copies of the Excel worksheet with fields highlighted to facilitate their requests. Participants could not make all requests at once and had a lot of back and forth and time delays while collecting the data they needed.
“As far as inputting the data, the biggest piece of feedback – not being able to advance to the next question until you answered the question was really difficult. And very difficult in the coordination of others as well. Some people were ready, and some people weren’t. It would have been great if they could have popped in and input their data as it was ready, but we weren’t able to do that.”
“It wasn’t easy to navigate because it wasn’t clear what I needed. After getting some of the data, I had to have multiple calls with the people collecting the data [colleagues] and share my screen so they could pull the data at that moment to give it to me.”
Some participants entered placeholder values (e.g., 0 or 1) to be able to view all the questions in each section, but this approach could lead to data entry error if they forget to update a response.
“For Step 2, I wanted to know everything it was going to ask me, specifically the PP&E section. It let me bypass it at first. I just put 0s and 1s until I saw everything that I needed but I was concerned that I wouldn’t be able to go back in there but then I was able to.”
“Typically there is a link that will show you the different questions they are going to ask so you can print that out and gather the data ahead of time before you go into the system…I ended up going through the survey and putting zero on a lot of them, taking notes on what I needed to complete, logged out, gathered the information, went back to the beginning and re-entered everything from start to finish, then I hit submit.”
For those who have access to the data to report directly, most use multiple reports or data sources to gather all of the data they need for Census Bureau surveys, and 10 participants reported using three or more reports to gather the requested information.
“I pull the reports from payroll, clinical software billing report, and prior taxes.”
“I used our payroll information and our accounting software and statements, month-end statements, year-end statements.”
“I have to gather it from several different places depending on what the question is. It might come from payroll reports, some things come from financials, some from our depreciation schedule, a conglomeration of different areas.”
When asked about how easy or difficult it is to find the data participants need for Census Bureau surveys, 21 participants indicated it is easy whereas 14 shared that it is difficult. The other participants found the process somewhere in the middle with some aspects of finding the data difficult but still manageable. Many of these participants reported that while finding the data is not particularly difficult, the process is time-consuming.
“Easy would not be a word that I use… It can be a bit daunting at times. Some of the questions just don’t seem like they fit together. So, I may have to send out different requests to try to get the answer to one question…”
“The amount of time [makes these surveys a nuisance]. You have to take time away from measurable work product to complete something that really beyond you and me and the Census Bureau, no one knows we’re doing.”
“It isn’t difficult but is time consuming. It takes time [for me] to find the data on the accounting software and also to get input from payroll and HR departments for questions about employees.”
“It’s not easy. Often times, I feel like what I get from the Census Bureau is designed for a single entity business. We have hundreds. Some qualify to go on the Census, some don’t. At times, I feel like I am hammering a square peg into a round hole.”
Furthermore, there is a trend related to business size and ease of locating data, with different challenges for small and large businesses. About half of participants from small businesses (those with fewer than 10 establishments) reported that finding the necessary data is easy; however, they still encountered some difficulties. Small businesses encounter issues with data mismatches between what they have access to and what the Census Bureau is asking for, lack of reporting at their business, and lack of granularity in their reports. About 40% of participants at larger businesses (those with 10 or more establishments) noted that reporting the data is easy and one-third noted it was difficult. Many larger businesses face issues with the granularity of questions, disparate reports, difficulty reporting for locations, and collaboration with more colleagues. However, larger businesses sometimes have the advantage of additional resources dedicated to reporting and more thorough and detailed reports.
Additionally, there are also some trends comparing non-manufacturers with manufacturers, with similar patterns to small versus large businesses. About 40% of manufacturers reported that finding the data was easy whereas another 40% reported it was difficult; about half of non-manufacturers reported finding the data was easy. Manufacturers are eligible for and required to complete additional questions in the AIES than their counterparts, which may influence perceptions of difficulty of the AIES. One participant noticed this issue regarding the number of questions, sharing “Depending on the NAICS code certain questions were asked. Manufacturing locations had a lot more questions than non-manufacturing locations.”
Of the participants who responded to the question about level of difficulty in entering data into Census Bureau surveys, the majority indicated that it is easy. Those who reported that it is difficult experienced a range of issues, including having to round their data as requested by the Census Bureau, encountering technical issues with the data entry, and understanding the type(s) of data being requested. Larger businesses find entering the data slightly more difficult than small businesses, but there are no apparent differences in difficulty for manufacturers and non-manufacturers. These points will be further discussed under the context of the AIES in the next section.
Module 2 of the Round 1 interviewer protocol (see Appendix A) asked participants to compare their experiences with previous Census Bureau surveys to the new AIES survey before gathering more feedback about the AIES. After reviewing the data from the 26 interviews in Round 1 of data collection, the project team and the Census Bureau decided that the comparison probes were not producing any additional insights about responding to the AIES. In fact, some of the questions comparing the AIES to other Census Bureau surveys were confusing to participants because they had completed the Economic Census in 2023, which is often confused with the AIES. Because of these reasons, we did not include the comparison probes in Round 2 of data collection. However, the other probes in Module 2 were retained for both rounds of data collection and are reported below for all 51 interviews.
Most participants (n=34) expressed that they had an overall positive experience with the AIES and appreciated that the Census Bureau was trying to improve the process for mandatory reporting. However, even those who saw positives to the new approach experienced technical or usability issues, which are outlined in this section. A few participants complained that the process of gathering the requested data and entering the data into the survey was very time-consuming. However, some did note that they will likely be more efficient with the process next year. In general, most participants approached the AIES in the same way that they have approached previous Census Bureau surveys, both in terms of how they gathered and submitted the needed data.4
Participants were asked to share their overall impressions of the AIES and how the survey compared with their previous experiences previous. Seven participants indicated that they thought the AIES was easier than prior surveys that they had completed for the Census Bureau.5 These participants thought that the format of the data entry and how the questions were organized helped speed up the process of entering the data, giving them the impression that this year’s AIES was shorter and simpler than in the past. Two participants who report for many different companies particularly appreciated the new combined survey approach, which made their data reporting easier and more straightforward for each company they have to report for.
“It was just the ease of getting through the survey. Last year’s survey I’d go through and populate stuff individually for the 8 different sites we have. This year, the way it was laid out, it was all there on one screen, just broken apart. It made it so much easier.”
“I thought the entry was easy this year. I didn’t have too much of a problem with that.”
“I loved [the new AIES]. I anticipated it taking a week and a half to do and it was mostly done in half a day. It was just a god send.”
“To be honest I don’t know exactly what it was but it felt a lot shorter, and less time consuming, I don’t know if it was less questions or the way it was presented, not sure what happened there... I have been doing these for a while and it just felt easier. Maybe it was just my familiarity with the data they were asking for this time around.”
“Overall impression when we learned that it was coming, happy when we learned that surveys were being consolidated. That had been a challenge. Different units were receiving different surveys, and we didn’t always know what other folks were doing or what involvement that they needed. It did bring some new challenges as far as having to coordinate and making sure everyone was providing their data in a timely manner. And, even knowing who to go to for the data.”
“I think it was much simpler just off the top of my head. I don’t think I spent as much time as I had in prior years.”
“It was a little bit more user–friendly and easier to complete because it was a combined thing instead of having to do multiple, separate censuses. I did appreciate that.”
“I think the fact that you had everything in that one [survey] as opposed to me having to do different surveys. Like I said I do surveys for 9 different companies, and before all those surveys were done other places so it felt like so many surveys. Now I noticed everything was in one place, there is just one area I need to focus on, and when I am done, I am done.”
“It was nice to have, there were a number of surveys that were combined into this one big survey, and that in itself was nice, so I didn’t have to worry about doing 10 individual surveys throughout the year. I didn’t have to worry about doing all of them, just this one. So that was the plus side.”
Many participants reported that the new AIES included more questions and more detailed information than they had to report previously. As one participant stated, “I don’t recall ever having to go into that much detail into our company expenses [in past Census Bureau surveys].” Sixteen participants expressed that they had difficulties with, or concerns about, the level of granularity that the AIES asked for, which makes reporting much more time-consuming. Participants tended to experience the most issues with the location-level questions, which many participants considered the most difficult and most time-consuming section of the survey. In particular, participants had difficulty reporting employee counts, depreciable assets, inventories, discrete expense line items, and revenue by locations. Larger businesses experienced more difficulties reporting data on their locations because they have more data to report and less insight into the detailed operations of all locations. Around 60% of the manufacturing companies interviewed reported issues with the granularity of the data requests, likely due to the additional questions required for each manufacturing location.
“It did take a little bit longer to go through all the locations because we do have 15 locations plus the other company and it’s 8 locations. That took longer because some of the other surveys are more like summary of the locations.”
“For this form, I tried to approach it different…in the prior [survey] we didn’t answer for each individual location, it used to be companywide. What is your Cap Ex for the company? Now I am looking for the Cap Ex for each location. So, it’s like a more granular level of detail. So, I had to redo the process of it to get that level of detail.”
“Oh, yeah, [the location data is] tricky too. That data's not readily available, you’ve got to go hunt it down. And then you have to make sense of it. I have to make sure I’m giving you something that makes sense and I have to make sense of it in my mind. We’ve got 30 locations and this location made XYZ income. Does that jive with the financial statements we’ve been putting out based on the report I’m about to plug into this? It’s not like I’m just flopping numbers down on a screen, it’s got to make sense.” –
“If I remember correctly on this report I think somewhere in there you had to break down your management from your front-line workers, to your clerical workers, and sales people. You know, that kind of stuff. It’s just time-consuming.”
“Very comprehensive… Having everything on a location level definitely requires a lot more detail and a lot more time...It asks a lot and when you’re dealing with 600 locations and different types of businesses outside of just our warehouse business such as some of our manufacturing businesses that we have, it just felt very comprehensive in the sense of the entirety of locations and then the amount of information required for each of those.”
“If you had three or four sites, it’s probably great. If you have over 2,000 line items to complete like I do, it’s absolutely miserable. If you’re a small company and maybe have a small number of sites, if you open a location, it’s a really big deal and you know about it. If you are like we are, with locations all over the globe that we are constantly getting in and out of…it’s a lot. We’re always moving around. The way I would see all of these movements is through the payroll report.”
Compared with location-level questions, company-level questions were much easier for most participants to answer because of more accessible data and fewer questions. Although location-level questions are the hardest for participants, company-level questions are almost universally the easiest. Most participants have no confusion about which parts of the company to include when responding to company-level questions in the AIES. However, one participant did experience some confusion over how to report employees by location because one location may have employees for several different companies. This participant was able to get guidance from the helpline to report on all employees and complete the company questions.
“The survey was sent to [Company A]. But we have 8 other companies under different FDINs. We have employees under [Company A] in these locations, but we also have employees from these other companies. They’re not in all locations. There were questions on who we should be reporting on. We received guidance to report on all employees, which opened up a much broader set of data.”
Participants were also probed about the difficulty of industry-level questions, and similar to company-level questions, most find industry-level information easy to report. However, several participants experienced difficulties with industry-level questions because of the organization of their company and the associated data. One participant reports for a parent company that includes many different entities and within each entity, multiple industries. The data for this company are organized at the location level and not at the industry level, which means the participant had to manually assign NAICS codes to the appropriate data to report by industry. Another participant shared a similar issue because their business has multiple industries at one location, making questions about the breakdown of work challenging to answer. A final participant also described some issues with retrieving industry-level data primarily because the colleagues they worked with missed that some of the data needed to be provided at a location level and some at an industry level. The fact that the questions asked for differing groupings was a problem.
“The [questions] reporting by NAICS [were the hardest] because we can run all of our reports by profit center [or location] but then I had to go in and assign all the profit centers to a NAICS code and then do another formula to pick up all those to populate…Each brand has numerous profit centers. So, like, if they have their wholesale division, or commerce division, or retail store division, restaurant division, so it’s all broken down into different profit centers. And then within those divisions there are more profit centers. So, it is more difficult pulling the data especially when it’s by those NAICS codes.”
“We do have multiple locations and at one, we have retail and manufacturing in one location. That’s where it got difficult for me to differentiate costs to either retail or manufacturing in the same building.”
In addition to the location-level data, participants also struggled with reporting sales, company expenses, depreciable assets, payroll, and inventory. Some of these topics were difficult for participants because gathering the requested data involved collaborating with colleagues, which means sending the data request, waiting for the data to be returned, answering clarification questions, and finally reporting the data in the survey.
“Where I’ve got to reach out to the IT or Facilities guys. Those are going to take more time…any time I’ve got to collaborate it takes more time.”
“Number of people and their individual payroll and the revenue. There was back–and–forth to figure out which employees are in which state.”
“Like this past year, you asked a lot of cyber security questions, you really made me dig for who could answer those.”
The topics of sales, company expenses, depreciable assets, payroll, and inventory were also difficult to report because the internal data structure for companies does not match the way the AIES requests the data. Seventeen participants reported that the way their data are organized does not match the way they need to report it for the AIES. Generally, the AIES is asking for data at a more granular level than businesses have in their internal reports. Without a motivated business need for tracking granular data, respondents struggle to report data at the detailed level that Census Bureau requests.
Location-level data were the most commonly mentioned data that participants did not have access to at a granular level. For example, one participant stated, “When it came to revenues, the survey asked for revenue by location. It’s not something we track. We don’t care about it.” This participant explained that their business does not have a need to track such revenue data by location and consequently, having to provide these data on the AIES was particularly burdensome. Another participant reported that their business organizes their finances by program rather than by location, making it more difficult to report location data. Other topics that participants had difficulty reporting at a more granular level or on a more detailed breakdown included software and hardware, operating expenses, depreciable assets, electric costs, utilities, internet sales, and capital expenditures. Larger businesses (those with 10 or more establishments) described more issues with their company data not matching the requests from the AIES compared to smaller businesses.
“Due to our own shortcomings on our internal statements and needing to add up different line items and come up with the correct total for the question on the survey was time consuming.”
“Our system doesn’t use the same terms that are used in the survey. I have to play with numbers to get them to spit out the answers.”
“Where it was asking about depreciable assets and capital expenditures, ending balance, what we had accumulated in any capital lease agreements. That section was nasty and difficult to work through. Then it gets a little bit more detailed after that asking about software vs hardware. That required more estimation because we do not split that out, we just combine our assets into one.”
“In general, it depends on how specific the data being requested is. For AIES, it is being categorized differently than we would categorize it for business use. So, it takes some backing out and exporting and reanalyzing to get to the buckets that the AIES wants to see the revenue and expense data and payroll…One of the things I ran into was in the cost information, a lot of the operational cost information. I have all the data I need existing in our systems, the difficult part becomes fitting it into the buckets requested.”
“The problem doesn’t lie with us getting the report from payroll, the problem lies with the correlation between the way our payroll department reports the data and the way Census Bureau wants it sent to them. It becomes a real issue when we try to merge the data together. It is very difficult.”
“The harder questions are when they are asking about different sales and expense categories. They don’t always line up with the way we prepare our financial statements. I know they can’t be tailored to us but basically I have to think about how they translate from us to the Census Bureau.”
Most participants want to report data as accurately as they can for Census Bureau surveys and have several strategies to manage data mismatches. Ten participants stated that they created custom reports to obtain the data that were being requested on the AIES. Although creating these reports took a significant amount of time this year, these participants hope that the process of creating these reports this year will result in increased efficiencies in future years. These participants are, however, concerned that any changes in the questions could make their newly created reports unusable. Additionally, some participants wrote extensive documentation about how questions were interpreted and how data were pulled to assist in the process next year. Following from the issues larger businesses face with their company’s data not matching the AIES requests, larger businesses are also more likely to need to create custom reports to respond to the AIES than smaller businesses.
“They are not standardized reports currently. We use [a specific accounting software] as our business software system, so it is a little bit dated. Most of it is creating custom reports to pull the information I want because we don’t have preset reports that are going to give me the information I need for these forms.”
“I’m a bean counter, so my approach is the same for all of these things. I read over it first. I say, ‘Okay, how can I get all the data together.’ I build some kind of model that brings in the data if I don’t have it readily available and then I go put it into your application…If you ask the same questions next year, it should be pretty easy.”
“One thing that we put in some effort to do as we went around the first time is make sure that the resources we were building and the step-by-step instructions we were building, the internal preparation perspective, are repeatable. So, what we did is for anything we were using our accounting software to pull, we outlined specific pages where you refresh the date, you refresh the specific locations and we can pull the data in a way that mirrors the prior year. So, we put some effort into making our research templates repeatable and then also building lists of the contacts we worked with within the company.”
A few participants admitted entering zeroes for data that they did not have readily available so that they could complete the survey, without any intention of filling in the data correctly. One participant stated, “I think I put zeroes in for some locations because I couldn’t even think about how to split that.” However, many participants spend time making manual calculations or custom reports to provide as accurate information as possible. Fourteen participants reported that they had to manually calculate responses to answer some of the questions within the AIES. About 53% of the manufacturing companies interviewed reported having to do custom calculations to report for the AIES. We suspect one of the reasons for manual calculations may be that businesses were asked to report information by a more granular level (e.g., by locations) than the level that they usually track (e.g., by company). Similarly, as previously noted in Section 3.2.3, there may be notable differences in the organization or breakdown of the records between the records businesses keep and the information the AIES requests from the businesses (e.g., the breakdown of expenses, payroll, or utility usage). As a result, participants need to engage in additional data management from their records to produce the appropriate response to the AIES.
“Sometimes there are requests asking for information [that are] not in line with how we track or present our information. I have to massage the data to get [it] into the survey.”
“Utilities specifically, it asked for electric costs. We don’t record electric costs, we record utility costs, we get joint billings with gas and electric. So to break that out is a manual process. You have to look at every invoice and split the electric cost.”
“For example, like one of the questions asked me how many kilowatts of electricity were used in manufacturing. In our building, we have a café, offices, and manufacturing and electric and gas is on the same bill. I have to look through my paper copies of old bills to figure out how much of the bill was electricity and do some kind of math to figure out what percentage was for manufacturing. It took forever just to answer that one question. If I just didn’t care, I’d make something up.”
“And then all the expenses are grouped by different things so I had to go and code all of our expenses into those groups. You’re doing a lot of coding and remapping before you can get it into that data.”
“Our reporting team prepares raw balances and income statements by site and company. We massage the data. We are not fancy enough to have a reporting system that I can mess with easily. So, I take theirs in Excel and mess with it.”
Another issue for participants is year-over-year survey question changes. Changes to the survey questions make reporting more difficult, especially for participants who try to keep records from each survey. When questions change, it makes it more challenging to gather and report the data. One participant felt that it is useless to try and prepare for next year’s survey given the number of changes with the survey questions from year to year.
“There were a few things in past surveys that I had reported as separate line items that were now combined. There was a bit of confusion as to what I did last time and now being asked to do it differently. I was trying to figure out which items were which.”
“It can be difficult. I usually try to keep notes from the prior year but sometimes the questions change, or which question you answer change, so that makes it more difficult to gather the data.”
“I am not sure how I could prepare, it seems to change as well so I am not sure next year’s report would be the same as this one, so I am not sure what to prepare for. I could separate and isolate data per location, but it wouldn’t be relevant, and it is time consuming so it would be a waste of time.”
“The only thing I could do [to prepare for next year], because we report on our fiscal year, is tell my team, these are the questions we got asked last year, go out and do that for the fiscal year in the fall this year. Then, it would be already done [before we get the next AIES], hopefully. But, if the questions change, I’ve wasted everyone’s time and the world would be furious with me.”
Another noteworthy issue experienced by participants is how time-consuming the process was to provide and submit the requested data for the AIES. Although thirteen participants spontaneously reported that having one single survey to complete the data collection was a positive, 20 said completing the survey was time-consuming. Such feedback may be related to the amount of information AIES is designed to collect from survey respondents, and the respondent burden may emerge from either the data retrieval or the data entry. For example, several participants stated that they did not have any issues finding, entering, or submitting the data, but the process required a significant amount of their time. When asked which part of the process required the most time, some participants indicated spending more time on retrieving the data than entering the data into survey, whereas others reported that their data entry was more time-consuming than their data retrieval. As expected, larger businesses (those with 10 or more establishments) protested the amount of time to complete the AIES more than smaller businesses.
“I did appreciate that there was only one survey that I had to submit but it took way longer than I expected to get all of that data in order to enter it into the report.”
“At first, I thought this was in addition to all the other surveys. So, I thought, oh boy, another survey! So then, I think you mentioned it and the person who called me to schedule this [interview], mentioned that this is INSTEAD of some of the surveys. That changed my opinion on it. Maybe this was a little more efficient. I’m not sure how many surveys this replaced that we would normally do. But I think then there was some efficiency gained…just pulling up the reporting portal, diving into all your financial info, submitting report, doing that two or three times, just takes that much more time than just doing it the one time.”
“I do appreciate that it looks like they are trying to consolidate some of these reports, so it is just one giant one instead of all of them…In our case […] it was a waste of energy, and it was disruptive. I can imagine a larger company with thousands of employees who probably have staff designated to specifically handle stuff like this, I don’t have someone like that. I am involved in revenue production and patient care, so to have to pull myself away for half a day was disruptive to our workflow and our patients.”
“It takes time. It’s somewhat easy but does take time to generate different reports depending on what is needed from the Census questions. It’s not horribly hard but it is time-consuming.”
“I didn’t think it was difficult to answer any of the questions, I just thought it was time consuming.”
“I think in some ways more efficient but in some ways I kept thinking wow is this ever going to end?”
“This is the closest that I’ve ever come to writing my representatives ever. It is a total waste of time. It is the closest I’ve come to quitting my job. Because it’s miserable doing it. I hate it. I think it’s the dumbest thing ever. To me, it’s the definition of governmental bloat.”
When directly probed about whether they would prefer to answer one survey versus multiple throughout the year, 22 out of 28 participants shared that they would prefer to complete one longer survey rather than multiple, shorter surveys throughout the year. These participants explained that one survey is easier to keep track of, makes logging into the portal more efficient, improves efficiencies when working with colleagues, and allows for easier yearly reporting. The three participants who preferred multiple, shorter surveys throughout the year felt that shorter surveys are easier to fit in around their other work commitments and do not require them to devote a significant amount of time at once to completing a survey. The other three participants explained that they see pros and cons to both approaches and do not prefer one over the other.
“I’m definitely down for just one survey and submitting once in a year because that’s just fewer things we have to worry about as an organization.”
“It’s nice to just knock it all out in one. It is less confusing because sometimes you think you finished one survey and then you get another request for another one and you think I already did that one but it’s not the same. So, it was nice to just have one in that regard, but it was bigger.”
“It was nice being one and done. Going in once and getting it done and not having to change your password, get in again, 3 months later change the password again and to get in again.”
“I would rather do it all at once especially if this can be consistent year over year. That would be great.”
“[I prefer] shorter surveys throughout the year because I have so many companies I have to report on. Dedicating a whole week to answering questions definitely took away from running the business.”
“I think if this could be a quarterly survey that would make it a bit more palatable because it is a lot of questions.”
Participants also reported issues or challenges regarding the definition or applicability of specific survey questions under the context of their business. For example, some questions were challenging for participants to understand because of confusing or unfamiliar terminology. Twelve participants mentioned issues with question comprehension, such as the examples below. Two participants found the length and wordiness of the questions especially challenging. Two participants also called out the potential for different interpretations across businesses and the potential for unreliable data because of ambiguously worded questions.
“…the question might be unclear and what do I think they are really looking for and trying to tie that back to our information. I will go back and read their instructions, see if that helps me any further, if that doesn’t, I might go back and look at the previous years, or get one of my accounting team members and see what their interpretation is.”
“Also, they had a huge section of questions like on ‘do you do last in, first out’ methods of accounting and I have never heard of that. Maybe this is some really obvious thing to a real accountant but I’m guessing that any small business owner who is not a real accountant but who knows how to enter expenses and income and pay their taxes is not going to ever have heard of this or know what these questions mean. There’s no option to say, ‘I don’t know what you’re talking about. I’m not going to answer this question.’ I even googled it. The Google answer didn’t even help me. I felt like I had to submit it because it’s the law. I honestly can’t even remember what I did. I think I just put zeroes in all of them because I had no idea. I had not encountered that before. I had never had no idea what a question was asking about until this survey.”
“Data are not hard to find. The interpretation of questions can be hard though. Sometimes contacts at Census Bureau aren’t sure exactly what it’s asking about specifically. This is often because the question is too general.”
“Sometimes I think there’s a little bit of vagueness in the interpretation especially if you’re asking six different people the same question and they interpret it their own way.”
“The [question] descriptions are very repetitive and wordy. It just takes time to read it all. You have to read it otherwise you might miss important information that you need to answer the question appropriately, but they’re hard to understand.”
Participants also reach out to the helpline when they struggle to understand the questions. Four participants who were explicitly asked about help resources in Round 2 of data collection reported contacting the customer helpline to seek guidance on ambiguous or confusing questions. One participant did not realize there was a helpline and requested access to a point person to help them better understand how to respond to questions that were confusing and unclear.
Furthermore, at least 12 participants encountered questions that they believed were not applicable to their business. Some of these questions were quite time-consuming for participants to navigate through, and as a result, participants felt uncertain as to how to answer the question. Topics that were not applicable included physical location questions for remote-only workers, depreciable assets, stocks, international locations, research and development activities, robotics, manufacturing questions for a business not involved in manufacturing, and foreign affairs. One participant explained that he believed some of the questions about operating leases were no longer relevant to any businesses because of changes to rules with the U.S. Securities and Exchange Commission. Encountering these questions left him frustrated because it appeared that the Census Bureau was not up to date on the latest rule changes. Some participants explained that they would prefer to have a filter question that could skip them past all of the questions that were not applicable or be able to select a “don’t know” response for such questions. A few participants noted that the grayed-out questions that were not applicable in both the online and Excel versions of the spreadsheet were difficult to navigate and made the spreadsheets more overwhelming.
“What I seemed to notice was that [there was much that wasn’t] applicable to our organization. It was a long survey where we answered a lot of questions with ‘No’ because they didn’t apply to us.”
“The data that I had was relatively easy to find, but most of the data requested was not relevant to our type of business. That was probably the most laborious portion to go through and read. For example, there was one section where they wanted to know how many robots we owned for production, how many we rented, how many were used, how many were new, but you had to read through the report and that was extremely laborious. I can tell you most of the data that we entered was zero…It was quite a laborious project just to read each line-item request, and very frustrating realizing I just put zeros for almost everything… The spreadsheet section that had column after column, I am not sure which one had the robot section, but I can remember some things were so off the wall, one had to do with like how many latex nipples do you produce, I guess that is pertinent to somebody but not us. It was that section where they wanted to know how much of this did you produce, how much of this did you use. I think I remember a lot of stuff about what was produced domestically or did you rely on foreign. We don’t produce anything foreign so anything that had to do with overseas production was irrelevant to us. So going through all of those categories it was just zero after zero after zero because it was geared toward a company that does manufacturing.”
“It was irritating though to have to go through non-applicable portions of the survey to make sure that they did not apply to the company. For example, the company is not in manufacturing, only in retail of clothes, but we had to go through a lot of manufacturing questions to make sure that we didn’t need to answer them.”
“There were probably some parts that I feel aren’t applicable and you just kind of have to get through them. Whether it’s putting a 0 in a number area or clicking no or NA or whatever it is. And that can be kind of cumbersome because you just want to get through it, but I don’t know what parts exactly without looking. Sometimes I feel like if you would answer certain answers to one question, it should take you and skip all that other stuff.”
In addition to being asked questions that were not applicable or relevant to their business, six participants noted that they were asked questions that seemed duplicative or redundant. They explained that it was as if they were being asked to report the same information multiple times in slightly different ways, such as by different time periods (e.g., quarterly and annually) or by different groupings (e.g., by locations and by industry). They felt frustrated by this effort to report the same information in different totals and did not understand the purpose of doing so.
“Some of the stuff is duplicative. You’re asking for quarterly information and you’re asking for annual information. Why precisely? I mean maybe some of that stuff is seasonal. I don’t know. Some of it seems duplicative.”
“It wasn’t bad, I remember going back a lot to the previous year’s March data and then through the end of the year. It felt like I was regurgitating the same information at times for two different time frames. Not sure what the end game was, maybe that’s data that they use. It was a little duplicative there. We had 14 locations for me to enter the same questions over and over again. They wanted the March information and then the year end information, just to see if business is running smoothly through the year or if there is big swings one way or the other.”
“Some of the questions were duplicated from company and industry level, is that right? It seemed to be. Some of those were the same and a little bit more elaborate of the two. At first I found that confusing, why am I doing this again and again, but then it made sense afterward.”
“It was a little messy. This line and this line is asking the exact same thing twice, then I realized you are separating from the manufacturing from the admin at the same site. So I answered the question even though in Part 1 I [had reported the totals].”
“I found in the most recent round of the AIES it requested a lot of duplicate information. So, I would answer the questions …and then it directed me to an Excel spreadsheet that wanted the exact same thing. I felt like I was answering the same thing twice. Or questions that were very similar to each other.”
The comments about duplicative questions also reflect another problem participants shared, which is a lack of understanding about the purpose of the AIES, especially around the level of detail requested by the survey. This lack of understanding can make the reporting process more frustrating for respondents. Participants do not understand why the Census Bureau needs to collect such granular information and what the Census Bureau does with that information. More participants from larger businesses (those with 10 establishments or more) expressed these sentiments about the purpose of the AIES, likely because of a higher number of questions larger businesses typically have to report.
“I hated it. I wondered why they needed all of that information, honestly. I can understand the labor aspect of it but I didn’t understand why they needed to know how much you spent on rent versus utilities versus how much you spent on laptops for employees this year.”
“To be honest, it is a real nuisance sometimes. Partly why I agreed to this [interview], it’s part of our duty to provide this information. I’m sure it’s helpful in ways I don’t even understand. So, we do it none the less. But at times it can be like I just wish I could skip this question. ... I’m sure there’s reasons for it. it’s our duty to provide that information and we do so to the best of our ability.”
“To be honest with you, [completing the AIES is] just a pain. I’m not getting paid for this time. It just adds 5 hours to my work week which is already pretty full. I think I’m doing it for the government’s benefit, which is fine, if you are doing things to make our lives better. I’m just doing it because I feel like I have a responsibility to do it because I was asked.”
“What is the penalty for not completing the survey? Also, how many surveys can they really make you do each year?”
“I would say [the AIES is] …I don’t know how to describe it. It’s just brutal. The level of granularity is insane. I don’t know what value this could have for anybody. It just feels like we’re throwing numbers at something. I don’t even know if I feel good about those numbers. One finance person may be interpreting those questions in one way and grabbing those numbers and then this person pulled a different number. Then, there’s not even a comparison.”
“So, the bigger question from me is do you really need it and is somebody following up and is it really being used when we put so much time into this stuff… I’m more curious, do you really need all of that information? What are the top ten things that you guys need that you are going to use? And are people filling them out credibly or just going through quick because it’s so hard!”
“My overall impression is that it’s there. I don’t know why I felt obligated to do it. I don’t know if other people do it. I’ve never seen actually the results of doing this—and that could be my fault for not looking at them—but to me it’s not extremely valuable. I just do it because I feel like I’m supposed to.”
Although the Census Bureau was aware very quickly that there was an issue with saving the data during the beginning of the AIES data collection and sought to address it, some participants mentioned this known issue. Three participants indicated issues that they had experienced with saving their data in the online survey.
“This year, I had filled out like half of it and while it was being reviewed, that information had been deleted. It was frustrating to have to do the re-upload. We had a new entity this year and that was also erased so I had to re-add the new entity. That was a pain. After I had redone it, I got an email that there was a server error during the time I had logged in and that everything had been lost. I had to do it three times. This was very frustrating. The email I got from the Census said do not go back in until we tell you it’s ok to do so but I never got the email. So eventually, I went in and it did get submitted right away so I didn’t risk it not going in.”
Some participants also reported other technical issues that interfered with their reporting process. For example, several participants reported inaccurate data populated in their surveys, such as closed locations, NAICS codes, and primary business activities. These inaccurate data posed the biggest challenge for participants when working in the Excel spreadsheet, where they could not update or correct such information. Further detail about how participants navigated these locked cells in the Excel sheet is provided in the next section.
“There is a location that closed in 2013, it’s still there. It’s not a terrible nuisance but it is frustrating because it’s not like you can just delete the row from the input form, you had to specify why you sold it, who you sold it to, the same information from 7 years ago. Adding new locations works fine, but removing old ones is difficult.”
Several other issues were reported by participants, including the desire for a printed report with their submitted responses, complaints about duplicative reporting across government agencies, and usability suggestions for the website. Two participants commented on the lack of an easy way to save the final submitted responses. Although the Excel sheet allows participants to save responses provided in spreadsheet format, it does not include the responses provided only in the survey portal. Many participants prefer to keep a copy of their responses for their records in case additional questions arise and to help prepare for the survey next year.
“Another thing: with other surveys once I completed it I’m able to pdf it like I retain a copy. For the AIES one I don’t recall there being an option to do that.”
“I’m trying to remember… I was looking for it this morning, but I assume I did it online because I couldn’t find a copy of it. I wanted to look at it again before we talked, but I couldn’t find it.”
One participant also brought up the issue of duplicative data reporting across government surveys and would prefer that the Census Bureau use the data submitted to other agencies. This participant shared, “there could be some cooperation with IRS and Census Bureau with depreciable assets. That information is all submitted with our tax return each year. It seems like why can’t the two of them talk instead of having to reproduce that for the Census Bureau.” Another participant echoed similar sentiments, sharing, “We’re a public company. We publish our financials to the world every quarter. Everybody knows what we’re doing. We’re a public company, the IRS is here constantly. We’re an open book. This is just somebody asking for a separate book that they want to look at differently.”
Finally, one participant mentioned experiencing issues with navigating through all the questions in the online portal. This participant used the Excel spreadsheet for Step 3 but wanted to have some way to navigate all items in the online portal easily, such as with a question pick list. He wanted to be able to confirm he had answered everything and easily move through the survey to check his responses.
“I think the only suggestion I had was having a dropdown menu of all the parts of the survey that needed to be completed so that if I had to go back to a question to refer, I think I had to keep clicking ‘Back’ to go back to an individual question. I may have missed something in the portal, but I didn’t see any way to look at a whole list and go “Oh, this is the question I wanted to go back to.”
Thirteen participants across the two research efforts were asked about their record keeping practices for business locations in Puerto Rico. All participants whose business has at least one location in Puerto Rico explained that the data for such locations were easily obtained and reported.
“It was very easy. We treat PR as a separate company, when we set them up on the books, they are set up separately.”
“[Getting the data for Puerto Rico is] pretty simple, [the Puerto Rico location] is integrated into the same system as any of our US [locations]. There is no currency exchange that needs to be done. For us, it is streamlined who I need to talk to. The [fact that it is in] Puerto Rico doesn’t add any extra layer of difficulty than any other [location].”
“[The records are from the] exact same source. Everything is generated off of location number and the database contains all locations, including those in Puerto Rico.”
“It should still be just the same as the others because it’s all in our system. So, it wouldn’t be any different really because all the data we have is in the same places.”
All participants indicated that they use the same steps to pull data for their locations in Puerto Rico as they do for their locations in the United States. Furthermore, the data about their locations in Puerto Rico are just as accessible as the data about their locations in the United States. One participant noted that gathering detailed capital expenditures data required an email to be sent to their location in Puerto Rico, but that the total level of effort in collecting this information was comparable to that of collecting this information about their domestic locations.
“They are treated and addressed the same, the only thing I’ve seen is one year they asked for more detailed Cap Ex data for them. So I have to reach out because that is not in our system. But anything operational I have. I would contact the controller in PR, it’s not an issue, send an email.”
“It is the same process as some of the others, I have to go down to the finance person, payroll person, and the operations person and ask them for the data, because we don’t have that information in our normal system, it has to all be pulled.”
Participants did not identify any issues with reporting data separately for their locations in Puerto Rico. Even the participant who explained that they needed to reach out to their locations in Puerto Rico for capital expenditures data did not have any issues with reporting and felt that the burden of obtaining and reporting this information was low.
Participants whose businesses have locations in Puerto Rico also reported that they have not encountered any language or cultural barriers in the process of gathering these data for the AIES. In some cases, the participants do not need to collaborate with any local staff to obtain the data, instead relying on reports they access themselves or colleagues based in the United States. However, even if they did need to collaborate with local staff, these participants clarified that all of the staff they interact with speak English.
In addition, Census Bureau conducted another 11 debriefing interviews with businesses primarily or exclusively located in Puerto Rico, uncovering similar experiences with little language or cultural barriers for completing the AIES. See Appendix E for further details of these interviews.
Although 62% of businesses chose to fill in their data using the online spreadsheet rather than downloading an Excel file in Round 1, the results when considering all 51 interviews are more evenly split between the two methods, with 42% using only the Excel, 36% using only the online spreadsheet, and 22% reporting using both. This is likely due to the type of businesses included in the Round 2 sample. Primarily because of the timing of data collection, as explained in Section 2, the businesses in Round 2 consisted of those who had to request extensions past the initial due date of the AIES. These Round 2 participants included larger businesses that tended to use the Excel sheet to fill out data more often than smaller businesses because of the complexity of reporting. Additionally, these large businesses preferred to use the Excel spreadsheet to communicate with colleagues about the data they needed to report.
“I downloaded the Excel file from the respondent portal. We would never be able to do that using the website, we couldn’t line it up. It would take us probably 3 months to get that done. We have people who work for different entities at the same location. [The online sheet] is very cumbersome and doesn’t work well for a company that is our size with all of the different locations we have.”
“I downloaded the Excel sheet so we had all the questions and data so the accounting department could see what they needed to answer as well, so I took that and keyed in from that. I didn’t know if I could trust that to upload right and I didn’t want to start over.”
Seven participants expressed being motivated to use the Excel spreadsheet to facilitate collaboration more easily with their colleagues and used different techniques to do so. One participant sent the Excel spreadsheet to their accounting department to fill out the data, and then manually entered the information into the online spreadsheet that they received back from their accounting department. Others highlighted the cells each person needed to fill in and created multiple versions of the Excel spreadsheet with only the data they needed from each person; this approach was time-consuming and could lead to data entry error as the points of contact synthesize all the responses from their colleagues. Many stated they would not trust multiple people in the portal due to concerns about overwriting responses and accidentally submitting the survey before it was completed.
“We chose the download option and that was mostly because we had multiple people working on the same spreadsheet. That was more helpful to be able to pass it around.”
“I had to use the downloadable Excel file. To give 18-20 people access to an online tool not knowing how your online tool would work and be able to allow people to go back and forth in that online, I wasn’t willing to take the risk of them clicking a button and saying done.”
“We opted not to use the web forms for a few reasons, primarily though because we weren’t the only team working on this so to be able to coordinate with various teams we needed to be able to download and send a copy highlighting exactly what data we needed.”
“Last year’s was way easier because I was able to just give them access to it and they could plug it in themselves. This year, it was an Excel file with a whole bunch of columns grayed out based on the locations. The questions I was constantly getting was, ‘What do I actually have to provide?’ Everything that’s not gray. I had to create a SharePoint site, pull down the file so they could access it all at the same time and complete it instead of being able to directly access the site the way they did in the prior year. So, they all complained to me the whole time. I got constant questions.”
“The majority of that survey was an old survey called the MA10000. [It] went out to about 18 different manufacturing plants in it. I could assign the cost accountant who handled that plant that survey. They could answer it themselves based on the reports I pulled for them. They could fill in the pertinent questions. The change [this year] made it that it was one person, me, and it was a spreadsheet that was from column A to column DD. That’s huge. A massive survey. I had 18-20 people having to go in so I had to make a copy of it to make sure they didn’t oops anything. I had to add, it was difficult, I had to add a couple of columns in, you had to put names in so they knew which they were assigned. It made many more steps and a lot of double checking.”
Other reasons that participants chose to download the Excel file included being able to save their work incrementally on their own computer without having to log into the portal, easier visibility than the online spreadsheet, and improved usability for entering location-related responses.
“Visually, it was easier to see it in the spreadsheet format, the downloaded file, than to scroll back and forth on a page within a website. You have a smaller view on a website page versus a spreadsheet which I can look at the whole view. Keep the spreadsheet download upload thing because that felt more user-friendly.”
“I find that it is much less risky that you will lose your work. Otherwise, the page can time out. You can use your arrow keys instead of the tab buttons. You can save your work so you can look back at what you answered. You can also import information.”
Although the Excel file has advantages for large businesses, those who did use the online spreadsheet thought it was easier to use than the Excel file. Many of these participants felt downloading the Excel spreadsheet was an additional effort and an extra step that they worried might not work correctly. Some were concerned about their ability to fill in the cells in the Excel sheet without errors, and others worried about how user-friendly the Excel sheet would be. They thought using the online option would be faster and some worried that the upload function for the Excel might not work. When probed further about concerns about the upload function not working in Round 2, participants expressed fears about data getting lost or reported inaccurately, as well as encountering errors during the upload process. Another participant who chose the online spreadsheet was worried about the data security of downloading and then uploading an Excel sheet with their responses.
“It was easier for me to read and understand on the website, I could just fill in the actual blanks instead of being worried I am putting it in the right box on an Excel spreadsheet.”
“I decided to use the web sheet, at that point I didn’t want to duplicate my efforts in any way, instead of filling out a spreadsheet. I know sometimes that can be helpful but I didn’t know if I would need to retype everything or if I could just submit the spreadsheet, so I just went with the web based.”
“I used the online spreadsheet. I thought that huge companies have this data in the spreadsheet and they could just upload it. That’s not us. It didn’t even occur to me to upload something.”
“I don’t generally like downloading spreadsheets. The online tool did the addition for you. The spreadsheet didn’t have anything the online tool didn’t have. If you download the spreadsheet, you’ve got to upload the spreadsheet. My mind worked very quickly on this decision. It’s just easier for me to get this information and plug it in here than to download the spreadsheet and then plug it in.”
“I thought it would be quicker and easier. I just thought it was removing a couple of steps, I didn’t have to download a file, scan the file for viruses, wait for it to open, enter, save, reupload. And it didn’t look like there were that many questions required for our company. It seemed like online was simple and worked.”
“My experience with downloading and uploading excel spreadsheets is that there are often formatting controls that will give you errors based on inputting the data. Since it is one location it was fairly straight forward, it was just easy to enter it instead of uploading a spreadsheet and troubleshooting through formatting controls.”
“The reason I didn’t do the [Excel] option was because I wasn’t sure about the file mapping. I didn’t want extra data to go with the file. I don’t know that it would have, I didn’t even try it but that’s why. I didn’t want information that shouldn’t have been shared or wasn’t asked for that I had to include.”
Another reason participants chose the online spreadsheet is because they liked that the online option saved automatically; however, as described previously, there were some issues with this save function early in data collection.
“To me it just seemed easier to do it via the online route because it saves as you go along as opposed to me having to worry about ‘oh wait, where did I save that spreadsheet?’ and run into the possibility of something not wanting to upload or there being compatibility issues.”
There were, however, eleven participants who used both the online spreadsheet and the downloadable Excel file and a larger percentage of these were from larger businesses. Five participants used the Excel to view the questions or gather data from other colleagues but then chose to enter the data directly into the online spreadsheet rather than upload the populated Excel file. Two participants did not trust that the Excel would upload correctly and instead opted to manually enter the information to the online spreadsheet. One participant may have experienced some misunderstanding over the upload process and thought they had to retype all data from the Excel into the survey platform manually.
“I used the Excel file to do my work and then once I had everything I would type it in the online spreadsheet. For me visualizing is better. I have two screens. On one screen I did all my work on the spreadsheet, and then I came over and did it on the online spreadsheet.”
“I remember proceeding without the Excel sheet, seeing the number of columns and then going back, and thinking let’s do this in the spreadsheet. And I do think that ended up being the easier route.”
“It talked about uploading the spreadsheet, but I didn’t feel comfortable, I didn’t know how to enter the data in the spreadsheet, it didn’t really say. So I just did it manually because it wasn’t all clear enough for me to do in the spreadsheet. I used the spreadsheet; I downloaded it to prepare the data and then I manually entered it. I did not see how the format should be for me, for the replies, the information, it wasn’t…usually when I am filling out somebody’s spreadsheet for example our auditors, you can only enter it in a certain way, otherwise it doesn’t even take it. This was completely free format, how do I know I am doing this correctly in the way they wanted. I just didn’t feel comfortable.”
Another participant started off by using the online spreadsheet but switched to the Excel after seeing the amount of data that needed to be reported. However, others reported that they were not aware that they had the option of using both methods and the ability to switch between them. As a result, participants stuck with the method they chose even when they were dissatisfied with their choice. One example was a respondent who shared, “I got the impression if I downloaded [the Excel sheet], I couldn’t go online.” Two participants shared that after uploading the Excel sheet, if they had additional data to update, they updated it directly in the online portal rather than in the Excel sheet with a new upload.
Only one participant reported that their financial analyst also logged in and used the online spreadsheet. All other participants were the only ones accessing the online portal. One participant saw communications that they could collaborate in the portal but was unable to do so during data entry. When not using the Excel spreadsheet to collaborate with colleagues, participants sent screenshots or copied and pasted question text into emails and entered the data online.
“I read something, they were talking about a collaboration tool within these surveys where you could invite someone, put in their email and they would get a notification they were invited to collaborate and then could go in and fill out the information they knew at the same time as other people and then someone could do a final review. I haven’t seen something like that before, is that true?”
The smaller businesses who used the online data entry did not have much negative feedback about this method. They were overall happy with the usability of the online worksheets. Participants liked the grayed-out cells that clearly communicated what questions they needed to answer, with one participant sharing, “It was perfect. It tells you in the instructions, whatever is grayed out you don’t have to put anything. So, it was really good. Me being the first time doing it, I was able to comprehend it.”
Participants also liked that the online option offered a spreadsheet to enter all the data at once instead of one question at a time on the page. The online spreadsheet helped make the process faster and easier, with one participant sharing, “I think [the online spreadsheet is] what made it so much easier. Last year it was quite a lengthy process of going through each of the locations’ data and entering that into the system.” One participant who used the online spreadsheet appreciated that it had similar functions as Excel enabling easy data entry, sharing “I did like how the online spreadsheet was like Excel. You could drag down and highlight and make changes in it as if it were a spreadsheet. It wasn’t just like one cell specific.”
However, four participants did think the visibility of the questions in the online spreadsheet was more difficult to navigate. They had issues viewing all the question text and instructions because of the wide format, which made it easier for participants to overlook some of the items that needed to be reported if they did not pay careful attention.
“The one thing where the downloadable spreadsheet may have been better is the viewable amount of data. A lot of times when you’re viewing on your monitor [on the online spreadsheet], you have to keep scrolling to see the entire thing – the headings. I remember the heading being so cumbersome. There was just a lot of writing in the heading. What they should have done was put a heading and then a footnote with more information. Whoever did it in the survey wanted to put so many characters in the heading that it became obnoxiously cumbersome.”
“I think the only thing I find when entering data in other surveys and this one too is that it extends much further than I would expect sometimes. I just needed to make sure I looked through the spreadsheet and got all the data put in there.”
“The presentation would be easier if it was a vertical arrangement instead of horizontal.”
Finally, one participant experienced issues with editing data in the online spreadsheet. This participant shared, “I got the last one [Step 3 and] it was like an Excel sheet. It looked strange. It wasn’t the normal check this box, enter the number in this box…I couldn’t change the data and that part was frustrating.” This issue was reported more frequently by those who reported their data with the Excel file, as described in the next section.
The businesses that used the Excel sheet had mixed impressions. Some positive feedback about the Excel sheet included that it was well-designed and organized, that it was clear which questions needed to be answered, and that it was easier to respond to. Participants found the location data easier to review and compare against their records in the Excel spreadsheet, especially for businesses with more locations.
“I think the new system where I was able to download the template that I need to use to import the data and each step would gray out those sections, so I just had the sections I needed to fill in for each step. That was much more helpful this year for me. I think that helps the process go a little faster than having to read through every question and figure out ‘Do I need to answer this?’ for each of our programs and locations… I think the way the questions were organized and put on the template seemed very clear for what I needed to click on and what I needed to fill out to move onto the next question.”
“The downloading of the spreadsheet and uploading seemed to work a bit better – especially with the location data. Having it on a spreadsheet, I could enter it all together and look at my source documents and enter it all at once instead of going back and forth on the website. That seemed better.”
“I’ve done some in the past where they didn’t have the Excel template or it was really clunky. For this, I liked having that downloadable spreadsheet. There were no issues. It was not difficult.”
Although most participants appreciated the option to work in an Excel spreadsheet, especially those representing very large businesses, many believed the usability and functionality of the spreadsheet could be improved. Participants encountered issues with data that could not be edited, the format of the spreadsheet (e.g., visual layout, gray cells, cell values stored as text, protected cells), and inability to sort. Starting with the process to add new locations, three participants noted that they had some difficulties and frustrations with adding new locations. These participants felt the process to add locations was not clear and straight forward, and once added, the new locations did not always carry through to the later steps of data entry. Although adding new locations can impact both those using the Excel spreadsheet and the online spreadsheet, only those using the Excel spreadsheet for data entry brought up this problem.
“At the beginning, I probably had to read the instructions twice and watch the video to see which way I had to go to get the information in. I was able to get there though. The operation worked but it took 10–15 minutes more than I would have liked. It would have been easier with just a button to add a location. I would have preferred that. I wouldn’t have had to watch the video in order to do it...I’d like to do it on one screen. You had to go to one step to update locations, upload that file, and go to another step somewhere else to input information. There were multiple steps to get that done. It would have been more handy if it was just one step. That would have been a lot easier.”
“The spreadsheet lists locations used in the past. To delete locations or add locations was difficult. When they start a new survey, just tell respondents the number of locations and types of locations they are looking for. This way you can start fresh each time without having to delete and then add locations.”
“It was a little bit difficult and the main thing that I think definitely needs to be fixed is the fact that when you do add new locations, those locations, even though you added the descriptions of what NAICS codes they would have, it doesn’t update or group them into it. So, like, when it’s asking you for expenses or whatever per NAICS code it doesn’t include any of those new locations. So all that data’s not included… I put a note at the end of the survey that we could not report everything because nothing new was included.”
Once getting into the Excel spreadsheet to add their data, four participants encountered issues with data they were unable to edit, particularly incorrect NAICS codes. One of these participants struggled with NAICS codes that were incorrect for a newly added location.
“The biggest difference [with the new AIES] is the spreadsheet that you are asked to download. This was challenging because many of the fields in the spreadsheet couldn’t be edited.”
“I couldn’t report all the expenses by the NAICS codes… The reason why was most of our brands sell men’s and women’s and even children’s clothing—there’s only a couple that are like just women’s or something like that—I had updated all the descriptions on even the ones that are in the system from ‘men’s only’ to ‘family’ but there were some codes that weren’t already in the survey. It wouldn’t let me add sales and all that to a new code. Because it didn’t update that on the spreadsheet I couldn’t report it. I wanted to use a new code. But I couldn’t report by that new code because it wouldn’t update it to that new code. It was all grayed out. You can’t type in there when you update something. It only lets you type in the fields for the old codes that are already in there for the NAICS.”
“There are some pages on the worksheets that are grayed out, and you can’t adjust them. So, the problem was it was set up for our one newest site and none of the other entities. I couldn’t fix this one that was prefilled in for me. I needed to change some things and they were grayed out. I just left them, I figured if you guys wanted to know it you would call me back.”
“I did the online spreadsheet. But some of the data I couldn’t change. There were 2 rows for our company and there should have only been one and I couldn’t delete one. I don’t even know if it went through correctly. Like we had 2 NAICS codes. We had one assigned to us but it was incorrect so I fixed it but then created another row for company information. It wouldn’t let me edit any of the data that was prefilled.”
Several participants struggled with the visual format of the Excel spreadsheet, both with the expansive instructions and question text and the number of columns in the spreadsheet. The instructions in the first rows of the spreadsheet can include a lot of text that is difficult to view on screen as participants enter data. Others experienced issues navigating the columns when they needed to refer to data in the first column, such as the establishment identifier.
“There was a question with a big spreadsheet…whoever designed that, obviously knew what answers they wanted in each box but it did not make sense. There wasn’t a clear header. It was way too big for the screen. You had to scroll back and forth to figure out what was the column header. It was so confusing. Even a 100% professional accountant I don’t think could answer that question. I had not experienced that.”
“…because of all the headers it was hard to follow where I am entering the data and to match it up with the platform, it was visually different, so it was a little more complicated.”
“I don’t love all the column headers. They’re too big and it’s not easy to freeze panes and look at the whole spreadsheet at once.”
“I had a hard time formatting to see all of the comments that were there of what should be included and what should be excluded. It was hard to get that formatted to see all the information that was in a certain cell.”
“And I know you had great explanation on the top 3 columns/rows that talked about what was there, those 3 had that much information in them. If you wanted to see all of it, it was very hard to scroll then. I had to shrink things to get on. I don’t know if some of that information can be on a secondary PDF, they could have 2 pages open. Column A is this…so it can be a little more condensed.”
“I found there were times when you’re 30 columns in and you have to refer to the location number in column 2, just kind of physically navigating that got a little difficult. The actual download and submission portion was easy. It just was when it came to actually filling out and working in the downloaded form that we found some trouble with just navigating it.”
Another issue for participants with the Excel spreadsheet was the way that cells were formatted, including storing some values as text and protected cells. One participant from a large business disliked that the store numbers were formatted in the Excel sheet as text instead of numbers, which interfered with creating formulas to look up the information from existing data sources. This participant also disliked the subtotals that appeared in the middle of the sheet because it prevented them from copying and pasting or dragging the formula down in the cells to populate all the rows. They would rather upload the Excel sheet and then see the subtotal rows online to review the data they entered. Another participant expressed similar issues with the protected cells that prevented copying and pasting full columns.
“The only other difficulty I had with using this sheet was with the protected cells. Often we would have to pull large data in one sheet and then move that data into the actual form and with so many protected cells we couldn’t copy and paste full columns so we had to go segment-by-segment or cell-by-cell.”
Several participants also found the grayed-out cells in the Excel spreadsheet to be confusing. One participant was unsure why there were white cells in columns that were otherwise gray and decided not to provide any data in these instances. This participant shared their screen with the interviewer and pointed out the areas of the spreadsheet that were confusing, suggesting that the two-color scheme as a visual cue did not clearly instruct them which parts of the Excel spreadsheet they need to engage and provide responses.
“I was confused about what these columns are, they are grayed out but then this is white like it wants there to be a number. It would be cool to not see columns you are not going to fill out. Maybe if there was a way to say we are not expecting you to put something in this column might be helpful. I thought they were going to be calling me back ‘hey you didn’t put numbers here.’”
“There were columns out there that were totally grayed out because they didn’t apply. If it didn’t pertain to us, that could be not there, instead of just grayed out.”
On the other hand, another participant expressed appreciation for the gray cells, suggesting that such visual cues clearly communicated what information they did not have to fill out. This participant shared, “I really appreciate the grayed-out sections, especially because we’re non-profit – we don’t sell goods – it was super helpful to know I didn’t have to worry about reading through all of those.”
Two participants also had issues with sorting the Excel spreadsheet. These were both very large businesses that struggled to align their records to the AIES data. They wanted to be able to sort their locations by different variables so they could align the data and provide responses. Both participants would rather provide the data in any order for their locations instead of the order prescribed by the Census Bureau.
“We get the report and it can be easily sorted by location. We have different companies all housed under one so we have several different EIN numbers. Once we get the report from payroll, we sort it by EIN, then Zip, then State, then address. We can group it all together and then we can enter it into the spreadsheet we downloaded from Census Bureau. The Census Bureau will not sort the data back into our format once we submit it. We have to contact one of your representatives who has to manipulate the data back into the way it is supposed to be. This took weeks to get done. You have a reference number that you assign to every location and that means nothing to our payroll department. We are not sorting by that number, we sort by the EIN, State and Address and try to correlate the best we can… In order to streamline it, we need Census Bureau to realize the data from your portal has to be manipulated to match what is in our payroll system. To get it back uploaded we need you to accept it the way we have it and not require it to be put back into the sort when it came out of your portal.”
“I think that what I would like to see happen in the future is that we would just have a spreadsheet where we could put all of our information in, that we’re able to sort, and send back… All of my problems stem not from getting all of the information from my company but trying to answer it in the way you all want to see it. In the past, it’s been a large spreadsheet where we could just enter information in all at the same time and I have over 2,000 lines so it’s a lot of data… we could download the spreadsheet but then you’re not supposed to be able to sort it. The way I eventually got my information in at all was calling multiple times and trying to talk to more of a technical person more than my rep. This person did try to troubleshoot with me for about an hour and a half. One of the solutions that we came up with is making my own working file and doing whatever I want in my working file… Then I did V-lookups on my data to get it in the way you wanted to see it. It was a lot of hoops to jump through.”
Uploading the Excel spreadsheet after completing it posed challenges for some participants. Three participants mentioned issues with uploading the Excel worksheet back to the survey website. These participants struggled with the messaging and instructions on the portal about uploading the spreadsheet. One participant indicated some confusion over the upload and download buttons for the Excel sheet. This participant was confused about the button because it just said “upload or download” and did not indicate a next step of downloading the Excel sheet. This participant spent time communicating with the Census Bureau to understand how to download the Excel, which took about a week. One participant was confused when trying to upload because there was no clear indication on the website that the upload had been accepted. This participant did reach out to the helpline and was able to confirm that the Excel worksheet had been uploaded correctly.
“I couldn’t get the dumb thing to say it was uploaded. Every time I hit submit it would say please fill this in and the only way to get in was to go back and every time I went back it would show it wasn’t uploaded. We went all the way out and back in and then it was uploaded. I had to call the [helpline] guy to work through that because it was frustrating. Logically, I was expecting it to go next.”
“I did find it a little bit complicated in the beginning. You have to download a file and then try to upload it. I did have a bit of trouble. I kept doing the file they wanted and it kept referring back to a different .jpg or something else. But, once I got the hang of it, I was able to attach it.”
Several participants commented on the request for reports in thousands and indicated that this can sometimes cause issues. Three of these participants used the Excel sheet while one used both the online spreadsheet and the Excel spreadsheet. One participant mentioned that they have to manually type in the amounts into the Excel sheet because the request was in thousands, so he had to manually round the final numbers. One participant found it unclear that numbers needed to be reported in thousands and had to upload the Excel sheet again after triggering error checks. Others commented on the overall process of reporting thousands instead of the exact numbers, which can lead to data entry errors.
“In the beginning it was a bit complicated because you need to be very careful with the numbers you put in. I think it’s in the thousands. You’re used to just typing in the numbers, so it took me a little bit to get used to typing in the thousands, but after that it was easy.”
One participant also did not like that the survey seemed to time out more quickly than previous Census Bureau surveys. This feedback may be related to the issues with saving data early on during data collection. However, if the time-out period can be lengthened, that should be considered..
“The system seems to time you out quicker than it used to, so if you are stuck on a page and need to go look for information, 5 mins and it completely logs you out. I felt like the new economic survey from an IT standpoint was not very good because it would take you completely out and would not save a lot of the data.”
Of the 51 participants, 31 remembered performing error checking on their survey prior to submission. Those who did not recall performing the error check also mentioned that they thought it was a good idea and would like for there to be error checks to confirm they had entered all the data correctly. It is possible that these participants did have an automatic check for errors but because there were no errors, they did not remember running the error check. One participant wondered whether it was something respondents could manually engage throughout the survey.
“…when I got to the submission page I just assumed if there was an error it would redirect you. Nothing popped up so I don’t remember the error check portion. It would be nice if you could integrate that into the system so if there was an error, you didn’t have to check on your own, it would just direct you to the error and you could address it.”
“No, I don’t remember [running error checks] …Maybe there was a review your data before submitting option? It didn’t flag anything. I’m sure there were many errors I put in there. There were zeroes I was putting in just because I didn’t know what else to do. It didn’t flag anything to me. I don’t remember seeing the word error on there at all. I do remember at one point there was a question that made me realize that an answer I gave to an earlier question wasn’t right, so I did go back and then correct that answer. That worked fine. You could click back, change your answer, and click forward again.”
Those who did remember running the error checks commented that they appreciated having this option and that they believed these checks improved the response process by ensuring all the data were entered correctly. For those who ran the error check and for whom the check did not flag any errors, the checks gave them confidence that the data was reported correctly, there was no missing data, and that they could submit their responses.
“I didn’t have any errors, just clicked it and it said no errors so I moved on, but I appreciated that it was checking.”
“[The error checking] was helpful…sometimes when you’re filling out spreadsheets, you skip over something. So, it was helpful to have that to make sure you come back to all of those areas.”
“Yes, that was very helpful because it gave you the exact locations that needed attention. We did that at the very end before submitting and then you could go right to where the error was and correct it. I also liked that once you did that, you could refresh the check report and see ‘Now we’re down to 3 [errors], now we’re down to 1 [error].”
Those who did have errors flagged during the checking process mostly found the process helpful and easy to use. Several participants indicated that the error checking found questions they had accidentally missed. Using the error checking in these cases saved them from accidentally submitting missing data.
“We missed one question that we needed to answer and then it also totaled up something and I can easily see my totals matched up to my report so I could see it added up right. Previously, you didn’t have that.”
“[The error checking] was fine. I just took this to mean [I had] missed something so go back and do it! it was a safeguard that helped me I guess. I feel like they were maybe related to our – I forgot to mention, we have a wholesale warehouse where we have a lot of like products shipped to our central warehouse and then distribute internally to restaurants. There was a question related to that location that I missed and I had to go back.”
“I believe it did identify a few errors like entering commas or something like that.”
Some of the error checking, however, flagged errors that were not actual data misreports. This was somewhat confusing to participants and seemed to be related mostly to locations that were added or removed. One participant had errors flagged related to a location that they attempted to remove from the data. This participant was confused about the point of the error checking because after reviewing and confirming the data were correctly reported, she was able to submit with the errors still flagged. Two other participants appreciated that they could still proceed even with errors triggered because they could not resolve the issues and had reported accurately. However, one participant disliked that she was able to move past Step 1 without a hard stop when three newly added locations failed to import with an entity name because the name was not in all capitals.
“When I did get my data completed [in Step 1], I didn’t understand that there were errors that I really needed to fix. It didn’t really stop me so I didn’t understand that I would never have the chance to do that again. I had three lines in there that because the entity name was not in all caps, it didn’t take it. It’s the weirdest thing ever. I couldn’t fix it… Three of those lines don’t even have an entity name associated with them which is crazy. I wish it would have hard stopped me or something.”
Another participant also had issues with errors flagged related to deleted and added locations. This participant also thought the process to run the error check was not intuitive because they had to save first before running the error checks. Additionally, this participant preferred to have one round of error checking instead of two. Another participant, who had errors flagged because of closed locations, left some responses blank and needed to return to those cells and answer zero instead. This participant did not think this process of reporting zero was very clear and assumed that the survey wanted them to enter zero.
“I think a few things were triggered [during the error checking]. Some of these entities if they were closed or management entities so they don’t have the same type of sales or expense data, I would have to go in and manually enter in zeros. So, some I missed putting in those zeros and that’s what those errors were. I did that other survey, so I was familiar with the rejection process… it takes you to that cell, so I assumed [I had to put in 0]. Everything else was 0 so I thought it made sense to put 0.”
Finally, one participant expressed her dissatisfaction with uploading the Excel sheet because of the error checking process. After inputting all the data, errors were triggered that she struggled to resolve without a team call. Overall, she found the process more confusing than using the online platform alone. Others also disliked the error checking process and found the error messages too vague, leading to difficulties determining which data points had triggered the errors. Several participants described having to complete a manual review of all lines of data to find the cells that triggered the errors.
“I had to get on a call with the finance team because we would get errors and have to clear those up. Using the spreadsheet was actually more confusing than inputting it into the platform itself.”
“Once you tried to submit, then you had to figure out what the errors were. The description on what the errors was, was not great. It was so generic, you literally went column by column line by line row by row to try to figure out where the error was and why it wasn’t uploading. It probably wasn’t that bad for smaller companies…I don’t remember the actual error message that popped up, maybe a reminder there of, ‘There can be no blanks’ [or maybe]… ‘Verify that all non-grayed out cells have an answer. There can be no blanks.’…I had a 0 that they wanted text. That was fun too, to find. Process of elimination. Once I realized all of my columns had 0s then I went across the top and saw one wanted written text so I put [the word] “none”. I think it took me like 2 hours just for the errors.”
For Round 2 of data collection, several questions were added about access to help resources to better understand challenges and problems participants faced while completing the AIES. Of the 25 participants in Round 2, 12 indicated they had used some of the help resources available, including contacting the customer helpline, watching videos, attending trainings, and reading documentation and instructions. Four participants were unable to find the answers to their questions in the general resources and wanted more detailed instructions about the questions, including help applying the questions to their specific companies. Others found the help materials available overwhelming and difficult to navigate because there was so much information available. As one participant shared, “There was a lot of help and a lot of notes to read, and I found it all overwhelming and I didn’t find it helpful. So, then I tried to do it on my own…Because it was just too much to read.”
Others reached out to the customer helpline when they faced technical issues, such as problems with uploading the Excel sheet or with errors that could not be resolved. Almost all participants who called or emailed the helpline thought they were helpful; however, one participant felt they did not get enough information to help them understand the survey item they had questions about. Another participant engaged with multiple help resources but felt they did not apply to her large business.
“After multiple calls to multiple people, some guy I talked to for 1.5 hours, who wasn’t my rep, did give me an idea because I wasn’t able to sort [the Excel sheet]. He said maybe I need to create a working file and that’s what I ended up doing. My rep was nice and as helpful as he could be, but he wasn’t able to help me. I needed help from a technical person. I kept trying different methods to get a technical person. It’s really frustrating…The webinar was good but the problem is, they were looking at a cat company with 3 or 4 locations and I have a company with 2,000 lines. Another issue that I didn’t specifically mention yet, and my account rep was able to help me with this, but I had 300 lines of additions. If I had to sit there and enter all of them in myself, that would have been miserable. But he was able to go in and just give me the blank lines. After he did that, I was able to just do one step to get them in there. If you could just do it in one spreadsheet. If you’re a large company, you want to be able to just do it all at once.”
“Also, I did get in touch with your helpdesk and that was very easy and the gentleman I spoke with was very knowledgeable and able to walk me through the sections I didn’t understand.”
Two participants indicated that they regularly clicked the information icon on questions in the online survey to better understand the question text but did not use any resources outside of the survey.
After Round 1 of interviews, the project team and the Census Bureau agreed that the saturation point for findings about the questions in Module 6 had been met, and this module was subsequently dropped from the protocol in Round 2. The findings reported below are from Round 1 of interviewing and include only 26 participants. See Appendix D-1 for all Module 6 materials.
First impressions. Participants found the question displayed in this screenshot to be very easy to answer, clear, and straightforward. Only one participant noted some confusion because their fiscal year is a calendar year.
“For me, it would be a quick answer. It’s partially confusing asking for a fiscal, calendar, or partial year. Our fiscal year is a calendar year, so the information I would give you includes the whole calendar year.”
Expectations about what will happen next. When asked what they expect will happen on the screen when they answered the question, most participants answered that they would simply expect to select one of the response options, click “Save and continue” and move to the next question in the survey.
“I’ll click one of the bubbles and hit ‘Save and continue’ which will move me to the next screen.”
“The radio button will be selected, and you can select ‘Save and continue.’”
“It will go to the next screen and ask me for that data for whatever time period I’m selecting.”
Some participants expected the survey would then ask for the start and end dates of the reporting period they selected; however, among those participants, they felt this would only be asked for partial year or fiscal year as calendar year would be self-explanatory.
“I guess it would populate the location data and the company data that you’re going to submit for a calendar year, fiscal year, or partial year. Or, if you hit ‘Partial Year’ it may even ask you to define what the partial year is or what the fiscal year period covers.”
“I assume if I selected ‘Fiscal Year,’ it would ask me probably to input what my year end was. Calendar year…I kind of expect nothing would happen except I would go ‘Save and continue.’ Yeah, nothing to happen if I selected ‘Calendar Year.’”
Screen layout and font. Overall, participants found the layout to be simple, straightforward, and easy to understand.
"Fine. It’s nice and simple. It’s not too cluttered.”
“It’s easy to read, simple. Not cluttered, not too busy.”
One participant noted that the additional italicized text provided more context and information that made the question easier to understand.
“I like it. [The text] underneath ‘What time period is covered by the data provided in this survey?’ is what I liked. It gave it to you in a different scenario. I mean, that was perfect.”
Two participants provided feedback on the survey when reviewing this question, which echoed feedback they provided earlier during their interviews. One participant mentioned that it would have been helpful to have a menu to indicate where this question was within the survey to help follow along with progress. This participant preferred to have a pick list of the questions in the survey to help navigate between them.
“I think it’s fine. Again, having the menu to follow along maybe, or give a sense of where I am in the whole survey would have helped. But, other than that, the way the question is laid out, the Save & Continue, it’s pretty clear how to move forward.”
Another participant mentioned that they wanted to be able to advance without providing a response to skip questions within the survey (i.e., advance to the next question without answering). This participant shared, “It would be nice if I could hit ‘Next’ even though I didn’t save without putting anything in. That is a complaint. You can’t get through that screen without putting anything in.”
Most participants had no issue with the font size or thickness. Five participants felt the font size or thickness should be increased to make the text more readable.
“I mean, it’s fine. The wording ‘Calendar year data is preferred….’ Should be better and clearer font. That is clearly in a font that is different than the answers below and it’s not as readable.”
“The [instruction] is a little small. It’s more difficult to read.”
“Maybe the definition of calendar year should be bigger.”
“’Reporting period’ could be bigger while the company name and address above ‘Reporting Period’ could be smaller.”
Participants understood that in this screenshot, the respondent had selected “Fiscal Year” and was then asked to enter the dates for the start and end of their fiscal year. Participants also understood that changing the selected answer would affect the follow-up question. Overall, participants thought that changing the answer to “Partial Year” would also trigger a follow-up question to enter the start and end dates (and would clear out the dates entered previously). Participants expected that changing their answer to “Calendar Year” would remove the follow-up question since they expected the survey would know the start and end dates for the calendar year.
“If you had selected “Calendar Year” you wouldn’t see the options below to enter in dates, but since they selected “fiscal year” those items appeared to provide clarify on what that mean.”
“If you put in calendar year, I would assume the dates would go away and you would just continue.”
“I would assume it probably would clear out the dates I entered and change the question to reflect ‘Calendar’ or ‘Partial’ year.’”
Many participants mentioned that the frequently asked question (FAQ) window was standard to how they had seen FAQs displayed on other websites or web forms. Given this standard format, participants felt it would be easy to navigate through the FAQ links. Participants also noticed the number of hyperlinks that were included in the FAQs, and some participants mentioned that they liked that the FAQs were in a pop-up window so they would not lose their place in the survey.
“I may have used this a few times, and I thought they were helpful. I think it’s pretty standard for a FAQ box.”
“It’s nice that it’s on the same page and you are still where you were, so you don’t have to go to a different window and have to get back to your survey, I see what they are doing. It works. It is better than getting taken to another page and losing your spot in the survey.”
When asked whether they thought the FAQs were helpful, participants had a mixed response. Some participants felt the FAQs were not helpful because they included questions that were not specific to survey items or covered very simple topics they assumed most people answering the survey would already know the answer to. Others felt the FAQs were helpful, even if they did not use them when completing the survey.
“Just that there are helpful questions and links that you can click on. I didn’t necessarily use this section when I was doing the survey, but it’s pretty self-explanatory how you could look through some questions and get information.”
“I think they are useful. They make sense.”
Most participants indicated that they did not use the FAQs when completing the survey, and this was primarily because they did not need them. One participant, however, indicated they would not have accessed the FAQs even if they had needed them because it would have been too time consuming to peruse all the information and get an answer to their question.
“I didn’t read all of them…when you are trying to get this done because you have to and you can’t spend time with your children, I’m not going to read all of these. If these are written in the same way that the survey is, I won’t understand them anyway. I googled when I didn’t know a question or phrase instead of looking at the FAQs. I googled it hoping someone would explain it to me in a way I would understand. Again, if I was an hourly worker to get paid to do this, maybe I would read all of this information and be interested but when it’s my precious time, I don’t have time for that.”
Font size. Sixteen participants felt the font size used for the FAQs was too small.
“It’s very small, since they have room to the right they should have it really big If You Have A Question and highlight it or something and should be in bigger font.”
“It’s really small and the popup could be wider.”
First impressions. Generally, participants understood what this question was asking them to report, and they were able to provide the requested information. A couple of participants indicated that they needed to ask for help from colleagues in reporting numbers for these questions, and one participant discussed how difficult it was to answer this question without having a background in accounting. Others commented that having clear instructions about what to include and exclude were particularly helpful. Some participants discussed the fact that responses to these items were to be provided in thousands, which, as reported previously, is something that they needed to keep in mind to avoid data reporting errors.
“I would need to fill out the blank boxes with the amounts for the various assets and expenditures that we have. I would click Save & Continue to save that information. Again, it was helpful that I did this last year, because I remembered that I needed to round everything to the nearest thousand and then enter that data.”
“I remember it being a detailed question. Something that I had to get answers for because I did not have them personally. I like that it tells you what to include and what to exclude. Even down where it talks about depreciable assets sold, impairment costs. I like this level of detail without having to go back to some other help document to figure out what exactly they are asking. It’s making me round, which is fine.”
“It’s interesting that they put the zeros in. I think that’s a little weird. Maybe they should just say to report in thousands. I did find that to be a bit weird. Maybe there’s no better way to do that. This screen, I had no idea how to get these answers. I don’t know what our depreciable assets are. We have a balance sheet but I don’t know how to find what these are. This may just be showing my lack of accounting training but if you imagine that most small business owners don’t have accounting training, they are good at what their business is, not accounting. I’m guessing you are getting a lot of people scratching their heads. When you have shops and manufacturing facilities...I just don’t think you know what the difference is between these different assets. That’s what our accountant does. I did my best to guess from the balance sheet. I was able to get capital expenditures except spell that out – what exactly is a capital expenditure. Fixed assets like computers, equipment. Is that green coffee that we then sell? Packaging? I’m guess that means things that you keep – furniture, equipment, computers. Add that in so I know. What is ‘other additions and acquisitions’? What does that mean? It was not clear to me. Maybe to someone with a degree in accounting, it would be 100% clear. It made me feel like I would put in misinformation and ruin your statistics. I am not going to call my accountant and have him walk through this with me. He’s expensive. If it was written from the perspective that some of us are good at what we do with our small business, like coffee, haircutting, or car repair, and not at accounting. Maybe have two different versions? Like huge company professional accountant version and small business version.”
Horizontal lines on screen. Participants viewed the single horizontal lines as being separators between questions to help break up the screen and help respondents in follow along.
“Making it clearer – keeping it to where things don’t get mixed up and one question at a time is in between the lines.”
“Just trying to break it up in different thought processes to mainstream it. The same way we have different horizontal lines on an IRS form.”
Some participants did not notice the horizontal lines at all or did not have a guess as to what the lines were intended to represent or signify.
“I don’t know. I had not given that any thought. I’m not sure what purpose that would serve.”
Overall, participants tended to be unsure of what the double horizontal line was intended to represent or signify as a visual cue. Their guesses included the following:
Similar purpose as single horizontal lines, just helping to further separate out the questions;
A mistake (i.e., intended to be only one horizontal line);
A larger division between the questions on opposite sides of the double line;
An indication that some kind of total is calculated; and
Separation of categories within the larger question (e.g., assets vs. expenses).
Although these guesses may not suggest that the horizontal lines are a potential distraction for participants completing the AIES, the format and layout of the grid should be adjusted to improve the clarity of the communication purposes as visual cues. For example, if the AIES uses a single horizontal line as a divider between content on the webpage, then the preferred format would avoid the use of double horizontal lines or having a blank space between lines. If the blank space is a programming byproduct of the content being hidden, then replacing the blank space with a generic note to clarify the absence of content would improve the clarity of the layout.
Eleven participants were asked to review the interactive content tool, which allows respondents to preview the survey questions (see Appendix D-2 for the communication materials reviewed in Module 7). After interacting with the tool, six of them explained that it provides instructions on how to use the tool to view survey questions and provides guidance on responding to the AIES. Participants also noted the interactivity of the tool and how it could be helpful for them.
“I liked it because it gave me those first 3 instructions and then when you started selecting your NAICS code and got down to where you needed to be, you ultimately limited down your questions. So, you knew exactly what you needed to address or what you were going to be asked.”
“This page provides a description of the different codes depending on industry, if you will… I assume that depending on whatever code is selected that kind of drives the type of questions that are asked in the survey.”
“It’s [survey] questions. You can choose, or click, I guess whatever you want to read about, I’m assuming. Over here on the left. And then here, it’s for me to read about.”
Regarding their prior knowledge and use of the interactive content tool, five participants reported knowing about the tool prior to the interview and only three of them had used the tool in the past. The three that had used the tool previously did not provide any further feedback about their experience.
Some participants reported noticing the familiarity of the survey questions as they previewed them in the tool; this helped confirm they were viewing the correct NAICS code. When being asked to find the question “What were the total sales, shipments, receipts, or revenue in 2022?” by company location, four participants were able to point correctly to the Establishment Questions tab in their response. One participant was not sure but believed this would be found under company questions or industry questions.
All participants were able to navigate the interactive tool to make their selection and four were confident about their selection, especially after viewing the types of questions associated with the selected NAICS code. One participant used the search bar to search NAICS codes to find the correct code for their business. After adjusting search terms and reviewing the search results, this participant thought it was obvious which code to select because only one code had the keyword for their business. Another participant described a similar step of finding a code that best describes the activities of their business.
“For me the one that made the most sense is ‘Professional, scientific and technical services.’ Then within that trying to decipher down which one of these more align with what we do as an organization. I think it was one of these here because we deal with more of the research with different science related things because of the nature of the companies that we work with in the pharmaceutical area.”
Four participants found the process of selecting their industry code(s) to be somewhat difficult. These participants struggled to identify the code that applied to their business and were unsure how their business should be classified. The higher-level codes were easier for participants to select than the lower-level subcategories. However, another participant struggled to find the right lower-level NAICS code.
“Once you get to the subcategory there is really nothing, we are the lost forgotten profession. Maybe all other outpatient care and facilities?”
“Well, I’m not so sure now. Usually, I already have the NAICS code in front of me so I know what it is.”
All 11 participants who discussed how to export survey questions for their company were able to identify the Download button as the necessary step for exporting survey questions. However, three of them also said that they were not entirely sure how to export questions or did not know what to expect after clicking the Download button. When reviewing the downloaded questions, one participant mentioned that they expected that they would need to download the questions for each tab, establishment, industry, and company, and were surprised when the downloaded Excel contained all the questions, sharing “I assume you need to do it for each tab, it’s not one all inclusive…oh it is all in here.” Another participant was confused about the differences between establishment, industry, and company and was unsure why some questions were listed in each section.
“I don’t understand like total number of employees, that is in establishment and company questions why? I don’t understand the difference between establishment and industry and company, is this all of them combined? Some questions are in here twice and some aren’t.”
Three participants indicated that the question export functionality could be helpful for them to complete the AIES. The other participants seemingly did not see the need or utility of question export and may not use the functionality beyond keeping it as a reference. One participant (see footnote6) found the format of the Excel spreadsheet difficult to review and manually adjusted the columns to see more text at once. A few others also commented on the low readability of information organized in the spreadsheet format.
“It was easy [to download] but it’s ugly to read…I don’t even know. I don’t know where I would enter data. This is just like reading something. I wouldn’t use it… This is way too much information.”
“That’s too much going on there! Yeah… it needs a lot of formatting. It’s too much!... I would prefer the PDF over this kind of format.”
“It is a mess of formatting. It’s formatting as a CSV which is why it looks so messy.”
One reason that participants did not believe the downloadable Excel would be useful is because it did not match the format of the survey questions. It was difficult for them to recognize that the downloadable Excel contained all the survey questions they had answered. Furthermore, one participant mentioned they were not sure where they would record the data retrieved from their colleagues, which is an essential function respondents need from a question preview tool. This participant shared, “I don’t know where I would enter data. This is just like reading something. I wouldn’t use it.”
“Yes, it would have been helpful, but as you can see it would take me a long time just to review this line by line. You are adding a whole other layer of digging. It would have taken even more time. Using the provided questionnaire and spreadsheet, most were self-explanatory, I probably wouldn’t use this.”
“Good Lord… I would probably just exit out. It’s hard to read any of that on there. I’m not sure what it all means. I wouldn’t give people the choice to open that up.”
“I don’t understand like total number of employees, that is in establishment and company questions why? I don’t understand the difference between establishment and industry and company, is this all of them combined? Some questions are in here twice and some aren’t. I don’t think this helps me at all.”
Thirteen participants were asked to review the AIES landing page on the Census Bureau website during the interviews (see also Appendix D-2 for the communication materials reviewed in Module 8). Three participants indicated that they remembered visiting the page before, or something that included similar information, but were not very confident about whether it was the same webpage.
The entry point “READY TO REPORT?” button was identified by five participants as the most important information on the AIES landing page. Other participants also identified the FAQs and the survey questions as important information on this page. At the same time, two participants reported that they expected to see the reporting deadline but could not find the information on the landing page. One participant expected to see the estimated completion time while another expected to see downloadable spreadsheets, but it was not immediately available on the landing page. Three participants reported that they would only visit the page when they needed to find out more information and would specifically access the FAQs as needed. Others indicated that they would visit the AIES website either upon receipt of the email invitation, when they were ready to complete the survey or when they survey was due. One participant indicated that reviewing the Annual Surveys report, while not totally applicable to their business, might be interesting to learn more.
“I think I visited it when I started to get the emails telling me that the survey is coming.”
“I guess for me it would be accessing the survey, but also maybe the FAQs, although I know you can access that in the survey portal.”
“For me, it [the most important information on this page] would be the survey questions. So [that] you know what you’re answering. It’s always a great thing…. I’m not sure I would [visit] unless I needed to go back and find more information.”
“I guess I’m not seeing immediately when it’s due. Or when you need to respond by.”
“I don’t know, maybe prior Census front pages I’ve seen had estimated time to complete. Always nice to have some clues of what you are getting yourself into. Though I am sure that is a hard number to get to.”
“Once I looked at that I would open the questions and keep them over here [different window] and come back and look at it if I need to. Since I have 9 surveys, I am going to try and go through and update as many as I can, then go back and forth and update the things.”
“Depends on my motivation – if my motivation is to complete it, then the green button. If it’s ‘I don’t know what to do, it’s FAQ.’”
“In regards to the survey taker, probably the guidance and the instructions. but for the Census, it is probably the actual report button.”
“It really gives you links to everything you would need to know. I don’t think there’s anything to add because if you add to it, it’s just becoming overly wordy.”
Participants generally found the “Information for Respondents” page to be helpful and summarized its content as providing additional information about the survey, including the purpose of the AIES. Three participants called out the detail at the top of the page about combining the surveys into one survey as positive, with one participant unaware that the new AIES combined seven surveys into one. One participant felt it was too much information and typical of government websites, explaining that it looked cramped with small font and lots of information. None of the participants recalled visiting this information page when being asked to complete the AIES.
“I think this is super helpful, super practical. I appreciate that they’re trying to condense more surveys into one. I don’t know if our organization fills all these out, but that’s great. I think this is very clear. I think, practically, what do I need to do? What are the steps? So this is helpful for how to get started in reporting. I do appreciate that there are instructions down here in the ‘How-Tos’ and, of course, it make sense to want to prove that you’re legitimate. But definitely this section about how to get started is most helpful.”
“This is an intro to the same survey, just different. I like the spot at the top where it is replacing all these other things that I did. That’s nice. That starts me off with a positive vibe. It also says the portal, I need my authentication code, then it gently leads me to some instructions if I want some. High level summary, that’s nice. This one is more likely to get me to scroll through it. Cause it doesn’t have a green button at the top that says start.”
“This looks like more information about the AIES, I am trying to see how it is different from the other tab because that tab seemed to encompass all of the information. It’s explaining which surveys were combined into this survey. And it looks like there are a lot of links on this page directly to information. Whereas the other page, if you need this click here, if you need this click here. This one has the actual PDFs and manuals that you can click right on from this page.”
Participants indicated that the description of AIES and the supporting documentation stood out for them as the most important content on the page. These sections provided helpful information that they were not aware of prior to the interview. One participant thought the link to the Question Preview Tool was most important so that respondents can prepare in advance. Two participants further pointed out that they would expect to see the mention of the deadline for reporting on this page, but they could not find it on the page.
“For me, those sections [How to start report, What you need to report] are most important. I recognize already that if we’re asked to do this by the Census, I take it as a no-brainer that this is important. For us and our purposes, we just want to get this done, so getting started as quickly as possible and very practical helpful steps is what I needed.”
“I could see some of this stuff potentially time consuming to sit down and watch a video. I could see all [the how to PDFs and videos] being really helpful! … Maybe this is going to save some time watching the video versus figuring out the spreadsheet myself, possibly. But I’d like to know how long each of these videos is though.”
“All the different surveys, where you need to go, the reports and the portal. This is the area I would spend a little more time on to understand what I need to get done.”
“The ready to report section, how to sign in, how to start, where to go for help, how-to videos and FAQs.”
As for the timing for visiting the information page, three participants said that they would only visit the page when they were instructed to do so from Census Bureau’s notification or when they need further assistance to start responding to the AIES.
Participants summarized the content of the FAQ page as a useful resource that can help answer common questions for people responding to the AIES. One participant pointed out the difference between the “General” and “Respondent” categories and found the organization very helpful; however, another two participants did not see a clear difference between these two categories. When asked for further feedback, participants did not identify anything that stood out from their review but overall thought the information was helpful and could be informative for respondents who were seeking help while completing the AIES. One participant wanted more information on the FAQ page about how to understand NAICS code classifications.
“I can see this is very helpful, very practical. Answering questions I might run into. Especially how it’s separated into General and Respondent, that’s useful. I would definitely go through FAQs if I was ever having issues logging into a website or if I was running into software issues, yeah, like why can’t I upload the Excel spreadsheet. This is definitely the kind of thing I would look for if I was struggling…The general questions feel like bigger picture questions versus the respondent questions are very practical issues I might run into while doing the actual survey. So, that seems very helpful to me.”
“These NAICS codes is something that’s always been a little frustrating to us as a non-profit organization. Where do we fit? I think there’s been an update or there’s going to be an update this year that will move people who work in community programs…They’re always trying to fit us into these big, broad categories.”
“It looks nice. I like it. It might not have all the questions I would ask on here but it’s a start.”
“My hope is that the sorting is dynamic [driven by which questions are actually asked most frequently by users] and is not driven by the perspective of the web designer – what is the most important question? Questions are an “essay.” It’s not a surprise, but it’s a surprise that this interview comes up after this is (hopefully) the first iteration. Long way to go.”
Five of the nine participants reviewing the FAQ page reported that they have not seen the FAQ page before. Two indicated that they would only visit the FAQ page “on an as needed basis” and when they run into questions as they answer the survey. Two participants thought they might access the FAQs when they initially get notice of the survey. One participant noted his preference for question-specific information to be available in the portal on the survey questions so that he can be more efficient and not risk losing any data.
“I think I have always been able to find the answers to my questions by what’s located within the survey. Meaning like, either the descriptions that are included, or if there’s ever a question mark to click on to need more information. I would find it that way… I would prefer there to be a question mark on the page and that would give me a little more detail about what the question is asking…the FAQs to me is for the whole survey. I think if I had a question on a specific question, then maybe there’s a question mark for me to click on there for me to learn more. I want to keep moving. I want to move through this as quickly as possible and not have to refer back to a central location and not lose my thought or not lose all my information. I would hate for that to happen.”
Overall, the participants felt the organization of the page made sense and was clear.
There is no clear preference regarding the helpfulness of the three AIES web pages, with each of the three pages reported as the most helpful by at least three participants (out of 10 who reported their preference).
“Probably the [AIES landing page] because it had the green bar to report here. Some people won’t watch the videos or read instructions. They just wanted to get started. I liked having the instructions and videos.”
“I think the [Information for Respondents page] probably looks the most useful. It gives the review of all the different surveys and also gives you a reminder of where the portal is.”
“Probably the FAQ because depending which question you’re looking at, it would include a lot of information that was on the more generic “About” pages.”
“Probably [the FAQs would be most helpful]. The other 2 were just getting you started. Doing a government survey, we are just going to jump in and get going, and then we will run into problems and want to figure out what the issue is.”
“I think information for respondents is the best one…. Because it’s got an instruction manual, it’s got an interactive preview tool. I like all of those.”
“The first one because you can get to the FAQS from there and the actual survey from there and information from there. That general AIES program surveys site had everything on one landing page.”
First impressions. Participants understood that the postcard provided a general introduction to the AIES and, importantly, noted that the AIES combines several different surveys into one (see also Appendix D-3 for the communication materials reviewed in Module 9). One participant seemed to even “want more of the effort” as she moved the cursor around the wheel listing various Census surveys and noted, “It would be nice if it actually said that this survey will replace all of these.”
“It’s about introducing us to the new way that the AIES is being handled and stating that it’s one program that will replace others so it’s in all-in-one. Streamlining it and thanking you for taking it.”
“It just looks like the AIES is compartmentalized into seven different components.”
“It’s basically the introduction of a new survey that, from my understanding, is replacing maybe a handful of other surveys that used to take the place of or had to have been filled out individually.”
One participant appreciated that the postcard provided information about the survey and felt the graphic was colorful, sharing “it’s informative, it is telling you when you will hear from them, the QR code is nice to find more information, the graphic is colorful to look at.”
Most participants do not recall receiving the postcard. Only one participant remembered receiving this postcard and found it useful. Two others felt the information looked familiar but did not recall seeing the information on the postcard itself. Another participant further suggested that the advanced notice might not reach to his desk, or he might not keep it since the actual survey would follow later.
“It looks like it’s something that’s there reminding you that you’re going to be getting an invitation coming up soon. Typically, I’d throw it away until I actually get the real thing. It might not even make it to my desk. They might toss that before it even gets to me.”
“No, I remember getting an email saying you’ll be getting this in a short time. I didn’t receive a postcard, but I did receive the email saying I would be receiving this postcard in the near future.”
“I want to say, I don’t know that I received it or saw it on a business survey. It does look familiar, but I can’t tell you when and where.”
QR code. All participants noticed the QR code on the postcard; however, none of the participants felt that they would have scanned the QR code. Perhaps unsurprisingly, the participants would have used their computer to access the AIES website because they would have completed the survey using their computer rather than a mobile device.
“I probably wouldn’t have because I would do this survey on my work computer and not a mobile device.”
“I probably would have just used my computer to visit the site that it shows on there.”
“I probably wouldn’t just because it’s saying that something’s coming and there is nothing that we must do now. Just to get more information we would use that QR code. So, I don’t think I would use it for that.”
“We do most of our stuff on our work laptop… [the likelihood is] slim to none.”
Participants also indicated that in general, they are unlikely to interact with QR codes, particularly when it comes to a work-related task because they would prefer to complete that task on a work computer or laptop. One participant noted that they found QR codes, in general, to be a “nuisance” and that they avoided using them.
“That’s a good question, I guess it just depends on what it is, with stuff for work I would rather it be on the computer. I guess there is a chance I would use the QR code and then pull up the website on the computer.”
“Probably not. I just think they’re a nuisance…[I tend to ignore them]. It might be my age.”
“It depends on the QR code and what it’s for. Typically, at my age I ignore them. I’m almost 50. I don’t live and die by them like my 20-year-old does.” “It’s going to be on my mobile phone and this is a business-related thing and I’m going to need my laptop. There are a lot of things in my head to make me not want to deal with it on my mobile phone.”
Participants suggested two destinations that they expected from the QR code. Several participants felt the QR code should direct people to a site with more information about the survey, whereas others mentioned the QR code directing them to a link to access the survey. As for the level of details they would expect to see, some participants felt that such a site should have more general information about the survey, whereas others would prefer the site to have more detailed information. Regardless of the content expectation, some participants questioned whether this made sense to provide in a QR code because people would be very unlikely to complete the survey on their phone.
“If I used the QR code, I would expect that it would pull up the detailed information about the survey and all of the details in regard to that and what information I need to complete the survey.”
“Homepage on the survey in general with general information about what it is and how you will be receiving information, stuff like that.”
“More information about the survey and what’s on the card. Maybe instructions. But I wouldn’t think instructions right away.”
“Based on the content of the rest of the sheet I would expect it to just be further detail highlighting the purpose of the survey.”
“I would think it would go to the link that was part of the email that was received to go to the survey.”
“Log-in for the survey or the results of past surveys.”
“Census website that explains the survey. Usually, I’m at my computer so I wouldn’t pick my phone up, I would just go to the Census website.”
“Further explanation, possibly a full link to the survey but that wouldn’t make sense seeing as I would be on a mobile device and who would want to do that on a mobile device?”
First Impressions. Most of the participants remembered receiving this email and felt its primary purpose was to provide a “heads up” that the survey was coming soon. Four participants confirmed that this email was sent directly to them, and one participant could not remember whether it was sent to them or their boss. Two participants noted that although this was a somewhat standard advanced notice email, it was similar to the postcard and also described how things would be different for the AIES.
“It’s about the survey, giving me a heads up that it’s coming up. I can start preparing information if I want to and letting me know that I will need to start responding to it.”
“It seemed very formal. It’s from the U.S. Census Bureau so I felt like I needed to be involved in the Annual Economic Surveys.”
“It just told me to be expecting the survey.”
“That it wasn’t just here is your link, here is your passkey to the survey. It was informing me that one was coming and that it was going to be slightly different than the past.”
“It’s kind of similar to the [postcard], right? An introduction of a new survey, its purpose exists to kind of consolidate other annual surveys we had previously been participating in. And then more of a why, or what does it do…My initial impulse with this one is, by virtue of it looking pretty familiar to me, is an indication that there will be a survey and we’ll be participating in it.”
Another participant indicated that one of the first things they noticed about the email was the Census Bureau seal at the top of the email, which increased the survey’s legitimacy and helped the participant feel that the survey was not a scam. This participant shared, “the notice at the top about making sure it’s official is good to remember and nice to know that this isn’t fraud.”
Action taken upon receiving the email. Most participants indicated that after receiving the email, they put a reminder on their calendar to complete the survey on a future date. One participant, however, stated that they visited the website, downloaded the question preview spreadsheet, and started working on finding answers to the survey questions after receiving the email. A few participants did not do anything, waiting for the survey to arrive. Another participant notified their supervisor and registered in the system because this person also had the authorization code.
“I kept it in my inbox to remind me that it would need to be done in a few weeks. I waited until it popped up for sure for me to start working on it.”
“I did visit the website, and I think that is when I downloaded the spreadsheet of the questions and started working on those answers.”
“First I brought it to my boss’s attention, then I registered because I have a letter with the authorization code and due date and all that.”
Usefulness of the email. Most participants appreciated receiving this advanced notice email because it made them aware that the AIES was coming, they could put it on their calendars, and they could begin gathering information they would need to respond. A few participants did not react to this advance notice and were waiting for the survey to act.
“I appreciated the heads up that it was coming so I could start getting information and then they tell you the reason they are doing it which is good to know as well.”
“Yeah, it was my first year, so I didn’t really know when to expect to begin.”
Suggested changes to the email. A few participants suggested changes to the email. One of these participants suggested including an assurance that changes had been made to the AIES in an effort to make the process of responding to Census Bureau surveys easier. One participant suggested including the detailed questions or a preview of the downloadable Excel file. One participant suggested including a summary of all the key points at the top would be helpful. Another participant noted that he ignored it because there was no mention of survey deadline.
“The detailed questions or a preview of the Excel form to know. If I had a preview of the Excel form, I could have been thinking about what I needed to do. Having a general question about Cap Ex wasn’t that helpful.”
“It doesn’t have anything on it that says ‘hey, you have to do it by a certain date or time’ so I ignored it.”
All participants agreed that this email was a standard reminder to complete the AIES survey by the approaching due date. Participants noted that the instructions included in the email were easy to follow and that the email, helpfully, included the due date in multiple places and authorization code.
“It’s just asking that you have this [survey] completed by a specific due date. That’s all I’m getting out of it. I would try to complete it by the due date, but it just depends on how I prioritize all my other deliverables and if this makes it by the due date or not. I know last year somebody may have asked for an extension.”
“To remind you that you need to complete the AIES, and they are requesting your response. I think it also says something about if you have done so, thank you but if you haven’t, you need to go in and do it.”
“It’s nice how at the top they have all the points: register your code, how to report, when the due date is. I think it was good.”
“More specifics now about again, this is the new survey replacing existing surveys. Specifically, this is how you register, sign in to access the survey with the associated due date and the authentication codes to be able to report the data.”
Eight participants remembered receiving this email whereas one participant did not remember receiving it. Among those participants who remembered receiving it, two participants indicated they received the email copy, four received the paper copy, and two received both the email and paper copy. Additionally, one of the participants who remembered receiving the email noted that they received it after they had completed the survey. Lastly, among those who remembered receiving the email, all participants noted that it was sent directly to them.
First impressions. Participants felt this letter was straightforward and that it communicated general information about the survey, how to access the survey, and that participation in the AIES survey is required by law.
“Letting you know you are required to do the survey by law. How much time it will take, giving you information, you need to log in, any credentials you may need.”
“It’s trying to explain the process and provide links and the code to login.”
“Yep, so this I consider basically our notification of the survey and that our participation was required.”
Some participants indicated that some of the features of the letter, primarily the Census Bureau seal and the name and signature of Robert Santos, helped them feel more confident in the legitimacy of the survey.
“It seems legitimate. There’s a lot of ways to create counterfeit these days, but it seems legitimate…I looked up who this guy [Robert Santos] was.”
“I could google whoever that guy is and see if it’s legitimate or not.”
“Let me know that this is a legitimate survey, that I am talking to someone real. It tells me about the survey, it will be confidential. Just to make sure that I know this is all good. It is just to help you improve the process.”
Most participants remembered receiving this letter. Three of the participants remembered receiving an electronic copy, two remembered receiving a paper copy, and three remembered receiving both a paper and electronic copy.
Action taken upon receiving the letter. Two participants took immediate action to begin the survey after receiving the letter. Other participants explained that they set the letter aside and put a reminder on their calendar to complete the survey on a future date.
“I don’t know. I went on the website. I didn’t do it immediately. I think I put it aside for probably about a week and put it on my calendar to complete it on a given day.”
“I set it in my needs to be done pile.”
“Because of prior surveys, I knew to go right to the registration and login which is what I did and kind of poked around to see if I would have the same options as prior surveys to add delegates and request extensions. This first page [of the letter] is what I used mostly.”
Suggested changes to the letter. Most participants who reviewed the letter considered it as straightforward and thorough, and they did not think of anything for improvement. One participant specifically stated that the letter did a good job of providing the necessary information in a concise way.
“I mean it’s a good letter. It’s a one-pager. It gives you exactly what you need – no more, no less. I like the letter. In terms of brevity, this one’s good.”
Some participants suggested changes to the letter. Two participants noted that including the due date may improve the clarity further. One participant suggested emphasizing the legal requirement and penalty for omitting the survey: “The other thing you might do is to say you are required to do this by law and failure to do so may result in….” Another participant suggested removing the signature, explaining, “Having a fake signature from Robert L Santos who I don’t know and is probably not contactable. I would lose that myself….”
First impressions. Participants understood that the purpose of this letter was to inform a company that they were past due in completing the AIES survey and to resend the information needed by the business to complete the survey.
“It’s a past due notice that you didn’t complete it within the first notice, and you need to complete it now.”
“I did not do the survey before it was due. The information as to where to go.”
“This would be received if we’re past the deadline for submission. I know online we’re able to request extensions and when those expire, we can reach out to our account contact if there’s a valid reason we haven’t finished and work with her for an additional extension. So, I imagine we would receive this if we had exceeded that extension or were expected to.”
The first things participants noticed about the letter were the words “past due,” the Census seal at the top of the letter, 2023 AIES, and that a different person had signed the past due letter than the original letter. The participant who flagged that a different person signed the past due letter than the original letter indicated that they would not be concerned that the letter was a scam because the past due letter had a different signatory. However, if they letter had not been signed, they would have been concerned about the survey’s legitimacy.
“It would give me pause that it wasn’t signed. It wouldn’t give me pause that it was a different person.”
Three participants recalled receiving this letter, with one noting that they received the notice even though their submission was not past due. After receiving the letter, one participant checked that their survey response had been submitted and confirmed that it had been submitted before the due date. One participant filed for an extension, and another put it on their “to-do” list.
Suggested changes to the letter. Similar to the suggestions for the notification letter, several participants suggested that the past due letter should include the due date and the authentication code. One participant also suggested including the information about penalty to ensure getting respondents’ attention. Additionally, one participant stressed the importance of ensuring recipients of the letter were actually past due in their submission. Lastly, one participant suggested sending this letter in both an electronic and paper form because they would be likely to suspect the electronic copy was spam and delete it.
“Make sure you are really past due before you get it.”
“That it doesn’t say when it was originally due…I assume they would have the authentication code on there. They should do electronic and paper. The issue I have with electronic is that I would delete it because I thought it was a scam.”
“One thing that is missing about the past due notice is a due date. I would say enforcement information. If I am an accountant and I am at something that is past due and I am looking at what is the penalty, what’s my due date. I have a lot of things competing for time and information. So, something that is past due but doesn’t necessarily have clear enforcement information is going to be de-prioritized.”
“I’d lose the rubber stamp at the bottom for someone we can’t really contact. I would only put it if there’s an email or something like that where you can inquire about that. In reporting for the federal government, the biggest complaint is finding a valid contact. You can’t necessarily find it on a letter from them, you have to go by some other route to find a contact.”
First impressions. Participants understood that this letter would follow the “Past Due Notice” if a survey response had still not been submitted and that the letter would be directing the company to complete the survey and informing them there would be a fine for not participating.
“Apparently, they didn’t respond to the past due notice. They are telling you they are giving you 10 days to complete and if you don’t, you could be fined. The login information is there to complete the survey.”
“This one is letting you know that if you don’t complete the survey, it is punishable. If they [Census Bureau] don’t do that, they are not going to be able to enforce this survey.”
“This one is kind of nicer. ‘Your timely response would be very helpful.’ They are being nice and gentle instead of saying we can take legal action. Oh, here you go, ‘possible prosecution,’ so they are being a little tough. Otherwise, people won’t pay attention, I feel certain.”
Several participants first noticed the phrase, “Notice of Failure” and also the bolded terms like “REQUIRED BY LAW” or “STRICTLY CONFIDENTIAL.” They felt these were strong phrases to motivate action. Participants also noticed and appreciated that the letter included a deadline for completing the submission, though not all participants thought the 10 days was a realistic expectation.
“This word right here – ‘failure.’ That’s got a lot of strength attached to it. There’s nothing like telling someone there’s a failure. Other than that, there’s nothing really breathtaking about this. It’s not signed. But, it seems like it’s a follow-up to the past-due letter…You might want to bold this [highlights $5,000 penalty in third paragraph].”
“There would be a due date – okay that’s good. I think probably the 10 days might not be reasonable if they didn’t have dates on the other [letters]. Is that like 10 working days because what if you are on vacation?”
“This one says within 10 days so that is important.”
Action taken upon receiving the letter. Participants indicated that, if they had received this letter, they would have logged on and completed the survey as quickly as they could. One participant noted that if they were unable to complete the survey by the 10-day deadline, they would have contacted the Census Bureau to request more time. Another participant, however, would complete the survey at her own pace, noting that she would “put it on my to-do list, put it in my inbox.”
“I’d probably go take your survey very quickly.”
“I would have tried to complete it and if I had issues, I would have contacted them to say I needed more time. But I don’t see…oh, I do see a place to contact them.”
“I’d just start looking into who was working on the survey. Internally, trying to figure out how we can get this submitted.”
“First, I would have to make sure that I submitted it, and it was received and the acknowledgement that ours was received. Then I would probably call the number right away. There seems to be plenty of help and goodwill to help make this happen. There must be some sort of consequences if you don’t make the effort. It is everyone’s responsibility.”
First impressions. Participants felt this letter was similar to the “Notice of Failure” in that its purpose was to alert the recipient to the fact that they had not completed the AIES and to provide some information on how to complete the survey. Some participants did note that this letter seemed “nicer” than the “Notice of Failure” letter because it did not use as extreme language and did not discuss any fines that would be imposed. However, one participant did not see any references to past due notices. This participant believes that would be important context to include.
“It gives more of a basic overview of the survey and who would take it. Just still saying it is past due.”
“It sounds ok to me. Just says you were supposed to do it and didn’t, it sounds nicer than the other one that said you could be fined and stuff, but if that is the reality.”
“Past due notice. It would be nice if this [name of the survey] was highlighted so we know what it is for and who to direct things to. This information is really helpful [most questions can be answered using general ledger] I don’t remember seeing it in the other letters, but it would be really helpful to have up front.”
Five participants felt sending this letter to the most senior financial official at a company would be an effective “last ditch effort” to obtain a response to the survey. However, one participant questioned the effectiveness of this approach and suggested the letter should instead be sent to the team responsible for completing the survey.
“I think it’s probably a last-ditch effort to get the information reported. And if you didn’t get it through all the other avenues, what other choice do they really have.”
“Why not just send to the team taking the survey? It just gets passed down anyway. If this gets sent to a financial officer, they would either send this directly back down to the team filling out the survey or they would use the authentication code out of curiosity. The problem is that authentication codes can only be used once so that’s an issue if they use it and then pass the letter down to the team in charge of completing the survey to see.”
“I think it makes sense because you wouldn’t know who else to reach out to. From the standpoint of the Census Bureau, I think it makes perfect sense. It would get the attention you’re looking for, but it would go back to the person who was responsible for submitting the survey.”
“I personally would be embarrassed about it and would go to my staff and put it high on their priority list. We tried not to let it get this far.”
Suggested changes to the letter. Participants suggested adding information about who received previous notices/requests to complete the survey, the original survey due date and the deadline for responding to the past due notice, and information about the fine they would receive for not participating. One participant suggested making a phone call to be sure that someone has been receiving notices about the survey. They also suggested that a stronger language about the consequences for not participating may be warranted in this letter compared with the previous letter. Another participant pointed out that the signature of all these letters should be consistent and the language should be more personable for a better perception.
“I don’t see the $5,000 sticking out to me. It should be in here. Some kind of ‘failure’ – you got to use some good strong language. If people are going to be pissed off at the government [for strong language], whatever! They’re going to be pissed off at the government whether you send them a notice with strong language or don’t. If you’re not paying people, there’s no sense trying to be their friend, especially through a typed letter.” [I: Okay, so for you ‘failure’ would be a stronger and more effective word?] ‘Most definitely.’ [I: is there any other language you think should be in here?] “I definitely recommend the word ‘penalty,’ ‘fine’” [I: Is there anything that is in the letter that should not be included?] “Here would be my recommendation. All this stuff about authentication code, register, etc. is not first-page information. You can put all of that on the second page. The first page is ‘Failure,’ ‘Penalty,’ ‘Required by law.’ If you’re trying to get the attention of some very, very busy person, just scream at them.”
“If you really want to get somebody’s attention, those are the two things [past due date and penalties] and that would make people angry.”
“It doesn’t say what the survey is used for in the government. Everything else is pretty standard, matches up. But the signature is different, Lisa again who was on the second one… I’d probably change to “we have not received” instead of “our records show…” make it sound less like a computer where you can.”
When asked about the time spent completing the AIES survey, participants reported a wide range of estimates, with about 21 hours as the average time to complete.7 When looking at the differences between Round 1 and Round 2, it is clear that Round 2 participants had more complicated data with lengthier reporting times, which contributed to their need for due date extensions. Differences by business size are reported in Exhibit 3.1 below, which clearly illustrated that larger businesses (with more than 30 locations) spent the most active time reporting. Some participants reported they spent a large amount of time passively waiting for responses, for some up to 6 weeks of time. However, for larger companies reporting more than 100 hours, much of this time was spent on hands-on review, manipulation, and collaboration and on entering data into the survey.
“[The survey] probably [took me] 6 or 8 hours. It would take a good portion of my day from start to finish. I think before I started the survey it said 3 and a half hours, that’s just beyond reality. I mean with something this big, and having to gather all the information and run the reports, it’s going to take way longer than 3 and a half hours. And with my position, there’s other things I have to do during the day. I can’t sit down and work on the survey all in one day, I had to do one hour here, one hour there, and get to it when I can.”
Exhibit 3.1. Time to Complete* the AIES in Hours by Number of Locations
No. of Locations |
Average hours |
Min |
Max |
Fewer than 10 |
8.5 |
0.5 |
48 |
10 to 30 |
8.4 |
2.5 |
35 |
More than 30 |
48.7 |
4.5 |
200 |
Total |
21.3 |
0.5 |
200 |
*Time to complete in this table reflects active hours reviewing questions, pulling data, managing data, and entering data into the survey. It does not include passive wait time while other colleagues gathered data.
There were differences in opinions about the 6-week period of reporting time for the AIES for those participants in Round 1, who were early reporters, and Round 2, many of whom requested due date extensions. Those who requested extensions believe the 6-week period is too short, and many struggled to balance competing business priorities because of the time of year the survey was released. However, some who requested extensions this year hope that the process will be smoother and quicker for them next year now that they have gone through the process once and prepared documentation.
“I would say it’s a little short. We have had to request extensions.”
“It’s not very much time, only because it was so confusing. If it made more sense in the beginning 6 weeks is an appropriate time if we had someone out. It wasn’t enough time, I requested as many extensions as I could for the first one because it took us so long to figure out how to do it.”
“Usually, I ask for an extension and another extension. 4-6 weeks would be adequate if it was standard and didn’t change so much year over year. We came out of covid and spun off the home health division. We have been running full out. Unfortunately, this gets pushed to the bottom of my worklist.”
“From the Census’ perspective I think they believe they give enough time but again, with business-specific requirements and workload I personally didn’t feel like I had enough time because it came during the thick of our annual audit so there were a lot of other things that were going on that needed my undivided attention.”
“Because of when you are asking for it right during the first quarter close out, I think it needs to be longer or change the due date to be more towards the summer June/July is our slow period. Whereas we are calendar year, and you are trying to close out and get with your CPA to get your taxes filed. The first quarter into the second can be very busy.”
Furthermore, 16 participants specifically commented on the timing of AIES. Participants generally suggested that Christmas holidays, tax season, or the year-end close out are all the bad times they would rather not have to complete any Census surveys. At least five participants pointed to later in April and May as a better time for them to answer the AIES. Some participants also mentioned not having all the data needed to respond until later after the new year, for example tax filing information was not available for one participant until after April 15th.
“That is not a problem, the problem is coming in March and April. We are hiring extra people just for those 2 months.”
“I think I would still try to work on it as soon as I was able to, but definitely having it due around end of April because of our financing schedule. We’re very busy January through end of March, so anything due after that it is great for our schedule.”
“I think that’s enough time. You’re never going to want to do it, but if you can’t find time within 6 weeks, then you’re really busy. Again, tax time and Christmas time is terrible.”
“I’m sure they want timely information as soon after year end is possible. We have to close our year in January, then audit in February, in March tax return, in April then digging ourselves out of the hole, then May/June, then we’re feeling caught up and confident. That might be contradictory to the need for timely information, but that’s when we start to slow down.”
“Yeah, the deadline was about right. There is no good time of the year. We’re busy all year and people are taking vacations during the summertime. The perfect time is probably ‘due April or May.’ Right now, I’m trying to close our April books, but I’m not stressed.”
“For this company, it’s far enough out from year end that it gives us time to get through our year-end procedures, get our audit done, and focus on this…The end of April is a good time for us.”
As for the feedback regarding potential changes to the 6-week time frame for completing AIES, nearly all participants considered this time frame to be appropriate and that shortening it would make completing AIES a more difficult task for them. Only two participants said that they would be able to complete the survey in a shortened time frame, such as 3 or 4 weeks. Many participants (n=23) are in favor of extending the time frame, especially those in Round 2 who used due date extensions. Smaller businesses with straight forward reporting for the AIES were less likely to be interested in a longer reporting period. Five participants believed that a longer time frame would lead to procrastination and another eight participants did not think a longer time frame would have any impact on them completing the AIES by any deadline.
“I think if you were to extend it even more, people would just procrastinate. I think if you really give a good amount of due date, it will give us an urgency to respond to the work that you need.”
“[Extending the length of time] would be fine. Some people might be going through year-end audits.”
“I don’t know if it would impact me. When it’s time for me to do the survey, I set aside time to work on it so I am not doing it last minute.”
When asked whether they were aware of the availability of the due date extension for the AIES 34 participants shared that they were aware of the availability, whereas another 28 did not know that they could request an extension. For the participants who knew about the availability of the extension, 20 of them requested it. Twelve participants reported that they have used an extension for completing Census surveys in the past. They also said that requesting the extension was easy and that having the extension was helpful. However, one participant did not like that they had to request the extension multiple times in 2-week increments; they would prefer to only request the extension once.
“Yes, I have used those in the past. It was just logging in and checking a box. There wasn’t a big appeal process or anything.”
“It’s usually pretty basic. I do the ones for [another survey], you can just go in and click a button and get a couple extra weeks. Some of them if I need to I have to call the contact if I am at the end and it won’t let me do it any more but usually I try to stay on top of it but occasionally it happens.”
“It was mostly because I had a different job at that time. I had a bigger scope, so it was helpful to me to manage my time so I could give the time the survey needed.”
“It was pretty easy. There was the ‘request extension’ system through the site. There was no real issue with that.”
“It’s annoying I have to do it 2 weeks at a time. Tell me what the final due date is going to be, and that’s what you should set it as. All you’re doing is training people that they can continue to extend this due date. Just tell me when it’s due and end of discussion.”
In this section, we highlight recommendations flowing from the findings across all subsections of Section 3, focusing on tangible changes that may improve the usability of the AIES web instrument and the clarity of the AIES messaging, including the information and instructions provided on the website and communication materials for the participants and the public. It is worth noting that from the reports in Section 3.3 and Appendix E, we found little evidence regarding different cultural or linguistic issues related to completing the AIES. The proposed recommendations in the current section focus on improving the design and usability of the instrument and the clarity of the communication materials. For recommendations tailored to the local contexts of Puerto Rico, please refer to Appendix E for more details.
Although most if not all participants perceived the AIES positively, completing the AIES is inherently resource demanding because participants—as the representative of the business—had to retrieve a notable amount of information from their business records, collaborate with colleagues when they did not have access to the requested information, process the information to formulate the appropriate and accurate responses, and finally enter and submit their responses to the Census Bureau (Sections 3.1 and 3.2).
One of the most notable pain points discussed during the debriefing interviews revolves around the requests for the same type of information (e.g., revenue) at different levels. The participants pointed out in Section 3.2.2 that the requests for information at the location level were particularly demanding because the records of specific establishments at individual locations may not be easily accessible. Perhaps more importantly, as detailed in Section 3.2.3, the main source of burden may be caused by notable differences in the organization or breakdown of the records between the records businesses keep and the information the AIES requests from the businesses (e.g., the breakdown of expenses, payroll, or utility usage). As a result, participants need to engage in additional data management with their existing records to produce the appropriate response to the AIES at a more granular-level breakdown (e.g., by locations, industries, or time period) that their businesses may not need to, or be required to, track. Several participants perceived some of these questions as duplicative or redundant even when they understood the questions were asked in slightly different ways (detailed in Section 3.2.7). Based on the feedback from the debriefing interviews, we propose the following three recommendations for improvement.
Recommendation 1: Consider including specific instructions about the purpose of the AIES in the communication materials, such as the introduction and the FAQs on the public-facing website, and in the instrument with an emphasis on clarifying the importance for collecting the business data in different ways. A clear explanation of the purpose of AIES and how the Census Bureau will use granular data will provide transparency that may better motivate and increase buy-in from respondents.
Recommendation 2: Consider extending the timeout period if possible. Given that many participants reported that the AIES seemed to ask more questions than previous Census Bureau surveys about their business, the AIES portal should offer respondents a sufficient period for engagement with the instrument online. Additionally, including an instruction about the time-out period may also help set a proper expectation for respondents and avoid frustration.
Recommendation 3: Consider conducting additional methodological research to explore and test designs for organizing and presenting questions that are asking for the same types of information (e.g., revenue, capital expenditures) at different levels (e.g., total company, by industry, by location, by time period) so that it is more apparent to respondents that these are distinct questions asking for the business information in different ways of reporting. Further research should also focus on developing instructions and identifying designs that may improve the efficiency of data entry and help respondents report accurate responses to the AIES, such as instructions about how to update inaccurate data and how to add new locations.
During the debriefing interviews (see Section 3.4), respondents provided insightful feedback explaining their decision-making process as they selected either the online spreadsheet or the downloadable Excel file for supporting their data gathering and reporting. Notably, the need to use either format to collaborate with their colleagues seemed to be a key part of their decision making. However, the instrument and the online portal may not provide sufficient instruction and functionality to support such a collaboration among respondents. Participants also reported usability issues with both types of spreadsheets that led to confusion or frustration while completing the AIES for their business. Specifically, the color scheme (i.e., the grey-out versus white) used to format the spreadsheet did not seem to provide the clear visual cues as intended. Based on the feedback and suggestions from the debriefing interviews, we propose the following four recommendations for improvement.
Recommendation 4: Consider updating the spreadsheet instructions and layout to clearly guide respondents to enter their responses in the right place(s) in the spreadsheet. The download/upload page of the spreadsheet itself should also provide respondents with a clear legend and instructions about the color scheme and any embedded functionality. Furthermore, consider adjusting how the spreadsheet presents the question text and instructions, which respondents find difficult to read and increases the challenges of navigating the spreadsheet. These tangible measures will improve the usability of both spreadsheets and potentially mitigate some respondent burden.
Recommendation 5: Consider adjusting the protected cells and calculated totals within the Excel spreadsheet to make copying and pasting data easier and also consider providing the ability to sort data to better facilitate response. Although protected cells, calculated totals, or other embedded functions may be necessary elements based on the instrument design, additional instructions regarding these features should be provided to respondents in the spreadsheet (such as in the form of comments on the cells), which will guide respondents to better use the Excel spreadsheet as intended. Further research on the online spreadsheet may be needed to confirm whether there are similar usability complaints.
Recommendation 6: Consider including clear indications that the Excel spreadsheet has been correctly uploaded and allowing respondents to remove and reupload as needed. Once these two functionalities are implemented, the AIES instrument should be able to mitigate many of the usability issues reported in Section 3.4.3 and improve the user experience with the downloadable Excel spreadsheet.
Recommendation 7: Consider conducting usability testing to refine the error checking process. Some participants reported that when errors were flagged, they were unsure which data points were triggering the errors and how to fix them. This increased respondent burden because they had to review all the data to try to identify the errors, as reported in Section 3.4.5. Error messaging should direct respondents to the flagged item and clearly indicate why it is being flagged.
Recommendation 8: Consider including instructions to guide respondents to use the Interactive Content Tool for previewing AIES questions. Highlighting the Interactive Content Tool at the beginning of the survey in Step 1 could help respondents see all questions before starting the survey, offering sufficient information to address respondents’ needs for internal coordination. Positioning this tool in Step 1 could help reduce potential measurement error caused by the respondents’ workaround of entering placeholder values to view all the questions at once so they could communicate with colleagues to prepare data. Additionally, some participants noted the ability to see all questions at once in the Excel spreadsheet may be important to highlight for businesses on the Step 3 page when they decide which method to use. Therefore, providing an instruction on the Step 3 page to guide respondents to the Interactive Content Tool for their needs to preview survey questions may be an effective way to improve the user experience with the AIES. If the Interactive Content Tool is made accessible from the survey portal to provide the preview functionality, consider outputting the questions that correspond with the NAICS codes in the respondent’s data. If the Interactive Content Tool is made accessible outside of the portal, consider providing respondent NAICS codes from previous data in the advanced communication materials to better facilitate participants ability to preview applicable survey questions (see Recommendations 13 and 14 for additional proposals for improving the Interactive Content Tool).
Recommendation 9: Consider emphasizing communication in the portal about collaborative use of the AIES instrument and providing a clear instruction about how to facilitate respondents’ internal collaboration using the AIES portal or online spreadsheet. During the debriefing interviews, participants who used the downloadable Excel spreadsheet often reported that they decided to use this method because the Excel file is easy to use to support their internal collaboration with colleagues for data gathering and entry. Only one participant reported reading about a collaboration tool within the AIES instrument or the portal but did not know how to use it. Therefore, if facilitating online collaboration is a goal of the instrument design, then it may be necessary to promote the collaboration functionality and provide clear instructions to respondents regarding the benefits of using the collaboration tool. In the future, it may be worth considering offering respondents the ability within AIES to assign specific survey items to their colleagues for data entry rather than offering the ability to give their colleagues access to the whole survey. Participants are concerned about data entry error and are reticent to give their colleagues access to the portal.
Recommendation 10: Consider providing a downloadable copy of all responses for respondents to keep for their records. This copy of their responses should include all questions answered online in the portal and in either the online or downloadable spreadsheets. Having all the details of their response in one file will be helpful to prepare for the survey the following year and to maintain a record of their response.
During the debriefing interviews, participants were asked to review a set of communication materials and their feedback on these materials was detailed in Sections 3.5 and 3.9. Notably, participants did not indicate any problems with the reporting year and fiscal year probes during the interview, and they also did not report issues related to discerning reporting units from their AIES experience. We propose the following six recommendations for improvement based on the feedback on the issues participants encountered with specific materials that they reviewed.
Recommendation 11: Consider increasing the font size for the text in the FAQ modal window (described in Section 3.5.3).
Recommendation 12: Consider revising the grid format on the Step 2 capital expenditures grid page. This grid format is somewhat confusing for participants because of comprehension issues and the layout of the horizontal grey bars is a less effective visual cue. Participants were not consistently understanding the layout and were not sure about the meaning of the grey bars, such as whether these are distinct categories or whether the last two items will automatically sum (detailed in Section 3.5.4). Consistent with Recommendation 4, providing a clear legend and instructions for the key elements of the grid will improve the usability and mitigate the comprehension issue for the Step 2 page.
Recommendation 13: Consider including additional explanation of the differences between establishment-, industry-, and company-level questions in the Interactive Content Tool (detailed in Section 3.6.1) and mapping these questions to the appropriate steps in the AIES. This measure, once implemented, may also serve as a reference for clarifying the purpose of collecting business information by these different levels and the intended use of such data to increase the buy-in of respondents as indicated in Recommendation 1.
Recommendation 14: Consider tweaking the Interactive Content Tool downloadable spreadsheet to increase readability, given the consistent feedback from multiple participants on this specific issue about the spreadsheet (detailed in Section 3.6.3). For example, configuring the size of cells, columns, and rows of the spreadsheet to increase spacing for the question text; breaking up the sheet into multiple smaller sheets; or applying other ways to make the sheet look concise and less busy could improve readability. Matching the format of the Interactive Content Tool downloadable spreadsheet to the AIES Excel spreadsheet as closely as possible as a way to preview questions could help improve user satisfaction with the tool. Because of the existing format (Section 3.6.3), some participants currently may not believe the Interactive Content Tool shows all the questions they are asked in the AIES.
Recommendation 15: Consider providing the video length in the description of the FAQ videos. FAQ videos should feature the length of the video and make the information visible in the description so that respondents can easily see how long they are (detailed in Section 3.7.2). This will help respondents’ decision making regarding whether and when to review the video for assistance.
Recommendation 16: Consider the necessity of presenting the QR code for the postcard or consider the proper action of the QR code. The QR code on the postcard is not something most participants said they would use since they complete Census Bureau surveys on their work computer (detailed in Section 3.8.1). Including QR codes on future materials should not be a high priority. Alternatively, to embrace and better use the QR code, the embedded action in the QR code should be programmed to email survey information (e.g., instruction, link to the portal) to respondents’ designated account. This will help connect respondents to the survey via a proper channel at their workplace.
Recommendation 17: Consider including information that may motivate respondents in the advanced email. Participants like receiving the advanced email and might find it more helpful to include advance notice about question changes (detailed in Section 3.8.2). Including links to the Interactive Content Tool in advanced letters will be helpful for respondents to preview the questions, as previously mentioned in Recommendation 8.
Recommendation 18: Consider including the authentication code and the original due date in the past due letter (detailed in Section 3.9.2). This will improve clarity and emphasize how late respondents may be when submitting their responses.
Recommendation 19: Consider revising the ECSL1 letter. This experimental “Dear CEO” letter could be improved by adding more information about who received past notices/requests to complete the survey, the original survey due date and the deadline for responding to the past due notice, information about the fine they would receive for not participating, and stronger language as in the Office of General Council “Light” Letter (detailed in Section 3.9.4).
Recommendation 20: Consider featuring the due date information in all communication materials in a prominent and clear way. Between Sections 3.5 and Section 3.9, many participants mentioned that the survey due date is the most important information they sought, or they would expect to find upon their engagement with these communication materials. Although the due date information is already available in most materials, providing the due date information in all materials should help ensure respondents’ awareness.
Recommendation 21: Consider maintaining a 6-week time frame as the minimum ideal fielding period for the AIES data collection, and, if possible, consider extending the time frame. Based on the feedback reported in Section 3.10, participants elaborated the need for enough time for them to be able to curate, process, and enter proper and accurate responses into the AIES, accounting for the time needed for waiting for feedback from internal collaborations and juggling competing priorities at work—especially complying with quarter-end and year-end reporting and tax filing obligations. Shortening this time frame may risk introducing unnecessary burden and may discourage respondents from employing best efforts to produce the high-quality data that Census Bureau seeks to collect. One option to consider is extending the time frame only for larger businesses to allow them more time for reporting, thus staggering the data collection deadlines based on business size. Additionally, further reviewing data collected during the AIES survey about the amount of time spent on the survey could be useful to inform fielding time frame decisions.
Recommendation 22: Consider improving the visibility of due date extensions in the portal because about half of participants were unaware of the extensions, as reported in Section 3.10. Mentioning the option for extensions in respondent communications could also be useful. Additionally, allowing for more than a two-week extension would be appreciated by larger businesses.
Recommendation 23: Consider additional debriefing interviews that include further discussions about respondent burden, the decision to use both the Excel spreadsheet and online spreadsheet, and any of the changes stemming from the above recommendations. Respondents who report the highest hours to complete when answering the AIES could be targeted for interviews to learn more about their sources of burden.
Recommendation 24: Consider further usability testing of the Excel spreadsheet, especially with large businesses, including testing any changes made. Additionally, respondents who use both the Excel spreadsheet and online spreadsheet could be identified through paradata and targeted for interviews to further explore usability related issues.
Recommendation 25: Consider exploring ambiguously worded questions and confusing terminology further through cognitive interviews to improve consistent comprehension among respondents. Allowing respondents to flag confusing questions in the AIES could be helpful to identify items with which they are struggling. Reviewing calls and messages to the help desk and respondent comments in open-ended questions could also be helpful to extract insights for future improvement.
RTI has presented revised recommendations to the Census Bureau. Per Census Bureau feedback, most findings and recommendations were consistent with the results from the previous research efforts. The Census Bureau further indicated that the findings from debriefing interviews will be used to inform additional improvements for the next AIES.
2023 AIES Participant Debriefing Protocol
Participant ID |
|
Date |
|
Time |
|
Interviewer |
|
Locations in Puerto Rico? |
|
Multi-unit or single unit? |
|
Legend:
Blue highlight: Used only in round 1.
Yellow highlight: Used only in round 2.
No highlight: Used in both rounds 1 and 2.
Purpose:
Researchers in the Census Bureau’s Economic Statistical Methods Division (ESMD) and Economy-Wide Statistics Division (EWD), along with RTI International, will conduct participant debriefing interviews in support of the 2023 Annual Integrated Economic Survey (AIES). These interviews will cover three main topics: response process, communications materials, and instrument performance.
Not all questions in this protocol will pertain to all participants, and some modules are optional depending on the length of the interview and firm characteristics. Numbered questions are priorities, and bulleted questions underneath are optional probes.
Research Questions: The research will be guided by the following research questions:
How are respondents reporting to the AIES?
What are respondents' overall impressions of the survey?
What is the ideal length and timing of the field period for respondents?
What are the unique reporting needs of companies with locations outside of the 50 United States?
What are the barriers to reporting?
What is respondents’ feedback on the content and accessibility of respondent communications?
What support materials are respondents using when reporting to the AIES?
Are the content selection tool and summary document sufficiently supporting response?
What are respondents’ impressions of letters and emails?
What is respondents’ feedback on instrument performance and response burden?
What feedback do respondents have on updates to the instrument since the last round of research?
What are the respondent-reported reasons for item nonresponse on the AIES?
What are respondents’ impressions of screen layout, font, and other features of the online instrument?
Informed Consent: Respondents will be asked to complete a consent form electronically before the time of the interview.
Materials Needed:
Electronically signed consent form
Communication Materials specific to this protocol letter
Respondent recruitment and firm information, including: contact info; response status; locations in Puerto Rico; locations in manufacturing; single/multi-units; item missingness; pilot participation flag; number of locations; number of industries
Method: We will conduct the interviews by Microsoft Teams.
Expected length of interview: 1 hour (60 minutes) maximum
General probes that may be used throughout the interview:
Were these data easy to access?
What else can you tell me about this?
Can you tell me more about that?
How confident are you in that response?
What looked unclear or is confusing here?
Introduction (5 minutes)
If necessary: You should have received a link to a consent form from our recruiter via email. Did you have a chance to review and sign that yet?
[IF YES] Did you have any questions about any of the information presented there?
[IF NO] Please open up the link now and review the form, then sign it. Take your time and let me know if you have any questions.
Thank you so much for agreeing to talk with me today!
As part of the roll out of the Census Bureau’s new Annual Integrated Economic Survey, we are following up with some companies to learn more about the processes you may or may not have used to complete the survey and to review some study materials.
I am working with the Census Bureau to make sure that the Annual Integrated Economic Survey is performing as expected and to get feedback about ways to improve the performance of the survey instruments. I’m talking with you today because you are identified as the person who provided response to the AIES this year. When I refer to “the AIES survey” during our conversation I’m talking about the Annual Integrated Economic Survey that you responded to about your business recently.
My job is to improve the AIES survey. I didn’t write the survey questions, so you do not have to hold back when telling me your feedback. Please be candid and frank in your responses. Our interview is being conducted under the authority of Title 13, which means that your responses are confidential, and neither your name nor the name or identifying information about your company will be included in any of our findings.
Do you have any questions before we get started?
I’d like to record our session today so that when I go to analyze the results of these interviews, I can use the recording to pick up on anything I may have missed in my notes. Do I have your permission to record our session today?
[Turn on recording]
[State respondent ID number and date, repeat consent on recording: Do you agree to participate in the interview today? Is it okay if I record our conversation?]
Topic 1: Response Process
Module 1: Warm up (5 minutes)
Universe: All participants
Let’s get started with some general questions about your job and Census Bureau surveys in general before we start talking about the AIES specifically.
[INTERVIEWER INSTRUCTION: As needed, remind respondent that before the AIES, they may have reported for Census Bureau surveys several times throughout the year on multiple different annual trade surveys. Now, through the AIES, they are only reporting for multiple trade surveys at once]
[INTERVIEWER NOTES: The goal for this section of questions is to get a general understanding of the response process for businesses BEFORE the switch to the AIES. This process will likely overlap with the process for responding to the AIES. Ask additional probes as needed to understand how the business reports their data broadly and not specifically to the AIES.
Note that if someone is very new to the role (i.e., 1 year or less), they may not have any past experience with responding to Census Bureau surveys. Skip questions as needed in these cases.]
Tell me a little bit about your business. What types of goods or services does this business provide?
What is your role in the organization?
What is your role in the process for responding to Census Bureau surveys in general? How long have you been in this role?
How do you find the data you need to answer Census Bureau surveys?
Do you work with anyone else to get the data you need?
Do you access any reports to get the data you need?
How easy or difficult is it for you to find all the data you need for Census Bureau surveys? Why is that?
How easy or difficult is it for you to enter the data into the survey once you have the answers you need? Why is that?
Module 2: Responding to the AIES (7 minutes)
Universe: All respondents
Now let’s talk specifically about your experiences on the new Annual Integrated Economic Survey. As a reminder, this is likely the first time you have responded to the new AIES, which combines multiple Census Bureau economic surveys into one instrument.
[INTERVIEWER NOTES: The goal for this section of questions is to better understand how the response process to the AIES may be different compared to Census Bureau surveys they have responded to in the past. We also want to start understanding their overall experiences with completing the AIES. Probe as needed on any issues with response that they bring up in this section.
Note that if someone is very new to the role (i.e., 1 year of less), they may not have any past experience with responding to Census Bureau surveys. Skip questions as needed in these cases.]
What differences, if any, did you notice between this AIES and other Census Bureau surveys you have completed in the past?
In what ways was your approach the same or different for the AIES versus other annual Census Bureau survey(s)?
Did anything change in the way you retrieved the data you needed to answer the AIES compared to other annual Census Bureau surveys?
Did anything change in the way you entered the data you needed to answer the AIES compared to other annual Census Bureau surveys?
What was your overall impression of the AIES?
How easy or difficult was it to answer the survey?
Did some topics or sections take more or less time than others? Which ones? Why?
[INTERVIEWER INSTRUCTION: If respondents do not specify the level of data aggregation as a reason that the survey is easy or difficult to answer, you should ask additional probes about whether and how each level of data aggregation may be difficult for respondents to report:
[IF NEEDED:] Was it difficult to answer questions for the company level?
[IF NEEDED:] Was it difficult to answer questions for the location level?
[IF NEEDED AND BUSINESS INCLUDES MULTIPLE INDUSTRIES:] Was it difficult to answer questions for the industry level?
Are they equally difficult, or which was the most difficult to answer?
Did you have any confusion about which parts of the company to include when answering the AIES?
Overall, how easy or challenging was it to submit your response to this survey in the electronic instrument?
Did you access any help resources when completing the AIES?
Based on your experience this year, are there ways you might prepare for the AIES in the future?
Module 3: Locations Outside of the 50 States (10 minutes)
Universe: All respondents with at least one location in Puerto Rico
[INTERVIEWER NOTES: The goal for this section of questions is to better understand how the response process differs for businesses that have locations in Puerto Rico.]
In preparing for our conversation today, I noticed that your company has at least one location in Puerto Rico.
How easy or difficult was it to report some but not all data for your locations in Puerto Rico? For example, we ask for revenue by locations in Puerto Rico, but we don’t ask for capital expenditures including locations in Puerto Rico.
Are your records for Puerto Rican locations the same or different than those in the 50 states?
[INTERVIEWER NOTE: We are interested in differences in data structure, databases, or how they “keep the books” overall for US versus Puerto Rican locations]
Do you use the same steps to pull data for Puerto Rican locations as you do for those in the 50 states?
[INTERVIEWER INSTRUCTION: probe about colleagues they may work with and how they pull the data, as needed.]
Do you use different steps to report data for locations in Puerto Rico?
How accessible is the information we request on the AIES for your locations in Puerto Rico?
[INTERVIEWER NOTES: we want to learn more about the different processes that respondents may have for accessing reports and reaching out to colleagues for the Puerto Rican location data they report.]
Do you track the same information for your locations in Puerto Rico as you do for those in the 50 states?
If we asked all the questions on AIES to include all of your locations in Puerto Rico, how would that impact your response to the AIES? That is, would you have to access different systems, reach out to different people, or have other steps to report all of the questions on the AIES for your locations in Puerto Rico?
How do language barriers impact how you respond for your Puerto Rican locations, if at all?
When you reach out to your locations in Puerto Rico, do you encounter any language barriers? What do you do about it, or how do you handle language barriers?
In what ways could the Census Bureau mitigate or minimize language barriers in getting data for our survey from your company?
When you reach out to your locations in Puerto Rico, do you encounter any cultural differences that impact your ability to get requested data? What do you do about it, or how do you handle cultural differences?
Topic 2: Instrument Performance
Module 4: Explicit Response Choice and Error Checking (10 minutes)
Universe: All respondents
[INTERVIEWER NOTES: The goal for this section of questions is to understand how multi-unit respondents respond to the AIES, specifically whether they prefer to fill out the data online or through the Excel sheet, and their interactions with error checking. For Step 2 and Step 3, respondents have the option to download an Excel sheet that populates with data they filled out in Step 1. We want to know more about their decision to use the online or Excel reporting methods and what worked and what didn’t work.]
Think about when you entered your data into the AIES instrument. There were a few points in the survey where you could download an Excel file, or you could fill in the data using an online spreadsheet. How did you decide how to respond?
[INTERVIEWER NOTE: Respondents have the option to choose which data entry method to use (i.e., online vs Excel) at three points in the survey – Step 1 (verify locations), Step 1b (verify locations continued, for only some businesses), and Step 3 (detailed data). Respondents do have the option to select different data entry methods at different steps, but this functionality is not advertised. Please confirm which step they used which method at if they changed methods across these steps. You can also remind them of these points where they made a decision about data entry method.]
Did you use just one way to respond, or did you use both?
While completing the survey, did you ever flip between the two methods?
Did anyone else at your company use either the Excel file or the online spreadsheet?
What feedback do you have on your chosen response mode? How did it go?
[IF DID NOT USE EXCEL SHEET, IF NEEDED]: Can you tell me more about why you decided not to use the excel sheet to respond and upload to the website? [INTERVIEWER NOTE: We’re looking for more information about hesitancy to use the excel sheet, especially focused on respondent concerns about functionality of the excel or the upload feature.]
Think about when you entered your data into the AIES instrument. There were a few points in the survey where you could run error checking on your submission. Do you remember running the error checking?
Did running error checking help you to submit your data?
What feedback, if any, do you have on error checking on this survey?
What do you recall about the process of running the error checking?
Topic 3: Communications Materials
Module 6: Web Standards Exploratory (10 minutes)
Universe: All respondents (optional)
Now, let’s take a look at a few screenshots from the AIES. In the recruitment email, which included the link to the consent and the information about joining the Teams call, we also included an attachment of some study materials. There are PDF and Word versions of this document. Please open one of those attachments so we can review these materials.
Once you have the attachment open, please share your screen with me so I can follow along with what you are viewing. You can do this by clicking on the “Share” button next to the red “Leave” button, it has an upward pointing arrow on it.
[INTERVIEWER NOTE: If for any reason the respondent cannot share their screen (e.g., technical issues, prefers not to share screen) you can continue with the interview. Ask more questions and make sure the respondent is describing aloud what they are looking at. Make note of the technical issues and talk with Katherine and Patrick immediately after the interview.]
[INTERVIEWER INSTRUCTION: Guide participant to Screenshot 1: Step 2, Reporting Period, the first page of the PDF or Word document.] In this question, we are asking about the reporting period for this survey. What is your first impression of this screen?
What do you expect will happen on this screen when you answer this question?
What do you think about the layout of this screen?
What do you think about the size and thickness of the font on this screen?
[INTERVIEWER INSTRUCTION: Guide participant to Screenshot 2: Step 2, Fiscal Year, the second page of the PDF or Word document.] Is it clear what has happened here?
Describe for me what has happened on this screen. What would you do next?
What do you expect will happen if you change your selected answer?
[INTERVIEWER INSTRUCTION: Guide participant to Screenshot 3: Step 1, FAQ Modal Window, the third page of the PDF or Word document.] This screen shows what the screen looks like when a user clicks on the FAQ link at the top. What do you notice first here?
What about the font – is it too small, too big, or the right size?
What stands out to you?
[INTERVIEWER INSTRUCTION: Guide participant to Screenshot 4: Step 2, Grid Format, the fourth page of the PDF or Word document.] This is a question from the company-level collection part of the survey. What do you notice here?
How do these questions relate to each other, if at all?
How do you anticipate this screen will work? What do you expect to see?
What are the horizontal lines trying to communicate to you?
Look about halfway down at that double line – why is that there? What do you think that double line is trying to communicate to you?
VERSION A OF PROTOCOL - Module 2: Interactive Content Tool (15 minutes)
Universe: All participants (optional)
Now, let’s take a
look at a website for the AIES. I’m going to put a link in the
chat of our meeting, please click on it and open it in a web browser.
https://www.census.gov/aies/questionspreview/
[INTERVIEWER NOTE: Respondent should
already be sharing screen. Ask them to navigate to a web browser to
display the website for you both to look at.]
Take a look at this page – please tell me what you think this page is all about. What is this? [INTERVIEWER NOTE: we’re looking for general reactions to this page. Do respondents understand what this page would be used for?]
Did you know about this page before our time together today?
Did you use this tool before today? If so, please tell me about that.
Please think about what your company does or makes. Now, select the industry or industries that best represent your company using this tool.
[INTERVIEWER NOTE: We expect respondents would need to select the 6-digit code for the information on the website to be useful. However, respondents may be confused about which code to select. Please capture as much detail as you can about the selection process and how respondents navigate the website.]
Did you find it easy or difficult to select your industry/industries?
How did you know what to select?
How confident were you in making selections?
Now that you’ve made selections, let’s preview questions for your business at the company, industry, and location levels.
What are you looking at on your screen? What do you notice?
Where would you go to find the question for “What were the total sales, shipments, receipts, or revenue in 2022?” by location? [INTERVIEWER NOTE: This is not really a usability task that respondents would go to the website to complete, but more to judge their comprehension of the information presented in the site. Spontaneously probe as needed about any problems they run into.]
We’ll move on to the next task of exporting the questions for your business at the company, industry, and location level to a spreadsheet file. Please show me how you would go about doing that and open the download.
What do you think about how the information is displayed in the excel sheet? Is everything clear or is anything unclear?
Did you find it easy or difficult to export your survey questions to a spreadsheet file?
What would you do with this download once you had it?
Would this be helpful for you in completing the AIES? If so, how would it be helpful to you?
VERSION B OF PROTOCOL – Module 3: AIES Website (15 minutes)
Universe: All participants (optional)
Now, let’s take a look at a few website pages for the AIES. I’m going to put a link in the chat of our meeting, please click on it and open it in a web browser.
[INTERVIEWER NOTE: Respondent should already be sharing screen. Ask them to navigate to a web browser to display the different URLs for you both to look at.
For each of these pages, we want to understand if the information is clear and easy to comprehend. Is the general topic of the page clear? Is anything surprising that they would not expect to see on the page?]
[INTERVIEWER INSTRUCTION: Enter the following
URL in the
chat:
https://www.census.gov/programs-surveys/aies.html
Take
a look at this page [PAUSE FOR RESPONDENT TO
READ] – please tell me in your own words what it is all
about.
Have you seen this page before our time together today?
What stands out on this page for you?
What is the most important information on this page?
What, if anything, do you expect to see that is not on this page?
When might you visit this page?
[INTERVIEWER INSTRUCTION: Enter the following
URL in the
chat:
https://www.census.gov/programs-surveys/aies/information.html
Now,
let’s take a look at this page [PAUSE
FOR RESPONDENT TO READ] – please tell me in your own
words what this page is all about.
Have you seen this page before our time together today?
What stands out on this page for you?
What is the most important information on this page?
What, if anything, do you expect to see that is not on this page?
When might you visit this page?
[INTERVIEWER INSTRUCTION: Enter the following
URL in the
chat:
https://www.census.gov/programs-surveys/aies/faq.html
Ok,
last one – [PAUSE FOR RESPONDENT TO
READ] let’s take a look at this page. Please tell me in
your own words what this page is all about.
Have you seen this page before our time together today?
What stands out on this page for you?
What, if anything, do you expect to see that is not on this page?
What is the most important information on this page?
When might you visit this page?
What do you think about the organization of the page into general and respondent questions? Do these sections make sense to you or is anything confusing?
Of all of the pages we just looked at, which one do you think would be most helpful in completing the survey, and why?
Do these pages make sense to you?
Is the information on these pages what you would expect to see?
Can you find the information you need to complete the survey?
VERSION C OF PROTOCOL – Module 4: AIES Emails (15 minutes)
Universe: All participants (optional)
Now, let’s take a look at a few of the emails we sent out for the AIES.
[INTERVIEWER NOTE: Ask respondent to continue scrolling down in the Word/PDF document they already have open.]
[INTERVIEWER INSTRUCTION: Guide participant to
the postcard, the fifth page of the PDF or Word document]
This
is a postcard that we sent to some businesses but not all. Take a
look at it [PAUSE
FOR RESPONDENT TO READ] – tell me what this all about.
Do you remember receiving this post card?
If so, did you find receiving this information useful?
Did you notice the QR code in the corner?
What is the likelihood that you would use your mobile device to access the QR code?
Do you generally interact with QR codes when you see them, or do you generally ignore them? Why?
What would you expect this QR to point to – that is, what information would you want this to open up to?
[INTERVIEWER INSTRUCTION: Guide participant to
the AIES Advanced Email, the sixth page of the PDF or Word document,
titled “Important notice.”]
Take a look at
this email [PAUSE
FOR RESPONDENT TO READ] – we sent this to your company
in late February. Tell me what it is all about.
Do you remember seeing this email before?
[IF DOES REMEMBER RECEIVING:] What did you do after receiving this email?
[IF DOES REMEMBER RECEIVING:] Did it come directly to you, or was it forwarded to you?
[IF DOES NOT REMEMBER RECEIVING:] Can you think of any reasons why you may not have received this email? [INTERVIEWER NOTE: looking for reasons such as spam, sent to someone else, new to role, etc.]
What is the first thing you notice about the email?
Would you/did you find it useful to receive this communication in advance of the survey opening?
[IF DOES REMEMBER RECEIVING:] Did you visit the websites noted and use the tools there to begin preparing in advance of the survey opening?
What, if anything, would you change about this email, having completed the survey?
[INTERVIEWER INSTRUCTION: Guide participant to
the AIES Due Date Reminder Email, the seventh page of the PDF or
Word document, titled “Due Date Reminder.”]
Now,
take a look at this last email [PAUSE
FOR RESPONDENT TO READ] – tell me what this is all
about.
Do you remember seeing this email before? Did you receive a paper copy of it?
[IF DOES REMEMBER RECEIVING:] What did you do after receiving this email?
[IF DOES REMEMBER RECEIVING:] Did it come directly to you, or was it forwarded to you?
[IF DOES NOT REMEMBER RECEIVING:] Can you think of any reasons why you may not have received this email? [INTERVIEWER NOTE: looking for reasons such as spam, sent to someone else, new to role, etc.]
What is the first thing you notice about the email?
What is the purpose of this email?
What stands out about this email?
VERSION D OF PROTOCOL – Module 5: AIES Letters (15 minutes)
Universe: All participants (optional)
Now, let’s take a look at a few of the letters we sent out for the AIES.
[INTERVIEWER NOTE: Ask respondent to continue scrolling down in the Word/PDF document they already have open.]
[INTERVIEWER INSTRUCTION: Guide participant to
the AIES-L1, the fifth and sixth pages of the PDF or Word document.
Instruct respondents to read BOTH pages – the front and back
of the letter.]
Take a look at this letter, both
the front and back [PAUSE
FOR RESPONDENT TO READ] – tell me what it is all about.
Do you remember seeing this letter before? Did you receive a paper copy of it?
[IF DID RECEIVE]: What did you do after receiving this letter?
What is the first thing you notice about the letter?
What is the purpose of this letter?
What stands out about this letter?
What, if anything, would you change about this letter, having completed the survey?
[INTERVIEWER INSTRUCTION: Guide participant to
the AIES-L2, the seventh page of the PDF or Word document, titled
“PAST DUE NOTICE”]
Take a look at this
letter now [PAUSE
FOR RESPONDENT TO READ] – tell me what it is all about.
Do you remember seeing this letter before? Did you receive a paper copy of it?
[IF DID RECEIVE]: What did you do after receiving this letter?
What is the first thing you notice about the letter?
What is the purpose of this letter?
What stands out about this letter?
What, if anything, would you change about this letter, having completed the survey?
[INTERVIEWER INSTRUCTION: Guide participant to
the AIES-L4L, the eight page of the PDF or Word document, titled
“NOTICE OF FAILURE TO PROVIDE MANDATORY RESPONSE.”]
Now, take a look at this letter
[PAUSE FOR RESPONDENT TO READ] –
not all companies receive this one. Tell me what it is all about.
What is the first thing you notice about the letter?
What is the purpose of this letter?
What stands out about this letter?
What would you do if you got this letter?
[INTERVIEWER INSTRUCTION: Guide participant to
the AIES-ECSL1, the ninth page of the PDF or Word document, also
titled “PAST DUE NOTICE.”] This last one is
different because it would not be sent to you, it would be sent to
the most senior financial officer we had on file for your company. I
want to stress that we have not sent this letter out, and we are not
going to send this letter. We are just looking for feedback on a
hypothetical letter like this one being sent to the most senior
financial officer we can identify. In this hypothetical situation,
we would have reached out to this company multiple times by email,
mail, and phone, and gotten no response.
Tell me what it is all
about.
What is the first thing you notice about the letter?
What is the purpose of this letter?
What stands out about this letter?
As we discussed, this letter would only be sent to those who had not responded to the AIES after multiple attempts. I know you did respond to the AIES this year. Let’s pretend that you had not responded after multiple contact attempts. How do you feel about this letter being sent to the most senior financial officer we could identify at your company? [INTERVIEWER NOTE: We’re trying to learn if sending to senior financial officer would lead to issues with alienating respondents by “going over their heads.”]
What do you think the impact would be at your company if we sent this letter to the most senior financial officer? [INTERVIEWER NOTE: We’re looking for consequences that the respondent may face if this letter goes to the senior financial officer, e.g., scolding, written up, etc.]
What would be the best way for the Census Bureau to notify your company about completing the AIES after missing the deadline?
What information should be in the letter that is not in there now? Anything else you would need to know or that should be included in the letter? Anything that should NOT be included in the letter?
Topic 4: Burden
Module 11: Ideal Field Period (5 minutes)
Universe: All Respondents (optional)
[INTERVIEWER NOTES: The goal for this section of questions is to understand how long businesses need to collect data and respond to the AIES. If running low on time, please ask each numbered question and skip follow up questions.]
How long do you think it took you to complete the whole survey, including any time to retrieve the data you needed?
[INTERVIEWER NOTE: We want to understand both the active time responding to the survey (i.e., reviewing data, entering data) and the inactive time waiting for responses from colleagues. Probe as needed to understand.]
[IF NEEDED:] How much time were you actively responding to the survey? That is, reaching out to colleagues, reviewing data, and entering data?
[IF NEEDED:] How much time were you passively waiting for responses from colleagues?
[IF NEEDED:] How much time did your colleagues spend actively responding to your requests for information, data, and reports?
[IF NEEDED:] Your best guess is fine and it’s okay if you do not know.
The Census Bureau usually allows four to six weeks from the initial email about a survey to the due date to complete the survey. What do you think about this length of time given for completing the AIES?
How easy or difficult is it to respond with the current time frame?
How would it impact your response if we extended this length of time?
How would it impact your response if we shortened this length of time?
Were you aware that due date time extensions were available for the AIES?
[IF YES:] Did you use the due date time extension for the AIES this year?
[IF YES:] What was that process like?
[IF NO:] Why did you not use the extension?
[IF NEEDED:] Have you ever used a due date time extension on any Census Bureau survey? What was that process like?
Module 12: AIES Relative Burden (5 minutes)
Universe: All respondents (optional)
[INTERVIEWER NOTES: The goal for this section of questions is to understand the burden that the AIES poses for respondents compared to the surveys in the past. You may want to ask additional spontaneous probes about burden, which are included as IF NEEDED probes below. Skip these probes if running low on time or they were covered previously.]
Compared to other Census Bureau surveys you may have completed in the past, how easy or difficult was it to answer the AIES? Why?
[IF NEEDED:] Compared to other Census Bureau surveys you may have received, how similar or different was it to answer the AIES? Why?
[IF NEEDED:] How do you feel about answering all the surveys at one time compared to shorter surveys throughout the year?
[IF NEEDED:] How do you feel about logging into the portal and completing all the surveys at once compared to logging in multiple times throughout the year?
[IF NEEDED:] How do you feel about working with colleagues to gather data at multiple times during the year compared to one time?
Wrap up/Debriefing (5 minutes)
Universe: All participants
That’s all the questions I have for you today! Do you have any other comments, questions, or suggestions for us?
Thank you so much for your time today.
[CLOSE INTERVIEW AND TAKE THE FOLLOWING STEPS:
SAVE RECORDING TO APPROPRIATE FOLDER
EMAIL/TEAMS MESSAGE REBECCA Q AND KATHERINE THAT THE INTERVIEW WAS COMPLETED
COMMUNICATE ANY ISSUES OR PROBLEMS WITH THE INTERVIEW TO KATHERINE AND PATRICK
FINALIZE NOTES AS SOON AS POSSIBLE AND MOVE NOTES INTO FINAL FOLDER]
CONSENT FORM
The U.S. Census Bureau routinely conducts research on how we, and our partners, collect information in order to produce the best statistics possible. You have volunteered to take part in a study of data collection procedures.
We plan to use your feedback to improve the design and layout of the form for future data collections. Only staff involved in this product design research will have access to any responses you provide. This collection has been approved by the Office of Management and Budget (OMB). This eight-digit OMB approval number, 0607-0725, confirms this approval and expires on 12/31/2025. Without this approval, we could not conduct this study.
In order to have a complete record of your comments, your interview will be recorded (e.g., audio and screen). We plan to use your feedback to improve the design and layout of survey forms for future data collections. Only staff involved in this product design research will have access to the recording.
AUTHORITY AND CONFIDENTIALITY
This collection is authorized under Title 13 U.S. Code, Sections 131 and 182. The U.S. Census Bureau is required by Section 9 of the same law to keep your information confidential and can use your responses only to produce statistics. Your privacy is protected by the Privacy Act, Title 5 U.S. Code, Section 552a. The uses of these data are limited to those identified in the Privacy Act System of Record Notice titled “COMMERCE/CENSUS-4, Economic Survey Collection.” The Census Bureau is not permitted to publicly release your responses in a way that could identify you, your business, organization, or institution. Per the Federal Cybersecurity Enhancement Act of 2015, your data are protected from cybersecurity risks through screening of the systems that transmit your data.
BURDEN ESTIMATE
We estimate that completing this interview will take 60 minutes on average, including the time for reviewing instructions, searching existing data sources (if necessary), gathering and maintaining the data needed (if necessary), and completing and reviewing the collection of information. You may send comments regarding this estimate or any other aspect of this survey, including suggestions for reducing the time it takes to complete this survey to [email protected]
Click here to participate in this research.
I agree to participate in this research
Enter your information below:
First Name: __________________________________________________
Last Name: __________________________________________________
Respondent Debriefing Recruitment Email:
SUBJECT: Action Requested: Schedule a Meeting with the U.S. Census Bureau
Hello,
I
hope this message finds you well! I am a survey researcher at the
U.S. Census Bureau. You are listed as the contact for your company,
and we are interested in gathering feedback about a Census Bureau
survey that you recently completed. Below is a link to select a date
and time to meet with us. The meeting should last no more than 1
hour, and no advance preparation is required.
Follow this link to the Scheduler: Schedule Meeting
Or
copy and paste the URL below into your internet
browser:
[https://research.rm.census.gov/________________________________________________]
If
you have any questions or concerns, please feel free to contact me
via e-mail or phone. Your participation in this research is
voluntary and invaluable.
Thanks in advance for your
consideration!
[NAME], [TITLE]
[SIGNATURE BLOCK]
Nonrespondent Debriefing Recruitment Email:
SUBJECT: Action Requested: Schedule a Meeting with the U.S. Census Bureau
Hello
[name],
I’m [NAME], a survey researcher with
the Census Bureau and I wanted to reach out personally to invite your
feedback on Census surveys.
I am a part of a team that talks to people who receive our surveys to let others know what you think. I’d like to hear from you – you can find a day and time that works for you by using the scheduling link below or reply to this email to find a time.
The meeting will last no more than 60 minutes, and no advance preparation is required.
Follow this link to the Scheduler: Schedule Meeting
Or
copy and paste the URL below into your internet
browser:
[https://research.rm.census.gov/________________________________________________]
If you have any questions or concerns, please feel free to contact me via e-mail or phone. Your participation in this research is voluntary and invaluable.
Thanks in advance for your consideration!
[NAME], [TITLE]
[SIGNATURE BLOCK]
AIES Instrument Screenshots for Testing
Screenshot 1: Step 2, Reporting Period
Screenshot 2: Step 2, Fiscal Year Selected
Screenshot 3: Step 1 FAQ Modal Window
Screenshot 4: Step 2 Grid Format
2023 AIES Websites for Testing
Interactive Content Selection Tool
https://www.census.gov/aies/questionspreview/
AIES Landing Page
https://www.census.gov/programs-surveys/aies.html
AIES FAQ Page
https://www.census.gov/programs-surveys/aies/faq.html
AIES Information for Respondents Page
https://www.census.gov/programs-surveys/aies/information.html
2023 AIES Emails for Testing
AIES Postcard
Introducing
the Census Bureau’s new Annual Integrated Economic Survey
The U.S. Census Bureau is excited to announce the launch of the Annual Integrated Economic Survey (AIES). This new program will replace one or more surveys in which your company previously participated, streamlining your reporting by consolidating all necessary questions into one convenient survey. If selected to participate in the AIES, your business will receive a survey invitation in March of 2024.
Thank you for your participation in these important annual surveys. The Census Bureau is ready to support your company during this transition. For more information on the Annual Integrated Economic Survey, please visit census.gov/aies.
AIES Advance Email
Subject: IMPORTANT NOTICE: Annual Integrated Economic Survey
ANNUAL INTEGRATED ECONOMIC SURVEY
Important Notice Regarding Upcoming Census Surveys
Check our email address – it’s official if it ends in @census.gov
ATTN
COMPANY
We are writing to inform you that in a few weeks, we will be sending you the 2023 Annual Integrated Economic Survey.
At this time, we encourage you to visit census.gov/aies/information for a list of surveys that are covered under the new collection and to start preparations for reporting. For a preview of the questions in this survey, visit census.gov/aies/questionspreview/.
What is the Annual Integrated Economic Survey?
The Annual Integrated Economic Survey (AIES) is a newly redesigned survey that combines multiple mandatory annual surveys into one comprehensive data collection. The new format simplifies reporting by consolidating all necessary questions into one convenient platform, requiring you to answer only industry-specific content that directly relates to your company’s activities.
With key data items standardized across all sectors, the AIES will offer a holistic annual picture of the U.S. economy. Your valuable data serve as a critical input in calculating the Gross Domestic Product (GDP), empowering policymakers and businesses, including your own, to make well-informed and precise decisions.
Please do not reply directly to this automated email.
Join us on March 13, 2024, at 11:00 a.m. Eastern Time for an informative webinar introducing the AIES, “Navigating the Annual Integrated Economic Survey (AIES): Integrating Economic Data Collection for Facilitating Response”. The webinar will walk through the survey's purpose and how it benefits your business, provide an overview of key dates to keep in mind, and highlight tools and resources that will simplify the survey response process. Limited seating is available, so register here to ensure your place for this exclusive event.
Thank
you
in advance for your time and participation, and for helping the U.S.
Census Bureau measure America’s people and economy.
Sincerely,
Lisa E. Donaldson
Chief, Economy Wide Statistics Division
U.S. Census Bureau
Office of Management and Budget Number: 0607-1024
Expires: 06/30/2026
AIES Due Date Reminder Email
Subject: DUE DATE REMINDER: Annual Integrated Economic Survey
ANNUAL INTEGRATED ECONOMIC SURVEY
Due Date Reminder
Check our email address – it’s official if it ends in @census.gov
ATTN
COMPANY
The U.S. Census Bureau recently requested your response to the 2023 Annual Integrated Economic Survey. This is a reminder that the due date is fast approaching.
_____________________________________________________________________________
Due Date: XXXXX XX, 20XX
Register or sign in at https://portal.census.gov
Add your authentication code: XXXX-XXXX-XXXX
Case-sensitive; if you copy and paste, ensure no blank spaces are captured.
_____________________________________________________________________________
What is the Annual Integrated Economic Survey?
The Annual Integrated Economic Survey (AIES) is a newly redesigned survey that combines multiple mandatory annual surveys into one comprehensive data collection on business revenues, expenses, and assets. The new format simplifies reporting by consolidating all necessary questions into one convenient platform, requiring you to answer only industry-specific content that directly relate to your company’s participation.
With key data items standardized across all sectors, the AIES will offer a holistic annual picture of the U.S. economy. Your valuable data serve as a critical input in calculating the Gross Domestic Product (GDP), empowering policymakers and businesses, including your own, to make well-informed and precise decisions.
Please do not reply directly to this automated email. For assistance with completing this survey or for a list of the surveys that are covered under this new collection, please visit https://census.gov/aies/information or call our customer helpline at 1-800-681-3012, Monday through Friday, 8:00 a.m. to 8:00 p.m. Eastern time.
Thank you in advance for your time and participation, and for helping the U.S. Census Bureau measure America’s people and economy.
Sincerely,
Lisa
E. Donaldson
Chief,
Economy Wide Statistics Division
U.S. Census Bureau
Office of Management and Budget (OMB) Number: 0607-1024
Expires: 06/30/2026
ID: XXXXXXXXXXX
2023 AIES Letters for Testing
AIES-L1: Initial Letter
A Message from the Director, U.S. Census Bureau:
Your firm has been selected to participate in the 2023 Annual Integrated Economic Survey. Online reporting is now open. Please use the following information to access your survey:
Register OR sign in at https://portal.census.gov
Enter your authentication code.
Report by clicking on “REPORT NOW.” You can return to your account over multiple sessions to complete the survey.
YOUR RESPONSE IS REQUIRED BY LAW and will be kept strictly CONFIDENTIAL. Information about the authority, confidentiality, and burden of this data collection can be found on the back of this letter.
For more information or assistance with completing this survey, please visit census.gov/aies/information or call our customer helpline at 1-800-681-3012, Monday through Friday, 8:00 a.m. to 8:00 p.m. Eastern time. For a preview of the questions in this survey, visit census.gov/aies/questionspreview/.
Thank you in advance for your time and participation, and for helping the U.S. Census Bureau measure America’s people and economy.
Sincerely,
Robert L. Santos Director
[Page 1] of L1
What is the Annual Integrated Economic Survey?
The Annual Integrated Economic Survey (AIES) is a newly redesigned survey that combines multiple mandatory annual surveys into one comprehensive data collection. The new format simplifies reporting by consolidating all necessary questions into one convenient platform, requiring you to answer only industry-specific content that directly relates to your company’s activities.
With key data items standardized across all sectors, the AIES will offer a holistic annual picture of the U.S. economy. Your valuable data serve as a critical input in calculating the Gross Domestic Product (GDP), empowering policymakers and businesses, including your own, to make well-informed and precise decisions.
How do I know this is a legitimate government survey?
This survey has been approved by the Office of Management and Budget (OMB). The eight- digit OMB approval number is 0607-1024 and appears in the lower left corner of each reporting screen. Without this approval, we could not conduct this survey. We are conducting this survey under the authority of Title 13, United States Code, Sections 131 and 182.
How long will it take to complete the survey?
We estimate the Annual Integrated Economic Survey will take an average of 3.38 hours to complete. Factors such as company size, complexity, and activity will affect your actual time to complete the survey. This estimate includes the time to review instructions, search existing data sources, gather and maintain the data needed, and complete and review the survey.
Will my response be confidential?
Yes. The U.S. Census Bureau is required by law to keep your response confidential (Title 13, U.S. Code, Section 9). The Census Bureau is not permitted to publicly release your responses in a way that could identify your business, organization, or institution. Per the Federal Cybersecurity Enhancement Act of 2015, your data are protected from cybersecurity risks through screening of the systems that transmit data.
Am I required to fill out the survey?
Yes. Your business is required by law (Title 13, U.S. Code, Sections 224 and 225) to respond to this survey.
How will the Census Bureau use the information I provide?
By law, the Census Bureau can only use your business data to produce statistics.
[Page 2] of L1
AIES-L2: Past Due Notice Letter
PAST DUE NOTICE
Our records indicate that we have not received your 2023 Annual Integrated Economic Survey. This survey is mandatory and requires your immediate attention.
Authentication Code:
Due Date:
Register OR sign in at https://portal.census.gov
Add your authentication code OR locate this report under “My Surveys”
Report by clicking on “REPORT NOW.” You can return to your account over multiple sessions to complete the survey.
If you recently reported, you can verify your filing status by clicking on “Options” and then “Filing Status.” Look for an entry under the “Date Received” column.
YOUR RESPONSE IS REQUIRED BY LAW and will be kept strictly CONFIDENTIAL. The U.S. Census Bureau is authorized to collect this information under Title 13, United States Code (Sections 131 and 182). The same law requires that you respond (Sections 224 and 225) and assures the confidentiality of the information you provide (Section 9). The Census Bureau is not permitted to publicly release your responses in a way that could identify your business, organization, or institution, and use your responses only to produce statistics.
For assistance with completing this survey or for a list of the surveys that are covered under this new collection, please visit census.gov/aies/information or call our customer helpline at 1-800-681-3012, Monday through Friday, 8:00 a.m. to 8:00 p.m. Eastern time. For a preview of the questions in this survey, visit census.gov/aies/questionspreview/.
Thank you in advance for your time and participation, and for helping the U.S. Census Bureau measure America’s people and economy.
Sincerely,
Lisa E. Donaldson
Chief, Economy Wide Statistics Division
U.S. Census Bureau
The Office of Management and Budget
(OMB) approval number for the Annual
Integrated Economic Survey is 0607-1024.
AIES-L4L: Office of General Council (OCG) “Light” Letter
NOTICE OF FAILURE TO PROVIDE MANDATORY RESPONSE
Our records indicate that we have not received your 2023 Annual Integrated Economic Survey response. If you have submitted your survey within the past few weeks, thank you for your participation. You may verify your filing status by logging into your account at https://portal.census.gov.
If you have not yet reported, we remind you that your response is REQUIRED BY LAW. Title 13, United States Code, Sections 131 and 182, authorizes this collection. Sections 224 and 225 require your response. Section 9 requires that we keep your answers STRICTLY CONFIDENTIAL.
Although Title 13 (Section 224) and the Sentencing Reform Act of 1984 (18, U.S.C. 3559 and 3571) allow for possible prosecution of responsible officials and penalties up to $5,000 (and still require response), the U.S. Census Bureau prefers to work cooperatively with businesses like yours to ensure we gather and distribute reliable statistics about the U.S. economy. We ask for your help in meeting that goal, recognizing that we are relying on your valuable time and effort to comply with this request.
Online reporting is easy, with a secure account setup necessary to begin the reporting process. For assistance with completing this survey or for a list of the surveys that are covered under this new collection, please visit census.gov/aies/information or call our customer helpline at 1-800-681-3012, Monday through Friday, 8:00 a.m. to 8:00 p.m. Eastern time. For a preview of the questions in this survey, visit census.gov/aies/questionspreview/.
Authentication Code:
Due Date:
Your timely response will save tax dollars by reducing the need for additional mailings and telephone calls. Please report within 10 days, following these steps:
Register OR sign in at https://portal.census.gov
Add your authentication code OR locate this report under “My Surveys”
Report by clicking on “REPORT NOW”
After reporting, we invite you to review the useful data and analysis tools available to aid and inform businesses such as yours by visiting the Census Bureau’s website at https://www.census.gov/aies/information.
Thank you in advance for your time and participation.
Sincerely,
Nick Orsini
Associate Director for Economic Programs
U.S. Census Bureau
The Office of Management and Budget (OMB) approval number for the Annual Integrated Economic Survey is 0607-1024.
AIES-ECSL1: Experimental “Dear CEO” Letter
PAST DUE NOTICE
Our records indicate that we have not received your company’s 2023 Annual Integrated Economic Survey. This survey is mandatory and requires your immediate attention.
This survey includes questions about your company and all the locations for your company. Most questions can be answered using your general ledger, but some are about payroll and human resources. Please designate a person at your company to be responsible for answering this survey and complete the survey as soon as possible. This designated person will use the information below to access the survey.
Authentication Code: XXXXXX
DRAFT – FOR
RESEARCH ONLY
Register OR sign in at https://portal.census.gov
Add your authentication code OR locate this report under “My Surveys”
Report by clicking on “REPORT NOW.” You can return to your account over multiple sessions to complete the survey.
YOUR RESPONSE IS REQUIRED BY LAW and will be kept strictly CONFIDENTIAL. The U.S. Census Bureau is authorized to collect this information under Title 13, United States Code (Sections 131 and 182). The same law requires that you respond (Sections 224 and 225) and assures the confidentiality of the information you provide (Section 9). The Census Bureau is not permitted to publicly release your responses in a way that could identify your business, organization, or institution, and use your responses only to produce statistics.
For assistance with completing this survey or for a list of the surveys that are covered under this new collection, please visit census.gov/aies/information or call our customer helpline at 1-800-681-3012, Monday through Friday, 8:00 a.m. to 8:00 p.m. Eastern time. For a preview of the questions in this survey, visit census.gov/aies/questionspreview/.
Thank you in advance for your time and participation, and for helping the U.S. Census Bureau measure America’s people and economy.
Sincerely,
Lisa E. Donaldson
Chief, Economy Wide Statistics Division
U.S. Census Bureau
The Office of Management and Budget
(OMB) approval number for the Annual
Integrated Economic Survey is 0607-1024.
Special Investigation:
Debriefing Interviews with
Firms Primarily or Exclusively Located in Puerto Rico
On Behalf
of the Annual Integrated Economic Survey (AIES)8
Melissa A. Cidade, EWD
Hillary Steinberg, ESMD
Michael Lopez Pelliccia, SPMO
Ana Jara Castro, SPMO
Clara Santiago Bello, SPMO
Marijulie Martinez Lozano, SPMO
Introduction
In addition to respondent and non-respondent debriefing interviews, we engaged in a special investigation into the record keeping practices and data accessibility of firms primarily or exclusively located in Puerto Rico and in-sample or eligible to be in-sample for the AIES. Researchers in the Strategic and Portfolio Management Office (SPMO) located at a Census Bureau office in Puerto Rico conducted these interviews primarily in Spanish, with support from researchers in the Economy-Wide Statistics Division (Office of the Division Chief) and the Economic Statistical Methods Division (Data Collection and Methodology Research Branch).
We moved forward with this investigation with two broader purposes in mind. On the one hand, there has been increasing call for granular, timely, and accurate data on the Puerto Rican economy. While AIES is not the only vehicle for collecting the pertinent data, we believe it could serve as a primary vehicle for the creation or expansion of economic estimates for that area. Currently, firms with locations in both the continental United States and Puerto Rico are asked to provide some information about their locations in Puerto Rico on the AIES. However, most questions are not in-scope for Puerto Rican locations, and companies must have locations in both the continental United States and Puerto Rico to be in-sample; that is, firms with locations exclusively in Puerto Rico are out of scope for the AIES.
At the same time, we engaged this investigation in response to the Census Bureau’s strategic goal to expand data equity, with a particular focus on Hard-to-Count Populations (HCT) and Historically Undercounted Populations (HUPs)9. In a demographic sense, HUPs are those populations that are ‘missing’ from our data products, whether because of structural barriers to response, measurement issues in capturing response, or other impediments. For the AIES, an establishment survey, firms with locations in Puerto Rico represent a subset of economic activity that has had historically low response rates, if included in collection efforts at all. Likewise, socio-cultural differences in Puerto Rico – most notably, in the most prominent language usage – represent known impediments to survey response. As such, while we engage this research to ultimately contribute to considerations of inclusion of firms exclusively or primarily in Puerto Rico in future AIES administrations, we also look to contribute to the larger conversation on HUPs in establishment surveys like the AIES.
Methodology:
First, EWD, ESMD, and SPMO collaborated to translate the pertinent sections of the respondent debriefing protocol into Spanish. This is a first investigation into the reporting processes of firms in Puerto Rico, so we decided to focus exclusively on sections of the protocol related to participant roles and responsibilities, response processes, and record keeping practices and data accessibility for establishments located in Puerto Rico. We were especially interested in any companies with locations in both the continental United States and Puerto Rico to determine if there are differences in reporting practices, and how that might impact response process to the AIES moving forward.
We then reviewed the AIES sample to identify those companies that had more than one location (establishment) with a reported physical address in Puerto Rico. We decided to cast a wider net and include those companies that were eligible for inclusion in AIES but were not necessarily currently in-sample. We drilled down to focus on those with 33 percent or more locations in Puerto Rico and/or headquarters with a reported physical address in Puerto Rico. This list constituted our recruitment file for this project, totaling 16 companies.10
Beginning in early summer, 2024, we emailed the identified companies inviting them to participate in an optional interview (see Appendix F for recruitment materials for this effort). These initial emails were based off of the recruitment materials used for the larger debriefing study, translated into Spanish. For companies where the initial email was undeliverable (‘bounced’), we reached out by phone using the phone number listed on the Business Register, and requested an updated email address. In cases where the phone number was incorrect, we researched the company and called the general phone number; when we could reach someone at the firm, we requested a contact name and email address to send a recruitment invitation.
One week after the initial recruitment invitation, we sent a follow-up email to those who had not yet responded. Then, two weeks later, we sent a reminder email in English and Spanish11. When a recipient responded to the recruitment invitation, we then coordinated an interview and sent an invitation via email, including the informed consent form (see Appendix G for the informed consent form, translated into Spanish for this interviewing). About an hour before the interview, we sent a reminder email with the consent form again, to ensure that it was signed prior to interviewing.
We conducted all interviews virtually using Microsoft Teams and recorded all interviews. Each interview included two Census Bureau representatives, an interviewer and note taker. Interviews lasted about one hour a piece. In total, we conducted 11 interviews across various types of firms, including sizes, industries, and other characteristics. See Table below for an overview of interview participants. Note: One company had a headquarters in Puerto Rico but all economic activity happened in other US territories. Because of the focus of this research on practices for locations outside of the continental United States, we have decided to include their interview in our analyses.
Number of Interview Participants by Business Presence in Puerto Rico, AIES Response Status, and Primary Role |
|
Presence in Puerto Rico |
Number |
Located in Puerto Rico and the Continental United States |
7 |
Exclusively located in Puerto Rico |
4 |
2023 AIES Response Status |
Number |
Did not complete the 2023 AIES/unknown |
8 |
Completed the 2023 AIES |
3 |
Primary Role |
Number |
Financial (including CFO, financial analyst, and others) |
6 |
Human Resources (including Director of HR, compensation and benefits manager, and others)/other |
4 |
|
|
Response Process
First, we asked participants a series of questions to better understand the response processes used at companies primarily or exclusively located within Puerto Rico. We note that participants report using the same or similar response processes to those we hear from other establishment surveys generally and the AIES specifically. For example, some participants report reviewing the survey and completing it themselves, with one saying simply that they are “responsible for completing the survey.”12 Others talked about reviewing the submitted data, with another mentioning that they “audit and revise data submitted” through the survey. This is typical behavior for responding to the AIES.
We then asked a follow up question, “How do you find the data you need to answer Census Bureau surveys?” Again, the response process is very similar to other reported response behavior for the AIES. Some report using internal systems and accessing records at the company, with one saying that they “have the data in their system…everything is in the system,” while another mentioned that they “use their company accounting system and it is pretty easy.” Some pointed to standing documentation that they can use to complete the survey, including “financial statements and payroll reports” and “audited financial statements and accounting system.”
Still, others talked about needing to rely on others within the company in order to complete the survey. This data dispersion is not uncommon in larger companies especially; said one participant, “I don't have all the answers at the moment. Suddenly I have to evaluate where I should go [to get the data]...This takes a bit since not all the people are available when I fill out the questionnaire and they don't have the answers at hand and I have to wait and there is a delay in looking for that information, basically this is the experience I had this year.” We asked specifically about data dispersion, and most participants indicated that it does take more than one person to complete economic surveys, with one pointing out that “it depends on the data [the survey] requests.”
Finding 1: Participants at companies primarily or exclusively located in Puerto Rico are using similar response strategies to those located in the continental United States.
Recommendation 1: Because the response processes are similar, we should make available all response supports as are available to all AIES sampled companies, including survey delegation, FAQs and other documentation, and within-instrument help text.
Burden
We asked participants, “How easy or difficult it is it for you to find all the data you need for Census Bureau surveys?” Some mentioned that it is easy, citing “clear questions” and that the questions are well within their scope of work. A few said it was middle-of-the-road, with one saying that the “definitions and classifications of assets, equipment, and so on are not aligned” with the way the company keeps their records, but that they could enter the information.
Some participants reported that finding the data needed to complete Census Bureau surveys represents a heavy burden. Said one, it is “difficult, due to the amount of information they request and the number of people needed to extract the data.” Said another, it was “a little difficult” to answer the AIES and that they “spent a couple of days filling out the questionnaire. We stopped, started, [and then stopped again] to look for the information,” suggesting that completing the AIES took multiple log-ins to the survey.
Additionally, a few participants noted challenges in getting the data into the survey itself. One said it is “super easy” because they printed the survey out, completed it on paper, and then entered the data into the online form. But another mentioned that because this is their first encounter with the AIES, “it wasn’t that easy…[I] had to update all [establishments] that are open in Puerto Rico. And I think that part was more complicated because you could [update the information] in Excel or you could do it within the platform.” This is similar to feedback received in the research phase of the AIES, where we learned that some respondents – especially those at large firms – rely on Excel templates to complete their establishment listing updates at the start of the survey, while others prefer to use the browser-based platform.
Finding 2: The reported burden for companies exclusively or primarily located in Puerto Rico is dependent up on the size and complexity of the company as well as the types of data requested on the survey, similar to companies in the continental United States.
Recommendation 2: Continue exploring ways of organizing the survey into groups of questions based on topics that can be delegated to the appropriate person within the company for response.
Data Accessibility
The next section of the interview focused on the accessibility of data requested on the AIES or on the 2022 Economic Census. In this case, we were most interested in what records are available for locations in Puerto Rico, and how easy or difficult it is to access those records to support response.
Data Disaggregation by Unit
One unique aspect of the AIES is that it asks questions at three units within the business – the “company” level, representing the highest aggregation and all economic activity; the “establishment” level, representing the lowest aggregation and akin to individual locations; and the “industry” level, representing groups of establishments that make or do the same thing, based on their six-digit North American Industry Classification System (NAICS) code. For companies with locations in Puerto Rico, this unit model is further muddied: while some questions are asked at the location level, most questions asked at the industry level are intended to exclude data from locations in Puerto Rico. As such, we ask for revenue, expenditures, and payroll for locations in Puerto Rico, but we then ask for other information – like capital expenditures – at the industry level and excluding those locations in Puerto Rico. We wondered: how challenging was it for participants to disaggregate their locations in Puerto Rico from their industry reporting?
We asked participants, “How easy or difficult is it to report some but not all data for your locations in Puerto Rico?” Most participants noted that it is not challenging for them to exclude locations in Puerto Rico from other reported data. Remarked one, “it’s not a challenge” to engage in this reporting. Another mentioned that it is “simple” because the company has “separate accounting by location.” We did note that participants mentioned that reporting for other US territories might be more challenging, with one mentioning that reporting for “those [locations in] Puerto Rico are easier than those from the Dominican Republic13.”
We probed a little further, asking “are your records for Puerto Rican locations the same or different than those in the 50 states,” and “do you use the same steps to pull data for Puerto Rican locations as you do for those in the 50 states?” Again we note that the challenge is not in locating and reporting data for locations in Puerto Rico but rather in reporting for locations in other US territories, like Guam and the Virgin Islands. Said one, “the steps are the same” to pull the data, but the data are “different, compared to the Dominican Republic.” Another mentioned that operations in Guam are “independent and have separate reporting,” while a third said that the data and the steps to access them are “different from the Virgin Islands.” Here we emphasize that while records seem to be similar for locations in Puerto Rico, we caution that this finding does not seem to be applicable to the record keeping practices of locations in other US territories and outlying areas and other sovereign states.
Finding 3: Records for locations in Puerto Rico are as accessible as those for locations in the continental United States.
Recommendation 3: Because records seem to be as accessible for locations in Puerto Rico as locations in the continental United States, consider expanding collection to include firms located exclusively in Puerto Rico in the AIES.
Finding 4: Records for other territories and outlying areas may be less accessible than for locations in the continental United States.
Recommendation 4: Consider additional investigations into the records accessibility of locations in other territories and outlying areas, including Guam and the Virgin Islands, as well as other sovereign states, like the Dominican Republic.
Linguistic and Cultural Considerations
The AIES is an English-only collection – the survey and all supporting documentation are exclusively in English. We asked participants if this posed an issue, and none mentioned a language barrier to completion. Most simply responded “no” or that there is “no issue.” One participant mused that the lack of language barriers may be because “Puerto Rico’s system is similar to the USA,” suggesting again that the record keeping, maintenance, and retrieval systems for companies in Puerto Rico and in the continental United States are congruous. Said another, “if you want to work in this business, you have to know English. We are in an American territory…I work with a lot of American staff from the United States…and the only thing they speak is English, they don’t speak Spanish.”
One mentioned that the necessarily technical language on the survey can slow down the response process, noting that “in Puerto Rico, we have a pretty good command of the language” but that “the level of complexity is a bit [higher] because [English] is not our first language.” This participant went on to mention that they “may suddenly get stuck on some words because of how the question is designed.” To deal with this, they “look for the translator to see what the question means, to see if I am understanding it correctly.” In instances where they still do not understand the question intent, they “go to the company controller [and] ask him/her to help me with this question that I am not sure what it refers to, and we come to a conclusion based on what we understand.” In this case, the participant is outlining their response process for complex questions – first, they try to understand the question themselves; then, they turn to translation tools; and if neither of these strategies are successful, they rely on additional expertise within their company.
We do note that one participant pointed to a possible language barrier and fear of reprisal for misreporting as a barrier to response, as demonstrated by this extended quote:
The information [requested in the survey] is not exact, but it is an approximation. In that sense…we are a public company and we are not going to submit information that is not correct… we are going to submit the data that we have, we will submit the totally correct information. [T]here is a little fear of what we are going to submit. When it says Census, I don't know, suddenly it is a little fear, the information that we are submitting is from our company and we want to submit all the true and real information and not make mistakes. And… because of the language, it can lend itself to this type of error that is not conscious, all the information that is submitted is with the full intention of reporting all the information of the company, but because of the language, information can be submitted that is not being asked for.
Finding 5: There are no significant reported linguistic or cultural barriers to response.
Recommendation 5: Based off of this interviewing, we do not recommend additional materials development in Spanish. However, one participant did mention the advantage of having accessible Census Bureau staff in Puerto Rico to supporting response, so we do recommend considering further involvement of this staff in collection efforts where appropriate to support response.
Respondent Communications
We then asked participants about the survey communications that we send out on behalf of the AIES. We note that participants mentioned similar communication processes and issues in this interviewing as in other establishment survey interviewing, including preferences in mode of communication, the ways that physical mail travels through the office, and other considerations. Of note, most participants mentioned staff turnover and availability as key issues to receiving survey communications. Several cited specific instances of the letter or email being sent to a contact no longer with the company, with one saying that the “previous person who was the contact person [for the survey] is no longer with [the company]…and [I] found the information about our [survey] communications by searching for the former employee's email.” Several times during the interviews participants requested to update the contact information on the fly for the survey, noting that the listed contact person had left the company.
Of note, a few participants mentioned delays in receiving communications in a timely manner outside of staff turnover. We provide the following extensive quote, illustrating mail delivery issues impacting the survey response process:
If [the mail] arrives at all, it can take a week, two weeks. Right now in Puerto Rico, 15% of the mail in Puerto Rico is getting lost. The mail has a serious problem here, and it has been for 2 or 3 years. For example, I lose payments that I send to suppliers by mail. I lose around 5 to 6 checks a week, which don't reach the other person. And the mail, maybe it's in the mail, [the postal service] leaves it lying around in the mail and don't return it, or sometimes they return it…a year later.
It is not always just the mail delivery that is the issue, though. We heard from one participant, who noted that the delay in receipt is because of a delegation issue, saying that “my supervisor can take between one to two weeks to send me the information…letters take longer than emails, [but even emails] can take one to two weeks for it to reach me when my supervisor sends it to me.”
Finding 6: Participants representing firms in Puerto Rico have many of the same barriers to receipt of communications materials as those at firms in the continental United States.
Recommendation 6: Because of the similarities in barriers to receipt, we recommend that if the AIES is to include firms exclusively in Puerto Rico, the same or similar respondent locating activities and communications strategies – including contact switching and multimodal contact – be used for these firms as are used in the wider sample. Further, because of possible mail delivery issues, we recommend consideration of a longer field period or an earlier contact escalation process for firms exclusively in Puerto Rico. These considerations could be incorporated into AIES adaptive design strategies for future survey communications.
Barriers to Response
Finally, we asked those participants who had not responded to the AIES as of the interview date to tell us about the barriers to survey reporting they may be facing. In general, participants pointed to three main barriers to completion: the burden of the survey; the survey not matching the company structure; and the need for a survey preview.
Response Burden and Processes
First, some participants mentioned that they had accessed the AIES (burned the authorization code and opened the survey) but had not yet completed or submitted their data. For some, it is due to the length of the survey, with one saying, “the time that must be dedicated to the survey is [a] challenge,” and another mentioning that “it takes time and effort [to complete], it's not a simple survey,” and that in fact, “it is more tedious, long and complicated” compared to other surveys.
A few mentioned the intersection of the volume of questions with the size and complexity of their firm, with one saying, “The problem is that it takes a while to answer the questionnaire because it is a very large company since we have [more than 30] establishments, and it takes them a while to answer all the information they ask for.” Another echoed this sentiment, saying “the AIES is complex and that…especially in large organizations like [this one], I would need to communicate with various sources in the organization” to compile the requested data. One mentioned the issue of data dispersion discussed earlier, that they simply do not have access to the records themselves and must rely on others within the company which adds time to completion. Stated the participant, “I had to investigate, inquire and look for information with several people,” and especially, “when they ask how many employees I have on a date, then I have to go to HR because sometimes they ask for specific months because there is no report created for that.”
We do note that two of the 11 participants specifically mentioned issues with delegating the survey appropriately. One participant called the help line where “they instructed [me] to forward the electronic message to the relevant person… [I] think there must be an easier way to make changes” in the survey contact. Said another, “[I have] not been able to see the AIES survey and have had access problems since I does not receive it directly…I have to wait for authorization from the comptroller and there is a period of time that the unblocking of the account has passed time and I cannot enter the survey.”
Units of Collection
A few participants noted that the barrier to response is in a mismatch between the units on the survey and the way the company keeps its records. This mismatch is echoed on other establishment surveys but is particularly acute on the AIES because of the integrated three-unit model (company, establishment, and industry). Said one, “instead of me filling out [more than 15] stores, if it is the same company and the same [tax ID why can’t I] fill out a consolidated [survey] for the company? In my case I only have one [tax ID] with [more than 15] stores, it is difficult to fill out.” Said another, “The survey should be consolidated by its locations.”
Survey Preview
As in other interviewing, participants again rallied for a survey preview to support response. Said one, “Before [AIES]… we gathered all the information and sat down to fill it out online. Now we can't do that, now we feel trapped. And now, with all due respect, we feel like we're wasting our time.” This participant went on to mention that after completing the survey, “I couldn't print the questions I answered, I could only print the confirmation,” and this bothered them. Additionally, another mentioned that “We would like to know in advance what questions we are going to [to be] asked, if we can print out what you need us to fill out or at least look at it. If we can click forward, maybe not print it, but we can go to the other pages and look for all the information, we [can] gather all the information and then we sit down and we just do it one by one and that's it.”
Finding 7: Participants from companies primarily or exclusively in Puerto Rico experience the same barriers to response as those from companies in the continental United States, including high actual and perceived burden and issues interacting with the survey.
Recommendation 7: We strongly recommend that AIES leadership continue to engage in a robust program of respondent-centered research to continue to refine the survey as it matures. This could include additional qualitative research, analysis of paradata and item performance, mixed method research, and survey experiments. These research activities should include companies primarily or exclusively located in Puerto Rico if the AIES universe is to extend to these companies.
Informe del encuestado Correo electrónico de reclutamiento inicial:
ASUNTO: Acción Solicitada: Programar una reunión con la Oficina del Censo de EE. UU.
Consulte nuestra dirección de correo electrónico: es oficial si es de census.gov
Saludos,
¡Espero que se encuentre bien al recibir este mensaje! Soy investigador de encuestas en la Oficina del Censo de EE. UU. Usted figura como contacto de su empresa y estamos interesados en sus comentarios sobre las encuestas de la Oficina del Censo. Me gustaría reunirme con usted a través de Microsoft Teams o por teléfono durante unos 45 minutos.
Sus comentarios serán confidenciales y se utilizarán únicamente para mejorar nuestras encuestas. No necesita hacer nada para prepararse para esta reunión.
¿Cuáles son algunas fechas y horarios que podrían funcionar mejor para usted?
Si tiene alguna pregunta o inquietud, no dude en ponerse en contacto conmigo por correo electrónico o por teléfono. Su participación en esta investigación es voluntaria e invaluable.
¡Gracias de antemano por su consideración!
[NOMBRE], [TÍTULO]
[BLOQUE DE FIRMA]
Correo electrónico de reclutamiento recordatorio de información del encuestado (enviado 1 semana después del inicial):
Asunto: Recordatorio: Programe una breve reunión virtual con la Oficina del Censo de EE. UU.
Consulte nuestra dirección de correo electrónico: es oficial si es de census.gov
¡Saludos, [Nombre]!
Soy [NOMBRE] de la Oficina del Censo. Le envié un correo electrónico la semana pasada para programar una reunión para recibir sus comentarios sobre nuestras encuestas y aún no he tenido noticias suyas.
Soy parte de un equipo que habla con empresas como la suya para mejorar nuestras encuestas. ¿Tienes media hora para contar tu experiencia esta semana?
¡Tu participación es voluntaria e invaluable!
¡Gracias!
[NOMBRE], [TÍTULO]
[BLOQUE DE FIRMA]
Invitación a la reunión de Outlook para el informe del encuestado (enviada una vez programada):
Título: Reunión con la Oficina del Censo de EE. UU.
Saludos [Nombre],
Muchas gracias por aceptar hablar con nosotros. Déjame saber si prefieres que te llame directamente en lugar de reunirnos a través de Teams y, de ser así, cuál es el mejor número para comunicarnos con usted.
Antes de reunirnos, complete el formulario de consentimiento para la entrevista, ubicado en: <<URL>>
He invitado a varios colegas a escuchar nuestra conversación. Si no se sientes cómodo con eso, házmelo saber.
¡Gracias!
[NOMBRE], [TÍTULO]
[BLOQUE DE FIRMA]
Correo electrónico recordatorio del informe del encuestado (enviado el día de la entrevista):
Asunto: Próxima reunión con la Oficina del Censo de EE. UU.
¡Saludos, [Nombre]!
Este correo electrónico contiene información para su próxima reunión de Teams con la Oficina del Censo, programada para: Hoy [a las 10 a.m., hora del Este].
Asegúrese de reunirse conmigo en el enlace de Teams de su invitación.
No es necesario que se prepare para nuestra entrevista. Si aún no lo ha hecho, complete el formulario de consentimiento aquí: <<URL>>
¡Gracias!
[NOMBRE], [TÍTULO]
[BLOQUE DE FIRMA]
Formulario
de consentimiento para participantes del AIES 2023
Investigación
Especial de Empresas Puertorriqueñas
La
Oficina del Censo de EE. UU. realiza investigaciones de forma
rutinaria sobre cómo nosotros y nuestros socios recopilamos
información para producir las mejores estadísticas
posibles. Se ha ofrecido como voluntario para participar en un
estudio sobre los procedimientos de recopilación de datos.
Planeamos utilizar sus comentarios para mejorar el diseño y la
disposición del formulario para futuras recopilaciones de
datos. Solo el personal involucrado en esta investigación de
diseño de producto tendrá acceso a las respuestas que
usted proporcione. Esta colección ha sido aprobada por la
Oficina de Gestión y Presupuesto (OMB). Este número de
aprobación de la OMB de ocho dígitos, 0607-0725,
confirma esta aprobación y vence el 12/31/2025. Este número
de aprobación válido certifica legalmente esta
recopilación de información.
Para tener un registro completo de sus comentarios, su entrevista será grabada (por ejemplo, audio y pantalla). Planeamos utilizar sus comentarios para mejorar el diseño y la disposición de los formularios de encuesta para futuras recopilaciones de datos. Sólo el personal involucrado en esta investigación de diseño de producto tendrá acceso a la grabación.
Estimamos que completar esta entrevista tomará un promedio de 45 minutos, incluido el tiempo para revisar las instrucciones, buscar fuentes de datos existentes (si es necesario), recopilar y mantener los datos necesarios (si es necesario) y completar y revisar la recopilación de información. Puede enviar comentarios sobre esta estimación o cualquier otro aspecto de esta encuesta, incluidas sugerencias para reducir el tiempo que lleva completar esta encuesta, a [email protected]
Aviso de Privacidad
La
Oficina del Censo está obligada por ley a proteger su
información. A la Oficina del Censo no se le permite
divulgar sus respuestas de manera que pudieran identificarlo a usted.
Estamos realizando esta encuesta voluntaria bajo la autoridad del 13
del Código de los EE. UU secciones 141 y 193. La ley federal
mantiene sus respuestas confidenciales (13 U.S.C. §9). Sus
respuestas voluntarias se usarán para mejorar la usabilidad
del sitio web de la Oficina del Censo. Su privacidad está
protegida por la Ley sobre la Privacidad (5 U.S.C. §552a).
La información que usted proporcione puede ser compartida con
otro personal de la Oficina del Censo por las razones relacionadas
con el trabajo identificadas en esta declaración. Participar
en esta entrevista implica consentimiento con los propósitos
identificados. Su información está protegida por
la Oficina del Censo cumpliendo estrictamente con las disposiciones
de la Ley sobre la Privacidad.
Me he ofrecido como voluntario para participar en esta entrevista informativa de la Oficina del Censo.
Ingresa tu información a continuación:
Nombre: __________________________________________________
Apellidos: __________________________________________________
1 For more information, please visit https://www.census.gov/programs-surveys/aies.html and https://www.census.gov/programs-surveys/aies/technical-documentation/methodology.html.
2 The two participants from the concurrent e-Commerce Exploratory Interviews data collection were excluded from the tabulation of the recruitment outcomes for the Debriefing Interviews.
3 Hughes, K. A. (2004). Comparing pretesting methods: Cognitive interviews, respondent debriefing, and behavior coding (Survey Methodology #2004-02). Statistical Research Division, U.S. Census Bureau.
4 This finding is based on analysis of the 26 interviews from Round 1.
5 This finding is based on analysis of the 26 interviews from Round 1.
6 Note that this participant was assigned to protocol version B but found the interactive content tool when exploring the AIES website and provided feedback on it.
7 To quantify the aggregated estimate provided by the participants, we first reviewed the participants’ open-ended responses to the two questions in the module: “How long do you think it took you to complete the whole survey, including any time to retrieve the data you needed” and “How much time were you actively responding to the survey (that is, reaching out to colleagues, reviewing data, and entering data)?” We then coded the responses by the following rules: (1) use the estimates “as is” when participants provided a specific number of hours; (2) use the average/midpoint when participants provided a range of hours; and (3) convert a (work)day to an 8-hour increment.
8 This summary report is meant to be incorporated into the larger AIES Participant Debriefing Findings and Recommendations Report, due to be delivered in September, 2024. It is not intended to stand alone.
9 While these two phrases are conceptually different (see: https://www.census.gov/newsroom/blogs/random-samplings/2023/10/understanding-undercounted-populations.html), herein we use HUP as it is the more popular nomenclature of the Economic Directorate.
10 Note: some companies are split into more than one part (so-called “split-parts”). For this interviewing, we selected the split-part located within Puerto Rico for inclusion in recruitment efforts here.
11 We sent the final reminder recruitment email in both English and Spanish because there was a concern that communications from the Census Bureau’s Economic Directorate are exclusively in English and recipients might question the legitimacy of the invitation coming in Spanish.
12 Please note: interviewers were conducted in Spanish and participant quotes were translated to English for incorporation into this report. Likewise, throughout this report, we use the non-gendered “they” to refer to all participants, regardless of their preferred pronouns.
13 We note that the Dominican Republic is a sovereign state and not a territory of the United States, and include this quote as illustrative of record keeping practices for locations outside of the continental United States.
| File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
| Author | Erin Elizabeth Williams (CENSUS/ADRM CTR) |
| File Modified | 0000-00-00 |
| File Created | 2026-01-22 |