2022 AIES Findings and Recommendations Slide Deck

Attachment K - 2022 AIES Findings and Recommendations Slide Deck.pdf

Generic Clearance for Questionnaire Pretesting Research

2022 AIES Findings and Recommendations Slide Deck

OMB: 0607-0725

Document [pdf]
Download: pdf | pdf
Annual Integrated Economic
Survey Pilot: Phase III
Preliminary Findings and
Recommendations
Melissa A. Cidade, EWD
Heidi St.Onge, ADEP
December 20, 2023

The Census Bureau has reviewed this data product to ensure appropriate access, use, and disclosure avoidance protection of the confidential source data
(Project No. P-7529180, Disclosure Review Board (DRB) approval number: CBDRB-FY24-ESMD010-004).

Good morning! I’m Melissa Cidade a survey methodologist in EWD.
While I’m doing the talking, today’s presentation is a collaborative effort between
myself and my friend and colleague Heidi St.Onge, thank you for your work on all of
this, Heidi!
And, while I’m giving shout outs, let me pause here and recognize that today’s
presentation is representative of efforts from across the Economic Directorate –
thank you so much to the respondent research team and the emerging methods
team in ESMD, to the contact strategy and data handling teams in EMD, and to the
production and account management teams in EWD for your tremendous work over
the last six months or so – we simply would not be here today without your
meticulosity, creativity, and dedication.
Today we are talking about the preliminary findings and recommendations from
Phase III of the AIES Pilot, affectionately known as the Dress Rehearsal. There’s a lot
to cover, so please hold your questions until the end of my prepared remarks. I’ll talk
for about 40 minutes, and then we will have about 20 minutes for questions,
clarifications, and discussion. If you think your team needs a rehash of this

1

presentation, please reach out to Heidi and I and we can schedule additional
conversations about these findings and recommendations. Note that we are
scheduled to present a cleared version of this presentation at EAMS on Thursday,
January 25 at 2:30 pm.
Let’s jump right in…

1

AIES Pilot Overview
Phase I: “The 78” Pilot 2022
• Goal: Understand response processes and further instrument refining
• Qualtrics instrument, 78 companies
Phase II: Response Spreadsheet Pilot 2023
• Goal: Induce independent response
• Response spreadsheet, about 900 companies
Phase III: 2022 AIES (Dress Rehearsal)
• Goal: Troubleshooting and infrastructure building
• New Centurion instrument, about 8,000 companies

CBDRB-FY24-ESMD010-004

2

Today we are talking about the findings and recommendations for Phase III of the
AIES Pilot. You’ll remember that Phase I was last year and was 78 total companies
and run through Qualtrics. Phase II was earlier this year, was about 900 companies,
and used the spreadsheets. If you want to refresh your memory on these earlier
rounds of research, check out the recordings from the January and August EAMS
presentations we gave earlier this year.
Today, we are looking at Phase III – the largest and last of the research collections in
support of the March 2024 launch.

2

Survey Response Data and
Paradata (N = 4,860)

Research
Modalities

Response Analysis Survey (RAS)
(N = 465)
Respondent Debriefing Interviews
(N = 44)
Respondent Usability Interviews
(N = 28)
Inbound Call Log (N = 924)
CBDRB-FY24-ESMD010-004

3

As in previous rounds, we are drawing on a number of research modalities for today’s
presentation. First, we conducted a preliminary review of the response data to the
survey and the corresponding paradata. We conducted a Response Analysis Survey of
respondents, a short survey sent within two weeks of submitting the DR instrument.
We conducted 44 debriefing interviews with respondents to get their feedback on the
survey. We also conducted 28 interviews dedicated to usability to better understand
how respondents interacted with the instrument. And, we did a first look at the
inbound call log in support of this survey for additional feedback from the field,
especially issues respondents were facing. Again, thank you to each of the teams
that supported these collections – brought together, these activities provide a
window into instrument performance for the Dress Rehearsal.
If you would like more information about any of these modalities, we can have
separate briefings from the teams that conducted each of them. There is more
information contained in their reporting than I can possibly include in this
presentation, and I encourage this group to explore the nuances of each of these
methodologies for additional insight into instrument performance.

3

Usability
Testing

Dress
Rehearsal

Test new survey flow in AIES Dress Rehearsal collection.

x

x

Include functionality to clean up establishment lists

x

x

Consider functionality to orient respondents within the
spreadsheet

x

Continue to explore ways of communicating optionality at the unit
level.

x

Gain additional
information about
response burden

Explore other ways of collecting non-numeric responses.

x

Develop respondent
communications

Continue to develop response support materials.

Goal

Recommendation

Get feedback on the new
flow
Test key elements of the
spreadsheet design

Prime respondents for the change

Future
Research

x
x
x

Consider cutting content

x
x

x

Consider response support training.

x

Update letters to retain resonant messaging and drop
discouraging messaging.

x

Consider additional research into communications materials.

x

CBDRB-FY24-ESMD010-004

x

4

With that much information coming in, we could talk about results for hours. To keep
us focused, let’s remember that after Phase II, we had some research questions that
needed further investigation. Some of these we could test in the Dress Rehearsal –
this is what we’ll cover today. On screen now is a table that ended our last
presentation at the end of Phase II, with the goals for investigation this round.
Throughout this presentation, we will talk about these four goals – note that we have
color-coded this presentation to orient you through these goals as we talk through
them.
(click)First, we wanted to test out the three step flow of the survey. (click) Then, we
had some lingering questions about the respond-by-spreadsheet method we are
using for the granular data on this survey. (click) Next, we wanted to learn more
about respondent burden. (click) Finally, we wanted to test out additional
respondent communications materials.

4

Get feedback on the new flow
Three-step structure
Linear design

CBDRB-FY24-ESMD010-004

5

First up, let’s talk about Goal 1: to get feedback on the new flow. This includes the
layout – three discrete steps to complete – and the linear design, each step coming
sequentially after the other.

5

CBDRB-FY24-ESMD010-004

Goal 1: Get feedback on the new flow 6

Remember that the Dress Rehearsal Instrument was the first time we implemented
the three step response layout that came as a recommendation from Phase I.
Respondent first saw this screen at the start of the survey, outlining that there were
three steps to completion…

6

Three-step Design
Overall, which of the following AIES questionnaire sections was most challenging to
complete?
(N = 444)
Step 1: Review your
locations
11.7%

Step 3: Answer more
granular questions
67.1%

Step 2: Answer companylevel questions
21.2%

CBDRB-FY24-ESMD010-004

Goal 1: Get feedback on the new flow 7

We asked respondents on the Response Analysis Survey to identify the most
challenging section of the survey, and about two in three said it was Step 3 – answer
more granular questions. This is the step where respondents have the option of
answering by online spreadsheet or by downloading an Excel file, filling it out, and
then submitting it through the survey.

7

Finding 1: The three-step design needs
additional support.
• “It would be nice to have all parts of the survey available at once, not having to
wait until a step was completed to move on to the next step.” – RAS
• “I didn’t know how many steps there were. I crossed my fingers, is this the end of
it? You didn’t know what they were going to ask on the next step. There was no
way to get a full view of what are all the questions going to be, what are all the
steps? It was all unknown until you tried to do it.” – Debriefing Interview
• “[The survey was] disjointed, not user friendly. The first two [sections] were
typical, third section was different and I really didn’t care for it, I downloaded the
template and then upload when done. I wasn’t very familiar with it, didn’t like
that the whole survey wasn’t in one format.” – Debriefing Interview
CBDRB-FY24-ESMD010-004

Goal 1: Get feedback on the new flow 8

In fact, we find that some respondents mentioned that the three-step flow of the
survey is challenging – but not necessarily because of the structure of the survey. The
first quote from the Response Analysis Survey, where the respondent mentions “not
having to wait until a step was complete to move to the next step” is more about the
inability to move back and forth through the survey than the actual structure of the
survey, something we’ll talk more about in a minute. The second quote from a
debriefing interview highlights another finding we’ll talk about in a minute, the need
for a survey preview, where the respondent mentions “there was no way to get a full
view of what are all the questions going to be, what are all the steps.” And, in the
third quote, the respondent is having a negative visceral reaction to the new format,
calling it “disjointed” and “not user friendly,” and saying that they “didn’t like that the
whole survey wasn’t in one format.”
This brings us to the first finding for the Dress Rehearsal: the three-step design needs
additional support and features. That is to say that while there is evidence that some
respondents don’t like the survey structure broken into the three steps, most of the
negative feedback about the structure had more to do with the needs for instrument
flexibility and survey previews.

8

Linear design – Survey Break Points
• Step 1: All of the buttons are blue and say “resume”
• Step 2: Remarks screen
• Step 3: Estimated time to complete screen

CBDRB-FY24-ESMD010-004

Goal 1: Get feedback on the new flow 9

Let’s turn our attention, then, to this question of the linear design of the survey. The
Dress Rehearsal instrument was designed to be answered unidirectionally –
respondents answer step 1, 2, and 3 in turn and do not return to earlier steps in the
survey once completed. We designed the survey in a linear fashion so that changes
made by respondents to establishment listings in Step 1 could be generated on-thefly in the spreadsheet in Step 3.
Each of these delineations is a survey break point – a spot after which the respondent
cannot return to the previous screens. In Step 1, this happens after the respondent
has updated information about their establishments and answered additional followup questions where appropriate. In Step 2, after the substantive questions we
provide a space for remarks about the Step 2 response, and once a respondent clicks
save and continue after that remarks capture, they cannot go backwards. And, in
Step 3, after the completion of the spreadsheet – either online or downloaded and
uploaded back to the system – the respondent is asked a few questions about who
completed the survey and how long it took to complete; that screen represents the
final break point for the survey after which respondents cannot alter any of their
previous data.

9

Finding 2: Linear design suppressed
response.
The Flipper

“I put in 1's to go through it to see what questions there were and at the end it only allowed
me to submit so I couldn't change my answers to the actual instead of the $1
placeholder.” – RAS

The Quality Controller
“Detailed uploads were not matching totals entered in Part 1 that were uneditable.” – RAS

The Reviewer
“[I] would like the ability to be able to review questionnaire responses before submitting.
There should be an option at the end to preview survey responses so someone other
than the preparer could review for accuracy before submission. The inability to go back
and edit after answering a question is risky.” – Debriefing Interview
CBDRB-FY24-ESMD010-004

Goal 1: Get feedback on the new flow 10

What we find in the data is that the linear design is not conducive to response. We
see three major typologies of respondents that illustrate why this layout was so
problematic: the flipper, the quality controller, and the reviewer.
The Flipper is a respondent who wants to look through the survey to see what
questions are coming up, what data to collect, or out of curiosity to understand what
to expect. These respondents – like the quote on screen – often put dummy data
into the survey to be able to advance, not realizing that they could not come back
later to change their responses to the actual values. This again speaks to the need for
more robust survey preview so that respondents do not have to move through the
actual survey to anticipate which questions they will be answering. Next, we have the
Quality Controller – these respondents want the ability to check their company-totals
to the establishment or KAU level reporting. The quote for the Quality Controller
highlights this: the response in Step 3 doesn’t match the response in Step 1 because
it was “uneditable” at that point. Finally, we have The Reviewer – these respondents
either want the ability to one last review of the entirety of their response or they are
mandated by their company to have someone else review and approve their
submission prior to release. The quote here is representative of the second type,
with the respondent noting a want to “preview survey responses so someone other

10

than the preparer could review for accuracy before submission,” calling the lack of
this functionality “risky.”
And, we have our second finding for the Dress Rehearsal: the linear design
suppressed response. It is impeding accurate survey reporting for respondents.

10

Recommendations: Get feedback on the new
structure
Phase III Goal

Finding

Recommendation

Three-step design

The three-step design needs
additional support.

Include flexible navigation and a
more robust means of survey
previewing.

Linear design

Linear design suppressed
response.

Explore the ability to move more
freely throughout the survey.

CBDRB-FY24-ESMD010-004

Goal 1: Get feedback on the new flow 11

Ok, let’s pause here and take stock – we found that the three-step design is not
impeding response, but needs additional supports like flexible navigation and survey
previews. At the same time, we learned that the linear design of the instrument is
impeding response and needs reconsideration. This includes the recommendation to
allow for more free movement throughout the survey.

11

Test key elements of the
spreadsheet design
Include functionality to clean up establishment lists
Consider functionality to orient respondents within the spreadsheet
Continue to explore ways of communicating optionality at the unit level
CBDRB-FY24-ESMD010-004

12

Let’s keep going. Next up, we wanted to test key elements of the spreadsheet design
in the first Centurion rendered version of the instrument. Let’s take a look at each in
turn…

12

Include functionality to clean up
establishment lists
First screen has the list of
establishments….

On the next screen, applicable
establishments get follow up questions

CBDRB-FY24-ESMD010-004

Goal 2: Test the spreadsheet 13

In Phases I and II, we heard from respondents that the pre-listing of establishments
based on previous response could be challenging. Lists were often out of date,
missing locations or including nonoperational or other out of scope locations.
(click)For Phase III, we included in Step 1 the functionality to review the prelisted
establishments and clean up the lists – a housekeeping exercise so to speak.
Respondents were presented with their list in an online spreadsheet, and could not
move forward from this list until they updated the operational status of each location.
(click) Then, based on their responses to the list, they were taken to a follow-up
screen that included tailored questions for applicable locations to gather additional
information about the establishment.

13

Finding 3: Larger companies struggled with Step 1.
Screen Purpose:
“I’m going to type all this stuff in? Are we waiting for
something here? It’s not clear which question to fill out to
verify. I’m still confused.” – Usability Interview

Percentage of triggered warning/error
messages by type of message
(N = 68,693)*

Horizontal Scrolling:
“I don’t know what I’m supposed to verify to move forward.“
– Usability Interview
Long Lists:
“In the early stages of the survey (identifying and confirming
locations) there's not download/upload option. Absolutely
ridiculous to scroll around a tiny window and make sure
things are correct. We have 800 locations. Just silly. Especially
when you offer the option later in the survey. Really stuck on
this section for days before even getting the actual survey.” -RAS
CBDRB-FY24-ESMD010-004

Other
warnings
54.7%

Operational
status
warning
45.3%

*Total number of warning/error messages triggered. Respondents
could trigger the same error more than once.

Goal 2: Test the spreadsheet 14

Testing identified three major problems with the functionality for the establishment
listing clean up. First, usability testing demonstrated that respondents were not clear
on what action to take upon reaching the first screen for Step 1. In fact, every user
involved in the usability testing assumed that they could verify the locations by
reading the information and clicking save and continue to go to the next step. Every
single participant received the error indicating that they must input a value to record
each establishments operational status. The quote on screen demonstrates this
confusion, as the respondent asks “are we waiting for something here? It’s not clear
which question to fill out to verify.”
Relatedly, respondents struggled with the horizontal scrolling on this screen because
questions were over to the right of the visual field. Users did not scroll right to see
the rest of the spreadsheet on Step One. They did not know they had to answer
questions for each establishment because they did not scroll to the right and see
them. Many users remained entirely unaware of the survey questions until they
received an error after attempting to move forward, as demonstrated by the second
quote where the respondent says “I don’t know what I’m supposed to verify to move
forward.”

14

Finally, we found that this step was especially challenging for our largest responding
businesses. The third quote on screen illustrates the issue of long lists of
establishments, where the respondent notes that it is “absolutely ridiculous to scroll
around a tiny window and make sure things are correct” as their business has 800
locations. They called this layout “just silly” and notes that they were “really stuck on
this section for days before even getting the actual survey.”
We looked at the paradata and note that almost half of all of the error messages that
were trigged using the survey were at this step 1, reinforcing that respondents did
not initially know how to move forward in the survey.
So, the third finding here is that companies are struggling with Step 1, and especially
our largest companies are struggling with Step 1. We’ll need to consider other ways
to approach cleaning up establishment listings.

14

Consider functionality to orient respondents
within the spreadsheet
Online Spreadsheet

Download Spreadsheet

CBDRB-FY24-ESMD010-004

Goal 2: Test the spreadsheet 15

Next up, we learned in Phase II that respondents were getting lost in the
spreadsheets – the AIES has a high volume of questions, and especially for very large
companies, their response spreadsheets can get unwieldy.
While we could integrate some functionality to help respondents navigate the
spreadsheets – like the ability to filter columns – we could not integrate other means
of orienting respondents in this round of the instrument.

15

Clip of a respondent during a usability interview
CBDRB-FY24-ESMD010-004

Goal 2: Test the spreadsheet 16

And as a result, the spreadsheet – online and download/upload – was challenging for
respondents to navigate. Throughout each of the different research modalities, we
see convergence around this major finding: Respondents are getting lost in the
spreadsheet and it is leading to confusion and frustration, and is impeding response.
Let’s watch this first respondent during a usability interview navigate the
spreadsheet. (Play video) Notice how the issue revolves around staying oriented in
the spreadsheet while finding the question, the correct unit, and the response
capture.

16

Finding 4: Respondents get lost in the
spreadsheet.
“The set up of the questions ran horizontally and I had to keep
scrolling up to see other parts of the question. The design was
extremely poor. “ – RAS
“You had to scroll horizonal instead of vertical, and as you scrolled
right, I had to keep going back to see what line I am on, am I entering
in the right information? A freeze pane view would be good, or vertical
presentation so you can scroll that way.” – Debriefing Interview

CBDRB-FY24-ESMD010-004

Goal 2: Test the spreadsheet 17

The first respondent on screen here is from the Response Analysis Survey and
described the issue in that “the set up of the questions ran horizontally and I had to
keep scrolling up to see other parts of the question.” Similarly, in the second quote,
the respondent asks “am I entering in the right information?” This is an issue we will
need to address moving forward.

17

Continue to explore ways of communicating
optionality at the unit level
Phase II Spreadsheet

Phase III Spreadsheet

CBDRB-FY24-ESMD010-004

Goal 2: Test the spreadsheet 18

Next up, we have the continuation of the unit problem in the AIES. AIES collects data
at the company level, the establishment or location level, and the industry level aggregations of locations that do or make the same thing, sometimes called Kind of
Activity Units or KAUs. For the Phase II instrument, on the left, you’ll remember that
we experimented with optionality at the unit level (click) by color coding response.
We found that respondents didn’t really get it, confusing optionality at the unit level
with optionality at the question level.
For the Phase III instrument, instead, we went back to standardized aggregation for
almost all content at the KAU for non-manufacturing collection. (click) Questions
that were required by location are white for the locations and gray for the KAU; (click)
likewise, questions that are required at KAU are in white at KAU and gray at location.

18

Finding 5: Respondents
struggled with units.
• “First two rows are a glitch. I
would just skip them and ignore
them if it’s just the two rows, that
doesn’t give me information of
which location you’re referring to.”
– Usability Interview

Sum of Establishments Compared to Companylevel Reporting for Four Variables for Pilot
Phase II (N = 318) and Phase III (N = 4,398)
Percentage of companies by value match type
100%
90%
80%
70%
60%
50%
40%

• “One of [their] KAU lines is
completely grayed out and will not
allow [them] to enter data even
though [they have] data to report
for that industry that was included
in company-level totals.” – Inbound
Call Log

30%
20%
10%
0%

Phase II

Phase III

Phase II

Total employment

Missing
CBDRB-FY24-ESMD010-004

Exact match

Phase III

Annual payroll

Phase II

Phase III

Q1 payroll

Phase II

Phase III

Revenue

Within 10 percent More than 10 percent difference
Goal 2: Test the spreadsheet 19

This brings us to finding 5 – respondents continue to struggle with the units in the
AIES. On the left are two quotes about reporting by the KAU – in the first, the
respondent calls the KAU units “a glitch” and states that they “would just skip them
and ignore them.” In the second, a respondent has called in to report that “one of
their KAU lines is completely grayed out and will not allow them to enter data,”
suggesting that KAUs performance in the survey itself might be inconsistent. On the
right, I draw your attention to what we’ve been calling “the big four” since the start
of this work – four key items that are asked at the establishment level and then
compared back to company level reporting as a makeshift measure of reporting
quality. (click) We note that for each of these four questions, we saw an increase in
the proportion of businesses for which either the company-level reporting or one or
more establishment-level reporting was left blank. We suspect that some of this is
due to the failure of the linear design, and further encourage investigation into
instrument flexibilities to encourage full reporting of these and other data items.

19

Test other key elements of the spreadsheet design
• Finding 6: Rounding functionality surprised respondents.
“It auto changed to thousands. That would be fine if it told you
that. It should be an instruction on the front tab, ‘please enter in
thousands.’” – Usability Interview
• Finding 7: NAPCS reporting needs additional attention.
• Missed
• Cumbersome
• Mismatched
CBDRB-FY24-ESMD010-004

Goal 2: Test the spreadsheet 20

We have two additional findings related to the spreadsheet performance to mention
here. First, we note that the automatic rounding functionality surprised respondents,
particularly in usability testing. Note the first quote where the respondent isn’t
opposed to the functionality, that it “would be fine if you told that” up front. It was
surprising, not suppressing response.
But, the second finding is that we need to continue to research better ways of
collecting the NAPCS data if we are to retain it on this survey. Our testing suggests
three major issues with NAPCS collection in the DR: that respondents don’t see the
NAPCS tab or don’t know what it is if they do see it; that reporting NAPCS is
cumbersome and confusing especially because of the lack of row labels; and that
NAPCS is not reflective of the ways that companies keep their records. We will need
to reconsider how we are collecting NAPCS moving forward.

20

Recommendations: Test key elements of the
spreadsheet design
Phase III Goal

Finding

Recommendation

Include functionality to clean up
establishment lists

Large companies struggled with Step 1.

Develop download/upload functionality
for Step 1.

Consider functionality to orient
respondents within the spreadsheet

Respondents get lost in the spreadsheet Explore ways to freeze pains

Continue to explore ways of
communicating optionality at the unit
level

Respondents struggled with units.

Update ways of displaying KAUs to cue
respondents

Test other key elements of the
spreadsheet design

Rounding functionality surprised
respondents.

Include instruction that entries will be
rounded.

NAPCS reporting needs additional
attention.

Investigate better performing ways of
collecting NAPCS.

CBDRB-FY24-ESMD010-004

Goal 2: Test the spreadsheet 21

Ok, let’s pause here and review this section on elements of the spreadsheet design.
We noted that our largest companies especially are struggling with verifying
locations, and recommend developing download/upload functionality for this step
similar to the functionality in Step 3. We saw evidence that respondents are getting
lost in the spreadsheet, and recommend exploring ways to freeze pains within the
spreadsheet so that they can see the location or question at all times. We saw the
negative impact of the way we collected KAUs, and suggest updating ways of
displaying KAUs to cue respondents, including labeling the row and more explicit
instructions about responding at the KAU. And, we learned that respondents were
surprised by the rounding functionality – include an instruction about this on the
instruction tab on Step 3. Finally, we have evidence that NAPCS reporting is
underperforming, and suggest additional investigation into ways to collect NAPCS
moving forward.

21

Gain additional information
about response burden
Explore other ways of collecting non-numeric responses.
Prime respondents for the change.

CBDRB-FY24-ESMD010-004

Goal 3: Response Burden 22

Let’s shift gears now and talk about burden, something we’ve been tracking since the
start of this program.

22

Explore other ways of collecting non-numeric
responses.
Step 2: Banner Messages

Step 3: List Messages

CBDRB-FY24-ESMD010-004

Goal 3: Response Burden 23

Up first, we noted in Phase II that non-numeric questions – those that asked for
categorical response like yes/no or major activity codes – were underperforming in
the grid format. One suggestion coming out of this work was to include hard and soft
edit checks in the instrument to cue respondents into issues in response.
On the left is an example of a soft edit check within Step 2, company-level response.
These warnings are displayed as banners across the top of the relevant screen. On
the right is an example of warnings in Step 3, the spreadsheet response. These
warnings are listed, with a ‘fix’ feature to bring respondents back to the issue that
triggered the error or warning.
As an aside, we call soft edit checks – those issues identified within the data that are
still acceptable for response – “warnings.” We call hard edit checks – those issues
identified with the data that prevent the submission of data – “errors.” From here, I
will use “error checking” to encompass both warnings and errors.

23

Finding 8: Error checking needs additional
development.
• Automate when possible:
“It didn’t force me to do a data check…never went to the check data tab; normally
[the survey] would throw me into the data check- or error out the column.” –
Usability Interview
• Redirect to the problem when possible:
“The whole row is highlighted.
It doesn’t take you right to it?
Not that clear.” – Usability Interview
• Reconsider error checking report:
“[My response generated] an error. The error report was really weird. It said it
passed and then it had errors. I don’t understand that.” – Debriefing Interview
CBDRB-FY24-ESMD010-004

Goal 3: Response Burden 24

Generally, we find that error checking needs additional development as the
instrument progresses. First, respondents noted that error checking was not required
and was not running automatically. The first quote notes that the respondent “never
went to the check data tab” because “normally [the survey] would throw me into the
data check”, indicating both that they did not run the check, and that they expected
this to be an automatic process. We may want to automate this step when possible.
In the second quote, the respondent runs the error check on Step 3 and then notes
that “the whole row is highlighted” – that is, the error check highlights the entire row
and not the specific data issue. In this case, they are expecting to be automatically
brought to the issue, and we suggest this may be functionality to consider in future
iterations of the survey.
And finally, respondents noted that the results of the error checking were
incongruently labeled – the report says “Pass” in large green letters, and then lists a
set of errors. This is because the respondent had triggered soft edit check warning –
nonfatal errors that do not have to be changed in order to successfully submit the
data. The respondent described this as “really weird” and literally “I don’t
understand that.” In future iterations of the survey, we should consider labeling this

24

report more clearly, while encouraging respondents to update items that failed edit
checks.

24

Prime respondents for the change

CBDRB-FY24-ESMD010-004

Goal 3: Response Burden 25

Last round, we found that just by changing the survey, we are increasing response
burden at least for the first few years of collection. One of the recommendations
coming out of that finding was to prime respondents that the AIES would be coming,
it would be different, and they could prepare. Later in this presentation we’ll walk
through some of the findings related to the letters and email communications we
developed for this round of collection, but here, we highlight the other ways we
attempted to prime respondents for this change. We launched the new AIES website,
pictured left, and tried out the first prototype of an interactive content selector tool
to serve as a means of helping respondents to prepare for the AIES.

25

Finding 9: Respondents rely on a survey
preview.
• “To make it easier for us, we need to have an overview of what the report is asking for.
Other surveys have PDF or excel version that we can download and see what the whole
survey is….If the survey requires us to reach out to other departments, then it’s easier
for us to forward them this [PDF or excel] template and ask them for the information.” –
Debriefing Interview
• Respondent asked about pdf to preview questions, whether there were more
questions after the Remarks section, how to share parts of the survey with others and
whether they’ll be able to review responses before submitting them. – Inbound Call Log
• “Not being able to get the entire survey prior to entering the data in caused significant
challenge. I usually will print the survey out and assign the various pieces to those that
are best to complete them. With this survey I had to answer as I went through it.” – RAS

CBDRB-FY24-ESMD010-004

Goal 3: Response Burden 26

Again, across all research modalities, including inbound phone calls, one of the issues
loud and clear is finding number 9: Respondents rely on a survey preview. What we
provided was not robust enough, it was not accessible enough, and it did not meet
the needs of our respondents. This, coupled with the linear design, represents a
major issue in the AIES DR collection.
In the first quote, from a debriefing interview, the respondent is noting not only that
a survey preview is standard for these kinds of surveys – “other surveys have PDF or
Excel versions that we can download” – but also that this is a key component to the
response process – “if the survey requires us to reach out to other departments, then
it’s easier for us to forward” the preview. The second quote, taken from the inbound
phone log, notes specifically being “asked about pdf to preview questions” and
“whether there were more questions after the Remarks section.” The final quote is
from the Response Analysis Survey where the respondent writes that “not being able
to see the entire survey prior to entering the data in caused significant challenge.”
The respondent clearly states that the lack of a survey preview suppressed response.

26

Finding 10: Content continues to be a challenge.
Which of the following, if any, were challenges to completing the AIES
questionnaire?
(N = 465*)
400
350
300
250
200
150
100
50
0

334
243

227

189

172

155

153
108

89
32

Had to collect
information
from more
than one
database or
other source

Had to wait
to rely on
others within
my company
for the
requested
data

Had to add,
allocate, or
otherwise
manipulate
data to fit
questions

Too many
questions

Unclear or Some other
inadequately challenge
defined
terms

The online Already had Questions
I had no
survey was other surveys were too
challenges
difficult to to complete complicated completing
use
at the same
the AIES
time
questionnaire

*Note: Respondents could select all that apply.

CBDRB-FY24-ESMD010-004

• “It [took] me about
6 hours […] I don’t
ever recall spending
as much time. When
I finished it, I was
mad at the amount
of time I spent. I
don’t remember ever
spending more than
an hour doing a
census survey in the
past.” – Debriefing
Interview
• “As a dentist, I didn't
understand most
questions.” – RAS
Goal 3: Response Burden 27

And while we are on the topic of content on the AIES, we have – for the third round in
a row – the finding that the survey questions are ambiguous and voluminous –
there’s too much content and too much of it is confusing to respondents. So, what is
making the AIES challenging for respondents? A primary issue is data accessibility for
respondents: When asked, the most given response is that respondents had to
collect information from more than one database or source, reflecting data
dispersion. Similarly, 243 Response Analysis Survey respondents said that the
challenge lies from having to wait to rely on others within the company for the
requested data. And another 227 said the challenge lies in having to add, allocate, or
otherwise manipulate data to fit the question. These represent the top three most
selected responses for this question – collecting the data or waiting for others to
deliver it, or fitting the company data to match our questions. But, there are other
challenges related to the survey content, too. 189 respondents said the challenge
was that there are too many questions, while another 172 said there are unclear or
inadequately defined terms on the survey. Note that 89 respondents said that the
questions were too complicated.
We see this challenge with content in the qualitative data, too – in the first quote, the

27

respondent said they were “mad at the amount of time I spent” estimating that it
took them “about six hours” to complete. The second quote: “As a dentist, I didn’t
understand most questions” reminds us that content can be challenging both for our
largest and our smallest companies.

27

Recommendations: Gain additional
information about response burden
Phase III Goal

Finding

Recommendation

Explore other ways of collecting
non-numeric responses.

Error checking needs additional
development.

Consider automatically checking
submissions.
Update labels on error checking
report.

Prime respondents for the change

Respondents rely on a survey
preview.

Make content previews more
accessible and easier-to-use.

Content continues to be a
challenge.

Cognitively test misreported
content and consider additional
cuts.

CBDRB-FY24-ESMD010-004

Goal 3: Response Burden 28

Let’s stop here and consider the findings for our goal to gain additional information
about response burden. We wanted to explore other ways of collecting non-numeric
responses, and implemented this in the Dress Rehearsal by way of rudimentary error
checking. We found that we need a little more work on this: consider error checking
as an automatic process for responses, and update the labeling on the error checking
report. We also wanted to prime respondents for the changes coming from the AIES
– and one way that we did that was through an explicit online presence this round.
We found that respondents rely on survey previews to support response, and we
recommend additional testing and refinement of the content selection tool as well as
making this tool and other documents more accessible to respondents. At the same
time, we continue to find that not only is there too much content, but that content is
ambiguous or not relevant to respondents. We support additional cognitive testing
on misreported content, and encourage survey leadership to consider additional
content cuts where appropriate.

28

Develop respondent
communications
Continue to develop response support materials.
Update letters to retain resonant messaging and drop discouraging messaging.

CBDRB-FY24-ESMD010-004

29

Let’s round the corner here on our fourth goal of the Dress Rehearsal: Developing
respondent communications.

29

Continue to develop
response support materials.
• Added phone support
• Emails and letters
• Account managers

CBDRB-FY24-ESMD010-004

Goal 4: Communications 30

In Phase I, you’ll remember, we phone recruited companies, inviting them to be a
part of the research, and all outbound survey communications were through email.
In Phase II, we added in paper and email survey communications, and fielding emails
and phone calls exclusively with headquarters staff and staff at NPC from across the
directorate – an ‘all hands on deck’ response to supporting respondents. In Phase III,
we onboarded telephone support using our typical production phone center staff.
This was the first time we’ve used this modality to support AIES. In this phase, we
used a sequential mixed mode contact strategy of mail, email, and for some,
telephone follow-up communications. We also continue to look to our account
management program, and especially the Full Service Account Manager program, to
guide our largest respondents through the process of reporting the AIES.

30

Finding 11: Respondents report using the
materials when they are relevant and accessible.
• “I tried calling the Help Desk—
they never called back, but I
followed up by logging a
question on the website itself
and I was finally able to get
response.” – Debriefing
Interview

Which of the following, if any, did you use to complete the AIES?
(N = 465*)
300
250

246

200

156

134

150

106

100

65

50

58

49
21

0
AIES
website

AIES
AIES How-to Calling the Interactive AIES Videos Some other I did not use
content
PDFs
AIES
content tool
material or any of these
guide
Helpdesk
help
to complete
*Note: Respondents could select all that apply.
the AIES
CBDRB-FY24-ESMD010-004

• “In the documentation we
received about completing the
survey, I do not recall seeing
[the Content Summary] which
outlines the content &
structure of the survey. If I had
been aware of this, I would
have used it to gather the
necessary information before
filling out the survey. That
would have been a huge help
in planning for the survey. –
RAS
Goal 4: Communications

31

In support of the Dress Rehearsal, we developed a whole suite of response support
materials based on feedback we had received in prior rounds of research. On the
Response Analysis Survey, we asked respondents specifically about these response
support materials that they may or may not have accessed to complete the AIES. The
top three selected responses are the AIES website – 246 respondents – the AIES
content guide, which is a compendium of all questions asked in the survey – 134
respondents – and the AIES How-to PDFs, which provide additional instructions to
use the survey platform – 106 respondents. Note that 156 respondents said that
they did not use any of the listed response support materials to complete the AIES.
Of those that listed an “other” resource, most mentioned emailing with a specific
person at the Census Bureau, or other resources specific to the company, like
accountants, prior survey submissions, and others.
I note that respondents are using our response support materials in combination with
each other – look at the first quote where the respondent first described calling the
Help Desk, but “they never called back, but I followed up by logging a question on the
website itself and I was finally able to get a response.” The respondent was
unsuccessful in the first preferred mode of communication – calling the Help Desk –
so they reached out using a secondary mode and was successful. In the second

31

quote, though, we see what happens when our materials are available but not
accessible – in the Response Analysis Survey, we provided links to each of the
communications we listed. In a write in space, the respondent notes that they do not
“recall seeing” this documentation, but that once they saw it on the Response
Analysis Survey, it “would have been a huge help in planning for the survey.” In this
way, it is important not only to make this support available, it must also be easily
accessible, connected to the ecosystem of available response support.

31

Update letters to retain
resonant messaging and drop
discouraging messaging.
Phase II Finding:
• Many respondents
found the list of
consolidated surveys
intimidating and
confusing
• Most respondents
are only looking for
a few pieces of
information in the
initial request

CBDRB-FY24-ESMD010-004

Goal 4: Communications

32

In Phase II, one of the research pieces looked at communications testing to support
the Dress Rehearsal. We implemented two major findings – first, that the list of
consolidated surveys is intimidating and confusing to respondents, and may suppress
response. The second was that respondents were looking for a few key pieces of
information in the initial survey request letter.
Initially we suggested a full review of our contact materials and strategies to run
parallel to the Dress Rehearsal research; ultimately, we were unable to rally the
resources in time to execute that line of research, but we were able to embed a few
questions on our respondent debriefing interviews to see how these materials
performed in the field about three specific pieces of communications.
On screen now, you can see the initial survey request, quite sparse in messaging, but
emphasizing those key pieces of information: Due date, portal address, and
authentication code. (click) And here is the so-called “welcome letter” – this short
email is automatically sent to the respondent once they enter in the authentication
code to access the survey. You’ll note that it does not reference the consolidated
survey list, and it points the respondent to other response support materials of
interest. (click) Finally, using research from the 2022 Economic Census, we also

32

included a flier extending the vignette of Lilly as a respondent to the AIES. Each of
these pieces of communication were developed with the findings from the Phase II
comms research in mind.

32

Finding 12: Each communication piece serves
a specific function.
Initial Email:

Keep it
short, add
preview.

Welcome Email:

Already
started
reporting.
CBDRB-FY24-ESMD010-004

AIES Flier:

Drop Lilly,
add why.
Goal 4: Communications

33

For the initial email, respondents reacted positively to the short messaging. Most
indicated that this email spurs them to log in to the portal and enter in the
authentication codes. A few mentioned wanting access to a survey preview at this
step, so we may want to consider adding a link to the interactive content tool or the
content summary documentation.
For the Welcome email, respondents reported that for the most part, they had
already started completing the survey when the email was delivered to their inbox.
Some found it to be useful, and no one reported that it suppressed response.
And, for the flier, the suggestion is to drop the Lilly vignette – the resonant message is
that the AIES is for small businesses, when in fact, AIES is for all businesses. And,
respondents suggested emphasizing the “why” of the AIES –messaging like
production of the GDP and other uses of the data, and that the survey is mandatory.
This brings us to our last finding of the day: each of the pieces of communication in
the Dress Rehearsal serves a particular function, and we need to tailor our
communications to that function.

33

Recommendations: Develop respondent
communications
Phase III Goal

Finding

Recommendation

Continue to develop response
support materials.

Respondents report using the
materials when they are relevant
and accessible.

Review the ecosystem of
communications materials to
ensure that all pieces are
accessible.

Update letters to retain resonant
messaging and drop discouraging
messaging.

Each communication piece serves a Additional communications-based
specific function.
research.

CBDRB-FY24-ESMD010-004

Goal 4: Communications

34

And, here we are at our last recommendations screen. First, we learned that the
additional response support materials we had developed were being used by
respondents when relevant and accessible. We recommend a review of the
communications ecosystem – all letters, emails, websites, videos, tools, in-survey
instructions, phone scripts…everything – to ensure that not only are we covering all
of the respondent needs, we are doing so in a way that links all of the
communications together coherently and consistently. Next, we updated our letters
and emails to emphasize resonant messaging, and we learned that each of our pieces
of communication serves a specific function in supporting response. We recommend
additional research into the best practices for communications materials for a survey
as complex as the AIES. This could include additional interviewing, message testing,
focus groups, and other modalities designed to better understand communication
preferences and needs of our respondents.

34

Goal

Finding

Recommendation

Get feedback on
the new flow.

Three-step design

The three-step design needs additional
supports.

Linear design

Linear design suppressed response.

Include functionality to clean up establishment lists

Large companies struggled with Step 1.

Consider functionality to orient respondents within the spreadsheet

Respondents get lost in the spreadsheet

Continue to explore ways of communicating optionality at the unit
level

Respondents struggled with units.

Test other key elements of the spreadsheet design

Rounding functionality surprised
respondents.

Test key elements
of the
spreadsheet
design.

NAPCS reporting needs additional attention.
Gain additional
information
about response
burden.

Explore other ways of collecting non-numeric responses.

Error checking needs additional
development.

Prime respondents for the change

Respondents rely on a survey preview.

Content continues to be a challenge.

Cognitively test misreported content and
consider additional cuts.

Develop
respondent
communications

Continue to develop response support materials.

Respondents report using the materials
when they are relevant and accessible.

Update letters to retain resonant messaging and drop discouraging
messaging.

Each communication piece serves a specific
function.

CBDRB-FY24-ESMD010-004

Findings and Recommendations

35

Whew….here we are. All of our preliminary findings and recommendations from the
AIES Pilot Phase III. (click) We got feedback on the new flow, and found that we need
to abandon the linear design, and consider additional supports for the three-step
layout. (click) We tested key elements of the spreadsheet design, and found that the
establishment listing needs work, that the spreadsheet is big and overwhelming, that
units continue to be a source of error, that rounding functionality surprised
respondents and that NAPCS underperformed. (click) We gained additional
information about response burden, including that our error checking functionality
needs some additional development and that respondents rely on a survey preview
to support reporting. We also heard that content is continuing to be challenging for
respondents. (click) Finally, we wanted to further develop respondent
communications, and we found that respondents use the materials we provide when
they are accessible, and that each piece of communications serves a particular
function to support the response process.

35

Goal

Recommendation

Next Steps

Get feedback on the new
flow.

The three-step design needs additional
supports.

Further develop survey preview and content selection tool.

Linear design suppressed response.

Develop ability to move forward and backwards through the
survey.

Large companies struggled with Step 1.

Provide download/upload functionality for Step 1.

Respondents get lost in the spreadsheet

Freeze left columns and top rows.

Respondents struggled with units.

Update KAU display.

Rounding functionality surprised
respondents.

Include instruction on instructions tab.

NAPCS reporting needs additional
attention.

Additional research into NAPCS capture.

Error checking needs additional
development.

Update error labeling and implement automatic error
checking.

Respondents rely on a survey preview.

Further develop survey preview and content selection tool.

Cognitively test misreported content and
consider additional cuts.

Conduct cognitive testing on misreported content.

Respondents report using the materials
when they are relevant and accessible.

Review website for accessibility and ease of use.

Test key elements of the
spreadsheet design.

Gain additional
information about
response burden.

Develop respondent
communications.

Each communication piece serves a specific Conduct additional communications-focused research.
CBDRB-FY24-ESMD010-004
36
function.

And on this screen, I have the recommendation on the left and the next steps on the
right.

36

Goal

Recommendation

Next Steps

Get feedback on the new
flow.

The three-step design needs additional
supports.

Further develop survey preview and content selection tool.

Linear design suppressed response.

Develop ability to move forward and backwards through the
survey.

Large companies struggled with Step 1.

Provide download/upload functionality for Step 1.

Respondents get lost in the spreadsheet

Freeze left columns and top rows.

Respondents struggled with units.

Update KAU display.

Rounding functionality surprised
respondents.

Include instruction on instructions tab.

NAPCS reporting needs additional
attention.

Additional research into NAPCS capture.

Error checking needs additional
development.

Update error labeling and implement automatic error
checking.

Respondents rely on a survey preview.

Further develop survey preview and content selection tool.

Cognitively test misreported content and
consider additional cuts.

Conduct cognitive testing on misreported content.

Respondents report using the materials
when they are relevant and accessible.

Review website for accessibility and ease of use.

Test key elements of the
spreadsheet design.

Gain additional
information about
response burden.

Develop respondent
communications.

Each communication piece serves a specific Conduct additional communications-focused research.
CBDRB-FY24-ESMD010-004
37
function.

These bolded items are ones that are already in the works, and we anticipate will be
addressed in time for the March 2024 launch, including the ability to move forward
and backward through the survey; providing a download/upload feature for the
establishment listing clean up; freezing left columns and top rows to help orient
respondents in the spreadsheet; updates to the ways that KAUs are displayed; adding
a note about the rounding functionality in instructions; and updating error labeling
and automatic error checking. I want to take a moment here to publicly recognize
the work that our friends and colleagues in Special Internet Operations Branch in
ADSD have been able to accomplish as we have gotten results from the field; because
we had so much information coming in so quickly, we were able to start to tackle the
big ticket items in preparation for next year.
(click) The next one – further development of the survey preview and content
selection tool – is awaiting OMB clearance to get testing in front of respondents, and
is already undergoing additional refinements based on feedback from the field. We
anticipate a more robust interactive content selection tool in time for March 2024.
(click) This leaves these four items on the table – further investigation into NAPCS;
cognitive testing of underperforming content in the survey; a review of the AIES

37

website for accessibility and ease of use; and additional communications-focused
research as possible areas of further exploration into 2024. In addition to additional
usability interviewing and respondent debriefings, one or more of these may be
further explored to support future iterations of the AIES.

37

Positive feedback, too…
• “One of the things that I like about it because our company has grown
significantly by acquisition was being able to answer all the questions at one
time for all of the entities. I really appreciated that. Versus having multiple
surveys that I have had to do historically." – Debriefing Interview
• “When I got to the download spreadsheet, it was easy… We actually have 3 to 4
people working on [the survey], compiling it together, and submitting it as one.
It’s a little bit easier when we have the spreadsheet to work with as opposed to
manually entering something into a screen.” – Debriefing Interview
• “Very easy in our system. The questions on the survey were straightforward. It’s
easy to match the questions from the report that I run so easy to populate.” –
Debriefing Interview
CBDRB-FY24-ESMD010-004

All is not lost….

38

At this point, you may be feeling a little overwhelmed at the scope and breadth of
issues this research has identified. I just want to pause for a moment and recognize
that the purpose of this work was to identify issues with the AIES – we kind of set
ourselves up for some doom and gloom today.
But, all is not lost – as in previous rounds of research, we also got positive feedback
from the field, especially renewed enthusiasm in the original point of the AIES: the
combining of disparate survey programs. In the first quote, the respondent mentions
that the AIES gives them the ability to “answer all the questions at one time for all of
the entities” and that they “really appreciated that.”
In the second quote, the respondent notes a preference for the respond-byspreadsheet design of the AIES, calling it “easy”, especially because of the number of
people working on the response, saying that “it’s a little bit easier when we have the
spreadsheet to work with as opposed to manually entering something into a screen.”
In the third quote, the respondent notes that “the questions on the survey were
straightforward,” and that “it’s easy to match the questions” to their internal
reporting.

38

Key Takeaways:
Piloting is worth it!

Additional areas of investigation

• Found the showstoppers

• Instrument refinements

• First practice in collection

• Response support materials
development

• Understand sources of burden
• Piloting is a proven method

• Additional research:
•
•
•
•
CBDRB-FY24-ESMD010-004

NAPCS
Large-firm response
Communications review
Cognitive Testing
Key Takeaways

39

Ok!! Wow! We’ve covered a lot of ground today! Your head may be spinning, so I
want to end our time today with two key takeaways from the Pilot Phase III.
First: This last year has been grueling – we conducted two rounds of research in less
than 12 months, including standing up an entirely new survey in an entirely new
system. I know some of you have dismayed at how drafty our house has been, but I
want to point out to you that while there are still major issues we’ve outlined today,
this piloting program identified the absolute showstoppers and worked – often in real
time – to fix those issues and prepare for next year. That was exactly the point: we
need a dress rehearsal, and we got one, for sure! The collaborative and iterative
methods we used here allowed us to identify these issues with enough time to start
to make changes for 2024. We could not have done that if we hadn’t pushed
ourselves to stand something up this fall. At the same time, this first practice
collection got us our first glimpses at what AIES data may look like, and has bought us
invaluable time to develop and refine our data handling strategies beginning in
production. We have improved understanding of the sources of burden our
respondents are facing, and have been able to pivot across all three iterations to
respond to those pain points. In all, piloting is a powerful, proven method that we
have executed iteratively three times in support of this new program, and the survey

39

is better off for it.
At the same time, we recognize that drafty was ok for the practice, but in production,
we need to patch those holes so to speak. This includes key updates to the survey
instrument to address poor instrument performance, and the development of
additional response support materials and the accessibility of those materials. We
also note the opportunity for additional investigations into NAPCS reporting, largefirm response processes, communications materials, and cognitively testing content.
We have learned so much over our three rounds together – we came together as a
team, and we accomplished the improbable: we stood up a harmonized survey
instrument to inform and prepare for collection in 2024. The first year will not be
perfect, but it will be better because of the work we have put in to date. I am so
grateful for the opportunity to be involved in this work, and I am excited to see what
we cook up in Year 1.

39

Thank you!
• Melissa Cidade
• [email protected]
• Heidi St.Onge
• [email protected]

CBDRB-FY24-ESMD010-004

40

That concludes our prepared remarks, thank you so much for your time and
attention. If you have more specific questions, or if you want to talk about any of the
pilot findings further, please do not hesitate to reach out to either of us!

40

Goal

Recommendation

Next Steps

Get feedback on
the new flow.

The three-step design needs additional supports.

*Further develop survey preview and content selection tool.

Linear design suppressed response.

Develop ability to move forward and backwards through the survey.

Test key elements Large companies struggled with Step 1.
of the spreadsheet
Respondents get lost in the spreadsheet
design.
Respondents struggled with units.

Gain additional
information about
response burden.

Develop
respondent
communications.

Provide download/upload functionality for Step 1.
Freeze left columns and top rows.
Update KAU display.

Rounding functionality surprised respondents.

Include instruction on instructions tab.

NAPCS reporting needs additional attention.

Additional research into NAPCS capture.

Error checking needs additional development.

Update error labeling and implement automatic error checking.

Respondents rely on a survey preview.

*Further develop survey preview and content selection tool.

Cognitively test misreported content and consider
additional cuts.

Conduct cognitive testing on misreported content.

Respondents report using the materials when
they are relevant and accessible.

Review website for accessibility and ease of use.

Each communication piece serves a specific
function.

Conduct additional communications-focused research.

CBDRB-FY24-ESMD010-004

Bold – in progress, probable inclusion in 2024
* - awaiting OMB clearance
Recommendations and Next Steps 41

I’m going to leave this list up on screen to guide any additional conversation for today.
Cheers!

41


File Typeapplication/pdf
File TitleMicrosoft PowerPoint - Phase III Findings
AuthorMelissa A Cidade (CENSUS/EWD FED)
File Modified2024-02-15
File Created2024-02-15

© 2025 OMB.report | Privacy Policy