AIES Pilot Study
Respondent Debriefing Interview Protocol
Phase II
Winter/Spring 2023
This protocol is a guide – the questions here will not necessarily be asked exactly as worded in the protocol or in this order. Not all questions will be asked in every interview. As much information about interview participants and enterprises/establishments should be obtained prior to the interview as possible. In some cases, probing may need to be adjusted based on the background research that has been conducted and/or in response to participants’ insights.
These interviews will be guided by four research questions, each with sub-questions:
Research Question 1: How are respondents completing the survey?
What is the process of gathering the data necessary to complete the survey?
Does the holistic approach lead to better data quality?
Research Question 2: What are the patterns that shape the ways respondents answer the survey?
What aspects of a company, such as size, staff, and records, dictate the data we receive?
How do respondents make decisions about how to answer items or topics?
Research Question 3: Is the new survey overly burdensome?
How, if at all, has the existing survey response procedures been the same and different using the new survey, as compared to the regular annual surveys?
How many and what kind of staff are involved in gathering the information for this survey?
What aspects of the survey instrument may be impeding the response process?
Materials: Protocol, consent form, survey responses, respondent instrument (overview of responses)
Method: We will conduct the interviews by phone, Microsoft Teams, or we will conduct the interviews in person, depending on availability and federal employee travel restrictions.
Expected length of interview: 1 hour (60 minutes) maximum
General probes that may be used throughout the interview:
How did you arrive at this number/answer this question?
Were these data easy to access?
What else can you tell me about this?
Can you tell me more about that?
How confident are you in that response?
What looked unclear or is confusing here?
Introduction
If necessary: I sent you a consent form in an email today – did you get a chance to sign that? If not, please do so now, and then we’ll get started.
Thank you so much for agreeing to talk with me today!
As part of the pilot program for the new Annual Integrated Economic Survey, we are following up with some companies to learn more about the processes you may or may not have used to complete the survey.
I am part of a group within the Census Bureau that makes sure that our surveys are performing as expected, and provides feedback to other parts of the Bureau about ways to improve the performance of our instruments. I’m talking with you today because of your unique role in testing out the new AIES survey instrument.
Remember, my job is to improve the surveys. Please be candid and frank in your responses. Our interview is being conducted under the authority of Title 13, which means that your responses are confidential, and neither your name nor the name or identifying information about your company will be included in any of our findings.
Do you have any questions before we get started?
I’d like to record our session today so that when I go to analyze the results of these interviews, I can use the recording to pick up on anything I may have missed in my notes. Do I have your permission to record our session today?
<<Turn on recorder>>
Company Background
What is your role in the company? How long have you been with the company?
What is your role in the process for responding to Census Bureau surveys?
Examples: gathering data, entering data, consulting with data providers, etc.
Tell me a little bit about your business. What types of goods or services does this business provide?
How many locations does your business currently operate? In how many states? In how many countries?
Is this a foreign or domestic company?
Are there related companies?
Did you have any confusion about which parts of the company to include when answering the survey?
In what ways was your approach the same or different for this survey versus other annual survey(s)?
Research Question 1
Spreadsheet:
Let’s talk about the spreadsheet now. What were your first impressions?
How did you feel about all the locations and questions being in one spreadsheet?
Was it easier or harder than the current annual surveys to answer all in one place?
What do you think of the color coded cells?
Did the shading help you respond?
Did you use the colors to decide how to respond?
What was your general approach to the spreadsheet like?
Did you have a plan for how to fill it out?
What questions did you answer first? What did you leave for last?
What was your process like? Did it change for the topics?
What did you think of being able to answer by industry or establishment?
Was the ability to answer by either industry or establishment made clear to you?
Did you use this option?
FOR BTOS: On the spreadsheet that listed all your establishments, two columns asked for contact information at each establishment. What was your reaction to this request?
Above the columns, we explained why we were asking for this information. Did you find this explanation clear or not clear? Is there information missing from this explanation that might have helped you?
What else should we consider when asking this question?
Would you want to be included in any communication with these establishment contacts?
When you entered the response portal, you would have seen a walkthrough tutorial outlining how to use it. Do you recall going through that tutorial? Was it useful to you?
[Interviewer may want to share screen and walk through screens]
Would you have known how to get back to this tutorial, if needed?
Did you use any other help document provided? (help pages; census.gov etc)
FOR USABILITY: Thinking about when you submitted your spreadsheet, how would you like to be notified of errors?
Overall, were there any other features that would help you to respond that are missing?
Is there anything that would have made reporting easier if it had been included or formatted differently?
Research Question 2: Patterns in Responses (20 minutes)
I wanted to ask you some specific questions about your responses and what we noticed. I’ll share my screen now. Were you able to understand how we’d like you to format answers?
Did you notice some cells were not necessary or optional?
I see you answered by establishment/industry for the spreadsheet/certain items or topics. How did you decide that?
Some of your cells/items/topics/locations were left blank. Can you tell me more about why?
How easy or difficult was it to answer the question about establishment contacts?
What was your process for providing this information?
(How did you choose each contact you selected? If you put yourself/a general inbox, why did you select that option rather than a specific person?)
How easy or difficult was it to answer this question?
Partial nonresponses: I see that some of the establishment contacts are blank. Can you tell me about that?
(What kept you from providing that establishment contact information? Why did you provide information for some establishments but not others?)
Nonresponses: I see that the columns for establishment contacts are blank. Can you tell me about that?
(What kept you from providing the establishment contact information?)
Do you have any suggestions for improving this question? (For nonresponses: How could we improve this question to get this information?)
Do you think there were characteristics about your company that impacted how you answered?
Do you think company size or amount of establishments made it easier or harder to answer?
[If applicable] I remember you reached out to us about [a specific topic]. What ended up happening?
How did you decide when it was time to contact us?
Research Question 3
[skip if they completed any of the survey] I noticed that you did not complete the AIES pilot survey. Can you tell me about that?
What, if anything, could we at the Census Bureau do to support you in responding to this survey?
What are the reasons why you did not fully complete (or did not complete at all) the AIES pilot survey?
Do you have any suggestions for ways the Census Bureau could encourage companies to respond to this survey?
BURDEN: How long do you think it took you to complete the whole survey? How long for the 3-step survey? How long for the spreadsheet?
Did some topics take more or less time than others? Which ones?
Which topics were the easiest to answer about? Which were the hardest?
Did certain items take more or less time to answer about?
Which topics or items did you have readily available in your records?
Did you have to contact anyone else to find this information?
Who did you contact?
Which topics required you to reach out to others?
Overall, which question(s) would you say was the most time-consuming or difficult to complete?
In-Person Possibility
In the past, we’ve found some types of interviews get results that are more true to how you might fill out a survey when we do those interviews in person. How would you feel about that?
Would there be certain requirements on our end that would make you more comfortable? Such as knowing the interviewer’s vaccination status? What about a recent negative test?
Would you have a preference between in-person or virtual interviews?
Does your company currently have a physical office?
How often are you in that office?
Is that somewhere we would hypothetically be able to conduct an interview?
Are there any protocols around visitors? Is your company meeting with visitors?
Is there anything else we should think about when considering in-person interviews as an option?
Wrap-up
That’s all the questions I have for you today! Do you have any other comments, questions, or suggestions for us?
Thank you so much for your time today.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Melissa A Cidade (CENSUS/ESMD FED) |
File Modified | 0000-00-00 |
File Created | 2023-12-22 |