Census Bureau Participant
Debriefing Research for the
2023 Annual Integrated Economic
Survey
Submitted Under Generic Clearance for Questionnaire Pretesting Research
March 4, 2024
Request: The U.S. Census Bureau, through a contract with RTI International (RTI), plans to conduct research under the generic clearance for questionnaire pretesting research (OMB number 0607-0725, expiration: 12/31/2025). We will be conducting participant debriefing interviews to document response processes and patterns for the 2024 Annual Integrated Economic Survey (AIES), a cross-economy annual survey that combines seven long-standing annual economic collections into one streamlined, harmonized survey collection, designed to produce national and subnational annual economic estimates.
Debriefings outlined in this memorandum represent one piece of a suite of respondent-centered research in support of the first full implementation of the AIES. This includes anticipated requests for cognitive and exploratory interviewing and additional usability testing later this calendar year. These debriefings offer an additional research methodology to engage respondents and nonrespondents in the continued development of the AIES instrument.
Background on the AIES: In preparation for the implementation of the 2023 AIES, the Census Bureau’s Economic Directorate has engaged in a robust program of respondent-centered research projects. Most notably, beginning in February 2022, the Census Bureau conducted a pilot test of the harmonized survey. Phases I and II of the AIES pilot were collected under OMB 0607-0971, and included a harmonized survey instrument, respondent debriefing interviews, a Response Analysis Survey (RAS), and analyses of the content of calls and emails from respondents during the field period. In August 2023, we conducted a large-scale test of the production instrument – the 2022 AIES, a “dress rehearsal” of the harmonized instrument under OMB 0607-1024, along with additional usability testing under OMB 0607-0725. Each of these efforts built off the research before it, leading to an iterative research program.1
At the conclusion of the final test – the dress rehearsal – we identified findings and recommendations for improvement to the survey program. Generally, these recommendations can be grouped into three categories: those related to the response process for AIES; those related to respondent communications and response support; and those related to instrument performance. Some of these lines of inquiry are best suited for usability research, and the Census Bureau plans to pursue this type of testing later in 2024. Others need cognitive testing or early-stage scoping for more investigation, and we plan to engage these methodologies later in 2024, as well. Finally, some can be addressed using respondent debriefings, the purpose of this specific OMB request.
Table 1: Recommendations and Research Methodologies from the 2022 AIES to the 2023 AIES
Topic |
AIES Dress Rehearsal Finding |
Update or Next Step |
2024 Research Methodology |
Response process |
Respondents rely on a survey preview. |
Further develop survey preview and content selection tool. |
Debriefing interviews |
Communications |
Respondents report using the support materials when they are relevant and accessible. |
Review website for accessibility and ease of use. |
Debriefing interviews |
Each communication piece serves a specific function. |
Conduct additional communications-focused research. |
Debriefing interviews |
|
Instrument design |
The three-step survey design needs additional supports. |
Further developed survey preview and content selection tool. |
Debriefing interviews |
Linear survey design suppressed response. |
Developed ability to move forward and backward through the survey. |
Usability testing |
|
Large companies struggled with Step 1. |
Provided download/upload functionality for Step 1. |
Usability testing |
|
Respondents get lost in the spreadsheet |
Froze left columns and top rows. |
Usability testing |
|
Respondents struggled with the reporting units. |
Updated industry-level collection display in instrument. |
Usability testing |
|
Rounding functionality surprised respondents. |
Included instruction on instructions tab. |
Usability testing |
|
Error checking needs additional development. |
Update error labeling and implement automatic error checking. |
Usability testing |
|
Ambiguous content needs additional testing. |
Conduct cognitive testing on misreported content. |
Cognitive and early-stage scoping |
Purpose: The purpose of this study is to further qualitatively explore response processes to inform continued development of the AIES instrument. We anticipate this interviewing to cover two categories of participants: respondents to the AIES and non-respondents. For the purposes of this research, the debriefing questions will be focused on the participant’s experiences in answering the survey (or not) and their evaluation of communications materials, as well as their impressions of the survey instrument.
To that end, we have identified three overarching research questions, each with sub-questions nested underneath, including:
How are respondents reporting to the AIES?
What are respondents' overall impressions of the survey?
What is the ideal length and timing of the field period for respondents?
What are the unique reporting needs of companies with locations outside of the 50 U.S. states?
What are the barriers to reporting (non-respondents)?
What is respondents’ feedback on the content and accessibility of respondent communications?
What support materials are respondents using when reporting to the AIES?
Are the content selection tool and summary document sufficiently supporting response?
What are respondents’ impressions of letters and emails?
What is respondents’ feedback on instrument performance and response burden?
Population of Interest: The population of interest consists of a wide variety of businesses who were asked to participate in the 2023 Annual Integrated Economic Survey, regardless of their response status to that survey. We anticipate targeted recruitment for the debriefings to include companies with at least one establishment classified in a six digit North American Industry Classification System (NAICS) code in the manufacturing sector (“manufacturing firms”). We will also pay special attention to complex firms, operationalized as companies with establishments classified in four or more NAICS codes.
Sample: For respondent debriefings, Census Bureau staff will periodically provide RTI with a list of businesses that had responded to the AIES since the previous list submission, including respondent contact information and job title, and any missing or out of scope responses to the survey. From these lists, RTI staff will recruit respondent debriefing participants. Recruitment lists will reflect those companies eligible for selection into the AIES.2
For nonrespondent debriefing interviews, Census Bureau staff will pull a list of delinquent companies after the published survey due date, late in the field period, so as to identify low-likelihood respondents. We anticipate this data pull occurring in July, at least six weeks after the published survey due date.
We plan to conduct a maximum of 100 interviews. We plan to conduct interviews with participants representing businesses of varying sizes3 and complexities, as well as response status. We anticipate the following approximate targets for recruitment:
Table 2: Targeted Types of Debriefing Participants
Type |
Target N |
Respondent |
70 |
Non-respondent |
30 |
Total |
100 |
This number of interviews was selected because it is a manageable number of interviews for the timeframe available for this testing. The sample should be adequate to provide reactions to the survey indicative of most respondents and non-respondents across a wide variety of firms.
Recruitment: For respondent debriefing interviews, RTI will contact potential participants via email, explaining the nature of the research and asking them to participate in the study. They will recruit additional participants by telephone, as needed, to obtain wide participation by characteristics of interest, including manufacturing status, complexity, response status, and others. The realized sample of participants will be those who are able to be contacted and who agree to participate in the study.
Participants will be informed that their response is voluntary and that the information they provide is confidential and will be seen only by Census Bureau employees or those with Special Sworn Status who are involved in the research project. Once interviews are scheduled, researchers will send participants a confirmation via email.
See Attachment B for respondent recruitment materials.
Method: Researchers from the Census Bureau and from RTI will conduct all interviews using Microsoft Teams. Interviews will follow a semi-structured protocol that includes both required and optional sets of interview questions. Interviews will last no longer than 1 hour.
Interviews may be recorded. These interviews will be stored on a Census Bureau secure server, accessible only to researchers on the project. These recordings will capture audio, computer screen, and may include participant faces and setting.
Protocol: Some parts of the interview will be administered to all participants, regardless of response status or firm characteristics. Some parts of the interview will be in-scope for participants representing certain firms. Finally, some of the parts of the interview are optional for all participants, and will be administered based on timing, participant feedback, and interviewer purview. Subject area specialists from the Census Bureau may participate in some of the debriefing interviews for observational purposes.
Attachment A contains the protocol for the debriefings. Attachments D and E contain the various letters and emails we may ask participants to evaluate. Attachment F and G contains screenshots of webpages and parts of the survey instrument we may ask participants to provide feedback on.
Consent: In addition to the required PRA and Privacy Act notices, the consent form will also indicate that the respondent agrees that the interview can be audio- and video-recorded to facilitate analysis of the results (see Attachment C). Respondents will sign the consent form via a Qualtrics form that allows for a digital signature. Respondents who do not consent to be recorded will still be allowed to participate.
Incentive: Monetary incentives for participation will not be offered.
Interview Length and Burden Estimate: Interviews will last no more than 60 minutes each, and we will conduct no more than 100 interviews across various firm types and locations (total burden: 100 hours). We estimate it will take respondents five minutes to read and review recruitment materials, and we anticipate sending no more than two recruitment emails (total burden: 17 hours). If recruitment is sluggish for particular types of firms, we may conduct up to five phone attempts at 3 minutes per call to gain participation in interviewing (total burden: 26 hours). We anticipate it will take respondents no more than five minutes to sign the consent form, schedule the session, and log on when the session begins (total burden: 9 hours). This brings the total estimated response burden to 152 hours. Recruitment will be rolling throughout the field period, and so recruitment sizes are based on typical scenarios of response from discrete populations in prior interviewing.
Table 3: Burden Hours Assumptions for Debriefing Interviews
Interviewing Type |
Target N |
Recruitment N |
Interviewing burden in hours |
Recruitment materials review (10 minutes max) burden in hours |
Recruitment phone calls (15 minutes max) burden in hours |
Scheduling, signing consent forms, logging in (5 mins max) burden in hours |
Respondent |
70 |
1,000 |
70 |
12 |
18 |
6 |
Non-respondent |
30 |
500 |
30 |
5 |
8 |
3 |
Totals |
100 |
1,500 |
100 |
17 |
26 |
9 |
Timeline: Testing will begin in April 2024, within two weeks of the launch of the 2023 AIES. Interviewers will contact respondents as close to their survey submission date as possible so as to minimize recall bias. Nonrespondent debriefing interviews will begin later in the field period after the official due date has passed (April 30, 2024). All interviewing will be completed no later than the close-out date of the collection, October 2024.
Language: Testing will be conducted in English only.
Attachments: Please refer to the following attachments related to this request:
Attachment A: 2023 AIES Participant Debriefing Interview Protocol
Attachment B: Recruitment Materials
Attachment C: Participant Informed Consent Form
Attachment D: 2023 AIES Letters for Testing
Attachment E: 2023 AIES Emails for Testing
Attachment F: 2023 AIES Websites for Testing
Attachment G: AIES Instrument Screenshots for Testing
Attachment H: AIES Pilot Phase I Findings and Recommendations Slide Deck
Attachment I: AIES Pilot Phase II Findings and Recommendations Slide Deck
Attachment J: 2022 AIES Usability Testing Findings and Recommendations
Attachment K: 2022 AIES (“Dress Rehearsal”) Findings and Recommendations Slide Deck
Contact: The contact person for questions regarding the design of this research is listed below:
Melissa A. Cidade, Ph.D.
Principal Methodologist, Annual Integrated Economic Survey
Economy-Wide Statistics Division, Office of the Division Chief
U.S. Census Bureau
Washington, D.C. 20233
Cc:
Nick Orsini (ADEP) with enclosure
Stephanie Studs (ADEP) “ “
Blynda Metcalf (ADEP) “ “
Michael Lopez Pelliccia (ADEP ) “ “
Lisa Donaldson (EWD ) “ “
Edward Watkins (EWD) “ “
Shelley Karlsson (EMD) “ “
Tom Smith (EMD) “ “
Amy Anderson Riemer (ESMD) “ “
Kevin Deardorff (ERD) “ “
Aleia Clark Fobia (ADRM) “ “
Jasmine Luck (ADRM) “ “
Danielle Norman (PCO) “ “
Mary Lenaiyasa (PCO) “ “
1 See the following attachments for additional background information on this program of research:
Attachment H: AIES Pilot Phase I Findings and Recommendations Slide Deck
Attachment I: AIES Pilot Phase II Findings and Recommendations Slide Deck
Attachment J: 2022 AIES Usability Testing Findings and Recommendations
Attachment K: 2022 AIES (“Dress Rehearsal”) Findings and Recommendations Slide Deck
2 For more information on sample considerations and sample unit eligibility for AIES, see: https://www.census.gov/programs-surveys/aies/technical-documentation/methodology.html
3 Please note: while we will aim for respondents from a wide variety of multi-unit companies, we will not be specifically targeting our very largest companies, those participating in the Full Service Account Management (FSAM) program, as feedback and input from these companies will come through the FSAM representatives.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Melissa A Cidade (CENSUS/EWD FED) |
File Modified | 0000-00-00 |
File Created | 2025-08-12 |