September 26, 2023 (revised 3.11.24)
Cognitive Testing Support Project: Final Report on Development and Cognitive Testing of Emergency Economic Information Collection (EEIC) Question Bank Phase II
Final Report
Prepared for
Measurement & Response Improvement for Economic Programs
Economic Statistics and Methodology Division
Economy-Wide Statistics Division
The U.S. Census Bureau
Prepared by
Y. Patrick Hsieh, Rachel Stenger, Katherine Blackburn,
Jerry Timbrook, and Chris Ellis
RTI International
3040 E. Cornwallis Road
Research Triangle Park, NC 27709
RTI Project Number 0217758.001.003
Contents
Section Page
1. Introduction 1-1
1.1 Goals and research questions 1-1
1.1.1 Cognitive interview methodology 1-2
2. Public Sector Testing 2-1
2.2 Identifying and developing public sector questions for testing 2-1
2.3 Cognitive interview protocol development 2-2
2.4 Cognitive Testing Data Collection 2-4
2.4.1 Population of interest and sampling strategy 2-4
2.4.2 Participant recruitment 2-5
2.4.3 Cognitive interviewing procedure 2-7
2.5 Analytic strategy and assessment process 2-9
2.6.1 Common themes regarding organizational differences for the response process 2-12
2.6.2 Common themes regarding cross-sector applicability of question wording 2-16
2.6.4 Justification for removing items from the Question Bank 2-21
2.7 Finalizing the Question Bank for Public Sector Questions 2-21
3. Private Sector Testing 3-1
3.2 Identifying and revising private sector questions for retesting 3-1
3.3 Cognitive interview protocol development 3-2
3.4 Cognitive Testing Data Collection 3-3
3.4.1 Population of interest and sampling strategy 3-3
3.4.2 Participant recruitment 3-4
3.4.3 Cognitive interviewing procedure 3-9
3.5 Analytic strategy and assessment process 3-10
3.6.1 Common themes regarding organizational level burden 3-12
3.6.2 Common themes regarding cross-sector applicability of questions 3-13
3.6.3 Common themes regarding the utility of examples 3-17
3.6.4 Justification for removing items from the Question Bank 3-19
3.7 Finalizing the Question Bank for Private Sector Questions 3-19
4. Conclusion 4-1
5. References 5-1
A. Summary of Final Disposition Codes for Surveys by American Association for Public Opinion Research A-1
B. EEIC Cognitive Interview Protocols for testing with the Public Sector B-1
B-1. EEIC Cognitive Interview Protocol for testing with the Public Sector (Round 1, Wave 1) B-2
B-2. EEIC Cognitive Interview Protocol for testing with the Public Sector (Round 1, Wave 2) B-19
B-3. EEIC Cognitive Interview Protocol for testing with the Public Sector (Round 1, Wave 3) B-35
B-4. EEIC Cognitive Interview Protocol for testing with the Public Sector (Round 2, Wave 1) B-52
C. EEIC Cognitive Interview Protocols for testing with the Private Sector C-1
C-1. EEIC Cognitive Interview Protocol for testing with the Private Sector (Wave 1) C-2
C-2. EEIC Cognitive Interview Protocol for testing with the Private Sector (Wave 2) C-17
C-3. EEIC Cognitive Interview Protocol for testing with the Private Sector (Wave 2) C-33
D. Recruitment and Interview Materials D-1
D-1. Template of Initial Contact Email Used During Cognitive Interview Recruitment D-2
D-2. Template of Confirmation Email Used During Cognitive Interview Appointments D-3
D-3. Consent Form Used During Cognitive Interviews D-4
E. Questions Not Included in the Question Bank (Questions Classified as “Not Acceptable”) E-1
Exhibits
2.1. Instrument Arrangement for EEIC Cognitive Testing with the Public Sector 2-2
2.2. Recruitment Outcomes by Testing Rounds for the Public Sector 2-6
2.3. EEIC Testing Public Sector Participants Descriptive Statistics 2-8
2.4. Classification Criteria for Questions Cognitively Tested 2-10
2.5. Summary of Testing Results for Public Sector Testing 2-12
3.1. Instrument Arrangement for EEIC Cognitive Testing with the Private Sector 3-3
3.2. Recruitment Outcomes by Testing Rounds for the Private Sector 3-5
3.3. Outcomes from Information Gathering Research 3-7
3.4. EEIC Testing Private Sector Participants Descriptive Statistics 3-9
3.5. Summary of Testing Results for Private Sector Testing 3-12
Research Objective
Through a contract between WhirlWind Technologies, LLC and RTI International (RTI), the Census Bureau commissioned RTI to develop, cognitively test, and finalize a Question Bank of cognitively tested survey questions appropriate for gauging and monitoring the economic impact on U.S. businesses and organizations of regional, national, or international emergencies (research conducted under OMB number 0607-0725). The final product of this work is Emergency Economic Information Collections (EEIC) Question Bank v1.0, which includes 173 unique questions that can be included in federally administered surveys without further OMB approval; the majority of these questions were cognitively tested with employees of businesses in the private sector while some were tested with employees of agencies in the public sector. EEIC Question Bank v1.0 can be found here.
After the conclusion of EEIC Question Bank v1.0, additional cognitive testing was conducted for EEIC Question Bank v2.0 under two main tasks: (1) testing survey questions with employees of agencies in the public sector and (2) testing survey questions with employees of businesses in the private sector. The survey questions tested with agencies in the public sector included existing questions in EEIC Question Bank v1.0 and newly developed questions. The survey questions tested with businesses in the private sector included questions classified as “further testing required” or “not acceptable for inclusion” after EEIC Question Bank v1.0 testing was completed. After cognitive testing with the public and private sectors was complete, the team determined which questions would enter the EEIC Question Bank v2.0.
Methodology and Outcomes
Task 1: Cognitive Testing with the Public Sector
The goal of this first task was to expand the EEIC Question Bank v1.0’s coverage to include additional items tested with public sector agencies. This was accomplished by (1) testing whether relevant items from Question Bank v1.0 work well with employees from public sector agencies and (2) developing and testing new questions specific to the public sector to increase the bank’s topical coverage.
This process resulted in 28 questions for cognitive testing. These questions were tested across two rounds. All 28 questions were tested with public sector agencies in Round 1. Round 2 was reserved for retesting questions that did not perform well in Round 1. These items were revised between rounds based on analysis of Round 1 interview results.
To complete this testing, 60 cognitive interviews were conducted from December 19, 2022, to July 24, 2023, across four waves and two rounds of data collection (three waves in Round 1 and one wave in Round 2). Each wave of data collection included 15 participants and tested between 8 and 11 questions, grouped by topic.
General findings from cognitive testing included the following:
The organization of public sector agencies impacted the response process, especially considering differences in agency size, location (i.e., urban vs rural), and type of agency (e.g., special district, municipality).
“Paid employees” was a difficult construct for public agencies because there are a variety of positions that include no pay (e.g., volunteers) or limited pay (e.g., elected board members who receive a small stipend).
Public sector respondents struggled to understand technical terms about business operations and language used by human resources (HR), such as capital expenditures and overtime.
Some language did not translate well from the private sector to the public sector, such as references to local governments and schools, and the topics of layoffs, recruitment activities, and demand.
It was difficult to develop new questions to measure concepts of interest focused on specialized language from the Census Bureau, such as agency functions, funding, and authority over an agency’s budget.
After cognitive testing from both rounds, 21 questions were classified as acceptable for use with the public sector and 8 questions as unfit for use with the public sector.
Task 2: Cognitive Testing with the Private Sector
At the conclusion of work on Question Bank v1.0, several items did not perform well enough in cognitive testing to be included in the first version of the bank. It was determined that further revisions and testing were required before these items could be included in the bank. The goals of this second task were to (1) revise these questions to address issues discovered during Question Bank v1.0 testing and (2) retest these revised items for potential inclusion into Question Bank v2.0.
This process resulted in 24 questions for cognitive retesting. Unlike the public sector questions, most of these private sector question had been previously tested. Therefore, they were only tested in one more Round for Question Bank v2.0 testing.
Forty five cognitive interviews were conducted from May 11, 2023, to June 19, 2023, across 3 waves of data collection. Each wave of data collection included 15 participants and tested 8-10 questions, grouped by topic.
General findings from cognitive testing included the following:
Companies with multiple establishments, or locations, faced increased response burden for questions that asked them to report about differential changes across establishments.
Findings from the public sector Round 1 testing were applicable to the private sector and helped us uncover some additional concerns about these topics.
Comprehension problems for some items existed across sectors even though the reasons for these difficulties were sector-specific.
It was difficult to adapt all questions to work across industries, especially regarding manufacturers and non-profits.
Including examples for certain terms proved useful.
After cognitive testing, 23 questions were classified as acceptable for inclusion in the Question Bank v2.0 and one question as excluded from the question bank.
Since the onset of the Coronavirus pandemic, the Census Bureau has sought to measure the effect of the pandemic on U.S. businesses through supplemental questions added to several of its recurring business surveys. In 2021, the Census Bureau sought a new generic Office of Management and Budget (OMB) clearance for conducting Emergency Economic Information Collections (EEIC). This generic clearance provided an avenue to collect quality data during future unanticipated emergent events (e.g., natural or humanmade disasters, pandemics or other health emergencies, civil unrest or insurrection, or other emergency events). Through a contract between WhirlWind Technologies, LLC and RTI International (RTI), the Census Bureau commissioned RTI to develop, cognitively test, and finalize a Question Bank of cognitively tested survey questions appropriate for gauging and monitoring the economic impact on U.S. businesses and organizations of regional, national, or international emergencies (research conducted under OMB number 0607-0725). The final product of this work is EEIC Question Bank v1.0, which includes 173 unique questions; the majority of these questions were cognitively tested with employees from businesses in the private sector while some were tested with employees from agencies in the public sector. EEIC Question Bank v1.0 can be found here.
After the conclusion of EEIC Question Bank v1.0, additional cognitive testing was conducted for EEIC Question Bank v2.0 under two main tasks: (1) testing survey questions with agencies in the public sector and (2) testing survey questions with businesses in the private sector. The survey questions tested with public sector agencies included existing questions in EEIC Question Bank v1.0 and newly developed questions. The survey questions tested with private sector businesses included questions classified as “further testing required” or “not acceptable for inclusion” after EEIC Question Bank v1.0 testing was completed. After cognitive testing with the public and private sectors was complete, the team determined which questions would enter the EEIC Question Bank v2.0. This report summarizes the scope of, process for, and outcomes from each of these tasks.
The cognitive testing tasks (both with the private and public sectors) explored whether representatives of businesses/government agencies in the United States understood the survey questions as intended and were able to provide the requested data. This research was conducted under the generic clearance for questionnaire pretesting research (OMB number 0607-0978). The findings from the cognitive interviews enabled the Census Bureau to:
Understand participants’ thought processes and memory demands when answering EEIC questions; and
Understand the record-keeping practices and intraorganizational data retrieval process to answer EEIC questions, reflecting response burden at both individual and organizational levels.
The study results informed the final design of and determined which survey questions would be included in the Question Bank v2.0 for the proposed OMB generic clearance for EEIC.
Cognitive interviewing is a method for evaluating survey questions that assesses how participants from a population of interest understand, mentally process, and respond to the questions we present (Willis, 2004). Cognitive interviewing first involves identifying questions of interest to test, creating probes to use during the interview to evaluate the questions, and finally analyzing the qualitative data collected during the interview process to improve the questions’ construction. The number of interviews required to reach saturation (i.e., the point where sufficient data have been collected to make valid conclusions) can vary depending on the diversity of the population of interest. Because reaching saturation is impractical for resource, time, and budget constraints, the project team for this study agreed that a minimum of 15 completed interviews would suffice, based on Census Bureau experience testing establishment survey questions across businesses/agencies of different sizes and industries/types that complete economic surveys for the Census Bureau. This cognitive testing project used a purposive sample, selected to meet these criteria (i.e., business size and industry or agency type), based on the research questions. Thus, the resulting sample should not be considered statistically representative, and it is inappropriate to use the sample to make statistical inferences about a specific target population. Furthermore, participants completed this study in an artificial, laboratory-like setting with an abbreviated set of questions asking about the effect of an emergency event on their business. Therefore, it is possible that these findings may not generalize to realistic survey conditions.
Items in Question Bank v1.0 were mainly tested with employees from private sector businesses, with only a small number of questions tested with public agencies that process and grant permits. The goal of this first task was to expand the bank’s coverage to include additional items tested with public sector employees from across a variety of agencies. This was accomplished by (1) testing whether relevant items from Question Bank v1.0 work well with employees from public sector agencies and (2) writing and testing new questions specific to the public sector.
This process resulted in 28 questions for cognitive testing. These questions were tested across two rounds. All 28 questions were tested with public sector agencies in Round 1. Round 2 was reserved for retesting questions that did not perform well in Round 1. These items were revised between rounds based on analysis of Round 1 interview results.
The sections below detail how questions were developed, tested, analyzed, and (when appropriate) added as items to Question Bank v2.0.
Development of the public sector questions began with a collaborative effort between all members of the project team. Census Bureau economic subject matter experts (SMEs) reviewed items in Question Bank v1.0 (i.e., items that had only been cognitively tested with employees in the private sector) and identified questions that might also be applicable to agencies across the public sector. The SMEs also drafted new questions to expand the topical coverage of public sector items in the Question Bank. These new questions covered topics such as contact information, reasons for permanent closures, changes to budget, and changes to the primary function of the agency. Although the questions included for public agency testing in Question Bank v1.0 focused only on permits, the goal of the Question Bank v2.0 public sector testing was to develop public sector questions that could be used on agency-wide surveys.
After compiling the initial set of draft survey questions, an iterative expert review approach was used to assess and improve the draft questions. First, survey methodologists revised the existing items identified from Question Bank v1.0 to ensure they were applicable to the public sector (e.g., replacing phrases like “goods and/or services” with just “services”). Next, survey methodologists reviewed each of the newly drafted questions and provided recommendations for methodological changes (e.g., question wording, response options, formatting) if questions did not conform to best practices in the survey methodology literature (Dillman et al., 2014). SMEs then provided further recommendations for revision and questions were remediated using this feedback, which was followed by one final review. This process resulted in 28 questions (22 from Question Bank v1.0, 6 new) moving on to the next step in the research process.
The 28 questions were organized into three waves (Exhibit 2.1), based on question topic. Most questions were tested in only one wave. However, there were two notable reasons for exceptions.
First, during the identification and development process, small revisions were made to the questions that were selected from Question Bank v1.0 (i.e., questions that were originally tested with the private sector) to ensure that they were applicable to the public sector (e.g., replacing phrases like “goods and/or services” with just “services”). However, three of the questions selected from Question Bank v1.0 ask about concepts that are conceptually different across the private and public sectors. To maximize our opportunity to collect sufficient participant feedback, we desired a larger analytic sample. Therefore, we tested these three questions across two waves rather than just one. Using the qualitative findings collected from a larger sample allowed us to make stronger conclusions about results for these questions.
Second, two of the newly written questions asked about a complicated concept (i.e., changes to an organization’s funding/finances because of a natural disaster). We similarly desired a larger analytic sample to allow us to make stronger conclusions about results for these questions and tested them across two waves rather than just one.
After Round 1 cognitive testing was completed, eight questions were selected to be re-tested in Round 2, during one wave of data collection.
Exhibit 2.1. Instrument Arrangement for EEIC Cognitive Testing with the Public Sector
Testing Group |
No. of Questions Tested |
Round 1 |
|
Wave 1 |
8 |
Wave 2 |
7 |
Wave 3 |
8 |
Questions tested across multiple waves |
5 |
Round 1 Total |
28 |
Round 2 |
8 |
Once the draft questions were organized into waves within each round of testing, a protocol was developed for each wave’s cognitive interview. The EEIC cognitive interviews followed a semi-structured interview protocol using the concurrent probing technique. Appendix B includes all cognitive interviewing protocols used with the public sector sample.
When developing the protocols, the fill variables denoted in the draft questions (e.g., “In <time period>” or “<business/agency/etc.>”) had to be operationalized for testing. To maintain consistency of the operationalization for the entire study, decisions made about the terms used in each fill depended on the context of an individual question, the projected timeframe within which the interviews were being conducted, and any findings from previous waves of testing.
Questions were also written for testing using the generic phrase “the natural disaster” as the wording for the <event> fill. During the cognitive interview, interviewers explained to participants that this phrase was intended as a placeholder and would be filled in with something more specific once the questions were finalized, so that interviewers could guide participants to think about whatever event would be most relevant to their agency.
Probing questions were written to focus on each step in the survey response process for establishment surveys (Bavdaž, 2010; Bavdaž et al., 2015; Jenkins & Dillman 1997; Tourangeau et al., 2000; Willimack & Snijkers 2013):
Perception (Does the respondent see the question?)
Comprehension (Does the respondent understand the meaning of terms and phrases used in the question and response options?)
Retrieval (Does the respondent have difficulty locating the information required to answer the question—either from their own memory, from another employee at the company, or from databases/records?)
Judgment (Does the respondent have any difficulty forming their answer? Do they edit their answer?)
Response (Does the respondent have any difficulty mapping their answer onto the available response options?)
Some probing questions were written to ask respondents about organizational-level burden (Bavdaž, 2010); that is, agency practices, policies, and other record-keeping or information retrieval processes that occur at the agency level (i.e., Does the agency maintain a record for the requested information? How does the respondent retrieve the record and what level of effort is needed?). These questions were asked hypothetically (e.g., If you were answering this question on a Census Bureau survey, would you know the answer right away, would you make a guess, or would you consult records or other colleagues?). However, during the interviews, some respondents went to the database where the record was stored so that they could provide an actual answer to the survey question. This concept was key because the questions being tested might be added to already existing Census Bureau surveys, and the goal was to limit respondent burden.
As described in Section 2.1, the questions identified for Round 1 testing with the public sector included both questions from Question Bank v1.0 and newly written questions. For the newly written questions, probes focused on all steps in the response process. For questions from Question Bank v1.0, probes mainly focused on comprehension and individual- and organizational-level burden. For Round 2 testing, probing questions were written to focus on changes made to the question after testing in Round 1.
The population of interest for public sector cognitive testing consisted of local government agencies that had previously responded to Census Bureau surveys like the Government Units Survey (GUS). These local government agencies consist of five types: county, municipal, township, special district, and school district. The sampling frame was a list of local agencies on the Government Master Address File, which is a Census Bureau–maintained universe of state and local governments. This sampling frame included the contact information, which typically consisted of name, address, phone number, and email address of the past respondent at a given agency. However, not every agency had identified a point of contact for the agency and instead included more limited information, such as a phone number and/or email address. The sampling frame also included some essential agency information such as the agency type and the agency function if it was a special district.
To prepare for the agency sample recruitment, the frame was randomly partitioned into replicates consisting of about 100 agencies, stratified by agency type. After creating the replicates, each replicate was reviewed for duplicate contact persons, which occur in situations where one person is responsible for reporting for multiple agencies. Each contact person was only contacted once for recruitment to participate, where the agency was selected randomly when the replicates were created. Additionally, housing authorities were initially included in the sampling file but removed from being eligible for interviews after it was discovered that some housing authorities operate as private businesses. These replicates were then contacted—initially by email and later by phone—and invited to participate in the research study (see Section 2.4.2 for details).
As stated in Section 1.1.1, this project used a purposive sample; therefore, the resulting sample should not be considered statistically representative, and it is inappropriate to use the sample to make statistical inferences about a specific target population.
As described in Section 1.1.1, the goal of participant recruitment for individual waves was to conduct 15 total interviews, with an estimated length of 45 minutes per interview.
The project team recruited participants for cognitive testing and conducted interviews with public sector employees from December 14, 2022, to January 25, 2023, in Round 1 and from June 30, 2023, to July 24, 2023, in Round 2. School districts were excluded from recruitment in Round 2 because the data collection period was during the summer months and the point of contact at schools would be difficult to reach.
Recruitment efforts focused on contacting and recruiting participants using one replicate of sampled agencies at a time. We emailed an invitation to a replicate of sampled agencies to participate in the interview and within about 1 week of sending the first email, a recruiter called the sampled agency. All recruiters and interviewers used Census Bureau email accounts (i.e., [email protected]) when contacting sampled participants. We sent a second email to sampled agencies that did not respond to the first email and phone call. All contact attempts were addressed to the identified contact person in the sampling file, when available. Sampled agencies that agreed to participate by phone or by email were then scheduled for an interview and sampled agencies who refused to participate by phone or by email were removed from further contact.
In cases where the original past respondents were no longer the point of contact for the sampled agency (i.e., the listed contact person was retired, deceased, changed positions, or no longer employed at the agency) or there was no point of contact provided, interview staff attempted to identify and seek cooperation from their replacement for participating in the cognitive interview. However, some of these new employees indicated their lack of knowledge or confidence for answering our questions and instead refused to participate.
Recruitment for participation in each round of testing overlapped, such that some participants who were contacted in Round 1 but did not participate in an interview or refused to participate were sometimes contacted again in Round 2. Overall, staff were able to establish contact (i.e., completed interview, refused, no-show, withdrawal during interview, and rescheduled) with approximately 16% of the sampled agencies in Round 1 and approximately 24% of the agencies in Round 2. In addition, we were able complete interviews with approximately 8% of all contacted sampled agencies in Round 1 and approximately 9% in Round 2. See Exhibit 2.2 and Appendix A for details about the recruitment outcomes for the study.
Exhibit 2.2. Recruitment Outcomes by Testing Rounds for the Public Sector
|
Round 1 (Waves 1–3) |
Round 2 (Wave 1) |
Recruitment start date |
12/14/2022 |
6/30/2023 |
Interviews start and end dates |
12/19/2022—1/25/2023 |
7/10/2023—7/24/2023 |
Number of sampled units with at least one contact attempt |
565 |
171 |
Number of unknown eligibility* |
13 |
4 |
Number of ineligible* |
0 |
0 |
Total number of eligible cases** |
552 |
167 |
Number of refused* |
31 |
16 |
Number of no-shows* |
10 |
5 |
Number of withdrawals* |
4 |
4 |
Number of completed interviews* |
45 |
15 |
Number of eligible cases, non-contact* |
462 |
127 |
Cooperation rate*** |
8.2% |
9.0% |
Contact rate**** |
16.7% |
*See Appendix A for the definitions of each final disposition code (AAPOR, 2016)
**For these purposes, a sampled unit was considered eligible if at least one email was sent and the email did not bounce back, regardless of whether the phone number was called.
***AAPOR Cooperation Rate 1 (COOP1), or the minimum cooperation rate, is the percentage of eligible cases with a completed interview.
****AAPOR Contact Rate 3 (CON3) is the percentage of eligible cases in which some responsible member of the sampled unit was reached by the survey (i.e., completed interview, refused, no-show, withdrawal during interview, and rescheduled).
Note: The number of sampled units where contact was attempted was not mutually exclusive by round. As recruitment efforts overlapped between rounds, sampled units may have been contacted during multiple rounds. The number of completed interviews is mutually exclusive by round; each sampled unit could be interviewed only once.
The sample for the Question Bank v2.0 effort included all types of government agencies, rather than only those who process and grant permits as in Question Bank v1.0 testing. In Question Bank v1.0, we had exceptionally high cooperation rates (21.1%) with the permit agencies and expected to have similarly high response rates for a general agency sample. However, this proved not to be true, with cooperation rates closer to the private sample than expected (as illustrated in Exhibit 2.2). It is unclear why the general agency sample had a significantly lower cooperation rate compared to the permit agencies.
In addition to the overall low cooperation rate, there were other challenges experienced with recruiting public sector agencies. When contacting each agency and explaining the topic of the interview as the impacts of natural disasters, many points of contact wanted to direct us to speak with their local government emergency coordinators. In these cases, the point of contact did not believe they were the best fit for the interview topic and wanted to direct us to the appropriate person. After explaining that the goal of the interviews was to speak with the person who normally fills out the Census Bureau surveys, we were able to convert some of these participants to complete an interview, but not all.
Another issue that we ran into was that some points of contact report for multiple agencies, especially in rural areas, where one person may wear many different hats. As described in Section 2.4.1, we decided to remove instances where multiple agencies had the same point of contact listed to simplify the recruitment process. We proceeded with contacting each person only once and asking about their experiences reporting for one given agency. During the interview, data were collected about how many different agencies for which each participant reported, and interviewers asked the participant to focus on how they would report for one given agency. Most of the completed interviews were with participants who only reported for one agency.
All cognitive interviews were conducted remotely (as opposed to in person) using Microsoft Teams. When faced with technical difficulties, some interviews were completed through a direct phone call to the participant. Participants and interviewers used audio only and were not required to turn on their video cameras.
The study consent and the testing instrument for each wave, including an introduction to the study, were programmed as stand-alone web instruments, hosted by Qualtrics. Once a participant was scheduled with an interviewer, the participant was sent an email confirming the appointment. This email contained the link to the consent form and a link to the testing instrument. In addition to the protocol and survey questions themselves, other study materials included invitation and confirmation emails, along with the consent form that had been approved by the Census Bureau’s Policy Coordination Office and OMB and used by Census Bureau researchers. See Appendix D for recruitment and interview materials, including templates of the invitation and confirmation emails that recruiters used (Appendices D-1 and D-2) and the programmed consent form (Appendix D-3).
During each interview, the interviewer went through the following procedure with the participants:
Verified that the participant filled out the consent form and re-verified consent to participate and consent to be recorded.
Began recording the interview (with participant consent) and re-verified consent to participate and consent to be recorded with the recording turned on.
Instructed the participant to navigate to the Qualtrics survey and reviewed the introduction page. During this introduction, the interviewer explained the purpose of the interview and instructed the respondent to “think aloud” as they answered (i.e., tell the interviewer what they are thinking about as they are reading and answering each question, noting if anything is confusing or unclear to them) and provided an example of thinking aloud.
Asked the participant several background information questions, including the type of agency, the participant’s title, and their years of experience at the agency.
Asked the participant about their previous experience with natural disasters or other emergency events and encouraged the participant to either think about that experience or think hypothetically about a future event while answering the questions.
Reviewed questions and probes with the participant, using concurrent probing (i.e., participants reviewed a question, interviewer asked all probing questions for that survey question, then participant moved on to the next question).
Asked concluding questions. These questions included asking participants if they had any other comments about the survey questions, asking any additional information about organizational-level burden (e.g., who else would they needed to ask input from to answer the questions, how much effort would it be to coordinate with them) and asking what device the participant was using to answer the survey.
Thanked the participant for their time and ended the recording and interview.
After the interview concluded, interviewers uploaded the recording of the interview to a secure folder within the Census Bureau’s protected network and entered their notes from the interview into a combined spreadsheet for each wave. Sixty cognitive interviews were conducted across both rounds of testing, with interviews lasting an average of approximately 35–40 minutes. The study did not offer monetary incentives for research participation, which is the Census Bureau’s usual practice for cognitive interviews with agencies. See Exhibit 2.3 for descriptive statistics of participants by round of testing.
Exhibit 2.3. EEIC Testing Public Sector Participants Descriptive Statistics
|
Round 1 (Waves 1–3) |
Round 2 (Wave 1) |
Number of interviews |
45 |
15 |
Length of interview (in minutes) |
|
|
Average (Std. Dev.) |
40.0 (9.5) |
35.9 (12.4) |
Min |
25 |
18 |
Max |
64 |
60 |
Agency Type |
|
|
County |
2 (4.4%) |
1 (6.7%) |
Township |
6 (13.3%) |
2 (13.3%) |
Municipal |
12 (26.7%) |
6 (40.0%) |
School District |
2 (4.4%) |
0 (0.0%) |
Special District |
23 (51.1%) |
6 (40.0%) |
Role/Title of participant |
|
|
Management (e.g., Director, Manager, Administrator, Supervisor) |
22 (48.9%) |
7 (46.7%) |
Accounting/Finance (e.g., Secretary/ Treasurer, bookkeeper, payroll) |
9 (20.0%) |
3 (20.0%) |
Clerk |
8 (17.8%) |
4 (26.7%) |
Other (e.g., Superintendent, Consultant, Trustee, Commissioner) |
6 (13.3%) |
1 (6.7%) |
Length of time at agency (in years) |
|
|
Average (Std. Dev.) |
9.1 (10.7) |
7.3 (7.2) |
Median |
4.75 |
6 |
Mode |
2 |
4 |
Time Zone (of participant) |
|
|
Central |
18 (40%) |
8 (53.3%) |
Eastern |
14 (31.1%) |
3 (20.0%) |
Mountain |
5 (11.1%) |
1 (6.7%) |
Pacific |
8 (17.8%) |
3 (20.0%) |
As explained previously, interviewers wrote up detailed notes from the interview and input the notes into a centralized Excel spreadsheet for notes after each interview. The spreadsheet file was used to analyze the data collected about each question. This qualitative analysis was completed using the survey response process (Bavdaž, 2010; Bavdaž et al., 2015; Jenkins & Dillman 1997; Tourangeau et al., 2000; Willimack & Snijkers 2013) as a guiding framework for understanding issues that may prevent respondents from easily producing appropriate or accurate answers to the questions (i.e., problems preventing quick completion of the response process)—the same framework used to develop the cognitive interview protocols.
Using this framework, methodologists analyzed the interview notes of individual questions to reveal issues that may prevent survey respondents from producing appropriate or accurate answers with a reasonable level of effort. Methodologists also considered the type of government agency during analysis to discover if there were any types of agencies that may have more difficulty with the questions. For example, respondent burden may be low for a small government agency with a staff of three but high for a large municipality with a staff of 150.
Based on the assessment of the findings, staff classified each question into one of three categories: few minor issues, some minor and/or moderate issues, or multiple moderate and/or major issues. These classifications included both the quantity (i.e., few, some, a lot) and severity (i.e., minor, moderate, major) of the issues uncovered during testing. Minor issues were generally problems in the response process mentioned only by one participant, or easy-to-fix problems that did not affect burden and still allowed participants to reach a final answer consistent with the question’s intended meaning. Moderate issues were problems in the response process mentioned by multiple participants and that suggested increased response burden while still allowing participants to reach a final answer; these answers, however, were not always consistent with the intended meaning of the question. Some questions were also classified as moderate issues when we needed SMEs to weigh in on the problems observed during testing. The SME input helped us decide whether the issues were too severe for the question to be included in Question Bank v2.0. Major issues were problems that caused a stoppage in the response process and prevented participants from reaching a final answer (e.g., data requested does not exist in records, large variation in participant comprehension of key terms). The general criteria for classifying questions are summarized in Exhibit 2.4, which details the quality and severity of problems associated with each classification.
Exhibit 2.4. Classification Criteria for Questions Cognitively Tested
Few Minor Issues |
Some Minor and/or Moderate Issues |
Multiple Moderate and/or Major Issues |
Clear to most participants |
Not clear to some participants |
Not clear to most participants |
Small number of minor, easy-to-fix issues |
Multiple minor issues and/or a small number of moderate issues
|
Multiple moderate issues and/or at least one major issue |
Most participants could easily produce and enter a final answer |
Some participants had difficulty producing or entering a final answer |
Most participants had difficulty producing or entering a final answer |
|
|
Question had a high degree of organizational level burden |
In addition to the testing summaries and the assessment of individual questions, staff proposed recommendations for further improvement for most questions regardless of their classification, including a new draft of the revised version, when needed. For some questions, multiple revised versions were provided for the project team to discuss and choose which revised version should move forward (either to the Question Bank or to Round 2 testing). The recommendations also included instructions to advise users of the Question Bank how to administer EEIC survey questions for their research.
In each round of testing, the results of the analysis for each tested question and recommended question changes were reviewed to determine if any adjustments were needed. The project team met to discuss these results and decide on the next steps for each question. The goal for Round 1 was to (1) identify questions that were acceptable for inclusion in the Question Bank as is for the public sector or had very minor changes and could move on to the Question Bank without further testing; (2) identify and diagnose questions that would need further revision and testing in Round 2; (3) identify questions that were not acceptable for use with the public sector even though they worked well for the private sector; and (4) deliberate on and finalize any necessary revision to be implemented for the next steps.
Round 2 analysis focused on a final decision for each question; the final decision for each question was either (1) the implemented changes improved the issues identified in Round 1 and the question was included in the Question Bank or (2) the question in its current form continued to exhibit issues such that any recommended revisions would require additional testing and therefore was not ready for inclusion in the Question Bank. With no further planned testing at this time, no recommendations for question changes for those items still exhibiting severe issues were made.
This section begins by summarizing the assessment of the testing outcomes and the final decisions of the questions by each round and wave of testing. It then presents a discussion of the common themes identified across individual testing waves, followed by a diagnosis interpreting the potential reasons that questions resulted in a poor performance that prevented the team from including such questions in the Question Bank.
Exhibit 2.5 summarizes the testing results of the questions tested for the public sector for Question Bank v2.0. After Round 1, it was determined that 14 questions had few or no issues and could be included in the Question Bank either as is or with minor revisions. The team also identified seven questions that had varying degrees of issues to be addressed. These questions underwent additional revision and then moved on to Round 2 for retesting. In addition to these seven questions, one question was pulled from the existing private sector questions from Question Bank v1.0 to test with the public sector after the initial question chosen for the public sector tested poorly. Fourteen questions that performed poorly during Round 1 were excluded from the Question Bank. See Section 3.4.3 for justifications for excluding questions from the Question Bank and Appendix E for the details of these questions.
After Round 2, analysis supported including seven additional questions in the Question Bank and excluding one question because of the need for further revision and testing and poor performance. This resulted in 21 questions recommended for use with the public sector in addition to the questions added during Question Bank v1.0.
Some of the issues uncovered during testing with the public sector during Round 1 raised new concerns about three questions already in Question Bank v1.0 after being tested with the private sector. These three questions were retested with the private sector and findings from both private and public sector testing were considered when making final recommendations. See Section 3.6 for further discussion of the private sector testing.
Exhibit 2.5. Summary of Testing Results for Public Sector Testing
# |
Qs Tested |
Assessment Classification |
Final Decision |
||||
Few Issues |
Some Issues |
Major Issues |
Accepted to |
Moved to Round 2 |
Excluded from Q Bank |
||
Round 1 |
|
|
|
|
|
|
|
Wave 1 |
8 |
5 |
3 |
0 |
6 |
2 |
0 |
Wave 2 |
7 |
2 |
5 |
0 |
4 |
2 |
1 |
Wave 3 |
8 |
2 |
3 |
3 |
3 |
2 |
3 |
Multiple waves |
5 |
0 |
2 |
3 |
1 |
1 |
3 |
Round 1 Total |
28 |
9 |
13 |
6 |
14 |
7 |
7 |
Round 2 |
8 |
1 |
6 |
1 |
7 |
— |
1 |
Similar to the previously completed private sector testing, we observed several essential differences within the public sector agencies depending on the size and type of government agency. One stark difference was the difference between small, rural agencies and large, urban agencies. Although the organizational structure and functionality of larger, multi-function agencies is similar to that of private businesses or organizations, the organizational structure of smaller, rural single- and multi-function agencies is more diverse than the larger agencies. For smaller, rural agencies, many rely on volunteers to run the agencies and some have no paid employees. Smaller, rural special district agencies typically exist to perform one single, specific function in the community, such as sourcing ambulances for the county or managing the cemetery. Sometimes the duties of these small agencies are limited, such as once a month meetings or yearly budget reviews.
As mentioned earlier when describing the recruitment process in Section 2.4.1, some special districts have one point of contact serving across multiple small agencies. Meanwhile, larger, multi-function agencies, such as municipalities or townships, act much more like private businesses, with many paid employees operating typically out of a central location. These larger, multi-function agencies provide many different services to the community, such as utilities, parks, public transportation, and other general government services.
These differences by size and agency type mean that asking one question that works well across all public sector agencies can be very difficult. We primarily observed issues for small special district agencies, which are highly specialized and vary significantly in their organizational structure. Findings were discussed with the SMEs to understand how the Census Bureau typically accommodates the varying structure of agencies. The challenges observed with asking questions across agencies were similar to the challenges that the Census Bureau often faces with surveying government agencies. Additionally, the SMEs provided context about the known measurement error that they face in their estimates, especially around topics such as number of paid employees and expenses. Additional notes regarding these findings were added to the accepted public questions in Question Bank v2.0 to ensure survey designers can consider all relevant information when selecting questions.
In addition to differences among the different types and sizes of agencies, we also saw key differences compared with the private sector, primarily around questions asking about closures. Almost all public agencies interviewed indicated it would be incredibly unlikely that they would close, even temporarily, during a natural disaster. Instead, these agencies’ role during natural disasters tends to increase, taking on additional functions within the community, such as distributing disaster supplies or converting government buildings (e.g., libraries, office space) to provide shelter. Public agencies also reported that it was typical for staff (paid or volunteer) to work as much as possible to make sure the community can manage the impacts from the disaster and recover, including significant overtime. Although there may still be instances when asking about closures is important, there is likely limited analytic utility for these questions about closures in the public sector; However, it will depend on the type of natural disaster.
Although the concept of paid employees was well-understood for larger, multi-function agencies, small, rural agencies tended to have more confusion about this term. In small, rural agencies, especially special districts, there might only be a handful of staff, some or all of whom operate as volunteers and do not receive any pay. Additionally, when a natural disaster strikes, public agencies in rural areas reported that there was usually an “all hands on deck” reaction, with both staff from the agency and volunteers from the community working together to keep the agency functioning and provide additional community services. This was the case for rural areas regardless of whether there were any paid employees associated with the agency. Some agencies also operate only with elected board members, with each board member receiving a small stipend associated with the position. Some participants at these agencies with board members pushed back on the idea that board members would be considered paid employees because they did not receive an ongoing salary.
Q03_B From April 1, 2023 to June 30, 2023, what percentage of this agency’s paid employees (workers who received a W-2) were temporarily and permanently laid off as a result of the natural disaster? Include:
Exclude:
Temporarily laid off workers are those who have been given a date to return to work or who are expected to return to work within 6 months. Permanently laid off workers have no expectation of being rehired within 6 months. Estimates are acceptable. Enter 0 if no layoffs.
|
With public agencies, we saw some additional issues with technical terms that we did not observe as frequently for the private sector. Public sector respondents are less likely to be knowledgeable about HR and finance terms, especially for special district agencies in rural areas. Many of these agencies do not have HR, accounting, or finance employees with whom to consult to respond to Census Bureau survey questions. Although we saw issues with these terms in the private sector when the point of contact was not in the given department (i.e., HR or finance), these respondents typically had someone else at their company to consult with to answer the questions. For public sector respondents, they do not always have someone else they can consult.
Q52 As a result of the natural disaster, how did this agency change its budgeted capital expenditures from July 1, 2022 to September 30, 2022? Select all that apply.
Q20 From July 1, 2022 to September 30, 2022, as a result of the natural disaster, did this agency take the following actions related to shifts or hours?
|
Another example of these types of terms that public sector participants struggled with was “overtime.” The intent of question Q20 (right) was to ask whether the amount of time staff worked had changed, but public sector employees struggled to consistently understand the word “overtime.” Several participants mentioned that “overtime” was not the term they would use because there was a more applicable term for their agency, or their source of funding excluded them from use of overtime. “Overtime” has a very specific definition and is only applicable to certain employees (i.e., typically those who are non-exempt). We suspect the lack of traditional HR employees as the points of contact for the public sector impacts comprehension of the term “overtime.” The points of contact that do respond to public sector Census Bureau surveys may not know the employment status of all agency staff nor understand the term “overtime” as intended. After consulting with the SMEs about these problems, we decided not to recommend this question for use with the public sector.
Q101 – Round 1 From July 1, 2022 to September 30, 2022, did the following factors related to the natural disaster influence this agency’s decision to close temporarily?
Q101 – Round 2 From July 1, 2022 to September 30, 2022, did the following factors related to the natural disaster influence this agency’s decision to close temporarily?
|
This question was retested with changes to these items for both the public and private sector. We adjusted the wording of several items and dropped those that did not seem to be working in the context of the public sector (see above). In Round 2, we found comprehension was improved for “state of emergency declared by the federal, state, or local government.” However, participants continued to struggle to understand the item “school closings, which led to a high number of absent staff” since schools are also a part of the public sector. Combined with the findings from the private sector, which indicated difficulties with reporting for large businesses and comprehension issues (as outlined in Section 3.6.1), we recommended removing this item from the question bank for both public and private sectors.
Q22 From July 1, 2022 to September 30, 2022, did this agency have less difficulty, no change, or more difficulty in recruiting paid employees compared to what was normal before the natural disaster?
Q29 From July 1, 2022 to September 30, 2022, how much did demand for this agency’s services change compared to what was normal before the natural disaster?
|
Recruitment activities were also not very relevant for public agencies, which do not typically engage in recruitment like private businesses with headhunters, job fairs, and other activities. Instead, public agencies interpreted “recruitment” to include a variety of hiring activities, including posting job ads, interviewing, and retaining employees. For this question about recruitment (Q22, above), we adjusted the concept of the question to instead focus specifically on hiring, and it tested well in the second round of testing and is recommended for use with the public sector in Question Bank v2.0.
Another concept that some participants struggled to understand was that of “demand” in Q29 (above). This concept is typically well-understood in the private sector as a key aspect of business, but we found that some participants from public sector agencies struggled to apply “demand” to the services they provide. These problems were limited and after Round 1 testing we felt confident we could move forward with “demand,” but after input from the SMEs we decided to test the concept of “need” during Round 2. Unfortunately, the concept of “need” did not work well because of comprehension challenges around the phrase “did need” in the question stem. After discussion with SMEs and considering the findings from both rounds of testing, we felt comfortable moving forward with the term “demand” for the public sector while adding notes to the Question Bank v2.0 to reflect the potential issues some agencies may encounter.
New Question Q21 – Round 1 From July 1, 2022 to September 30, 2022, did the natural disaster cause this agency’s primary function to change?
New Question Q21 – Round 2 From April 1, 2023 to June 30, 2023, did the natural disaster cause this agency to add additional functions? Examples of functions are Fire Protection, Sewage and Water Supply, Utility, etc.
[IF YES:] Please describe the function(s) added: __________ |
New Question Q13 From July 1, 2022 to September 30, 2022, did the natural disaster affect this agency’s funding to pay its employees?
|
New Question Q14 Did the authority over this agency’s budget change as a result of the natural disaster?
|
Questions were removed from consideration for the Question Bank when testing identified substantial issues. See Appendix E for all questions that were removed from consideration for the Question Bank. Some questions were excluded after Round 1 testing if the question could not be re-written to overcome substantial problems with the response process identified during testing (e.g., construct too complex or vague, construct does not match experience of public sector). Feedback from the Census Bureau confirmed that these questions were not high-priority constructs for the Question Bank v2.0. Examples of questions that were not included after Round 1 testing are outlined in the previous sections (2.6.1, 2.6.2, 2.6.3).
We revised other questions after Round 1 and then re-tested them in Round 2. Questions were eliminated after Round 2 if testing determined that they continued to pose problems for the response process and that further revision would require additional cognitive testing before being added to the Question Bank. The only question that we removed after Round 2 testing was W2Q21 about changes in the primary functions of an agency, as outlined above in Section 2.6.3.
The project team updated the EEIC Question Bank v1.0 by completing additional cognitive testing and using the findings to develop Question Bank v2.0. EEIC Question Bank v2.0 includes a description of the additional cognitive testing that was completed for 28 questions with employees from agencies in the public sector.
For the 23 questions from Question Bank v1.0 (i.e., questions that had already been tested with employees in the private sector) that were tested with public sector agency employees across both rounds, notes from this new testing were added to that question’s entry in the Question Bank. Information included in these notes is described below:
A note indicating that the question was cognitively tested with employees from agencies in the public sector.
Whether cognitive testing supported or did not support using the question on a survey with the public sector.
If cognitive testing did not support using the question on a survey with the public sector, a brief description of the findings from testing.
If cognitive testing did support using the question on a survey with the public sector, instructions about any wording changes necessary when using with the public sector (as opposed to the private sector).
Of the six newly developed public sector questions, three were categorized as “Acceptable for the Question Bank” (as described in Section 2.3.2 and Section 2.4) and a new entry was added to the EEIC Question Bank v2.0 using the same template as Question Bank v1.0.
At the conclusion of work on Question Bank v1.0, several private sector items did not perform well enough in cognitive testing to be included in the first version of the bank. It was determined that further revisions and testing were required before these items could be included in the bank. The goals of this second task were to (1) revise these questions to address issues discovered during Question Bank 1.0 testing and (2) retest these revised items for potential inclusion into Question Bank 2.0.
This process resulted in 24 questions for cognitive retesting. Unlike the public sector questions, the private sector questions had been previously tested, and some questions had been tested twice before with varying revisions. Therefore, these 24 questions for the private sector were only tested in one final Round for Question Bank v2.0.
The sections below detail how questions were identified, revised, retested, analyzed, and (when appropriate) added to Question Bank v.2.0.
Project staff members collaboratively reviewed 26 questions that were tested as a part of the Question Bank v1.0 project but were classified as either “further testing required” or “not acceptable for inclusion” and not included in the bank. As a part of this review, staff assessed testing and analysis notes that were created as a part of the Question Bank v1.0 project for each question. Census Bureau economic SMEs were also asked to determine whether each question would be applicable to their division, other divisions, or other agencies/groups, if revised. Ten questions that were not applicable to these groups were dropped from further consideration.
Survey methodologists then revised the remaining 16 draft questions to incorporate testing and analysis notes from Question Bank v1.0. As a part of this revision process, four new questions were created to address methodological concerns (e.g., addressing double-barreled questions by creating two new questions). It was also determined that one question under consideration would benefit from being asked after another question already published in Question Bank v1.0, so these questions were tested as a gate and a follow-up.
Most of these 21 questions were applicable to all industries, but two questions required testing with specific industries. One question about inventories was industry-specific and applicable only to manufacturers. Another question about net profits included an instruction for use by non-profits. The inclusion of these two questions for testing required recruiting specifically for manufacturers and non-profits, as described further in Section 3.4.2.
Notably, private sector retesting was conducted after Round 1 of public section testing (see Section 2). Based on findings from Round 1 public sector testing, project staff revised three additional questions that were tested for and included in Question Bank v1.0. These revisions allowed the project team to test whether insights gained during public sector testing of these three questions would also be applicable to the private sector.
This identification and revision process resulted in 24 questions moving on to the next step in the research process.
The 24 questions were organized into three waves, based on question topic (Exhibit 3.1). Most questions were tested in only one wave. However, there were three notable reasons for exceptions. First, as described in Section 3.2, one question was applicable only to the manufacturing industry. We included this question for manufacturers in all three waves to allow for collecting sufficient feedback from a larger analytic sample. Second, as described in Section 3.2, one question included an instruction that was provided for non-profits. Because this question was applicable across industries and only included an instruction that was industry-specific, we included this question on two waves to allow for feedback on the instruction specifically.
Third, Question Bank v1.0 contains some phrases, definitions, or instructions that are shared across multiple items in the bank (e.g., multiple items use the same instructions for reporting profit). Two of the 24 questions tested in this task used one of these phrases, definitions, or instructions that are shared across other questions in Question Bank v1.0. During the revision process, these two questions were revised by updating the wording of the shared phrases/definitions/instructions. If the updates to these phrases/definitions/instructions tested well for these two questions, this would support the need to update all questions that use these particular phrases/definitions/instructions in v1.0 of the bank. Therefore, to allow for stronger conclusions derived from a larger analytic sample, these two questions were tested across two waves rather than just one.
Once the draft questions were organized into waves within each round of testing, cognitive interview protocols were developed for each wave using the same approach developed for the public sector testing detailed in Section 2.3. As described in Section 3.1, the questions identified for testing with the private sector included questions excluded from Question Bank v1.0 in previous testing, newly written questions, and questions tested with the public sector. Because the majority of these questions had been tested previously, probing questions were written to focus on the changes that were made to the question because it was previously tested. However, for any new questions, probes focused on all steps of the response process.
Exhibit 3.1. Instrument Arrangement for EEIC Cognitive Testing with the Private Sector
Testing Group |
No. of Questions Tested |
Round 1 |
|
Wave 1 |
6 |
Wave 2 |
7 |
Wave 3 |
8 |
Questions tested across multiple waves |
3 |
Round 1 Total |
24 |
The population of interest for private sector cognitive testing consisted of businesses that had previously responded to one or more of the following Census Bureau business surveys:
Services Annual Survey (SAS17)
Annual Capital Expenditures Survey (ACES)
Annual Wholesale Trade Survey (AWTS17)
Annual Retail Trade Survey (ARTS17)
Manufacturers’ Unfilled Orders Survey (M3UO14)
Annual Survey of Manufactures (ASM).
To recruit employees in the private sector, we continued to use the same sampling file as we did in the Question Bank v1.0 testing. During this previous testing, Census Bureau staff from the Economy-Wide Statistics Division provided a sampling frame—a list of businesses that had previously responded to at least one of these surveys. This sampling frame included the contact information, which typically consisted of name, address, phone number, and email address of the past respondent at a given business. The sampling frame also included some essential business information such as the business’ industry classification using the North American Industry Classification System (NAICS) and payroll size.
During this previous testing, project staff prepared this sample file for recruitment by removing duplicates and randomly partitioning the frame into replicates consisting of about 100 businesses, stratified on payroll size category (based on the Census Bureau’s classification). Units in payroll size categories A (≥50,000,000) and B (10,000,000–49,999,999) were excluded from the sample. This was done for two reasons: there were so few companies in these payroll size categories that we could not guarantee anonymity and the experiences of these large companies are likely not applicable economywide.
For testing of Question Bank v2.0, staff contacted replicates that were unused in the Question Bank v1.0 testing. These replicates were contacted initially by email and later by phone and invited to participate in the research study (see Section 3.2.2 for details). As stated in Section 1.1.1, this project used a purposive sample; therefore, the resulting sample should not be considered statistically representative and thus it is inappropriate to make statistical inferences about a specific target population.
To ensure the participant recruitment for Question Bank v2.0 would not sample and recontact the businesses that have been contacted or have completed a cognitive interview during the testing of Question Bank v1.0, the same sampling list from Question Bank v1.0 continued to be used. Before data collection began, 50 cases were randomly selected for additional review and research online to better understand the accuracy of the contact information. One staff member researched each company and attempted to establish several key pieces of information: (1) whether the business was closed, (2) whether the contact person listed still worked at the company, and (3) whether a new contact person could be identified. Further discussion of the results of this effort are outlined in Section 3.4.2 under recruitment challenges. Ultimately, we decided that additional online research before calling cases would not improve contact rates, nor ultimately cooperation rates. Instead, the typical contact procedure was followed as described in Section 3.4.2.
As described in Section 1.1.1, the goal of participant recruitment for individual waves was to conduct 15 total interviews, with an estimated length of 45 minutes per interview.
Project staff recruited participants for cognitive testing and conducted interviews with private sector employees from May 11, 2023 to June 19, 2023. Recruitment efforts focused on contacting and recruiting participants using one replicate of sampled businesses at a time. We emailed a replicate of sampled businesses an invitation to participate in the interview and within about 1 week of sending the first email, a recruiter called the sampled agency. All recruiters and interviewers used Census Bureau email accounts (i.e., [email protected]) when contacting sampled participants. We sent a second email to sampled businesses that did not respond to the first email and phone call. Sampled businesses that agreed to participate by phone or by email were then scheduled for an interview and sampled businesses who refused to participate by phone or by email were removed from further contact.
In cases where the original past respondents were no longer the point of contact for the sampled agency (i.e., the listed contact person was retired, deceased, changed positions, or no longer employed at the agency), staff attempted to identify and seek cooperation from their replacement for participating in the cognitive interview. However, some of these new employees indicated their lack of knowledge or confidence for answering our questions and instead refused to participate.
Because of the content of some of the questions tested for the private sector, there was a specific effort made to recruit employees in the manufacturing industry and the non-profit industry, as described in Section 3.3. Recruitment efforts for these industry-specific cases used the NAICS codes provided within the sampling file (as described in Section 3.4.1), using NAICS codes starting with 31–33 for manufacturers and 813 for non-profits. The goal for manufacturers was to gain cooperation from four manufacturers per wave across the three waves with the manufacturing-specific inventory question, for a total of 12 manufacturers. The goal for non-profits was to gain cooperation from two non-profits per wave across the two waves with question with the non-profit instruction, for a total of four non-profits. During Question Bank v1.0, we had similar recruitment goals for manufacturers but never attempted to recruit specifically for non-profits. We found during Question Bank v2.0 that non-profits readily agree to participate and ultimately recruited more non-profits than the original goal. We completed 12 interviews with manufacturers, seven interviews with non-profits, and 26 interviews from all other industries.
Overall, staff were able to establish contact (i.e., completed interview, refused, no-show, withdrawal during interview, and rescheduled) with approximately 15% of the sampled businesses and we were able complete interviews with approximately 5% of all contacted sampled businesses. See Exhibit 3.2 and Appendix A for details about the recruitment outcomes for the study.
Exhibit 3.2. Recruitment Outcomes by Testing Rounds for the Private Sector
|
Round 1 (Waves 1–3) |
Recruitment start date |
5/8/2023 |
Interviews start and end dates |
5/11/2023—6/19/2023 |
Number of sampled units with at least one contact attempt |
961 |
Number of unknown eligibility* |
23 |
Number of ineligible* |
20 |
Total number of eligible cases** |
918 |
Number of refused* |
76 |
Number of no-shows* |
10 |
Number of withdrawals* |
6 |
Number of completed interviews* |
45 |
Number of eligible cases, non-contact* |
781 |
Cooperation rate*** |
4.9% |
Contact rate**** |
14.9% |
* See Appendix A for the definitions of each final disposition code (AAPOR 2016)
**For these purposes, a sampled unit was considered eligible if at least one email was sent, and the email did not bounce back, regardless of whether the phone number was called.
***AAPOR Cooperation Rate 1 (COOP1), or the minimum cooperation rate, is the percentage of eligible cases with a completed interview.
****AAPOR Contact Rate 3 (CON3) is the percentage of eligible cases in which some responsible member of the sampled unit was reached by the survey (i.e., completed interview, refused, no-show, withdrawal during interview, and rescheduled).
Participant recruitment challenges. As outlined in Section 3.4.1, we began recruitment with an experiment to evaluate whether additional upfront online research would improve contact rates and make recruitment calls more productive. We did not request a new sample file because we wanted to avoid contacting any person or business who had been contacted for Question Bank v1.0. We had concerns about how accurate information in the sample was, especially given the challenges we already faced with outdated information during testing for Question Bank v1.0. We were concerned that the recruitment task may require greater effort and more time to recruit the needed number of interviews, with turnover of the points of contact the primary concern.
One option that we considered was to add in a step to the recruitment process to do research online to confirm the contact person at each business, with subsequent recruitment efforts focused only on those who we could confirm. To evaluate the effectiveness of this approach, one staff member reviewed the information available online for 50 businesses to confirm whether the business was still open, whether the listed contact person still worked at the business, and whether a new contact person could be identified. These 50 cases were then worked by the recruiter like normal, with one email sent before one phone call attempt, followed by one final email outreach. The hypothesis was that businesses where we could confirm the contact people may be more productive, thus making additional research up front worth the time and effort. However, the review of these efforts indicated no better contact rate for the contacts that were confirmed online compared to the contacts that were not confirmed. The outcome of this effort is summarized in Exhibit 3.3. Contacts that were confirmed online were contacted at a rate of 55% and contacts that were not confirmed online were contacted at a rate of 57%. Additionally, once contacted, there was no significant difference between the number of contact persons we were able to confirm were still the point of contact for the business: 83% for those we confirmed online, and 88% for those we did not confirm online.
In addition to the information presented in Exhibit 3.3, all three interviews that were scheduled from the recruitment efforts of these 50 cases were from contacts that could not be confirmed online. All three of these businesses were small businesses. We suspect that businesses with employees who are not listed online are most commonly small operations that may have no online presence at all. We also tend to gain more cooperation from smaller businesses, which may help explain why the interviews we did recruit from these 50 cases all came from small businesses we were unable to confirm online. One other interesting outcome from this research was the refusal rates across contacts who could and could not be confirmed online. Among those contacts who could not be confirmed (n=28), six refused to participate, giving a refusal rate of 21%. Among those contacts who were confirmed (n=22), only three refused to participate (14%). Given the small sample, we cannot draw many conclusions from this refusal rate, but it was an interesting finding.
After reviewing all the findings from this effort, we concluded that additional online research before recruitment calls would likely not lead to a higher contact rate.
Exhibit 3.3. Outcomes from Information Gathering Research
|
|
n |
Percentage |
Number of contacts confirmed online |
|
22 |
44% |
Contacted |
|
12 |
55% |
Contact confirmed phone |
|
10 |
83% |
Contact no longer employed |
|
1 |
8% |
Unknown |
|
1 |
8% |
Not contacted |
|
10 |
45% |
Number of contacts not confirmed online |
|
28 |
56% |
Contacted |
|
16 |
57% |
Contact confirmed phone |
|
13 |
81% |
Contact no longer employed |
|
3 |
19% |
Unknown |
|
0 |
— |
Not contacted |
|
12 |
43% |
After attempting this new strategy for recruitment and deciding it would not be fruitful, we continued to work the sample with the typical recruitment efforts. However, we did notice issues with the accuracy of the frame provided by the Census Bureau. We encountered several instances of businesses that had been closed for a significant period of time and points of contacts who had left the business. This turnover increased the recruitment effort because the recruiter attempted to find a new suitable point of contact in these cases where the previous point of contact was no longer employed. Although we do not have the exact numbers of contacts who were no longer employed, anecdotally it seemed to have increased compared with our efforts during testing for Question Bank v1.0. Other issues we encountered included businesses being sold or incorporated into other entities, a mismatch between the business name in the sampling frame and the business contacted, and mismatched NAICS industry listed in the sampling frame.
Another recruitment challenge we faced both during testing for Question Bank v1.0, and this effort was the number of no shows and withdrawals from the interviews. When a participant does not show up for an interview, interviewers attempted to contact the participant via phone and email to conduct the interview at the appointment time or reschedule. When a participant could not be contacted by the interviewer, the recruiter would step in and follow up 1 to 2 days later to attempt to reschedule the participant. If the participant could not be contacted, then efforts were made to replace the participant with a new one. Another issue we encountered was withdrawals during the interview, where interviewers would explain the interview in detail during the consent process, including the time commitment, and participants would change their minds about participating. The effort to replace these cases was significant, especially when they represented the industry-specific recruitment needs for manufacturers and non-profits.
One additional issue with recruitment that occurred during testing for Question Bank v2.0 involved Certified Public Accountants (CPAs). Many businesses ask their CPAs to respond to Census Bureau surveys because the CPA has access to all the financial information needed to fill out the surveys. However, for the purposes of the questions we were testing, the CPAs would not have the intimate knowledge of the business needed to answer them. We did experience this issue during testing in Question Bank v1.0 but did not decide to systematically exclude CPAs at that time. When the sample included more up-to-date contact information, the number of CPAs responding to our requests was less and thus less of an issue for Question Bank v1.0. Additionally, during Question Bank v1.0 testing, some CPAs self-selected out of participation when they understood they were not the ideal participant for our interviews. For Question Bank v2.0 testing, we decided to systematically exclude CPAs so we could focus our recruitment efforts on participants who were better candidates for the cognitive interview. Exhibit 3.2 includes all CPAs identified in the sample in the count of ineligible participants. CPAs were identified through conversations during recruitment phone calls and indicators from the contact information that the person was a CPA rather than a representative of the business (e.g., email addresses that included “CPA” and did not include the name of the business).
Finally, some challenges were faced when we elicited cooperation from the employee responsible for reporting for the sampled business. Many contacted individuals expressed uncertainty about whether the call was a scam, especially given that recruitment was conducted by RTI with recruiters’ phones showing North Carolina area codes (i.e., not area codes from the Washington, D.C., metropolitan area, where the Census Bureau is located). Some also thought that the request to participate through Teams was indicative of a scam because the Census Bureau surveys they usually complete are all online. Some contacted individuals also indicated that the initial contact email went to their spam folder, and they would not have known about the study without the phone call. Other reasons for refusing to participate included the following:
The estimated interview time of 45 minutes was too long;
The business had a policy of not completing voluntary surveys;
The business was not impacted by the pandemic or natural disasters and did not feel the need to participate; and
The individual did not want to help the government.
The challenges that RTI faced with recruitment efforts were similar to the recruitment challenges that the Census Bureau often faces with other cognitive interviewing projects. Fortunately, we were able to resolve these challenges and fulfill the recruitment goals and solicit enough participants to complete the cognitive interviews.
For the private sector testing, the same cognitive interviewing procedures that were used in public sector testing (described in Section 2.4.3) were used. Forty-five cognitive interviews with employees in the private sector were conducted, with interviews lasting around 40 minutes, on average. In addition, participants were employed at businesses that varied on payroll size, from payroll size category J (<500) to C (1,000,000-9,999,999). See Exhibit 3.4 below for descriptive statistics of participants.
Exhibit 3.4. EEIC Testing Private Sector Participants Descriptive Statistics
|
Round 1 (Waves 1–3) |
Number of interviews |
45 |
Length of interview (in minutes) |
|
Average (Std. Dev.) |
40.7 (9.7) |
Min |
23 |
Max |
73 |
Role/Title of participant |
|
Accounting/Finance/HR (e.g., controller, accountant) |
26 (57.8%) |
Leadership (e.g., owner, CEO, CFO) |
19 (42.2%) |
Length of time at company (in years) |
|
Average (Std. Dev.) |
11.8 (11.8) |
Median |
8 |
Mode |
8 |
Time Zone (of participant) |
|
Central |
10 (22.2%) |
Eastern |
28 (62.2%) |
Mountain or Pacific |
7 (15.6%) |
The analytic strategy described for the public sector in Section 2.5.1 was also followed for the private sector. For the private sector, all of the data for questions that were repeated across waves were reviewed and analyzed together. Additionally, questions with similar concepts, such as structures and equipment or inventories, were analyzed together so that any common findings could be discovered. The main difference regarding analytic strategy was the focus on different industries for the private sector, whereas the public sector analysis explored different agency types. For the private sector, the main groups compared in analysis were manufacturers, non-profits, and all other industries. As outlined above in Section 3.2, one question was only tested with manufacturers and one question had probes that were specific to the non-profit sector. The same classification criteria for question problems were used for the private sector. See Section 2.5.1 for additional detail about the analytic strategy.
The focus for this round of testing for the private sector was to determine whether a question could or could not be included in the Question Bank v2.0. We accomplished this by evaluating the changes we implemented to the questions based on feedback gathered during Question Bank v1.0 development. The goals for this retesting were to (1) identify questions that were acceptable for inclusion in the Question Bank as is or had very minor changes and could move on to the Question Bank without further testing and (2) identify any questions that were still experiencing significant issues and would not be included into the Question Bank.
Additionally, three questions were included for retesting with the private sector after they exhibited problems in the public sector testing. The goals for this further testing were to (1) probe on any issues uncovered in the public sector testing; (2) test question wording changes motivated by findings from the public sector testing; and (3) determine whether these changes should be implemented with the private sector as well as the public sector.
This section begins by summarizing the assessment of the testing outcomes and the final decisions of the questions by each wave of testing. It then presents a discussion of the common themes identified across individual testing waves, followed by a diagnosis interpreting the potential reasons that questions resulted in a poor performance that prevented the team from including such questions in the Question Bank. The themes presented in this report outline the new findings from testing for Question Bank v2.0 but do not represent all issues faced by private sector respondents to establishment surveys. See the report for Question Bank v1.0 for additional issues, such as organizational and individual burden, including the following:
The tradeoff between precision and retrieval burden;
The applicability of questions across different business practices or organizational roles;
Problems with question design that was not well-aligned with business conventions; and
The utility of “don’t know” and “not applicable” responses.
Exhibit 3.5 summarizes the testing results of the draft EEIC questions for the private sector. After testing, it was determined that 13 questions had few or no issues and could be included in the Question Bank either as is or with minor revisions. The team also identified 10 questions that had varying degrees of issues to be addressed by consulting with the SMEs. One question that performed poorly was excluded from the Question Bank. See Section 3.4.3 for justifications for excluding questions from the Question Bank and Appendix E for the details of these questions.
Exhibit 3.5. Summary of Testing Results for Private Sector Testing
Wave No. |
Qs Tested |
Assessment Classification |
Final Decision |
|||
Few Issues |
Some Issues |
Major Issues |
Accepted to |
Excluded from Q Bank |
||
Wave 1 |
6 |
3 |
3 |
0 |
6 |
0 |
Wave 2 |
7 |
4 |
3 |
0 |
7 |
0 |
Wave 3 |
8 |
3 |
4 |
1 |
7 |
1 |
Multiple waves |
3 |
3 |
0 |
0 |
3 |
0 |
Total |
24 |
13 |
10 |
1 |
23 |
1 |
As suggested in the literature about business surveys (Bavdaž, 2010; Bavdaž et al., 2015; Willimack & Snijkers, 2013), an organization’s structure (e.g., location of records, policy and practices of record keeping) is a unique source of burden that can be difficult to address. Not surprisingly, the testing indicated that answering some questions would require respondents to retrieve information from multiple sources, such as other colleagues or company records (e.g., databases). For this round of testing, most of the issues with retrieval of information were not severe. Notes were included in the Question Bank v2.0 to indicate any questions with higher respondent burden so that survey designers can consider this when selecting questions. The primary source of organizational burden we observed during testing for Question Bank v2.0 was with companies that have multiple establishments, or locations. In testing for Question Bank v1.0, most burden for companies with multiple establishments was because records were not always stored in a centralized location for the entire company. In these cases, participants indicated that the required information was tracked at each of the company’s individual worksites and would be burdensome to retrieve.
Contrastingly, in testing for Question Bank v2.0, the primary concern participants from companies with multiple establishments raised was that they were uncertain how to answer one question to report across multiple establishments. They did not experience a difficulty retrieving the information, rather the questions were best suited for companies with a single location. For example, Q108 (below), asks about how on-site operations changed after a natural disaster. This question was retested to review changes to the instruction about when to select “on-site operations ceased.” Participants consistently understood the question text and instruction and the question was classified as “few issues.”
Q108 From January 1, 2023 to March 31, 2023, how did this company’s on-site operations change compared to what was normal before the natural disaster? If operations ceased on-site but continued remotely (such as working from home), please select ‘on-site operations ceased’.
|
Another common theme from Question Bank v2.0 testing was the cross-sector applicability of questions, meaning both private versus public and across different industries. As outlined above in Section 3.2, three questions were retested with the private sector to evaluate the findings from the public sector testing and determine whether the issues identified also applied to the private sector. When possible, we wanted the questions to remain as similar as possible for the private and public sectors so that comparisons across sectors could be made. The primary example of this retesting was Q101, which asks about reasons the company closed temporarily. For this round of private sector testing, we implemented changes supported by the public sector Round 1 testing and displayed only the relevant items in the grid for evaluation. The item “state of emergency declared” was expanded to include “by the federal government, state government, or local government.” The item “university and/or school closings” was adjusted to “school closings, which led to a high number of absent staff.” Retesting of this question focused primarily on these changes and we replicated the findings from the public sector. The item about school closings was confusing for participants, and those from large companies reported that they would not necessarily know
Q101 From July 1, 2022 to September 30, 2022, did the following factors related to the natural disaster influence this agency’s decision to close temporarily?
|
Another question that was adjusted after public sector testing and retested with the private sector was Q01_B. This question was originally part of a grid, and the individual item was tested with the public sector to explore its applicability outside of the private sector. However, this public sector testing uncovered some implied assumptions the question was making. Because the original Question Bank v1.0 question asked, “how did this company change the total number of paid employee hours?”, it implied that the company was actively deciding how many hours employees work. When testing this with the public sector, it was discovered that many public agencies work until the work is done during an emergency and the agency itself does not typically dictate employee hours. Instead, all agency staff work as much as is needed to deal with the disaster. With this finding in mind, we altered the wording to “how did the company’s total number of employee hours change?” This change should remove the implication that the company was actively changing the hours, and instead ask simply for a report of the changes. This question was retested with both the private sector and in Round 2 of the public sector testing. Although the wording change to the question text tested well and was consistently understood by participants, there were new issues uncovered with the private sector. We believe that the new issues were discovered during this round of testing because the initial question was tested only within a grid. These new problems may have been overlooked when evaluating all the grid items at once.
Q01_B – Question Bank v1.0 From July 1, 2022 to September 30, 2022, as a result of the natural disaster, how did this company change the total number of paid employee hours?
Q01_B – Question Bank v2.0 From July 1, 2022 to September 30, 2022, as a result of the natural disaster, how did the company’s total number of paid employee hours change?
|
This issue with the salaried employees also reflects problems with implementing these questions across all industries with a variety of employee types. Another question that proved to be too difficult to implement universally across industries was Q136 (below) about methods of sales. The question originally tested in Question Bank v1.0 included the instruction “Please report the main method of closing the sale or orders.” The phrase “main method” was not consistently understood by participants. This instruction was adjusted to “If individual sales of your company’s goods or services are made through multiple methods, please report the method used at the point of sale.” However, this tweak to the instruction for this question did not improve comprehension significantly. Many participants had issues understanding the new phrase “method used at the point of sale” with some believing it described the payment method instead of the mode of closing the sale.
Q136 – Question Bank v1.0 In the April – June quarter of 2021 and in the July – September quarter of 2021, what percentage of this company’s goods or services were sold to clients or customers using the following methods? Please report the main method of closing the sale or orders. Enter 0 if none. Estimates are acceptable.
Q01_B – Question Bank v2.0 In the October-December quarter of 2022 and in the January-March quarter of 2023, what percentage of this company’s goods or services were sold to clients or customers using the following methods? If individual sales of your company’s goods or services are made through multiple methods, please report the method used at the point of sale. Enter 0 if none. Estimates are acceptable.
|
Q99 – Question Bank v2.0 From January 1, 2023 to March 31, 2023, did this company have at least one physical location that was open to customers?
|
One of the most prominent issues facing the design of high-quality survey questions is ensuring that all terms are universally understood by respondents. As supported in the literature, phrases and terms can hold a variety of different meanings for respondents. Even straight-forward terms can be assigned an array of definitions by respondents (Belson, 1981; Fowler, 1995). However, providing examples in the question text can influence responses by limiting the breadth of the concept (Tourangeau, et al., 2014). This negative impact of providing examples must be weighed carefully with the benefit of increased clarity. The literature supports that examples can improve the accuracy of responses when they serve to remind respondents of items they otherwise might have unintentionally excluded or were unsure whether to include (Tourangeau et al., 2014).
To better explore whether providing examples was useful for respondents, three questions were presented in A and B format—one without examples and one with examples. When shown Version A (without examples), participants were asked to define key terms on their own. When shown Version B (with examples), participants confirmed whether the examples helped expand their definitions or provide additional clarity. Three of the questions focused on the concepts of structures and two of the questions focused on equipment. All of the questions about structures and equipment used the same examples, “buildings, parking garages, warehouses, etc.” for structures, and “production machinery, computers, office supplies, etc.” for equipment. These questions were also included in testing to evaluate other changes informed by the Question Bank v1.0 testing, but the focus of this section will be on the performance of the examples in the question text. An example of one of the questions is presented below.
Q178A – Without examples From January 1, 2023 to March 31, 2023, what was the total dollar amount of the losses for all structures that were damaged, destroyed, or decreased in value a result of the natural disaster? Please report amount before insurance compensation. If none, enter 0. Estimates are acceptable. $__________ From January 1, 2023 to March 31, 2023, what was the total dollar amount of the losses for all equipment were damaged, destroyed, or decreased in value a result of the natural disaster? Please report amount before insurance compensation. If none, enter 0. Estimates are acceptable. $__________ Q178B – With examples From January 1, 2023 to March 31, 2023, what was the total dollar amount of the losses for all structures (such as buildings, parking garages, warehouses, etc.) that were damaged, destroyed, or decreased in value a result of the natural disaster? Please report amount before insurance compensation. If none, enter 0. Estimates are acceptable. $__________ From January 1, 2023 to March 31, 2023, what was the total dollar amount of the losses for all equipment (such as production machinery, computers, office supplies, etc.) that were damaged, destroyed, or decreased in value a result of the natural disaster? Please report amount before insurance compensation. If none, enter 0. Estimates are acceptable. $__________ |
As described in Section 3.5.1, all questions with similar concepts were analyzed together to attempt to discover any common findings among the data. Across the three questions that included the concept of structures, “structures” was fairly well-understood even without examples. There were limited questions raised by participants when reviewing Version A without examples. The examples that participants thought of when reviewing Version A of the questions matched closely with the examples provided in Version B. However, the examples provided in Version B of the questions expanded participants’ definitions of structures. Some participants did not initially consider parking garages when thinking about what to include for structures. The term “equipment” posed more challenges for participants when presented without examples in the two questions tested. More participants had questions about what to include in equipment compared to structures, with questions primarily about whether vehicles and office furniture should be included. Additionally, providing the examples in Version B helped expand participants’ definitions of equipment because many participants did not initially think of office supplies as equipment. After seeing the examples, participants believed it made sense to include office supplies as equipment.
After discussion with the Census Bureau, we decided to move forward with including examples of both structures and equipment for all relevant questions. From testing, it was clear that providing the examples could help reduce uncertainty about what to include and helped to expand respondent definitions. In addition, we determined that the focus on office supplies was less useful for the analytic goals of the questions and replaced “office supplies” with “vehicles.” Adding vehicles to the list of examples for equipment should better measure the economic impact of a disaster because vehicles represent a larger expense than office supplies.
Questions were removed from consideration for the Question Bank when testing identified that substantial issues remained. See Appendix E for all questions that were removed from consideration for the Question Bank. Only one question was excluded from the Question Bank after this round of private sector testing. This item (described above in section 3.6.2, Q136) continued to suffer serious issues even after multiple rounds of testing and attempts to adjust the question wording. Discussion with the Census Bureau confirmed that this question was not a high priority construct and could be excluded because of the ongoing issues.
The project team updated the EEIC Question Bank v1.0 by completing additional cognitive testing with employees from businesses in the private sector and using the findings to develop Question Bank v2.0.
Twenty questions that were not acceptable for Question Bank v1.0 were revised and retested. One question was similar to a question already published in the Question Bank v1.0 (Q33). Instead of creating a new entry in the Question Bank for the newly tested item, we instead updated the notes for this existing Q33 to indicate that the wording of the newly tested item could be used as an alternative to Q33. Of the remaining 19 questions, 18 were categorized as “Acceptable for the Question Bank” (as described in Section 3.6) and a new entry was added to the EEIC Question Bank v2.0, using the same template as Question Bank v1.0.
As described in Section 3.1, public sector testing findings indicated that three questions from Question Bank v1.0 should be revised and re-tested with the private section. For these questions, updates to the wording were made to each question in EEIC Question Bank v2.0 based on the findings from testing.
Question Bank v1.0 includes several phrases, terms, definitions, and instructions that are used in multiple questions. As described in Section 3.2, there were several questions included for this purpose—to test a potential re-wording of these common phrases, terms, definitions, and instructions (e.g., examples of structures and equipment, terms used to describe manufacturing inventory, instructions for non-profit businesses). Where testing indicated that changes should be made to the wording of any of these phrases, terms, definitions, and instructions, we made those changes in EEIC Question Bank v2.0 where applicable.
This document provides a summary of the testing results and revision recommendations for the entire set of EEIC draft questions, as the outcome of the initial (v1.0) and the follow-up expansion effort (v2.0). Most recommendations were considered and integrated into the final version of the EEIC questions included in the Question Bank v2.0.
As a living document, the EEIC Question Bank v2.0 offers a unique opportunity to the Census Bureau to continue their effort of gauging the economic impacts of emergency events on businesses and local governments in the United States. Future research and development may consider soliciting users’ feedback from those who adopt EEIC questions from the Question Bank v2.0 for collecting emergency economic information to address any emergent events. The users’ feedback may then be incorporated in the Question Bank v2.0 for future users as the reference for assuring data quality and successful adoption of the EEIC Question Bank.
American Association for Public Opinion Research. (2016). Standard definitions: Final dispositions of case codes and outcome rates for surveys. 9th edition. AAPOR.
Bavdaž, M. (2010). The multidimensional integral business survey response model. Survey Methodology, 36(1), 81-93.
Bavdaž, M., Giesen, D., Černe, S. K., Löfgren, T., & Raymond-Blaess, V. (2015). Response burden in official business surveys: Measurement and reduction practices of national statistical institutes. Journal of Official Statistics, 31(4), 559-588. https://doi.org/10.1515/jos-2015-0035
Belson, W. (1981). The design and understanding of survey questions. Grower Publishing.
Dillman, D. A., Smyth, J. D., & Christian, L. M. (2014). Internet, phone, mail, and mixed-mode surveys: the tailored design method (4th ed.). Wiley.
Fowler, F. J. (1995). Improving survey questions: Design and Evaluation. SAGE Publications.
Jenkins, C., & Dillman, D. 1997. Towards a theory of self-administered questionnaire design. In L. Lyberg, P. Biemer, M. Collins, L. Decker, E. DeLeeuw, C. Dippo, N. Schwarz, & D.Trewin (Eds.), Survey Measurement and Process Quality. Wiley-Interscience.
Tourangeau, R., Rips, L. J., & Rasinski, K. A. (2000). The psychology of survey response. Cambridge University Press.
Tourangeau, R., Conrad, F. G., Couper, M. P., & Ye, C. (2014). The effects of providing examples in survey questions. Public Opinion Quarterly, 78(1), 100–125. https://doi.org/10.1093/poq/nft083
Willis, G. B. (2004). Cognitive interviewing: A tool for improving questionnaire design. SAGE Publications.
Willimack, D. K., & Snijkers, G. (2013). The business context and its implications for the survey response process. In G. Snijkers, G. Haraldsen, J. Jones, & D. Willimack (Eds.), Designing and Conducting Business Surveys. Wiley.
Final disposition code (AAPOR 2016) |
Details |
1. Interview |
Completed interview |
2. Eligible, non-interview |
|
3. Unknown eligibility, non-interview |
|
4. Ineligible |
|
Note: For these purposes, a sampled unit was considered eligible if at least one email was sent, and the email did not bounce back, regardless of whether the phone number was called. |
Emergency
Economic Information Collections Question Bank 2
Cognitive
Interviewing Protocol
Wave 1 (PUBLIC SECTOR: Operational Impact Assessment)
Checklist
Legend:
Black font‒ interview scripts and probes to be administered by interviewers BLACK UNDERLINED UPPERCASE font ‒ conditions for additional probes [Red font with brackets] ‒ interviewer actions to be performed Blue italicized font ‒ Read in verbatim to respondents (Introduction specific) Purple font ‒ Testing question content (for interviewer reference only) <item> ‒ varying elements to be replaced based on respondent’s answer for probing
|
Respondent number: ___________________
Interview date: ___________________
Interview start time: ___________________
Unit type: ___________________
Function (if special district): ___________________
Single/Multifunction: ___________________
Consent link: [link]
Survey link: [link]
[Interviewer: at the start of the interview, complete the following:
Connect to Skype for Business (audio only)
Confirm if respondent has already signed consent
If consent already signed, verbally confirm consent for participation, consent for recording, and consent for observers
Ask if respondent has any questions
Start recorder and state date, time, participant number, and reconfirm consent
Help respondent access the web survey
Ask respondent to click through to the Introduction page.]
[Interviewer: Read (or paraphrase) the Introduction to the respondent.]
[Interviewer: The think-aloud example (in blue) needs to be read verbatim.]
Introduction
The U.S. Census Bureau is requesting your help to test survey questions focused on collecting emergency economic information. If you agree to take part in this study, you will answer a few questions asking about the impacts of natural disasters on your agency’s operations. We recommend that you think about how your agency may respond to potential emergencies and unforeseen disruption.
Also, you will be asked to read the instructions and questions aloud and then “think aloud” as you answer each question. This may feel a little unnatural, but it will help us understand how you think about and answer each question. Here’s an example:
Suppose this were a survey about your home and one of the questions asked, “How many bedrooms are there in your house?” Someone thinking aloud as they respond might say, “What do you mean by bedrooms—are these just where people sleep? If so, I have 2 bedrooms. But if we are counting anything with a closet and a window, we also have a study. So perhaps, we have 3 bedrooms. I’m going with where people sleep, so I’ll answer 2 bedrooms.”
Today we are going to look at about 10-12 questions and I’ll ask some additional debriefing questions. After we look at each question, I will ask you some follow-up questions to get your feedback. For example, I might ask you what a particular term or phrase means or ask you to put the question in your own words. There are no right or wrong answers to any of the questions that I might ask. We just want to make sure that everyone is understanding these survey questions in the same way and if the questions are worded clearly and, if they are not, what changes should be made. Your participation is important because it will help the Census Bureau improve these questions to accurately measure economic activities of businesses like yours.
Before we begin, do you have any questions?
[Interviewer action: Answer questions, as needed.]
Let’s get started with some background information about your business.
What is your role or title?
How long have you worked at your office in this role?
Besides the pandemic, has your office experienced a natural disaster, like a hurricane, tornado, or flood, that caused your office to temporarily stop service to the public?
[If yes: instruct the respondent to think about that event when answering these questions:] The questions that we’re going to review do not ask about a specific natural disaster or disruptive event. Instead, the questions use the phrase “natural disaster” as a placeholder so that we can talk about whatever event is relevant to you. When you see the term “natural disaster” in a question, I want you to think about [EVENT DISCUSSED ABOVE]. When you see dates in the questions, those are just to give you an idea of the length of time we would ask about. But for our discussion, I want you to think of a time period immediately after [EVENT DISCUSSED ABOVE].
[If no: instruct the respondent to think about a hypothetical natural disaster that is common in their area (e.g., a tornado, a fire, a hurricane, etc.):] The questions that we’re going to review do not ask about a specific natural disaster or disruptive event. Instead, the questions use the phrase “natural disaster” as a placeholder so that we can talk about whatever event is relevant to you. Since you haven’t had a natural disaster that has impacted the agency while you have worked there, let’s think of a hypothetical disaster that may happen in the future. What area of the country are you in? [Guide respondent in selecting a natural disaster to talk about during the interview]. When you see the term “natural disaster” in a question, I want you to think about [HYPOTHETICAL EVENT DISCUSSED]. When you see dates in the questions, those are just to give you an idea of the length of time we would ask about. But for our discussion, I want you to think of a time period immediately after [HYPOTHETICAL EVENT DISCUSSED].
[Interviewer action: Ask respondent to move to next page.]
Now let’s move on to the survey. Please read each question out loud and then think aloud as you come up with your answer.
[Interviewer action: Ask respondent to move to next page.]
[Interviewer action: Observe how respondent answers the survey and watch for any difficulty in determining their answers to the questions. Prompt respondent to think aloud while answering/reading the questions if they do not.]
Probes for W1 Q1
[Remind respondents about placeholder dates and “natural disaster” as needed]
Has any of this agency’s contact information changed as a result of the natural disaster?
⃝1 Yes
⃝2 No
THINKALOUD NOTES
INTERVIEWER: If needed, explain that the “contact information” we are asking about is the contact information that the Census Bureau uses to contact the agency for surveys. If you do have to explain this to them, please describe the participant’s confusion.
How easy or difficult was it for you to answer this question? [IF DIFFICULT AND IF NEEDED:] Please explain.
And would you know the answer right away, would you make a guess, or would you consult records or other colleagues?
Probes for W1 Q1B
[IF NEEDED:] Respondents who answer “yes” to the previous question will receive this next question. I’ll still have you look at this question for our testing so that I can ask you some questions for your feedback.
What is this agency’s new contact information?
Addressee Title or Department
ATTN: |
|
Street 1
|
Street 2
|
City
|
State
|
Zip Code
|
|
|
|
|
- |
|
|
|
|
Phone Number
( |
|
|
|
) |
- |
|
|
|
- |
|
|
|
|
Email address
|
THINKALOUD NOTES
And what about this question—How easy or difficult was it for you to answer this question? [IF DIFFICULT AND IF NEEDED:] Please explain.
Probes for W1 Q2
[Remind respondents about placeholder dates and “natural disaster” as needed]
From July 1, 2022 to September 30, 2022, as a result of the natural disaster, did this agency temporarily close for at least one day?
Temporary closures can include times when a small number of individuals monitor and maintain any of an agency’s physical locations while it is otherwise shut down.
⃝1 Yes
⃝2 No
THINKALOUD NOTES
Did you think that this question was clearly written or not clearly written? [IF NOT CLEAR AND IF NEEDED:] Please explain.
How easy or difficult was it for you to answer this question? [IF DIFFICULT AND IF NEEDED:] Please explain.
Probes for W1 Q2B
[Remind respondents about placeholder dates and “natural disaster” as needed]
[IF NEEDED:] Respondents who answer “yes” to the previous question will receive this next question. I’ll still have you look at this question for our testing so that I can ask you some questions for your feedback.
From July 1, 2022 to September 30, 2022, did the following factors related to the natural disaster influence this agency’s decision to close temporarily?
|
Yes |
No |
Not applicable |
State of emergency declared |
|
|
|
Stay-at-home orders |
|
|
|
Restricted or limited access to the agency’s worksite(s) |
|
|
|
Local government information/suggestion |
|
|
|
Employee safety |
|
|
|
Disruption to supply/inventory delivery |
|
|
|
University and/or school closings |
|
|
|
Staff’s unwillingness or hesitation to report for work |
|
|
|
Agency’s funding was frozen or suspended |
|
|
|
Other, please describe: |
|
|
|
THINKALOUD NOTES
Let’s first focus on the question itself. Did you think this question was clearly written or not clearly written?
Now let’s look at the items in the table. Do you think that “State of emergency declared” and “Stay-at-home orders” mean the same thing or different things?
[IF NEEDED AND IF DIFFERENT:] How are they different?
Now let’s look at the fourth item: “Local government information/suggestion.” What are some examples of “Local government information/suggestion?”
Do you think that “Local government information/suggestion” means the same thing or something different from “State of emergency declared”?
[IF NEEDED AND IF DIFFERENT:] How are they different?
[IF NEEDED:] Do you think that “Local government information/suggestion” means the same thing or something different from “Stay-at-home orders”?
[IF NEEDED AND IF DIFFERENT:] How are they different?
Now let’s look at the third item — “Restricted or limited access to the agency’s worksite(s).” Does the term “worksite” make sense for your agency?
[IF NEEDED AND IF NOT:] What term would make more sense for your agency?
And now let’s take a look at the second to last item — “Agency’s funding was frozen or suspended.” In your opinion, do you think this item was clearly written or not clearly written?
How easy or difficult was it for you to answer this item [agency’s funding was frozen or suspended]? [IF DIFFICULT AND IF NEEDED:] Please explain.
Probes for W1 Q3
[Remind respondents about placeholder dates and “natural disaster” as needed]
From July 1, 2022 to September 30, 2022, did this agency permanently close as a result of the natural disaster?
⃝1 Yes
⃝2 No
THINKALOUD NOTES
When you were answering this question, what does the term “permanently close” mean to you?
Does your agency have any virtual locations?
[IF YES, AGENCY HAS VIRTUAL LOCATIONS:] When you were answering this question, were you thinking about physical locations only, or both physical and virtual locations?
[IF YES, AGENCY HAS VIRTUAL LOCATIONS:] What are some examples of how a natural disaster or other emergency event can permanently close a virtual location?
How easy or difficult was it for you to answer this question? [IF DIFFICULT AND IF NEEDED:] Please explain.
Probes for W1 Q3B
[Remind respondents about placeholder dates and “natural disaster” as needed]
[IF NEEDED:] Respondents who answer “yes” to the previous question will receive this next question. I’ll still have you look at this question for our testing so that I can ask you some questions for your feedback.
From July 1, 2022 to September 30, 2022, why did this agency permanently close as a result of the natural disaster?
|
THINKALOUD NOTES
How easy or difficult was it for you to answer this question? [IF DIFFICULT AND IF NEEDED:] Please explain.
[IF RESPONDENT IS ANSWERING HYPOTHETICALLY AND IF NEEDED:] If you were answering this question on a survey after a [FILL WITH NATURAL DISASTER] occurred, what kinds of answers do you anticipate writing into this text box?
For this question, would you know the answer right away, would you make a guess, or would you consult other colleagues?
Probes for W1 Q4
[Remind respondents about placeholder dates and “natural disaster” as needed]
From July 1, 2022 to September 30, 2022, how much did demand for this agency’s services change compared to what was normal before the natural disaster?
⃝1 Increased a lot
⃝2 Increased a moderate amount
⃝3 Did not change
⃝4 Decreased a moderate amount
⃝5 Decreased a lot
THINKALOUD NOTES
Does your agency provide more than one type of service?
[IF YES, HAS MORE THAN ONE TYPE OF SERVICE:] How easy or difficult is it for you to answer this question about all of your agency’s services?
[IF YES, HAS MORE THAN ONE TYPE OF SERVICE AND IF NEEDED:] If a natural disaster increased demand for one type of service but decreased demand for another type of service, how would you answer this question?
[IF YES, HAS MORE THAN ONE TYPE OF SERVICE:] If this question asked you about your agency’s “primary service” instead of “services,” would this be easier, the same, or more difficult to answer?
[IF NO, HAS ONLY ONE TYPE OF SERVICE:] How easy or difficult is it for you to answer this question? [IF DIFFICULT AND IF NEEDED:] Please explain.
Probes for W1 Q5
[Remind respondents about placeholder dates and “natural disaster” as needed]
From July 1, 2022 to September 30, 2022, as a result of the natural disaster, did this agency suspend any of the services it usually offers?
⃝1 Yes
⃝2 No
⃝3 Not applicable
THINKALOUD NOTES
How easy or difficult was it for you to answer this question? [IF DIFFICULT AND IF NEEDED:] Please explain.
For this question, would you know the answer right away, would you make a guess, or would you consult records or other colleagues?
IF NOT APPLICABLE IS SELECTED:
Probes for W1 Q6
[Remind respondents about placeholder dates and “natural disaster” as needed]
From July 1, 2022 to September 30, 2022, as a result of the natural disaster, did any of the structures (such as buildings or parking garages) used by this agency sustain the following kinds of damage?
|
Yes |
No |
Not applicable |
Minor damage to structure(s) |
|
|
|
Major damage to structure(s) |
|
|
|
Structure(s) destroyed |
|
|
|
THINKALOUD NOTES
In your opinion, did you think this question was clearly written or not clearly written?
How easy or difficult was it for you to answer this question? [IF DIFFICULT AND IF NEEDED:] Please explain.
IF NOT APPLICABLE IS SELECTED FOR ANY GRID ITEM:
You selected “Not applicable” [for respective grid item(s)]. Can you tell me more about why you chose that response? [If necessary, probe for each NA option selected.]
INTERVIEWER: INSTRUCT THE RESPONDENT TO GO TO THE NEXT SCREEN.
On this screen, we first have the same question we just looked at and underneath, we have the question but with a slightly different wording. Please review both of these questions.
OPTION A: From July 1, 2022 to September 30, 2022, as a result of the natural disaster, did any of the structures (such as buildings or parking garages) used by this agency sustain the following kinds of damage?
OPTION B: From July 1, 2022 to September 30, 2022, as a result of the natural disaster, did any of the structures (such as buildings or parking garages) in which this agency operates sustain the following kinds of damage?
|
Yes |
No |
Not applicable |
Minor damage to structure(s) |
|
|
|
Major damage to structure(s) |
|
|
|
Structure(s) destroyed |
|
|
|
Option A uses the phrase “used by this agency” and Option B uses the phrase “in which this agency operates.” Do these phrases mean the same thing to you, or are they different?
Which wording do you prefer? [IF NEEDED:] Please explain why.
Probes for W2 Q13
[Remind respondents about placeholder dates and “natural disaster” as needed]
The next two questions will ask you about a different topic.
From July 1, 2022 to September 30, 2022, did the natural disaster affect this agency’s funding to pay its employees?
⃝1 Yes
⃝2 No
THINKALOUD NOTES
In your own words, what do you think this question is asking?
This question references funding “to pay your employees.” How easy or difficult is it to answer this question, thinking only about funding to pay your employees? [INTERVIEWER: We want to know if they can separate funding specifically for paying employees from other funding, or if it’s one pot of money.]
What are some examples of how a natural disaster can affect your agency’s funding to pay your employees?
If your agency’s budget was frozen due to a natural disaster, how would you answer this question?
Probes for W2 Q14
[Remind respondents about placeholder dates and “natural disaster” as needed]
And now let’s look at our last question.
Did the authority over this agency’s budget change as a result of the natural disaster?
⃝1 Yes
⃝2 No
THINKALOUD NOTES
And what about this question—In your own words, what do you think this question is asking?
What does the phrase “authority over this agency’s budget” mean to you?
[IF NEEDED:] What are some examples of an “authority” over your agency’s budget?
Does this phrase match your agency’s experience?
[IF NEEDED AND IF PHRASE DOES NOT MATCH:] What phrase would better match your agency’s experience?
What are some examples of how a natural disaster can change the authority over your agency’s budget?
Concluding Probe(s)
Thinking back over the entire survey, is there anything else you’d like to discuss?
If you were taking this survey normally—not thinking aloud—approximately how long do you think it would take to complete?
Would you consult or request input from any other people in answering these questions?
IF YES:
Who would you consult? [Probe to capture position title, department, and any other details helpful to knowing where this information is available within the agency.]
How easy or difficult would it be to coordinate with them and get their help?
How much time do you think it would take to find out the answer to questions you’re unsure about? [Probe to clarify (1) the time respondents think it will take the other person to answer versus (2) the lapsed time from when the respondent requests assistance until they get an answer from the other person.]
Finally, what device did you use to answer the survey? Is it a laptop or a desktop computer? Or is it a mobile device like a tablet or a smart phone?
[Interviewer action: Ask observer if they have any questions they’d like to add, being sensitive to the respondent’s time.]
Those are all the questions I have for you. Thank you very much for participating!
Emergency
Economic Information Collections Question Bank 2
Cognitive
Interviewing Protocol
Wave 2 (PUBLIC SECTOR: Operational Impact Assessment)
Checklist
Legend:
Black font‒ interview scripts and probes to be administered by interviewers BLACK UNDERLINED UPPERCASE font ‒ conditions for additional probes [Red font with brackets] ‒ interviewer actions to be performed Blue italicized font ‒ Read in verbatim to respondents (Introduction specific) Purple font ‒ Testing question content (for interviewer reference only) <item> ‒ varying elements to be replaced based on respondent’s answer for probing
|
Respondent number: ___________________
Interview date: ___________________
Interview start time: ___________________
Unit type: ___________________
Function (if special district): ___________________
Single/Multifunction: ___________________
Consent link: [link]
Survey link: [link]
[Interviewer: at the start of the interview, complete the following:
Connect to Skype for Business (audio only)
Confirm if respondent has already signed consent
If consent already signed, verbally confirm consent for participation, consent for recording, and consent for observers
Ask if respondent has any questions
Start recorder and state date, time, participant number, and reconfirm consent
Help respondent access the web survey
Ask respondent to click through to the Introduction page.]
[Interviewer: Read (or paraphrase) the Introduction to the respondent.]
[Interviewer: The think-aloud example (in blue) needs to be read verbatim.]
Introduction
The U.S. Census Bureau is requesting your help to test survey questions focused on collecting emergency economic information. If you agree to take part in this study, you will answer a few questions asking about the impacts of natural disasters on your agency’s operations. We recommend that you think about how your agency may respond to potential emergencies and unforeseen disruption.
Also, you will be asked to read the instructions and questions aloud and then “think aloud” as you answer each question. This may feel a little unnatural, but it will help us understand how you think about and answer each question. Here’s an example:
Suppose this were a survey about your home and one of the questions asked, “How many bedrooms are there in your house?” Someone thinking aloud as they respond might say, “What do you mean by bedrooms—are these just where people sleep? If so, I have 2 bedrooms. But if we are counting anything with a closet and a window, we also have a study. So perhaps, we have 3 bedrooms. I’m going with where people sleep, so I’ll answer 2 bedrooms.”
Today we are going to look at about 10-12 questions and I’ll ask some additional debriefing questions. After we look at each question, I will ask you some follow-up questions to get your feedback. For example, I might ask you what a particular term or phrase means or ask you to put the question in your own words. There are no right or wrong answers to any of the questions that I might ask. We just want to make sure that everyone is understanding these survey questions in the same way and if the questions are worded clearly and, if they are not, what changes should be made. Your participation is important because it will help the Census Bureau improve these questions to accurately measure economic activities of businesses like yours.
Before we begin, do you have any questions?
[Interviewer action: Answer questions, as needed.]
Let’s get started with some background information about your business.
What is your role or title?
How long have you worked at your office in this role?
Besides the pandemic, has your office experienced a natural disaster, like a hurricane, tornado, or flood, that caused your office to temporarily stop service to the public?
[If yes: instruct the respondent to think about that event when answering these questions:] The questions that we’re going to review do not ask about a specific natural disaster or disruptive event. Instead, the questions use the phrase “natural disaster” as a placeholder so that we can talk about whatever event is relevant to you. When you see the term “natural disaster” in a question, I want you to think about [EVENT DISCUSSED ABOVE]. When you see dates in the questions, those are just to give you an idea of the length of time we would ask about. But for our discussion, I want you to think of a time period immediately after [EVENT DISCUSSED ABOVE].
[If no: instruct the respondent to think about a hypothetical natural disaster that is common in their area (e.g., a tornado, a fire, a hurricane, etc.):] The questions that we’re going to review do not ask about a specific natural disaster or disruptive event. Instead, the questions use the phrase “natural disaster” as a placeholder so that we can talk about whatever event is relevant to you. Since you haven’t had a natural disaster that has impacted the agency while you have worked there, let’s think of a hypothetical disaster that may happen in the future. What area of the country are you in? [Guide respondent in selecting a natural disaster to talk about during the interview]. When you see the term “natural disaster” in a question, I want you to think about [HYPOTHETICAL EVENT DISCUSSED]. When you see dates in the questions, those are just to give you an idea of the length of time we would ask about. But for our discussion, I want you to think of a time period immediately after [HYPOTHETICAL EVENT DISCUSSED].
[Interviewer action: Ask respondent to move to next page.]
Now let’s move on to the survey. Please read each question out loud and then think aloud as you come up with your answer.
[Interviewer action: Ask respondent to move to next page.]
[Interviewer action: Observe how respondent answers the survey and watch for any difficulty in determining their answers to the questions. Prompt respondent to think aloud while answering/reading the questions if they do not.]
Probes for W2 Q7
[Remind respondents about placeholder dates and “natural disaster” as needed]
From July 1, 2022 to September 30, 2022, how did this agency's total payroll for paid employees change as a result of the natural disaster?
⃝1 Increased
⃝2 Did not change
⃝3 Decreased
THINKALOUD NOTES
How easy or difficult was it for you to answer this question? [IF DIFFICULT AND IF NEEDED:] Please explain.
For this question, would you know the answer right away, would you make a guess, or would you consult records or other colleagues?
Probes for W2 Q8
[Remind respondents about placeholder dates and “natural disaster” as needed]
From July 1, 2022 to September 30, 2022, as a result of the natural disaster, how did this agency change the total number of paid employees?
⃝1 Increased
⃝2 Did not change
⃝3 Decreased
⃝4 Not applicable
THINKALOUD NOTES
How easy or difficult was it for you to answer this question? [IF DIFFICULT AND IF NEEDED:] Please explain.
For this question, would you know the answer right away, would you make a guess, or would you consult records or other colleagues?
IF NOT APPLICABLE IS SELECTED:
You selected “Not applicable.” Can you tell me more about why you chose that response?
Probes for W2 Q9
[Remind respondents about placeholder dates and “natural disaster” as needed]
From July 1, 2022 to September 30, 2022, what percentage of this agency’s part-time paid employees (workers who received a W-2) were temporarily and permanently laid off as a result of the natural disaster?
Temporarily laid off workers are those who have been given a date to return to work or who are expected to return to work within 6 months. Permanently laid off workers have no expectation of being rehired within 6 months.
Estimates are acceptable. Enter 0 if no layoffs.
|
% |
Temporarily laid off: |
|
Permanently laid off: |
|
⃝ Not applicable
THINKALOUD NOTES
In your opinion, did you think this question was clearly written or not clearly written?
And for this question, would you know the answer right away, would you make a guess, or would you consult records or other colleagues?
This question asks you about “part-time” paid employees. How easy or difficult would it be for you to answer this question about “full-time” paid employees? [IF DIFFICULT AND IF NEEDED:] Please explain.
IF NOT APPLICABLE IS SELECTED FOR ANY GRID ITEM:
You selected “Not applicable” [for respective grid item(s)]. Can you tell me more about why you chose that response? [If necessary, probe for each NA option selected.]
INTERVIEWER: INSTRUCT THE RESPONDENT TO GO TO THE NEXT SCREEN.
On this screen, we first have the same question we just looked at and underneath, we have the question but with a slightly different wording. Please review both of these questions.
INTERVIEWER, FOR YOUR REFERENCE: The difference in wording is…
Option A says “what percentage of this agency’s… were temporarily and permanently laid off”
Option B says “what percentage of… did this agency lay off temporarily and permanently”
OPTION A: From July 1, 2022 to September 30, 2022, what percentage of this agency’s part-time paid employees (workers who received a W-2) were temporarily and permanently laid off as a result of the natural disaster?
OPTION B: From July 1, 2022 to September 30, 2022, what percentage of part-time paid employees (workers who received a W-2) did this agency lay off temporarily and permanently as a result of the natural disaster?
Temporarily laid off workers are those who have been given a date to return to work or who are expected to return to work within 6 months. Permanently laid off workers have no expectation of being rehired within 6 months.
Estimates are acceptable. Enter 0 if no layoffs.
|
% |
Temporarily laid off: |
|
Permanently laid off: |
|
⃝ Not applicable
In your opinion, did you think Option B was clearly written or not clearly written?
Which wording do you prefer? [IF NEEDED:] Please explain why.
Probes for W2 Q10
[Remind respondents about placeholder dates and “natural disaster” as needed]
From July 1, 2022 to September 30, 2022, as a result of the natural disaster, how did this agency change the total number of paid employee hours?
⃝1 Increased
⃝2 Did not change
⃝3 Decreased
⃝4 Not applicable
THINKALOUD NOTES
How easy or difficult was it for you to answer this question? [IF DIFFICULT AND IF NEEDED:] Please explain.
For this question, would you know the answer right away, would you make a guess, or would you consult records or other colleagues?
IF NOT APPLICABLE IS SELECTED:
You selected “Not applicable.” Can you tell me more about why you chose that response?
Probes for W2 Q11
[Remind respondents about placeholder dates and “natural disaster” as needed]
From July 1, 2022 to September 30, 2022, as a result of the natural disaster, did this agency take the following actions related to shifts or hours?
|
Yes |
No |
Not applicable |
Reduction in overtime |
|
|
|
Increase in overtime |
|
|
|
Reduction in shifts or hours |
|
|
|
Increase in shifts or hours |
|
|
|
THINKALOUD NOTES
Let’s look at the items in the table. How well do these items match your agency’s experience?
[IF THEY DO NOT MATCH AND IF NEEDED:] Which items do not match your agency’s experience? Please explain.
For this question, would you know the answer right away, would you make a guess, or would you consult records or other colleagues?
IF NOT APPLICABLE IS SELECTED FOR ANY GRID ITEM:
You selected “Not applicable” [for respective grid item(s)]. Can you tell me more about why you chose that response? [If necessary, probe for each NA option selected.]
Probes for W2 Q12
[Remind respondents about placeholder dates and “natural disaster” as needed]
Compared to what was normal before the natural disaster, how has the number of paid employees working from home one day or more per week changed as a result of the natural disaster?
⃝1 Increased
⃝2 Did not change
⃝3 Decreased
⃝4 Not applicable
THINKALOUD NOTES
This question uses the phrase “working from home.” How well does this phrase match your agency’s experience?
[IF IT DOES NOT MATCH AND IF NEEDED:] What phrase would better match your agency’s experience?
This question defines employees who [“work from home” OR the phrase that respondents are telling you] as working from home at least one day or more per week. How well does this definition match your agency’s experience?
[IF IT DOES NOT MATCH AND IF NEEDED:] How does your agency classify workers who [“work from home” OR the phrase that respondents are telling you]?
How easy or difficult is it for you to answer this question? [IF DIFFICULT AND IF NEEDED:] Please explain.
[IF NEEDED:] How does your agency keep records about employee’s [“working from home” OR the phrase that respondents are telling you] status?
[IF NEEDED:] For this question, would you know the answer right away, would you make a guess, or would you consult records or other colleagues?
IF NOT APPLICABLE IS SELECTED:
You selected “Not applicable.” Can you tell me more about why you chose that response?
Probes for W3 Q20
[Remind respondents about placeholder dates and “natural disaster” as needed]
From July 1, 2022 to September 30, 2022, did this agency have less difficulty, no change, or more difficulty in recruiting paid employees compared to what was normal before the natural disaster?
⃝1 Less difficulty
⃝2 No change
⃝3 More difficulty
⃝4 Not applicable
THINKALOUD NOTES
How easy or difficult was it for you to answer this question about all employees?
[IF NEEDED:] If a natural disaster caused recruitment to be more difficult for certain types of employees but easier for other types of employees, how would you answer this question?
For this question, would you know the answer right away, would you make a guess, or would you consult records or other colleagues?
IF NOT APPLICABLE IS SELECTED:
You selected “Not applicable.” Can you tell me more about why you chose that response?
Probes for W2 Q13
[Remind respondents about placeholder dates and “natural disaster” as needed]
From July 1, 2022 to September 30, 2022, did the natural disaster affect this agency’s funding to pay its employees?
⃝1 Yes
⃝2 No
THINKALOUD NOTES
In your own words, what do you think this question is asking?
This question references funding “to pay your employees.” How easy or difficult is it to answer this question, thinking only about funding to pay your employees? [INTERVIEWER: We want to know if they can separate funding specifically for paying employees from other funding, or if it’s one pot of money.]
What are some examples of how a natural disaster can affect your agency’s funding to pay your employees?
If your agency’s budget was frozen due to a natural disaster, how would you answer this question?
Probes for W2 Q14
[Remind respondents about placeholder dates and “natural disaster” as needed]
Did the authority over this agency’s budget change as a result of the natural disaster?
⃝1 Yes
⃝2 No
THINKALOUD NOTES
And what about this question—In your own words, what do you think this question is asking?
What does the phrase “authority over this agency’s budget” mean to you?
[IF NEEDED:] What are some examples of an “authority” over your agency’s budget?
Does this phrase match your agency’s experience?
[IF NEEDED AND IF PHRASE DOES NOT MATCH:] What phrase would better match your agency’s experience?
What are some examples of how a natural disaster can change the authority over your agency’s budget?
Probes for W2 Q15
[Remind respondents about placeholder dates and “natural disaster” as needed]
From July 1, 2022 to September 30, 2022, were any of this agency’s electronic business records or data permanently lost as a result of the natural disaster?
⃝1 Yes
⃝2 No
⃝3 Not applicable
THINKALOUD NOTES
How well does the phrase “business records or data” match your agency’s experience?
[IF IT DOES NOT MATCH AND IF NEEDED:] What phrase would better match your agency’s experience?
How easy or difficult was it for you to answer this question? [IF DIFFICULT AND IF NEEDED:] Please explain.
IF NOT APPLICABLE IS SELECTED:
You selected “Not applicable.” Can you tell me more about why you chose that response?
Probes for W2 Q16
[Remind respondents about placeholder dates and “natural disaster” as needed]
And now let’s look at our last question.
From July 1, 2022 to September 30, 2022, did this agency have a backup of its electronic business records and data?
⃝1 Yes
⃝2 No
⃝3 Not applicable
THINKALOUD NOTES
What types of “backup” for your agency’s electronic business records were you thinking about when answering this question?
For this question, would you know the exact answer right away, would you make a guess, or would you consult records or other colleagues?
IF NOT APPLICABLE IS SELECTED:
You selected “not applicable.” Can you tell me more about why you chose that response?
Concluding Probe(s)
Thinking back over the entire survey, is there anything else you’d like to discuss?
If you were taking this survey normally—not thinking aloud—approximately how long do you think it would take to complete?
Would you consult or request input from any other people in answering these questions?
IF YES:
Who would you consult? [Probe to capture position title, department, and any other details helpful to knowing where this information is available within the agency.]
How easy or difficult would it be to coordinate with them and get their help?
How much time do you think it would take to find out the answer to questions you’re unsure about? [Probe to clarify (1) the time respondents think it will take the other person to answer versus (2) the lapsed time from when the respondent requests assistance until they get an answer from the other person.]
Finally, what device did you use to answer the survey? Is it a laptop or a desktop computer? Or is it a mobile device like a tablet or a smart phone?
[Interviewer action: Ask observer if they have any questions they’d like to add, being sensitive to the respondent’s time.]
Those are all the questions I have for you. Thank you very much for participating!
Emergency
Economic Information Collections Question Bank 2
Cognitive
Interviewing Protocol
Wave 3 (PUBLIC SECTOR: Operational Impact Assessment)
Checklist
Legend:
Black font‒ interview scripts and probes to be administered by interviewers BLACK UNDERLINED UPPERCASE font ‒ conditions for additional probes [Red font with brackets] ‒ interviewer actions to be performed Blue italicized font ‒ Read in verbatim to respondents (Introduction specific) Purple font ‒ Testing question content (for interviewer reference only) <item> ‒ varying elements to be replaced based on respondent’s answer for probing
|
Respondent number: ___________________
Interview date: ___________________
Interview start time: ___________________
Unit type: ___________________
Function (if special district): ___________________
Single/Multifunction: ___________________
Consent link: [link]
Survey link: [link]
[Interviewer: at the start of the interview, complete the following:
Connect to Skype for Business (audio only)
Confirm if respondent has already signed consent
If consent already signed, verbally confirm consent for participation, consent for recording, and consent for observers
Ask if respondent has any questions
Start recorder and state date, time, participant number, and reconfirm consent
Help respondent access the web survey
Ask respondent to click through to the Introduction page.]
[Interviewer: Read (or paraphrase) the Introduction to the respondent.]
[Interviewer: The think-aloud example (in blue) needs to be read verbatim.]
Introduction
The U.S. Census Bureau is requesting your help to test survey questions focused on collecting emergency economic information. If you agree to take part in this study, you will answer a few questions asking about the impacts of natural disasters on your agency’s operations. We recommend that you think about how your agency may respond to potential emergencies and unforeseen disruption.
Also, you will be asked to read the instructions and questions aloud and then “think aloud” as you answer each question. This may feel a little unnatural, but it will help us understand how you think about and answer each question. Here’s an example:
Suppose this were a survey about your home and one of the questions asked, “How many bedrooms are there in your house?” Someone thinking aloud as they respond might say, “What do you mean by bedrooms—are these just where people sleep? If so, I have 2 bedrooms. But if we are counting anything with a closet and a window, we also have a study. So perhaps, we have 3 bedrooms. I’m going with where people sleep, so I’ll answer 2 bedrooms.”
Today we are going to look at about 10-12 questions and I’ll ask some additional debriefing questions. After we look at each question, I will ask you some follow-up questions to get your feedback. For example, I might ask you what a particular term or phrase means or ask you to put the question in your own words. There are no right or wrong answers to any of the questions that I might ask. We just want to make sure that everyone is understanding these survey questions in the same way and if the questions are worded clearly and, if they are not, what changes should be made. Your participation is important because it will help the Census Bureau improve these questions to accurately measure economic activities of businesses like yours.
Before we begin, do you have any questions?
[Interviewer action: Answer questions, as needed.]
Let’s get started with some background information about your business.
What is your role or title?
How long have you worked at your office in this role?
Besides the pandemic, has your office experienced a natural disaster, like a hurricane, tornado, or flood, that caused your office to temporarily stop service to the public?
[If yes: instruct the respondent to think about that event when answering these questions:] The questions that we’re going to review do not ask about a specific natural disaster or disruptive event. Instead, the questions use the phrase “natural disaster” as a placeholder so that we can talk about whatever event is relevant to you. When you see the term “natural disaster” in a question, I want you to think about [EVENT DISCUSSED ABOVE]. When you see dates in the questions, those are just to give you an idea of the length of time we would ask about. But for our discussion, I want you to think of a time period immediately after [EVENT DISCUSSED ABOVE].
[If no: instruct the respondent to think about a hypothetical natural disaster that is common in their area (e.g., a tornado, a fire, a hurricane, etc.):] The questions that we’re going to review do not ask about a specific natural disaster or disruptive event. Instead, the questions use the phrase “natural disaster” as a placeholder so that we can talk about whatever event is relevant to you. Since you haven’t had a natural disaster that has impacted the agency while you have worked there, let’s think of a hypothetical disaster that may happen in the future. What area of the country are you in? [Guide respondent in selecting a natural disaster to talk about during the interview]. When you see the term “natural disaster” in a question, I want you to think about [HYPOTHETICAL EVENT DISCUSSED]. When you see dates in the questions, those are just to give you an idea of the length of time we would ask about. But for our discussion, I want you to think of a time period immediately after [HYPOTHETICAL EVENT DISCUSSED].
[Interviewer action: Ask respondent to move to next page.]
Now let’s move on to the survey. Please read each question out loud and then think aloud as you come up with your answer.
[Interviewer action: Ask respondent to move to next page.]
[Interviewer action: Observe how respondent answers the survey and watch for any difficulty in determining their answers to the questions. Prompt respondent to think aloud while answering/reading the questions if they do not.]
Probes for W3 Q17
[Remind respondents about placeholder dates and “natural disaster” as needed]
From July 1, 2022 to September 30, 2022, as a result of the natural disaster, did this agency have higher than usual, no change in, or lower than usual voluntary turnover for paid employees?
Voluntary turnover occurs when an employee chooses to leave a position that then needs to be refilled.
⃝1 Higher than usual
⃝2 No change
⃝3 Lower than usual
THINKALOUD NOTES
This question provides a definition of “voluntary turnover.” In your opinion, is this definition written clearly or not clearly?
In your agency’s experience, have there been times when an employee chooses to leave a position but your agency has chosen not to refill that position?
[IF YES TO PROBE 2:] Do you think that voluntary turnover includes those instances of an employee leaving and your agency not refilling that position?
[IF YES TO PROBE 2:] Does asking you to exclude these cases make it easier or more difficult to answer this question, or it doesn’t matter? [IF NEEDED:] Please explain.
And for this question, would you know the answer right away, would you make a guess, or would you consult records or other colleagues?
INTERVIEWER: INSTRUCT THE RESPONDENT TO GO TO THE NEXT SCREEN.
On this screen, we first have the same question we just looked at and underneath, we have the question but with a slightly different wording. Please review both of these questions.
OPTION A: From July 1, 2022 to September 30, 2022, as a result of the natural disaster, did this agency have higher than usual, no change in, or lower than usual voluntary turnover for paid employees?
OPTION B: From July 1, 2022 to September 30, 2022, as a result of the natural disaster, did this agency have higher than usual, no change in, or lower than usual voluntary paid employee turnover?
Voluntary turnover occurs when an employee chooses to leave a position that then needs to be refilled.
⃝1 Higher than usual
⃝2 No change
⃝3 Lower than usual
Option A uses the phrase “voluntary turnover for paid employees” and Option B uses the phrase “voluntary paid employee turnover.” Do these phrases mean the same thing to you, or are they different?
[IF DIFFERENT AND IF NEEDED:] How are they different?
Do you think that Option A is more clear, Option B is more clear, or does it not matter to you?
Probes for W3 Q18
[Remind respondents about placeholder dates and “natural disaster” as needed]
From July 1, 2022 to September 30, 2022, as a result of the natural disaster, did this agency put any hiring freezes in place?
⃝1 Yes
⃝2 No
⃝3 Not applicable
THINKALOUD NOTES
How well does the phrase “hiring freeze” match your agency’s experience?
[IF IT DOESN’T MATCH AND IF NEEDED:] What phrase would better match your agency’s experience?
How easy or difficult was it for you to answer this question? [IF DIFFICULT AND IF NEEDED:] Please explain.
Probes for W3 Q19
[Remind respondents about placeholder dates and “natural disaster” as needed]
From July 1, 2022 to September 30, 2022, as a result of the natural disaster, did this agency hire any additional paid employees?
⃝1 Yes
⃝2 No
⃝3 Not applicable
THINKALOUD NOTES
How easy or difficult was it for you to answer this question? [IF DIFFICULT AND IF NEEDED:] Please explain.
For this question, would you know the answer right away, would you make a guess, or would you consult records or other colleagues?
IF NOT APPLICABLE IS SELECTED:
You selected “Not applicable.” Can you tell me more about why you chose that response?
Probes for W3 Q20
[Remind respondents about placeholder dates and “natural disaster” as needed]
From July 1, 2022 to September 30, 2022, did this agency have less difficulty, no change, or more difficulty in recruiting paid employees compared to what was normal before the natural disaster?
⃝1 Less difficulty
⃝2 No change
⃝3 More difficulty
⃝4 Not applicable
THINKALOUD NOTES
How easy or difficult was it for you to answer this question about all employees?
[IF NEEDED:] If a natural disaster caused recruitment to be more difficult for certain types of employees but easier for other types of employees, how would you answer this question?
For this question, would you know the answer right away, would you make a guess, or would you consult records or other colleagues?
IF NOT APPLICABLE IS SELECTED:
You selected “Not applicable.” Can you tell me more about why you chose that response?
Probes for W2 Q12
[Remind respondents about placeholder dates and “natural disaster” as needed]
Compared to what was normal before the natural disaster, how has the number of paid employees working from home one day or more per week changed as a result of the natural disaster?
⃝1 Increased
⃝2 Did not change
⃝3 Decreased
⃝4 Not applicable
THINKALOUD NOTES
This question uses the phrase “working from home.” How well does this phrase match your agency’s experience?
[IF IT DOES NOT MATCH AND IF NEEDED:] What phrase would better match your agency’s experience?
This question defines employees who [“work from home” OR the phrase that respondents are telling you] as working from home one day or more per week. How well does this definition match your agency’s experience?
[IF IT DOES NOT MATCH AND IF NEEDED:] How does your agency classify workers [“work from home” OR the phrase that respondents are telling you]?
How easy or difficult is it for you to answer this question? [IF DIFFICULT AND IF NEEDED:] Please explain.
[IF NEEDED:] How does your agency keep records about employee’s [“working from home” OR the phrase that respondents are telling you] status?
[IF NEEDED:] For this question, would you know the answer right away, would you make a guess, or would you consult records or other colleagues?
IF NOT APPLICABLE IS SELECTED:
You selected “Not applicable.” Can you tell me more about why you chose that response?
Probes for W3 Q21
[Remind respondents about placeholder dates and “natural disaster” as needed]
INTERVIEWER: Only special districts should answer this question.
If the agency is not a special district, instruct them to skip this question.
If you are unsure or the person you are speaking with is in charge of reporting for several agencies, ask this question.
From July 1, 2022 to September 30, 2022, did the natural disaster cause this agency’s primary function to change?
⃝1 Yes
⃝2 No
THINKALOUD NOTES
In your own words what you do you think this question is asking?
What does the phrase “primary function” mean to you?
How easy or difficult is it for you to answer this question? [IF DIFFICULT AND IF NEEDED:] Please explain.
This question asks you about the July to September quarter of 2022. How easy or difficult would it be for you to answer this question if it asked about “in the last month”? [IF DIFFICULT AND IF NEEDED:] Please explain.
What about “in the last fiscal year”? [IF DIFFICULT AND IF NEEDED:] Please explain.
What about “in the current fiscal year”? [IF DIFFICULT AND IF NEEDED:] Please explain.
Probes for W1 Q6
[Remind respondents about placeholder dates and “natural disaster” as needed]
From July 1, 2022 to September 30, 2022, as a result of the natural disaster, did any of the structures (such as buildings or parking garages) used by this agency sustain the following kinds of damage?
|
Yes |
No |
Not applicable |
Minor damage to structure(s) |
|
|
|
Major damage to structure(s) |
|
|
|
Structure(s) destroyed |
|
|
|
THINKALOUD NOTES
In your opinion, did you think this question was clearly written or not clearly written?
How easy or difficult was it for you to answer this question? [IF DIFFICULT AND IF NEEDED:] Please explain.
IF NOT APPLICABLE IS SELECTED FOR ANY GRID ITEM:
You selected “Not applicable” [for respective grid item(s)]. Can you tell me more about why you chose that response? [If necessary, probe for each NA option selected.]
INTERVIEWER: INSTRUCT THE RESPONDENT TO GO TO THE NEXT SCREEN.
On this screen, we first have the same question we just looked at and underneath, we have the question but with a slightly different wording. Please review both of these questions.
OPTION A: From July 1, 2022 to September 30, 2022, as a result of the natural disaster, did any of the structures (such as buildings or parking garages) used by this agency sustain the following kinds of damage?
OPTION B: From July 1, 2022 to September 30, 2022, as a result of the natural disaster, did any of the structures (such as buildings or parking garages) in which this agency operates sustain the following kinds of damage?
|
Yes |
No |
Not applicable |
Minor damage to structure(s) |
|
|
|
Major damage to structure(s) |
|
|
|
Structure(s) destroyed |
|
|
|
Option A uses the phrase “used by this agency” and Option B uses the phrase “in which this agency operates.” Do these phrases mean the same thing to you, or are they different?
[IF DIFFERENT AND IF NEEDED:] How are they different?
Which wording do you prefer? [IF NEEDED:] Please explain why.
Probes for W3 Q22
[Remind respondents about placeholder dates and “natural disaster” as needed]
The next few questions will ask about a different topic.
From July 1, 2022 to September 30, 2022, how did this agency’s operating budget change compared to what was normal before the natural disaster?
⃝1 Increased
⃝2 Did not change
⃝3 Decreased
⃝4 Not applicable
THINKALOUD NOTES
How well does the phrase “operating budget” match your agency’s experience?
[IF IT DOES NOT MATCH AND IF NEEDED:] What phrase would better match your agency’s experience?
For this question, would you know the answer right away, would you make a guess, or would you consult records or other colleagues?
IF NOT APPLICABLE IS SELECTED:
You selected “Not applicable.” Can you tell me more about why you chose that response?
Probes for W3 Q23
[Remind respondents about placeholder dates and “natural disaster” as needed]
As a result of the natural disaster, did this agency apply for any government loans or assistance from July 1, 2022 to September 30, 2022?
⃝1 Yes
⃝2 No
THINKALOUD NOTES
How easy or difficult was it for you to answer this question? [IF DIFFICULT AND IF NEEDED:] Please explain.
For this question, would you know the answer right away, would you make a guess, or would you consult records or other colleagues?
Probes for W3 Q24
[Remind respondents about placeholder dates and “natural disaster” as needed]
As a result of the natural disaster, did this agency receive any government loans or assistance from July 1, 2022 to September 30, 2022?
⃝1 Yes
⃝2 No
THINKALOUD NOTES
How easy or difficult was it for you to answer this question? [IF DIFFICULT AND IF NEEDED:] Please explain.
For this question, would you know the answer right away, would you make a guess, or would you consult records or other colleagues?
Probes for W3 Q25
[Remind respondents about placeholder dates and “natural disaster” as needed]
And now let’s take a look at our last question.
As a result of the natural disaster, how did this agency change its budgeted capital expenditures from July 1, 2022 to September 30, 2022? Select all that apply.
⃝1 Canceled budgeted capital expenditure(s)
⃝2 Decreased budgeted capital expenditure(s)
⃝3 Postponed budgeted capital expenditure(s)
⃝4 Increased budgeted capital expenditure(s)
⃝5 Introduced new unbudgeted capital expenditure(s)
⃝6 No changes
⃝7 Not applicable
THINKALOUD NOTES
How easy or difficult was it for you to answer this question? [IF DIFFICULT AND IF NEEDED:] Please explain.
Let’s take a look at the response options. How well do these response options match your agency’s experience?
[IF THEY DON’T MATCH AND IF NEEDED:] What response options would make sense for your agency?
For this question, would you know the answer right away, would you make a guess, or would you consult records or other colleagues?
IF NOT APPLICABLE IS SELECTED:
You selected “Not applicable.” Can you tell me more about why you chose that response?
Concluding Probe(s)
Thinking back over the entire survey, is there anything else you’d like to discuss?
If you were taking this survey normally—not thinking aloud—approximately how long do you think it would take to complete?
Would you consult or request input from any other people in answering these questions?
IF YES:
Who would you consult? [Probe to capture position title, department, and any other details helpful to knowing where this information is available within the agency.]
How easy or difficult would it be to coordinate with them and get their help?
How much time do you think it would take to find out the answer to questions you’re unsure about? [Probe to clarify (1) the time respondents think it will take the other person to answer versus (2) the lapsed time from when the respondent requests assistance until they get an answer from the other person.]
Finally, what device did you use to answer the survey? Is it a laptop or a desktop computer? Or is it a mobile device like a tablet or a smart phone?
[Interviewer action: Ask observer if they have any questions they’d like to add, being sensitive to the respondent’s time.]
Those are all the questions I have for you. Thank you very much for participating!
Emergency
Economic Information Collections Question Bank 2
Cognitive
Interviewing Protocol
Wave 3 (PUBLIC SECTOR)
Checklist
Legend:
Black font‒ interview scripts and probes to be administered by interviewers BLACK UNDERLINED UPPERCASE font ‒ conditions for additional probes [Red font with brackets] ‒ interviewer actions to be performed Blue italicized font ‒ Read in verbatim to respondents (Introduction specific) Purple font ‒ Testing question content (for interviewer reference only) <item> ‒ varying elements to be replaced based on respondent’s answer for probing
|
Respondent number: ___________________
Interview date: ___________________
Interview start time: ___________________
Unit type: ___________________
Function (if special district): ___________________
Single/Multifunction: ___________________
Consent link: [link]
Survey link: [link]
[Interviewer: at the start of the interview, complete the following:
Connect to Skype for Business (audio only)
Confirm if respondent has already signed consent
If consent already signed, verbally confirm consent for participation, consent for recording, and consent for observers
Ask if respondent has any questions
Start recorder and state date, time, participant number, and reconfirm consent
Help respondent access the web survey
Ask respondent to click through to the Introduction page.]
[Interviewer: Read (or paraphrase) the Introduction to the respondent.]
[Interviewer: The think-aloud example (in blue) needs to be read verbatim.]
Introduction
The U.S. Census Bureau is requesting your help to test survey questions focused on collecting emergency economic information. If you agree to take part in this study, you will answer a few questions asking about the impacts of natural disasters on your agency’s operations. We recommend that you think about how your agency may respond to potential emergencies and unforeseen disruption.
Also, you will be asked to read the instructions and questions aloud and then “think aloud” as you answer each question. This may feel a little unnatural, but it will help us understand how you think about and answer each question. Here’s an example:
Suppose this were a survey about your home and one of the questions asked, “How many bedrooms are there in your house?” Someone thinking aloud as they respond might say, “What do you mean by bedrooms—are these just where people sleep? If so, I have 2 bedrooms. But if we are counting anything with a closet and a window, we also have a study. So perhaps, we have 3 bedrooms. I’m going with where people sleep, so I’ll answer 2 bedrooms.”
Today we are going to look at about 10-12 questions and I’ll ask some additional debriefing questions. After we look at each question, I will ask you some follow-up questions to get your feedback. For example, I might ask you what a particular term or phrase means or ask you to put the question in your own words. There are no right or wrong answers to any of the questions that I might ask. We just want to make sure that everyone is understanding these survey questions in the same way and if the questions are worded clearly and, if they are not, what changes should be made. Your participation is important because it will help the Census Bureau improve these questions to accurately measure economic activities of businesses like yours.
Before we begin, do you have any questions?
[Interviewer action: Answer questions, as needed.]
Let’s get started with some background information about your business.
What is your role or title?
How long have you worked at your office in this role?
Besides the pandemic, has your office experienced a natural disaster, like a hurricane, tornado, or flood, that caused your office to temporarily stop service to the public?
[If yes: instruct the respondent to think about that event when answering these questions:] The questions that we’re going to review do not ask about a specific natural disaster or disruptive event. Instead, the questions use the phrase “natural disaster” as a placeholder so that we can talk about whatever event is relevant to you. When you see the term “natural disaster” in a question, I want you to think about [EVENT DISCUSSED ABOVE]. When you see dates in the questions, those are just to give you an idea of the length of time we would ask about. But for our discussion, I want you to think of a time period immediately after [EVENT DISCUSSED ABOVE].
[If no: instruct the respondent to think about a hypothetical natural disaster that is common in their area (e.g., a tornado, a fire, a hurricane, etc.):] The questions that we’re going to review do not ask about a specific natural disaster or disruptive event. Instead, the questions use the phrase “natural disaster” as a placeholder so that we can talk about whatever event is relevant to you. Since you haven’t had a natural disaster that has impacted the agency while you have worked there, let’s think of a hypothetical disaster that may happen in the future. What area of the country are you in? [Guide respondent in selecting a natural disaster to talk about during the interview]. When you see the term “natural disaster” in a question, I want you to think about [HYPOTHETICAL EVENT DISCUSSED]. When you see dates in the questions, those are just to give you an idea of the length of time we would ask about. But for our discussion, I want you to think of a time period immediately after [HYPOTHETICAL EVENT DISCUSSED].
[Interviewer action: Ask respondent to move to next page.]
Now let’s move on to the survey. Please read each question out loud and then think aloud as you come up with your answer.
[Interviewer action: Ask respondent to move to next page.]
[Interviewer action: Observe how respondent answers the survey and watch for any difficulty in determining their answers to the questions. Prompt respondent to think aloud while answering/reading the questions if they do not.]
Probes for W1 Q4
[Remind respondents about placeholder dates and “natural disaster” as needed]
From April 1, 2023 to June 30, 2023, how much did need for this agency’s services change compared to what was normal before the natural disaster?
⃝1 Increased a lot
⃝2 Increased a moderate amount
⃝3 Did not change
⃝4 Decreased a moderate amount
⃝5 Decreased a lot
THINKALOUD NOTES
What does the phrase “need for this agency’s services” mean to you, as it’s used here in this question?
Does your agency provide more than one type of service?
[IF YES, HAS MORE THAN ONE TYPE OF SERVICE:] How easy or difficult is it for you to answer this question about all of your agency’s services?
[IF YES, HAS MORE THAN ONE TYPE OF SERVICE AND IF NEEDED:] If a natural disaster increased need for one type of service but decreased need for another type of service, how would you answer this question?
[IF NO, HAS ONLY ONE TYPE OF SERVICE:] How easy or difficult is it for you to answer this question? [IF DIFFICULT AND IF NEEDED:] Please explain.
Probes for W3 Q21
[Remind respondents about placeholder dates and “natural disaster” as needed]
INTERVIEWER: Only special districts should answer this question.
If the agency is not a special district, instruct them to skip this question.
If you are unsure or the person you are speaking with is in charge of reporting for several agencies, ask this question.
From April 1, 2023 to June 30, 2023, did the natural disaster cause this agency to add additional functions? Examples of functions are Fire Protection, Sewage and Water Supply, Utility, etc.
⃝1 Yes
⃝2 No
[IF YES:] Please describe the function(s) added:
_________________________________________________
THINKALOUD NOTES
In your own words what you do you think this question is asking?
What kinds of functions were you thinking about while answering this question?
How easy or difficult is it for you to answer this question? [IF DIFFICULT AND IF NEEDED:] Please explain.
If your agency did add additional functions because of a natural disaster, what kinds of things do you think you would write in the text box?
Probes for Q55
[Remind respondents about placeholder dates and “natural disaster” as needed]
From April 1, 2023 to June 30, 2023, how did this agency’s expenses change compared to what was normal before the natural disaster?
⃝1 Expenses increased
⃝2 Expenses did not change
⃝3 Expenses decreased
THINKALOUD NOTES
What kind of expenses were you thinking about while answering this question?
[IF NEEDED:] Can you think of an example of how a natural disaster could cause your agency’s expenses to change?
How easy or difficult was it for you to answer this question? [IF DIFFICULT:] Please explain.
Probes for W3 Q20
[Remind respondents about placeholder dates and “natural disaster” as needed]
From April 1, 2023 to June 30, 2023, did this agency have less difficulty, no change, or more difficulty in hiring paid employees compared to what was normal before the natural disaster?
⃝1 Less difficulty
⃝2 No change
⃝3 More difficulty
⃝4 Not applicable
THINKALOUD NOTES
In your own words what you do you think this question is asking?
How easy or difficult was it for you to answer this question? [IF DIFFICULT:] Please explain.
[IF NEEDED:] If a natural disaster caused hiring to be more difficult for certain types of employees but easier for other types of employees, how would you answer this question?
For this question, would you know the answer right away, would you make a guess, or would you consult records or other colleagues?
IF NOT APPLICABLE IS SELECTED:
You selected “Not applicable.” Can you tell me more about why you chose that response?
Probes for W2 Q10
[Remind respondents about placeholder dates and “natural disaster” as needed]
From April 1, 2023 to June 30, 2023, as a result of the natural disaster, how did the agency’s total number of paid employee hours change?
⃝1 Increased
⃝2 Did not change
⃝3 Decreased
⃝4 Not applicable
THINKALOUD NOTES
[IF RESPONDENT GAVE AN ANSWER:] How did you arrive at your answer to this question?
[IF RESPONDENT IS ANSWERING HYPOTHETICALLY AND DID NOT PROVIDE AN EXACT ANSWER:] If a natural disaster were to happen, how do you think you might respond to this question?
[IF NEEDED:] How did you arrive at your answer to this question?
How easy or difficult was it for you to answer this question? [IF DIFFICULT AND IF NEEDED:] Please explain.
For this question, would you know the answer right away, would you make a guess, or would you consult records or other colleagues?
IF NOT APPLICABLE IS SELECTED:
You selected “Not applicable.” Can you tell me more about why you chose that response?
Probes for W2 Q9
[Remind respondents about placeholder dates and “natural disaster” as needed]
From April 1, 2023 to June 30, 2023, what percentage of this agency’s part-time paid employees (workers who received a W-2) were temporarily and permanently laid off as a result of the natural disaster?
Include:
Part-time employees and officials
Current employees in paid leave status
All elected or appointed officials paid any amount of pay or stipend (even small amounts of $25 per meeting or $100 annually) or paid on salary basis; by fees or commissions; on a per meeting basis; or a flat sum quarterly, semi-annually, or annually
Temporary or seasonal employees
Exclude:
Employees on unpaid leave, unpaid officials, pensioners, and contractors and their employees
Temporarily laid off workers are those who have been given a date to return to work or who are expected to return to work within 6 months. Permanently laid off workers have no expectation of being rehired within 6 months.
Estimates are acceptable. Enter 0 if no layoffs.
|
% |
Temporarily laid off: |
|
Permanently laid off: |
|
⃝ Not applicable – This agency has no part-time paid employees
THINKALOUD NOTES
INTERVIEWER: If the participant says that they do not have any part-time employees, then ask them to think about full-time employees while answering the probes. Please note here if they are thinking/answering about part-time or full-time employees.
This question has several parts and I want to review them one piece at a time. First, let’s review the question—the text before “Include.” In your opinion, did you think this question was clearly written or not clearly written? [IF NOT CLEAR:] Please explain.
Now let’s take a look at the instructions written about who to include and who to exclude—the list in bullet points. Did you find these instructions clear or not clear? [IF NOT CLEAR:] Please explain.
Did you find these instructions about who to include and exclude helpful or not helpful? [IF NOT HELPFUL:] Please explain.
Now let’s take a look at the definition of temporarily laid off—the text written in italics. Did you find this definition clear or not clear? [IF NOT CLEAR:] Please explain.
Overall, how easy or difficult was it for you to answer this question? [IF DIFFICULT:] Please explain.
And for this question, would you know the answer right away, would you make a guess, or would you consult records or other colleagues?
IF NOT APPLICABLE IS SELECTED:
You selected “Not applicable”. Can you tell me more about why you chose that response?
Probes for W1 Q2B
[Remind respondents about placeholder dates and “natural disaster” as needed]
INTERVIEWER, SAY: Respondents who tell us on a previous question that they were temporarily closed during the natural disaster will receive this next question. I’ll have you look at this question for our testing so that I can ask you some questions for your feedback.
From April 1, 2023 to June 30, 2023, did the following factors related to the natural disaster influence this agency’s decision to close temporarily?
|
Yes |
No |
State of emergency declared by the federal, state, or local government |
|
|
Stay-at-home orders |
|
|
Restricted or limited access to the agency’s worksite(s) |
|
|
Employee safety |
|
|
School closings, which led to a high number of absent staff |
|
|
Staff’s unwillingness or hesitation to report for work |
|
|
Other, please describe: |
|
|
⃝ Not applicable – This agency did not temporarily close from April 1, 2023 to June 30, 2023
THINKALOUD NOTES
Overall, how easy or difficult is it for you to answer this question?
Let’s focus on the reasons for temporary closures in the table. How easy or difficult was it for you to answer the first row in the table—state of emergency declared? [IF DIFFICULT:] Please explain.
Do you think that this first row [STATE OF EMERGENCY DECLARED] is the same thing as or different from the second row—stay-at-home orders? [IF DIFFERENT:] Please explain.
Now let’s take a look at the fifth row in the table—"School closings, which led to a high number of absent staff.” How easy or difficult was it for you to answer this item? [IF DIFFICULT:] Please explain.
IF NOT APPLICABLE IS SELECTED:
You selected “Not applicable.” Can you tell me more about why you chose that response?
Probes for W3 Q24
[Remind respondents about placeholder dates and “natural disaster” as needed]
As a result of the natural disaster, did this agency receive any government loans/grants from April 1, 2023 to June 30, 2023?
⃝1 Yes
⃝2 No
THINKALOUD NOTES
How easy or difficult was it for you to answer this question? [IF DIFFICULT AND IF NEEDED:] Please explain.
What kinds of loans or grants were you thinking about while answering this question?
For this question, would you know the answer right away, would you make a guess, or would you consult records or other colleagues?
Concluding Probe(s)
Thinking back over the entire survey, is there anything else you’d like to discuss?
If you were taking this survey normally—not thinking aloud—approximately how long do you think it would take to complete?
Would you consult or request input from any other people in answering these questions?
IF YES:
Who would you consult? [Probe to capture position title, department, and any other details helpful to knowing where this information is available within the agency.]
How easy or difficult would it be to coordinate with them and get their help?
How much time do you think it would take to find out the answer to questions you’re unsure about? [Probe to clarify (1) the time respondents think it will take the other person to answer versus (2) the lapsed time from when the respondent requests assistance until they get an answer from the other person.]
Finally, what device did you use to answer the survey? Is it a laptop or a desktop computer? Or is it a mobile device like a tablet or a smart phone?
[Interviewer action: Ask observer if they have any questions they’d like to add, being sensitive to the respondent’s time.]
Those are all the questions I have for you. Thank you very much for participating!
Emergency
Economic Information Collections Question Bank 2
Cognitive
Interviewing Protocol
Wave 1 (PRIVATE SECTOR: Damage and operations)
Checklist
Legend:
Black font‒ interview scripts and probes to be administered by interviewers BLACK UNDERLINED UPPERCASE font ‒ conditions for additional probes [Red font with brackets] ‒ interviewer actions to be performed Blue italicized font ‒ Read in verbatim to respondents (Introduction specific) Purple font ‒ Testing question content (for interviewer reference only) <item> ‒ varying elements to be replaced based on respondent’s answer for probing
|
Respondent number: ___________________
Interview date: ___________________
Interview start time: ___________________
Manufacturing? (Y/N): ___________________
Consent link: [LINK]
Survey link: [LINK]
[Interviewer: at the start of the interview, complete the following:
Connect to Microsoft Teams (audio only)
Confirm if respondent has already signed consent
If consent already signed, verbally confirm consent for participation, consent for recording, and consent for observers
Ask if respondent has any questions
Start recorder and state date, time, participant number, and reconfirm consent
Help respondent access the web survey]
[Interviewer: Read (or paraphrase) the Introduction to the respondent.]
[Interviewer: The think-aloud example (in blue) needs to be read verbatim.]
Introduction
The U.S. Census Bureau is requesting your help to test survey questions focused on collecting emergency economic information. If you agree to take part in this study, you will answer a few questions asking about the impacts of natural disasters on your company’s operations. We recommend that you think about how your company may respond to potential emergencies and unforeseen disruption.
Also, you will be asked to read the instructions and questions aloud and then “think aloud” as you answer each question. This may feel a little unnatural, but it will help us understand how you think about and answer each question. Here’s an example:
Suppose this were a survey about your home and one of the questions asked, “How many bedrooms are there in your house?” Someone thinking aloud as they respond might say, “What do you mean by bedrooms—are these just where people sleep? If so, I have 2 bedrooms. But if we are counting anything with a closet and a window, we also have a study. So perhaps, we have 3 bedrooms. I’m going with where people sleep, so I’ll answer 2 bedrooms.”
Today we are going to look at about 8-10 questions and I’ll ask some additional debriefing questions. After we look at each question, I will ask you some follow-up questions to get your feedback. For example, I might ask you what a particular term or phrase means or ask you to put the question in your own words. There are no right or wrong answers to any of the questions that I might ask. We just want to make sure that everyone is understanding these survey questions in the same way and if the questions are worded clearly and, if they are not, what changes should be made. Your participation is important because it will help the Census Bureau improve these questions to accurately measure economic activities of businesses like yours.
Before we begin, do you have any questions?
[Interviewer action: Answer questions, as needed.]
Let's get started with some background information about your company.
First, tell me about your company: What do you do or make?
What is your role or title?
How long have you worked at your company in this role?
Can you tell me about your experience completing business surveys from the Census Bureau?
In the length of your employment at this company, have you ever experienced a natural disaster like a hurricane, tornado, or flood that seriously impacted the company?
[If yes: instruct the respondent to think about that event when answering these questions:] The questions that we’re going to review do not ask about a specific natural disaster or disruptive event. Instead, the questions use the phrase “natural disaster” as a placeholder so that we can talk about whatever event is relevant to you. When you see the term “natural disaster” in a question, I want you to think about [EVENT DISCUSSED ABOVE]. When you see dates in the questions, those are just to give you an idea of the length of time we would ask about. But for our discussion, I want you to think of a time period immediately after [EVENT DISCUSSED ABOVE].
[If no: instruct the respondent to think about a hypothetical natural disaster that is common in their area (e.g., a tornado, a fire, a hurricane, etc.):] The questions that we’re going to review do not ask about a specific natural disaster or disruptive event. Instead, the questions use the phrase “natural disaster” as a placeholder so that we can talk about whatever event is relevant to you. Since you haven’t had a natural disaster that has impacted the company while you have worked there, let’s think of a hypothetical disaster that may happen in the future. What area of the country are you in? [Guide respondent in selecting a natural disaster to talk about during the interview]. When you see the term “natural disaster” in a question, I want you to think about [HYPOTHETICAL EVENT DISCUSSED]. When you see dates in the questions, those are just to give you an idea of the length of time we would ask about. But for our discussion, I want you to think of a time period immediately after [HYPOTHETICAL EVENT DISCUSSED].
[Interviewer action: Ask respondent to move to next page.]
Now let’s move on to the survey. Please read each question out loud and then think aloud as you come up with your answer.
[Interviewer action: Ask respondent to move to next page.]
[Interviewer action: Observe how respondent answers the survey and watch for any difficulty in determining their answers to the questions. Prompt respondent to think aloud while answering/reading the questions if they do not.]
Probes for Q106
[Remind respondents about placeholder dates and “natural disaster” as needed]
From January 1, 2023 to March 31, 2023, were any of this company’s locations designated as essential by the federal, state, or local government?
⃝1 Yes
⃝2 No
THINKALOUD NOTES
What does “essential” mean to you, as it’s used in this question?
Do you think this question is applicable to natural disasters outside of the Coronavirus pandemic? Why or why not? [If yes, probe for examples.]
How confident are you about your answer to this question?
[If the participant is not confident, ask:] Would you prefer to have a “Don’t know” response option available, or are the current response options okay?
INTERVIEWER: INSTRUCT THE RESPONDENT TO GO TO THE NEXT PAGE.
SAY: On this page, we have the same question but split into three questions: the first asking about being designated as essential by the federal government, the second asking about the state government, and the third asking about the local government. I want to ask you some questions about this different format.
INTERVIEWER: RESPONDENT DOES NOT NEED TO ANSWER THESE QUESTIONS, JUMP TO ASKING PROBES.
From <date1> to <date2>, were any of this company’s locations designated as essential by the federal government?
⃝1 Yes
⃝2 No
From <date1> to <date2>, were any of this company’s locations designated as essential by the state government?
⃝1 Yes
⃝2 No
From <date1> to <date2>, were any of this company’s locations designated as essential by the local government?
⃝1 Yes
⃝2 No
THINKALOUD NOTES
How easy or difficult is it for you to know if your company has been designated as essential by the federal, state, or local government separately? [IF DIFFICULT:] Please explain.
How confident would you be in your answers to these three questions?
[IF THE PARTICIPANT IS NOT CONFIDENT, ASK:] Would you prefer to have a “Don’t know” response option available, or are the current response options okay?
Probes for Q122 (Wave 1 & 2)
[Remind respondents about placeholder dates and “natural disaster” as needed]
From January 1, 2023 to March 31, 2023, what was the total amount of time this company was temporarily closed as a result of the natural disaster?
Temporarily closures can include times when a small number of employees monitor and maintain any of a company’s physical locations while it is otherwise shut down.
⃝1 Did not close (remained open either in full or limited capacity)
⃝2 Less than 1 week
⃝3 1-3 weeks
⃝4 4 weeks or longer
THINKALOUD NOTES
This question gives a definition of temporary closures. Did you think that this definition was clear or not clear? [IF NOT CLEAR:] Please explain.
This definition references a “small” number of employees monitoring and maintaining locations. If this definition said a “limited” number of employees, would that be easier to understand, more difficult to understand, or make no difference? Please explain.
What about if it said “reduced” number—would that be easier to understand, more difficult to understand, or make no difference? Please explain.
How confident are you about your answer to this question?
Does your company have more than one location?
[If YES:] How easy or difficult was it for you to answer this question, thinking about all of your company’s locations?
[Remind respondents about placeholder dates and “natural disaster” as needed]
From January 1, 2023 to March 31, 2023, how did this company’s net profits change compared to what was normal before the natural disaster? If this company is non-profit or not-for-profit, select “Not applicable.”
⃝1 Increased
⃝2 Did not change
⃝3 Decreased
⃝4 Not applicable
THINKALOUD NOTES
Is this company a non-profit or not-for-profit organization?
How easy or difficult was it for you to answer this question? [IF DIFFICULT:] Please explain.
This question provides an instruction about when to select “Not applicable.” Did you think that this instruction was clear or not clear? [IF NOT CLEAR:] Please explain.
In this instruction, we use the terms “non-profit” and “not-for-profit.” Did you think that these terms were clear or not clear? [IF NOT CLEAR:] Please explain.
Do you think that we are missing any other terms used to describe non-profits or not-for profits?
IF NOT APPLICABLE IS SELECTED:
You selected “Not applicable.” Can you tell me more about why you chose that response?
Probes for Q108
[Remind respondents about placeholder dates and “natural disaster” as needed]
From January 1, 2023 to March 31, 2023, how did this company’s on-site operations change compared to what was normal before the natural disaster?
If operations ceased on-site but continued remotely (such as working from home), please select ‘on-site operations ceased’.
⃝1 On-site operations increased
⃝2 On-site operations continued with no change
⃝3 On-site operations decreased
⃝4 On-site operations ceased
THINKALOUD NOTES
Did you think that this question was written clearly or not clearly? [IF NOT CLEARLY:] Please explain.
What does “on-site operations” mean for your company, as it’s used here in this question?
[IF NEEDED:] How does your company define or measure its on-site operations? [INTERVIEWER: we want to know how they are thinking about this phrase as it relates to the productivity—are they thinking about profits? Number of employees? Production outputs?]
We give an instruction here for when to select “on-site operations ceased.” As used in this instruction, is the phrase “continued remotely (such as working from home)” clear or not clear to you? [IF NOT CLEAR:] Please explain.
Does your company use either of the phrases “remotely” or “working from home”?
[IF NO:] How easy or difficult was it for you to understand this instruction?
Probes for Q24
[Remind respondents about placeholder dates and “natural disaster” as needed]
From January 1, 2023 to March 31, 2023, as a result of the natural disaster, did this company have higher than usual, no change in, or lower than usual voluntary turnover for paid employees?
Voluntary turnover occurs when an employee chooses to leave a position that then needs to be refilled.
⃝1 Higher than usual
⃝2 No change
⃝3 Lower than usual
THINKALOUD NOTES
This question provides a definition of “voluntary turnover.” In your opinion, is this definition written clearly or not clearly?
In your company’s experience, have there been times when an employee chooses to leave a position but your company has chosen not to refill that position?
[IF YES TO PROBE 22 (THEY INDICATE THEY HAVE EVER EXPERIENCED NOT REFILLING POSITIONS):] Do you think that voluntary turnover includes those instances of an employee leaving and your company not refilling that position?
[IF YES TO PROBE 22 (THEY INDICATE THEY HAVE EVER EXPERIENCED NOT REFILLING POSITIONS):] Does asking you to exclude these cases make it easier or more difficult to answer this question, or it doesn’t matter? [IF NEEDED:] Please explain.
And for this question, would you know the answer right away, would you make a guess, or would you consult records or other colleagues?
Probes for Q144
[Remind respondents about placeholder dates and “natural disaster” as needed]
From January 1, 2023 to March 31, 2023, as a result of the natural disaster, did any of the structures in which this company operates sustain the following kinds of damage?
|
Yes |
No |
Not applicable |
Minor damage to structure(s) |
|
|
|
Major damage to structure(s) |
|
|
|
Structure(s) destroyed |
|
|
|
THINKALOUD NOTES
Did you think that this question was written clearly or not clearly? [IF NOT CLEARLY:] Please explain.
What kind of structures were you thinking about while answering this question?
IF NOT APPLICABLE IS SELECTED:
You selected “Not applicable.” Can you tell me more about why you chose that response? [PROBE AS NEEDED FOR EACH N/A RESPONSE GIVEN]
INTERVIEWER: INSTRUCT THE RESPONDENT TO GO TO THE NEXT SCREEN.
SAY: On this screen, we have the same question, but now there are examples provided of what we mean by structures—provided in the parentheses. So the examples include buildings, parking garages, warehouses, etc. Now I will ask you some questions about these examples.
INTERVIEWER: RESPONDENT DOES NOT NEED TO ANSWER THESE QUESTIONS, JUMP TO ASKING PROBES.
From January 1, 2023 to March 31, 2023, as a result of the natural disaster, did any of the structures (such as buildings, parking garages, warehouses, etc.) in which this company operates sustain the following kinds of damage?
|
Yes |
No |
Not applicable |
Minor damage to structure(s) |
|
|
|
Major damage to structure(s) |
|
|
|
Structure(s) destroyed |
|
|
|
THINKALOUD NOTES
Did you find these examples helpful or not helpful? [IF NOT HELPFUL:] Please explain.
Did these examples make you think of any other structures that you had not previously considered? If so, what were they?
Probes for Q146
[Remind respondents about placeholder dates and “natural disaster” as needed]
INTERVIEWER, SAY: Respondents who tell us on the previous question that their structures received any kind of damage will receive this next question. I’ll have you look at this question for our testing so that I can ask you some questions for your feedback.
What caused the damage to the structures? Please think about the damage due to the natural disaster. Select all that apply.
☐1 Water damage from rain, flood, snow melt, storm surge, or sewer backup/overflow
☐2 Mold or mildew
☐3 Smoke or fire
☐4 Ice or hail, including damage from related fallen tree/debris
☐5 Wind, including damage from related fallen tree/debris
☐6 Mud, rock, or other debris from moving earth
☐7 Man-made disaster (intentional such as riot/bombing or technological/industrial accident)
☐8 Other (please describe):
THINKALOUD NOTES
How easy or difficult was it for you to answer this question?
Now let’s focus on the response options. Do you think that each of these response options are clear or not clear? [IF NOT CLEAR, FIND OUT WHICH ONES AND WHY]
Do you think that the first option “water damage” and the second option “mold or mildew” are the same thing or different things?
Probes for Q146A
[Remind respondents about placeholder dates and “natural disaster” as needed]
SAY: This is a follow-up to our previous question, so only the items that people report as causes for the damage on the previous question will be listed here. But today I would like you to review all the items so I can ask you questions for our feedback.
INTERVIEWER: WE DON’T EXPECT RESPONDENTS TO GIVE FINAL ANSWERS ON THESE ITEMS.
Did your company submit any insurance claims for structural damage caused by the following?
|
Yes |
No |
Water damage from rain, flood, snow melt, storm surge, or sewer backup/overflow |
|
|
Mold or mildew |
|
|
Smoke or fire |
|
|
Ice or hail, including damage from related fallen tree/debris |
|
|
Wind, including damage from related fallen tree/debris |
|
|
Mud, rock, or other debris from moving earth |
|
|
Man-made disaster (intentional such as riot/bombing or technological/industrial accident) |
|
|
Other (please describe): |
|
|
THINKALOUD NOTES
Let’s first look at the question itself. Did you think that this question was clearly written or not clearly written?
As far as you know, would your company have to submit a separate insurance claim for “mold or mildew,” or would this be covered under water damage?
Do you think you would know the answer to this question right away, would you make a guess, or would you consult records or other colleagues?
As far as you know, about how long after a natural disaster occurs would your company submit insurance claims for structural damage?
Probes for Q88 (Waves 1, 2, & 3)
[INTERVIEWER: This question is for manufacturing only. If the respondent’s company is not a manufacturing company, instruct them to skip the next two screens.]
[Remind respondents about placeholder dates and “natural disaster” as needed]
From January 1, 2023 to March 31, 2023, as a result of the natural disaster, did this company write off any inventories (that is, inventories of raw materials, work-in-process, or finished goods) as damaged, destroyed, or decreased in value?
⃝1 Yes
⃝2 No
THINKALOUD NOTES
Do you think that this question was written clearly or not clearly?
We ask about inventories that were “damaged, destroyed, or decreased in value.” What does “destroyed” mean to you, as it’s used in this question?
INTERVIEWER: INSTRUCT THE RESPONDENT TO GO TO THE NEXT SCREEN.
SAY: On this screen, we first have Version A, which is the same question we just looked at. And then underneath that, we have Version B, which is the same question but the example in parentheses, “raw materials” is replaced with “materials-and-supplies.” I would like to ask you some questions about this wording for your feedback.
INTERVIEWER: RESPONDENT DOES NOT NEED TO ANSWER THESE QUESTIONS, JUMP TO ASKING PROBES.
VERSION A: From January 1, 2023 to March 31, 2023, as a result of the natural disaster, did this company write off any inventories (that is, inventories of raw materials, work-in-process, or finished goods) as damaged, destroyed, or decreased in value?
⃝1 Yes
⃝2 No
VERSION B: From January 1, 2023 to March 31, 2023, as a result of the natural disaster, did this company write off any inventories (that is, materials-and-supplies inventory, work-in-process inventory, or finished goods inventory) as damaged, destroyed, or decreased in value?
⃝1 Yes
⃝2 No
THINKALOUD NOTES
In this question, we list three different kinds of inventories. In Version A, what does “inventories of raw materials” mean to you? [IF NEEDED:] What does your company consider “inventories of raw materials”?
Version B says “materials-and-supplies inventory” instead of “raw materials.” Do you think that this is more clear, less clear, or is there no difference to you?
What does “materials-and-supplies inventory” mean to you? [IF NEEDED:] What does your company consider “materials-and-supplies inventory”?
Would you answer Version B about “materials-and-supplies” differently than Version A about “raw materials”? [If YES:] How would you answer this question?
Both questions list the next type of inventory as “work-in-process.” What does this mean to you, as it’s used in this question? [IF NEEDED:] What does your company consider “inventories of work-in-process”?
And both questions list the last type of inventory as “finished goods.” What does this mean to you, as it’s used in this question? [IF NEEDED:] What does your company consider “inventories of finished goods”?
Concluding Probe(s)
Thinking back over the entire survey, is there anything else you’d like to discuss?
If you were taking this survey normally—not thinking aloud—approximately how long do you think it would take to complete?
Would you consult or request input from any other people in answering these questions?
IF YES:
Who would you consult? [Probe to capture position title, department, and any other details helpful to knowing where this information is available within the company.]
How easy or difficult would it be to coordinate with them and get their help?
How much time do you think it would take to find out the answer to questions you’re unsure about? [Probe to clarify (1) the time respondents think it will take the other person to answer versus (2) the lapsed time from when the respondent requests assistance until they get an answer from the other person.]
Finally, what device did you use to answer the survey? Is it a laptop or a desktop computer? Or is it a mobile device like a tablet or a smart phone?
[Interviewer action: Ask observer if they have any questions they’d like to add, being sensitive to the respondent’s time.]
Those are all the questions I have for you. Thank you very much for participating!
Emergency
Economic Information Collections Question Bank 2
Cognitive
Interviewing Protocol
Wave 2 (PRIVATE SECTOR: Emergency preparedness and operating status)
Checklist
Legend:
Black font‒ interview scripts and probes to be administered by interviewers BLACK UNDERLINED UPPERCASE font ‒ conditions for additional probes [Red font with brackets] ‒ interviewer actions to be performed Blue italicized font ‒ Read in verbatim to respondents (Introduction specific) Purple font ‒ Testing question content (for interviewer reference only) <item> ‒ varying elements to be replaced based on respondent’s answer for probing
|
Respondent number: ___________________
Interview date: ___________________
Interview start time: ___________________
Manufacturing? (Y/N): ___________________
Consent link: [LINK]
Survey link: [LINK]
[Interviewer: at the start of the interview, complete the following:
Connect to Microsoft Teams (audio only)
Confirm if respondent has already signed consent
If consent already signed, verbally confirm consent for participation, consent for recording, and consent for observers
Ask if respondent has any questions
Start recorder and state date, time, participant number, and reconfirm consent
Help respondent access the web survey]
[Interviewer: Read (or paraphrase) the Introduction to the respondent.]
[Interviewer: The think-aloud example (in blue) needs to be read verbatim.]
Introduction
The U.S. Census Bureau is requesting your help to test survey questions focused on collecting emergency economic information. If you agree to take part in this study, you will answer a few questions asking about the impacts of natural disasters on your company’s operations. We recommend that you think about how your company may respond to potential emergencies and unforeseen disruption.
Also, you will be asked to read the instructions and questions aloud and then “think aloud” as you answer each question. This may feel a little unnatural, but it will help us understand how you think about and answer each question. Here’s an example:
Suppose this were a survey about your home and one of the questions asked, “How many bedrooms are there in your house?” Someone thinking aloud as they respond might say, “What do you mean by bedrooms—are these just where people sleep? If so, I have 2 bedrooms. But if we are counting anything with a closet and a window, we also have a study. So perhaps, we have 3 bedrooms. I’m going with where people sleep, so I’ll answer 2 bedrooms.”
Today we are going to look at about 8-10 questions and I’ll ask some additional debriefing questions. After we look at each question, I will ask you some follow-up questions to get your feedback. For example, I might ask you what a particular term or phrase means or ask you to put the question in your own words. There are no right or wrong answers to any of the questions that I might ask. We just want to make sure that everyone is understanding these survey questions in the same way and if the questions are worded clearly and, if they are not, what changes should be made. Your participation is important because it will help the Census Bureau improve these questions to accurately measure economic activities of businesses like yours.
Before we begin, do you have any questions?
[Interviewer action: Answer questions, as needed.]
Let's get started with some background information about your company.
First, tell me about your company: What do you do or make?
What is your role or title?
How long have you worked at your company in this role?
Can you tell me about your experience completing business surveys from the Census Bureau?
In the length of your employment at this company, have you ever experienced a natural disaster like a hurricane, tornado, or flood that seriously impacted the company?
[If yes: instruct the respondent to think about that event when answering these questions:] The questions that we’re going to review do not ask about a specific natural disaster or disruptive event. Instead, the questions use the phrase “natural disaster” as a placeholder so that we can talk about whatever event is relevant to you. When you see the term “natural disaster” in a question, I want you to think about [EVENT DISCUSSED ABOVE]. When you see dates in the questions, those are just to give you an idea of the length of time we would ask about. But for our discussion, I want you to think of a time period immediately after [EVENT DISCUSSED ABOVE].
[If no: instruct the respondent to think about a hypothetical natural disaster that is common in their area (e.g., a tornado, a fire, a hurricane, etc.):] The questions that we’re going to review do not ask about a specific natural disaster or disruptive event. Instead, the questions use the phrase “natural disaster” as a placeholder so that we can talk about whatever event is relevant to you. Since you haven’t had a natural disaster that has impacted the company while you have worked there, let’s think of a hypothetical disaster that may happen in the future. What area of the country are you in? [Guide respondent in selecting a natural disaster to talk about during the interview]. When you see the term “natural disaster” in a question, I want you to think about [HYPOTHETICAL EVENT DISCUSSED]. When you see dates in the questions, those are just to give you an idea of the length of time we would ask about. But for our discussion, I want you to think of a time period immediately after [HYPOTHETICAL EVENT DISCUSSED].
[Interviewer action: Ask respondent to move to next page.]
Now let’s move on to the survey. Please read each question out loud and then think aloud as you come up with your answer.
[Interviewer action: Ask respondent to move to next page.]
[Interviewer action: Observe how respondent answers the survey and watch for any difficulty in determining their answers to the questions. Prompt respondent to think aloud while answering/reading the questions if they do not.]
Probes for Q122 (Wave 1 & 2)
[Remind respondents about placeholder dates and “natural disaster” as needed]
From January 1, 2023 to March 31, 2023, what was the total amount of time this company was temporarily closed as a result of the natural disaster?
Temporary closures can include times when a small number of employees monitor and maintain any of a company’s physical locations while it is otherwise shut down.
⃝1 Did not close (remained open either in full or limited capacity)
⃝2 Less than 1 week
⃝3 1-3 weeks
⃝4 4 weeks or longer
THINKALOUD NOTES
This question gives a definition of temporary closures. Did you think that this definition was clear or not clear?
This definition references a “small” number of employees monitoring and maintaining locations. If this definition said a “limited” number of employees, would that be easier to understand, more difficult to understand, or make no difference? Please explain.
What about if it said “reduced” number—would that be easier to understand, more difficult to understand, or make no difference? Please explain.
How confident are you about your answer to this question?
Does your company have more than one location?
[If YES:] How easy or difficult was it for you to answer this question, thinking about all of your company’s locations?
Probes for Q101
SAY: Respondents who say that their company was temporarily closed on the previous question will receive this next question. I’ll have you look at this question for our testing so that I can ask you some questions for your feedback.
[Remind respondents about placeholder dates and “natural disaster” as needed]
From January 1, 2023 to March 31, 2023, did the following factors related to the natural disaster influence this company’s decision to close temporarily?
|
Yes |
No |
Not applicable |
State of emergency declared by the federal government, state government, or local government |
|
|
|
Stay-at-home orders |
|
|
|
Local government information/suggestion |
|
|
|
School closings, which led to a high number of absent staff |
|
|
|
THINKALOUD NOTES
Let’s focus on the reasons for temporary closures in the table. How easy or difficult was it for you to answer the first row in the table—state of emergency declared? [IF DIFFICULT:] Please explain.
Do you think that this first row [STATE OF EMERGENCY DECLARED] is the same thing as or different from the second row—stay-at-home orders? [IF DIFFERENT:] Please explain.
Do you think that the first row [STATE OF EMERGENCY DECLARED] is the same thing as or different from the third row—local government information/suggestion? [IF DIFFERENT:] Please explain.
Now let’s take a look at the last row in the table—"School closings, which led to a high number of absent staff.” How easy or difficult was it for you to answer this item? [IF DIFFICULT:] Please explain.
IF NOT APPLICABLE IS SELECTED:
You selected “Not applicable.” Can you tell me more about why you chose that response? [PROBE AS NEEDED FOR EACH N/A RESPONSE GIVEN.]
Probes for Q178A/Q178B
[Remind respondents about placeholder dates and “natural disaster” as needed]
From January 1, 2023 to March 31, 2023, what was the total dollar amount of the losses for all structures that were damaged, destroyed, or decreased in value a result of the natural disaster?
Please report amount before insurance compensation. If none, enter 0. Estimates are acceptable.
$_______
From January 1, 2023 to March 31, 2023, what was the total dollar amount of the losses for all equipment that were damaged, destroyed, or decreased in value a result of the natural disaster?
Please report amount before insurance compensation. If none, enter 0. Estimates are acceptable.
$_______
THINKALOUD NOTES
Let’s look at the first question about structures. What kinds of structures were you thinking about while answering this question?
Now let’s take a look at the second question about equipment. What kinds of equipment were you thinking about while answering this question?
Both questions ask about structures and equipment that were “damaged, destroyed, or decreased in value.” Was this phrase—“damaged, destroyed, or decreased in value” clear or unclear to you? [IF UNCLEAR:] Please explain.
Now let’s take a look at the instructions in italicized text. The first instruction says to “report amount before insurance compensation.” Was this instruction clear to you, or not clear? [IF NOT CLEAR:] Please explain.
INTERVIEWER: INSTRUCT THE RESPONDENT TO GO TO THE NEXT SCREEN
SAY: On this screen, we have the same questions, but now there are examples provided of what we mean by structures and equipment—provided in the parentheses. So the examples provided for structures are buildings, parking garages, warehouses, etc. And the examples provided for equipment are production machinery, computers, office supplies, etc. Now I will ask you some questions about these examples.
INTERVIEWER: RESPONDENT DOES NOT NEED TO ANSWER THESE QUESTIONS, JUMP TO ASKING PROBES.
From January 1, 2023 to March 31, 2023, what was the total dollar amount of the losses for all structures (such as buildings, parking garages, warehouses, etc.) that were damaged, destroyed, or decreased in value a result of the natural disaster?
Please report amount before insurance compensation. If none, enter 0. Estimates are acceptable.
$_______
From January 1, 2023 to March 31, 2023, what was the total dollar amount of the losses for all equipment (such as production machinery, computers, office supplies, etc.) that were damaged, destroyed, or decreased in value a result of the natural disaster?
Please report amount before insurance compensation. If none, enter 0. Estimates are acceptable.
$_______
Let’s focus on structures first. Did you find these examples helpful or not helpful? [IF NOT HELPFUL:] Please explain.
Did these examples make you think of any other structures that you had not previously considered? If so, what were they?
Now let’s talk about equipment. Did you find these examples helpful or not helpful? [IF NOT HELPFUL:] Please explain.
Did these examples make you think of any other equipment that you had not previously considered? If so, what were they?
Probes for Q167
[Remind respondents about placeholder dates and “natural disaster” as needed]
Before the natural disaster, did this company have the following in place?
|
Yes |
No |
Not applicable |
Communications with key executives, such as the CEO, CFO, etc. |
|
|
|
Communications with employees outside of work via phone, email, or text messaging |
|
|
|
Systems that enable most employees to work from home or remote locations |
|
|
|
THINKALOUD NOTES
Did you think that the three items in the table were written clearly or not clearly? [IF NOT CLEARLY:] Please explain.
How easy or difficult was it for you to answer this question? [IF DIFFICULT:] Please explain.
IF NOT APPLICABLE IS SELECTED:
You selected “Not applicable.” Can you tell me more about why you chose that response? [PROBE AS NEEDED FOR EACH N/A RESPONSE GIVEN.]
Probes for Q168
[Remind respondents about placeholder dates and “natural disaster” as needed]
From January 1, 2023 to March 31, 2023, did any of this company’s locations use the following due to the natural disaster?
|
Yes |
No |
Not applicable |
Communications with key executives, such as the CEO, CFO, etc. |
|
|
|
Communications with employees outside of work via phone, email, or text messaging |
|
|
|
Systems that enable most employees to work from home or remote locations |
|
|
|
THINKALOUD NOTES
[IF ANSWERED YES FOR ANY ITEM:] Please explain how you decided on your answer. [INTERVIEWER: We want to know if they say “yes” because of the natural disaster, or if they used these things for any other reason.]
[IF ANSWERING HYPOTHETICALLY:] If a [natural disaster/specific disaster respondent is thinking about] were to happen, how do you think you might answer these questions?
[IF YES TO ANY:] Please explain how you decided on your answer. [INTERVIEWER: We want to know if they say “yes” because of the natural disaster, or if they used these things for any other reason.]
Does your company have more than one location?
[IF YES:] How easy or difficult is it for you to answer this question about any of your company’s locations?
IF NOT APPLICABLE IS SELECTED:
You selected “Not applicable.” Can you tell me more about why you chose that response? [PROBE AS NEEDED FOR EACH N/A RESPONSE GIVEN]
Probes for Q185
[Remind respondents about placeholder dates and “natural disaster” as needed]
From January 1, 2023 to March 31, 2023, how often did this company communicate its operating status to paid employees?
⃝1 Daily
⃝2 Weekly
⃝3 Biweekly
⃝4 Monthly
⃝5 Less often than monthly
⃝6 Not at all
THINKALOUD NOTES
How easy or difficult was it for you to answer this question? [IF DIFFICULT:] Please explain.
IF NOT APPLICABLE IS SELECTED:
You selected “Not applicable.” Can you tell me more about why you chose that response?
INTERVIEWER: INSTRUCT RESPONDENT TO GO TO THE NEXT SCREEN.
SAY: For this question, we have some options for key persons your company could communicate its operating status to. Please take a look at this list now and then I will ask you some questions.
INTERVIEWER: Note that we are not trying to pick the best option here. We want to offer all of the terms on this list as options for a fill, so we want to make sure that everything in the list makes sense.
From January 1, 2023 to March 31, 2023, how often did this company communicate its operating status to paid employees?
[This bolded phrase could be replaced with one of the following:
Owners
Customers
Vendors
Contractors
Suppliers
Shareholders
The local government
The state government
The federal government]
⃝1 Daily
⃝2 Weekly
⃝3 Biweekly
⃝4 Monthly
⃝5 Less often than monthly
⃝6 Not at all
⃝7 Not applicable
THINKALOUD NOTES
Does your company communicate with all of the people in this list, or are any not applicable? [FIND OUT WHICH ITEMS ARE NOT APPLICABLE AND WHY. If needed, a potential probe could be: “Can you walk me through this list of key persons and how your company communicates with each of them?” Or, you can ask specifically about a few of the people in the list that you’re not sure about, based on the previous conversations.]
Would it be difficult for you to answer this question about each of these people, or would it be easy to answer about all of them? [FIND OUT WHICH ARE DIFFICULT AND WHY]
Do you think this list is missing any people your company would communicate its operating status to? [IF YES:] Please explain.
Probes for Q186
[Remind respondents about placeholder dates and “natural disaster” as needed]
SAY: If someone said that their company communicated about its operating status to paid employees in the previous question, then they will get this follow-up question. I will ask you to review this question for your feedback.
From January 1, 2023 to March 31, 2023, how did this company communicate its operating status to paid employees? Please select all that apply.
☐1 Automated calls
☐2 Personal telephone calls, including telephone chain/tree
☐3 Text messages
☐4 Email
☐5 Social media
☐6 Other (please describe)
THINKALOUD NOTES
How easy or difficult was it for you to answer this question? [IF DIFFICULT:] Please explain.
Are the response options listed clear to you, or are any unclear? [FIND OUT WHICH ARE UNCLEAR AND WHY]
Do you think we are missing any response options from this list? [IF YES:] Please explain.
INTERVIEWER: INSTRUCT THE RESPONDENT TO GO TO THE NEXT SCREEN.
SAY: Just like the last pair of questions we reviewed, the bolded phrase could be replaced with any term on the same list of people.
INTERVIEWER: Note that we are not trying to pick the best option here. We want to offer all of the terms on this list as options for a fill, so we want to make sure that everything in the list makes sense.
From January 1, 2023 to March 31, 2023, how did this company communicate its operating status to paid employees? Please select all that apply.
[This bolded phrase could be replaced with:
Owners
Customers
Vendors
Contractors
Suppliers
Shareholders
The local government
The state government
The federal government]
☐1 Automated calls
☐2 Personal telephone calls, including telephone chain/tree
☐3 Text messages
☐4 Email
☐5 Social media
☐6 Other (please describe)
THINKALOUD NOTES
Would it be difficult for you to answer this question about each of these people, or would it be easy to answer about all of them? [FIND OUT WHICH ARE DIFFICULT AND WHY]
Probes for Q88 (Waves 1, 2, & 3)
[INTERVIEWER: This question is for manufacturing only. If the respondent’s company is not a manufacturing company, instruct them to skip this question.]
[Remind respondents about placeholder dates and “natural disaster” as needed]
From January 1, 2023 to March 31, 2023, as a result of the natural disaster, did this company write off any inventories (that is, inventories of raw materials, work-in-process, or finished goods) as damaged, destroyed, or decreased in value?
⃝1 Yes
⃝2 No
THINKALOUD NOTES
Do you think that this question was written clearly or not clearly?
We ask about inventories that were “damaged, destroyed, or decreased in value.” What does “destroyed” mean to you, as it’s used in this question?
INTERVIEWER: INSTRUCT THE RESPONDENT TO GO TO THE NEXT SCREEN.
SAY: On this screen, we first have Version A, which is the same question we just looked at. And then underneath that, we have Version B, which is the same question but “raw materials” is replaced with “materials-and-supplies.” I would like to ask you some questions about this wording for your feedback.
INTERVIEWER: RESPONDENT DOES NOT NEED TO ANSWER THESE QUESTIONS, JUMP TO ASKING PROBES.
VERSION A: From January 1, 2023 to March 31, 2023, as a result of the natural disaster, did this company write off any inventories (that is, inventories of raw materials, work-in-process, or finished goods) as damaged, destroyed, or decreased in value?
⃝1 Yes
⃝2 No
VERSION B: From January 1, 2023 to March 31, 2023, as a result of the natural disaster, did this company write off any inventories (that is, materials-and-supplies inventory, work-in-process inventory, or finished goods inventory) as damaged, destroyed, or decreased in value?
⃝1 Yes
⃝2 No
THINKALOUD NOTES
In this question, we list three different kinds of inventories. In Version A, what does “inventories of raw materials” mean to you? [IF NEEDED:] What does your company consider “inventories of raw materials”?
Version B says “materials-and-supplies inventory” instead of “raw materials.” Do you think that this is more clear, less clear, or is there no difference to you?
What does “materials-and-supplies inventory mean to you? [IF NEEDED:] What does your company consider “materials-and-supplies inventory”?
Would you answer Version B about “materials-and-supplies” differently than Version A about “raw materials”? [If YES:] How would you answer this question?
Both questions list the next type of inventory as “work-in-process.” What does this mean to you, as it’s used in this question? [IF NEEDED:] What does your company consider “inventories of work-in-process”?
And both questions list the last type of inventory as “finished goods.” What does this mean to you, as it’s used in this question? [IF NEEDED:] What does your company consider “inventories of finished goods”?
Concluding Probe(s)
Thinking back over the entire survey, is there anything else you’d like to discuss?
If you were taking this survey normally—not thinking aloud—approximately how long do you think it would take to complete?
Would you consult or request input from any other people in answering these questions?
IF YES:
Who would you consult? [Probe to capture position title, department, and any other details helpful to knowing where this information is available within the company.]
How easy or difficult would it be to coordinate with them and get their help?
How much time do you think it would take to find out the answer to questions you’re unsure about? [Probe to clarify (1) the time respondents think it will take the other person to answer versus (2) the lapsed time from when the respondent requests assistance until they get an answer from the other person.]
Finally, what device did you use to answer the survey? Is it a laptop or a desktop computer? Or is it a mobile device like a tablet or a smart phone?
[Interviewer action: Ask observer if they have any questions they’d like to add, being sensitive to the respondent’s time.]
Those are all the questions I have for you. Thank you very much for participating!
Emergency
Economic Information Collections Question Bank 2
Cognitive
Interviewing Protocol
Wave 3 (PRIVATE SECTOR: Hours, profits, and employees)
Checklist
Legend:
Black font‒ interview scripts and probes to be administered by interviewers BLACK UNDERLINED UPPERCASE font ‒ conditions for additional probes [Red font with brackets] ‒ interviewer actions to be performed Blue italicized font ‒ Read in verbatim to respondents (Introduction specific) Purple font ‒ Testing question content (for interviewer reference only) <item> ‒ varying elements to be replaced based on respondent’s answer for probing
|
Respondent number: ___________________
Interview date: ___________________
Interview start time: ___________________
Manufacturing? (Y/N): ___________________
Consent link: [LINK]
Survey link: [LINK]
[Interviewer: at the start of the interview, complete the following:
Connect to Microsoft Teams (audio only)
Confirm if respondent has already signed consent
If consent already signed, verbally confirm consent for participation, consent for recording, and consent for observers
Ask if respondent has any questions
Start recorder and state date, time, participant number, and reconfirm consent
Help respondent access the web survey]
[Interviewer: Read (or paraphrase) the Introduction to the respondent.]
[Interviewer: The think-aloud example (in blue) needs to be read verbatim.]
Introduction
The U.S. Census Bureau is requesting your help to test survey questions focused on collecting emergency economic information. If you agree to take part in this study, you will answer a few questions asking about the impacts of natural disasters on your company’s operations. We recommend that you think about how your company may respond to potential emergencies and unforeseen disruption.
Also, you will be asked to read the instructions and questions aloud and then “think aloud” as you answer each question. This may feel a little unnatural, but it will help us understand how you think about and answer each question. Here’s an example:
Suppose this were a survey about your home and one of the questions asked, “How many bedrooms are there in your house?” Someone thinking aloud as they respond might say, “What do you mean by bedrooms—are these just where people sleep? If so, I have 2 bedrooms. But if we are counting anything with a closet and a window, we also have a study. So perhaps, we have 3 bedrooms. I’m going with where people sleep, so I’ll answer 2 bedrooms.”
Today we are going to look at about 8-10 questions and I’ll ask some additional debriefing questions. After we look at each question, I will ask you some follow-up questions to get your feedback. For example, I might ask you what a particular term or phrase means or ask you to put the question in your own words. There are no right or wrong answers to any of the questions that I might ask. We just want to make sure that everyone is understanding these survey questions in the same way and if the questions are worded clearly and, if they are not, what changes should be made. Your participation is important because it will help the Census Bureau improve these questions to accurately measure economic activities of businesses like yours.
Before we begin, do you have any questions?
[Interviewer action: Answer questions, as needed.]
Let's get started with some background information about your company.
First, tell me about your company: What do you do or make?
What is your role or title?
How long have you worked at your company in this role?
Can you tell me about your experience completing business surveys from the Census Bureau?
In the length of your employment at this company, have you ever experienced a natural disaster like a hurricane, tornado, or flood that seriously impacted the company?
[If yes: instruct the respondent to think about that event when answering these questions:] The questions that we’re going to review do not ask about a specific natural disaster or disruptive event. Instead, the questions use the phrase “natural disaster” as a placeholder so that we can talk about whatever event is relevant to you. When you see the term “natural disaster” in a question, I want you to think about [EVENT DISCUSSED ABOVE]. When you see dates in the questions, those are just to give you an idea of the length of time we would ask about. But for our discussion, I want you to think of a time period immediately after [EVENT DISCUSSED ABOVE].
[If no: instruct the respondent to think about a hypothetical natural disaster that is common in their area (e.g., a tornado, a fire, a hurricane, etc.):] The questions that we’re going to review do not ask about a specific natural disaster or disruptive event. Instead, the questions use the phrase “natural disaster” as a placeholder so that we can talk about whatever event is relevant to you. Since you haven’t had a natural disaster that has impacted the company while you have worked there, let’s think of a hypothetical disaster that may happen in the future. What area of the country are you in? [Guide respondent in selecting a natural disaster to talk about during the interview]. When you see the term “natural disaster” in a question, I want you to think about [HYPOTHETICAL EVENT DISCUSSED]. When you see dates in the questions, those are just to give you an idea of the length of time we would ask about. But for our discussion, I want you to think of a time period immediately after [HYPOTHETICAL EVENT DISCUSSED].
[Interviewer action: Ask respondent to move to next page.]
Now let’s move on to the survey. Please read each question out loud and then think aloud as you come up with your answer.
[Interviewer action: Ask respondent to move to next page.]
[Interviewer action: Observe how respondent answers the survey and watch for any difficulty in determining their answers to the questions. Prompt respondent to think aloud while answering/reading the questions if they do not.]
Probes for Q99
[Remind respondents about placeholder dates and “natural disaster” as needed]
From January 1, 2023 to March 31, 2023, did this company have at least one physical location that was open to customers?
⃝1 Yes
⃝2 No
THINKALOUD NOTES
Did you think that this question was clearly written or not clearly written? [IF NOT:] Please explain.
How easy or difficult was it for you to answer this question? [IF DIFFICULT:] Please explain.
Probes for Q99A
[Remind respondents about placeholder dates and “natural disaster” as needed]
SAY: Respondents who answer “yes” to the previous question will receive this next question. I’ll have you look at this question for our testing so that I can ask you some questions for your feedback.
From January 1, 2023 to March 31, 2023, as a result of the natural disaster, did this company change the number of hours its physical locations were open to customers?
⃝1 Yes
⃝2 No
THINKALOUD NOTES
What about this question—did you think this question was clearly written or not clearly written? [IF NOT:] Please explain.
What does “physical locations” mean to you, as it’s used here in this question?
Does your company have more than one physical location?
How easy or difficult was it for you to answer this question?
[IF ANSWERED YES:] Please explain how you decided on your answer. [INTERVIEWER: We want to know if they say “yes” because of the natural disaster, or if they changed their hours for any other reason.]
[IF ANSWERING HYPOTHETICALLY:] If a [natural disaster/specific disaster respondent is thinking about] were to happen, how do you think you might respond to this question?
[IF YES:] Please explain how you decided on your answer. [INTERVIEWER: We want to know if they say “yes” because of the natural disaster, or if they changed their hours for any other reason.]
Probes for Q99B
[Remind respondents about placeholder dates and “natural disaster” as needed]
SAY: Respondents who answer “yes” to the previous question will receive this next question. I’ll have you look at this question for our testing so that I can ask you some questions for your feedback.
From January 1, 2023 to March 31, 2023, as a result of the natural disaster, how did this company change the number of hours its physical locations were open to customers?
⃝1 Increased hours
⃝2 Decreased hours
THINKALOUD NOTES
How easy or difficult was it for you to answer this question? [IF DIFFICULT:] Please explain.
[IF ANSWERED:] Please explain how you decided on your answer. [INTERVIEWER: We want to know if they increased/decreased because of the natural disaster, or if they increased/decreased for any other reason.]
[IF ANSWERING HYPOTHETICALLY:] If a [natural disaster/specific disaster respondent is thinking about] were to happen, how do you think you might respond to this question?
[IF YES:] Please explain how you decided on your answer. [INTERVIEWER: We want to know if they increased/decreased because of the natural disaster, or if they increased/decreased for any other reason.]
Probes for Q136
[Remind respondents about placeholder dates and “natural disaster” as needed]
In the October-December quarter of 2022 and in the January-March quarter of 2023, what percentage of this company’s goods or services were sold to clients or customers using the following methods?
If individual sales of your company’s goods or services are made through multiple methods, please report the method used at the point of sale.
Enter 0 if none. Estimates are acceptable.
|
October-December quarter of 2022 |
January-March
|
In-person (such as a retail store, display showroom, or door-to-door sales) |
% |
% |
Online (such as the company’s website or apps, a third party’s website or apps, emails, or virtual meetings) |
% |
% |
Telephone |
% |
% |
% |
% |
|
Other (please describe:) |
% |
% |
Total |
[PROGRAMMER TOTAL] |
[PROGRAMMER TOTAL] |
THINKALOUD NOTES
Did you think that this question was written clearly or not clearly? [IF NOT:] Please explain.
This question provides an instruction here in italics. Did you think that these instructions were written clearly or not clearly? [IF NOT:] Please explain.
What does the phrase “the method used at the point of sale” mean to you, as it’s used here?
Was this phrase clear or not clear to you? [IF NOT:] Please explain.
[IF NOT CLEAR:] Does your company use a phrase other than “point of sale”?
How easy or difficult was it for you to answer this question? [IF DIFFICULT:] Please explain.
Probes for Q34 (Waves 1 & 3)
[Remind respondents about placeholder dates and “natural disaster” as needed]
From January 1, 2023 to March 31, 2023, how did this company’s net profits change compared to what was normal before the natural disaster? If this company is non-profit or not-for-profit, select “Not applicable.”
⃝1 Increased
⃝2 Did not change
⃝3 Decreased
⃝4 Not applicable
THINKALOUD NOTES
Is this company a non-profit organization?
[IF NO:] Have you ever worked for a non-profit organization?
How easy or difficult was it for you to answer this question? [IF DIFFICULT:] Please explain.
This question provides an instruction about when to select “Not applicable.” Did you think that this instruction was clear or not clear? [IF NOT CLEAR:] Please explain.
In this instruction, we use the terms “non-profit” and “not-for-profit.” Did you think that these terms were clear or not clear? [IF NOT CLEAR:] Please explain.
Do you think that we are missing any terms used to describe companies that don’t make a profit?
Probes for Q148
[Remind respondents about placeholder dates and “natural disaster” as needed]
From January 1, 2023 to March 31, 2023, did this company submit any insurance claims related to the natural disaster for the following?
|
Yes |
No |
Loss of inventory (such as spoilage, product damage/destroyed) |
|
|
Damage to structures |
|
|
Damage to equipment |
|
|
Workers’ compensation for injury or illness |
|
|
Business interruption (lost profit due to the natural disaster) |
|
|
THINKALOUD NOTES
How easy or difficult was it for you to answer this question? [IF DIFFICULT:] Please explain.
Let’s look at the items in the table more specifically. In the first row, we give you some examples of “loss of inventory.” Did you find these examples helpful or not helpful? [IF NOT HELPFUL:] Please explain.
Now let’s take a look at the second row. What kinds of structures were you thinking about while answering this question?
Now let’s take a look at third row. What kinds of equipment were you thinking about while answering this question?
In the fourth row, the type of insurance we have listed is “Workers’ compensation for injury or illness.” Did you think that this was clear or not clear? [IF NOT CLEAR:] Please explain.
In the last row, we give a definition of business interruption. Did you find this definition helpful or not helpful? [IF NOT HELPFUL:] Please explain.
INTERVIEWER: INSTRUCT THE RESPONDENT TO GO TO THE NEXT SCREEN.
SAY: On this screen, we have the same question, but now there are examples provided of what we mean by structures and equipment—provided in the parentheses. So the examples provided for structures are buildings, parking garages, warehouses, etc. And the examples provided for equipment are production machinery, computers, office supplies, etc. Now I will ask you some questions about these examples.
INTERVIEWER: RESPONDENT DOES NOT NEED TO ANSWER THESE QUESTIONS, JUMP TO ASKING PROBES.
From January 1, 2023 to March 31, 2023, did this company submit any insurance claims related to the natural disaster for the following?
|
Yes |
No |
Loss of inventory (such as spoilage, product damage/destroyed) |
|
|
Damage to structures (such as buildings, parking garages, warehouses, etc.) |
|
|
Damage to equipment (such as production machinery, computers, office supplies, etc.) |
|
|
Workers’ compensation for injury or illness |
|
|
Business interruption (lost profit due to the natural disaster) |
|
|
THINKALOUD NOTES
Let’s focus on structures first. Did you find these examples helpful or not helpful? [IF NOT HELPFUL:] Please explain.
Did these examples make you think of any other structures that you had not previously considered? If so, what were they?
Now let’s talk about equipment. Did you find these examples helpful or not helpful? [IF NOT HELPFUL:] Please explain.
Did these examples make you think of any other equipment that you had not previously considered? If so, what were they?
Probes for Q114
[Remind respondents about placeholder dates and “natural disaster” as needed]
From January 1, 2023 to March 31, 2023, how did the total number of paid employees on this company’s payroll change compared to what was normal before the natural disaster?
⃝1 Increased
⃝2 Did not change
⃝3 Decreased
⃝4 Not applicable
THINKALOUD NOTES
In your own words, what do you think this question is asking?
What kinds of employees were you thinking about while answering this question?
How easy or difficult was it for you to answer this question? [IF DIFFICULT:] Please explain.
What does “compared to what was normal before the natural disaster” mean to you?
[IF NEEDED:] What was your point of comparison for this change?
IF NOT APPLICABLE IS SELECTED:
You selected “Not applicable.” Can you tell me more about why you chose that response?
Probes for Q114A
[Remind respondents about placeholder dates and “natural disaster” as needed]
SAY: Respondents who answer “increased” or “decreased” on the previous question will receive this next question. I’ll have you look at this question for our testing so that I can ask you some questions for your feedback.
How much time do you anticipate will pass before the total number of paid employees on this company’s payroll returns to what was normal before the natural disaster?
⃝1 Less than 1 week
⃝2 1 week to 1 month
⃝3 2-3 months
⃝4 4-6 months
⃝5 More than 6 months
⃝6 I do not believe this company’s total number of paid employees will return to what was normal before the natural disaster
⃝7 Not applicable
THINKALOUD NOTES
Did you think that this question was clearly written or not clearly written? [IF NOT:] Please explain.
How confident are you in your answer to this question?
How easy or difficult was it for you to answer this question? [IF DIFFICULT:] Please explain.
Does your company keep any records to help you answer this question?
[IF YES:] Please tell me more about these records.
IF NOT APPLICABLE IS SELECTED:
You selected “Not applicable.” Can you tell me more about why you chose that response?
Probes for Q01_B
[Remind respondents about placeholder dates and “natural disaster” as needed]
From January 1, 2023 to March 31, 2023, as a result of the natural disaster, how did the company’s total number of paid employee hours change?
⃝1 Increased
⃝2 Did not change
⃝3 Decreased
⃝4 Not applicable
THINKALOUD NOTES
[IF RESPONDENT GAVE AN ANSWER:] How did you arrive at your answer to this question?
[IF RESPONDENT IS ANSWERING HYPOTHETICALLY AND DID NOT PROVIDE AN EXACT ANSWER:] If a natural disaster were to happen, how do you think you might respond to this question?
[IF NEEDED:] How did you arrive at your answer to this question?
How easy or difficult was it for you to answer this question? [IF DIFFICULT AND IF NEEDED:] Please explain.
For this question, would you know the answer right away, would you make a guess, or would you consult records or other colleagues?
IF NOT APPLICABLE IS SELECTED:
You selected “Not applicable.” Can you tell me more about why you chose that response?
Probes for Q88 (Waves 1, 2, & 3)
[INTERVIEWER: This question is for manufacturing only. If the respondent’s company is not a manufacturing company, instruct them to skip this question.]
[Remind respondents about placeholder dates and “natural disaster” as needed]
From January 1, 2023 to March 31, 2023, as a result of the natural disaster, did this company write off any inventories (that is, inventories of raw materials, work-in-process, or finished goods) as damaged, destroyed, or decreased in value?
⃝1 Yes
⃝2 No
THINKALOUD NOTES
Do you think that this question was written clearly or not clearly?
We ask about inventories that were “damaged, destroyed, or decreased in value.” What does “destroyed” mean to you, as it’s used in this question?
INTERVIEWER: INSTRUCT THE RESPONDENT TO GO TO THE NEXT SCREEN.
SAY: On this screen, we first have Version A, which is the same question we just looked at. And then underneath that, we have Version B, which is the same question but “raw materials” is replaced with “materials-and-supplies.” I would like to ask you some questions about this wording for your feedback.
INTERVIEWER: RESPONDENT DOES NOT NEED TO ANSWER THESE QUESTIONS, JUMP TO ASKING PROBES.
VERSION A: From January 1, 2023 to March 31, 2023, as a result of the natural disaster, did this company write off any inventories (that is, inventories of raw materials, work-in-process, or finished goods) as damaged, destroyed, or decreased in value?
⃝1 Yes
⃝2 No
VERSION B: From January 1, 2023 to March 31, 2023, as a result of the natural disaster, did this company write off any inventories (that is, materials-and-supplies inventory, work-in-process inventory, or finished goods inventory) as damaged, destroyed, or decreased in value?
⃝1 Yes
⃝2 No
THINKALOUD NOTES
In this question, we list three different kinds of inventories. In Version A, what does “inventories of raw materials” mean to you? [IF NEEDED:] What does your company consider “inventories of raw materials”?
Version B says “materials-and-supplies inventory” instead of “raw materials.” Do you think that this is more clear, less clear, or is there no difference to you?
What does “materials-and-supplies inventory mean to you? [IF NEEDED:] What does your company consider “materials-and-supplies inventory”?
Would you answer Version B about “materials-and-supplies” differently than Version A about “raw materials”? [If YES:] How would you answer this question?
Both questions list the next type of inventory as “work-in-process.” What does this mean to you, as it’s used in this question? [IF NEEDED:] What does your company consider “inventories of work-in-process”?
And both questions list the last type of inventory as “finished goods.” What does this mean to you, as it’s used in this question? [IF NEEDED:] What does your company consider “inventories of finished goods”?
Concluding Probe(s)
Thinking back over the entire survey, is there anything else you’d like to discuss?
If you were taking this survey normally—not thinking aloud—approximately how long do you think it would take to complete?
Would you consult or request input from any other people in answering these questions?
IF YES:
Who would you consult? [Probe to capture position title, department, and any other details helpful to knowing where this information is available within the company.]
How easy or difficult would it be to coordinate with them and get their help?
How much time do you think it would take to find out the answer to questions you’re unsure about? [Probe to clarify (1) the time respondents think it will take the other person to answer versus (2) the lapsed time from when the respondent requests assistance until they get an answer from the other person.]
Finally, what device did you use to answer the survey? Is it a laptop or a desktop computer? Or is it a mobile device like a tablet or a smart phone?
[Interviewer action: Ask observer if they have any questions they’d like to add, being sensitive to the respondent’s time.]
Those are all the questions I have for you. Thank you very much for participating!
RESPONSE REQUESTED: Schedule an Interview with the U.S. Census Bureau
Dear «name»,
The U.S. Census Bureau is conducting a study in partnership with RTI International, a not-for-profit research organization. We are looking for a select group of participants to review some proposed survey questions about <businesses’/government entities’> response to events, such as the Coronavirus pandemic and natural disasters. The Census Bureau is attempting to be more agile in their response to various emergent events that impact the <business sector/public sector>.
We are reaching out to you because you previously completed a survey for the U.S. Census Bureau on behalf of your <business/government entity>. As a past participant, your perspective is especially valuable and will help ensure that these questions make sense and can be answered accurately.
The interview will take no more than 45 minutes to complete. If you agree to take part in this study, you will answer a few questions about your <business’/government’s> response to different events. After answering these questions, the RTI interviewer will ask you some follow-up questions, such as how easy or difficult it was to answer the questions, or any suggestions you have to improve them.
We welcome your feedback. Please give me a call at XXX-XXX-XXXX or respond to this email to schedule an appointment.
Thank you,
Sincerely,
<<Name>>
Census Bureau Interview Confirmation – DATE at TIME
FNAME,
This is a confirmation of your interview on DATE at TIME (time zone). Thank you again for agreeing to participate! Below are the details about the appointment. I have also sent you a calendar invite to help reserve the time on your schedule.
Some important notes:
The interview will be conducted using Microsoft Teams phone conferencing. The phone number to dial in is XXX-XXX-XXXX. Use the meeting ID XXXXXXXX to join.
Before the interview, please review the consent information linked here: <<CONSENT LINK>> Your interviewer will review it when you meet and you can ask any questions you may have.
During the interview you will be asked to open the survey here: SURVEY LINK. You do not need to review the survey before your appointment.
Please respond to this email or give me a call if you have any questions or need to reschedule your interview.
Sincerely,
<<Name>>
PURPOSE
The U.S. Census Bureau is conducting a short study in partnership with RTI International, a not-for-profit research organization. The Census Bureau routinely conducts research on how to collect information in order to produce the best statistics possible. You are invited to take part in this study, which seeks to help the Census Bureau improve survey questions about the impact of natural disasters on your company/agency. If you agree to participate, you will be asked to complete a few survey questions about the general impacts of natural disasters on your company/agency and discuss how you answer these questions during a confidential interview.
AUTHORITY AND CONFIDENTIALITY
This survey is conducted by the Census Bureau under the authority of Title 13 U.S. Code (U.S.C.), Sections 131 and 182. Section 9 of Title 13 U.S.C. ensures the confidentiality of data reported by private companies. Routine uses of these data are limited to those identified in the Privacy Act System of Record Notice titled “COMMERCE/CENSUS-4, Economic Survey Collection.” The Census Bureau can use your responses only to produce statistics, and is not permitted to publicly release your responses in a way that could identify you, your business, organization, or institution. Additionally, per the Federal Cybersecurity Enhancement Act of 2015, your data are protected from cybersecurity risks through screening of the systems that transmit your data.
This study has been approved by the Office of Management and Budget (OMB). This eight-digit OMB number, 0607-0978, confirms this approval and expires on 12/31/2023. Without this approval, we could not conduct this study.
BURDEN ESTIMATE
We estimate that completing this interview will take no more than 45 minutes. You may send comments regarding this estimate or any other aspect of this survey, including suggestions for reducing the time it takes to complete this survey, to [email protected].
CONSENT FORM
You have volunteered to take part in a study of data collection procedures. You will connect with our interviewer via Microsoft Teams for the virtual interview, in which you will answer the survey questions and then discuss the questions with the interviewer via computer audio or telephone. The interview will be conducted by a staff member from RTI International. We plan to use your feedback to improve the content, design, and layout of the form.
We would like to audio record what you say during the interview. Only the people who work on this study will hear the recording, which will simply be used to ensure we have understood your answers. If you don’t want to be recorded, that’s okay.
Please Check the appropriate options below to indicate your consent.
I give my consent to participate in this research. I understand that my participation is voluntary and that I can stop the interview at any time.
I give my consent to have my research interview recorded on audio. I understand that the recordings are for analyzing the interviews later and that neither my name nor the name of my company will be disclosed.
I give my consent to have additional Census Bureau staff, beyond the interviewer, observe my interview. I understand that this staff is also sworn to uphold Title 13 of the US Code referenced above.
_______________________________________ _________________
Signature of Research Participant
_______________________________________ _________________
Date
_______________________________________ _________________
Printed Name of Research Participant
Public Sector-Newly Developed Questions that were Not Acceptable for the Question Bank v2.0
Q13
From July 1, 2022 to September 30, 2022, did the natural disaster affect this agency’s funding to pay its employees?
⃝1 Yes
⃝2 No
Q14
Did the authority over this agency’s budget change as a result of the natural disaster?
⃝1 Yes
⃝2 No
Public Sector-Question Bank v1.0 Questions that were Not Acceptable for the Public Sector
QB Q144
From July 1, 2022 to September 30, 2022, as a result of the natural disaster, did any of the structures (such as buildings or parking garages) used by this agency sustain the following kinds of damage?
|
Yes |
No |
Not applicable |
Minor damage to structure(s) |
|
|
|
Major damage to structure(s) |
|
|
|
Structure(s) destroyed |
|
|
|
QB Q20
From July 1, 2022 to September 30, 2022, as a result of the natural disaster, did this agency take the following actions related to shifts or hours?
|
Yes |
No |
Not applicable |
Reduction in overtime |
|
|
|
Increase in overtime |
|
|
|
Reduction in shifts or hours |
|
|
|
Increase in shifts or hours |
|
|
|
QB Q24
From July 1, 2022 to September 30, 2022, as a result of the natural disaster, did this agency have higher than usual, no change in, or lower than usual voluntary turnover for paid employees?
Voluntary turnover occurs when an employee chooses to leave a position that then needs to be refilled.
⃝1 Higher than usual
⃝2 No change
⃝3 Lower than usual
QB Q53
From July 1, 2022 to September 30, 2022, how did this agency’s operating budget change compared to what was normal before the natural disaster?
⃝1 Increased
⃝2 Did not change
⃝3 Decreased
⃝4 Not applicable
QB Q52
As a result of the natural disaster, how did this agency change its budgeted capital expenditures from July 1, 2022 to September 30, 2022? Select all that apply.
⃝1 Canceled budgeted capital expenditure(s)
⃝2 Decreased budgeted capital expenditure(s)
⃝3 Postponed budgeted capital expenditure(s)
⃝4 Increased budgeted capital expenditure(s)
⃝5 Introduced new unbudgeted capital expenditure(s)
⃝6 No changes
⃝7 Not applicable
Private Sector-Questions Not Acceptable for Question Bank v2.0
Q136
In the October-December quarter of 2022 and in the January-March quarter of 2023, what percentage of this company’s goods or services were sold to clients or customers using the following methods?
If individual sales of your company’s goods or services are made through multiple methods, please report the method used at the point of sale.
Enter 0 if none. Estimates are acceptable.
|
October-December quarter of 2022 |
January-March |
In-person (such as a retail store, display showroom, or door-to-door sales) |
% |
% |
Online (such as the company’s website or apps, a third party’s website or apps, emails, or virtual meetings) |
% |
% |
Telephone |
% |
% |
% |
% |
|
Other (please describe:) |
% |
% |
Total |
[PROGRAMMER TOTAL] |
[PROGRAMMER TOTAL] |
T he Census Bureau has reviewed this data product for unauthorized disclosure of confidential information and has approved the disclosure avoidance practices applied. (Approval ID: CBDRB-FY23-ESMD001-020)
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | RTI International |
File Modified | 0000-00-00 |
File Created | 2024-09-05 |