OMB Memo

EC22 NAPCS Machine learning Usability evaluation_OMB letter_Update.docx

Generic Clearance for Questionnaire Pretesting Research

OMB Memo

OMB: 0607-0725

Document [docx]
Download: docx | pdf

Generic Information Collection Request:

Usability evaluation for the 2022 NAPCS

Update:


This research is dynamic; given updated timetables and functionality, we are requesting to expand testing from 60 case to a maximum of 100 cases. This letter includes updated timetables and burden estimates given those new parameters.


Request: The Census Bureau plans to conduct additional research under the generic clearance for questionnaire pretesting research (OMB number 0607-0725). The Census Bureau plans to conduct a usability evaluation for proposed redesigns for the North American Product Classification System (NAPCS) survey items on the online 2022 Economic Census instrument.


The Economic Census is a mandatory survey conducted by the Census Bureau every five years under the authority of Title 13, United States Code (U.S.C.), Section 131. The survey collects data electronically from nearly 4 million establishments (including large, medium and small companies representing all U.S. locations and industries) on a range of operational and financial topics. Data from the survey are used as the official five-year measure of American business and the economy. As part of the Economic Census, the NAPCS section of the survey collects detailed information about the revenue that businesses generated by the goods and services they provide. Further information regarding the Economic Census can be found on the Census Bureau’s website at: https://www.census.gov/programs-surveys/economic-census.html. For more information about NAPCS, please visit https://www.census.gov/eos/www/napcs/ .


Purpose: Historically, the Census Bureau requested revenue details mainly for goods and services associated with an establishments’ primary North American Classification System (NAICS) classification, thus focusing primarily on where goods were produced rather than which goods were produced. However, beginning with the 2017 Economic Census, the Census Bureau introduced a new way of collecting and disseminating the product revenue information through the NAPCS framework. This framework focuses on the demand-side economy, providing a summary of where goods and services are sold, in the primary industry in which they are produced.


Under the NAPCS framework, businesses were asked to explicitly report detailed information on all revenue generating goods and services they offer, even if those goods and services are not typical of that business’ industry. Although some goods and services might make up a fraction of the business’ revenue, respondents should still report these products, and provide a description of them and their associated revenue.


Item 22 in the 2017 Economic Census - the NAPCS survey question - first presented respondents with a list of products typical in their industry. After selecting products, respondents were then asked to report revenues associated with each of the selected products on a subsequent screen. The survey included additional open-ended response fields where the respondent could write in other revenue generating goods or services that were not pre-listed. This two-stage design takes advantage of embedded automation – basing subsequent questions on previous responses – which is available due to the flexibilities of the electronic mode of the instrument.


This two-stage approach to asking about NAPCS in 2017 still resulted in a substantial number of write in responses, as respondents reported items that they did not believe were captured in the pre-listed descriptions. Each of these write-in responses, in turn, is sent through extensive processing relying on an outsized amount of resources to capture the product or service lines. As such, it is best to reduce the number of write-in responses to reduce the amount of post-hoc data handling and cleaning. To assist in the selection of products/services, and to potentially reduce the number of write-in responses, the upcoming 2022 NAPCS survey item plans to incorporate machine learning functionality. Additionally, the 2022 NAPCS survey item now includes the open-ended response fields to capture items not pre-listed at the first stage of the two question series.


This design proposal differs substantively from prior reporting strategies and therefore requires an evaluation to assess its feasibility prior to implementing in the 2022 Economic Census instrument. Additionally, the incorporation of machine learning and its design elements warrant input from respondents.


Data gathered from testing will help inform user-centered design decisions for the design of the 2022 NAPCS survey items to ensure that respondents can easily report the NAPCS codes most appropriate for them and to minimize write-in responses as much as possible. Additional objectives for the evaluation of the online NAPCS survey item in the 2022 Economic Census instrument include the following:


  • Evaluate the performance of NAPCS in terms of item non-response attributed to its design, functionality and ease of use;

  • Inform the design of the machine learning functionality so that it meets the needs and expectations of respondents, allowing them to make appropriate selections as frequently as possible; and

  • Provide recommendations for improvements to the design of the NAPCS survey items that can potentially enhance data quality and respondents’ reporting experience.


Researchers will record results from the interviews, and produce a report that outlines findings and provides recommendations for improvement.


Staff from the Data Collection Methodology & Research Branch (DCMRB) within the Economic Statistical Methods Division (ESMD) of the Census Bureau will be conducting interviews for this testing, with support from staff from the Economy-wide Statistics Division (EWD), and the Economic Management Division (EMD). For this testing, we will interview up to 60 respondents.


Population of Interest: Large, medium and small companies from various locations and industries across the United States.


Language: Testing will be conducted in English only.


Method: The method of research will be remote usability interviews. For the purposes of this research, the usability interview sessions will be focused on the respondents’ interactions with the revised Item 22 (NAPCS) on the 2022 Economic Census data collection instrument. All interviews will be conducted over Skype for Business and/or phone. Researchers will ask respondents to provide feedback on the machine learning functionality planned to assist in the classification of products/services entered in open-ended response fields for Item 22. Respondents may be provided supplemental information related to their industry to help provide additional context (e.g., respondents will be able to interact with the machine learning search tool to generate a search for their establishment to view relevant products/services for NAPCS). The interviews will follow a semi-structured interview protocol (attached). Subject area specialists from the Census Bureau will participate in some of the interviews in order to listen to the interview, and provide clarification when appropriate.


Sample: Staff from EMD and EWD will provide DCMRB staff with a list of recent 2017 Economic Census respondents for recruiting. This listing will include contact information, a size indicator for the company, and a listing of their major industry classification. We will also attempt to obtain responses provided to the NAPCS survey item from the 2017 Economic Census to reference during the interview session. Whenever possible, we will include companies that used the write-in capture for NAPCS in 2017.


We plan to conduct up to 100 interviews. We plan to conduct interviews with a variety of company sizes (small, medium, and large) and industry classifications (i.e., service, wholesale, retail, manufacturing, mining, and construction). Table 1 displays the targeted distribution for interviews.


Table 1. 2022 Economic Census NAPCS Testing: Potential Respondent Distribution:


Service

Wholesale

Retail

Manufacturing

Mining

Construction

TOTALS

Small

6

6

6

6

3

3

30

Medium

10

6

6

6

4

3

35

Large

10

6

6

6

4

3

35

TOTALS

26

18

18

18

11

9

100



This number of interviews was selected because it is a manageable number of interviews for the time period allotted. This number should adequately cover target company sizes and classifications, and should be large enough to provide reactions to the instrument in order to identify meaningful findings.


Recruitment: We will contact respondents who reported to the 2017 Economic Census and/or the 2019 Annual Survey of Manufactures/ Retail of Organization survey. The resulting sample of participants will be those who are able to be contacted and who agree to participate in the study. Participants will be informed about the purpose of the study, the confidentiality of their responses, and the voluntary nature of their participation.


We will not be providing monetary incentives to participants in this study. Once an interview is scheduled, respondents will receive a confirmation email as well as a reminder email or phone call prior to the interview if the interview is scheduled for a later time. As part of the confirmation email, we will send respondents an additional short survey to benchmark their familiarity and comfort with using computers generally and internet-based tasks specifically. This survey takes no more than five minutes to complete. We will include instructions for using Skype for Business to the respondents in that confirmation email and encourage ‘trial runs’ before the interview to maximize interviewing time and minimize technical difficulties.


Enclosures: Below is a list of materials to be used in the current study:


  1. Attachment A: Protocol

  2. Attachment B: NAPCS screenshots to illustrate the existing NAPCS design and the proposed redesign changes of the NAPCS survey items being evaluated-for illustrative purposes. Respondents will be able to view and comment on features of the redesign

  3. Attachment C: Computer use and Internet experience questions to get an understanding of participants' experience using computers and the Internet

  4. Attachment D: Consent and PRA to inform participant about privacy, confidentiality and burden associated with participation and obtain consent for participation and recording of the session (may be captured via electronic signature)



Timeline: Testing will be conducted from August through November 2020.


For the phone calls, we expect that each interview will last no more than 30 minutes (100 cases x 30 minutes per case = 50 hours). Additionally, to recruit respondents, we expect to make up to 5 phone contacts per completed case. The recruiting calls are expected to last on average 3 minutes per call (5 attempted phone calls per completed case x 100 cases x 3 minute per case = 25 hours). When a respondent agrees to participate, we will schedule a supplemental 10 minute pre-meeting to troubleshoot issues with screensharing and other technology issues (100 cases x 10 minutes = about 17 hours). Finally, respondents will complete a short survey on their familiarity with digital technology and platforms, lasting about five minutes (100 cases x 5 minutes per case = about 8 hours). Thus, the estimated burden for the entirety of this project is approximately 100 hours (50 hours for interviews + 25 hours for recruiting + 17 hours for troubleshooting + 8 hours for self-administered questionnaire).


Contact: The contact person for questions regarding data collection and statistical aspects of the design of this research is listed below:


Melissa Cidade, Ph.D.

Data Collection Methodology & Research Branch

Economic Statistics and Methodology Division

U.S. Census Bureau

Washington, D.C. 20233

(301) 763-8325

[email protected]



Cc:

Nick Orsini (ADEP) with enclosure

Carol Caldwell (ESMD) with enclosure

Diane Willimack (ESMD) with enclosure

Amy Anderson Riemer (ESMD) with enclosure

Melissa Cidade (ESMD) with enclosure

Krysten Mesner (ESMD) with enclosure

Lisa Donaldson (EMD) with enclosure

Kim Moore (EWD) with enclosure

William Samples (EWD) with enclosure

Theresa Riddle (EMD) with enclosure

Michelle Vile Karlson (EMD) with enclosure

James Jamski (EMD) with enclosure

Jennifer Hunter Childs (ADRM) with enclosure

Jasmine Luck (ADRM) with enclosure

Danielle Norman (PCO) with enclosure

Mary Lenaiyasa (PCO) with enclosure

5


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorErica L Olmsted Hawala
File Modified0000-00-00
File Created2021-01-13

© 2024 OMB.report | Privacy Policy