Memo

OMB pretesting request - 2016 MU Instrument Usability Testing - 6-3-16.docx

Generic Clearance for Questionnaire Pretesting Research

Memo

OMB: 0607-0725

Document [docx]
Download: docx | pdf

The Census Bureau plans to conduct additional research under the generic clearance for questionnaire pretesting research (OMB number 0607-0725). We will be conducting usability testing for the 2016 Annual Survey of Manufactures and 2016 Report of Organization (ASM and RO, respectively). The ASM and RO are related and similar programs: They both collect establishment-level data (as opposed to strictly enterprise-level) and use the same electronic collection system. The ASM collects data for industry classification, employment, measures of output, assets, expenditures, inventories, expenses, and other industry-specific inquiries from single- and multi-unit companies. The RO collects basic classification, employment, and sales information in service of frame maintenance between economic censuses only from multi-unit companies across all sectors. The ASM and RO are currently collected electronically, using the downloadable Surveyor application (for multi-unit companies) and via the Census Bureau’s Centurion online collection system (for single-unit companies). The Census Bureau no longer mails paper ASM or RO forms, although we provide worksheets to assist respondents in preparing their responses for entry into the electronic collection systems.


This research will focus on a new reporting system in development for the 2016 ASM/RO, which will replace Surveyor. The new reporting system will provide the same functionality as the current systems, but with enhanced visual design in order to make it more user-friendly. It will also be a web-based instrument rather than a downloadable application. We plan to conduct usability interviews with respondents from the ASM/RO universe in order to test the layout and functionality of the prototype instrument. A functioning prototype may not be available in time for the first round of testing, in which case we intend to use paper mockups. In the first round we will focus on the effectiveness of the design of screens (labels, instructions, button placement, etc.) in conveying the intended use and functionality of each screen. A fully functioning prototype will be available for the second round of testing.


In addition to the new web instrument, in the first round we will evaluate other aspects of the future reporting system. We will conduct usability testing of a prototype of the eCorrespondence system, which will serve as the online authentication portal for the 2016 ASM/RO by which respondents will access the data collection instrument. We will evaluate prototypes of Excel spreadsheet reporting templates to be used by multi-unit companies, with a focus on visual design of enhanced instructions, and PDF worksheets, with a focus on content and potential uses by respondents. We will also present samples of follow-up letters printed on pressure-sealed envelopes (pre-glued paper that is then folded by a machine that applies high pressure to seal the letter into an envelope configuration) to guage respondents’ reactions to receiving such a mailing. In the second round we will test the functionality and usability of a fully functioning prototype online instrument.


We plan to conduct usability interviews with respondents from up to 60 multi-unit companies that responded to the 2015 ASM/RO, split evenly between two rounds of testing (up to 30 respondents in each round). The first round of usability testing interviews will include the screen mockups of the multi-unit company electronic data collection instrument for the 2016 ASM/RO and the 2017 Economic Census, the eCorrespondence prototype website, the prototype worksheet, and the prototype spreadsheet. The second round of usability testing interviews will include the completed version of the new multi-unit company electronic data collection instrument for the 2016 ASM/RO.


Research Questions

  • eCorrespondence Prototype:

    • Can respondents access the test site?

    • Is the navigational path through the screens of the site logical to respondents?

    • Do button labels and descriptions make sense to respondents and inform respondents of the correct functions of the buttons?

    • Are respondents able to discern the functions of individual screens?

    • What additional features or information do respondents want in the MyCensus prototype?

  • Multi-unit (MU) Instrument Screen Mockups:

    • Does the flow of the instrument screens support respondents’ informational and decision-making needs with regard to reporting options and instrument functionality? If not, what other information do they need, and what are the optimal locations and formats for presenting such information?

    • Do button labels and descriptions make sense to respondents and inform respondents of the correct functions of the buttons? If not, what terms do respondents suggest?

    • Are respondents able to discern the functions of individual screens? If not, what are the sources of confusion, and how they suggest making them clearer?

    • What features do respondents typically look for when completing the survey, and does our instrument meet their expectations?

    • What additional features do respondents want in the instrument?

  • Spreadsheet Prototype:

    • Do respondents understand the intended use of the spreadsheet template?

    • Is the visual design of the spreadsheet template effective in promoting readability of questions and instructions?

    • Would respondents use the spreadsheet, and if so, how? E.g., VLOOKUP, copy/paste, send to other respondents to gather data, etc.

    • What additional features or information do respondents want in the spreadsheet template?

  • Reporting Guide Prototype:

    • Would respondents use the worksheets, and if so, how? E.g., reviewing question requirements during course of gathering data, writing down responses, writing down procedural notes, sending to data providers, receiving data from providers, retaining for records, etc.

    • How do respondents react to the format and content of our downloadable questions/instructions document mockup?

    • Do respondents have any suggestions for changing content or visual design?

    • Are there any other formats we should consider for presenting sufficient reporting instructions? E.g., writeable PDF.


  • Pressure Sealed Envelopes

    • How do respondents react to receiving a pressure-sealed envelope?

      • Are respondents familiar with pressure-sealed envelopes? Do they receive this type of mail from other government sources, and if so, which ones?

      • Do they perceive this kind of mailing to be a legitimate request from the Census Bureau?

      • Do they have any problems opening such an envelope?

      • Do they have any problems finding the information they need – information about the survey, legal information, due date, log-in information, etc.?

      • Do they anticipate any problems with the way such a mailing would be handled and processed in intra-company mail systems?



Interviews will be conducted from July through October 2016. Interviews will be conducted by staff from the Data Collection Methodology & Research Branch within the Census Bureau’s Economic Statistical Methods Division. Subject matter and collection operations staff may participate as observers of the interviews as they are able. Interviews will be audio recorded with respondents’ permission. After respondents agree to conduct the interviews, they will schedule a meeting time and will receive confirmation emails about their appointments. Participants will be informed that their responses are voluntary and that the information they provide is confidential and will be seen only by Census Bureau or special sworn employees involved in the research project. We will not be providing monetary incentives to participants in this study.


We estimate that it will be necessary to interview one respondent at each business. We estimate the interviews will take up to 1 hour (60 cases X 1 hour per case = 60 hours). Additionally, to recruit respondents we expect to make up to 5 phone contacts per completed case. The recruiting calls are expected to last on average two minutes per call (5 phone calls per completed case X 60 cases X 2 minutes = 10 hours). Thus, the estimated burden for this project is 70 hours (10 hours for recruiting + 60 hours for interviews).

Enclosed are copies of the screen mockups of the 2016 ASM/RO multi-unit company instrument, screenshots of the eCorrespondence website, a prototype worksheet, a prototype spreadsheet, draft content of a follow-up letter that will be printed on pressure-sealed envelopes, and a draft interview protocol.


The contact person for questions regarding data collection and statistical aspects of the design of this research is:


Aryn Hernandez

Data Collection Methodology & Research Branch

U.S. Census Bureau

Washington, D.C. 20233

(301) 763-7982

[email protected]



Enclosures

cc:

William Bostic          (DIR) with enclosures

Nick Orsini (DIR) “ “

Kevin Deardorff (EWD) “          ”

Lisa Donaldson (EMD) “ “

Carol Caldwell (ESMD) “         ”

Cynthia Hollingsworth (EWD) “ “

Shelley Karlsson (EMD) “ “

Julius Smith (EWD) “ “

Carma Hogue (ESMD) “         ”

Diane Willimack              (ESMD)   “         ”
Bill Davie (ESMD) “         ”

James Jamski (EMD) “ “

Theresa Riddle (EWD) “ “

Danielle Norman (PCO) “          ”


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorAryn Hernandez (CENSUS/ESMD FED)
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy