OMB Memo

OMB letter VCIS part 2.docx

Generic Clearance for Questionnaire Pretesting Research

OMB Memo

OMB: 0607-0725

Document [docx]
Download: docx | pdf



Generic Information Collection Request


Request: The Census Bureau plans to conduct additional research under the generic clearance for questionnaire pretesting research (OMB number 0607-0725). We propose to conduct exploratory research online and in-person cognitive/usability testing to gather information about online map preference and use. As part of this current submission, we are seeking approval for the in-person usability/cognitive interviews that will be conducted by the Center for Behavioral Science Methods (CBSM) staff. This is an addition to a prior submission that sought the approval of the online questionnaire for the exploratory study.


Purpose: The results of this study will inform the design of a 2020 Census experiment, the Vacant Crowdsourcing Study. The study will ask Census respondents to identify any vacant housing units in their neighborhood using online map. If the results of the experiment show that respondents can identify vacant units accurately, the experiment’s methods may be implemented in future Censuses and other surveys to reduce follow-up field work of identifying vacant housing units. Specifically, the findings from the current study will be used to help select the type of address display and the number of residential units to be displayed.


The goal of the in-person testing phase is to gather information used for recommending (1) improvement to the mailing materials for the experiment; (2) the map(s)/list(s) to be used in this 2020 Census experiment; and (3) the number of housing units to be displayed on the recommended map(s)/list(s). The information gathered will include both quantitative and qualitative data about how participants used a map/list to complete tasks such as finding their home address on the map/list, finding the address that is four homes away from their home, and identifying vacant and/or occupied neighboring units. Quantitative data will include how long it takes participants to complete the tasks and if they can perform the tasks accurately. Qualitative data will include any spontaneous comments participants make while completing the tasks and responses to probing questions.


Population of Interest: People who will respond online to the Census.


Timeline: The in-person study will be conducted in April and May 2019. Recruiting for the in-person study will start in late April.


Language: Testing will be conducted in English only.



Sample: For the in-person phase, we will use a convenience sample of 30 participants from the Washington, D.C. metropolitan area. Participants selected for this research will be from the population of general public, who we expect would respond online on behalf of their housing unit in 2020. We will recruit participants for this population by adding questions on the screening questionnaire about (1) whether the participant handles the mail in their household, as a proxy for being the person who would open the mail and answer the census and (2) whether the participant prefers to do surveys on the web or by mail, as a proxy for the person completing the census online. We will also recruit the participants with various housing unit types (e.g., detached and attached housing, apartments, townhomes, and mobile homes); home ownership types (Own, Rent); number of years lived in their home; and population density (Urban, Rural). The participants will have various educational levels and household compositions including with children and no children, and related and unrelated adults.


Recruitment: Participants will be recruited by CBSM staff using the following methods: advertisement postings on Craigslist, broadcast messages distributed to Census headquarters staff, flyers, and word of mouth.


Method and Protocol: In an in-person testing session, the participant will read the mailing materials on paper and examine the map(s)/list(s) on either a Census Bureau-provided mobile device or laptop computer. Participants will first be asked to complete a demographic questionnaire. Then, participants will be asked to read the mailing materials while thinking aloud (using a think-aloud protocol) and then give feedback using generic probing questions. After the mailing materials, participants will be asked to complete a questionnaire on Internet experience. They will then be shown an address display and they will complete tasks while being timed and thinking-aloud. After completing the tasks with one type of address display, participants will be asked to complete a certainty-in-accuracy questionnaire and satisfaction questionnaire. They will then be shown each of the 6 address displays being tested in this phase and asked to rank their preference and then asked to explain what they were thinking while they answered that question.


Consent: Each participant will give a written consent to participation to the study. We will inform participants that their response is voluntary and that the information they provide is confidential and will be accessed only by employees involved in the research project. The consent form will also indicate that the respondent agrees that the interview can be audio and/or videotaped to facilitate analysis of the results. Participants who do not consent to be video and/or audio-taped will still be allowed to participate.


Use of Incentive: For the in-person testing, we plan to offer an incentive of $40 to offset the costs of participation, such as travel and parking.


List of Materials: Below is a list of materials to be used in the current study. Included is a note on whether each attachment is new or has already been approved by OMB.


  1. Recruitment information including additional screener questions, recruitment text and flyer (Enclosure 1)

  2. Protocol used for the in-person phase study (Enclosure 2)

  3. Demographic questionnaire (Enclosure 3) (Previously approved by OMB for usability testing in the spring of 2015 for the 2015 Census Test.)

  4. Draft mailing materials (Enclosure 4)

  5. Address displays (Enclosures 5a – 5f)

  6. Mobile experience questionnaire (Enclosure 6)

  7. Tasks (Enclosure 7)

  8. Satisfaction and post-task confidence-in-accuracy questionnaire (Enclosure 8)

  9. Preference questionnaire (Enclosure 9)


Length of interview: For the in-person study, we estimate 60 minutes per participant.


The pre-approved generic screening questionnaire will take approximately ten minutes per person, and the additional screening questions specific to this research will take two minutes per person. We estimate that we will screen three people for each successful recruit for each of the 30 interviews. Therefore, we estimate a total of 90 people screened for a total of 3 hours.


The total estimated respondent burden for this request is 33 hours.


Table 1. Total Estimated Burden

Category of Respondent

No. of Respondents

Participation Time

Burden

Screening

90

12 minutes

3 hours

Cognitive/Usability Interviews

30

60 minutes

30 hours

Totals



33 hours



The contact person for questions regarding data collection and statistical aspects of the design of this research is listed below:


Shelley Feuer

Center for Behavioral Science Methods

U.S. Census Bureau

Washington, D.C. 20233

(301) 763-0873

[email protected]


3


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorErica L Olmsted Hawala
File Modified0000-00-00
File Created2021-01-15

© 2024 OMB.report | Privacy Policy