OMB Memo

OMB memo_usability_for RTI Review_FINAL.docx

Generic Clearance for Questionnaire Pretesting Research

OMB Memo

OMB: 0607-0725

Document [docx]
Download: docx | pdf

Generic Information Collection Request:
Addendum for Usability Testing of a Self-Administered Current Population Survey (CPS)


Request: The Census Bureau and Bureau of Labor Statistics, through a contract with RTI International, plans to conduct usability testing under the generic clearance for questionnaire pretesting research (OMB number 0607-0725) for a self-administered Current Population Survey (CPS).


Addendum to Original Request: This is an addendum to the cognitive and usability testing for a self-administered CPS under the generic clearance that was approved November 9, 2023. This addendum is only for the usability testing portion of the original request. The revisions to the usability testing request include (1) recontacting participants from round 1 for round 2 interviews, (2) increasing the amount of the incentives provided to participants, (3) decreasing the burden time due to screening in fewer people for the interviews, (4) increasing burden due to the time required for recontacting individuals for a second interview, and (5) switching to the combined race and ethnicity question for the screener.


Purpose: Efforts are underway to add an internet self-response option for the CPS, which is currently conducted only in-person and by telephone. A self-administered version of the CPS presents many challenges, particularly with regards to respondent burden and data quality. Several core CPS items require interviewer probing to elicit sufficient detail from respondents, such as descriptions of job titles and job duties. There are also questions that are interviewer-coded based on open-ended responses (e.g., a question on which techniques respondents looking for work used to try to find a job). The high visibility of the data and the complexity of these challenges mean that thorough testing of a self-administered CPS must be completed prior to implementing this mode into production.


Therefore, in partnership with the Bureau of Labor Statistics, the Census Bureau intends to conduct usability testing of a CPS Internet instrument to evaluate how participants are able to respond and navigate through the CPS. CPS is a longitudinal survey where households participate in multiple waves of the survey. The goal of the study is to conduct usability testing on the month-in-sample (MIS) 1 and MIS-2 internet instruments. There are important differences we must test in the MIS-1 and MIS-2 instrument to include: (1) different login procedures, (2) differences in the household roster, and (3) usage of dependent data in MIS-2. To mirror the CPS data collection strategy and to enable testing of the MIS-1 and MIS-2 instruments, we plan to interview the same participants in both Round 1 and Round 2. The same participants must be interviewed so that dependent data from the first instrument can be fed back into the MIS-2 internet instrument.


Population of Interest: CPS samples a nationally representative sample of households in the United States. We plan to recruit from this general population as described further below.


Timeline: Usability will be conducted from June 2024 through January 2025.


Language: Testing will be conducted in English.


Sample: We plan to conduct two rounds of usability testing. For Round 1, there will be 80 interviews using the CPS MIS-1 internet instrument. In Round 2, we will re-interview 50 of the Round 1 participants using the MIS-2 instrument. In total, there will be 130 usability interviews.


Recruitment: Respondents will be recruited through a variety of methods. RTI will use methods such as posting fliers at local community organizations, putting advertisements in community newspapers, and collaborating with staff at community-based organizations to identify potential respondents. RTI will also post advertisements on Craigslist.com and social media sites such as Facebook and Reddit. Finally, broadcast messages will be distributed to Census Bureau and RTI staff to leverage personal connections.

Recruitment for the round 1 interviews will primarily focus on identifying respondents who meet certain criteria related to critical paths through the instrument. Specifically, we will recruit respondents who are:

  • Employed full-time (35+ hours per week)

  • Employed part-time (less than 35 hours per week)

  • Working more than one job and whose hours generally vary (including gig economy workers)

  • Self-employed

  • Unemployed and actively looking for work

  • Not in the labor force (retired or disabled)

  • Not working and not looking for work, or only passively looking for work

If possible, we will also recruit respondents who are discouraged workers marginally attached to the labor force (looked for work in last year but not in last 4 weeks) and people who had a deviation in their usual work schedule during the CPS reference week (e.g., on vacation, in school, temporarily ill, on jury duty). However, we anticipate that the small size of these populations will make recruitment difficult and therefore will concentrate on the bulleted characteristics above.

Most (80%) of the respondents will live in a household with at least one other person since proxy response is a design feature of the CPS. We will also aim for demographic diversity in sex, education, race/ethnicity, geographic region of the country, and household composition. Since this research is focused on a self-administered web version of the CPS, interviews will be conducted with people who are likely to respond online (i.e., those who use the internet frequently). We will recruit respondents to complete the CPS on both laptop/desktop and mobile devices. The screening questions and advertising materials are attached (see Attachments 1 and 2).

We will schedule the Round 2 interview date and time for 50 participants at the end of Round 1. Each of these participants with a Round 2 interview will receive an email contact approximately halfway between Round 1 and Round 2 reminding and encouraging them to participate in the second interview. A few business days prior to the Round 2 interview, they will receive one last reminder confirming they will participate (see Attachment 3). The other 30 participants who will not be scheduled a Round 2 interview right away will be informed that they may be contacted to participate in an additional round.

Method: Staff from RTI will conduct usability sessions remotely using Microsoft Teams.


Protocol: In Round 1 of the usability sessions (see Attachment 4a), participants will complete a self-administered web instrument of the month in sample one (MIS-1) survey, which is the initial survey CPS respondents receive in the panel. In Round 2 of the usability session (see Attachment 4b, participants will complete a self-administered web instrument of the month in sample two (MIS-2) survey, which is a subsequent survey they will complete in the panel. The participants will start the protocol by reviewing a letter to assist them with the instrument login procedures (see Attachment 5). In both rounds, participants will be asked to think out loud as they would if they were at home completing the surveys (see Attachment 6 for the CPS Self-response questionnaire). The interviewer will do minimal probing during the survey-taking, saying things such as “keep talking” and “what are you thinking?” After they finish, respondents will be debriefed to evaluate responses the participant gave, especially to measure trends in responses from Round 1 to Round 2. At the end of both rounds, the respondent will complete a satisfaction questionnaire (see Attachment 7) and be asked general debriefing probes about their experience.


Consent: We will inform respondents that their response is voluntary and that the information they provide is confidential and will be accessed only by employees involved in the research project. The consent form will also indicate that the respondent agrees that the interview can be audio and videotaped to facilitate analysis of the results (see Attachment 8). Verbal consent will be captured on recordings. Respondents who do not consent to be video and/or audio-taped will still be allowed to participate.


Incentive: In Round 1, participants will receive $50. In Round 2, participants will receive $60. They will be paid a higher incentive to motivate them to participate for the additional interview. Due to the longitudinal nature of this study, it is important to attract and retain the participants. The second interview will also be more burdensome with more in-depth debriefing probes about the dependent data.


Length of Interview: We estimate that each of the 130 interviews will take approximately one hour. This results in a burden of 130 hours.


The screening questionnaire specific to this research will take approximately 15 minutes per person. We estimate that we will screen 6 people for each successful recruit for each of the 80 interviews in Round 1. Therefore, we estimate a total of 480 people screened for a total of 120 hours (480 people at 15 minutes each). Since we are re-interviewing 50 participants in Round 2, we will not need to screen any additional people, but we will need to recontact participants and remind them of their next interview time. We anticipate about 14 hours of burden for recontacting and when necessary, rescheduling participants.


Thus, the total estimated burden for usability testing is 264 hours.


Table 1. Total Estimated Burden

Category of Respondent

No. of Respondents

Participation Time

Burden

  Screening

480

15 minutes

120 hours

Recontact participants to schedule round 2 interviews

80

10 minutes

14 hours

 Interviews

130

60 minutes

130 hours

Totals



264 hours



Below is a list of materials to be used in the current study:

Attachment 1. Screening questionnaire

Attachment 2. Recruitment methodology and ads

Attachment 3. Recontact email between interviews

Attachment 4a. Round 1 Usability protocol

Attachment 4b. Round 2 Usability protocol

Attachment 5. Contact Letters for login

Attachment 6. CPS Self-administered questionnaire

Attachment 7. Usability testing satisfaction questions

Attachment 8. Consent Form


Contact: The contact person for questions regarding data collection and statistical aspects of the design of this research is listed below:


Renee Stepler

Center for Behavioral Science Methods

U.S. Census Bureau

Washington, D.C. 20233

202-295-7635 

[email protected]


Jonathan Katz

Center for Behavioral Science Methods

U.S. Census Bureau

Washington, D.C. 20233

301-763-5956

[email protected]


4


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJessica Holzberg (CENSUS/CBSM FED)
File Modified0000-00-00
File Created2025-08-12

© 2025 OMB.report | Privacy Policy