Volume I - BPS Cognitive Testing

Volume I BPS20-22 Cognitive Usability Testing.docx

NCES System Clearance for Cognitive, Pilot, and Field Test Studies 2019-2022

Volume I - BPS Cognitive Testing

OMB: 1850-0803

Document [docx]
Download: docx | pdf


2020/22 Beginning Postsecondary Students Longitudinal Study (BPS:20/22)

Cognitive and Usability Testing

Volume I

Supporting Statement

OMB # 1850-0803 v. 260

Submitted by

National Center for Education Statistics

U.S. Department of Education


December 2019

Attachments:

Attachment I – Recruitment Procedures and Materials

Attachment II – Eligibility Screening Questions

Attachment III – Consent to Participate in Research

Attachment IV – Interview Protocol

Attachment V – Interview Facsimile

Contents

Submittal-Related Information 3

Background 3

Design and Context 4

Estimated Respondent Burden 5

Estimate of Costs for Recruiting and Paying Respondents 6

Estimate of Cost Burden 6

Cost to Federal Government 6

Assurance of Confidentiality 6

Schedule for BPS:20/22 OMB requests and related activities 6


Tables

Table 1. Screening and participant numbers by respondent type 5

Table 2. Estimate respondent burden 6


Submittal-Related Information

The following material is being submitted under the National Center for Education Statistics (NCES) generic clearance agreement (OMB# 1850-0803), which provides NCES the capability to improve data collection instruments by conducting testing such as usability tests, focus groups, and cognitive interviews, to improve methodologies, survey questions, and/or delivery methods.

This request is to conduct cognitive and usability testing starting in January 2020 in preparation for the 2020/22 Beginning Postsecondary Student Longitudinal Study (BPS:20/22) field test data collection (OMB# 1850-0631), which will begin in March 2021. RTI International will collect BPS:20/22 data on behalf of NCES under contract to the U.S. Department of Education. EurekaFacts is RTI’s subcontractor for aspects of the BPS:20/22 cognitive and usability testing.

This submission describes the cognitive and usability testing recruitment, screening, and procedures designed to ensure quality, performance, and reliability of the tested questions and of the overall survey usability. The results will be presented to a Technical Review Panel (TRP) for discussion of potential survey modifications prior to the field test and will be submitted to OMB for review in November 2020 as part of the BPS:20/22 field test data collection request.

Background

Students in the BPS:20/22 sample are initially identified in the 2019-20 National Postsecondary Student Aid Study (NPSAS:20). NPSAS:20 is a cross-sectional study that examines the characteristics of students in postsecondary education, with special focus on how they finance their education. BPS:20/22 is the first follow-up survey with a subsample of NPSAS:20 sample members who were identified as first-time beginning college students (FTBs) during 2019-20 academic year. A second BPS follow-up is planned for 2025.

As a longitudinal study, BPS:20/22 is designed to follow a cohort of students who enroll in postsecondary education for the first time during the NPSAS academic year of interest, irrespective of the date of high school completion. The study collects data on student persistence in, and completion of, postsecondary education programs; their transition to employment; demographic characteristics; and changes over time in their goals, marital status, income, and debt, among other indicators. Data from BPS are used to help researchers and policymakers better understand how financial aid influences persistence and completion, what percentages of students complete various degree programs, what early employment and wage outcomes are for certificate and degree attainers, and why students leave postsecondary education.

The cognitive and usability testing described in this submission will allow NCES to test selected survey questions that are either new to this BPS cohort or have been revised from existing questions before their inclusion in the BPS:20/22 field test data collection. The testing of these questions is intended to ensure the quality, performance, and reliability of data elements pertaining to persistence, attainment, and labor market outcomes to support the overarching purpose of BPS. Specifically, this includes questions intended to collect data on respondents’ months enrolled in postsecondary education and intensity; education experiences, including academic and social activities; emergency aid; employment history; housing security and homelessness; and food security.

Results from cognitive and usability testing will refine the survey questions, maximize the quality of data collected, and provide information on issues with important implications for the survey design, such as the following:

  • The comprehension of certain terms in survey questions, including updated and added terminology;

  • The thought processes used to arrive at answers to survey questions;

  • Appropriate response categories to questions;

  • Sources of burden and respondent stress;

  • User interaction with the survey, which has been optimized to adjust to different screen sizes, including smaller mobile devices; and

  • Ease of survey navigation on all devices, including desktop, laptop, and mobile devices (tablet or smartphone).

Design and Context

The cognitive and usability testing described in this generic clearance package will be conducted with individuals who have similar characteristics to those in the BPS:20/22 cohort. EurekaFacts staff, who have extensive experience in cognitive and usability testing methodologies, will recruit cognitive and usability testing participants, conduct the interviews, compile interview video and audio recordings, and report the results.

Cognitive and usability testing will be conducted simultaneously using a subset of questions proposed for inclusion in the BPS:20/22 field test survey. For the cognitive testing component of the interviews, respondents will read the questions quietly to themselves and will be asked to “think aloud” as they come up with their responses to each question. They will be prompted by the interviewer to explain the mental steps they took to arrive at the answer. Interviewers will also use “general probing” throughout the duration of the interview when respondents give an indication of difficulty with the question to identify the source of confusion (see Attachment IV for a list of general probes). In addition, “specific probes” will be administered to respondents on targeted questions after the respondent has read the question and provided a response. After a response has been provided and the respondent clicks “Next,” a prompt will appear on screen alerting the interviewer and respondent that we are interested in learning more about this question through specific probes. Attachment V provides a list of the targeted survey questions and their specific probes. These three types of probes (i.e., think aloud, general, and specific probes) will help to identify how respondents understand the questions and formulate their answers to help evaluate and revise question wording as needed.

For the usability testing component of the interviews, interviewers will observe and probe on respondents’ ease of navigation through the survey and will debrief respondents following the survey on their overall experience answering questions on their desktop, laptop, or mobile device (as applicable).

A total of 30 respondents will be invited to participate either in-person or remotely, according to respondents’ location preference and the type of device the survey is being conducted on. If participating remotely, respondents will be required to have access to a desktop or laptop computer with a high-speed internet connection to establish an audio and video connection with the interviewer.

Remote testing is convenient and flexible for respondents because they can schedule the session to fit their needs and can participate from their home, institution, or other location. It allows respondents to use the survey in a real-world environment rather than in a lab setting. EurekaFacts’ web-based remote interviewing/usability solution includes webcam technology, video streaming, and an audio connection to provide real-time, face-to-face interaction between the respondent and interviewer as both view the respondent’s screen. This enables the efficacy of self-administered surveys to be evaluated on both the laptop and desktop computers. Respondents who do not have access to a webcam will have one shipped to their address for use during the testing session. After the testing is completed, respondents will return the webcam using postage-paid packaging provided by EurekaFacts.

All interviews will be conducted through an audio connection while both the respondent and interviewer view the survey. Those recruited for usability testing on a mobile device (e.g. tablet or smart phone) will be interviewed at the EurekaFacts facilities in order to assure proper positioning of the camera over their mobile device which will allow the recording of the respondent’s actions as they are navigating the survey during the interview. This allows both the interviewer and respondent to view the mobile screen in real-time. Remote observers can log on, watch the respondent’s face, listen to the interview, follow the respondent's screen as they complete the survey on their computer or mobile device, and listen to the debriefing. Remote observers will also be able to communicate with each other and the interviewer through e-mail.

The cognitive and usability testing sample will include individuals who first began their postsecondary education between July 1, 2017 and June 30, 2018, and who have similar characteristics as those who will participate in the BPS:20/22 field test data collection (as identified in the eligibility screener; see Attachment II for specific eligibility screener questions). The sample will include both degree and certificate completers, those who left prior to completing a degree, and those still enrolled. Individuals will be recruited from institutions with varied characteristics, including level of degree offered (less-than 2-year, 2-year, and 4-year) and control (public, private not-for-profit, private for-profit). See Table 1 for the expected number of testing participants by respondent type.

Recruits will be identified using EurekaFacts database of potential research respondents in the Washington, DC metro area. The database includes information on key demographic criteria, including gender, age, and race/ethnicity, which will be used to diversify the sample within the constraints identified in the previous paragraph. Referrals, advertisements in student newspapers and online forums, and social media postings may also be used to recruit respondents. All recruitment of potential cognitive interview respondents will be conducted using an online recruitment screener containing eligibility criteria questions specific to this study to ensure that testing participants qualify for the study.

Audio and video recordings of each interview will be available to NCES and BPS:20/22 staff at RTI for review. Either during or immediately following the conclusion of each interview, EurekaFacts will code participant responses and organize observations. Upon completion of all interviews, EurekaFacts will analyze the combined data and summarize the common themes and insights from the interviews into a final written report.

Table 1. Screening and participant numbers by respondent type

Respondent type

Screened

Testing participants

Currently enrolled

265

20

Not currently enrolled

135

10

Total

400

30

Attachment I in this submission presents the procedures and materials that will be used for recruitment of testing participants; Attachment II the screening questions that will be used to determine eligibility for cognitive and usability testing; Attachment III the consent forms; Attachment IV the cognitive and usability testing protocol; and Attachment V a facsimile of the survey, including a table listing questions with specific probes.

Estimated Respondent Burden

To yield 30 completed interviews, we anticipate screening up to 400 individuals for eligibility and to ensure that we are achieving the desired distribution of respondent characteristics. The screening process, on average, is estimated to take about 5 minutes per person (see Attachment II). Each cognitive and usability testing session will last a maximum of 60 minutes.

Table 2. Estimate respondent burden

Activity

Number of respondents

Number of responses

Minutes per respondent

Maximum total burden hours

Screening

400

400

5

33

Cognitive and usability interview

30*

30

60

30

Study total

400

430


63

* Subset of the screened group.

Estimate of Costs for Recruiting and Paying Respondents

In order to be able to recruit a representative range of respondents, and to thank them for their time and participation, we will offer prospective participants a $50 gift card from a major credit card company for completing the 60-minute cognitive interview.

Estimate of Cost Burden

There are no direct costs for respondents.

Cost to Federal Government

The cost to the federal government for conducting cognitive interviews will be $86,067 under the EurekaFacts subcontract to RTI. This cost includes recruitment, conducting interviews, analyses, report writing, and participant incentives.

Assurance of Confidentiality

Cognitive and usability testing respondents will be informed that their participation is voluntary and that:

EurekaFacts and RTI International are carrying out this research for the National Center for Education Statistics (NCES), part of the U.S. Department of Education. NCES is authorized to conduct this study by the Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9543). All of the information you provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151).

All respondents will be assigned a unique identifier (ID), which will be created solely for data file management and used to keep all materials for each respondent together. The respondent ID will not be linked to the respondent’s name. Respondents will be sent a consent form via e-mail, on which they will need to provide an electronic signature before sending it back to EurekaFacts’ office in order to confirm their participation. The signed consent forms will be kept separately from the survey data files for the duration of the study and records, including the audio and video recordings obtained during the administration of this study, will be destroyed after the final report is completed.

Schedule for BPS:20/22 OMB requests and related activities

EurekaFacts will begin recruiting for the cognitive and usability testing upon receiving OMB clearance, and the testing is scheduled to begin by February 2020. Informed by the testing, a final draft of the survey will be used in a field test with approximately 3,500 sample members, beginning in March 2021.

Recruit participants

January – February 2020

Conduct cognitive testing

February – April 2020

Finalize revisions to item wording

April – October 2020



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleMemorandum
Authormcominole
File Modified0000-00-00
File Created2021-01-14

© 2024 OMB.report | Privacy Policy