2020/22 Beginning Postsecondary Students (BPS:20/22) Field Test

2020/22 Beginning Postsecondary Students (BPS:20/22) Field Test

Appendix D BPS2022 FT Cognitive Interview Summary Final

2020/22 Beginning Postsecondary Students (BPS:20/22) Field Test

OMB: 1850-0631

Document [docx]
Download: docx | pdf

2020/22 BEGINNING POSTSECONDARY STUDENTS LONGITUDINAL STUDY (BPS:20/22)

Field Test


Appendix D

Cognitive and Usability Testing Summary



OMB # 1850-0631 v. 18

Submitted by

National Center for Education Statistics

U. S. Department of Education



August 2020



Table of Contents



List of Tables





Background

The National Center for Education Statistics (NCES) is preparing the first longitudinal follow-up to the cross-sectional 2019-20 National Postsecondary Student Aid Study (NPSAS:20) which examines the characteristics of students in postsecondary education, with special focus on how they finance their education. The Beginning Postsecondary Students Longitudinal Study (BPS:20/22) is the first follow-up survey with a subsample of NPSAS:20 sample members who were identified as first-time beginning college students (FTBs) during the 2019-20 academic year. Data from BPS are used to help researchers and policymakers better understand how financial aid influences persistence and completion, what percentages of students complete various degree programs, the early employment and wage outcomes of certificate and degree attainment, and why students leave postsecondary education.

To ensure their quality, performance, and reliability EurekaFacts conducted cognitive and usability testing of a portion of the survey questions that are either new to this BPS cohort or have been revised from existing questions before their inclusion in the study. Full details of the cognitive interview components were originally approved in December 2019, OMB# 1850-0803 v. 260 (revised and approved in April 2020, OMB# 1850-0803 v. 267)1. Insights from testing will help to refine survey questions, maximize data quality, and provide information on issues with important implications for the overall survey design, such as: 

  1. Comprehension of certain terms in survey questions, including updated and added terminology; 

  2. The thought processes used to arrive at answers to survey questions; 

  3. Appropriate response categories to questions; 

  4. Sources of burden and respondent stress; 

  5. User interaction with the survey, which has been optimized to adjust to different screen sizes, including smaller mobile devices; and 

  6. Ease of survey navigation on all devices, including desktop, laptop, and mobile devices (tablet or smartphone). 





Executive Summary

Introduction

The present study focuses on the Beginning Postsecondary Students Longitudinal Study (BPS:20/22) student survey, investigating issues and preferences regarding item formulation and comprehension as well as pinpointing sources of substantial participant burden.

Sample

A total of 28 students participated in one-on-one in-person or remote cognitive and usability interviews with EurekaFacts staff between March 2, 2020 and May 21, 2020. Of these participants, 71% were female and 29% were male. Most participants (89%) were aged 18-24, while the rest were either 25-29 (7%) or 40-49 (4%). Further, 39% of the sample identified as White, 29% identified as Black or African American, 7% identified as Asian, and another 7% preferred not to answer. One participant identified as mixed race White and Asian. Additionally, 14% of the sample also identified as being of Hispanic or Latino origin. Concerning education, 82% of the sample was pursuing a bachelor’s degree while 18% were pursuing an associate degree.

Key Findings

  1. Overall, participants did not closely read instructions, especially when instructions included a lot of text. Burdens and difficulties could be decreased with more concise directions for survey questions in which participants exhibited low comprehension.

  2. When difficulty occurred on calendar-style items, it was a significant source of burden due to the format of the selection options. Instructions should emphasize the selection of all months of attendance.

  3. When comparing question versions that collect the same concept, participants favored items perceived as more concise (e.g., participants preferred the enrollment intensity single item B22ASTST2, rather than the two questions, B22ANENRLFTA and B22ANENRLPTA; and participants preferred the simplified household composition B22FHHWHO, rather than the detailed B22FHHNUM).

  4. Participants relied heavily on the examples when included in item wording and suggested the inclusion of examples for item wording with difficult concepts when no examples were provided.

  5. With a few exceptions, participants could generally report what they needed to with the response options included in the survey.



Study Design

Sample

A sample of 28 students whose first postsecondary enrollment occurred between January 1, 2017 and June 30, 2019 participated in the present study. Table 1 provides a breakdown of participant demographics.

Table 1. Number of cognitive interview participants, by demographic characteristics: 2020

Demographic characteristics

Total (N = 28)

Gender


Female

20

Male

8

Age


18-24

25

25-29

2

40-49

1

Race


Asian

2

Asian/White

1

Black or African American

8

White

15

Prefer not to answer

2

Hispanic/Latino Origin


Yes

4

No

23

Prefer not to answer

1

Income


Less than $20,000

15

$20,000 to $49,999

3

$50,000 to $99,999

2

$100,000 or more

5

Prefer not to answer

3

Degree Pursued


Associate degree

5

Bachelor’s degree

23







Recruitment and Screening

To qualify for participation, each respondent had to be enrolled in a college, university, or trade school between January 1, 2017 and June 30, 2019. Further, students had to be attending their first postsecondary institution and working on their first postsecondary degree or certificate since completing high school.

EurekaFacts utilized an internal panel of individuals as well as targeted recruitment to individuals aged between 18 and 65 years old from across the country. All recruitment materials, including but not limited to initial outreach communications, study advertisements and flyers, reminder and confirmation e-mails, and informed consent forms, underwent OMB approval. Recruitment materials and advertisements were distributed across social media platforms such as Facebook and Instagram. Due to the coronavirus pandemic, certain recruitment methods were limited. In-person outreach and canvasing at public and private 2-and 4-year institutions in the Washington, DC metropolitan area were conducted until mid-March 2020.

To ensure that respondents met the qualification criteria, all potential participants completed a screening survey. Participants were either self-screened, using an online web-intake form, or were screened by EurekaFacts staff over the phone. All participants were screened using an OMB-approved screener script programmed into a CATI like software (Verint) to guarantee that the screening procedure was uniformly conducted and instantly quantifiable. During screening, all participants were provided with a clear description of the research, including its burden, confidentiality, and an explanation of any potential risks associated with their participation in the study. Qualified participants whose self-screener responses fully complied with the specified criteria were then contacted by phone and scheduled to participate in a cognitive interview. At the time of scheduling, participants were required to re-answer items from the screener identifying their school and date of first enrollment; respondents who provided inconsistent responses were excluded from participation. Participants were recruited to maintain a good mix of demographics, including gender, race/ethnicity, and socioeconomic status, as shown in Table 1. All interview participants received a $50 incentive as a token of appreciation for their efforts.

Ensuring participation 

To ensure maximum participation, all scheduled participants received a confirmation e-mail that included the consent form, date, time, and location of the interview. Confirmation e-mails for in-person participants included a map and directions for how to reach the EurekaFacts office. Those who were participating remotely received a link to the virtual meeting room along with instructions for how to set-up their technology for the interview. Additionally, participants that were scheduled more than two days prior to the actual date of the session received a reminder e-mail 48 hours prior to the interview. All participants received a follow-up reminder e-mail and a reminder telephone call within 24 hours of their scheduled interview time to confirm participation and respond to any questions. Every person scheduled was required to return a signed consent form prior to participation. Participants who came into the EurekaFacts office had the opportunity to review and sign the consent form during check-in; those participating remotely had to return their signed forms over e-mail prior to their scheduled interview time.

Data Collection Procedure

EurekaFacts conducted 28 one-hour interviews. Two interviews were conducted in-person at the EurekaFacts research facility in Rockville, MD, and 26 interviews were conducted online using GoToMeeting, a virtual meeting hosting application. Interviews occurred between March 2, 2020 and May 21, 2020. To follow safety guidelines, in-person interviews were not offered after March 16, 2020. Data collection followed standardized policies and procedures to ensure privacy, security, and confidentiality as well as to maximize the reliability of the data collected. EurekaFacts established two standardized data collection procedures in this instance: one for participants coming to the EurekaFacts office for interviews and one for participants electing for online participation.

For in-person interviews, upon their arrival to the EurekaFacts office, participants were welcomed and asked to sign-in. Written consent was then obtained before participants were escorted to the interview room. For online participants, signed consent forms were required prior to the participant joining the online meeting room.

Written consent was obtained via e-mail prior to the virtual interview for most participants. However, participants that did not return a consent form prior to joining the GoToMeeting session were sent a friendly reminder via private message by a EurekaFacts staff member. The consent forms, which do include the participants’ names, are stored separately from their interview data and are secured for the duration of the study. The consent forms will be destroyed three months after the final report is released.

Cognitive and Usability Interview Procedure

Remote Interviews. Prior to each virtual interview, a EurekaFacts employee created a GoToMeeting with a unique URL. All participants were sent a confirmation e-mail that confirmed the date and time of the interview, included the unique link to the online meeting, with an attached consent form. In addition to the consent forms, participants were provided with step-by-step instructions on how to enter the meeting room. In attempt to allow enough time for technological troubleshooting, participants were requested to enter the virtual meeting room five minutes prior to the start time of the session. Virtual participants were shown in the meeting window via webcam and microphone. At the scheduled start time, the interviewer began the interview.

Interview sessions progressed according to an OMB-approved script and interviewer protocol. The participant was reminded about the purpose of the interview, the confidentiality of their responses, the voluntary nature of participation, and the nature and purpose of the recording. After confirming that the participant was ready for the recording to begin, the interviewer initiated the audio-video recording. Using Google Chrome, the participant was then logged into the BPS survey platform and the browser was shared into the meeting window. The participant was given control of the mouse and keyboard and proceeded through the survey at their own pace, thinking aloud and responding to interviewer probing where necessary. At the end of the survey or at the end of 60 minutes, whichever occurred first, the participant was given the opportunity to make general commentary about the survey before the recording was terminated and the participant dismissed. Remote participants received a $50 Visa gift card over e-mail as a token of appreciation for their efforts.

In-person Interviews. In-person interviews were only scheduled from March 2, 2020 through March 16, 2020 due to the coronavirus pandemic. Prior to each interview at the EurekaFacts office, a staff member created a unique URL and call-in ID for observation and recording purposes. All participants received a confirmation e-mail which confirmed the date and time of the interview and included a copy of the consent form. Participants were checked in at the front desk, where they were offered the consent form a second time, and then escorted to the interview room by a EurekaFacts host at the scheduled start time. Inside the interview room, participants were seated beside the interviewer who set up the participant’s device and shared their view of the survey into the GoToMeeting.

Interview sessions progressed according to an OMB-approved script and interviewer protocol. The participant was reminded about the purpose of the interview, the confidentiality of their responses, the voluntary nature of participation, and the nature and purpose of the recording. After confirming that the participant was ready for the recording to begin, the interviewer initiated the audio-video recording. The participant proceeded through the survey at their own pace, thinking out loud and responding to interviewer probing where necessary. At the end of the survey or at the end of 60 minutes, whichever occurred first, the participant was given the opportunity to make general commentary about the survey before the recording was terminated and the participant dismissed. Each participant who completed an interview at EurekaFacts was provided with a $50 Visa gift card as they checked out prior to exiting the EurekaFacts facility.

Coding and Analysis

The interview sessions were audio and video recorded using GoToMeeting’s record functionality. After each session, standardized data-cleaning guidelines were used to review the recording and produce a data file containing a high-quality transcription of each participant’s commentary and behaviors. Completely anonymized, transcriptions and observations tracked each participant’s contributions from the beginning of the session to its close. As the first step in data analysis, coders’ documentation of the interview sessions into the data file included only records of verbal reports and behaviors, without any interpretation.

Two staff members reviewed the data following the completion of the data file. One reviewer cleaned the data file by reviewing the audio/video recording to ensure all relevant contributions were captured. In cases where differences emerged, the reviewer and coder discussed the participants’ narratives and their interpretations thereof, after which any discrepancies were resolved. The second reviewer conducted a spot check of the data file to ensure quality and final validation of the data captured.

Once all the data was cleaned and reviewed, research analysts began the formal process of data analysis which involved identifying major themes, trends, and patterns in the data and taking note of key participant behaviors. Specifically, analysts were tasked with classifying patterns within the participants’ ideas, in addition to documenting how participants justified and explained their actions, beliefs, and impressions. Each topic area was analyzed using the following steps:

  1. Getting to know the data – Several analysts read the data file and viewed the video recordings to become familiar with the data. Analysts recorded impressions, considered the usefulness of the presented data, and evaluated any potential biases of the interviewer.

  2. Focusing on the analysis – The analysts refamiliarized themselves with the purpose of the interviews and research questions, documented key information needs, and focused the analysis by question or substantive topic.

  3. Categorizing information – The analysts gave meaning to participants’ words and phrases by identifying themes (e.g., categorized patterns of participants’ survey responses, qualitative feedback, and responses to interview probes).

  4. Developing codes – The analysts developed codes based on the emerging themes to organize the data. Differences and similarities between emerging themes were discussed and addressed in efforts to clarify and confirm the codes.

  5. Identifying patterns and connections within and between categories – Multiple analysts coded and analyzed the data. They summarized each coded theme, identified similarities and differences, and combined related codes into broader ideas/concepts. Additionally, analysts assessed each theme’s importance based on its severity and frequency of reoccurrence.

  6. Interpreting the data – The analysts used the themes and connections to the broader concepts to explain findings in relation to the research questions (e.g. quality, performance, and reliability of the survey). Credibility was established through analyst triangulation, as multiple analysts cooperated to identify themes and to address differences in interpretation.

Limitations

The key findings of this report were based solely on analysis of the student cognitive interview observations and discussions. Because the coronavirus pandemic arose during data collection, fewer than intended in-person interviews were scheduled and conducted. Further, because only two in-person interviews were conducted, only a few participants could participate from smartphones and tablets, as these required technological capabilities that could only be ensured from the EurekaFacts office. The project was further limited by the specific recruitment requirements. As a result, the demographic makeup of the participants does not perfectly match the BPS:20/22 sample.

Based on survey responses and session-specific time constraints, some survey questions were not seen by all participants, thus the corresponding interview probes were not administered in those instances. Further, although EurekaFacts provided participants with a guide on how to set up for the interview and encouraged participants to follow the guide prior to the interview time, this did not always occur. As a result, technological difficulties regarding participant set-up contributed to constrained time in a few interviews. Consistent with the nature of interviews requiring participant think-aloud, time constraints increased with the talkativeness of the participant.

Findings

The following section of the report provides qualitative results of the cognitive interviews, organized by survey section. Assessments of accuracy and clarity, along with results from interview probes that collect feedback on specific survey questions are discussed, such as participants’ confidence, ease or difficulty of responding, version preference, and definitions of key terminology. Findings are reported along the major themes, trends, and patterns found during data analysis. The qualitative methodology of cognitive and usability seeks to establish broad themes rather than quantitatively precise or absolute measures, and as such, findings are discussed in thematic patterns rather than percentages and counts. Specific implications for field test design are summarized at the end of each survey section. A list of the forms included in the cognitive interview can be found in Attachment 1 at the end of this document. In addition, readers may refer to Attachment V of the OMB# 1850-0803 v. 267 for the complete BPS:20/22 cognitive survey instrument (https://www.reginfo.gov/public/do/DownloadDocument?objectID=100368601)


Enrollment Section

The enrollment section includes the following forms: B22ANENRLFT1, B22ANENRLPT1, B22ASTST2, B22ANENRLPT2, B22ANPOTHENR, B22AOTSCHENR, and B22APRSCHDB.

B22ANENRLFT1

Due to survey logic, 27 participants completed item B22ANENRLFT1. Participants were asked to report the months during which they attended their institution as a full-time student. The item has a graphic format wherein months are presented in individual boxes that change color from light gray to a darker gray when clicked on by a participant. Participants should have selected each month they attended as a full-time student, even if only for a portion of that month. Participants should not have selected months where they attended part-time, or did not attend at all due to breaks/holidays.

Accuracy and Clarity

Most students were able to accurately answer the item, selecting months they attended full-time in compliance with survey instruction. Still, despite being able to work out the correct response, some were confused or delayed by the formatting of the item. These students could not find the button to select all and/or felt that the breakdown by month made it difficult to align with their institution’s construction of the semester. When asked how to make this information easier to provide, the student who represented this opinion well stated, “Maybe if it just said which year did you attend full-time, or maybe semester. That would be easier. But thinking about the individual months, I kind of got confused.”

Some students did not select months where they attended for a week or less, only selected the first and last months of attendance, or only reported future attendance. Notably, students who only selected the first and last months, realized their mistake when answering B22ANENRLPT1 and reported that they would go back and change their answer if taking the survey independently.

Confidence

Most students were either very confident or somewhat confident in their response. The student who reported being unconfident in their response cited the month-by-month breakdown as the source of their difficulty suggesting the survey instead ask, “Which year/semester did you attend full-time?”

B22ANENRLPT1
Most students reported no part-time attendance in the survey and thus did not see this item. Due to survey logic, eight participants completed item B22ANENRLPT1. The item has the same graphic format as B22ANENRLFT1 wherein months are presented in individual boxes that change color from light gray to a darker gray when clicked on by a participant. Participants should have selected each month they attended as a part-time student, even if only for a portion of that month. Students should not have selected months where they attended under full-time enrollment or did not attend at all due to breaks/holidays.

Accuracy and Clarity

A majority of students were able to accurately answer the item, selecting months they attended part-time in compliance with survey instruction. Still, the participant who completed this item on their smartphone struggled considerably with the formatting of the item’s instructions. On their device, the text was tightly packed together and lacked indentation. This participant was able to accurately respond. The only participant who did not respond accurately failed to report a summer month in which they attended because they did not finish the courses. This participant reasoned, “…it doesn’t count on my transcript so I suppose it wouldn’t count here.”

Confidence

Most students were either very confident or somewhat confident in their response. The one student who reported being unconfident was a student who answered this item accurately but had only selected the first and last months of their semesters on B22ANENRLPT1. The student felt that their confidence could be improved, and this issue avoided in the future, if the item was reworded: “Maybe if it stated, ‘How long was the semester that you attended or plan to attend?’ That way there can be more clarity that it wants you to select more than one month. What I inferred was since I attended an entire semester that I only need to select the month it began. But if it said, ‘Specify the months that your semester lasted,’ then I would've had more clarity that I need to select more than one month.”

B22ASTST2

All participants completed B22ASTST2. The item is a multiple-choice matrix where students were to report their enrollment intensity by academic year. For each academic year, the student was to click the bubble corresponding with one of the following categories: “full-time,” “part-time,” “mix of both full-time and part-time,” or “not enrolled.”

Accuracy and Clarity

Most students were able to accurately respond to this item. One student summed up the majority reaction to this item well: “Yeah, I feel like this is more representative of typical surveys I've seen. It makes sense that they would split it up like this, it is straightforward, easy to figure out. And it also takes into account not enrolled if you took a break or something like that; it takes into account a lot of situations.” Of the participants who provided accurate responses, one who saw this item on their smartphone and struggled with the formatting of text on previous items, reported the number of commas and and/or statements was burdensome to them. They suggested rewording the second paragraph so that the enrollment intensity options (i.e., full-time, part-time, or a mix of both full-time and part-time) could be in parentheses as they are within this sentence. The student explained this would be easier because, “This way you have already planted a seed in my head that this is the information that I need to answer.” As probing continued this participant began using their device in landscape mode noting, “It is a lot easier and user-friendly.”

Defining “Mix of both full-time and part-time”

Most participants defined “mix of both full-time and part-time” as being enrolled full-time in one semester of the academic year and then part-time in the other semester of that same year. A few participants reported that this referred to beginning the semester with the amount of credits needed to be considered a full-time student but then dropping courses so that, by end of that semester, they are considered a part-time student. One participant could not provide an example, one participant reported this as being enrolled full-time for one semester and not enrolled at all in the next, and one participant felt this occurred when you’re enrolled in 16-week courses and 8-week courses at the same time.

Comparison: B22ANENRLFTA/B22ANENRLPTA vs. B22ASTS2

Students were reminded that they went through two versions of the items on enrollment status. After the interviewer explained the two versions, participants were to report if they preferred answering Version 1 (B22ANENRLFTA/B22ANENRLPTA) or Version 2 (B22ASTST2).

Preferences

Most participants preferred answering B22ASTST2. These participants generally reported that the visual layout of this item made it seem more concise and allowed them to think more easily about and choose their answer. Still, some students preferred the version with B22ANENRLFTA/ B22ANENRLPTA.

When asked to identify which of the two versions was easiest to answer, participants generally felt the item they preferred was the easiest. This was the case for all but two participants who felt neither version was more or less difficult than the other.

Participants’ Recommendations

In addition, students had the opportunity to provide recommendations on other ways this information could be collected. Of these, participants made the following recommendations:

  • Modifying B22ASTST2 and/or the calendar items so that participants choose their enrollment status by each semester (i.e., Fall 2017, Summer 2020). Representative of students who made this recommendation, one participant said, “I would probably put it as multiple-choice by semester. If you are doing college students, we are better off with Fall Semester, Summer Semester. Instead of having all these months and having to figure out what semester was what.”

  • Modifying B22ASTST2 to include either a label or time period for summer semesters on a separate line instead of asking about July 1st through June 30th in one fell swoop.

  • Modifying B22ASTST2 so that the survey asks about freshman year, sophomore year, etc. instead of or in addition to listing the time-period for each of these academic years.

Replacing the versions with a single item that has a calendar drop-down and asking participants “When did you start going full-time?” and “When did you end going full-time?”



B22ANPOTHENR

Almost all the participants answered item B22ANPOTHENR. This was a yes-no, multiple choice item which asked if students had pursued additional certificates or degrees at their postsecondary institution.

Comprehension

Of the participants who were asked to repeat back what the question was asking using their own words, all were able to repeat back the content of the item in a manner that was technically accurate. However, students defined “certificate program” differently when providing examples.

Ease or Difficulty

Most students reported the item was easy to answer. The remaining few participants experienced difficulty that stemmed from uncertainty about what should be included as “degree or certificate programs.”

B22AOTSCHENR

Most participants answered item B22AOTSCHENR. This item was a yes-no, multiple choice item which asked if students had attended any other college, university, or trade school since July 1, 2017.

Reliability

While participants did not experience any significant difficulties when answering this item, an issue of reliability was observed as students in similar circumstances did not always provide similar answers to this item. Of the three students who studied abroad, one answered yes to pursuing additional certificate or degree at their institution and the other two answered no. One of the latter participants stated the following, which sums up all three participants’ confusion with the item, “For this one, I'm kind of "yes" and "no." So, I did a study abroad where I was in exchange with another university, so I technically went to another school, but it was through US--I was still enrolled as a US student.”

B22APRSCHDB

Four participants answered item B22APRSCHDB. Students were asked to select the reason(s) why they considered a particular school to be their “main school” from a list of options and were also instructed to check all that applied.

Accuracy and Clarity

All students who answered this item were able to do so easily. These students indicated they considered that institution to be their main school because it is where they are currently attending, have been enrolled the longest, have been enrolled for a degree, and the most selective school they attended.

However, when asked to define their main school in their own words, students’ answers did not align perfectly with the available response options. Half the participants reported it was the school from which they would graduate, and the others reported it was either the school from which they were getting most of their degree credits or the school at which they were getting a four-year degree. Only one of the students utilized the ‘some other reason’ response option to specify their definition of main school.

Summary of Enrollment field test design implications

Cognitive feedback for the Enrollment section resulted in the inclusion of an experiment to test two approaches of collecting enrollment intensity, and the collection of additional debriefing information for the respondent identification of a “primary school” in the field test survey.

Cognitive participants were generally able to comprehend and provide the enrollment information similarly across both versions of enrollment intensity collection and there was no clear consensus on a preferred approach; therefore, two approaches of collecting enrollment intensity (i.e., single forced-choice grid question or a yes/no radio button gate question) will be tested with field test respondents to further investigate the best method of collecting enrollment data across multiple academic years and compare the cognitive burden. Field test respondents will be randomly assigned to one of the two versions of collecting enrollment intensity.

The identification of a student’s “primary school” or “main school” is important to BPS:20/22 because additional data elements are collected about a student’s primary school enrollment. Cognitive participants with multiple postsecondary school enrollments were requested to identify their main school. An insufficient number of cognitive participants fell into this category; thus, the field test survey will further explore how students define their primary school. The BPS:20/22 field test will feature two debriefing forms after the respondent selection of a primary school, to collect further information about the primary school identification.




Education Experiences Section

The education experiences section includes the following forms: B22BWBSHFAC, B22BFAMCOMM, B22BACDPART, B22BSOCIAL, B22BDIVPART, B22BDIVERSITY, B22BSRVUSE, B22BSRVIMP, and B22BNUMAPP.

B22BWBSHFAC

Most of the participants answered item B22BWBSHFAC. Students were to indicate the extent to which they agree or disagree with statements about the teachers they had contact with during their postsecondary education. This item was a matrix that included three statements and a Likert-type response scale that ranged from 1 (Completely disagree) to 5 (Completely agree).

Defining “a good command”

Every student was able to complete the item “Had a good command of what they were teaching” without issue, so when asked to define what “a good command” in their own words, students were able to answer in a way that indicated widespread comprehension. Students commonly explained that a teacher with a good command has experience in the field they are teaching, is able to answer student questions, makes sure that students understand the information, and does not simply read from the textbook. One student’s response encapsulates the most commonly used descriptors well: “I can walk into the classroom and ask them any question within means of what we are learning about and they can understand it. And it is not like they are reading from a presentation; they are using their own words and they are truly an expert in the field. It's different from high school when these high school teachers are reading from a book; they're actually experts in the field.”

B22BFAMCOMM

Almost all the participants answered item B22BFAMCOMM. Students were asked to report how often they communicate with five different groups of people. This item was a matrix which utilized the following scale, in this order: never, daily, weekly, monthly, a few times a year, and not applicable.

Accuracy and Clarity

While several students progressed through this item without issue, a majority of students experienced issues of accuracy regarding this item. Of these, most participants indicated that they would report different answers for siblings than they would for extended family, which are combined in the same statement in the survey. The opinion of students in this group was well represented by student who stated, “Like I said, I just answered based on siblings but if I didn't have any siblings and it was just extended family, my answer would've been different for that question.” In a separate issue of accuracy, several more participants felt the provided response options could not capture the actual frequency of their communication. These latter participants selected weekly when the communication was several times a week and/or reported monthly when communication was a few times a month or every other week.

Ease or Difficulty

Almost all the sample answered interviewer probing about the ease or difficulty of answering this survey item. Of these, many students reported that it was easy to provide an answer about each of the different groups. Still, approximately one-third of students reported difficulty with providing answers. Several of these participants attributed their difficulty to one of the aforementioned issues of accuracy, recommending that siblings and extended family be divided into separate categories and/or that participants be allowed to choose “a few times a week” as a response. Two other participants had difficulty deciding whether or not social media interactions counted as “communication” and a few more had trouble reporting because their frequency of communication varied widely over the course of the year, depending on whether or not they were in school. These students recommended including a definition for communication and rewording the item to emphasize that estimating was okay. The final few instances of difficulty involved one student who reported speaking to one sibling more frequently than the other and another participant who felt as though they were being “judged for not communicating with people” outside of their family.

B22BACDPART

Most participants answered item B22BACDPART. Students were asked to report whether they had participated in a list of academic activities. This item was a matrix that listed six different activities for which participants were supposed to select either yes or no.

Accuracy and Clarity

Some participants were unclear about the meaning of at least one listed activity. Most commonly, participants did not recognize ‘apprenticeship’ or ‘a learning community.’ Participants who were unfamiliar with activities recommended including examples of them in the survey. This type of recommendation was understandable as many participants indicated a reliance upon the examples as they thought aloud. Commentary in this regard is summed up well by one participant who stated, “Again, the parentheses really help get you there. If you just tell me a guided research experience, I am not sure what that is, but I did work as a research assistant for a professor though.” For apprenticeship, which already has examples in the survey, students recommended providing “an example of what it is as opposed to what it isn’t.”

Outside of the activities that were explicitly reported as unfamiliar, data also revealed inconsistency among students who answered ‘yes’ for participation in a learning community. While every student who selected yes for this activity did not explain their rationale, some of the students who selected yes for learning community reported that they happened to take more than one course with a few of the same people and two participants who selected yes for learning community reported that they lived with a group of people with whom they were required to take classes. Not all these participants reported this activity as unfamiliar to them.

Still, most of the participants who were asked reported being confident in their response to this item.

B22BSOCIAL

Most of the participants answered item B22BSOCIAL. Students were asked to report whether they had participated in a list of social activities. This item was a matrix that listed seven different activities for which participants were supposed to select either yes or no.

Clarity

Most participants recognized all the listed activities. The other half of participants did not immediately recognize at least one of the activities. Most of these students did not recognize the term “student affinity groups” although many of these were able to work out the meaning using examples. Still, one participant reported their ballroom club as a student affinity group, indicating further explanation might be needed.

B22BDIVPART

Most participants answered item B22BDIVPART. Students were asked to report whether they had interacted with different groups of students outside of the classroom. This item was a matrix listing five categories of students for which students were supposed to select: yes, no, or don’t know.

Defining ‘interactions outside the classroom’

Participants did not experience any significant difficulties or issues with this item. When asked what they included as “interactions outside of the classroom,” participants most commonly included hanging out with friends or friends of friends and interacting with club members at club-related events. Participants also frequently reported including chatting with people they run into on campus, such as in the cafeteria, library, or dorms. Less frequently, participants included studying and working on group assignments, interacting with people at parties, at administratively organized campus events, sports games, and on the bus/train ride home.

B22BDIVERSITY

Nearly all the participants answered item B22BDIVERSITY. Students were asked to report how often they had meaningful and honest conversations with the groups they reported interacting with on the previous item. This item was a matrix which utilized the following scale, in this order: never, rarely, sometimes, often, always.

Accuracy and clarity

While participants did not experience widespread difficulty with this item, there were a few issues regarding the ‘always’ response option. A few of the students found it difficult to determine what an “always” response would mean in relation to this item. These students reported that it could mean any of the following: that you never stop talking to people in these groups, you have these conversations every day, or every interaction you are having is meaningful. Furthermore, while not every student who selected always explained how they were defining it, those who did provided a variety of feedback. Participants used always when their close friends fell into one of the groups, when they felt they appropriate response was “very often,” or because they could recall “a lot” of conversations for that demographic group.

Defining “meaningful and honest conversation”

All participants were able to provide accurate definitions of ‘meaningful and honest conversation.’ Participants emphasized that these conversations transcend small talk and involve getting to know another person’s personality, personal problems, or candid beliefs. One participant whose perception aligned with the opinion of the group stated, “Meaningful means it's more than just small talk, you're talking about something substantial. It could be any of these things, you could be talking about politics or religion. Or you are opening up to them about your personal feelings, about your problems.”

B22BSRVUSE

A majority of participants answered item B22BSRVUSE. Students were asked to report whether they had used any of the provided school services. This item was a matrix that listed 12 school services for which participants were supposed to select either yes or no to indicate usage.

Accuracy and Clarity

While there were no widespread issues with this item, there were a few instances of inconsistency regarding what students included as “use.” Of the two students who reported uncertainty about whether going to the gym counted as using “health center and services,” one decided that it did count and the other decided it did not. Of two students who reported taking their resumes into career services to be reviewed, one believed this counted as using “career planning or job placement assistance” and the other did not. Relatedly, one student felt that a meeting about interview preparation also did not count as using this service.

Of the students who reported on their recognition of the listed school services, a majority recognized every school service listed.

B22BSRVIMP

Most participants answered item B22BSRVIMP. For each service they reported using on the previous item, students were asked to indicate how important the service was in their decision to stay in school. This item was a matrix which utilized the following scale, in this order: not at all important, somewhat important, important, very important.

Ease or Difficulty

Most participants responded to this item without issue, reporting it was easy provide their answer. However, other participants reported difficulty in providing a response. Half of the students who reported difficulty attributed it to the fact that they were asked about their “decision to stay in school.” These students had difficulty responding in terms of dropping out because they had not considered it and wondered if there was another interpretation they should have been responding to. As one of these students put it, “What’s confusing for me is that would you ‘stay in school.' Like would I not be in school without it or how important was it to my life living in college or what kept me in college?” The other half of students reporting difficulty with this item, reported that the listed services were not reasons they think about when considering whether to remain in school. One of these participants stated, “Somewhat difficult because I haven’t really thought about how these specific things have related to why I stayed in school.”

B22BNUMAPP

Most participants answered item B22BNUMAPP. Students were asked to report the number of schools they applied to, excluding the NPSAS institution. This item was fill-in-the-blank but included an exclusive radio button through which students could indicate that they had only applied to the NPSAS institution.

Accuracy and Clarity

Three students used the radio button to report having applied to only one school. Despite instruction to exclude the NPSAS institution, some of the remaining participants explicitly reported that they included it in the number they reported. While this data point cannot be confirmed for each participant who saw this item, many other students explicitly reported that they did not include the NPSAS institution in their response, indicating inconsistency in how this item was answered. One student recommended rewording the item to say "Excluding [NPSAS institution], how many other..." instead of just asking “How many other colleges, universities, or trade schools did you apply to?”

Summary of Education Experiences field test design implications

Cognitive feedback from the Education Experiences section resulted in the clarification of language for two field test survey questions.

Cognitive participants experienced accuracy issues when reporting how often they communicate with five different groups of people (B22BFAMCOMM), primarily due to the combination of “Siblings and extended family” as a single response option and unclear instructions about the frequency scale. Given this feedback, the field test survey will separate “siblings” and “extended family” into two separate response options and additional language will be added that clarifies how to report frequency of communication on the question scale.

Cognitive testing results indicated that participants were unfamiliar with, and not able to accurately define, a “learning community” when asked to report on their participation in academic activities (B22BACDPART). This feedback illustrated the importance of providing a clearer description of a “learning community,” and this response option was modified for the field test survey to provide further clarification.



Financial Aid Section

The financial aid section contains the following forms: B22CEAIDAWARE and B22CEAIDAPPLY.

B22CEAIDAWARE

Almost all students answered item B22CEAIDAWARE. Students were asked if their school had an emergency aid program. This item was a multiple-choice where students could select either yes, no, or don’t know.

Comprehension

Most participants answered ‘don’t know’ to this item. These students included some that did not know what an emergency aid program was and some that knew what it was but did not know whether their institution had one. Including the latter individuals, most participants were able to accurately explain the meaning of ‘emergency aid program.’ Of the students who could not define this accurately, most commonly they did not know enough to even be able to make a guess. When students could guess incorrectly, they confused emergency aid with public safety and ambulatory services or academic intervention on poor grades.

Of the students who were asked how they distinguish emergency aid from other sources of aid, some distinguished this based on the predictability of the need for funding, reporting that emergency aid is not something you know you’ll need ahead of time, it’s something you ‘suddenly’ need. Some distinguished this on the source of the funding; these students consider that the money does not come through financial aid and includes contributions from students and staff. A few students distinguished the funding by knowing that emergency aid is based on student circumstances and is not merit-based or contingent upon GPA.

B22CEAIDAPPLY

Only the students who answered yes to B22CEAIDAWARE saw the item B22CEAIDAPPLY. Students were asked if they had applied for emergency aid from their school. This item was a multiple-choice where students could select either yes or no.

Of the few students who were administered this question, none had difficulty or issues with answering this item. No notable patterns emerged in this data given the limited administration.

Summary of Financial Aid field test design implications

Given the low rate of awareness regarding emergency aid programs by cognitive interview respondents, the field test will feature additional emergency aid questions to observe respondent experiences with emergency aid.





Background Section

The background section includes the following forms: B22FHHUM, B22FHHWHO, B22FEVRHOML, and B22HSCSAFE.

B22FHHNUM

Most participants students answered item B22FHHNUM. Students were asked to consider their household at 18 and report the number of people who lived there by type of relative/inhabitant. If they lived in more than one household during this year, students were supposed to report about the household that provided the most financial support. The item was formatted as a table. Each of the seven rows corresponded to a category of inhabitant and students select the appropriate number from a drop-down numbered list.

Accuracy and Clarity

While many participants experienced no issues of clarity, there were instances of inconsistency in how participants in similar situations reported their answers. Specifically, of the five students who wondered whether to report on their dorm or their parents’ home, one answered about their dorm, one answered about their parent’s home but later reported they should have answered about their dorm, and three answered about their parent’s home. Similarly, of the two students who questioned the formality of guardianship among older siblings, one decided to report their sibling as acting as their guardian and the other decided to not do so.

While most participants reported that none of the categories overlapped, a few participants did identify overlap. Two of these participants reported that “mothers or other female guardians” and its male equivalent could overlap with “grandparents acting as guardians.” It happened that both of these participants had a grandparent acting as their guardian and one of them included their grandmother in multiple categories. The other participant noted that “mothers or other female guardians” and its male equivalent could overlap with “others.” This student specified that an uncle who takes care of you like a father could still be reported in the “others” category.

Ease or Difficulty

A majority of participants reported that none of the categories were difficult to answer. Of the few participants reporting difficulty, they attributed their difficulty to attempting to decide whether an older sibling could be a guardian or had difficulty recalling if a relative moved in before or after their 18th birthday. One participant reported it was difficult to decide whether to report on their dorm or parent’s home and another participant reported it was difficult to decide if his twin should be reported as a brother or other relative.

Defining “others”

The most cited people were aunts, uncles, cousins, and nephews. This was followed in frequency of mentions by friends of the family and non-relative tenants or roommates. All participants who were asked were able to think of at least one person who would fall into this category.



B22FHHWHO

Most of the participants students answered item B22FHHWHO. Students were asked to select the response that best described their living situation at 18. If they lived in more than one household during this year, students were supposed to think about the household that provided the most financial support. This item was one multiple-choice question where students were to select one of the following three responses: living with one parent or guardian, living with two parents or guardians, not living with parents or guardians.

Accuracy

Students had no widespread difficulty or issues with answering this item nor did any notable patterns emerge. Most participants reported living with two parents or guardians. Of the participants who reported not living with any parent or guardian, most were answering about living in a dorm and the other had been kicked out of their parent’s home.

Comparison: B22FHHNUM vs. B22FHHWHO

Most participants reported that B22FHHWHO was the easiest of the two to answer, although many attributed this to length and not clarity. As one participant stated, “I'd say obviously this one was easier to answer because it required less information. It took less time because there is just one question there. But I would say the previous one was easy too. I'd say they're pretty comparable it's just the length is different.” Only a few felt B22FHHNUM was easier to answer.

B22FEVRHOML

Most participants answered item B22FEVRHOML. Students were asked if they had slept in any of a variety of places because they had nowhere else to go. Participants were to exclude instances where they slept in these places on vacation or business trips. This item was a matrix which listed seven different locations for which participants were to select either yes or no to indicate sleeping there.

None of the participants experienced any notable difficulties or issues with answering the item and no trends emerged from the data. None of the participants reported sleeping in any of the listed places.

B22FHOUSEC – B22FHSCSAFE

Most participants proceeded through this section of items. The items asked the students to consider the past 12 months and report on various metrics of housing security.

Clarity

Of the participants who responded to interviewer probing about the presence of unfamiliar terms or questions in this section of items, a majority reported that nothing was unfamiliar to them. One participant expressed confusion regarding the following sub-items: “received a summons to appear in housing court” and “had an account default or go into collections.” For the latter, the student questioned if they should include accounts such as those for credit cards in addition to housing-related accounts.

Difficulty

Many participants reported having experienced no difficulty as they navigated the items. Still, several participants expressed some difficulty in deciding what to include as a “move” on item B22FNUMMOVE and another reported that it was sometimes difficult to know the answers to items because their parents handled most of the finances. Regarding item B22FNUMMMOVE, there is evidence that students handled this inconsistently as one participant stated the following, “I wasn't sure whether to include moving from living at home to living at a dorm which is, by definition, not permanent. And I did not have to register anywhere as having changed addresses. I didn't think that would count as moving but I'm not positive on that one.”

Summary of Background field test design implications

Cognitive feedback for the Background section identified the least burdensome household structure question and informed the design of housing security questions for the field test survey.

The field test survey will collect household structure at age 16 in the simplified format (B22FHHWHO), as the majority of cognitive participants reported B22FHHWHO was easier to answer and comprehend, compared to the more detailed B22FHHNUM, which some participants indicated had overlapping or unclear categories.

Most cognitive participants had no difficulty answering the new set of housing security questions to be featured in the field test survey, except for number of moves (B22FNUMMOVE). Some participants had difficulty understanding what should be included in a “move.” Given the importance of this data point in the identification of housing insecure students, the field test question wording will be modified to provide additional clarification on types of “moves” to include. Additionally, the field test will include a follow-up question for those who reported 4 or more moves in the year. The data resulting from field test will help inform the full-scale collection of this information.





Employment Section

The employment section contains the following forms: B22DWKHREN01, B22DWKHRS01, and B22DREFPKLSTDB.

B22DWKHREN01

Most students answered item B22DWKHREN01. Students were asked to report the number of hours they usually work per week while also attending school. This item was fill-in-the-blank.

Confidence

Students had no difficulty or issues with answering this item and no notable patterns emerged in this data regarding confidence. A large majority of participants reported confidence in their answer, these were evenly split between somewhat confident and very confident. The one participant who reported being very unconfident in their response worked for their employer on an as needed basis.

B22DWKHRS01

Most students answered item B22DWKHRS01. In this item, students were asked to report the number of hours they usually work per week while not also attending school. This item was fill-in-the-blank.

Confidence

Students had no difficulty or issues with answering this item and no notable patterns emerged in this data concerning confidence. The same majority of participants that reported confidence on B22DWKHREN01 also reported being confident in their answer for this item. For the same reason as before, the same participant who was very unconfident in their answer for B22DWKHREN01 was also very unconfident in their response to this item.

B22DREFPKLTDB

Most students answered item B22DREFPKLTDB. Students were asked to report why, from a list of possible responses, they consider a particular job to be their main employer and were also instructed to select all that apply. Participants had the opportunity to specify their own reason by selecting “some other reason” as a response.

Defining “main employer”

When asked to define “main employer” in their own words, students provided the following descriptors in descending frequency: a) the job that pays them the most; b) the job they had worked at the longest; c) the job they currently worked at the most; and d) the job they simply considered to be their “main” job. Only reasons related to employment duration (b) and recency of employment (c) were available as response options for B22DREFPKLTDB. Even though the students frequently shared these “main employer” definitions during the interview that did not fit into any of the response options available, the participants generally did not select the “Some other reason: please specify” option to add them to the survey.

Summary of Employment field test design implications

Cognitive participants provided additional definitions for their “main employer,” which informed a comprehensive set of response categories on the “main employer” debriefing forms for the field test survey.



Debriefing Section

While most students had the opportunity to offer general feedback via the debrief, unique, usability-related commentary was only provided by less than half of these. During the debrief it was most common for students to report the overall perception of the survey as easy to answer and understand. This descriptor was followed in frequency of mentions by commentary that cautioned against the amount of text that respondents must read in the survey. A few students’ comments concerned the survey description they were provided; they felt they were surprised by the content and were unsure about what the data was being used for. Finally, one student mentioned that, at the time of the survey, they had already been living in their parent’s home for more than 30 days. The student noted that when you go home from campus, the status of your housing or food security can change and suggested the survey explicitly instruct students about how to account for the coronavirus pandemic in their answers.





Attachment 1



List of all cognitive interview survey items

Section name

Form name

Label

Enrollment

B22AINTRO

Survey introduction

Enrollment

B22ASAMESCH

Attended NPSAS institution after Year 1

Enrollment

B22ASAMEDEG

Continued enrollment at NPSAS institution after Year 1 for base year enrollment

Enrollment

B22ACURENR

Currently enrolled at NPSAS institution

Enrollment

B22ACMPDGN

Completed base year degree at NPSAS institution

Enrollment

B22ADGN

Date completed base year degree at NPSAS institution

Enrollment

B22ANENRLFTA

Version 1: Attended NPSAS institution full-time in years 1-3

Enrollment

B22ANENRLFT1

Version 1: Months attended NPSAS institution full-time in years 1-3

Enrollment

B22ANENRLPTA

Version 1: Attended NPSAS institution part-time in years 1-3

Enrollment

B22ANENRLPT1

Version 1: Months attended NPSAS institution part-time in years 1-3

Enrollment

B22ASTST2

Version 2: Enrollment intensity at NPSAS institution in years 1 - 3

Enrollment

B22ANENRLFT2

Version 2: Months attended NPSAS institution full-time in years 1-3

Enrollment

B22ANENRLPT2

Version 2: Months attended NPSAS institution part-time in years 1-3

Enrollment

B22ANPOTHENR

Additional enrollment at NPSAS institution in Years 1-3 other than base-year enrollment

Enrollment

B22AOTSCHENR

Enrollment at any other school besides NPSAS institution in Years 1-3

Enrollment

B22AOTSCH01

Other school 1: School coder

Enrollment

B22AOTDEGREE01

Other school 1: Degree or certificate type

Enrollment

B22AOTNENRL01

Other school 1: Months attended

Enrollment

B22AOTCURENR01

Other school 1: Currently attending

Enrollment

B22AOTCMPDGN01

Other school 1: Completed degree/certificate requirements

Enrollment

B22AOTDG01

Any additional enrollment at [other school] in Years 1-3

Enrollment

B22AOTENR01

Any additional enrollment at any other schools besides [other school] in Years 1-3

Enrollment

B22APRSCHLST

[primary school] pick list

Enrollment

B22APRSCHDB

[primary school] pick list debriefing

Education Experiences

B22BEDEXPINT

Education experiences introduction

Education Experiences

B22BWBSHFAC

Teacher effectiveness

Education Experiences

B22BFAMCOMM

Frequency of communication with family and friends outside of [primary school]

Education Experiences

B22BACDPART

Participation in academic activities at [primary school]

Education Experiences

B22BSOCIAL

Participation in student social groups at [primary school]

Education Experiences

B22BDIVPART

Had interactions outside of class with diverse students at [primary school]

Education Experiences

B22BDIVERSITY

Frequency of interactions with diverse students at [primary school]

Education Experiences

B22BSRVUSE

Used school services in [primary school] academic year

Education Experiences

B22BSRVIMP

Importance of school services used in NPSAS academic year

Education Experiences

B22BNPATND

Attend NPSAS again

Education Experiences

B22BNUMAPP

Number of institutions applied

Financial Aid

B22CEMERINT

Emergency aid introduction

Financial Aid

B22CEAIDAWARE

Aware of emergency aid programs at [primary school]

Financial Aid

B22CEAIDAPPLY

Applied for emergency aid at [primary school]

Financial Aid

B22CEAIDRCV

Received emergency aid from [primary school]

Employment

B22DINTRO

Employment introduction

Employment

B22DANYJOB

Worked for pay in years 1 - 3

Employment

B22DEMPLOY01

Employer 1: name

Employment

B22DWRKMON01

Employer 1: months worked in years 1- 3

Employment

B22DEMPCUR01

Employer 1: currently employed

Employment

B22DEARN01

Employer 1: earnings

Employment

B22DWRKENR01

Employer 1: worked while enrolled

Employment

B22DWKHREN01

Employer 1: hours worked while enrolled

Employment

B22DWRKNEN01

Employer 1: worked while not enrolled

Employment

B22DWKHRS01

Employer 1: hours worked while not enrolled

Employment

B22DOTHEMP01

Additional employment

Employment

B22DREFPKLST

Reference employer picklist

Employment

B22DREFPKLTDB

Reference employer picklist debriefing

Employment

B22DOCC

Reference employer: occupation

Background

B22FHFINTRO

Food and housing security introduction

Background

B22FMEALPLN1

Meal plan at [primary school]

Background

B22FMEALPLN2

Meal plan covers 11 or more meals a week

Background

B22FUSDAHH

Food bought didn't last and couldn't afford to eat balanced meals in the last 30 days

Background

B22FUSDAAD1

Ever cut the size of meals or skip meals in the last 30 days

Background

B22FUSDAAD1A

Frequency of cutting the size or skipping meals in the last 30 days

Background

B22FUSDAAD2

Ever eat less than you felt you should in the last 30 days

Background

B22FUSDAAD3

Ever hungry but didn't eat in the last 30 days

Background

B22FUSDAAD4

Lost weight because there wasn’t money for food in the last 30 days

Background

B22FUSDAAD5

Ever not eat for a whole day in the last 30 days

Background

B22FUSDAAD5A

Number of days did not eat for whole day in the last 30 days

Background

B22FEVRHOML

Places slept in the last 30 days

Background

B22FHOUSEC

Housing security measures in the last 12 months

Background

B22HSCINC

Rent or mortgage increase made it difficult to pay

Background

B22FNUMMOVE

Number of times moved

Background

B22HSCSAFE

Left household because felt unsafe

Background

B22FHHNUM

Household composition at age 18

Background

B22FHHWHO

Family structure at age 18

Background

END

End of survey



1 Refer to OMB# 1850-0803 v. 267 Attachment V for the BPS:20/22 cognitive survey instrument: https://www.reginfo.gov/public/do/DownloadDocument?objectID=100368601

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-13

© 2024 OMB.report | Privacy Policy