Qualitative Testing Sumary Student Records

Appendix F NPSAS 2020 Qualitative Testing Summary Student Records.docx

2019–20 National Postsecondary Student Aid Study (NPSAS:20)

Qualitative Testing Sumary Student Records

OMB: 1850-0666

Document [docx]
Download: docx | pdf
  1. HIDDEN TEXT TO CONTROL APPENDIX PAGE NUMBER – DO NOT DELETE

2019–20 NATIONAL POSTSECONDARY STUDENT AID STUDY (NPSAS:20)



Appendix F

Qualitative Testing Summary for Student Records



OMB # 1850-0666 v.25

Submitted by

National Center for Education Statistics

U.S. Department of Education




August 2019



Tables

Table 1. Qualitative testing participants, by institution and individual characteristics, 2019 F-4

Figures

Figure 1. Adobe Connect software F-7

Figure 2. Example of participant’s screen share F-8

Figure 3. Remedial course-taking item F-15

Figure 4. Tuition charged item F-16

Figure 5. PDP homepage and study icons F-20

Figure 6. PDP Task Menu F-21

Figure 7. PDP Help Menu F-21

Figure 8. Enrollment List Page F-24

Figure 9. Excel Mode page F-26

Figure 10. Example of Enrollment tab with help text F-27

Figure 11. Excel template column headers F-28

Figure 12. Excerpt from Instructions Tab of CSV File specifications F-30

Figure 13. Critical Items in CSV File Specifications F-30

Figure 14. Web Mode "Check for Errors" and "Finalize" buttons F-31



This appendix summarizes the results of qualitative testing conducted in preparation for the 2019-20 National Postsecondary Student Aid Study (NPSAS:20) institution data collection. This testing included virtual focus groups with institution staff and usability testing of the Postsecondary Data Portal (PDP) website. Full details of the pretesting components were approved in December 2018 and March 2019 (OMB# 1850-0803 v.243 and 247). The study design is described first, including discussion of overall conclusions, sampling and recruitment procedures, methods for conducting the virtual sessions, and data processing. Next is discussion of detailed findings from the focus group sessions, followed by detailed findings from usability testing.

Qualitative Testing Summary

The National Postsecondary Student Aid Study (NPSAS:20), conducted by the National Center for Education Statistics (NCES), collects student data directly from postsecondary institutions. In order to improve the quality of the data collected as well as reduce the burden of completing the data request for institution staff, RTI International, on behalf of the National Center for Education Statistics (NCES), part of the U.S. Department of Education, contracted with EurekaFacts to conduct virtual focus group sessions and usability testing interviews with institution staff who are responsible for completing the NPSAS institution data request via the Postsecondary Data Portal (PDP).

In general, the focus groups and usability testing sessions addressed the following topics:

  • Timing of the data request;

  • Instructions and resources provided to institution staff;

  • Content of the data collection instrument and the PDP website;

  • Navigability of the PDP website;

  • Ease of finding needed information; and

  • Ease of providing data via the offered modes (Web, Excel, or CSV).

Participant Sample

Participants for focus group and usability testing sessions were drawn from a list of individuals who completed the most recent round of NPSAS data collection, the 2017-18 NPSAS Administrative Collection (NPSAS:18-AC). Participants were recruited to obtain feedback from a variety of institution sizes, institution sectors, and the participants’ roles or departments within the institution. Twenty individuals participated in one of seven focus groups, and eighteen individuals participated in one-on-one usability interviews. See table 1 for details about participant characteristics.



Table 1. Qualitative testing participants, by institution and individual characteristics, 2019

Participant Characteristics

Focus Groups
(
n = 20)

Usability Testing
(
n = 18)


Institution size

 

 


Small (< 100 students)

8

7


Medium (100 – 200 students)

3

8


Large (> 200 sample students)

9

3


Institution type




Private, for-profit, 4-year

1

2


Private, not-for-profit, 4-year

4

8


Public, 4-year

6

4


Public, 2-year

9

4


Participants’ university department




Institutional Research/Institutional Effectiveness/Institutional Planning

10

5


Financial Aid

5

6


Registrar/Enrollment Management/Student Services

4

2


Bursar/Student Accounts/Student Finance

1

1


Information Technology/Data Management

0

4


Participants’ years of experience working in postsecondary education




Less than 1 year

0

0


1 - 4 years

3

1


5 – 10 years

6

4


10 – 15 years

4

3


15 – 20 years

2

4


20+ years

5

6



Key Findings

This section summarized the key findings of focus groups with institution staff and usability testing of the PDP website.

Timing of the Request. Two common themes emerged from participants’ discussion of the timing of the NPSAS data collection request:

  1. The introduction letter sitting in someone else’s office for a period of time before being forwarded to the appropriate staff, leaving a shorter actual period of time to perform the data collection; and

  2. The request arriving during a busy time of year, when institutions have other data requests to fulfill.

Reference Period/Term Structure. The reference period for NPSAS:20 is July 1, 2019 through June 30, 2020 (i.e., “the NPSAS year”), which aligns with the federal financial aid year. For NPSAS, institutions are asked to report students’ enrollment status for each term in the NPSAS year and the tuition charged, budgeted costs of attendance, and financial aid received for this period. Participants noted that the July 1 through June 30 NPSAS year presented challenges to institution staff because the period tends to cross two different academic years for most institutions. This overlap led to confusion about which summer terms to include, depending on the institutions’ term structure. Most participants reported that their institution uses a traditional semester-based term structure, and some consider summer terms to be part of the same academic year as the following fall/winter term (referred to as a “leading summer” or “summer header”), while others consider summer terms to be part of the same academic year as the preceding spring term (referred to as “summer trailing” or “following summer”). As a result, the data reported for NPSAS, which requests data for both years of summer terms, did not align with the institutions’ defined academic years. Participants from these term-based institutions would prefer to base their reporting by term rather than the current NPSAS year reference period.

Instructions/Functionality. Participants were largely able to understand the different mode options for providing the data and the conditions on which the mode recommendations are based. Most participants selected either the Excel mode or the CSV mode and generally had minimal difficulty completing the task. For the most part, participants reported the instructions for completing both the Excel and CSV modes to upload data were clear and easy to understand. One area of confusion was the list of critical items and when conditionally critical items (those that are only critical based on responses to other items) should be reported. Some Excel mode participants reported usability issues related to the fact that the template is locked and some formatting functions (e.g., sorting and filtering) are controlled; these participants also reported understanding the need to lock the template.

Error Messages. A few issues related to error messages were identified by the participants, including:

  1. Both Excel and CSV mode users reported issues with error messages when leaving fields blank because they did not have that data. This was a particular challenge for participants from large institutions who reported that searching through the “hundreds” of error messages looking for critical errors was difficult and time consuming.

  2. Excel users expressed a desire for a way to accurately validate their data file before uploading it to the PDP website. One of the participants advocating for this change in the review process needed to upload the file “five or more times before it was finally accepted.”

  3. Some participants mentioned having uploaded files that were “accepted with errors” which they found confusing.

Budget. Some participants reported difficulty providing data for the budget section, particularly for students who did not apply for aid and therefore did not have an individualized budget. Participants reported that preparing a budget for these students can be a manual, and therefore time-consuming, process.

Financial Aid. Participants reported some confusion regarding the NPSAS year reference period and which financial aid awards should be reported. Participants reacted positively to the proposed change to (1) collect financial aid based on the term(s), rather than the NPSAS year reference period; and (2) to collect financial aid disbursed. Some participants noted that reporting individual awards is time-consuming, preferring to report a lump sum of aid received by the student.

Ease of Finding Needed Information. The instructions provided for completing the different pages and submitting the data appear to satisfy the needs of most participants. Participants appreciated the Contact Materials and Resources links in the Help Menu and consistently reported the various links were useful and helpful to them. Some participants felt that the Announcements link should be more pronounced or indicate that new announcements have been added.

Content and Navigability of the Website. Participants were able to move through the PDP easily and reported that the overall organization of the website was logical. Participants especially appreciated that the Task Menu page shows which tasks are completed and which are remaining. In addition, although many participants reported they would not usually watch instructional videos unless they had questions, several indicated the videos presented useful information and they liked the short length.

Student Enrollment List. Participants reported that the Student Enrollment List instructions were straightforward, easy to follow, and unproblematic; however some participants reported some areas of confusion on the Student Enrollment List page on the PDP.

Study Design

Recruitment and Screening

Participants for both focus groups and usability testing sessions were recruited from a list of institution staff who participated in NPSAS:18-AC. To be eligible to participate, participants had to be employed at an institution that participated in NPSAS:18-AC, responsible for providing some or all of the data for NPSAS:18-AC, and comfortable speaking in front of a team of researchers or other postsecondary educational professionals.

For focus groups, institution staff were divided into four categories based on their department at the institution and/or the sections of the NPSAS:18-AC student records collection they were responsible for completing:

  • Bursar/Student Accounts/Student Finance

  • Financial Aid

  • Institutional Research, Institutional Effectiveness, Institutional Planning

  • Registrar/Enrollment Management/Student Services

For usability testing, the mode used to provide data on the Postsecondary Portal (PDP) during NPSAS:18-AC was also considered in order to capture a variety of experiences using the three different modes (CSV mode, Excel mode, and Web mode).

All recruitment materials, including but not limited to initial outreach communications, Frequently Asked Questions, reminder and confirmation emails, and informed consent forms, underwent OMB approval (OMB# 1850-0803 v. 243). Recruitment materials were distributed through email using the contacts provided by RTI. Respondents first received a recruitment email before receiving a follow-up phone call from EurekaFacts staff.

In order to ensure that respondents met the qualification criteria, all potential participants completed a screening survey and then were screened by EurekaFacts staff over the phone. During screening, all participants were provided with a clear description of the research, including its burden, confidentiality, and an explanation of any potential risks associated with their participation in the study. Eligible participants who were screened by EurekaFacts staff over-the-phone were scheduled at the time of screening.

Participants were recruited to vary institution size, institution sector, and years in post-secondary education as much as possible. All participants were offered a $50 Amazon e-gift card as a token of appreciation for their efforts, although not all participants accepted an incentive.

To ensure maximum “show rates,” participants received a confirmation email that included the date, time, a copy of the consent form, and directions for participating in the virtual session. All participants received a follow-up email confirmation and a reminder telephone call at least 24 hours prior to their session to confirm participation and respond to any questions.

Data Collection Procedure

All focus group and usability testing sessions were conducted virtually using Adobe Connect software. Data collection followed standardized policies and procedures to protect participants’ privacy, security, and confidentiality. Upon their arrival to the virtual Adobe Connect meeting participants were welcomed and instructed to set up their microphones and speakers. Written consent was obtained via email prior to the session. The consent forms (which include participants’ names) were stored separately from their data and were secured for the duration of the study.

Shape1 Prior to each session, a EurekaFacts employee created an Adobe Connect meeting with a unique URL. All participants were sent a confirmation email that included the unique link for the Adobe Connect meeting (see figure 1). In order to allow enough time for technological set-up and troubleshooting, participants were requested to enter the Adobe Connect meeting room fifteen minutes prior to the start time of the session. When this time arrived, the moderator adjusted the security of the virtual room to allow participants to enter. Participants were greeted by a welcome slide that instructed them to connect their speaker, microphone, and webcam to the Adobe Connect software. However, participants were directed to keep their microphones muted and webcams off until it was time for the session to begin.

At the scheduled start time of the session, participants were instructed to turn on their microphones and webcams before being formally introduced to the moderator. Participants were then reminded they were providing feedback on the NPSAS:18-AC student records data collection and reassured that their participation was voluntary and that their answers may be used only for research purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law. Participants were also informed that the session would be recorded.

At the end each session, participants were thanked and informed when to expect the virtual $50 Amazon e-gift card. The recording of the session was then terminated and the Adobe Connect meeting ended to prevent further access to the virtual room.

Focus Group Procedure

EurekaFacts conducted seven, 90-minute, virtual focus groups between April 30 and May 30, 2019. Six focus groups were arranged by category so that each group was only administered probes relevant to the sections of data collection they completed. The final “make-up” group included participants from multiple categories. Financial Aid and Bursar/Student Accounts/Student Finance categories were combined into the same sessions because of the similarities between the two roles in NPSAS:18-AC. The three focus group session categories were as follows:

  • Financial Aid and Bursar/Student Accounts/Student Finance (FA) (3 sessions)

  • Institutional Research, Institutional Effectiveness, Institutional Planning (IR/IE) (2 sessions)

  • Registrar/Enrollment Management/Student Services (R) (1 Session)

  • Make-up session (multiple categories) (1 Session)

Focus group sessions were conducted using the OMB-approved moderator guide. The moderator guide was used to prompt discussion of staff members’ experiences completing the different data collection tasks in addition to focusing on the NPSAS Help Desk, methods of communication, and proposed data feedback report. While the moderator relied heavily on the guide, focus group structure remained fluid and participants were encouraged to speak openly and freely. The moderator used a flexible approach in guiding focus group discussion because each group of participants was different and required different strategies to produce productive conversation.

Topics for each session varied based on participants’ department at the institution. The following topics were discussed in the focus groups:

  • Topic 1: Enrollment List Collection (R and IR/IE groups)

  • Topic 2: Student Records Role Assignments (all groups)

  • Topic 3: Institution Information (R and IR/IE groups)

  • Topic 4: Student Records Modes (all groups)

  • Topic 5: Enrollment (R and IR/IE groups)

  • Topic 6: Budget (FA groups)

  • Topic 7: Financial Aid (FA groups)

  • Topic 8: Help Desk & Other Resources (all groups)

  • Topic 9: Giving Back (all groups)

  • Topic 10: Communication (all groups)

  • Topic 11: Closing (all groups)

The goals of the discussion were (1) to gain insight about the participants’ experiences in order to identify problematic processes, specifications, questions, terms, and/or resources; determine how respondents are able to coordinate data collection at their respective institutions; (2) to gauge how easily participants are able to enter student records data across the PDP modes; (3) to explore the interest of participants in receiving a data feedback report and/or follow-up “thank-you” for participating in data collection; and (4) to review the methods used by NCES to communicate information and instructions about the study.

Usability Testing Procedure

EurekaFacts conducted eighteen, 90-minute, virtual usability testing interviews between May 7 and June 17, 2019. Participants were asked to work with an interviewer to complete several tasks on the Postsecondary Data Portal (PDP) website, share their personal experiences with the website, and provide their impressions and suggestions on the usability of the PDP website and its tools.

Shape2 Interview sessions were conducting using an OMB-approved interview protocol. When prompted by the interview protocol, a trained EurekaFacts interviewer requested the participant share their screen before using the web link feature within Adobe Connect to launch the PDP website on the participant’s browser (see figure 2). The participant was also given a moment to clear their screen of any personal information prior to accepting the screen share request. After launching the PDP website, the interviewer shared a unique login with the participant to access the PDP prototype for the individual’s institution.

The interviewer protocol was used to prompt discussion of participants’ experiences with

  • using the PDP website to complete tasks;

  • using the PDP website’s data entry features to provide student records data in either CSV, Excel or Web mode;

  • utilizing and understanding the resources provided by the PDP website; and

  • navigating the PDP website.

During the interview, the interviewer asked the participant to complete several tasks, such as registering their institution and reviewing various features of the website. The interviewer also asked the participant which sections of the PDP tasks they have previously completed in order to focus on sections most relevant to the participant and their personal responsibilities. During the interview, participants were asked to select a PDP mode to evaluate and practice providing student record data. Each mode (CSV, Excel, or Web) had a unique set of questions to evaluate the usability of the data entry features. While the interviewer relied heavily on the protocol, the interviewer took a flexible approach to guide the interview because each participant had different experiences using and navigating the PDP which required different strategies to produce productive dialog and encourage the participant to speak openly and freely.

Some topics were not administered for all interviews based on the personal responsibilities of individual participants. The following topics and sections were included in the interview protocol:

  • Topic 1: Introductions & Homepage

  • Topic 2: Study Icons

  • Topic 3: Task Menu

  • Topic 4: Resources

  • Topic 5: Institution Registration Page

  • Topic 6: Student Enrollment List

  • Topic 7: Student Records – Mode Selection

    • 7a: Excel Mode

    • 7b: CSV Mode

    • 7c: Web Mode

  • Topic 8: Debriefing

The goals of the interview were (1) to gain insight about participants’ perceptions of the appearance, functionality, and understanding of the homepage and task menu; (2) to assess the usability and functionality of the help menu resources; (3) to review how easily participants can determine how to complete institution registration, the student enrollment list, and select a mode for student records; (4) to gauge how easily participants are able to enter student records data across the PDP modes; (5) to identify problematic processes, specifications, questions, terms, and/or resources; and (5) to evaluate which features throughout the website are most helpful for institution staff.

Coding and Analysis

The focus group and usability testing sessions were audio and video recorded using Adobe Connect’s record meeting function. After each session, a coder utilized standardized data-cleaning guidelines to review the recording and produce a datafile containing a high-quality transcription of each participant’s commentary and behaviors. Completely anonymized transcriptions tracked each participant’s contributions from the beginning of the session to its close. As the first step in data analysis, coders’ documentation of focus group sessions into the datafile included only records of verbal reports and behaviors, without any interpretation.

Following the completion of the datafile, it was reviewed by two reviewers. One reviewer cleaned the datafile by reviewing the audio/video recording to ensure that all relevant contributions, verbal or otherwise, were captured. In cases where differences emerged, the reviewer and coder discussed the participants’ narratives and their interpretations thereof, after which any discrepancies were resolved. The second reviewer conducted a spot check of the datafile to ensure quality and final validation of the data captured.

Once all the data was cleaned and reviewed, research analysts began the formal process of data analysis. In doing so, these staff looked for major themes, trends, and patterns in the data and took note of key participant behaviors. Specifically, analysts were tasked with identifying patterns within and associations among participants’ ideas in addition to documenting how participants justified and explained their actions, beliefs, and impressions. Analysts considered both the individual responses and the group interaction, evaluating participants’ responses for consensus, dissensus, and resonance. Once all the data was cleaned and reviewed, it was analyzed by topic area using the following steps:

  1. Getting to know the data – Several analysts read through the datafile and listened to the audio/video recordings to become extremely familiar with the data. Analysts recorded impressions, considered the usefulness of the presented data, and evaluated any potential biases of the moderator.

  1. Focusing on the analysis – The analysts reviewed the purpose of the session and research questions, documented key information needs, focused the analysis by question or topic, and focused the analysis by group.

  2. Categorizing information – The analysts identified themes, trends, or patterns.

  3. Developing codes The analysts developed codes based on the emerging themes to organize the data. Differences and similarities between emerging codes were discussed and addressed in efforts to clarify and confirm the research findings.

  4. Identifying patterns and connections within and between categories – Multiple analysts coded and analyzed the data. They summarized each category, identified similarities and differences, and combined related categories into larger ideas/concepts. Additionally, analysts assessed each theme’s importance based on its severity and frequency of reoccurrence.

  5. Interpreting the data – The analysts used the themes and connections to explain findings and answer the research questions. Credibility was established through analyst triangulation, as multiple analysts cooperated to identify themes and to address differences in interpretation.

Limitations

Qualitative research seeks to develop insight and direction, rather than obtain quantitatively precise measures. The value of qualitative testing is demonstrated in its ability to provide unfiltered comments from a segment of the targeted population. While this research cannot provide definitive answers, the sessions can play a large role in gauging the usability and functionality of the online survey, as well as identifying any consistently problematic survey items and response options.

The key findings of this report were based solely on notes taken during and following the focus group and usability testing sessions. Note that not all probes were administered to every participant, and even when probes were administered, every participant may not have responded to every probe due to time constraints and the voluntary nature of participation. Moreover, focus groups are prone to the possibility of social desirability bias, in which case some participants agree with others simply to “be accepted” or “appear favorable” to others. While it is impossible to prevent this, the EurekaFacts moderator instructed participants that consensus was not the goal and encouraged participants to offer different ideas and opinions throughout the sessions.

Findings: Focus Groups with Institution Staff

This section presents detailed findings from the focus groups with institution staff.

Topic 1: Enrollment List Collection

Participants were asked to discuss their experiences providing their institution’s student enrollment list for NPSAS:18-AC (a list of eligible students enrolled at their institution between July 1, 2017, and April 30, 2018). This topic was presented to focus group sessions that included Institutional Research and Registrar staff who would be involved in preparation of the enrollment list, totaling 9 participants.

Participants did not experience major difficulties preparing and submitting their student enrollment list but noted that the process was “time consuming.” When asked to elaborate further as to approximately how much time it took to complete the task, responses ranged from “a few hours” to roughly “a week or so.” Common explanations included having to pull separate information from the database and then combine together, ensuring it was the correct information needed, and certifying that the information was correctly recoded and formatted. One participant noted that they were fairly new to their role as an Institutional Research Analyst, so it took time to become familiar with the process.

One participant noted that it is “tricky” to mark high school completion status because their institution, a public two-year technical school, does not track this information as some programs do not require it.

Topic 2: Student Records Role Assignments

This section addresses how roles and responsibilities are assigned to complete the student records section at each institution. Participants were asked to describe their role in providing data for the components of the student records data request: institution term-structure and student-level data in four content sections (General Student information, Enrollment, Budget, and Financial Aid).

Section Completion

Participants were first asked whether they completed all sections by themselves or if they received additional help to complete sections of NPSAS:18-AC. Of those participants who responded (n=12), ten participants indicated that the sections were completed by multiple people, while only two participants reported being the only person to complete all sections. In cases where the sections were completed by multiple people, other departments included the Financial Aid and Registrar’s offices. All those who were asked to elaborate about their experience working with different departments and staff members described having a good working relationship with other departments in general and reported the same positivity for NPSAS:18-AC data collection too. When asked about providing an estimated amount of time involved to complete the student records request, times ranged from more than a day to three weeks.

Challenges with Retrieving Data

Participants were asked to report any challenges they experienced retrieving data. The most common challenges mentioned by participants were as follows:

  1. Miscommunication or delayed communication among different departments and staff,

  2. Lack of access to requested data, and

  3. Timing of when the request is received.

Shape3

"The Introduction Letter sat in the President’s office for many, many weeks before it was even brought to anybody’s attention…Once we heard of it, we were already behind schedule.”

Miscommunication or delayed communication among staff was the most frequently referenced challenge among participants. For example, one participant who noted experiencing communication issues explained that the NPSAS introduction letter was received by their institution’s President rather than those who complete the data request. They explained, “the introduction letter sat in the President’s office for many, many weeks before it was even brought to anybody’s attention…Even though the President does need to be aware of it, maybe the original letter should go to the actual people that do the survey so that they can be prepared.” The participant further explained, “Once we heard of it, we were already behind schedule.” Another participant experienced communication issues because the Introduction letter went to the Institutional Research department, but they were not informed that the data request had come in.

Of those who encountered the challenge of missing or insufficient tracking of data, one participant who works for a large 4-year university explained that although her office has access to their data system, budget data is not kept in that system and not to the level of detail requested by NPSAS. As a result, retrieving and accessing the data was especially difficult. Additionally, participants who indicated experiencing challenges with timing identified that when they do receive the request in September or October, institutions are already busy with other projects or commitments, so being provided a short turnaround time to submit all necessary information to NPSAS has been an issue. One participant, an Institution Research and Assessment Coordinator at a 4-year private not-for-profit university, suggested receiving the survey earlier in the year or later in the year such as late October for a November submission.

Assigning Responsibilities

Participants were then asked about how their institution decides on who is responsible for completing different components of the student records section and whether it was something easy or difficult to determine. Five participants responded to these probes. Two participants stated that they “take it upon themselves” to work through what is needed. Meanwhile, the remaining three participants discussed receiving assignments from someone else, two participants received their assignments from someone above them (i.e., Director or President), while the third participant reported getting the request from the Business Office. Generally, the participants who answered these probes did not experience issues with determining who is responsible for completing the sections. However, one participant did note that because their institution’s President received the introduction letter and not her, this caused some confusion as to who was responsible as “no one knew where to begin.”

Help Desk

Lastly, participants were asked whether in the future they would prefer to coordinate with other institutional data providers themselves or if they would prefer Help Desk staff to contact other individuals directly instead. All participants who responded to this item (n = 7) identified that they would rather coordinate on their own than have the Help Desk staff contact and prompt those individuals directly. Participants all found that having Help Desk staff involved would cause more confusion. Two participants also noted that based on their past experiences, those calls would end up re-routing to them anyways whether Help Desk staff was there or not.

Topic 3: Institution Information Page

The discussion for this topic centered around users’ experiences with completing the Institution Information section of the PDP, which collects the terms occurring at the institution in the NPSAS year. Participants were asked whether they would report students’ enrollment status (e.g., full-time, half-time, etc.) by month or by term. Furthermore, users were then asked to report their preferences and reactions to NCES pre-loading term data on the PDP based on previous rounds of the study, and users revising the data rather than manually entering this information. This topic was administered to the four focus groups that included Institutional Research staff, for a total of ten participants.

Term Structure

Participants were asked about their institution’s term structure (e.g. semesters, trimesters, continuous enrollment, etc.) and how they reported the summer terms in NPSAS:18 AC. One participant represented a continuous enrollment institution that does not fit into a traditional semester-based term structure, and the remaining nine participants represented institutions that operate on a traditional semester-based calendar system (e.g., fall/winter, spring, and summer semesters).

Reporting Summer Terms

For institutions that offer summer terms, the NPSAS year spans two academic years of summer sessions, and the instructions direct institutions to report both years. However, for administration and internal reporting, institutions associate summer terms with either the prior academic year (a summer “trailer” term) or the following academic year (a summer “header” term). Six out of ten institutions represented treat the summer term as a header.

When asked how they reported their summer terms for NSAS:18-AC, half (5 out of 10) of the participants reported including two years of summer terms because the instructions directed them to do so. Four participants reported some level of confusion about how to report the terms. As one participant explained, “July 1st would have been in the middle of our summer session for [2017] and June 30th would have been in the middle of our summer session [2018].” Another area of confusion is related to differences within the institution department. For example, one participant mentioned that their institution’s fiscal year treats the summer as a header term, while their financial aid department treats the summer as a trailer term and expressed concern about the data not being consistent.

Preloading Terms

All participants from term-based institutions reported that their term calendar is consistent from year to year and expressed a preference for the terms to be pre-loaded and reviewed rather than having to enter the terms for each round of NPSAS. None of these participants identified any potential challenges to pre-loading the terms, especially considering that they would be given the opportunity to make adjustments as needed. One participant from a continuous enrollment institution noted that it is a challenge for them to provide term data because they have many programs that have unique term schedules. The participant acknowledged, however, that entering the data by month (e.g., enrollment status for July 2017) is easy.

Topic 4: Student Records Modes

This section focuses on how participants determined which mode (Web mode, Excel mode, or CSV mode) to use to provide student records data for NPSAS:18-AC. Focus group participants also discussed their understanding of the instructions and resources provided for using Excel mode and CSV mode. This topic was administered to all seven focus groups for a total of 20 participants, but not all focus groups discussed every mode.

Selecting a Mode

Fourteen participants described how they determined which mode to use to enter student records data. Participants most frequently reported selecting the mode that was recommended by the PDP instructions as optimal for an institution of their size (n=8). Four participants explained they chose the mode with which they were most familiar, either because it had historically been used at their institution or because they had preexisting familiarity with it. The final two participants participated in previous rounds of NPSAS data collection prior to NPSAS:18-AC and had the opportunity to try both the Excel and CSV modes, reporting they selected the easier one (Excel) based on their personal experience with each.

Generally, participants felt satisfied with the mode options that are made available to them for student records data submission. In fact, given the opportunity to freely suggest alternative formats for submitting data, participants could not think of a format they would find more convenient than the current mode options.

Mode Selection Instructional Video

Participants were asked about whether they viewed the Mode Selection instructional video, and eleven participants provided responses. Five participants reported that they did not watch any videos on the pages of the PDP website, three participants could not remember whether they had watched them, and three participants confirmed that they had indeed watched the instructional videos. The participants who had watched the videos each reported that they were helpful and informative, with two confirming that they returned to the videos as an additional resource multiple times during the process of attempting the data request. One participant reported that they would be unlikely to watch an instructional video under any circumstance. Five of the eleven indicated a general appreciation for instructional videos, reporting they are likely to watch them, and six participants felt they would watch an instructional video if they were having issues accomplishing the task at hand.

Topic 5: Excel Mode

This section focuses on participants’ experiences providing student records data using the Excel template file. This topic was relevant to ten participants who provided data for NPSAS:18-AC using the Excel template, and was administered in five of the seven focus group sessions. Due to two participants joining their sessions in the middle of the discussion, eight participants were probed on template-specific items, while the full ten were probed on uploading and reviewing.

Excel Template

Data Entry. Seven of eight participants provided an explicit report on the method they used to input data into the Excel template, as one participant could not recall this information. While all seven participants reported copy/pasting data, only four relied exclusively on this input method. The other three participants copy/pasted most of their data but utilized an additional method for special cases: two reported needing to hand-key certain fields and one used the template’s drop-down menus wherever they existed.

Shape4

"For me, those types of help and information are invaluable.”

Help Text. All participants reported referencing the help text provided in the template. Generally, participants described the help text as helpful with one participant adding, “for me those types of help and information are invaluable.” However, two of eight participants felt that the help text did not provide enough information for some specific situations and still needed to call the Help Desk.

Participant Edits. Participants were asked to identify what they would change about the Excel template. Participants generally understood why the template was locked but noted that this limitation made data entry more difficult. Participants made recommendations to improve their experience providing data:

  • Adjust column widths to make more efficient use of the space on the screen.

  • Add the ability to filter data or freeze a reference column.

  • Add data validation to only accept certain characters (e.g., numerals in numeric data fields).

Uploading Data and Reviewing Error Messages

Most participants reported that they did not encounter any problems when uploading their Excel templates. Four participants reported that they received error messages on the completed files they uploaded. These error messages occurred because participants did not include “critical” data elements in their Excel template file; multiple participants mentioned the Ethnicity and Race data elements.

Participants who received error messages tended to upload a revised file as opposed to utilizing Web mode to edit the necessary fields. Even though all participants reported that they would not change anything about the way the error messages are displayed, reporting them as “straightforward” and easy to understand, participants did express a desire for a way to accurately validate their data file before uploading it to the PDP website. One of the participants advocating for this change in the review process needed to upload the file “five or more times before it was finally accepted.”

Topic 6: CSV Mode

Four of the seven focus groups discussed CSV mode, for a total of eight participants.

File Specifications

This section addresses ease of understanding the data requested within the file specifications and problems with data formats (i.e. decimal places, field lengths). Also discussed was how participants handled recoding the data to meet file specifications as well as any changes participants would make to the specifications to make them easier to use.

Understanding the file specifications. All eight participants reported having a general understanding of the file specifications. However, a few participants (2 out of 8) credited their ease in understanding the file to past participation in NPSAS or IPEDS [the Integrated Postsecondary Education Data System data collection effort]. Despite recalling some challenges with mapping or programming their data to the file, two participants agreed that the file was “fine” or “straightforward”.

Data formats. A few participants (3 out of 8) discussed issues regarding the file formats. Two Institutional Research (IR) staff mentioned that it would be helpful if the file specifications included item labels (rather than just item descriptions) to include in their data queries. One participant initially expressed confusion about how to code missing values, but then acknowledged that the specifications file included an instruction about leaving fields blank. Two participants also expressed confusion about whether they should include a “header” row in the uploaded file; they also noted that including a header row would aid in validating the information they were entering.

Another participant expressed confusion about the “Critical Items Y/N” field in the file specifications for items that are conditionally critical (i.e., only critical based on responses to other data elements). When probed about how to improve the instructions or error messages, the participant suggested adding a comment stating, "This element is not required, but if you have responded to [X] element with [Y] response. Then it is required."

Recoding data. Four out of eight participants stated they did not have to recode data. Two reported that they made sure their queries would format the results as indicated in the file specifications, one stated it was more of a problem solving exercise to assure they were pulling the right data, and the fourth reported they had their “database programmer” develop a template that would format the data correctly to meet the file specifications. Two participants added that when it was not possible to format the output correctly using the database queries, they would do additional recoding in Excel or SPSS. Another two participants mentioned they used SAS for coding and pulling data from various sources within their systems. One participant stated that getting data from other offices requires an additional step to re-name fields because they didn’t know how the other office labeled those fields.

Uploading Data Files

This next section focuses on uploading data, including discussion of any problems encountered while uploading, ease of understanding and resolving error messages, and editing the data.

Problems encountered uploading the CSV file. Participants reported no issues with uploading the CSV file, other than those noted below related to error messages.

Error messages. Four of the seven participants specifically discussed receiving error messages after uploading the CSV file. Two participants reported that the process went well and that the error messages clearly identified what the problem was and what to fix. The other two participants, both from large public institutions, reported encountering many error messages. One stated their institution has a large international population for whom race/ethnicity are not reported, thus they received numerous error messages for all these students which made finding the more critical errors difficult. The other participant explained that the error messages were difficult to understand, and they were not able to resolve all the errors due to the lack of clarification. They expressed confusion because, after reviewing their NPSAS:18 AC data, three of the four files uploaded with errors but were accepted.

One participant added, "It would be nice to have some clarification and to know the difference of a 'fatal error,' meaning something that is not going to work, versus a kind of a 'warning error,'" similar to the way they report this value in IPEDS. Participants also expressed confusion about whether errors prevented files from uploading.

All three of the participants who were asked whether they revised their files within the Web Mode or if they uploaded a revised file, reported they upload a revised file. Although not explicitly stated, two other participants indicated they also revised their files to fix the errors before uploading it again.

Topic 7: Enrollment

This section asked participants for feedback on data elements collected in the Enrollment section, including remedial course-taking and tuition charged. We asked staff to report on the status of remedial course-taking at their respective institutions before investigating how staff would report (or not report) such courses when completing the NPSAS data request. Additionally, this section reviews institution staffs’ impressions of how data on tuition charged is collected. This topic was included in five focus groups, totaling 16 participants.

Remedial Course-Taking

Shape5
Participants were asked whether their institution offers co-requisite courses (i.e., “gateway courses”) wherein students are taught both developmental and post-developmental content. Fourteen participants responded to this question, and while most (11 of 14) participants reported that their institution does offer co-requisite developmental courses, staff were divided on whether or not they would count those courses when responding to the remedial course-taking item (see figure 3).



Only a few participants confidently reported “yes,” they would consider co-requisite developmental courses to be remedial courses for the purposes of the item in figure 3 (3 out of 11). These participants felt this was the correct way to procced because a portion of the course content is below college or program-level. Participants were most likely to report that they would not consider the courses they identified as “gateway” or co-requisite developmental to be remedial (5 out of 11), noting the following reasons:

  • Two participants reported the co-requisite developmental course they had been considering was a program wherein students of international background take beginners-level English for college credit. Both participants felt they would not think of such courses as remedial because “they are English-as-a-foreign-language” which “is different than not meeting [native] English standards in terms of high school or college-level.”

  • One participant would not report her institution’s “gateway classes” on this item because at their institution the phrase describes a different type of course completely ―one in which all freshmen students are grouped together to learn “basic life skills.”

  • Another participant would not include such classes in response to the item in Figure 1 simply because they are too difficult to “tease out” of their data systems, despite recognizing that her 2-year-private institution does offer remedial courses that are co-taught with college-level classes.

Finally, three of 11 participants were not sure about how to correctly report co-requisite developmental courses on the item in question.

Staff noted the following areas of confusion:

  • Can a course be gateway/remedial/developmental while also satisfying a degree requirement?

  • Does the co-requisite developmental course have to be an actual course or would NCES also like data on students who are required to take part in remedial workshops, support programs, or other non-course-based developmental/remedial options?

Tuition Charged

Shape6
School staff were asked about challenges encountered in determining how much to report for each student’s tuition amount charged in the NPSAS year (see figure 4).



Five participants, all term-based institutions, reported challenges related to the NPSAS year reference period.

  • One such participant had students enrolled in summer term courses that were billed in June (before the reference period) but the courses ran into August (within the reference period).

  • Another participant reported needing to include a preceding summer which had a tuition amount different than the rest of the terms included, because as a summer-trailer institution the summer term was a part of a different academic year.

Shape7

"I basically left that field blank because it was very difficult to identify what they were charged based on the criteria of the student.”

Two participants reported difficulty generating tuition charged because of challenges with their respective institutions’ data systems. One did not clarify the system but felt their data may not be reliable and doubted whether the individual responsible for pulling the data would accurately interpret the many potential data elements in their system. The other participant reported using Ellucian Banner products and explained “I basically left that field blank because it was very difficult to identify what they were charged based on the criteria of the student… So, when you go into the system it is very hard to identify what the tuition was because a student will have multiple rows of the amount being charged.” The participant was also unable to get assistance from the institution’s Financial Aid Department due to them being overwhelmed.

Fifteen participants felt they would have an easier time submitting the NPSAS data request if the item wording requested tuition charged for specific terms (i.e., Fall 2019, Spring 2020) instead of the July to June NPSAS year reference period.

Other Discussion of Enrollment Data

Institution staff were also given the opportunity to discuss any additional challenges they encountered in the Enrollment section. A variety of challenges were discussed and raised by staff, including:

  • Two participants reported that providing enrollment information can be challenging for students that transfer into the institution (e.g., high school graduation dates and ACT/SAT scores, which are not known for transfer students).

  • Two participants reported that because graduate students are considered full-time with fewer credits/credit hours than undergraduate students, it became very time-consuming to collect data requested by half-time, three-quarter-time, less-than-half-time enrollment categories as they have to manually confirm the student’s type of enrollment before being able to proceed.

Topic 8: Budget

This section addresses participants’ understanding of the date elements collected in the Budget section; preferences regarding reporting a full-time/full-year budget versus other budget period options; whether or not participants include summer terms within the full-time/full-year budget; how budgets are adjusted when students change their enrollment pattern; how participants determine an individualized budget; and any additional challenges completing the Budget section. This topic was administered in four focus groups, totaling 11 participants.

Reporting Budget Data

Most participants reported challenges providing budget data. While one participant from a career technology institution explained that the budget section was easy for them because they have a calculated budget for every program, several participants found it challenging to report budgets for students, especially for those not receiving financial aid. Two main themes emerged during this discussion: degree of manual labor required to produce the data and concerns about consistency. Two participants noted that completing this section was difficult because of having different programs with different tuition rates. As a result, completing the section was a manual process of reviewing each individual student and determining whether they should use their standard cost of attendance or if they should prorate the student’s budget. Another participant also expressed concern about the consistency of the data being reported because they used a mix of standard Pell budgets and individualized budgets.

One participant noted that the similarity between the budgeted tuition data element (in the Budget section) and the tuition charged item (in the Enrollment section) and expressed confusion about why it was necessary to ask for both data elements. Some participants noted that it would help them interpret the instructions for the Budget section if they received more information about how the budget data will be used.

Budget Period

Institution staff were asked if they would prefer that the Budget section request only a full-time/full-year budget, which may require them to prorate students’ budgets, and whether their institutions’ full-time/full-year budgets would include summer sessions (i.e., a 9-month vs. 12-month budget). Just over half of the participants (6 out of 11) expressed a preference for this section to request a full-time/full-year budget, with some suggesting that requesting their Pell budget would be a common answer for all institutions. Six out of 11 staff described their full-time/full-year budget as a nine-month year. One participant suggested that the drop-down option for the budget period include a “full-time/full-year, 12 months” option and a “full-time/full-year, 9 months” option, rather than just full-time/full-year. One participant from a career technology institution suggested that the instructions should clarify the meaning of “full-time” in terms of the student’s clock hours (e.g. less than 24 hours, more than 12 hours). One participant noted that their Financial Aid office only codes students as "full time or part time,” but the budget period response options classify part-time students as “3/4 time,” “half-time,” or “less than half-time.”

Adjusting Student Budgets

Participants were asked what would happen to a student’s budget when a student’s enrollment pattern changes after the budget is prepared (e.g., the budget is prepared in the fall term for fall and spring enrollment, and then later the student decides to enroll in summer session). Three out of 11 participants reported they would re-budget the student to add the summer session to the student’s budget and increase their cost of attendance because the summer falls within the same academic year. Four additional participants reported that the situation does not apply to them because they are either a “summer header” institution and the new summer term would be included in the next academic year period or that they have “rolling terms” in that some programs start at different times of the year beyond the traditional September or January start.

Estimated Budgets

Institution staff were also asked how they would estimate a budget if an individualized budget was not available for a student. While two participants reported they always have an individualized budget, more than half the participants (6 out of 11) reported they would use a generic institution budget. Of these, one participant explained their estimated cost for attendance is based on residency status as they do not offer campus housing; the other five participants explained that they would use a generic budget as an estimated budget for students who did not complete a FASFA. The remaining participants were unable to respond as they were not responsible for student budgets.

Other Discussion of Budget Data

Shape8

In financial aid speak, the actual term is ‘cost of attendance’ [rather than budget]… The law requires them to provide the cost of attendance for all students receiving aid for that term. If the section is referring to it as something different it can cause very different answers about awards and cost of attendance.”

Institution staff were also given the opportunity to discuss any additional challenges they encountered in the Budget section. One participant mentioned that it is difficult to include non-credit students and, similarly, another participant reported that if a student did not complete a FASFA, they were hard to include. Two participants mentioned that some of the data requested within the budget section is not typically used for developing budgets, with one wondering why they had to provide data points like “health insurance” and “commuter/technology” because it was “tedious.” One participant, a Director of Financial Aid from a large public two-year institution, noted that “in financial aid speak, the actual term is ‘cost of attendance’” rather than budget. They further explained that “The law requires them to provide the cost of attendance for all students receiving aid for that term. If the section is referring to it as something different it can cause very different answers about awards and cost of attendance.”


Findings: Usability Testing

This section presents detailed findings from usability testing of the Postsecondary Data Portal (PDP) website.

Topic 1: Home Page & Study Icons

Participants were reacquainted with the PDP, asked to review the PDP homepage, and share their experiences using the PDP during NPSAS:18-AC data collection. All 18 participants were administered this section.

Over half of the participants (11 out of 18) described their overall experience of using the PDP website or homepage positively, such as recalling their previous experiences as “straight forward,” “user-friendly,” and “fine.” Negative comments related to the secure procedures for accessing the PDP, specifically that the two-factor authentication to enter the user ID slowed things down for them. One participant described the PDP login process as “unnecessarily cumbersome.”

Study Icons

Participants were nearly evenly split between those who noticed the study icons at the bottom of the screen (9 out of 17) and those who did not (8 out of 17). A few participants noted appreciating the icons and links to resources located on the homepage. Other participants described the home page as “busy,” reported that they were “a bit confused about where I wanted to be.” Users most commonly reported not clicking on the icons because they were focused on the task at hand.

Datalab website

Due to time constraints, only fourteen participants were asked if they had ever visited the NCES Datalab website; nearly all participants (13 out of 14) stated that they had not noticed the Datalab link (see figure 5). Furthermore, six participants expressed that they were unsure of what the Datalab was or how to find it on the PDP log-in page. Almost all users (12 out of 14) stated that they would be interested in receiving materials from NCES informing them how to utilize the Datalab website.

Shape9


Topic 2: Task Menu

This section focuses on assessing participants’ understanding of the Task Manu page, the tasks necessary to complete NPSAS, and the order in which they should be completed. A total of 18 participants were administered this topic.

General Understanding and Usability

Shape10 Shape11

"Right at the beginning it shows you where to start, where you can register your institution, and then it gives a status showing the progress stage where you left off."

All participants expressed positive opinions regarding the overall layout and usability of the Task Menu and that the tasks seemed to be presented in the chronological order in which they should be completed. As one explained, "I think it's pretty good because it's not thrown at you all at once. Right at the beginning it shows you where to start, where you can register your institution, and then it gives a status showing the progress stage where you left off." Five participants mentioned that the side bar Help Menu was especially useful, as it put many useful links in one location. Participants generally understood what they would need to do on this page in order to proceed.

Some participants expressed confusion over seeing “NPSAS:20” on this page and were not sure if they should interpret the “20” as the year 2020. One participant stated that the distinction between the "Student Enrollment List" and "Student Record Data" sections was confusing.

Task Menu and Next Steps

Nearly all participants were able to correctly describe the different tasks in the Task Menu (see figure 6). Two participants, however, explained that they were not sure exactly what type of documents would be included in 'Archive Notes or Other Documentation,’ but did understand that they could find additional documents under this section. All but one participant were able to identify that 'Provide Student Record Data' was ‘In progress’ and that none of the tasks were completed. One participant mistakenly conflated ‘In Progress’ with ‘Completed,’ and is the only participant who did not explain that their first step would be to ‘Register Your Institution.’

Shape12 Topic 3: Help Menu

This section focuses on assessing participants’ usage of various resources found on the right-hand side bar Help Menu (see figure 7). Seventeen participants were administered this topic.

Using the Help Menu

Ten participants recalled using some of the resources provided in the Help Menu. FAQs and Resources were most commonly reported by participants. When asked whether they found the information helpful, all 10 participants indicated “yes.”

Contact Materials Link

Shape13

[Contact Materials] summarizes exactly what [the study] is… what the survey is about and the instructions.”

A majority of participants (16 out of 17) found the Contact Materials link very useful, useful, or somewhat useful. Most participants acknowledged receiving physical contact materials in the mail but found having a digital copy helpful if participants “misplaced those materials” or if someone else received the hard copy and was not in the same office as the participant completing NPSAS. More specifically, participants found the Welcome Letter, the Student Enrollment List Instructions, and NPSAS:20 Brochure especially helpful. Other participants noted its usefulness if they needed to confirm information such as the purpose or dates. A participant explained, “[Contact Materials] summarizes exactly what [the study] is… what the survey is about and the instructions."

Resources Link

Shape14

[The Resources link] lists all the data that I need.”

Most participants (14 of 17) found the Resources link useful as “it lists all the data that I need” and it was described as “… a resource in my pocket that I can use.” Participants found this link especially helpful for those who may need additional assistance in understanding each section better or for any new staff who may need to familiarize themselves. The Student Enrollment List Instructions and Overview Video, the Quick Guide to Providing Student Records Data, the Confidentiality and Data Security Fact Sheet, and the FERPA Fact Sheet were most commonly cited as being the most helpful resources.

Announcements Link

Participants were evenly split when asked whether or not they found the information in the Announcements link to be useful, with eight participants reporting some level of usefulness and another eight participants reporting the link was not at all useful. The participants who did not find the link useful generally based their responses on the sample announcements that were posted on the test website rather than the general inclusion of announcements. Three participants further explained that they are unclear as to what information would be announced and why, while another participant offered several suggestions about the types of information that would be useful (e.g., upcoming deadlines, changes to the file layout). Those who found the Announcement link useful noted it had all announcements in one location so that it “is easy to find.”

Participants overall indicated they would like to receive alerts regarding any “urgent items” such as upcoming deadlines as well as any “new information or changes made.” Participants suggested having some sort of indication that a new announcement is available, such as a flag or number representing how many new announcements are pending their review.

What to Expect Link

Most participants (16 out of 17) found the timeline within the What to Expect link useful, specifically noting the visual display of when items are due as the most enjoyable. One participant exclaimed, “This is cool! … This is great for planning!” Although the timeline was considered useful, some participants wanted more specific information, like the exact dates that may show when “hovering” the mouse over the line, rather than just an overview.

NPSAS:20 FAQs Link

Shape15

"Most answers were very short and even for the longer answers, [its] listed in bullets so it was easier to read.”

Most participants (16 out of 17) found the information in the FAQs page very useful, useful, or somewhat useful. When asked to elaborate, participants generally agreed that it provides more information on questions that may occur during the process. One participant especially liked that “most answers were very short and even for the longer answers, [it’s] listed in bullets so it was easier to read.” Participants most commonly indicated the Student Enrollment List: Which Students to Include section as being the most helpful followed by the Background and Purpose of NPSAS.

Contacting the Help Desk

Almost all of the participants who reviewed this resource (9 out of 10) recalled reaching out to the Help Desk when completing NPSAS:18-AC. Five participants indicated using both phone and email while the remaining four participants used only one of the two methods to contact the help desk. Participants who were asked about whether it was easy or difficult to find the Help Desk all found it very easy or easy.

Live Chat Option

Ten participants were asked how likely they would be to use a live chat option if it was offered by NCES. Most participants (9 out of 10) expressed that they would likely use this option as it would allow for “immediate help” and would not be distracting to others around them, unlike if they were on the phone. Only one participant noted they would not utilize this option due to personal preference of speaking to a representative as it may bring up other questions they may have in mind.

Manage PDP Users

Three participants acknowledged using information found on the website to get access to the PDP for someone else. When asked how they would accomplish this task, other participants indicated they would either use the Manage PDP Users link or contact the Help Desk for assistance. Two participants explained they would just give their information to the potential user if they needed access to the PDP.

When asked to demonstrate adding a PDP user, most participants (8 out of 9) described it as very easy or easy to add a user. A participant explained, “All information was laid out in a way that was easy to understand.”

Topic 4: Registration Page

This discussion focused on the “Register Your Institution” section of the PDP. Participants were asked to indicate whether they needed input from other offices at their institution when providing the information collected in this section during NPSAS:18-AC. The Registration Page collects detailed information about institutions’ term structures, which was formerly collected as part of the student records phase of NPSAS. This topic was administered to a total of 13 participants.

Six participants explained that there is no additional information about their institutions that NCES should know about to improve their experience completing the study. Although there was no consensus on additional information that NCES should know about, staff mentioned collecting information about credit-to-clock hour conversions for institutions that do not use credits and using institutions’ Carnegie classifications to filter questions.

Completing the Registration Forms

While the NPSAS:18-AC Registration Page was organized on a single vertical-scrolling webpage, the Registration Page tested for NPSAS:20 was organized as a “form wizard,” in which participants viewed one question at a time on a separate form and advanced from form to form. No participants had difficulty navigating through the form wizard. Most participants (8 out of 12) did not find any words or parts of the Registration Page confusing, and among those who raised points of confusion, there were no consistent patterns of issues identified. For example, there was some minor confusion relating to the enrollment list cutoff date of April 30, as "April 30th probably would have happened long after a drop/add period had taken place.” One institution questioned whether the Registration Page referred to enrollment status for graduate or undergraduate students.

Assistance from Other Offices

A majority of participants (9 out of 12) reported not needing input from other offices at their institution to complete the Registration Page. Those who reported needing input from other institution offices indicated that they would receive input from staff in the Registrar, Academic Dean, Provost, Financial Aid, and President’s Offices. When asked how long it took to receive information from other offices and complete this step, reports ranged from a day to one week.

Term Structure

Twelve participants provided feedback on how they would decide whether to choose “status by month” or “status by term.” Most participants (9 out of 12) noted that they would select “status by term” as their institutions utilized semesters (e.g., Fall, Winter, Spring, Summer). One participant mentioned experiencing confusion when selecting a term option because their institution had sub-terms within the larger terms (e.g. 'first part of term,' 'a second part of term,' and ‘Maymester’). Other participants reported that they would either seek help from the Help Desk when deciding which option to choose, speak to the Associate Provost to ensure that they are choosing the correct option, or select the option for “non-traditional students” as their institution offers both traditional and non-traditional online courses.

A majority of participants (8 out of 9) believed it was very easy or easy to enter/verify a term. These participants explained that they could easily extract the start and end dates of the term from their institution. Specifically, two participants liked that the page only asked for the month and year of the term, instead of specifically requiring the entry for the start and end days. These participants explained that as the start and end days fluctuated every year, they would not need to look up the specific day and could easily enter the month and year of the term. One participant noted that the page had the correct pre-filled dates for their institution’s terms. The participant that reported difficulty entering or verifying a term explained that after seeking help from the Help Desk, entering/verifying the term became easy.

Topic 5: Student Enrollment List

This discussion focused on how participants prepared and then uploaded their student enrollment lists. Participants were provided sample data in order to demonstrate how they would complete this page. This section was presented to the 16 participants who were responsible for completing the student enrollment list portion of NPSAS:18-AC.

Navigating the Student Enrollment List Page

Shape16
Most participants (13 out of 16) were able to accurately describe the tasks to be completed on this page (compile and upload a list of students, provide additional data about preparing the list, and provide counts of excluded students). However, only 6 participants noticed both sections 3 (Provide Information About Your List) and section 4 (Upload Student Enrollment List), and only half of participants noticed the section 3 submission button (see figure 8).



When asked to rate how easy or difficult it was to complete this page, most participants (13 out of 16) described finding it very easy (4 out of 16) or easy (9 out of 16). One participant who found it very easy noted, “This particular page was straightforward.” Participants did mention the following areas of confusions:

  • the difference between the “Student Record Data” and the “Student Enrollment List” phases of NPSAS.

  • the “Comments” section, noting the section provides mixed messages because the instructions indicate a “particular format that you have to follow” but the comments section asks to clarify the layout of the enrollment list. Participants felt this section indicated they could use a different format than what was outlined in the instructions.

  • whether “High School Concurrent” students would count in the “excluded” category.

  • the meaning of the “Save Item 3 Data” button found at the bottom of this section and suggested changing the wording to “Save Form” or “Save Response from Above Item.”

Shape17

"… This particular page was straightforward.”

Assistance from Other Offices

Participants were asked whether they needed input from other offices at their institution to help complete the form. A majority (12 out of 16) identified needing input or assistance from either the Financial Aid, Registrar, or Admissions Office. In general, participants described the experience as “helpful” and noted having a good working relationship with the other departments.

Some participants questioned why there were only three lines available in the “Contact Information for Staff” section of this page, which asks the user to indicate other staff/offices who helped to prepare the enrollment list. One participant was not sure whose contact she was supposed to enter and asked, “Would that be anybody who was involved? ... Do I need to track down who did what?” She also noted that she would rather leave a blank and not receive an error, questioning “How important is this to move on to the next step?”

Participants generally estimated that the amount of time involved to complete the Student Enrollment List as ranging from about an hour and a half to about 4 weeks. Of those who indicated closer to a four-week timeframe, two participants reported having to call NPSAS for a “time extension” as it took time for them to receive information from another department.

Using the Instructions Link

Shape18

"…these instructions provide an outline on what to do with…different situations.”

Half of the participants (8 out of 16) indicated utilizing the link to download the list preparation instructions and did not report experiencing any difficulties selecting this link. The remaining half (8 out of 16) indicated not clicking on the “Instructions” link. Of those who did not click the link, participants commonly noted that they believed it was “self-explanatory” or that they’ve “…done it before.” A few participants indicated a preference for having a supplementary printed copy of the instructions available to them. Despite whether participants used the “Instructions” link or not, all participants reported that the enrollment preparation instructions were very easy (5 out of 16) or easy (11 out of 16) to understand. Participants found the instructions to be “concise” and that it listed everything out “step-by-step.”

Topic 6: Student Records – Mode Selection

This section focuses on how participants determined which mode (e.g. file format) to use for entering student records data and feedback on the PDP Mode Selection page, including whether or not they noticed the recommendations for institutions at the bottom of each mode, why they think the Web mode is highlighted, and whether or not they watched the available video. This topic was administered to a total of 16 participants1.

Selecting a Mode

Sixteen participants provided an explicit report on how they determined which mode to use to enter student records data. Most of the participants were able to accurately describe the three modes and agreed that their mode selection would depend on the sample size of the students. Excel was the most popular mode, with half of the participants indicating that they preferred Excel (8 out of 16). Participants noted that they were most familiar with this mode, liked that it maintained the formatting of the student IDs, was easy to use, and did not result in as many errors (when compared to CSV mode). Participants who preferred CSV mode noted that it allowed them to write the codes and output the data file in the required format and prevented errors that resulted from copying and pasting data. Participants noted using Web mode for manual data entry would be time-consuming and tedious for their institutions’ sample sizes.

Recommended Mode

Most participants (14 out of 16) noticed the sample size recommendations listed at the bottom of each mode. Most participants found this information helpful (11 out of 16), noting that it served as a useful guide on each of the three modes. In addition, twelve participants were able to accurately describe why the Web mode button was a different color than the other mode buttons (because it was the recommended mode based on the test institution’s sample size).

Mode Selection Instructional Video

Participants were asked for feedback on the instructional video embedded in the PDP Mode Selection page. Data was collected from 16 of 18 participants, but not all 16 were asked all the probes in this section. More than half of the participants (9 out of 16) reported they were not likely to watch the video, as they are familiar with using the portal and can find the directions elsewhere on the page. Other users noted that they may watch the video as it was short and may provide helpful information.

After viewing the video, most participants reported that they found the video to be useful (12 out of 15) and did not find any words or parts of the page or video confusing (13 out of 15). Those who did find words or parts of the page or video confusing were unclear about the meaning of the term “mode” and did “not understand the CSV mode.” A few participants reported that the video would impact their decision about which mode to use, and only one participant indicated that the video should include additional information. This participant would prefer that the video explain the rationale for the recommended mode.

Topic 7a: Student Records – Excel Mode

This section focuses on participants’ experiences providing student records data using the Excel Template. This topic was relevant to eight participants who selected Excel mode during the Topic 6.

Excel Mode Instructional Video

When asked to describe the steps they would take to complete the data request (see figure 9), most participants (7 out of 8) did not mention watching the instructional video embedded on the Excel mode page of the PDP. Six participants reported on their likelihood to watch the video, and half of these reported being somewhat likely to watch the video, especially if they became confused about some aspect of the data (3 out of 6). The other half of the participants reported being unlikely to view the video because it was easy to determine what needed to be done by reviewing the text on the webpage (3 out of 6).



Shape19


Participants who did not find the video helpful felt it was redundant, containing no additional information that was not provide elsewhere in the instructions (3 out of 6). Conversely, participants who felt the video was helpful (3 out of 6) either learned something new or appreciated that it “reiterated” what they already know.

Participants suggested the video be modified to include more detail about the following:

  • what happens after you upload the template;

  • what happens if there is an error in the template you uploaded;

  • how best to address different types of errors; and

  • how to reach the Help Desk.

Excel Template

Participants were asked to provide feedback on completing the following tasks: downloading a copy of the Excel mode template file, navigating through the template, entering data into the file, and uploading the completed file to the PDP.

Shape20 Help Text. A majority of participants accessed help text throughout the template (6 out of 8). Participants who used help text knew to do so because they were previously familiar with Excel and knew that the red triangle icon indicated a cell comment or note (see Figure 8). Participants who did not use help text were unfamiliar with the meaning of the red triangle. When they were asked what needed to change in order to increase usage of help text, participants felt the instructions could be amended to include an explanation of what these are alongside an example image of a cell with a cell comment icon.

Student Records Sections. Participants usually needed input from other offices or departments to complete the data request (7 out of 8). Typically, those who were not Financial Aid employees needed the Financial Aid department for the Budget and Financial Aid tabs (5 out of 8) and almost half of the sampled staff received help from the Registrar’s office in filling out the Student Information and/or Enrollment tabs (3 out of 8). Participants did not report difficulty associated with needing input from other offices, reporting it was both easy and often necessary for multiple people to complete different sections.

Participants were also asked about their overall impression of the organization of the template and felt it was easy to navigate from section to section. They were generally unable to think of a better way to group data elements (7 out of 8).

Shape21

The Instructions page was great. I printed it. That was like my little bible.”

Instructions. All participants indicated they had reviewed the Instructions tab of the Excel template file during the previous round of data collection. Most participants (5 out of 8) were able to recall using the “Tips” and “What’s New” sections of this tab during NPSAS:18-AC, reporting that this information was helpful. Generally, participants were not confused by the Instructions and reported them “straightforward” and “self-explanatory;” one participant even referred to it as their “little bible.” Some participants indicated that they would prefer to also have the Instructions available in a separate document outside of the template.

Item Codebook. All participants indicated having reviewed the Item Codebook during NPSAS:18-AC and generally found it “helpful” and “self-explanatory.” However, a few participants reported confusion on this tab of the template (2 out of 8). Some participants were displeased that the codebook was within the data-entry template and questioned why unknown values needed to be coded into the template as negative numbers.

Participants offered the following suggestions for improvement of the Codebook:

  • enable staff to recode using only positive numbers (i.e., “Unknown” could be “2” instead of “-1”);

  • allow staff to hide rows about data that are not applicable to their institution; and

  • create a Codebook document separate from the template so that it is easier to reference.

General Student Information Tab. Most participants (7 out of 8) reported that no words or parts of the General Student Information tab were confusing. Some participants questioned the meaning of the different colors in the header rows across the template (see figure 11). Participants reported that the Student Information tab would be less confusing if the meaning of the different colors in the header rows was clarified.

No participants reported concerns with students being sorted by last name/first name and no participants reported a preference to have them sorted differently.

Shape22


Enrollment Tab. Generally, participants did not report any words or parts of the Enrollment section as confusing or difficult. One participant reported having difficulty with the Enrollment Status items because her institution only reported correct terms for their undergraduate students; therefore, terms for graduate students were missing from the Excel template. The participant stated that the template should explain the appropriate way to address such an issue, noting that her institution spent a lot of time trying to identify where to report graduate student information before finally calling the Help Desk who explained that only they could resolve such an issue.

Budget Tab. Three of the eight Excel participants were responsible for completing the Budget section themselves, while the others reported that they would send the Budget tab to a different department for completion. Participants who completed this tab using individualized budgets reported that doing so was easy. Participants who completed this tab for institutions without individualized budgets were mixed in their assessment of this tab. Participants who found this tab difficult to complete felt it would be easier to complete if a line of text directing staff to the help text were provided directly beneath the words “Budget Period” in the header row.

Financial Aid Tab. Three of eight participants were responsible for completing the Financial Aid section themselves. Participants reported some confusion about what kinds of “private aid” to report and questioned the level of specificity required in subcategorizing types of aid. Although they noted that an understanding of why this information is needed, it was difficult to make sure aid was categorized in a consistent manner for all students.

Participants offered the following suggestions for improvement of the Financial Aid tab:

  • provide examples of private aid included in the header row; and

  • allow some types of institution aid to be condensed.

Uploading the Excel Template. Participants reported that uploading completed templates to the PDP website was easy. Some users (4 out of 8) expressed irritation that they did not know that all the data tabs needed to be uploaded in the same file, feeling this should be clearly explained on the webpage, in the text around “Step 3: Upload Your Completed Template.”

All participants reported that they received error messages when providing data. Participants found the error messages “clear” and “easy-to-understand” and did not find anything confusing about the way they were displayed. Participants reported that they were not likely to edit their data in Web Mode; all participants reported uploading a new file when errors were addressed.

Participants were generally unclear about the purpose of “Step 4: Review the Data You Provided” and did not identify it as a necessary step in completing the data request (3 out of 8). Participants who had not used this step during NPSAS:18-AC were confused as to its function, usually reporting a belief that it was merely an alternative way to identify and address errors. Participants who did review their data for previous rounds found it helpful in confirming that the formatting and content of their data file had not been altered during the upload process.

Topic 7b: Student Records – CSV Mode

This section focuses on the usability, function, and participants’ understanding of CSV mode. This topic was relevant to eight participants they would upload data to the PDP using the CSV mode during discussion of Topic 6, including one participant who used the Excel mode for NPSAS18:AC but who elected to use CSV mode for the interview2.

Overall, participants understood the process for uploading the student data via CSV mode, had no usability issues, and reported the site was easy to navigate. Based on their description of the steps necessary to complete the data request, all participants appeared to understand the procedures for compiling and uploading the student data. Most started by reviewing the instructions and file specifications in order to ensure that they gather the correct data in the proper format.

CSV Mode Instructional Video

Shape23

It was nice to see a short and concise summary of the whole process.”

Only a few participants mentioned they may choose to watch the instructional video embedded in the CSV Mode page on the PDP; when prompted to watch the video, most participants (6 of 7) reported the video is helpful and that “(i)t was nice to see a short and concise summary of the whole process.” None of the participants reported finding anything confusing nor would the video change their minds about which mode to use after watching the video. There were just two suggestions offered about the video: explain what to do in the case of errors and provide the content of the video in a written document.

General Usability

Seven participants reported being either familiar or very familiar with CSV files and most of the participants stated they would work with someone else to create the CSV files (e.g. a programmer). During the interviews, the interviewer recorded whether participants clicked on (or otherwise indicated they would click on) the following: instructional video, file specifications, instructions, show errors, download error list, review data, finalize all student record information, and Tips/What’s New. Although not all participants clicked on or otherwise noted all of these links or tasks, none of the participants demonstrated any usability issues.

File Specifications

Furthermore, Participants were able to accurately describe the columns in the file specifications document, with two respondents adding that the “comments” column was especially helpful. As one participant said, “Sometimes the magic lives in these comments." There was nothing reported as confusing about the specifications.

Tips and/or What’s New. Two participants did not notice the “Tips and/or What’s New” section of the file specifications and, when it was pointed out to them by the interviewer, one stated they would review the section only if they had errors after uploading the data. Although several participants did not remember seeing this information during NPSAS18:AC data collection, participants generally reported it would be helpful or very helpful information. Participants offered a few suggestions related to the “Tips and/or What’s New” sections: move this section to the top of the page, make the information accessible outside of the website, and provide this information on a separate tab.

Identifying Valid Values. Participants indicated that they understood the need to report data in the format specified in the valid values column or risk receiving error messages. Despite the explicit instruction on the “Instructions” tab of the file specifications document, (see figure 12), a few participants reported not knowing how to code unknown data.

Shape24





















Missing Data Elements. A few participants identified some data elements they were unable to provide, such as high school graduation dates, race/ethnicity (especially for international students), international phone numbers, and marital or veteran’s status. Participants reported they would leave the fields blank or enter zeroes.

Identifying Critical Items. Identifying the critical data elements was difficult for three of the six participants who were probed on this topic. None of the three mentioned the “Critical Items” column (see figure 13), stating that every line item is critical otherwise “it wouldn’t be asked here [in the survey].”

Figure 13. Critical Items in CSV File Specifications

Field

Field Label

Shape25 Critical Item Y/N

Max Length

Format

1

File Specification Version Number

N

2

N

2

Institute ID

Y

6

N

Understanding and Resolving Error Messages

All four participants probed either explicitly stated or implied that understanding the “error messages” was easy, stating that the messages are “pretty straightforward” and provide “clear reasons.” To resolve errors received, participants reported they would examine their file to identify what the error actually was (one mentioned reviewing the CSV specifications) and fix the error. Correcting the error would happen either manually in the web mode if there were just a few errors or in the CSV file which would then need to be re-uploaded. Moreover, while most participants were not confused by the error messages, a few offered suggestions to improve the error messages: include whether or not the error occurred in a “critical” field and provide information or options about how to resolve the error message within the message itself.

Topic 7c: Student Records – Web Mode

This section focuses on the usability and function of Web Mode. The Web mode section of the PDP is designed for colleges and universities with only a small number of student data records requested, with the recommendations indicating that Web mode is optimized for 20 or fewer students. This topic was relevant to two participants who selected Web mode during the Topic 6. Both participants felt that the web mode was clear and understandable to them, with one participant describing it as “self-explanatory.”

General Understanding and Usability

Participants understood that the Web mode was designed to be used for smaller numbers of students and both expressed that it was well suited for this purpose. The participants explained that the screen clearly prompts them to enter information on the requested students. Neither participant experienced any confusion or difficulty in understanding the information displayed in the Web Mode or the directions for entering the data. Additionally, neither participant experienced any usability errors or problems navigating the Web Mode section.

Web Mode Instructional Video

One participant stated that watching the video is the first thing that they would do and explained that “[the video] makes sure that you have all the information before you upload and finalize the report. After you finalize you can't make any changes." The other participant felt that the video would only be helpful if they had encountered something that was confusing to them, in which case they would watch the video for clarification. Both participants agreed that the video would not affect their decision about which mode to use.

Data Entry by Student or by Section

Participants were split regarding how they entered the student information, with one participant electing to do so ‘by section’ and the other choosing ‘by student.’ The participants offered different reasoning for their approach, with the ‘by student’ participant explaining that doing it “one student at a time” helped them focus and make sure there was no missing information, while the other participant explained they approach it from the mindset of ‘What do they need?’ and explained they would gather all requested information at once.

Use of Check for Errors, Help Text, and Finalize Buttons

Shape26
Both participants noticed the “Check for Errors” button (see figure 14) and were able to describe that it would inform them of potential errors in the data. While neither participant utilized the help text buttons during the usability interviews, both noticed them and understood their purpose. One participant explained that they would have utilized the help text buttons if they had been entering actual information during a real NPSAS administration, while the other explained they would only use the help text buttons if they encountered difficulty. Both participants explained that they expected the ‘Finalize’ button (see figure 14) to submit the information in such a fashion that they would be unable to change it after it was finalized.

Responsibilities and Interdepartmental Collaboration

Both participants stated that they needed to reach out to other departments within their institutions. One participant reported needing assistance from the bookstore and Registrar’s office, and that receiving information from the Registrar’s office “took some time” because they needed information from external organizations such as students’ academic history or their GED/high school diploma completion status. The other participant needed assistance from the Research office, Registrar’s office, and Financial Aid office and estimated that the collaboration process took around three weeks.

Topic 8: Debriefing

The debriefing section gave respondents an opportunity to offer additional opinions, suggestions, or feedback on the PDP and NPSAS in general. Due to time constraints, two participants did not participate in the debriefing, three completed an abbreviated debriefing, and thirteen participants were administered the full debriefing.

Shape27

Working with the website was the easy part, getting the data together was the hard part.”

Nearly all participants offered positive feedback on the data collection website, but roughly half the participants mentioned that they found it to be difficult to gather and compile all the requested information. As explained, “Working with the website was the easy part, getting the data together was the hard part.”

Participants felt that the PDP website was generally easy to use, with it receiving an average score of 3.64, on a scale of 1 to 5, where 1 is difficult and 5 is very easy. Furthermore, none of the participants reported that they would be unlikely to provide the requested data in the future. A few participants had final suggestions to improve the PDP website and the end-user experience:

  • improve communication responsiveness from RTI, particularly related to updates and responses about critical issues such as extensions and administrative procedures; and

  • customize instructions and procedures to match the various Student Information Systems used by the institutions (e.g. Colleague by Ellucian or PeopleSoft).


1 One participant was not responsible for selecting the mode and another participant experienced technical issues during this discussion. As a result, these two were not included in the analysis.

2 One participant had technical issues during the interview and as a result this person is not included in the analysis.

1

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-13

© 2024 OMB.report | Privacy Policy