Appendix G NPSAS24 RespondentNonResp Focus Group Report

2023-24 National Postsecondary Student Aid Study (NPSAS:24) Full-Scale Study - Student Data Collection and Student Records

Appendix G NPSAS24 RespondentNonResp Focus Group Report

OMB: 1850-0666

Document [docx]
Download: docx | pdf

2023–24 NATIONAL POSTSECONDARY STUDENT AID STUDY (NPSAS:24) FULL-SCALE STUDY

Student Data Collection and Student Records

Appendix G

Sample Member Focus Group Report


OMB # 1850-0666 v. 36


Submitted by

National Center for Education Statistics

U.S. Department of Education

September 2023








Executive Summary  

Introduction 

This current study centers on the student survey of the National Postsecondary Student Aid Study (NPSAS:24), including summaries of focus group and one-on-one interview sessions with students who participated in and those who were invited, but did not participate, in the NPSAS:24 field test survey. In general, the focus groups and one-on-one interviews addressed the following topics with time for general feedback at the end:

  1. Participation Engagement and Perceptions

    1. Impressions of Survey Letters

    2. Survey Legitimacy

    3. Embedded Links/Buttons

    4. Contacting/Interacting Reminders

  2. Survey Participation Factors (Nonrespondents only)

    1. Reasons for Not Participating in the Survey

    2. Incentives

  3. NPSAS:24 Survey Experience (Respondents Only)

  4. Open Discussion (Nonrespondents only)

Sample 

A total of 10 participants (six Respondents and four Nonrespondents) participated in focus group and one-on-one sessions conducted remotely via Zoom. Sessions occurred between June 29, 2023, and August 18, 2023. The sample consisted of a mix of gender (60.0% female, 30.0% male, and 10.0% whose gender was not listed), race (40.0% Asian, 40.0% White, 10.0% Black or African American, and 10.0% Two or more races), and income (70.0% with less than $20,000, 20.0% with income between $20,000 - $49,999, and 10.0% who preferred not to answer).

Key Findings 

The following are summaries of key findings that appeared for each of the topics:

  • When examining the invitation letter, both Respondents and Nonrespondents alike found the incentive to be the most crucial part in piquing their interest to participate in the NPSAS:24 survey.

  • Both Respondents and Nonrespondents were initially skeptical about the survey and did not find the survey legitimate until they did their own research (internet search, asking peers, school faculty, friends, family, confirming the .gov domain). Once they verified the legitimacy using those factors, both groups found the survey legitimate.

  • Respondents noted that a chatbot would be helpful for common questions on the NPSAS:24 survey but a human agent would be needed to answer more complex questions.

  • When it comes to communication with Respondents and Nonrespondents (e.g., contacting, interacting, or sending reminders), participants would most likely not respond to unrecognized phone calls due to sense of fraudulence. However, participants would be more likely to answer if the caller ID identifies it coming from the U.S. Department of Education (US Dept of Educ), with wariness that it could be fraudulent or ignore the call to avoid issues.

  • Nonrespondents generally take surveys if the survey’s purpose is appealing, inclusion of an incentive, they have the time, and can verify the legitimacy. Reasons that Nonrespondents did not participate include procrastination, lack of motivation, or forgetfulness and unsure of the survey’s purpose, and lack of time and availability.

  • Nonrespondents prefer receiving a check for their incentive payment because it gives them the highest level of security and peace of mind.

  • Nearly all of the participants preferred the “Click Here” button to start the survey.

  • Personalization and clear communication in survey reminders are important factors for encouraging participation in the NPSAS survey.

  • Nonrespondents suggested that the NPSAS:24 survey include weekly reminders, continue to be conducted via telephone, and include information to verify by personalizing communications with point of contact at institute or include institute name.

Background 

In collaboration with the National Center for Education Statistics (NCES), RTI International (RTI) is preparing to conduct the twelfth round of data collection for the National Postsecondary Aid Study (NPSAS), a nationally representative study about how students and their families finance education beyond high school. NPSAS is comprised of two parts: a survey completed by students and student data collected from institutions of higher learning. This study focuses on understanding the challenges faced by participants who were asked to complete the NPSAS:24 field test survey and on improving data collection tools and materials for NPSAS full-scale data collection.

Study Design 

Sample 

The sample comprised 10 post-secondary students who took or did not take the NPSAS:24 survey. Focus groups and individual interview sessions were conducted remotely via Zoom. Participants were obtained by lists provided by RTI. Table 1 provides a demographic breakdown by respondent type (survey takers=Respondents or non-survey takers=Nonrespondents).

Table 1. Sample Demographics

Participant Demographics 

Percent 

Respondent type


Respondents 

40.0

Nonrespondents 

60.0

Gender 

  

Male 

30.0

Female 

60.0

Different gender listed below

10.0

Race 

Asian 

40.0

Black or African American

10.0

White  

40.0

Two or More Races

10.0

Income

<$20,000

70.0

$20,000 - $49,999  

20.0

>$50,000

0.0

Prefer not to answer 

10.0

Education Level

Associate’s / Undergraduate level classes

10.0

Bachelor’s

50.0

Masters

40.0

Recruitment and Screening

The sample was obtained from two lists of individuals, provided by RTI, which included both survey respondents and nonrespondents. Individuals from these lists did not need further verification of eligibility, aside from verifying appropriate technology to participate (e.g., laptop or computer with working microphone and camera). The first half of recruitment, EurekaFacts sent out an email blast to the list of NPSAS:24 respondents and collected general availability to conduct focus group sessions. Upon completion of the Respondent sessions, EurekaFacts received the second list containing NPSAS:24 nonrespondents from RTI. EurekaFacts sent an email blast to Nonrespondents to collect general availability for focus group sessions. However due to low availability and difficulty scheduling group sessions, EurekaFacts conducted one-on-one interviews for the Nonrespondents group. All recruitment materials, including but not limited to initial outreach communications, Frequently Asked Questions, reminder and confirmation emails, and informed consent forms, underwent OMB approval. Materials described that participants would receive a virtual $75 e-gift card as a token of appreciation for their efforts.

Data Collection Procedure 

EurekaFacts conducted 90-minute focus groups and one-on-one interviews via Zoom between June 29, 2023, and August 18, 2023. Data collection followed standardized policies and procedures to ensure privacy, security, and confidentiality. Upon their arrival in the virtual Zoom meeting room, participants were welcomed and instructed to set up their microphones and speakers. Written consent was obtained via e-mail prior to the virtual focus group for most participants. However, participants that did not return a consent form prior to joining the Zoom meeting were sent a friendly reminder and a private message by a EurekaFacts staff member. The consent forms, including the participants’ names, were stored separately from their focus group data, and were secured for the study. The consent forms will be destroyed three months after the final report is delivered.

Coding and Analysis 

The focus group and one-on-one sessions were audio and video recorded using the Zoom record meeting function. After each session, a EurekaFacts employee utilized the NVivo software to upload and review a high-quality transcription of each session’s commentary and behaviors. Completely anonymized, transcriptions and observations tracked each participant’s contributions from the beginning of the session to its close. One reviewer cleaned the datafile by reviewing the audio/video recording to ensure all relevant contributions were captured. As the first step in data analysis, coders’ documentation of the focus group sessions included only records of verbal reports and behaviors, without any interpretation.

Following the completion of the NVivo analysis, it was reviewed by another analyst. In cases where differences emerged, the primary analyst and secondary analyst discussed the participants’ narratives and their interpretations thereof, after which any discrepancies were resolved. The second reviewer (secondary analyst) conducted a spot check of the datafile to ensure quality and final validation of the data captured.

Once all the data were cleaned and reviewed, research analysts began the formal process of data analysis which involved identifying major themes, trends, and patterns in the data and taking note of key participant behaviors. Specifically, analysts were tasked with classifying patterns within the participants’ ideas in addition to documenting how participants justified and explained their actions, beliefs, and impressions. Analysts considered both the individual responses and overarching group responses.

Limitations 

The key findings of this report were based solely on analysis of 10 participants (six Respondents and four Nonrespondents) observations and discussions. All participants were administered questions that gauged feedback on communication materials, how to improve communication, and survey participation. However, while all items were administered in the focus group sessions, every participant may not have responded to every probe due to time constraints or technical difficulties, thus limiting the total number of respondents providing feedback by question. The relatively small sample sizes often associated with qualitative studies limit the generalizability of results to larger populations.

However, the value of focus groups is demonstrated in their ability to provide unfiltered comments and provide observation of the general resources from a segment of the target population, the real users. While focus groups cannot provide absolute answers in all conditions, the focus group sessions can play a key role in identifying the areas where participants could encounter potential problems or issues when taking the survey on their own. In addition, participants can respond unfiltered to favored communication methods thus, providing an accurate scenario of the communication materials and ease-of-use of the survey, as well as providing insight into the user’s perception of the experience.

Findings 

Participation Engagement and Perceptions (Respondent and Nonrespondent groups)

Impression of Invitation Letters

Both Respondent and Nonrespondent groups were asked about their impression of the survey invitation letters sent by mail. The Respondent group was asked about other factors influencing their survey participation, any additional information expected or wanted on the invitation letters, and preference on payment method.

Shape1

I really liked the compensation amount, and also at the bottom the email mentioned how the survey responses will represent experiences of other students enrolled at your institution. So, I thought there was an obligation.’

In both groups (Respondent and Nonrespondent), all (n=10) found that the invitation letters caught their attention in a few different ways. Two Nonrespondents expressed it was intriguing while one other Nonrespondent felt they were personally selected and called to action by participating in the survey. Many of the Respondent and Nonrespondent students (n=9) indicated the offered incentive also caught their attention.

In addition, the QR code linked to the survey and the layout of the survey also made an impression on a Respondent and a Nonrespondent alike (n=2).

Conversely, two Nonrespondents thought that the invitation letter appeared to be a scam because of their history receiving emails about incentive offers and nonaffiliated university emails. Figure 1 summarizes the initial impressions of the invitation letters from both groups.

Figure 1. Initial Impressions on Invitation Letter

Shape2





Mention of Institution or Specific Person

Shape3

I didn't feel like it was very helpful because it also didn't say the institution, it felt very vague…I wasn't just really sure of the legitimacy of it all…’

When each participant was asked whether the mention of their institution and/or a specific person from their institution (e.g., financial aid counselor) would help them find the survey legitimate or would encourage them to complete the survey, five participants agreed that it would make the survey legitimate or encourage participation. An additional two participants had neutral opinions on it. Conversely, another two participants found it would negatively impact the appearance as being a scam. One Nonrespondent did not have an opinion on this matter.

Participant Influence

When the Respondent group was asked solely if any other factors would influence their survey participation, one Respondent emphasized the incentive, the survey topic, and the fact that it asked for their opinion on education influenced them to complete the survey. The remaining participants (n=5) did not state any extra influential factors.

Payment Method

When it came to the payment method they preferred, four Respondents indicated they preferred PayPal. One Respondent, however, stated they preferred to be incentivized by check due to their unfamiliarity with PayPal. Figure 2 provides Respondent preferences on payment method.



Figure 2. Respondent Preferred Payment Method



Survey Legitimacy

Both Respondent and Nonrespondent groups were asked about a range of factors regarding the legitimacy of the NPSAS:24 survey.

Concerns Regarding Legitimacy

The Respondent group was asked solely if they were concerned about the legitimacy of the NPSAS:24 survey prior to completing it; five participants claimed they were concerned at first. However, after looking at various aspects of the survey (email, sender, .gov domain), they found the .gov domain gave the survey legitimacy. Meanwhile the remaining Respondent stated they were “not really concerned.”

Verifying Legitimacy

Shape4

I looked up [on Google] what the test [survey] was just to make sure... it came up with a survey that is said is done quite often, so I just figured that.

Across both groups they were asked how they verified the legitimacy of the survey, two Respondents reported they just scanned the QR code and reviewed the first question before resuming the rest of the survey. One Respondent only used outside resources like their friends, college counselor, and other peers to verify the survey's legitimacy.

Two Nonrespondents used outside resources and used the internet (e.g., Google) as another source to legitimize the survey. One Nonrespondent solely used the internet as their source. The remaining Nonrespondent declared they did not do anything to verify the legitimacy of the survey.

Lingering Doubts About Legitimacy

The Respondent group was asked solely if they had any lingering doubts about the survey’s legitimacy. All Respondents reported that they did not have any lingering doubts, instead they found the survey questions to align with what they saw in the invitation.

Legitimate Appearance

The Nonrespondent group was asked if they felt the survey was legitimate based on the various contact methods they received (emails, text messages, mailings, and phone calls). Two Nonrespondents indicated they found it legitimate due to either the professional email layout or the email address containing a .gov domain in it. However, one participant claimed the consistent phone calls they received asking them to participate in the survey eventually made them feel like the survey was legitimate, stating, “Maybe that they were calling, and it was a person, not really a bot or anything that seemed legitimate…it seemed like they really wanted me to do it. I get spam calls often as well from China or a fake car dealership…but they usually call once in a blue moon, so it was just kind of the consistency as well.”

The Use of Reddit, Wikipedia, or ChatGPT

Both groups were asked about their regular usage of websites like Reddit, Wikipedia, or ChatGPT. One Respondent said they use Reddit, Wikipedia, and ChatGPT frequently, ranging from every single day to weekly use. Another Respondent uses both Reddit and ChatGPT routinely on a biweekly basis. Three Respondents solely use Reddit, Wikipedia, and ChatGPT, respectively. Meanwhile, the remaining Respondent does not use any of the online resources.

As for the Nonrespondents, one Nonrespondent stated they use both Reddit and ChatGPT, however, clarified that they use Reddit for asking specific questions. Another Nonrespondent said they use Wikipedia and have tried to use ChatGPT but not often. The remaining two Nonrespondents said they do not use any of the online resources. Figure 3 displays these results.

Figure 3. Using Reddit, Wikipedia, and ChatGPT



Using Online Resources to Verify NPSAS

Shape5

I think you could probably run the URL through it [ChatGPT] a couple of times and ask it some questions about it, whether it's legitimate...

Both groups were asked if they would use Reddit, Wikipedia, or ChatGPT to verify the legitimacy of the NPSAS:24 survey. Three Respondents and one Nonrespondent claimed they would. One Respondent student indicated although they would not know how to verify the survey using those sites, they would most likely use ChatGPT.

Another Respondent agreed and said they would probably copy and paste the email content into ChatGPT and see what results they get. Meanwhile the remaining participants (n=4) reported they would not use any of those online resources to verify the NPSAS:24 survey largely because they find the internet to be a generally untrustworthy environment.



Finding NPSAS on Online Resources

Shape6

I think it depends on the site. Probably on ChatGPT, I would expect to see more of what the survey is trying to accomplish...I just don't really believe much from Wikipedia since anyone can edit it. And then Reddit, I think I would see people's opinions maybe after taking it.

When asked if they would expect to find information about the NPSAS:24 survey using the online resources (Reddit, Wikipedia, ChatGPT) both groups had a mix of answers. Four Respondents and three Nonrespondents reported they would expect to find information about NPSAS:24.

One Respondent declared they would not expect to find information about NPSAS:24, while a Nonrespondent student clarified that they were unsure whether they would find information about the NPSAS:24 survey using those online resources.

Embedded Links and Buttons

Both Respondent and Nonrespondent groups were asked about whether they remember receiving an email with a video link and if they would click on such a link, along with their reasons for their choices, to gauge their preferences and engagement with video content in survey emails.

Remembering the Email Reminder

Across both groups, a few participants (n=4) did not recall receiving the email reminder containing the video link for the NPSAS:24 survey. Furthermore, five participants, a mix of Respondents and Nonrespondents, mentioned having a vague memory of the email but could not recall the video link within it. Only one Respondent distinctly remembered receiving the email with the embedded video link.

Clicking the Video Link

Shape7

I don't know if it's like a Trojan horse virus or something or I'm scared to click links in general, I feel like a lot of people I know have been hacked, especially on social media through clicking a link and somehow that hacks their account.

Every participant in both groups (n=10) indicated their reluctance to click on the video link. To elaborate, three Nonrespondents voiced their reluctance based on the link's perceived sketchiness and doubts about its authenticity.

The Respondent who remembered both the email and the video link chose not to watch the video, citing concerns about potential computer viruses. Two other Respondents concurred, finding this reasoning to be acceptable. One Respondent explained, “Because I was afraid that if I click on it's going to like, I don't know, send some virus to my computer. So, I'm like, ‘Oh. No, thank you.’”

Impressions of the Video

All the Nonrespondents viewed the NPSAS video during the focus group sessions, resulting in diverse opinions. One participant regarded the video as unremarkably ordinary, akin to numerous other informational videos. This leads to the conclusion that the video would not make them more inclined to complete the survey. Conversely, another participant expressed enjoyment and a heightened inclination to complete the survey due to the video. A film student among the Nonrespondents critiqued the video's subpar quality and use of ‘stock images,’ deeming it a poorly produced piece that would not motivate them to participate. Another Nonrespondent appreciated the video's reference to the U.S. Department of Education, though they remained neutral about whether the video's impact would indeed foster greater inclination to engage with the survey.

Click Here Link versus Button

Participants were presented with two options to start a survey: "Click Here Button" and "Click Here Link". They were asked to indicate which of these they would be more inclined to click on. Figure 4 displays these results.



Figure 4. Click Here Link versus Button

A majority (n=9) of the participants across both groups preferred the click here button over the hyperlink. One Respondent clarified that, in their perception, the link appeared less suspicious compared to the button. Three Respondents attributed their preference for the button to its noticeable size and distinctive color. Similarly, two Nonrespondents also positively commented on the button's attention-grabbing visual characteristics. One participant explaining, “I’ll say ‘Click Here Button’ just because it looks [more] pleasing than the second one.” The remaining participants (n=4) who favored the button over the link did not offer additional comments.

Clicking the Button to Start a Survey

Most participants (n=7) indicated their intention to utilize this button for starting a survey. One Respondent attributed their choice to the button's straightforwardness. Conversely, another Respondent expressed distrust and thus would not use the button. Furthermore, uncertainty about using the button was voiced by two participants (one Respondent and one Nonrespondent).

Expectation for a Government Survey Request

Shape8

Probably the ‘Click Here’ URL, just because, I don't know, it seems a little more formal to me, I guess. And you would think of the government as being formal, I guess...’

Regarding the anticipated option in a government survey request, most (n=7) expected to see a hyperlink in a government request. One Nonrespondent delved deeper, explaining that a hyperlink appeared more authentic.

The remaining three participants, who favored the “Click Here Button,” offered their perspectives on their expectations. A Nonrespondent participant noted that the government possesses the resources and capacity to create visually appealing graphics, akin to the button. Another Nonrespondent mentioned that the button appeared more polished, which led them to assume that the government would present something of that nature. The third participant did not offer any additional comments on their opinion.

In sum, a majority of participants (n=9) throughout both groups favored the “Click Here” button over the hyperlink. However, when it comes to their expectations from a government survey request, most (n=7) anticipated seeing a “Click Here” link.

Contacting/Interacting or Reminders

The Respondent group was specifically asked about having a chatbot to help while completing the NPSAS survey. Both Respondent and Nonrespondent groups were asked about their response to calls with a U.S. Department of Education caller ID, their habits with unrecognized calls, and voicemail usage on their cell phones, including personalized messages.

Chatbot

All the participants within the Respondent group agreed that they would not use the chatbot for the ease of filling out the survey. One participant stated, “It wouldn't hurt to have it there, but I honestly didn't need it.” However, participants did express that it is a good supportive option for other students for common questions, and a live agent for complex questions. As one participant summarized, “Yeah, just for the most common questions, but it was like complex question, probably like a live agent would answer it.”

Registered Caller ID

Both groups were asked if they would respond to calls with a U.S. Department of Education caller ID. Across both groups, half of the participants (n=5) responded that they would answer a call from the U.S. Department of Education. The main reason as to why they would answer is that the caller ID identifies it is the U.S. Department of Education, to such degree it would be an important call. One respondent specifically expressed the importance of the caller identification for them to answer a call, “Has a caller ID, yeah. If it has no caller ID, then no.” The other half of participants responded that they would not answer a call from the U.S. Department of Education, even if the caller ID appears with that identification. This includes one Nonrespondent that indicated that they would “probably” not answer. There were two main reasons given as to why they would not answer: 1) suspicion that the call might be fraudulent or a scam and 2) avoid the call in the in case that it is bad news (i.e., avoid issues). The first reason is consistent with a Respondent that further shared that they would verify that the phone number comes from the U.S. Department of Education by searching online, and then maybe call them back.

Unrecognized Phone Number

Shape9

I only answer calls from people I don't recognize when I'm expecting a call from a number I don't recognize.

Both Respondent and Nonrespondent groups were asked if they would answer a phone call from an unrecognized phone number. Most participants from both groups (n=7) indicated that they would not or sometimes answer calls from an unfamiliar phone number, letting them go to voicemail. Further, one Respondent shared that they usually answer calls, except if it comes from a 1-800 number. Another Respondent said that they answer if they get irritated with multiple calls. However, both groups did clarify that they would answer a call from an unknown source if expecting a business call or interview follow-ups. The few participants that do answer calls from an unfamiliar phone number explained that it was their nature of curiosity or due to their field of work (e.g., Resident Assistant in college).

Voicemail Usage

The final question asked to both groups was voicemail usage on their cell phones, including personalized messages. Across both groups, almost all participants (n=9), acknowledged having voicemail setup on their cell phones. The groups were also asked whether their voicemail message identifies them. Of participants that were able to respond, one Respondent acknowledged that their voicemail identifies them, three participants do not have their voicemail personalized or programmed with default voicemail, and four participants do not recall if their voicemail message is personalized or identifies them.



Survey Participation Factors (Nonrespondents only)

Reasons for Not Participating in the Survey

The Nonrespondent group was asked about their reasons for generally participating in a survey. In addition, Nonrespondents were also asked their reasons for not participating in the NPSAS:24 survey and what could have influenced them to change their decision.

General Survey Participation

Nonrespondents noted that the main factor on general survey participation is based on survey’s purpose, such as personal or professional interest, matching priorities (i.e., call to action), and motivation. For example, one participant stated, “I think it depends on something also that I would have an interest in. I would definitely be more inclined to take a survey that sounds interesting to me.” Another participant expressed, “If it's something that seems like my opinion will be important, I'll be more likely to participate.”

The second influential factor is the incentive or compensation promised for the survey's completion. A third factor is verification from a trusted source about the survey, like a friend. Lastly, one participant also mentioned that availability or time to complete the survey is an important factor in participating in a survey. Figure 5 demonstrates the factors that influence survey participation for Nonrespondents.

Figure 5. Main Factors that Influence Survey Participation

Shape10

Not Participating in NPSAS:24

Nonrespondents did not participate in the NPSAS:24 survey because of two factors: survey’s purpose and time. As it relates to procrastination one participant explained, “I actually did want to take it. I will say I was going to take it. I actually, I got lazy, and I just forgot about it, and I didn't take it.” Another participant emphasized the importance of knowing the survey’s purpose, “Like I said, it's okay that the reward is there, which should be there. It's not like... But what other than that am I getting from this interview, also? How is this study going to help? Something like that, knowing what the study is for maybe.” The second factor for not participating is the participant’s lack of time or availability. As one participant explained, “Lack of just time or inconvenience of time when I was being called and it just appearing not that important in the email and the clutter of emails, kind of just getting lost with that as well.”

Nonrespondents were then asked what could be done to change their mind to partake in the NPSAS:24 survey. One participant said that if the survey was conducted via telephone interview, they would partake in it. Another participant reiterated that if the communication was more personalized (e.g., identifying the point of contact or institute) it might be enough to sway their ability or decision. A different participant agreed that communication should be personalized and added the importance of schools spreading the message. They shared, “Because kind of where I would stay in the loop at school through just a lot of posters, flyers in the halls, in the dorms, in the academic buildings, and whatnot. Those really caught my attention as well.” Furthermore, this participant expressed the importance of verification not only from a trusted institution source, but also from a fellow student. The last participant stated that a frequency of weekly reminders may have helped them in remembering to take part in the survey.

In sum, Nonrespondents generally take surveys if the survey’s purpose is appealing, if offered an incentive, if they have the time, and if they can verify the legitimacy. Participants suggested that the NPSAS:24 survey include more frequent weekly reminders, be conducted via telephone, and include information to verify by personalizing communications with point of contact at institute or include institute name.

Incentives

In this topic, Nonrespondents gave their opinions on the impact of monetary rewards on survey completion decisions, examined their effect on the perceived legitimacy of survey requests, and discussed their preferred methods of receiving such incentives.

How Influential Are Monetary Incentives

Two participants emphasized the substantial influence of monetary incentives, while the remaining participants regarded them as somewhat persuasive. One participant, highlighting their role as a college student seeking convenient ways to earn money, stated, “I believe it's influential. I think that a broke college student or any college student, anyway, would view it as a means to assist and an easy way to earn extra cash.”

Monetary Incentives Leading to Perceived Survey Legitimacy

All four participants concurred that monetary incentives lend a sense of legitimacy to a survey. Two participants specifically noted that compensating individuals for completing a task enhances its perceived legitimacy. However, one participant highlighted the caveat that legitimacy hinges on not requesting bank account information. A Nonrespondent offered a nuanced perspective, stating that monetary incentives generate both a heightened sense of legitimacy and a contradictory feeling of reduced legitimacy simultaneously. The Nonrespondent explained, “Yeah. The initial response is usually monetary incentives to make it less legitimate. But after a little bit of digging, usually the monetary incentive increases legitimacy after there are other indicators of legitimacy.”

Nonrespondent Expectations and Preferences for Receiving Incentives

One Nonrespondent participant mentioned PayPal, citing its prevalence in survey-related transactions. Another participant anticipated a traditional bank transfer, expressing mistrust toward electronic third-party options. Two participants expected to receive a check, as they viewed electronic bank and third-party applications as methods typically reserved for transactions among friends. One participant who expected a check specifically stated, “I would be less likely to expect those [Zelle, Apple Pay, Venmo, PayPal], because in my opinion, they're more of personal transfers between people that know each other and friends. And yeah, I don't know. I would expect it to be more of something like a payroll type thing. I don't know why, but that's just where my brain goes.”

Regarding their preference for receiving incentives, three of the Nonrespondent participants favored receiving a check. One participant highlighted the security aspect, stating that checks are challenging to counterfeit. Another participant appreciated the simplicity of checks compared to electronic bank transfers. The remaining participant initially expressed a willingness to consider all options, including checks and electronic third-party applications. However, they ultimately settled on PayPal as their preferred choice.

Participants were asked to express their preferred method of payment for completing government surveys among three options: Gift card (with a choice of several options), Check, or PayPal payment. Please note that not all response options were selected by Nonrepondent participants, which is why the gift card option is not displayed in Figure 6.


Figure 6. Payment Preference for Government Surveys






Nonrespondents were asked their preference for an electronic payment over receiving a check. Preferences were diverse among the four participants. One favored Zelle, another preferred PayPal, a third participant opted for e-gifts, while the remaining participant exclusively favored receiving a check.

Open Discussion (Nonrespondents only)

Nonrespondents were asked to identify any overlooked factors that might have influenced their decision to complete the NPSAS survey. They were also prompted to pinpoint crucial aspects that could encourage others to participate and to highlight educational issues they believe policymakers should consider based on the survey's results.

Nonrespondents offered various suggestions regarding getting more individuals to complete the NPSAS survey. Two participants recommended personalizing the reminder email and articulating the importance of survey completion within the email content. Another suggested enhancing the subject line's appeal and visibility by including the incentive amount. The final participant did not propose any specific recommendations.

Regarding the most crucial factors to encourage participation among sample members, participants generally had limited input. One participant suggested personalizing the communication materials by including the recipient's institution or name. Another participant emphasized promoting the opening and viewing of the NPSAS video as a helpful tactic. However, the remaining two participants did not provide any specific advice or suggestions.

In response to the question about issues related to their education that they would like policymakers and lawmakers to be aware of, participants offered diverse insights. Financial challenges emerged as a common concern, with one participant emphasizing the burden of inflation and the struggle to manage bills alongside education expenses. Another participant underscored the necessity of providing food and shelter at affordable rates, particularly for international students adjusting to new lifestyles. Additionally, participants stressed the significance of making the college experience accessible to a broader demographic, emphasizing the value of the social and experiential aspects of education. These insights collectively call for a multifaceted approach by policymakers, considering financial assistance, basic needs provisions, and inclusivity in higher education policy planning.



NPSAS:24 Survey Experience (Respondents only)

Respondents were asked on their general survey experience with NPSAS:24, such as impressions of the survey, filling out the survey and overall feedback.

First Impression of NPSAS:24

All Respondents (n=6) expressed that they found the NPSAS:24 survey to be typical or common of what they expect of a survey, further explaining that overall, it was easy, simple, and doable. As one participant said, “Yeah, it's just questions that as if I were to fill out my financial aid form. It's pretty general.” Another expressed, “My experience taking it was pretty easy and simple to understand when I was taking the survey, so it didn't feel like 30 minutes when I was doing it.”

Shape11

Because it's not hard and I'm answering... It felt like a reflection to my life, literally my spouse, my income. Yeah, as if they're telling me to reflect on how I'm doing so far.

Respondents were further asked if the NPSAS:24 survey was interesting, engaging, boring or repetitive. As for the positive sentiment, one participant acknowledged the survey being interesting and three acknowledged it being engaging (including that it was “fun”). Conversely, two participants acknowledged the survey being boring and one recognized that it was repetitive. One participant expressed neutral sentiment, “I'm just normal. I've probably answered these same questions.” However, participants acknowledged that the NPSAS:24 survey was like any other survey and simple.

Experience During the Survey

Respondents were asked directly if they had trouble understanding any questions on the NPSAS:24 survey. One participant responded that they did have trouble on the annual income question, “Just the annual income is just probably harder for people where we're at in life. I mean, we worked maybe during the school year a little bit and then more during the summer. So, none of us are probably on salary so it's kind of just a question you got to think about.” However, in addition to the annual income question, in previous responses a few participants briefly admitted to some difficulty with financial aid questions. Furthermore, two participants admitted to guessing on certain questions when asked directly. Two participants acknowledged guessing on the income question, and one on the financial aid question.

Participants were then asked if they looked up answers or referred to documents when answering questions. Three participants admitted to looking up or referring to documents when answering questions. One participant had to refer to their student transcript to recall the date when they started school. Another participant looked up their wages. The third participant also looked back at an account to recall the exact income amount. When further probed if looking up this information was bothersome, the participant indicated that it was not.

The final question asked to all participants was how long it took them to complete the NPSAS:24 survey. On average, according to their estimates, it took the six participants an average of 27.5 minutes to complete the survey, with a minimum of 15 minutes and maximum of 40 minutes.

Additional Feedback

When asked if there was any additional feedback, three participants provided responses, most of them compliments on the survey. One participant complimented that it was “thoughtful.” Another participant emphasized the need for the materials to communicate the objective or mission of the survey., “if the mission word stated more explicitly or an email or if I just know exactly how this is going to benefit other people or other students, other future students.”

In sum, Respondents thought that the NPSAS:24 survey was similar to other surveys they have filled out before and was easy to fill out. They would have to reference some documents, but they were still able to complete the survey within the estimated time.

Closing Comments

Across both groups, participants were asked if there were any other points or opinions they would like to share. One Respondent said to make the survey less repetitive. Another Respondent found that personalizing communications would make the survey more enticing to complete, stating, “…if there's a way to also get an email from somebody at my school giving a little information about it. If I got information from more than just one email about it, I think it would've caught my eye sooner.” Meanwhile, the remaining participants (n=6) said they had no further points or opinions.

Respondents and Nonrespondents were asked if they had any questions for the moderator, all participants indicated they did not have any. However, two Nonrespondents commented that they liked how the interview was conducted via Zoom rather than over the phone, making it more personal and easier to give the moderator their full attention. Connecting back to the point that the option of being able to conduct the survey via telephone could obtain more responses. The remaining participants had no further questions for the moderator.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorRebecca Rawlinson
File Modified0000-00-00
File Created2024-07-24

© 2024 OMB.report | Privacy Policy