Student Focus Groups

Appendix G Student Focus Groups.docx

2019–20 National Postsecondary Student Aid Study (NPSAS:20)

Student Focus Groups

OMB: 1850-0666

Document [docx]
Download: docx | pdf
  1. HIDDEN TEXT TO CONTROL APPENDIX PAGE NUMBER – DO NOT DELETE

2019–20 NATIONAL POSTSECONDARY STUDENT AID STUDY (NPSAS:20)



Appendix G

Student Data Collection: Audience Assessment/Focus Group Summaries



OMB # 1850-0666 v.25

Submitted by

National Center for Education Statistics

U.S. Department of Education




August 2019



Contents

Online Pretesting Summary G-3

Incentive Structure G-3

Contacting Preferences and Legitimacy G-4

Recruitment Materials and Messaging G-4

In-Person Focus Group Summary G-5

Background G-5

Executive Summary G-6

Sample G-6

Study Design G-6

Recruitment and Screening G-6

Data Collection Procedure G-7

Coding and Analysis G-8

Limitations G-9

Findings G-9

Topic 1: Introduction to Survey G-9

Topic 2: First Year Education Experiences (FTB only) G-10

Topic 3: Paying for Education G-11

Topic 4: School Jobs (Graduate only) G-12

Topic 5: Employment Experiences (General only) G-13

Topic 6: Housing and Food Experiences G-14

Topic 7: Incentives G-16

Topic 8: E-mail Legitimacy G-17

Topic 9: Closing G-18

Recommendations for the Full-scale Study G-19

References G-19

Tables

Table 1. Number of survey tryout and crowdsourcing participants, by demographic characteristics: 2019 G-3

Table 2. Number of in-person focus group participants, by student type and demographic characteristics: 2019 G-6

Figures

Figure 1. Sample question on interactions outside of class G-11

Figure 2. Reported classification of a “State-borrowed Student Loan” G-12

Figure 3. Sample question on school-related jobs G-13

Figure 4. Sample question on employment details G-14

Figure 5. Sample questions of a dichotomous rating scale on food experiences G-15

Figure 6. Preferred reference period for places slept G-16




Background

Pretesting in preparation for the 2019-20 National Postsecondary Student Aid Study (NPSAS:20) full-scale student survey, data collection materials, and incentive plans consisted of several components: 1) survey tryouts, 2) an online opinion survey using a crowdsourcing platform with participants currently or recently enrolled in postsecondary education, 3) online focus groups, and 4) in-person focus groups with students enrolled in the 2018-19 academic year. Full details of the pretesting components were approved in December 2018 and March 2019 (OMB# 1850-0803 v.243 and 247).

In this appendix, results from the online pretesting methods are summarized first. Then, a detailed report of the in-person focus group design, including sampling and recruitment, data processing, and findings, is provided.

Online Pretesting Summary

Design choices and attributes of survey recruitment materials have been shown to substantially affect survey participation (e.g., Groves et al. 1992; Groves and Heeringa 2006; Lynn 2016; Lynn 2017) and as such may decrease the potential for nonresponse bias and survey cost. The NPSAS:20 survey tryouts, focus groups, and crowdsourcing survey provide in-depth insight in preparation for the full-scale study regarding the sample members’ preferences for different incentive structures, contacting preferences, and design of recruitment materials.

Online pretesting included 300 participants who participated in the survey tryouts, 47 participants who took part in one of three online focus groups, and 1,200 “workers” on Amazon’s MTurk online crowdsourcing platform who completed a questionnaire regarding the NPSAS:20 data collection materials (referred to as crowdsourcing). Of the survey tryout respondents, 58 percent were female compared to only 42 percent in the crowdsourcing survey (see table 1). Over half (63 percent) of the survey tryout respondents were between the ages of 18 and 24 and about one third (37 percent) in the crowdsourcing survey. Among the respondents completing the tryout survey, 26 percent were working towards obtaining a sub-bachelor’s degree, 50 percent a bachelor’s degree and 24 percent a post-bachelor’s degree. For the crowdsourcing survey, 21 percent were working towards a sub-bachelor’s degree, 51 percent towards a bachelor’s degree and 28 percent towards a post-bachelor’s degree.

Table 1. Number of survey tryout and crowdsourcing participants, by demographic characteristics: 2019

Demographic characteristics

Survey tryout
(
in %)

Crowdsourcing
(
in %)

Gender



Female

58

42

Age



18-24

63

37

25 or older

37

63

Degree type



Sub-Bachelor’s degree

26

21

Bachelor’s degree

50

51

Post-Bachelor’s degree

24

28

Incentive Structure

The questions around incentive structure focused on which incentive type would motivate sample members to participate, payment preferences, and potential alternatives to monetary incentives.

Responses from an open-ended question asking about sample members’ incentive preference indicated that 51 percent of all survey tryout respondents reported they would prefer some type of monetary incentive, 46 percent would be motivated by a gift card, 10 percent prefer another incentive (e.g., food), and 24 percent of all respondents chose not to respond. Using the same open-ended question about sample members’ incentive preferences in the crowdsourcing survey, the results suggest that crowdsourcing respondents favor monetary incentives (88 percent) compared to gift cards (7 percent) or other types of incentives (5 percent). Only 2 percent of all respondents did not respond to this question.

In terms of payment preferences, 52 percent of the survey tryout respondents prefer a payment via prepaid credit card, followed by a payment via PayPal (32 percent), check (10 percent) or another form (6 percent). For crowdsourcing respondents PayPal is the preferred payment mechanism (66 percent), followed by other payment mechanisms, such as MTurk (14 percent), prepaid credit card (12 percent), and check (8 percent).

Among the potential nonmonetary alternatives to monetary incentives, survey tryout respondents would most like to receive an online shopping subscription (27 percent), followed by a snack or food delivery subscription (19 percent), an online catalog to choose from (17 percent), an entertainment subscription (16 percent) and other gifts and services (combined 21 percent). Crowdsourcing respondents ranked entertainment subscriptions highest (35 percent) followed by online shopping subscriptions (17 percent), a snack or food delivery subscription (16 percent), an online catalog (15 percent) and other gifts and services (combined 17 percent). When following-up about why crowdsourcing respondents selected a particular nonmonetary incentive, respondents often called out a preference for having a choice. This may explain the result that 44 percent of all crowdsourcing respondents would select money over a nonmonetary incentive, even if the amount of money offered is lower than the value of the nonmonetary incentive. When probed on which nonmonetary incentive would be a better alternative to money, 68 percent of the crowdsourcing respondents reiterated that money or gift cards are their preference and only 43 percent mentioned a true nonmonetary alternative. Furthermore, crowdsourcing respondents stated that if they had already subscribed to the service or subscription offered as an incentive to complete a survey, only 37 percent would be very likely to complete the survey.

Contacting Preferences and Legitimacy

The best method to contact respondents according to the survey tryout and crowdsourcing surveys is via e-mail (survey tryout: 93 percent; crowdsourcing: 87 percent) compared to text-messaging (survey tryout: 4 percent; crowdsourcing: 7 percent), calls to cell phones (survey tryout: 3 percent; crowdsourcing 2 percent), or mailed materials (survey tryout: <1 percent; crowdsourcing: 2 percent). None of the survey tryout respondents selected call to home phone; whereas, 2 percent of the crowdsourcing respondents did. When asked for their reaction regarding the use of text messaging as a reminder to complete a survey, the majority of survey tryout respondents would use it (77 percent) while only 17 percent would ignore it. Additionally, only 7 percent of all survey tryout respondents would be upset by the text message reminder, 4 percent said they would block the text message sender, and 2 percent would report the sender as spam. When asked about receiving a text message reminder to complete the survey, 73 percent of all crowdsourcing respondents reported that they would use this text message while 19 percent would ignore it; 11 percent would block the sender, 11 percent would be upset, and 9 percent would report it as spam. Text messaging was chosen as the preferred method of contact in the online focus group, but e-mail was perceived as the most official method of contact. E-mail and text were chosen as the best reminders to complete a survey; both were perceived as non-intrusive. On the other hand, phone call reminders were perceived as intrusive which aligns with the low preference for phone calls in the surveys. Participants noted that they were more likely to answer a call from an unrecognized number if they were expecting a call.

Turning to e-mail legitimacy, the majority of the crowdsourcing respondents make their decision as to whether or not an e-mail is legitimate based on the sender e-mail address, name, and signature as well as the look and feel of the e-mail (grammar, links, etc.). Sixty-eight percent of all crowdsourcing respondents would expect to see an unsubscribe link in an e-mail sent on behalf of a governmental agency, and 57 percent prefer html over plain text e-mails. Only 17 percent of the crowdsourcing respondents report difficulty viewing html e-mails.

Recruitment Materials and Messaging

In the crowdsourcing survey, respondents were asked to rank four different envelope designs, including 1) a standard envelope used in past studies (serving as a control), and three envelope designs with a printed message that hints at cash being enclosed: 2) “$2 Gift Enclosed. See details inside” (referred to as text envelope below); 3) “Your next cup of coffee is on us. See details inside” (referred to as coffee envelope below); and 4) an envelope with a $2 bill printed in the corner (referred to as bill envelope below). Of all respondents, 32 percent said that they would be most likely to open the control envelope while 33 percent said they would be least likely to open this envelope. The text envelope performs similarly well in terms of likelihood of opening (31 percent very likely) as the control but performs better overall since only 7 percent of all respondents report that they are least likely to open this envelope. The coffee and bill envelope perform considerably worse in comparison to the former two envelopes. 79 percent of all respondents report that they are at least very likely to actually open the envelope they ranked highest.

In addition to the study envelopes, we investigated the content of the study brochures in the crowdsourcing survey as well as alternative formats in the online focus groups. Regarding the content of the brochures or pamphlets there are three topics that respondents would like to see mentioned: purpose of the study (62 percent), study content (43 percent), and privacy and confidentiality (32 percent). This ranking aligns with the order of importance among these three topics. Fifty-four percent of respondents report that they are at least very likely to actually read a brochure or pamphlet received in the mail.

Participants of the online focus groups were shown 4 different versions of reminder postcards. The two favored choices were postcards that featured people (animated or photos). The color schemes of the two postcards were also mentioned as attention-getting. Participants were also asked what messages in the postcards would make them complete the survey. Many mentioned that the messaging around the incentive and short amount of time required to complete the survey, along with the statement that their individual contribution could make a difference, were features that would make them complete the survey. Similarly, when asked what was missing from the postcards, participants reiterated that emphasizing the incentive, time expectations and how participation would make a difference would make the cards even more convincing.

Participants of the online focus group were shown series of introductory statements typically presented in a lead letter and asked which stood out the most and would motivate them to participate in the survey. The majority selected a message that congratulated them for being selected, included their name in the message, and explained the importance of the study. Many noted that the congratulatory introduction set a positive tone, made them feel special and built further interest to read on.

Similarly, online focus group participants were shown several explanatory statements on the NPSAS:20 study. The most selected statement included information about how the student will represent other students and why his/her participation was critical to the success of the study. Participants also noted that specific mentioning of the student’s school increased relevance and made the message more personal.

Finally, online focus group participants were shown several participation expectation statements and asked to select the one that stood out and would motivate them to participate. A clear preference was expressed for a statement specifying the incentive and time associated with the request to complete the survey.

In terms of visual design, participants were shown the same lead letter with variations in color and emphasis of the message (i.e., standard paragraph vs. bullet points) and QR code color and size. The majority of participants selected the design where the incentive message and QR code stood out in a different color from the rest of the text. The letters were found to have a good format and be official looking. Most participants could not find anything to dislike in terms of visual presentation.

Among the online focus group participants, personalization (respondent’s name and school) and mention of compensation were suggested as elements of an e-mail subject line that would encourage someone to open the message. Cash and gift cards were the most frequently mentioned incentives.

In-Person Focus Group Summary

Background

RTI International, on behalf of the National Center for Education Statistics (NCES), part of the U.S. Department of Education, contracted with EurekaFacts to conduct in-depth in-person focus groups with postsecondary students to obtain feedback to refine the NPSAS:20 full-scale student survey and recruitment processes. The focus groups collected feedback from students on the following areas:

  • the comprehensibility of survey terms and items;

  • the thought processes participants utilize to answer survey items;

  • the extent to which items can be satisfactorily answered with the given response options;

  • the overall functionality of the survey and its tools;

  • the appeal of non-monetary incentives; and

  • the prevailing attitudes toward survey-related communications.

Executive Summary

Sample

A total of 42 students from among the 300 respondents to complete the survey tryouts participated in five in-person focus groups (ranging from six to ten participants per group) between February 2019 and April 2019. Of these, approximately 62 percent were female, and 38 percent were male. Nearly half (48 percent) were between the ages of 18 and 24 and another 52 percent 25 or older. Furthermore, nearly half of the participants (48 percent) identified as Caucasian, 45 percent identified as Non-White and 7 percent preferred not to answer. Among the respondents, 38 percent of all respondents were working towards obtaining a Sub-Bachelor’s degree, 38 percent a Bachelor’s degree and 24 percent a Post-Bachelor’s degree. Seventeen (40 percent) participants were first-time beginning (FTB) students, 10 (24 percent) were graduate students, and 15 (36 percent) were non-FTB and non-graduate, a type of student hereafter referred to as “general.” FTB students were those who, in the 2018–19 academic year (July 1, 2018 – June 20, 2019), were attending their first postsecondary institution and working on their first postsecondary degree or certificate since completing high school. Table 2 provides a summary of participants’ demographics by student type.

Table 2. Number of in-person focus group participants, by student type and demographic characteristics: 2019

Demographic characteristics

Total
(
%)

FTB1
(
%)

Gender



Female

62

47

Age



18-24

48

76

25 or older

52

24

Race



White 

48

53

Non-White

45

47

Degree type



Sub-bachelor’s degree

38

65

Bachelor’s degree

38

35

Post-bachelor’s degree

24

0

1 FTB students are first-time beginning students, meaning they have not enrolled in postsecondary education prior to the 2018-2019 academic year.

Study Design

Recruitment and Screening

In order to qualify for participation, each respondent had to be enrolled in a college, university, or trade school between July 1, 2018, and the start of testing (April 2019), and live within commuting distance of Rockville, Maryland. Postsecondary students were stratified into three student types based on the participant’s status within their college, university, or trade school, FTBs, graduate students, and general students.

EurekaFacts utilized an internal panel of individuals as well as targeted recruitment and in-person outreach to individuals aged between 18 and 65 years old in the Washington, DC metro area. Recruitment materials and advertisements were distributed across social media platforms such as Facebook and Instagram. In-person outreach and canvasing at public and private 2- and 4-year institutions in the Washington, DC metropolitan area were also conducted.

In order to ensure that respondents met the qualification criteria, all potential participants completed a screening survey. Participants were either self-screened, using an online web-intake form, or were screened by EurekaFacts staff over the phone. All participants were screened using a screener script programmed into a CATI like software (Verint) to guarantee that the screening procedure was uniformly conducted and instantly quantifiable. During screening, all participants were provided with a clear description of the research, including its burden, confidentiality, and an explanation of any potential risks associated with their participation in the study. Qualified participants whose self-screener responses fully complied with the specified criteria were then contacted by phone or e-mail and scheduled to participate in a focus group session. Eligible participants who were screened by EurekaFacts staff over-the-phone were scheduled at the time of screening.

Participants were recruited to maintain a good mix of demographics, including gender, race/ethnicity, and degree type, as shown in Table 1. Each focus group included 6 to 10 participants for a total of 42 participants across five focus groups. All focus group participants received a $90 incentive as a token of appreciation for their efforts.

Ensuring Participation

To ensure maximum “show rates,” participants received a confirmation e-mail that included the date, time, and location of the focus group, along with a map and directions for how to reach the EurekaFacts office. Participants also received confirmation information via postal mail. Additionally, participants that were scheduled more than two days prior to the actual date of the session received a reminder e-mail 48 hours prior to the focus group session. All participants received a follow-up e-mail confirmation and a reminder telephone call at least 24 hours prior to their focus group session, to confirm participation and respond to any questions. 

Data Collection Procedure

EurekaFacts conducted five 90-minute focus groups at their research facility in Rockville, MD, between February and April 2019. The student survey was designed to ask a different set of questions to each of the three different types of postsecondary students. The five focus groups were arranged by student type so that each group was only administered probes relevant to the version of the survey they received. The three types of postsecondary students were:

  • First-time beginning students (FTB) (2 sessions)

  • Graduate students (1 session)

  • General students (non-FTB, non-graduate, 2 sessions)

Data collection followed standardized policies and procedures to protect the privacy of participants and the security of their information. Upon their arrival to the EurekaFacts office, participants were welcomed and asked to sign-in. Written consent was then obtained, and participants were provided with an ID card with a unique link to the survey to be used during the session. Each survey link was created using a unique identifier and not incorporating any part of the participant’s name or any other identifying information. The consent forms (which did include the participants’ names) were stored separately from their focus group data and were secured for the duration of the study.

Focus Group Procedure

At the scheduled start time of the session, participants were escorted to the focus group room and introduced to the moderator. Participants were then reminded that they were providing feedback on the NPSAS:20 survey and reassured that their participation was voluntary and that their answers may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law.

Focus group sessions progressed according to a moderator guide. The participants were informed that the focus group session would take up to 90 minutes to complete and was divided into two sections: an online survey and a period of group discussion. After participants completed the 30 to 40-minute survey, the moderator guided participant introductions before initiating the topic area discussion.

The moderator guide was used to steer discussion toward the specific questions of interest within each topic area. The focus group structure was fluid and participants were encouraged to speak openly and freely. The moderator used a flexible approach in guiding the focus group discussion, as each group of participants was different and required different strategies.

The following topics were discussed in the focus groups:

  • Topic 1: Introduction to Survey

  • Topic 2: First Year Education Experiences (FTB only)

  • Topic 3: Paying for Education

  • Topic 4: School Jobs (Graduate only)

  • Topic 5: Employment Experiences (General only)

  • Topic 6: Housing and Food Experiences

  • Topic 7: Incentives

  • Topic 8: E-mail Legitimacy

  • Topic 9: Closing

The goals of the discussion were to gain insight about the participants’ experiences taking the survey in order to identify problematic questions, terms, or response options; determine how respondents interact with the survey using their own devices; and explore how interested participants may be in receiving non-monetary incentives.

At the end of the focus group session, participants were thanked, remunerated, and asked to sign a receipt for their incentive payment.

Coding and Analysis

The focus group sessions were audio and video recorded using IPIVS recorder. In addition, during each session, a live-coder documented main themes, trends, and patterns raised during the discussion of each topic and took note of key participant behaviors in a standardized datafile. In doing so, the coder looked for patterns in ideas expressed, associations among ideas, justifications, and explanations. The coder considered both the individual responses and the group interaction, evaluating participants’ responses for consensus, dissensus, and resonance. The coder’s documentation of participant comments and behavior include only records of participants’ verbal reports and behaviors, without any interpretation. This format allowed easy analysis across multiple focus groups.

Following each focus group, the datafile was reviewed by three reviewers. Two of these reviewers cleaned the datafile by reviewing the audio/video recording to ensure all themes, trends, and patterns of the focus group discussions were consistent with those captured in the data. In cases where differences emerged, these first two reviewers discussed the participants’ narratives and their interpretations, after which any discrepancies were resolved. After the first two reviewers completed comprehensive reviews of the data, the third reviewer conducted a spot check of the datafile (i.e., selected a subset of cases at random to review) to ensure quality and final validation of the data captured.

Once all the data was cleaned and reviewed, it was analyzed by topic area using the following steps:

  1. Getting to know the data – Several analysts read through the datafile and listened to the audio/video recordings to become extremely familiar with the data. Analysts recorded impressions, considered the usefulness of the presented data, and evaluated any potential biases of the moderator.

  1. Focusing on the analysis – The analysts reviewed the purpose of the focus group and research questions, documented key information needs, focused the analysis by question or topic, and focused the analysis by group.

  2. Categorizing information – The analysts identified themes, trends, or patterns.

  3. Developing codes The analysts developed codes based on the emerging themes to organize the data. Differences and similarities between emerging codes were discussed and addressed in efforts to clarify and confirm the research findings.

  4. Identifying patterns and connections within and between categories – Multiple analysts coded and analyzed the data. They summarized each category, identified similarities and differences, and combined related categories into larger ideas/concepts. Additionally, analysts assessed each theme’s importance based on its severity and frequency of reoccurrence.

  5. Interpreting the data – The analysts used the themes and connections to explain findings and answer the research questions. Credibility was established through analyst triangulation, as multiple analysts cooperated to identify themes and to address differences in interpretation.

Limitations

The key findings of this report were based solely on notes taken during and following the focus group discussions. Additionally, some focus group items were administered to only one or two groups and furthermore, even when items were administered to all focus groups, every participant may not have responded to every probe due to time constraints and the voluntary nature of participation, thus limiting the number of respondents providing feedback.

Moreover, focus groups are prone to the possibility of social desirability bias, in which case some participants agree with others simply to “be accepted” or “appear favorable” to others. While impossible to prevent this, the EurekaFacts moderator instructed participants that consensus was not the goal and encouraged participants to offer different ideas and opinions throughout the sessions.

Qualitative research seeks to develop insight and direction, rather than obtain quantitatively precise measures. The value of qualitative focus groups is demonstrated in their ability to provide unfiltered comments from a segment of the targeted population. While focus groups cannot provide definitive answers, the sessions can play a large role in gauging the usability and functionality of the online survey, as well as identifying any consistently problematic survey items and response options.

Findings

Topic 1: Introduction to Survey

This section describes the participants’ initial impressions, reactions, likes, and dislikes of the NPSAS:20 survey. This section also summarizes and discusses how easy or difficult it was for participants to complete the survey regarding question clarity and survey length. Additionally, this section discusses whether participants were able to recall information and provide accurate reports to the survey questions.

Survey Impressions

Most participants found the survey to be well developed, reporting that the questions seemed to inquire about important topics pertaining to participants’ education experiences. However, participants identified the following key difficulties and concerns:

  • Survey length;

  • Sensitive survey questions; and

  • Unusual question content.

Survey length. Many participants felt that the survey was too long and reported that the length made it more difficult to both answer the survey items and maintain attentiveness throughout the entirety of the survey. This was expected given the additional probes embedded into the survey to collect open-ended responses and participants’ confidence in the data they were providing. The full-scale NPSAS:20 survey will not have open-ended responses or embedded probes.

Question sensitivity. Some participants reported that some of the questions asked were too personal, and that the survey covered sensitive topic areas. Questions that were considered to be too sensitive included those relating to sexual orientation, food consumption habits, and housing.

Unusual question content. A few participants felt the survey contained unusual content, including questions they considered to be unexpected and not applicable to their circumstances or experiences. More specifically, a few questions were found to be difficult to answer for “non-traditional” students (e.g., students over the age of 24, students with family and work responsibilities, or students who do not live or attend classes on campus) because the content and/or scenarios defined in the wording of the item did not seem to apply to them. Several participants commented that the questions about interreacting with student social groups were difficult to answer because there were scenarios in which the question could be less applicable. For example, some participants reported having limited interactions with other students due to studying at a satellite campus or attending classes online.

Information Recall

Many participants agreed that recalling recent details about the 2018–19 academic year was easy. One common reason provided was that student’s conceptual thinking of a year is typically within the context of an academic calendar.

However, some participants expressed having difficulty with recalling specific details regarding their employment history. Some of the most common challenges with accurately recalling work history information was due to the following:

  • Irregular school and/or work schedules;

  • Frequent job changes; and

  • Inability to recall due to the amount of time that had passed.

Topic 2: First Year Education Experiences (FTB only)

This section details first-time beginner participants’ ability to understand and answer questions in the NPSAS:20 survey that ask about first-year academic experiences and activities. Probing for this topic was administered to 2 of the 5 focus group sessions, or 17 of the 42 student participants.

Academic Activities

FTBs were asked whether they had participated in the following academic experiences during their first year of postsecondary education: courses with a community-based or service learning project, a learning community where you took two or more classes with the same group of students, guided research experience, or a first-year course or seminar. Participants generally found the items relating to their first-year academic experiences understandable and clear. Participants reported considering the following activities when answering these survey items: past experiences with academic courses, living in a learning community, and extracurricular activities. However, participants expressed confusion with two aspects of reporting on information about academic activities:

  • Uncertainty about which academic activities to consider; and

  • Confusion relating to the term “Guided research experience”.

Activities to consider. Some participants explained that the specific experiences could be confusing because they were uncertain whether to include student sponsored activities in their response. Students explained that some first-year activities were provided by the school itself while others were provided by clubs or student organizations. As a result, it was unclear whether all types of activities should be considered when providing a response. For example, one participant explained that they took a Criminal Justice course online which consisted of service projects such as a “ride-along with a police officer.” Whether this type of activity should be included in their response was unclear.

Shape1

“’Guided Research Experiences,’ I didn’t know what it meant exactly…. like I have done research for long-term projects, but I don’t know if that qualifies. Probably not. I am not sure what qualifies.”

Guided Research Experience”. Several participants expressed confusion regarding the term “Guided research experience.” Although participants did not elaborate sufficiently on the cause for confusion, it seemed unclear what type of research projects (whether short-term or long-term) should be considered as “Guided research experience.” One participant explained their confusion about this term stating, “Maybe just because I am a freshman, ‘guided research experiences,’ I didn’t know what it meant exactly… like I have done research for long-term projects, but I don’t know if that qualifies, probably not. I am not sure what qualifies.”

Interactions Outside of Class

Participants were also asked about their frequency of interactions outside of class with specific groups of people (figure 1).

Figure 1. Sample question on interactions outside of class

Picture 3

Participants expressed broad consensus on which activities they considered when responding to survey items about out-of-class interactions with peers and teachers. Participants identified direct social interaction outside of class to be interaction such as those at student clubs and organizations, sororities and fraternities, collaborations on classwork, and spending time in the student lounge. Students considered indirect social interactions to be communication over text messages or social media platforms (i.e. Instagram and Snapchat).

Recognizing differences. When discussing their interactions with students from different types of backgrounds (e.g. race/ethnicity, sexual orientation, religious beliefs), many students agreed that they had to make assumptions about their peers based on general appearance. Participants were often unsure of other students’ sexual orientation, political beliefs, religious beliefs, and economic background because other students may not openly comment or share their experiences on these subjects. As a result, students felt that the assumptions they made, in an effort to answer the question, had the potential to be inaccurate. One student explained, “Obviously race and ethnicity, a lot of times you may not know their nationality, but you can see physically that they are of a different race than you are. But economic background, religious beliefs, political beliefs, sexual orientation, that is not something that a lot of people are too, like, open and free about. Sexual orientation probably more so now than ever before, but religious and political is kind of divisive, you know people don’t really talk about it.”

Topic 3: Paying for Education

This section discusses participants’ student loan status, as reported during the focus group session, summarizing attitudes on the ease or difficulty of reporting on loan data information. Additionally, this section addresses participants’ knowledge on student loans overall, and more specifically, whether student loans borrowed from a state would be classified as private or federal student loans. Probing for this topic was administered to 4 of the 5 focus group sessions, or 34 of the 42 student participants.1

Reporting on Student Loans

Twenty of the thirty-four students explicitly stated they borrowed student loans. Of the twenty students who reported borrowing student loans, half were graduate students. In a unanimous opinion, each of these twenty students agreed that providing student loan information within the survey was easy.

When asked if they had heard of student loans borrowed from a state, 32 of the 34 participants reported they had not. The two students who had heard of loans borrowed from a state were FTBs; one learned this information from a student loan website and the other came across it during the college application process.

Eighteen participants explicitly reported a belief that a state-borrowed student loan is a federal loan, five explicitly stated it is neither a federal loan nor a private loan, and just six participants explicitly reported that it is a private loan (figure 2). Participants who classified these loans as federal most often spoke of the government when explaining why they did so. In a statement that echoes opinions across sessions, one participant said a state-borrowed student loan is federal “because it seems to be under the umbrella of the government.” The novelty of the concept may have contributed to participants’ tendency to misclassify these loans.

Figure 2. Reported classification of a “State-borrowed Student Loan”

Topic 4: School Jobs (Graduate only)

In this section, participants discussed the ability to report on school-related jobs, including assistantships, traineeships, fellowships, and work-study jobs. Participants also discussed how easy or difficult it was for them to provide information about school-related job earnings, and what values they took into consideration when providing an answer to the questions. This topic was administered to one focus group of 10 graduate students. However, only two participants reported that these questions were applicable to them, so the results were based on those two participants.

Reporting on School Jobs

Two participants answered questions pertaining to school-related jobs (figure 3) and reported it was easy to provide answers to these questions.

Figure 3. Sample question on school-related jobs

Shape2

Importantly, however, a participant expressed concern on whether the information they provided was sufficient enough for the survey. The participant stated that they second guessed their response regarding the total amount of their assistantship or traineeship after a follow up item asked how the participant determined their answer. The participant questioned whether the survey wanted an exact report of their earnings instead of the average amount, which they provided due to the variable amount of income they received on a bi-weekly basis. The second participant reported having a fellowship outside of school but stated that it was “super easy” to provide the details on school jobs, such as dates, earnings, and hours worked per week, because they earn hourly wages.

Topic 5: Employment Experiences (General only)

This section includes information about participants’ ability to recall details and provide information about their employment experiences. Additionally, this section discusses participants’ ability to provide information about their parent or guardians’ occupations. Probing for this topic was administered to 2 of the 5 focus group sessions, or 15 of the 42 student participants.

Information Recall

Most participants found it easy to recall details about their employment experiences, including the number of employers, dates of employment, income, and hours worked. Seven participants explicitly reported ease in recalling details about their employers. Of these participants three mentioned that the recency of the date range (figure 4) made it easy to answer this question. Furthermore, four participants explained that having one consistent employer made it easy to answer. For example, one participant stated that “most people do not change jobs very often,” which helps report on these details.

Figure 4. Sample question on employment details

Picture 3

Alternatively, two participants expressed difficulty and uncertainty with respect to recalling details about their employers. One participant reported “it was not really easy” to recall these details currently being unemployed, and previously being employed through cosmetology school which provided her with tips. Another participant expressed confusion with respect to categorizing their employment; this participant indicated having various informal jobs (i.e. dog walking). This participant opted to select “self-employed” in reference to informal work.

Reporting on Parent/Guardians’ Occupation

Only a few participants received items inquiring about their parent or guardians’ occupations. These participants expressed difficulty providing specific information or selecting accurate answers in response to the items. One participant reported confusion in selecting their parent’s occupation because they were unsure if the survey item was in reference to the job title (e.g., manager) or job type (e.g., journalist). Two participants suggested including an “unknown” response option.

The remaining participants that did not receive these survey items indicated that they would be able to provide information on their parent or guardians’ occupations. However, two participants stated that they could provide a general answer in reference to job duties as they may not be aware of specific tasks for one or both parents. Furthermore, one participant expressed a desire to receive survey items about their parent or guardians’ occupations.

Topic 6: Housing and Food Experiences

This section summarizes the participant experience when answering survey items on housing and food experiences and reports participant suggestions to mitigate any difficulty. This section also describes the participants’ assessment of the “places you’ve slept” list and discusses which reference periods were preferred for which item sets. Probing for Housing and Food Experiences was administered to all focus group sessions.

Survey Difficulty

Participants reported a variety of difficulties with answering survey items on food and housing experiences which can be classified into two categories: issues with item sensitivity and issues with item content/formatting.

Item sensitivity. Six participants explicitly reported experiencing an emotional reaction to items regarding their housing and food experiences due to the sensitive nature of the questions asked. Participants felt the survey did not prepare respondents to provide such sensitive information and at least nine students identified a “stark contrast” between the school and work-related items and the items on need insecurity. The perception was that survey items leading up to housing and food experiences were “straightforward and work or school-related,” aligning with what participants expected from a study like NPSAS, but this particular series of questions was unexpected and “intimate.” This surprise elicited an especially negative reaction from two of these participants, one of whom described it as “intrusive” and another who felt it was “off-putting.”

Shape3

“The way that they ask them couches it as though it’s an absolute. I have to severely budget myself to sustain through the rest of my loan but that’s not what those questions are asking; that’s suggesting that you either have money to get food or you’re destitute and unable to support yourself and there’s an entire planet that exists in the middle of those two things…”

Item content/formatting. Participants felt both the content and formatting of items on food and housing could be improved to capture a more accurate representation of food and housing stability. For survey items measuring food insecurity (the USDA six-item food security module), changes to item content were recommended by participants who felt questions could focus more on the nutritional deficits associated with a restricted budget, more so than the complete skipping or reducing of meals. Additionally, more than seven participants either reported that the nature of the dichotomous rating scale format was too restrictive or felt they needed the opportunity to elaborate on their response (figure 5).

Figure 5. Sample questions of a dichotomous rating scale on food experiences

Picture 2 Picture 4 Picture 3

In a statement that elicited explicit agreement from three other participants, and that was echoed by two others in different sessions, one participant felt the formatting of these survey items suggested “you either have money to get food or you’re destitute and unable to support yourself.” To mitigate this, participants suggested the survey either provide an open-ended response option where more details can be included, or to revise the response options to a Likert scale.

Places slept. For survey items inquiring about places participants had slept, three participants felt items would improve if content were amended to also measure where a respondent’s supposed stability in housing comes from. One participant holding this opinion shared that she depends upon a domestic partner to pay their rent. As such, she feels she may qualify as housing insecure, even though she sleeps in the same house most nights. Interestingly, while 12 participants explicitly reported this section of the survey was easy to answer, focus group discussion among the graduate group did reveal a certain degree of confusion as to the purpose of these items. This confusion led certain participants to answer housing items literally and another subset assumed that the survey wanted “no” responses to everything if you had stable housing. It was unclear how pervasive this issue with item specificity was; participants felt the item instructions were ambiguous.

With regard to the list of places slept (i.e. shelter, in a camper, at a group home, in transitional housing, or independent living program, etc.), participants found the list of potential places to sleep fairly exhaustive and did not have many unique suggestions to add to it. One participant suggested adding “your car,” one participant suggested “the library at school,” and more than six participants felt some variation of “your own home” should be included.

Preferred reference periods. Participants were randomly assigned to a “past 30 day” group or an “in the 2018–19 academic year” group to report on their food and housing experiences. For recalling their food experiences and situations, a majority of participants preferred “the past 30 days” to “2018–19 academic year” as 26 of the participants who explicitly responded to the probe found that it was easier to recall information in reference to the past 30 days.

On the other hand, participants held diverging opinions on the appropriate reference period for housing items (figure 6). Ten participants explicitly reported that the reference period does not matter because “no one would forget where you have been going to sleep;” 13 participants preferred the academic year, agreeing that such information was easy to recall but adding that the longer period could capture more places slept; and 10 participants preferred a 30-day reference period, feeling it would be easiest to remember.

Figure 6. Preferred reference period for places slept




Topic 7: Incentives

This section describes participants’ attitudes towards the receipt of incentives as compensation for completing a survey, including both monetary and non-monetary incentives. This section also summarizes whether some non-monetary incentives were perceived by participants as better than monetary incentives and discusses what the participants would do in the case in which something happened to a non-monetary incentive. Probing for incentives was administered to all focus groups.

Preferred Incentive Type

Shape4

“A Netflix subscription… you may already have, or they give you something else… but you don’t want it, but money gives you the flexibility to get what you want.”

Monetary incentives. A majority of participants explained that they preferred monetary incentives over non-monetary incentives, citing that the option to choose what you use the incentive for makes it more appealing. As one participant explained, “I think money is like – you want to be flexible with what they offer you, but a Netflix subscription…you may already have, or they give you something else…but you don’t want it, but money gives you the flexibility to get what you want.”

Non-monetary incentives. Participants who felt that a non-monetary incentive was acceptable espoused a plethora of options, including Amazon cards, extra credit for classes, or credits to rent textbooks. Additionally, many participants mentioned that non-monetary incentives could be more attractive if the recipient was given the ability to select a type of non-monetary incentives from a list of available options (e.g., Amazon, Netflix and Chegg). One participant described being able to choose their preferred incentive as a “personalized gift.” However, in regard to the non-monetary incentive of subscription services, a few participants explained that they would not want to sign up for a subscription service due to recurring costs after their incentive ran out.

Shape5

“If I was offered a bigger Amazon gift card than I was cash, I would take that because that is basically cash.”

Although most participants preferred monetary incentives, several participants felt that a non-monetary incentive would be acceptable if the value of the non-monetary incentives was significantly higher than the cash incentive. As one participant explained, “If I was offered a bigger Amazon gift card than I was cash, I would take that because that is basically cash, and I shop there a lot.”

Topic 8: E-mail Legitimacy

This section describes the perceived legitimacy of an e-mail reminder prompting one to complete a survey. Specifically, this section discusses the types of items participants typically look for when deciding if an e-mail was legitimate. This section also reports on what participants do with an e-mail they do not want. Additionally, participants’ preference for HTML versus plain text e-mails was discussed.

Perceived E-mail Legitimacy

When verifying the legitimacy of e-mail communications, nearly all participants reported utilizing information derived directly from the content of an e-mail (i.e. language, terms used, subject line) in combination with independently verifying the authenticity of an e-mail’s message based on the domain name or sender, for example.

Verifying authenticity. Participants described both technical and subjective appraisals of the “professionalism” of an e-mail, which they used in determining the legitimacy of an e-mail communication. Technical strategies included evaluating the sender’s e-mail address and domain name, as well as hovering over weblinks in the e-mail in order to evaluate the link’s destination. As one participant described, a more trustworthy source was one that can “have a phone number, have an address, have some type of – if I were to Google it, there’d be something there with a location or maybe people who work there.”

Shape6

“Have a phone number, have an address, have some type of – if I were to Google it, there’d be something there with a location or maybe people who work there.”

Aesthetic strategies were influenced by participants’ subjective perception of the e-mail’s appearance and language. Participants looked for the inclusion of a signature and contact information within the body of the e-mail. They also considered the language used in the subject line of the e-mail, specifically avoiding such terms as "Free", “Congratulations,” or “You won” – these terms would lead them to regard an e-mail more skeptically.

E-mail receipt circumstances. When determining the legitimacy of an e-mail, many participants also described considering factors related to the circumstances under which they received the e-mail. Participants described various factors as playing into their evaluations, including:

  • Whether or not they had a reasonable expectation of receiving this type of recruitment e-mail;

  • If someone they knew personally vouched for the source;

  • If they were able to independently verify the existence and reputation of the businesses and organizations involved.

Participant actions. Nearly all participants explained that when they receive an unwanted e-mail, or one they perceive as likely to be a SPAM e-mail, they take the following actions:

  • Deleting the e-mail;

  • Sending it to their SPAM or Junk folder;

  • Reporting the sender to their ISP or e-mail provider;

  • Ignoring it.

HTML vs Plain Text E-mails

Participants were largely divided on which e-mail type they preferred, with some feeling that HTML based e-mails were more visually appealing, while others explained they preferred the simplicity of a plain text e-mail. There was more agreement among participants that the e-mails should at least have a logo within the e-mail. It is worth noting that one participant expressed their concern that their computer may become infected by opening an HTML based e-mail, stating it could leave them “vulnerable [to a computer virus].”

E-mail suffix preferences. It is worth noting that across all groups, a majority of participants imparted the greatest legitimacy to a federal government e-mail with a .gov address over any other possibilities. Participants explained their preference for e-mails from federal government addresses (.gov) over business addresses (.org) claiming that a business address e-mail is more common and easily attainable, and therefore less trustworthy. However, some participants reported familiarity with the RTI organization and expressed that they would therefore respond to an e-mail address from an organization they were familiar with.

Topic 9: Closing

This section includes information about participants’ likes and dislikes about the survey with respect to the response options, item content, and topic areas. Specifically, each participant was asked to report one thing they liked and one thing they disliked about the survey.

Likes

Several participants liked the content of the survey items and were grateful for the opportunity to share their thoughts and provide reports on their experiences. The following were a few aspects of the survey that participants liked:

  • Shape7

    • “Liked helping the Department of Education with the student loan thing because it is big, complicated, messy and awful.”

    Some participants liked the survey topics on finance and students’ education. For example, these participants liked that the survey considered "different areas related to finance and student education."

  • A few participants were grateful to have the opportunity to contribute to research related to the topics covered by the survey items. One participant liked the idea of potentially “helping the Department of Education with the student loan thing because it is big, complicated, and messy.”

  • One participant liked that the survey inquired about respondents’ confidence in their answers.

  • Another participant liked that the survey provided different follow-up questions and probes based on the respondents’ answer selections.

Dislikes

There were a few aspects of the survey that participants most commonly reported disliking, either alongside discussion of other topic areas or during closing. These include the survey length, sensitivity of the topic areas covered, and the response options offered.

Response options. Several participants considered certain survey’s response options to be too restrictive and suggested methods for providing the respondents with an opportunity to report more accurate reports. A few participants found the dichotomous answer options to be too limiting in certain cases. For example, when asked about their food habits and experiences, they would prefer an open-ended response option. Another issue raised was that some response options captured a set of answers (i.e. “unemployed, retired, or disabled”) within a single response option. Participants reported that they would prefer for each condition to be independent of the other so that the response they give was less ambiguous and more accurate.

Question sensitivity. Several questions were considered too sensitive, uncomfortable, and/or “intrusive.” Those of particular concern were ones relating to sexual orientation, food consumption habits, and housing. As one participant explained, they “thought that was too personal” to inquire about such sensitive topics within the survey.

Recommendations for the Full-scale Study

Based on the combined results from online and in-person pretesting, we conclude that monetary incentives are still by far the preferred incentive type and that when probed about nonmonetary alternatives, respondents still explicitly mention preferring monetary incentives and gift cards over anything else. Regarding contacting methods, we recommend sending e-mails from an @ed.gov e-mail address, using text messaging more often, and to use the text envelope. Brochures will highlight the purpose and content of the study, as well as privacy and confidentiality. The letters will include a QR code and we would employ a different color and bulleting to highlight the incentive message.

Regarding the full-scale student survey, pretesting findings suggest participants generally enjoyed their survey taking experience. Survey content was easily understood and participants were confident in their responses, and information needed to respond to survey questions was relatively easy to recall. In addition, most participants reported that response options provided within the survey were appropriate and could be satisfactorily used to answer the items. However, several items were identified as being too personal or sensitive. For these items, supplemental text will be created to explain the importance of these measures within an education survey.



References

Groves, R.M., Cialdini, R., and Couper, M. (1992). Understanding the Decision to Participate in a Survey. Public Opinion Quarterly, 56(4), 475-495.

Groves, R.M., and Heeringa. S.G.. (2006). Responsive Design for Household Surveys: Tools for Actively Controlling Survey Errors and Cost. Journal of the Royal Statistical Society (Series A), 169: 439-457.

Lynn, P. (2016). Targeted Appeals for Participation in Letters to Panel Survey Members. Public Opinion Quarterly, 80(3), 771-782.

Lynn, P. (2017). From Standardised to Targeted Survey Procedures for Tackling Non-response and Attrition. Survey Research Methods, 17(1), 93-103.


1 RTI requested this topic not be administered during the second first-time beginning (FTB) focus group session in order to prioritize FTB-specific content.

1

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy