Appendix F - Cognitive Interviews Report

Appendix F NCER-NPSAS Grant Study-CSFA Cognitive Interviews Report.docx

NCER-NPSAS Grant Study - Connecting Students with Financial Aid (CSFA) 2017: Testing the Effectiveness of FAFSA Interventions on College Outcomes

Appendix F - Cognitive Interviews Report

OMB: 1850-0931

Document [docx]
Download: docx | pdf





NCER-NPSAS Grant Study

Connecting Students with Financial Aid (CSFA) 2017: Testing the Effectiveness of FAFSA Interventions on College Outcomes







Appendix F

Cognitive Interview Report


OMB # 1850-0931 v.2









Submitted by
National Center for Education Statistics
U.S. Department of Education









May 2017






NCER-NPSAS Grant Study

Connecting Students with Financial Aid (CSFA) 2017: Testing the Effectiveness of FAFSA Interventions on College Outcomes


Cognitive Interview Draft Report



April 2017








TABLE OF CONTENTS


Section



Page

1.0

Background and Purpose

3

2.0

Research Methodology and Limitations

4

3.0

Key Findings

5

4.0

Conclusions and Implications

8


1.0 BACKGROUND AND PURPOSE


Background and Purpose


  • In 2010, the National Center for Education Research (NCER) and the National Center for Education Statistics (NCES), both within the U.S. Department of Education’s Institute of Education Sciences (IES), began collaborating on an education grant opportunity related to the cross-sectional National Postsecondary Student Aid Study (NPSAS). Under the NCER-NPSAS grant opportunity, researchers could submit applications to the Postsecondary and Adult Education topic within the Education Research Grants program (CFDA 84.305A) to: 1) explore relationships between malleable factors (e.g. information on benefits of financial aid and FAFSA renewal) and postsecondary persistence and completion, as well as the mediators and moderators of those relationships; and 2) evaluate the efficacy of interventions aimed at improving persistence and completion of postsecondary education. Researchers approved for funding through this program can obtain indirect access to a subsample of the national NPSAS sample (after the study’s student interviews are completed) in order to conduct unique research projects that adhere to the guidelines set forth in the Request for Applications for the Education Research Grants Program, as well as guidelines set forth by NCES and the NPSAS program.


  • On July 1, 2016, a grant was awarded to this project: Could Connecting Students with Financial Aid Lead to Better College Outcomes? A Proposal to Test the Effectiveness of FAFSA Interventions Using the NPSAS Sample (referred to as “Connecting Students with Financial Aid (CSFA) 2017”; http://ies.ed.gov/funding/grantsearch/details.asp?ID=1853). The CSFA 2017 study investigates whether an intervention that provides financial aid information increases completion of the Free Application for Federal Student Aid (FAFSA). In addition, information will be provided on how the number of college credits taken can increase the amount of financial aid received to see if this information influences enrollment intensity (full- versus part-time status). The primary grantee is Bridget Long, Harvard University (Grant Award #R305A160388), and the co-principal investigator is Eric Bettinger, Stanford University. Data collection for the study will be led by the contractor, Research Triangle Institute (RTI).


  • The CSFA 2017 study includes a follow-up survey component designed to strengthen researcher understanding of participant knowledge of available financial aid information, participant experience with the financial aid application process, and participant perspective on how post-secondary financial aid influences decision making and behavior. This survey is scheduled for implementation five months after the last information intervention, but prior to the beginning of the next (2018-19) FAFSA submission season.


  • The research team proposed to recruit for and conduct with postsecondary students cognitive interviews of the CSFA 2017 Student Survey to provide information that may help us refine the existing survey language and organization of the survey tool, by examining:

  • the extent that language terms in questions are comprehended correctly;

  • the extent to which students’ possible answers are adequately captured by the multiple-choice options;

  • the thought processes used to arrive at answers to survey questions; and

  • potential sources of burden and respondent stress.


  • The cognitive interviews were cleared under OMB # 1850-0803 v.191. Feedback from the cognitive interviews guided development of the revised survey instrument to be used in the CSFA study.


  1. RESEARCH METHODOLOGY AND LIMITATIONS


Research Methodology


  • Cognitive interviews were conducted by appropriately trained interviewers during April 2017, each lasting between 20-45 minutes and consisting of both open-ended questions and target probes to explore participants’ opinions, decisions, and understanding of the survey questions and associated terminology


  • A total of 15 cognitive interviews were completed with current postsecondary students and 3 cognitive interviews of non-enrolled professionals with experience working with postsecondary students. Respondents represent a variety of postsecondary institutions, recruited using a variety of methods including networking referrals and invitations distributed via email and public postings.


  • Interested respondents scheduled an individual interview session based on availability. While there were originally 25 individuals who expressed interest in participating, only 18 were able to schedule a time and complete the interview process.


  • Respondents completed either the online version of survey instrument or the hard-copy paper version of the survey. This mixed mode of survey implementation was intentional and intended to capture insight regarding how the survey is perceived in different formats, how automatic skip logic capability may impact perception of survey length, and potential variance between the two modes of compliance with survey instructions.


  • For documentation purposes, the recruiting disposition is included below:

Cognitive Interviews

Number of responses to the invitation to participate

25

Number of recruits not available

7

Number of interviews via live telephone interface

7

Number of interviews completed in-person

11

Number of respondents successfully interviewed

18


  • During each interview, respondents were informed of the voluntary nature of their participation in the interview and completed an informed consent form. Respondents did not receive any compensation for their participation in cognitive interviews.

  • The interviewer then administered the survey tool with additional probes as needed to better learn whether the respondent understood survey questions and/or response options. Respondents were further probed about question and answer option comprehension if they demonstrated confusion, hesitation, or uncertainty when considering a survey item. Respondents were also asked whether and how the questions and/or responses could be improved to increase comprehension or provide options of responses that are more appropriate to their desired answers


  • Detailed notes were documented during each interview. Following the conclusion of an interview session, survey responses, interactions between the interviewer and respondent and observations recorded during the interview were summarized.


Limitations

  • A qualitative research methodology seeks to develop direction rather than quantitatively precise or absolute measures. The limited number of respondents involved in this type of research means the results should be regarded as directional in nature and be used to generate hypotheses for future decision making.


  • The non-statistical nature of qualitative research means the results cannot be generalized to the population under study with a known level of statistical precision.


  • The recruited participants may differ from the NPSAS sample that will eventually receive the CSFA Fall Survey. The cognitive interview respondents answered questions regarding the current 2017-18 financial aid application season while in-progress, rather than the timing of the actual survey, which takes place at the end of the application season and commencement of a new academic term. In addition, some of the students interviewed are in their first year of enrollment while the students sampled during CSFA 2017 will, as a result of being NPSAS:16 respondents, will have been enrolled for more than one year at the time of the survey. Lastly, the majority of the interview sample were attending full time. This sample may not accurately reflect the responses or perspective we would have received post FAFSA-season from a sample designed to be more diverse in post-secondary experience.



  1. KEY FINDINGS


Survey Format and Length


The majority of respondents who responded online were observed to be able to navigate the survey and reported no problem with the length of the survey. However, many of the online respondents were also subjects that did not apply for 2017-18 financial aid (or had not yet completed the application process). As a result, these respondents experienced a shorter path through to survey completion.


Respondents taking the hard copy version of the survey had more difficulty with the format and length of the survey, indicating that it “seemed like a lot of questions” at the start of the interview (paging through in advance) and expressing hesitation. Other recommendations included increasing the font size and spacing of the content to reduce “crowdedness” and making the skip-logic instruction boxes easier to identify.


Survey Organization and Flow


Not all hard-copy respondents experienced difficulty with the format and length of the survey. However, those that did suggested a reorganization of questions and the possibility of having two versions of the survey available to reduce confusion and eliminate unnecessary or irrelevant questions – one for 17-18 FAFSA filers and one for non-filers - if our records were able to indicate which version a respondent should receive. More than one respondent suggested making the FAFSA “awareness” question (Q7) appear much earlier in the survey.


Some respondents were thrown off by the timing of the cognitive interview (Spring 2017) in relationship to the target date for survey implementation (Fall 2017), particularly in terms of inquiries about past and current credit/course enrollment and intentions for future enrollment levels. It is not anticipated that this will be an issue during survey implementation, but it is cause to consider specifying a timeline when necessary for more accurate responses.

Ex. Q 12-16 : Some respondents had difficulty answering because they had applied for 2017-18 financial aid (Q2=yes, FAFSA), but had not yet completed the process (some, for example, had received their Student Aid Report, but not an award letter, or had encountered other variable in the financial aid cascade).


Survey Language and Tone


Overall, respondents consistently identified a few questions that they found difficult to understand or containing ambiguous language that could be improved. A few respondents suggested using a “less stuffy” or “less government” tone to the questions and responses –one had a feeling of being “judged” as they progressed through the survey while another found the repetition of “financial aid form” and government/non-government as condescending. Feedback regarding specific question language, tone and clarity is detailed in the next section.


Specific Question and Answer Option Clarity


Q3: Respondents desired examples of what “non-government” aid is. The original question design offered examples to those who indicate “yes” to applying for non-government aid so that they may indicate which applications they had completed. While it was our intention to avoid asking irrelevant questions to respondents who did not apply, including the examples seemed to remind a respondent of aid they had actually applied for, but hadn’t originally considered in their response. This was especially illuminated using the hard copy survey, which shows all questions and answers without the omission caused by the online display logic.


Q5: Respondents frequently asked “for federal aid or all aid” in regard to this question. In prior questions, we are very specific about the application we are referring to, yet we failed to specify the federal (FAFSA) application here.


Q9: Two respondents suggested that we have options of media that are not referred to as only “social media”. As there is no response option on this question for “other”, these respondents did not feel any of the responses were applicable.


Q11: At least half of the respondents struggled with the DRT section, finding it too dense and confusion. “I don’t know what this is talking about” and “Maybe my parents would know this”. It was noted that if the passage was read aloud by the interviewer, respondents were much less confused and, while they were no less familiar with DRT, it may be worth reconsidering how the question is worded.


Q17-24: This section on enrollment intensity was universally easy for respondents to answer, though a few did question the use of the word “term” in reference to academic session. “Term” could be measured in a variety of ways (standard semester, trimester, block scheduling, J-term, summer term) depending on where the student is enrolled. It was suggested that we add a preceding question or clarifying language to help identify how the academic calendar is being measured.

Q19: Some respondents had to read this question a few times and consider what that meant in their circumstances. Because many interviewees were attending full-time, they felt the question wasn’t relevant, but may have taken more time to complete coursework vs. overloading their schedule. Respondents also reported that the decision would depend on the difficult of the coursework that term.


Q23: Suggestion to replace “other” (or add an additional choice) to allow students who are enrolled full-time due to requirements for minimum enrollment to attend, maintain eligibility for a scholarship, or participate in activities (sports, etc.).

Q25; Some respondents did not like the Likert scale format or definitions for this series of statements – overall felt that a True/False/Not Applicable would be more appropriate. Specific comments included:

Q25a) Adjustments to how this statement was worded per OMB suggestion did not test well. For example, using “the financial aid the school and government were able to give me” was met with feedback of being unnecessarily confusing. “confusing”; “I didn’t apply for financial aid, so that has nothing to do with it”; “Why don’t you just say – I’m attending my current school because they gave me the best financial aid?”


Q25b) “This IS my first choice school” “and (again) “I didn’t apply for or receive financial aid – it wasn’t a factor”


Q25c) “I have to take the number of credits that I’m taking, so it doesn’t change anything”.


Q31: Those who took out loans (and answered this question) found the Strongly Agree to Strongly Disagree scale more acceptable than on the Q25 series of statements.


Q32: Feedback suggests re-wording the question “How did you decide to not take out a loan?” – that it wasn’t a decision, just sometimes not needed. New wording: If you have NOT taken out loans, how did you make that decision (stress the word “not”)? Other desired answer options:

  • I received enough financial aid to not need loans

  • I did not need to apply for financial aid and did not apply for loans

  • I received a scholarship and did not need to apply for additional financial aid

  • I did not need a loan, so there was no decision to make.


Q33: Statements a, b, and c were generally relevant and applicable to most respondents, but d, e, and f are only relevant if you have a loan (currently all answer this question). Additionally, a few respondents mentioned that 33f should actually be two different statements; 1) I had trouble trying to get a loan and 2) I was unable to get a loan.


Relevancy of Survey Items


Respondents confirmed that the questions asked in the survey may be helpful towards understanding why and how students apply for or do not apply for financial aid. A few respondents questioned the relevancy of specific questions, including the inclusion of a significant number of items regarding loans and loan repayment. (Q26-33)


Respondents who had not completed their own financial aid forms or been involved in college financial decisions were perceived to have more difficulty answering. Some were even apologetic and expressed regret that they did not have more understanding of the process.


Respondents not applying / receiving financial aid provided feedback that the questions assumed need and may not be adequately inclusive or those NOT in need of aid. To alleviate this, we may provide answer options that can indicate lack of financial need or lack of need to complete an application. Similarly, scholarship recipients desired options that allowed them to indicate their circumstances.


  1. CONCLUSIONS AND IMPLICATIONS


The cognitive interview process has revealed adjustments to survey language that could improve question and answer clarity, enhance the instrument’s relevance to a wider audience, and allow researchers to streamline the survey experience for the respondent.


Information gained from cognitive interviews will be considered in tandem with additional practitioner feedback, recent changes in federal financial aid policy and administration, and potential limitations to the allowable format and length of the survey instrument.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorCara Shugoll
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy