Cognitive Labs Report

NPSAS 2012 Field Test Cog Labs Report 1-20-11.doc

National Postsecondary Student Aid Study

Cognitive Labs Report

OMB: 1850-0666

Document [doc]
Download: doc | pdf

PowerPlusWaterMarkObject357831064

Cognitive Testing Summary Report





National Postsecondary Student Aid Study (NPSAS:12)

Field Test Student Interview






1. Introduction

This report focuses on recruiting, procedures, and general findings resulting from cognitive testing of the 2011-12 National Postsecondary Student Aid Study (NPSAS:12). Housed in NCES’s Postsecondary Studies Division, NPSAS:12 is a comprehensive study of how students and their families pay for postsecondary education. In addition to providing cross-sectional information about college costs and financing, NPSAS:12 will serve as the base year data collection for the next Beginning Postsecondary Students Longitudinal Study (BPS:12/14, BPS:12/17). As a longitudinal study, BPS is able to investigate persistence and enrollment in less-than-2-year, 2-year, and 4-year institutions, transfer and graduation rates, employment, and student loan debt over time. BPS, which samples both traditionally- and nontraditionally-aged first time postsecondary students, is the only nationally-representative, longitudinal survey of its kind.

Working together with RTI International, NCES is reconceptualizing the BPS study to better elaborate the decision-making processes of postsecondary students. The BPS student interview, including the NPSAS:12 base year interview, is also being redesigned to reflect this new conceptualization. Drawing upon human capital theory, the redesign team is developing a model of student decision-making in which choices are based upon probabilistic expectations of the rewards and costs of alternative choices.

In early 2010, focus groups were conducted on issues of relevance to NPSAS and BPS to inform refinement of items used in previous surveys as well as the development of new items which will help to elaborate the postsecondary choices of the first-time beginning (FTB) population. The focus groups helped the redesign team move from conceptualization to instrument development. Additionally, the focus groups were used to improve a select set of existing questions in the NPSAS and BPS interviews, particularly items involving financial aid terminology that is possibly unfamiliar to students (e.g. private loans) and items used to determine eligibility for the BPS cohort.


Cognitive testing of the NPSAS interview began later in 2010. The cognitive testing process was designed in order to enable the instrumentation team to:

  • examine the thought processes affecting the quality of answers provided to survey questions,

  • understand the extent to which terms in questions are comprehended,

  • evaluate the memory demands of the questions,

  • evaluate the ability of respondents to make calculations and judgments,

  • determine appropriate presentations of response categories,

  • assess the time it takes to complete the interview,

  • assess the navigational problems users face, and

  • identify sources of burden and respondent stress.


  1. Recruiting

Students in the Research Triangle Park, NC and Washington DC areas were recruited via flyers and advertisements designed to describe the purpose of the cognitive interviews and details of participation, including the time commitment, incentive for participation, and RTI staff contact information. Participants were recruited in a variety of ways, including posting ads on Craigslist, placing advertisements in a local newspaper (The Independent Weekly), and recruiting friends and family of RTI International staff members via word of mouth and postings on the RTI Intranet. In addition, recruiters conducted outreach in 4-year, 2- year, and less than 2-year institutions by asking the institution to publicize the study, displaying flyers on job boards and other visible areas, and advertising the study on the school’s job website.


Students interested in participating in the cognitive interviews were asked to call RTI to complete a brief screening interview to determine eligibility. Eligible participants were those determined to be first-time beginning students (FTBs) in the desired institution types with a range of demographic characteristics and educational experiences.


Cognitive interview participants were selected to reflect the three different institution levels: 4‑year, 2‑year, and less-than-2‑year. The overall goal was to recruit and screen enough students to yield 48 total interviews. Table 1 below provides detailed information on the total number of responses throughout all rounds of recruiting and interviewing.


Table 1. Participant Recruitment Status (Mutually Exclusive)

Status

NC

DC

Contacted RTI but not screened*

67

41

Screened but not eligible

75

28

Eligible but not scheduled

42

15

Scheduled but did not come to interview

2

0

Completed interview

42

6

Total

228

90

*Individuals who did not return RTI’s follow-up contact attempts.



  1. Interviewing Procedures

Across the four rounds of interviewing, a total of 48 participants was interviewed:

  • 18 from 4-year schools

  • 14 from 2-year schools

  • 16 from less-than-2-year schools

In order to ensure that both data collection modes utilized in NPSAS:12 were adequately tested, staff conducted 23 web interviews and 25 telephone interviews.


RTI staff incorporated both “think aloud” data capture, and scripted probing. A "think aloud" interview is one in which the respondent is instructed to tell the interviewer everything that he/she is thinking about in answering a survey question. Probes utilized in interviewing were both concurrent (asked at the same time the subject answers the questions) and retrospective (asked during a debriefing session) and were prepared ahead of time. In addition, staff employed spontaneous probes as needed.


A majority of interviews were conducted in-person at RTI International’s cognitive testing facilities in Research Triangle Park, NC (n = 36). In addition, 6 in-person interviews were conducted in RTI’s Washington, D.C. office. Two interviews of less than 2-year students from North Carolina took place by telephone. In addition, four interviews of less than 2-year students were conducted in-person at the students’ school.

4. Fundamental Observations

Throughout the course of cognitive testing, the NPSAS:12 instrumentation team put forth great effort to make a large number of changes to the instrument. Iterations changed from testing round to testing round, and focused not only on cognitive testing probes and responses, but also on visual discoveries, improvements to flow, appearance, etc. A number of these issues were not the focus of cognitive testing and therefore do not appear in this report. In the sections that follow, summary information is provided on items where significant cognitive barriers to completion of the interview were observed.


4.1 Enrollment

The questions in this section determine whether a student is a first-time beginner and identify what type of degree the student is working to achieve. In general, these questions proved most difficult for the less than 2-year and 2-year students whose paths through postsecondary education are not as linear as 4-year students. To address the difficulties these students had with the survey, the questions were revised to more clearly differentiate the type of degree and certificate programs in which the students were enrolled, eliminate confusing terminology, and provide better options for identifying students who intend to transfer to programs at a different institution.


In addition, item-specific issues and solutions identified in this section are summarized below.

  • N12ELIG asks students when they first began attending their college/university. Most students understood “attend” to mean when they started classes. However, in the initial rounds of testing, several participants misunderstood or misheard the question and thought it was asking if they attended their college/university on July 1, 2010 rather than since July 1. In subsequent rounds, the question was revised to, “Have you attended NPSAS at any time between July 1, 2010 and today?” There were no problems observed with the revised version of the question.

  • N12DEGREE asks students which type of degree they are working to achieve. In the initial version of the question, the list of degrees was long and participants had difficulty differentiating “undergraduate certificate or degree” from a bachelor’s degree. To emphasize the distinction, the response categories were separated into Degrees (associate’s, bachelor’s, etc.), Certificates and Diplomas (undergraduate certificate/diploma, post-baccalaureate certificate, etc.), and non-degrees. This change allowed participants to more quickly and accurately find their degree program.

  • N12ASSOC asks students in which particular associate’s degree program they are enrolled. The original question used abbreviations (e.g. AA, AS). Cognitive testing revealed that participants, even when enrolled in these programs, were not certain what the abbreviations meant. The question was revised to provide the full terms (e.g. Associate of Arts, Associates of Science).


4.2 Education Experiences

Overall, this section was well understood by participants and easy to complete. The item-specific issues and solutions identified in this section are summarized below.

  • N12HSCDR asks students to provide the name of their high school. While most web participants were able to complete this question without problems, there were a few participants who had difficulty. The initial version of the question had the instructions in paragraph form, which was lengthy and hard for some respondents to follow. The instructions were changed to short numbered steps to make the process clearer to participants.

  • N12GPAEST asks about GPA. The initial question included both letter grades and numeric grades (e.g. “Mostly A’s (3.75 and above)”) in the response options. This proved problematic for students who took classes with weighted grade point averages. The question was revised to include only letter grade response options.

  • N12ALTCRS. This question asks students if all, some, or none of their classes were taught online, in the evenings, or on the weekend. This question is a combination of six previous questions. In the previous version, students were asked three yes/no questions about whether they were taking an online class, an evening class, or a weekend class. For each question for which the student answered, “yes,” they were then asked if all of their classes were online (or in the evening or in the weekend). The participants misunderstood this question to mean if all of the sessions for that particular class were online (or in the evening or on the weekend). In addition, the previous question about online classes also asked about distance education. While online classes were well understood by participants, distance education was not and was removed from the revised version of the question. The revised question was quicker and easier for participants to answer.

  • N12SRVUSE. This question asks students to indicate which school services they used during the school year. Many students selected the “other” category and indicated they used financial aid services. This option was added to the question in subsequent rounds.

  • N12SRVMATRX. The initial version of this question asked students how likely it was that they would have continued to attend if they had not used the school services identified in N12SRVUSE. However, this did not make sense to some participants. For example, one student indicated that she would never have gone to a school that did not have a student health service, so she did not know how to answer. The question was revised to ask students how important the service was in their decision to stay at their school.

  • N12MAJ1. This question asks students about their major. The initial version of the question included the button “search” that participants were supposed to select after entering in their major. However, many participants hit the search button first before entering any information. The button was changed from “search” to “enter” and the instructions were revised to include numbered steps. This helped to ensure that participants entered job titles and duties before hitting the enter button.

  • N12EXOCC. This question asks students to indicate the job titles and duties for the occupation they intend to hold after they complete their degree. Initially, the question asked students to think about the title/duties for the job they would have five years after completing their degree, but many participants found it too difficult to answer so far out in time. As a result the “five years after” phrase was removed and now the question asks about after finishing their degree. The format for this question was also revised in a manner consistent with that used in N12MAJ1.


4.3 Financial Aid

The series of questions about financial aid was revised considerably. In the initial version of the questionnaire, many participants had a difficult time determining exactly what type of aid they received (i.e., federal loans versus private loans). To address this, the section was revised to include definitions and examples for all types of aid of interest. In addition, when necessary, examples of what not to include were also provided. This helped the participants better understand where to report the aid they received. For example, in the original question NS12STAID asked respondents if they received a state grant or scholarship. Participants were unclear if this included aid they received from their district or other local organizations within their state or from the college/ university they were attending (if it was in-state). Consequently, this item was revised to instruct respondents not to include aid received from their college/university, district, or other local organizations.


Furthermore, the order of the questions was revised to first ask about the different types of grants and scholarships the student might have received and then to ask about any loans the student might have taken. Previously the questions about grants, scholarships, and loans were mixed together, which some participants found confusing.


4.4 Current Employment

Overall, this section was well understood by participants and easy to complete. The item-specific issues and solutions identified in this section are summarized below.

  • N12ALTNUMJOB. This question focuses on the number of jobs students would work if not attending college. Some participants were unclear as to whether or not they should be summing the total number of jobs throughout the year, or focusing on how many jobs worked at one time. This was revised to keep students focused on how many jobs they would have held at one time.

  • N12ALTWAGE1 and N12ALTWAGE2. These two items ask students how much they think they would have earned at the job or jobs they would have worked instead of attending college. During initial rounds of testing, some participants expressed feeling burdened by attempting to calculate an exact amount. In later rounds of testing, these items were revised to explicitly instruct respondents that an estimate or best guess was acceptable.

  • N12SEARNS. This item asks students about the amount of time they worked while attending school. During an early testing session, participants expressed difficulty with calculating the number of weeks worked, expressing preferences for responding in semesters or months. In later rounds, allowing students to convert semesters and trimesters to weeks proved helpful and was well received.


4.5 Income and Expenses

Here again, the section was well understood by participants and easy to complete. The item-specific issues and solutions identified in this section are summarized below.

  • N12FAMHLP. This item asks students if family or friends help to pay for educational or other expenses. Early on, some confusion regarding whether or not a student’s spouse should be included as family was expressed. In later rounds, married participants received a modified item which clarified that the focus should be on family other than the spouse.

  • N12FAMHLPTYP. This item asks students how much money from family and friends was used for a number of items. One of the biggest sources of confusion for participants was whether or not room and board should be counted under basic living expenses such as rent or food. The modified version of this question (which is now N12FAMHLPORD) clarifies that room and board should be included.

  • N12DISTHMINS. This item asks students to provide an estimate of how much time it takes them to travel from their current residence to school. Initially, this question asked about time each day, during an average week, which caused confusion for participants. Because of the focus on “an average week,” many assumed the question was asking for a total for the entire week. Revisions focused on asking about “an average day” proved helpful.


4.6 Background

Overall, this section was also well understood by participants and easy to complete. The item-specific issues and solutions identified in this section are summarized below.

  • N12STATE. This question asks students the US state in which they hold legal residence. Initially, participants interpreted state of residence as status or type, not realizing the question’s focus. Revisions placed a greater emphasis on the purpose of the question by asking “Of which state are you a legal resident?” which eliminated confusion in later rounds.

  • N12DADED and N12MOMED. These two questions initially asked students about the highest level of education their father and mother completed, respectively. However throughout the interviews, participants illustrated that for first time beginning college students, some parent/guardian relationships of interest are not with mothers and fathers. As such, the item N12GUARDED was created to offer students the opportunity to choose up to two parent or guardian types (mother, grandmother, etc,) and then indicate person’s level of education.


4.7 Items Utilizing Sliders

Throughout the interview, participants were presented with items that utilized sliders to respond to questions focused on the likelihood of something occurring, or their level of agreement with a given statement. Cognitive protocols were focused on three high-level issues associated with use of the sliders:

  1. Usability – Were the participants able to correctly and easily use the sliders to indicate their response choices? Was there a “learning curve” observed or did participants find the use of sliders relatively easy from the start?

  2. Scale Preference – In the first two rounds of interviewing, participants were presented with a mix of likelihood/chance scales across items. Some items were presented with 0-10 (e.g., 2 chances in 10) scales, while other items were presented with 0%-100% scales.

  3. Mode Effects – Throughout all rounds of cognitive interviewing, interviewers were focused on whether or not the web and phone modes of the interview would produce different types of responses, in terms of whether or not a decimal (e.g., 3.5) response was chosen.


In terms of usability, the sliders proved simple to use for almost all participants. Even in cases where the first item was somewhat problematic or confusing, participants quickly became comfortable with using the sliders. Preference between using 0-10 and 0%-100% was stated so consistently by the end of round 2 of interviewing, that round 3 featured only the overwhelming preference, 0-10. Finally, cognitive testing results do suggest that there may be a mode effect on slider items that allow students to respond using decimals. While the cognitive testing results do not contain enough cases to yield generalizable results, it was clear that telephone interview participants, in large part, consistently answered using whole numbers, while web participants were more likely to use decimals visible on the slider’s scale.





4.8 “Discount” Items

Throughout all four rounds of interviewing, participants were asked items that were designed to investigate decision making related to human capital theory, in which choices are based upon probabilistic expectations of the rewards and costs of alternative choices. Participants were asked a series of questions in which they were offered the choice of a fixed amount of money early, or waiting a certain period of time to receive a larger amount of money.


In rounds 1 and 2, these questions were presented to participants as sliders, with respondents asked to indicate the minimum amount for which they would be willing to wait the longer period of time. The sliders, combined with this question format caused a great deal of confusion among participants. Many stated that they did not understand the purpose of the questions. Upon probing, many participants missed the emphasis on the minimum acceptable amount and used the slider to choose the maximum amount possible, indicating that they wanted to get as much as possible for waiting.


In round 3, the sliders were replaced with radio buttons allowing participants to indicate whether they would take an amount of money earlier, or wait for a larger amount of money at a later date. Following this item, they were again asked to use a slider to indicate the least amount of money for which they would be willing to wait the longer period of time. In round 4, this follow-up item was asked again, but the question text reminded the participant of the minimum amount (e.g., “Starting with $750…”). This adjustment helped respondents better understand the items. It also seemed to have helped them make the link between the initial items and follow-up items. In earlier rounds, the relationship between these items was not evident.


7



File Typeapplication/msword
Authordroe
Last Modified ByJennifer Sharp Wine, Ph.D.
File Modified2011-01-20
File Created2011-01-20

© 2024 OMB.report | Privacy Policy