Reports

Attachment III BPS 2012-2017 CogInterviewReports.docx

NCES Cognitive, Pilot, and Field Test Studies System

Reports

OMB: 1850-0803

Document [docx]
Download: docx | pdf





Attachment III
BPS:12/17 Cognitive Interviews Summary Reports

























Contents



Cognitive Testing Round 1 Results………………………………………………………………………………………………………..…2

Cognitive Testing Round 2 Results...………………………………………………………………………………………………………19











Cognitive Testing of Questionnaire Items

Beginning Postsecondary Students Longitudinal Study of 2017 (BPS:12/17)

Round 1





Katherine Kenward, MA

Alisú Schoua-Glusberg, PhD







Findings

Research Support Services, Inc.

July 29, 2015

Introduction



This report summarizes findings of Round 1 of cognitive testing of the Beginning Postsecondary Students Longitudinal Study of 2017 (BPS:12/17) survey instrument conducted by Research Support Services Inc. Twenty cognitive interviews were conducted in Illinois, in the Chicago Standard Metropolitan Statistical Area between May 11 and July 15, 2015.

The following table displays the respondents’ characteristics:

Case ID

CBE School

Enrollment Status

School Level

School Control

BPS002

Yes CBE

Leaver

2-year

Public

BPS008

No CBE

Completer

4-year

Private, not-for-profit

BPS009

No CBE

Still enrolled

4-year

Public

BPS010

No CBE

Completer

4-year

Private, not-for-profit

BPS015

No CBE

Still enrolled

4-year

Public

BPS016

No CBE

Completer

2-year

Public

BPS022

No CBE

Leaver

2-year

Private, not-for-profit

BPS023

No CBE

Still enrolled

4-year

Private, for-profit

BPS024

No CBE

Still enrolled

4-year

Private, for-profit

BPS025

No CBE

Completer

4-year

Public

BPS027

Yes CBE

Leaver

2-year

Public

BPS045

No CBE

Completer

Less-than-2-year

Private, for-profit

BPS047

No CBE

Completer

Less-than-2-year

Private, for-profit

BPS057

Yes CBE

Leaver

2-year

Public

BPS059

Yes CBE

Completer

Less-than-2-year

Public

BPS069

Yes CBE

Still enrolled

2-year

Public

BPS079

Yes CBE

Completer

4-year

Private, not-for-profit

BPS093

Yes CBE

Completer

2-year

Public

BPS131

No CBE

Completer

Less-than-2-year

Private, for-profit

BPS136

No CBE

Leaver

2-year

Private, not-for-profit



Participants were recruited through online postings and flyers targeted at various postsecondary institutions and organizations. Flyers (shown on Appendix E) were handed out to students on college campuses (including Chicago community colleges), and were posted in a variety of community venues, including job search centers, businesses employing those with licenses and technical certifications such as beauty parlors and restaurants.

Interested candidates had to respond to a telephone screener (see Appendix F). Of the 20 total participants, seven had attended a Competency-based Education (CBE) school. Ten respondents currently or previously enrolled in public institution, five in a private-for-profit and 5 in a private not-for-profit institution. Eight currently or previously attended a 4-year program, 8 attended a two year program and four a less than two year program. Ten students had completed their first program, five had left their program and five were still enrolled. Students may also have attended or still attend additional schools after their first one, those counts are not included above. Based on interviewer report, there were nine men and eleven women interviewed.

Interviews were conducted in an interview room in a Chicago area hotel and in an office space in downtown Chicago. Both locations were convenient to multiple forms of public transportation. For some of the interviews, RTI project staff listened in real time by telephone. Each interview began with an introduction and administration of a consent form (shown on Appendix A). Then the respondent was asked to complete a short questionnaire while thinking aloud and describing how he came to his answers (shown on Appendix B). Once the questionnaire was completed the interviewer conducted retrospective probing on each question, continuing probing until the cognitive goals were met or she felt no more information on the mental process the respondent followed could be captured. The Interview Protocol appears on Appendix C. Show cards (see Appendix D) were used to have the respondent select one or multiple answers from a list.



Question-by-Question Findings and Recommendations


  1. Have you ever taken an online course?


Yes = 10

No = 10



Although half of the respondents reported having taken an online course, one of them was thinking of MOOC (massive open online course) classes she had taken for pleasure and the food safety courses she had taken for her work (not for credit or school). Another respondent signed up for what appeared to be an online course, only to discover that the major exams had to be taken in person.

Courses the respondents took where most of the work was online, but not all, were generally not considered 'online courses.' For instance, this was the case with the respondent who had thought only of the MOOC courses as online, but did not consider her Art History Course as such, where most of the work was online but there was one session a week in a classroom. [PROBE at Q2: What made that class, not an online class?] “My grade depended on me attending a class (in person), rather than being online being sufficient.”

Of the 10 respondents reporting reported they had not taken any classes online, nine reported later, in Q3, that they participated in some or all of the activities mentioned on the show card. Only one student had never taken an online course nor done any activities online.

Respondents generally felt they knew what an online course was. It was typically described as one where the student was not required to physically be in a classroom or go to a school.



Recommendation: Based on this and the following two questions, a definition of 'online course' provided to everyone upfront at Q1, may allow to eliminate Q2.



  1. How would you describe an online course?



This open-ended question was not interpreted as intended by a number of respondents. Several of them answered by providing a description of the characteristics a student needs to have to successfully take an online course, such as needing to be more "organized," and that the classes are "fast-paced," "easier," or "shorter." Other respondents focused on the benefits of such a class: "convenient" or "flexible."

When probed, most respondents struggled to get beyond (as one respondent put it) “a course you take online." It was described as a course over the internet or on the web, or on the computer.

Although a few students had taken hybrid courses and almost all had taken courses with online components, students did not appear to see these classes as ‘online’. Only one student who had signed up for an online course and discovered only afterwards that she had to take the exams in person, identified an ‘online’ class as one that included a requirement of some physical presence.



  1. Have you taken classes that conducted any activities online? What activities were online?



Fifteen respondents initially answered that they had conducted activities online. The activities they listed covered a variety of course-related things that have to or can be done online:

" ...reading articles and answering questions about them, doing homework and returning the assignment to your professor, and working on papers."

Activities online could be anything that you individually or as a group need to work that’s classroom related it could either be finishing your homework together or lab reports or just understanding the content that will be covered in the class the following week.”

"...pick up homework assignments, check on grades"

There was only one respondent who had never had any online activities in any of her courses.

In follow up probing, respondents were shown a show card with online activities and asked if they had ever taken a course that included activities from the list:



SHOWCARD:

  1. Exams or quizzes

  2. Lectures or presentations

  3. Orientation or first day of class

  4. Homework or problem sets

  5. Discussions (e.g. e-board postings or

live chat)



All respondents who reported that they had taken online courses also reported having also done at least some of the listed activities in at least some of their traditional (i.e. not 'online') classes.

Three respondents who initially answered No, were reminded by seeing the show card that they had actually participated in at least some of the listed activities.

One respondent noted that he took a course in which he was not able to submit work via the phone but was told he had to submit it via a computer.  When probed if an online course would include work done on the phone or only on a computer, he said it was only on a computer. However, it was not clear if he would have considered phone work as an ‘online activity.’

Of the 10 respondents who reported they had not taken any online courses in Q1, nine reported in this question that they participated in some or all of the activities mentioned on the show card.

In answering the question about whether the respondent ever took a class that included the activities listed on the show cared, respondents tended to think about and respond regarding multiple classes. The follow up question, probing to learn the extent to which course activities had been online, was also answered by discussing a variety of courses which included online activities to different degrees.


  1. Some schools offer, or may even require, courses to strengthen your skills in particular topics, such as math or reading. Some schools use placement tests to determine if these courses should be taken. Did you ever take any courses like this?



Most respondents interpreted the question as asking if they had taken a placement test, rather than if they had taken any skill strengthening courses. Some thought the question was asking about placement tests that determine if they needed skill strengthening classes and those that are used for advanced placement to skip out of lower college level classes.

Of the 16 respondents who answered Yes, only 6 indicated that they took skill strengthening classes. Of the 4 who had answered No, three had also interpreted the question as asking about placement tests and indicated that they had not needed to take placement tests because they had high ACT scores or the equivalent. Two students who had answered Yes had taken placement tests and then placed into advanced classes at the college level.

Most respondents could not identify a specific term for these classes, with several referring to them as ‘zero’ level classes that come before the 100 level. One used the term ‘skill building classes’.

When asked if there might be an alternative way to ask the question, two students said it needed to be shorter.

Recommendation: The question as worded makes emphasis on the test right before the question itself, and respondents are answering with the definition rather than the question in mind. We recommend testing the question without the second sentence.



  1. Competency-based education programs evaluate student progress based on their mastery of skills. Courses in such programs are often self-paced, with students moving through the courses at a longer or faster period of time, based on their personal situation and demonstration of skills. Does this describe any programs you are, or were ever, enrolled in?



Like question 4, this question produced false positive responses. Subsequent probing showed evidence of that over reporting. Fourteen of the 20 respondents answered Yes, although in probing only one student claimed to know what 'competency-based education programs' meant and 2 others thought the phrase sounded vaguely familiar.

None of the seven respondents who had attended a CBE school were familiar with the term, although three said they had taken a course that fit the definition. This included the culinary school student who was allowed to move through skill development material more quickly during the course, toward the end of the semester, he had to sit in class filling time while the other students mastered the skills. There was a second student who had taken a course online that was self-paced. The third student described the coursework becoming progressively more difficult, “as you climb that ladder it gets more and more complicated cause they want to see that you can teach someone too …so it was getting harder as I go.”

Three non-CBE students who had responded Yes described programs or courses that had some self-paced components, including two students who took accelerated programs with fixed length classes (although both of these programs appeared to be designed only as an accelerated option). One student was allowed extra time to complete coursework at a slower pace.

Additional students described professors pacing a class to meet the needs of the individual students in the class. One described it as getting graded on skill mastery.

Pacing and self-pacing were most commonly highlighted on probing. And except in the case of the respondent who was able to take extra time to complete a class, flexible course length was not highlighted as a qualifying factor of the definition.

As one student pointed out, the definition evaluate student progress based on their mastery of skills would describe a number of programs: “Well, if you want to get technical with your definition any class is going to be competency-based and there is going to be some self-pacing...”

Only one student, who had never heard of such a program, gave a clear description of what the question was asking: “if a student is outpacing their peers, they get to move on.”



Recommendation: To reduce the number of false positives, we recommend unfolding this question into the following two and testing them in round 2.

A. Are you familiar with competency-based education programs?

B. IF YES: Have you ever been enrolled in a competency-based education program?

This may lead to false negatives, so it would be important to probe fully and provide the definition during probing to see if any CBE courses are missed.



  1. Did you receive college credit for skills you learned outside of school? This is sometimes described by schools as Prior Learning Assessments (PLA). Students who earn this credit often submit a portfolio for review. Examples include credit for prior work experience or for military service. Have you ever received credit for skills you learned outside of school?


Three of the 20 Rs thought they may have heard the term Prior Learning Assessments before, although none of them had received credit. Many of the students found the question confusing or were unsure of what should be included. As one student put it, “It gets confusing at the bottom,” possibly referring to the fact that by the end of the question and its definitions and examples, the cognitive demand on the respondent accumulates.

Five respondents answered yes, although all five were referring to credit they got through work experiences arranged by their college programs. These experiences were referred to as internships by three respondents and volunteer work by two others. One of the internships was working in a college-owned and run restaurant by a culinary student as part of his course; another was a student who was required to do volunteer work in a pharmacy as part of a class.

One student who had answered “No; doesn’t apply to anything I would do,” had received her CNA education and licensure while still in high school in a program offered to high school students through a local college. When she went on to get her LPN, her college accepted the CNA certification and did not require her to take the CNA class required of all LPN students. [PROBE: So in this question, when you see ‘prior learning assessments’ you don’t see that as what you did?] “Well it’s similar but I wasn’t for sure. [Probe: So you answered No because you weren’t familiar with the term?] “Yes, I wasn’t familiar with the term but thinking about it, it reminds me of when I was in high school…” “…I saw ‘learned outside of school’ and I was like, I’m just trying to get done …get in and get out so …I was just trying to get in and get finished, not other programs. […Probe: You saw this as continuing education, not somebody like you who was trying to get finished?] "Yea, yea."

When asked about what outside of school meant in the question, people included internships, volunteer work, high school AP classes, language fluency and the examples included in the questions such as military and work experience.



Recommendation:

Test the question without definitions, labels or examples:

Did you receive college credit for skills you learned outside of high school or college?

Probe to make sure there are no false negatives.



  1. How confident are you that you could come up with $2,000 if an unexpected need arose within the next month?

    1. I am certain I could come up with the full $2,000.

    2. I could probably come up with $2,000.

    3. I could probably not come up with $2,000.

    4. I am certain I could not come up with $2,000.



Half of the respondents said they could probably come up with $2000, another 5 said they were certain they could, four thought they could probably, and only one was certain he could not.

All respondents appeared to interpret the question as intended.

When asked how they felt when answering the question twelve respondents expressed no discomfort although a few of them thought it was oddly placed, or odd to have it in the instrument. Of the remaining eight, most noted that the question itself was not an issue but rather that thinking about their own situation made them nervous, or “shaky”. These respondents expressed that it was not the question itself but more the reality of their financial situation and choosing the response that made them feel badly. Alternatively another respondent who loved this question said that it “measures a person’s drive” and that he felt great having to consider and realize that he believes in himself (despite having few resources this person chose the first option). One respondent noted that the question itself was uncomfortable, “Income questions are never comfortable, slight unease when you have to explain money situations.”



  1. Suppose you had $100 in a savings account and the interest rate was 2% per year. After 5 years, how much do you think you would have in the account if you left the money to grow?

    1. More than $102

    2. Exactly $102

    3. Less than $102



Most respondents were able to answer this question correctly, although only a handful appeared to use math in any meaningful way.

Fewer respondents were uncomfortable with this question although a couple did express that it made them feel like they were back in school and several said they were unsure of their answer and their lack of confidence wasn’t comfortable. “Are there going to be math questions like that? I feel like I’m taking a quiz now.”

One respondent who answered C (less money) said that she really did not understand this question. She said that she feels banks take away money from people in her age and financial bracket (describing monthly finance charges when your account falls below a certain amount). She said that the customer always loses money if they don't have an account that keeps a minimum amount required to earn interest. She did not have a negative reaction to the question but did not feel that it was appropriate for someone in her current situation – young and not in a high paying job and thus not able to keep larger funds in a bank account.

  1. Imagine that the interest rate on your savings account was 1% per year and inflation was 2% per year. After 1 year, how much would you be able to buy with the money in this account?

    1. More than today

    2. Exactly the same

    3. Less than today



Four respondents were able to answer this question correctly, three citing inflation being higher than the interest rate and one saying “The way the world is today it’s going down…with the way the world is now it seems like it’s going to be less and less.” [PROBE: When you answered this were you looking at the percentages, 1%, 2% or the way the world is going?] “The way the world is going…how the government is now.”

Half the respondents answered that they would have more than today, generally ignoring the inflationary portion of the question or noting they were confused or ‘not good at math’. Four respondents answered they would have exactly the same and one left the response blank saying he did not have enough information

A number of respondents expressed that they were confused, or that the question was confusing as worded. This, combined with the question being directly after question 8, may have contributed to respondent’s difficulty. It appeared that several respondents thought this question was just like question 8 but with a one-year reference period.

Poor understanding of inflation as well as having the wording change from how much is in the account (question 8) to buying power (question 9) clearly confused respondents.

Respondents did not express additional discomfort except as it related to their confusion or lack of confidence in their response.

Recommendation: Consider question order effects and test questions 8 and 9 separated by other questions.



  1. Do you think that the following statement is true or false? “Buying a single company stock usually provides a safer return than a stock mutual fund.”

    1. True

    2. False

    3. Don’t know



Just over half the respondents responded False. Seven respondents answered don’t know and two answered True. Those who answered True or False were evenly split between feeling confident or not in their answer.

Although most respondents who were asked were able to describe a single company stock accurately, stock mutual funds were less familiar. One who said that a mutual fund was safer described “It’s like a checking or a savings account, instead of spending the money you are saving the money.” Another respondent unsure of the meaning said that, to him, it sounded like “mutual lending, Lending Tree”. He was unsure what the word “stock” meant in the context of stock mutual fund and returned to the idea of lending. Another respondent who answered False said, “I don’t know but I would guess it [a mutual fund] is multiple people investing in a stock? Or multiple stocks? Whatever it is, there is a group element so there less to risk.”

Again, the only level of discomfort expressed was mild and related to acknowledging that they were unsure of their answer rather than with the question itself.


  1. If your household somehow were to get an extra unexpected $25,000 in the next few weeks, what would it do with the money? (Check all that apply)

    1. Spend it on something the household wants or needs

    2. Pay off some household debts

    3. Put it in savings or investments

    4. Donate it to family or charity

    5. Other (if Other, please probe for specifics)



No respondents expressed discomfort with this question.

Most respondents chose multiple responses. Five chose Put in Savings or Investments, although one of them saw ‘investments’ as investment in himself in his self-employed business.

Interestingly, the concept of household was not universally thought of as the people one lives with and shares income and expenses with. Two respondents clarified that they answered for the household but that did not include the people they were living with; their “household” consisted of family members who live elsewhere. Five responded only for themselves despite living with others. Four respondents were the only adult or the only adult that participates in financial decisions. One respondent specifically did include his roommates in his decision making.



  1. Do you have any student loan debt?


Eleven respondents reported having no student loan debt, including two who had debt in the past but do not any longer. Both of these students had understood this question to mean currently have any school debt.

Six of the respondents said they understood their loans and the process of repayment, with one noting that she had to take an ‘exam’ showing that she knew about subsidies, loan repayment and so on before she was allowed to borrow.

Of the five that indicated they had not understood the loan process, one said that the repayment amount had come as a shock to her. Another noted that she knew she had to pay the loan back but had not realized it would be immediately upon graduation. One noted that she still is confused, “they are not clear about when what is due”. One who thought she had understood at the time had not realized she was taking out more than one loan. She stated that had she known then what she knows now, she would not have taken a loan.

One last respondent had answered “Yes, but I don’t have to start repaying.” This student had been confused by the question and was initially unsure how to answer it. He sees ‘debt’ as a loan in arrears not as the loan itself. “I am not in debt but I do have student loans.” The confusion in his response was that he and the school are in disagreement on whether or not he should be paying and if he is current on the payment. The respondent has received a letter from “fed-loan” saying he should repay the loan but he maintains that he has been out of school for only two months and is still listed as ‘currently enrolled,’ even though he may not return. Thus, had the question asked “Do you have a student loan?," this respondent would have answered ‘Yes’. But as the question is worded, he would answer ‘No’, except for the fact that he is in arrears (according to the school).



13). Have you completed a certification/associate's or bachelor’s degree since leaving high school?

  • Yes GO TO 13A

  • No, I am still taking classes towards a degree GO TO B

  • No, I have stopped taking classes towards a degree GO TO C

13A. Did it take you longer than you expected to complete your degree? YES / NO GO TO 14

13B. Is it taking you longer to complete your degree than you expected? YES / NO GO TO 14

13C. Was it taking longer than you expected to complete your degree? YES / NO GO TO 14



Eleven people reported they had completed their degree, with five of those saying it took longer than expected. Five respondents are still taking classes and four of them reported that it is taking longer. Four respondents are no longer taking classes, half of these respondents said it had been taking longer than expected to complete the degree.

A show card was shown to respondents, containing reasons why some students may take longer to complete a degree:

  1. Personal reasons—medical, family, or other non-financial, non-academic reasons

  2. Financial reasons—the costs of attending required periods with less or no enrollment

  3. Transfer issues—credits lost during transfer between institutions

  4. Changed major—new major required additional coursework

  5. Extra-curricular academics—participation in activities such as internships, co-ops, study abroad

  6. Academic advising—an academic advisor suggested a lighter course load

  7. Scheduling—could not get the classes needed, when needed



Respondents appeared to understand the reasons for delays listed on the show card; however, it became apparent that several categories had potential overlap. One respondent, for example, pointed out that an expected credit for an internship did not transfer to her program. When probing on job, though, an overlap between Financial Reasons and Job Reasons became apparent. Some respondents saw job duties/hours as interfering with schoolwork as the issue, whereas others saw having one or two jobs, as needed for financial reasons to pay for school or other expenses. "I stopped taking classes but it wasn't because it was taking too long, it was because I needed to work and my work schedule interfered with my school schedule. Not that one was more important, but at the time I just needed to work because I had to live.”

Respondents were thoughtful in coming up with additional reasons not on the list, including:

  • Accreditation issues – this impacted both transfer and the ability to secure loans for schooling

  • Hard for older students to jump back in (after being away from school for years)

  • Failing course and having to repeat (throwing off schedule)

  • AP classes not getting credit

  • Job activities - taking focus off school/interfering with getting work done

  • Job activities - Paying for school /working to pay for school

  • Stress

  • Enthusiasm of teachers (motivating students to complete a degree program)

  • Transportation issues/distance to school

  • Self-confidence/"they need a push"/Not sure what they want to do

  • Trouble with the law

  • Jury duty





14). In what ways did your undergraduate student loan debt influence your employment decisions?


Five respondents had no student loan debt. Of the remainder, seven initially said loans did not affect their employment decisions.

After the question was asked in open ended format, a show cared was shown to respondents who said the loan debt affected their employment decisions, with a list of possible ways:

  1. Took job outside field of study or training

  2. Took less desirable job

  3. Had to work more hours than desired

  4. Had to work more than one job at the same time



On viewing the show card, six respondents selected multiple items. All respondents appeared to understand the question and show card, although most appeared to read it more broadly as how loan debt has influenced life decisions rather than just job decisions.

Three respondents said that the loan debt influenced their field of study (including continuation to grad school) and suggested adding that the loan debt influenced their decision on field of study and, thus, the jobs they were qualified to get. Some respondents noted that their current employment decisions are related to being in school and not yet influenced by loan debt or are made in a way to avoid additional future loan debt. One respondent noted that the job he got to avoid additional loan debt influenced his field of study.

One respondent suggested adding "change personal spending habits." While another suggested adding "take on more debt," referring to taking on consumer loan debt.


One respondent, who would have preferred to work from 9 to 5, worked instead a different shift at the same job and suggested that this ought to be a response option.



15). In your current job, do you...

  1. Supervise the work of others?

  2. Participate in hiring or firing decisions?

  3. Participate in setting salary rates for employees?


Nine respondents did not circle any response, although several of these nine had some sort of leadership role. One respondent participated in training of new employees, two were responsible for setting work schedules, with one of these also responsible in directing a crisis response situation for his mental health job. And one was the floor lead in her restaurant job. When probed why she did not consider floor lead a supervisory position, this respondent said that she read the three activities as one and saw them all as ‘management’ since she doesn’t see herself as management she didn’t consider answering A. When probed further if it were separate she said she probably would have said yes, and if it had said sometimes would definitely have put yes.


Half the respondents said they have a supervisory role, four of these also participate in hiring and firing and three of these respondents also set salary rates. One respondent marked option c because she has a pet care business and sets the rates for her clients.


Recommended additions to the list were (including those mentioned above):

  • Setting schedules/assignments

  • Listening to others/ recognizing the work of others

  • Training

  • Participating in new ideas about the business or management

  • Being responsible for safety

  • Making sure others do their work/don’t sleep on the job

  • Directing in a crisis



Recommendation:

  • Include yes/no for each item or mark all that apply with a “none of the above” option.



16.On a scale from 1-5, where 1 is very unsatisfied and 5 is very satisfied, please indicate how satisfied you are with each of the following in your current job:

  1. Your pay

  2. Fringe benefits (such as health insurance or retirement benefits)

  3. Importance and challenge of your work

  4. Opportunities for promotion and advancement

  5. Opportunities to use your training and education

  6. Job security

  7. Opportunities for further training and education



Respondents in general understood this question as intended and appeared to understand the scale. All respondents used the scale appropriate to their understanding with the exception of Neither Satisfied or Unsatisfied which was used both as an in-between measure of satisfaction and as a Not Applicable response.

There were some slight differences in how the response options were understood. For example, satisfaction with benefits for those who get benefits ranged from 2-5, whereas those with no benefits ranged only slightly lower from 1-4. During probing, a couple of respondents noted, both for this item and others, that they were still early in their careers or still in school so their expectations were lower than they would be otherwise. Although not probed specifically, it appeared that medical coverage was the only benefit considered. As one self-employed respondent noted, her insurance is through her parents and she is not thinking about retirement yet.

Most respondents appeared to select option C (importance and challenge of your job) in terms of how much the job challenged them. However, three respondents appeared to see this item as the importance or challenge in relation to other career paths. One of the respondents who worked in the mental health field and marked 5 (very satisfied) noted “…so it does seem like something that does add good to the world.” Another respondent who runs a non-profit organization noted she was very satisfied because she believes in the mission of the organization and feels challenged by many of the things she does on a daily basis. A third respondent who was a cook also answered in terms of society, giving this answer a 3 for himself. “It depends on what a person does, I feel [being] a doctor is very challenging and very important, no shame. But I cook; not to say it’s not important but I just think there are other things out there that are way more important to the livelihood of others than what I do. It may be very important to me but that would be a totally different question.” [Probe: So you were thinking of this in terms of importance of your work compared to saving lives or the equivalent?] “Yea, I was thinking in terms of every other position and what people do and the importance of it. If it had been personal importance …I would rate it higher than that because I like what I do.” [Probe: What about if it were importance to your company, that is, how important you are?] “Yea, a higher rating”. [When probed on the challenge of the work separately from importance R said he would rate that differently too.] “I would definitely rate it higher, because when you think of that you think in terms of yourself not everything else”

Respondents who had no current job were unsure what to do with this question, or skipped it entirely.

There was some confusion about options E and G with some respondents thinking they overlapped.

Debriefing

At the end of the interview, we asked respondents if they wanted to make any additional comments about the questions tested. We probed on anything that seemed out of place or any questions that made them feel uncomfortable.

Six respondents mentioned that the math questions seemed out of place, were confusing, needed better wording, were slightly uncomfortable or asked what the answers were.

Four respondents mentioned finding questions 5 and 6 too long or confusing.













Cognitive Testing of Questionnaire Items

Beginning Postsecondary Students Longitudinal Study of 2017 (BPS:12/17)

Round 2





Katherine Kenward, MA

Alisú Schoua-Glusberg, PhD







Findings

Research Support Services, Inc.

January 2016

Introduction



This report summarizes findings of Round 2 cognitive/usability testing of the Beginning Postsecondary Students Longitudinal Study of 2017 (BPS:12/17) electronic survey instrument conducted by Research Support Services Inc. Twenty cognitive interviews were conducted in Illinois, in the Chicago Standard Metropolitan Statistical Area between November 13 and December 11, 2015.

Twenty adult participants were selected to provide representation of the specific populations of the BPS:12/17 study. All respondents were first enrolled in a postsecondary program between July 1, 2010 and June 20, 2012. Respondents were selected to meet particular institutional characteristics, including level of degree offered (less-than 2-year, 2-year, and 4-year) and institution control (public, private not-for-profit, private for-profit). The student sample included both program/degree completers, those still enrolled, and those who left prior to completing a program. Married respondents and respondents who had taken out student loans were also targeted to test specific questions regarding financial aid and loan debt.

The following table displays respondents’ characteristics:

 RSS ID

Month/Year

Completion?

Level

Control

Loans?

Married?

BPS206

8/1/2010

Still enrolled

4-year

Private, for-profit

Yes Loan

Not Married

BPS220

10/1/2010

Completer

less than 2 year

Private, for-profit

Yes Loan

Not Married

BPS223

10/1/2010

Completer

2-year

Private, not-for-profit

Yes Loan

Married

BPS224

1/1/2011

Completer

less than 2 year

Private, for-profit

Yes Loan

Married

BPS226

9/1/2010

Leaver

4-year

Private, not-for-profit

Yes Loan

Not Married

BPS228

7/1/2010

Completer

less than 2 year

Private, for-profit

Yes Loan

Not Married

BPS233

9/1/2011

Still enrolled

2-year

Public

Yes Loan

Not Married

BPS235

9/1/2011

Still enrolled

2-year

Public

No Loan

Not Married

BPS237

8/1/2010

Leaver

4-year

Public

Yes Loan

Not Married

BPS238

8/1/2010

Completer

4-year

Private, not-for-profit

Yes Loan

Not Married

BPS241

8/1/2010

Completer

4-year

Public

Yes Loan

Not Married

BPS246

8/1/2010

Completer

2-year

Private, not-for-profit

Yes Loan

Not Married

BPS250

3/1/2011

Leaver

less than 2 year

Private, for-profit

Yes Loan

Married

BPS252

8/1/2011

Leaver

4-year

Public

Yes Loan

Not Married

BPS253

5/1/2011

Completer

less than 2 year

Private, for-profit

Yes Loan

Not Married

BPS254

8/1/2011

Completer

4-year

Public

No Loan

Not Married

BPS264

8/1/2010

Completer

4-year

Public

Yes Loan

Not Married

BPS273

9/1/2010

Leaver

2-year

Public

No Loan

Not Married

BPS276

6/1/2012

Completer

less than 2 year

Public

No Loan

Not Married

BPS278

8/1/2010

Leaver

2-year

Public

No Loan

Not Married



Participants were recruited through online postings and flyers targeted at various postsecondary institutions and organizations. Flyers (shown on Appendix E) were handed out to students on college campuses (including Chicago community colleges), and were posted in a variety of community venues, including job search centers and businesses employing those with licenses and technical certifications such as beauty parlors and restaurants.

Interested candidates were screened for eligibility by phone (see Appendix D). Of the 20 total participants, ten were currently or previously enrolled in a public institution, six in a private-for-profit and four in a private not-for-profit institution. Eight were currently attending or had previously attended a 4-year program, six attended a two year program and six a less than two year program. Eleven students had completed their first program, six had left their program and three were still enrolled. Fifteen respondents reported having received financial aid and three respondents were married. Of the twenty respondents for Round 2 testing, eleven were female and nine were male.

Interviews were conducted in an interview room in a Chicago area hotel or in private office space in downtown Chicago. All locations were convenient to multiple forms of public transportation. Interviews lasted approximately 90 minutes. Each interview began with an introduction and administration of a consent form (see Appendix A). The interviewer then provided each respondent with a unique CaseID which they used to log in to the web-based survey instrument using their own smartphone or tablet. Respondents were asked to complete the survey as they would on their own. Interviewers administered specific probes on 38 questions which were predetermined by NCES/RTI to be of particular interest. Interviewer probes were designed to garner feedback on issues such as student enrollment, education experiences, financial aid, financial literacy and employment. Interviewers also noted any usability issues arising as respondents navigated through the survey instrument. Probed questions and specific interviewer probes appear on Appendix B. Interviewers presented respondents with a showcard (see Appendix C) to test an alternate version of one question regarding employment industry. Findings for each of the 38 probed questions are provided herein. Interviewers also noted and spontaneously probed on any additional questions where the respondent indicated there was a cognitive or usability issue.


Findings on Probed Questions

1. B17AOTSCH01




2. B17AOTSCEX01







Qs 1 and 2:

Goal: To determine respondents’ understanding of “attended” and whether or not they are providing accurate enrollment information in the survey based upon their understanding. For usability purposes, we are interested in whether or not they find their school in the results provided.



Respondents were asked to paraphrase the meaning of “attended” in Qs 1 &2. Five respondents interpreted “attended” as “enrolled,” and four as “taking classes.” One respondent commented that attending a school is not necessarily physical: “…just because you don’t physically attend campus does not mean that you are not attending classes (220/103).” Another respondent thought the opposite and said he would not count an online class: "I was just home, I just had to log in. I never physically attended the school...I was trying to make up a credit (264/117)."

Most respondents expressed a preference for Q2 (new predictive search form) and thought it was easier to select their school from the dropdown list. Only one respondent expressed a preference for Q1 (traditional coding form).

All respondents were able to find their school and all appeared to provide accurate enrollment information. One respondent had attended only one school and was directed to Q1 in error.

A few respondents were confused by being asked two versions of the same question. One respondent (237/110) thought Q2 was asking about a third school attended and skipped the question. Respondent 220/103 entered two different schools at Q1 and Q2. A third respondent (206/102) entered school name differently at Q2 after seeing how the school name appeared on the list at Q1.

Usability: One respondent (246/118) had an issue with the survey freezing and had to reload the instrument. The interviewer noted that this may have been an internet connectivity problem. One respondent (254/119) started the survey using an iPhone4 and the dropdown list at Q2 did not display on her phone.



Findings for questions 3, 35, 36, 37, 38 are discussed together:

3. B17AOTMJ01EX



35. B17AMAJ1



36. B17AMAJ1EX



37. B17BOMJ1A

Shape1



38. B17BOMJ1AEX




Respondents were probed regarding preference for the traditional coding form used at Q35 and Q37 versus the new predictive search form at Q36 and Q38.

Overall, more respondents seemed to prefer the traditional coding form over the new predictive search form. They were more confident they had chosen the right major, because they saw more descriptive options. Respondent 264/117 remarked "I like the first one because it’s more laid out...Because with the dropdown list you may miss your specific field."

One respondent (238/112) who had changed majors preferred one format for one major and the other for the second major: “Ok, so when I was doing it for ‘English’ (her most recent major), I preferred the drop down box, but…for ‘Engineering’, they specified it so specifically that it was easy to look at that way as opposed to the drop down box.”

Four respondents said they preferred the new predictive search form because it was simpler, and their major “just popped up”. Two respondents expressed no preference between the two formats.

Usability: Respondent 235/116 encountered a usability issue at Q35. The page did not fit on the screen of her iPhone5 and she was not able to see the 'select' buttons. She had to swipe to the left in order to find the buttons and when she was able to view the list of majors, the screen froze. When this respondent tried clicking 'previous' she was kicked out of the survey.

The instrument froze at Q37 for respondent 238/112. When she clicked the ‘back’ button she was directed to the beginning of the survey.

The predictive search did not appear at Q36 for one respondent using an iPhone4 (254/119).





4. B17BIMPACT



Goal: Have respondents heard of these types of programs? Are the program titles being interpreted consistently across respondents or are respondents thinking differently?



Research project with a faculty member outside of course or program requirements

Three respondents selected ‘Yes’ for ‘Research project with a faculty member outside of course or program requirements,’ however, all three seemed to be false positives. Respondent 237/110 was thinking of a senior thesis, 250/113 referred to a classroom activity, and 246/118 was considering private music lessons.

In general, respondents seemed confused by this item. Several did not seem to understand that the question was asking about research “outside of course or program requirements” and talked about research projects they did for classes (278/124, 237/110, 250/113).

Respondents were probed as to their interpretation of this item. One respondent (224/104) thought it was something like tutoring. Another respondent (226/107) thought it could be help getting a job or for something outside your major “Somebody as a faculty member, they are helping you maybe to get a job so they are working with you to get a position outside of school. Or even if you just are doing a project with a faculty member outside of you major.”

Another respondent (238/112) expressed confusion about why you would work with a professor ‘outside of course or program requirements’: “I don’t think I understand that one. That one seemed kind of fishy…why would you need to do research projects with someone outside of your course or program? Unless you are just getting mentored by them. It is kind of weird, I am not trying to hang out with my professor if I don’t have to. ”

Community-based project as part of a regular course

Eight respondents selected ‘Yes’ for ‘Community-based project’ and all seemed to understand the item as asking about working or volunteering in the community as part of a class. Respondent 228/105 explained that "It was for a biology course. We had a community garden." Respondent 220/103 mentioned a service learning component: “Before I can become a certified addiction and drug counselor I have to complete almost 2,000 service learning hours, which are basically I have to volunteer someplace unpaid for close to 2,000 hours.”

Four respondents who selected ‘No’ at this item seemed somewhat unclear when expanding upon their responses. Respondent 206/102 was thinking of an internship, 235/116 referred to a service learning program he participated in in high school, 264/117 thought of extracurricular programs unrelated to a specific course, and respondent 278/124 thought of “Practical learning outside of school, a work study program.”

Culminating Senior Experience

Seven respondents selected ‘Yes’ at this item, however, only three seemed to understand the item as intended. Those who understood named a senior recital (246/118), a project with a professor (241/120), and a senior thesis (238/112).

Three respondents thought this item would include exams at the end of a course. Others either said they had never heard of these terms, or did not understand the item as referring to something related to completing a singular course as opposed to completion of a program. For example, respondent 235/116 referred to a “comprehensive exam” for a course: “I remember that for a class, I think anthropology class, we had to do a comprehensive exam and also a project. I am not sure with senior project what the ‘senior’ part is.”



Program in which you were mentored

Nine respondents selected ‘Yes’ and all seemed to understand the item as intended. Respondents mentioned faculty mentors and being mentored by a tutor. Respondent 220/103 described a specific program: “It is actually through the place where I am doing my externship at. They have a mentorship program, which is basically for students. It just provides guidance outside of our externship for a little more professional experience.” Another respondent, 238/112 described a mentorship program through her sorority: “Throughout high school I was in a program that a sorority put on at my school …when I went into college I continued the mentor program and I got mentored by one of the members of the sorority, and eventually I joined the sorority.”

Learning Community

Seventeen of the 20 respondents said they had heard of this type of program but only 5 appeared to interpret this item correctly as referring to a formal program. Several respondents mentioned that it is common for students with the same major to take multiple classes together but not as part of a ‘formal program.’ One respondent (253/115) was thinking of a work force center: “…the learning community. It's a work force center. They help you find a job. Help you with your job search.” Respondent 226/107 thought this item referred to a program for students struggling or failing a class, respondent 235/116 interpreted as getting extra help from a teacher.



Usability: One respondent, (264/117) misread the first item as part of the overall instruction and thought that ‘outside of course or program requirements’ applied to all. The interviewer noted that this was likely due to scrolling quickly through items.





5. B17BCBE




Goal: Do respondents know what competency-based education programs are?

Three respondents selected ‘Yes’ at Q5 and all three seemed to be false positives. Respondent 220/103 was thinking about prior learning assessments, 235/116 was thinking about a course where a professor let her turn in assignments a week late, so thought it was “at her own pace” and 250/113 was thinking of a course where he did not have to attend class as long as he could pass the tests.

Fifteen of the 20 respondents included “at your own pace” as part of their explanation of what the question was asking, 6 mentioned ‘traditional classroom activities,’ and 7 of the 20 respondents mentioned ‘demonstration of mastery.’ Nine respondents seemed to be thinking of a program, while the other 11 respondents were not. One respondent, 254/119 was thinking of an independent study with a professor for credit. Another respondent, 237/110 was thinking of massive open online courses (MOOC).


Usability: No usability issues





6. B17BPLA



Goal: Do respondents know what credits for prior learning are?



Most respondents, 15 of the 20, generally understood that Q6 was asking if you had ever received credit for knowledge or skills learned outside of school. Many respondents mentioned having heard of programs like this, with 8 specifically referring to the fact that it was asking if you got credit for things learned “on the job or in the military” and not in school. No respondents were able to name a specific program, however.

All three respondents that answered ‘Yes’ at Q6 seem to have been false positives: 241/120 was thinking of high school AP classes, 224/104 was thinking of ROTC, and 246/118 was thinking of placement tests.

Five respondents mentioned AP tests as a way to get credit “for prior learning” and three of the 5 understood they should not be counted because they were not earned “on the job or in the military” but in high school. No respondents mentioned CLEP tests by name, but two respondents did mention taking tests to get credit.

One respondent (237/110) thought of programs intended to "acknowledge what older students bring to the table through the life experiences that they've already had and...recognizing that there's more than one way to learn stuff." This respondent was thinking specifically about "nontraditional students" as older students returning to college or students entering after military service.

Usability: No usability issues



7. B17BFEWERCRS





Goal: Is this a comprehensive list of reasons why respondents had to take fewer classes or time off school (which ultimately leads to slower progress toward a degree)? How do respondents define “take fewer classes or time off school?”

All respondents understood Question 7 as intended, there was no confusion about the meaning of ‘take fewer classes or time off school.” All respondents were able to select appropriate responses and explain their reasoning.

All but one respondent noticed the instruction to ‘please check all that apply.’ One respondent, (237/110) said she noticed the instruction and commented, “I generally assume if it's like a box rather than a circle I assume that I can check multiple ones."

In general, respondents thought that the list of response options was comprehensive. When probed as to whether list was complete, all respondents said yes.

One respondent (254/119) also suggested adding “volunteering” to the list: "Maybe just wanting a break from school. Or maybe…this could fall under personal reasons but volunteering like…Peace Corps or something like that."

One respondent, 264/117 selected ‘other’ and entered “prerequisites.” He explained that he had to wait to take some courses until taking other prerequisite courses: “it was like certain classes that I needed so I had to wait out on taking certain classes so I could take this one."

Respondent 226/107 said list was comprehensive when probed but interviewer noted that this respondent did not seem to have read the response options, just read the question and selected ‘none of the above.’

Respondent 238/112 considered adding an option for taking fewer classes or time off due to disability: “Unless they had a difficulty with their, – but it would not cause you to take fewer classes – I am saying like a difficulty... like actually learning disability or something like that, and they had to take fewer classes. That’s the only thing I can think of, but I think that pretty much covers it.”

In debriefing, respondent 252/114 mentioned that there should be a probe "on social life while attending college and how that plays a part for why you didn't complete school"

Three or 4 respondents mentioned liking the ‘Other (SPECIFY)’ option in case someone wanted to add something not on the list.

When probed, respondent 253/115 said that nothing was missing from list of response options but then mentioned that the list did not include being unable to afford bus fare or meals or having a job and not being able to go to school on days when certain courses are offered.

Respondent 254/119 suggested adding a N/A option for cases where a respondent feels the question does not apply.



Usability: No usability issues.



8. B17BMORECRS





Goal: Is this a comprehensive list of reasons why respondents had to take more classes? How do respondents define “take fewer classes or time off school?” As with B17BFEWERCRS, this question is intended to learn more about slower progress toward a degree.

In general, respondents had no difficulty interpreting or responding to Q8. 19 of 20 respondents reported having noticed the ‘check all that apply’ instruction (one respondent was not asked due to interviewer error).

The majority of respondents felt the response options for Q8 were comprehensive. Five respondents offered suggestions for additional response options:

  • Respondent 224/104 suggested adding an option for “Graduate early.”

  • Respondent 246/118 suggesting adding an option to cover taking extra courses for special interest: "I was thinking some people just like taking extra courses just because they want to know more, I don't know if that's useful to the survey but I was actually thinking if that one was there I probably would have checked that too. Because there were some times where I loaded up a little more than I needed to just because I was interested in a class."

  • Respondent 241/120 mentioned taking time off then returning to school and taking more courses in order to graduate on time. This respondent suggested having a space to specify when selecting ‘other’ (a ‘please specify’ box does pop up when ‘other is selected).

  • Respondent 250/113 said that the list was complete but mentioned an additional option of a student being “kicked out of school” for behavior problems or poor grades.

  • 252/114 suggested adding an option for having to repeat or withdraw from a course due to low grades.

  • Respondent 220/103 mentioned that having the option to select ‘Other’ would cover anything else.



Usability: One respondent (278/124) encountered a problem with the survey freezing at Q8. The Respondent was able to refresh and survey picked up where she had left off. No other usability issues.



9. B17CWHYPRV



Goal: To determine whether respondents can provide reasons to why they took out private loans. Is this list of reasons comprehensive, do any of the reasons overlap, and are we missing any common reason?



Three respondents reported having taken out private loans. Q9 was NA for the remaining 17 respondents. One of the three (237/110) did not understand the distinction between private and federal loans: "I mean to be honest I don't really know that much about the distinction between them." Two of the three did not understand the meaning of the term ‘deferred.’ One of the three did not understand the meaning of ‘distributed by institution's aid office.’

One respondent (224/104) seemed confused by ‘Federal loans were not offered by my school.’ He first selected this option, but then changed his answer to No, saying, "They were not offered to me." Otherwise, respondents had no difficulty selecting and explaining responses.

Usability: No usability issues





10. B17CSPLN



Goal: To determine if respondents are aware of their spouse’s student loans.



Question 10 was asked of 3 out of 20 respondents (those who reported being married). All three were aware of spouse’s loans, although one respondent did not feel comfortable answering questions about his wife.

Respondent 250/113 answered ‘Yes’ thinking of his wife’s “Sallie Mae” loans. Respondent 224/104 answered ‘No.’ This respondent’s spouse had student loans in the past which are now paid off. Upon probing, this respondent explained that they were thinking of current loans and not past loans.

The third respondent, 223/106 said he would rather not answer questions about his wife, because he had not asked her permission beforehand. The respondent was told it was okay to skip any questions he did not feel comfortable answering. Probes were skipped per respondent’s request.



Usability: No usability issues


11. B17CSPAMT



Goal: To determine whether respondents are able to provide an amount and their assessment of how accurate their answer is.

Question 11 was asked of 2 of the 20 respondents.

Respondent 250/113 provided an estimate saying he was unsure of the original loan amount but knows how much is paid towards the loan monthly. The estimate that the respondent provided was for the outstanding amount of the loan, “I know because the bills come to the house.”

Respondent 223/106 did not feel comfortable answering questions about his wife so question was skipped per respondent’s request.



Usability: No usability issues



12. B17CSPOWE



Goal: To determine whether respondents are able to provide an amount and their own assessment of how accurate their answer is.



Question 12 was asked of 2 of the 20 respondents.

As with Q11, respondent 250/113 was not positive of how much his wife initially borrowed but said he knows what is paid out each month. This respondent estimated the outstanding loan amount.

Respondent 223/106 did not feel comfortable answering questions about his wife so question was skipped per respondent’s request.



Usability: No usability issues



13. B17CSPLNPY



Goal: To determine whether respondents are able to provide an amount and their own assessment of how accurate their answer is.



Question 13 was asked of 2 of the 20 respondents.

Respondent 250/113 entered $200 stating that this is part of the monthly budget calculations for his household. The respondent was confident that this amount was exact: “I see the receipts…hers is always more than mine.”

Respondent 223/106 did not feel comfortable answering questions about his wife so question was skipped per respondent’s request.



Usability: No usability issues





14. B17DLNINC



Goal: What kind of life plans and decisions are respondents thinking of when answering this question? How do they interpret “life plans and decisions,” and are there other terms they use for these concepts.



Q14 was asked of 15 of the 20 respondents (those who had reported taking out a student loan). Nine respondents answered ‘Yes’ at Q14 and 6 answered ‘No.’

Respondents were probed for interpretation of the term ‘life plans and decisions.’ All respondents understood the term as intended and were able to explain what was taken into consideration (regardless of having answered question ‘Yes’ or ‘No’. Generally, respondents considered factors like, job/career decisions, living expenses, being able to travel, car loans and continuing education. Taking a less desirable job or working outside one’s field of study were mentioned most commonly.

Some examples of responses include:

  • When it says life and decisions to me is like, what has this made you think about? Has it made you think about different things like is this going to impact in the future car loans, home loans, apartments, and things of that nature. What are these loans going to impact? A lot of jobs now do a credit check.” (220/103)

  • life plans and decisions: “I think they mean exactly that. Having children, buying a home, moving across the country, moving internationally, traveling, those sorts of things.” (228/105)

  • I had to make sure that this is a career that I want to be in for a long time, and that this is something that I’ll be successful in and something that I can get a job in, so it’s kind of making me reevaluate everything now as opposed to later once you have the degree and you’re like, oh I can’t do anything with this degree and now I’m stuck paying off all this debt…it’s a little nervewracking.” (233/109)

  • "I do think that probably means decisions of like choosing a job that you're really passionate about versus a job that's going to pay you enough money so you don't have to worry about your loans or choices to travel, where you're going live, what kind of rent you can afford. And so far for me that hasn't really affected me in a major way." (237/110)

  • "I think things like the pressure of how soon to try to get a job and what type of job to get. Like do you hold out for a job that you really love or do you have to do the part-time grind until you find something." (246/118)

  • "It may influence you to get the first job you can just to have some income, and focus later on your dream as opposed to just jumping head first into trying to follow your dream or something, you just want to make some money." (264/117)



Interviewers noted whether respondents mentioned any items from the response options in Question 15 or if any items not listed were mentioned. Many respondents did consider factors listed at Q15:

  • Took job outside of field of study or training (3 respondents: 233/109, 246/118, 264/117)

  • Took a less desirable job (5 respondents: 237/110, 241/120, 246/118, 250/113, 264/117)

  • Had to work more than one job at the same time (1 respondent: 241/120)

  • Did not attend graduate program (2 respondents: 238/112, 241/120)

  • Could not afford to buy or keep a car (5 respondents: 206/102, 241/120, 246/118, 252/114, 264/117)

  • Vacation/travel (2 respondents: 228/105, 264/117)

  • Had to strictly budget money (2 respondents: 220/103, 224/104)

  • Had to delay purchasing a home* (2 respondents: 220/103, 228/105)

  • Moved back in with parents (1 respondent: 206/102)



*Several respondents mentioned giving consideration to housing/living expenses. However, five respondents (206/102, 220/103, 237/110, 252/114, 253/115) seemed to be thinking more about renting an apartment, “where you can afford to live” or being able to live on their own more than thinking about homeownership. It may be helpful to broaden this response option to cover consideration of living/housing expenses more generally.

Respondents also mentioned factors not included in the list of Q15 response options such as:

  • Credit history (206/102)

  • Getting married/having children (206/102, 228/105)

  • Other debt, ie. Credit cards (220/103, 241/120, 253/115)

  • Shopping/spending habits (224/104)

  • Moving expenses (228/105)

  • Commitment to field of study (233/109)

  • Childcare expenses (253/115



Usability: One respondent, 206/102 encountered a usability issue at Q14. The instrument froze and ‘next’ button did not work. The respondent went back to the previous question and was then able to move forward. No other usability issues were observed at Q14.



15. B17DLNICA




Goal: Do respondents identify with these reasons? Are any of these reasons confusing? Are there any other situations that aren’t included on this list?



Q15 was asked of 9 of the 20 respondents (those who had answered Yes at Q14).

PROBLEM: Wording appeared to be contradictory in instrument. Respondents were asked to choose the MAIN influence (which suggests selecting one response option) but also instructed to check all that apply. One respondent (246/118) said he chose only one response option thinking he should select his “main influence” only even though there were other influencing factors he could have also selected.

Respondents were probed as to whether any response options were confusing or unclear. 8 of the 9 respondents said all response options were clear. All respondents were able to clearly explain their reasoning for selected responses.

One respondent (238/112) thought that the options about buying a car or delaying purchasing a home were too specific, could instead just be “delay purchasing something big….“I think that’s a weird thing to throw in there, ‘delay purchasing a home’. That’s just a weird thing to throw in there with a list of... The car thing, the home thing I don’t really get.”

Respondents were clear about the meaning of ‘field of study or training.’ This response option was selected by 5 of the 9 respondents.

Five of the 9 respondents felt the list was comprehensive and 4 offered suggestions for additional response options: One respondent (228/105) selected ‘Other reason’ and specified “Moving.” Another (246/118) suggested adding an option to cover "abstract pressure" or “existential pressure." Two respondents suggested adding an option to cover loans impacting credit score or credit report. Respondent 252/114 said they would have selected ‘Credit Report’ if this were an option but did not select ‘Other’ and specify. Respondent 253/115 said that "affected your credit score negatively" should be added.

One respondent (220/103) seemed somewhat confused by Q15. This respondent answered ‘Yes’ at Q14 that loans had influenced life plans and decisions but then selected ‘None of the above’ only at Q15. The interviewer noted that the respondent did not seem to recognize that Q14 and Q15 were related and that she explained her responses to both questions well, but the responses did not correspond with her explanations. She mentioned things such as credit card debt, budgeting, car/home loans at Q14 but then at Q15 stated “I actually haven’t had to experience any of these as of yet, hopefully I don’t have to experience these things.”



Usability: No usability issues







16. B17DWRK1YR1



Goal: Can respondents accurately recall and report information from this timeframe? Do they accurately interpret the question, are they thinking about the correct year when they answer.



Fifteen respondents answered ‘Yes’ at Q16 and 5 answered ‘No.’ All respondents were thinking of the right time frame and all answered about jobs they had while in school. Most respondents had one single job. If respondents had several jobs they considered the one they held longest or worked at the most hours.

Respondents were probed as to what types of jobs they included or excluded when considering a response. Examples of jobs respondents chose to exclude are as follow:

  • summer jobs and jobs they got later in their school years

  • side jobs, like occasional construction, babysitting

  • one respondent would not declare under the table jobs

  • one was thinking of her 2 regular jobs but not odd jobs

  • one was unsure about including a work study job

  • one did not consider playing gigs a job

  • one excluded volunteering

  • one excluded money made on the side for participating in FGs

  • one excluded cutting friend's hair for pay



Usability: No usability issues



17. B17DWRK1HRS



Goal: Can respondents accurately recall and report information from this timeframe? Are they thinking of the correct year?



All respondents seemed to accurately report paid work for their first year of attendance and no particular difficulties were observed at Q17. Many respondents were able to easily recall number of hours worked. Respondents who had worked regularly their first year answered about an average work week. Some with more than one job or different jobs provided an average number of hours. Respondents who had worked a varying number of hours within a range either averaged or chose to enter the higher end of the range of hours worked. In general, respondents seemed to round up when averaging number of hours. Some respondents expressed hesitation as to how they should record having worked 37.5 hours. Two or three respondents rounded up to 40 hours to signify full time work.

Usability: No usability issues



18. B17DJOBZIP01



Goal: Do respondents know this information and are they able to provide it accurately? If one city has more than one zip code, how confident are they that they have selected the zipcode of their location?



Respondents who knew the zip code of their employer or worked at one location only were able to enter without issues. The problems that did arise at Q18 can be attributed to respondents not knowing employer zip code or working at multiple locations and not reading instruction carefully.

Three respondents indicated that they had not noticed the instruction about what to do if an employer has multiple locations. One respondent did not read the instruction and entered the zip code of a central office. Another respondent entered a central office zip code because this was the only zip code known. One respondent who works for an employer with multiple locations opted to enter “any Chicago zip code” (264/117). One respondent used Google to search for employer zip code.

One respondent works as a temp and chose to enter the zip code where the staffing agency is located even though she does not work at that location: "That's the location of the staffing agency that employees me. (273/121)"

Three respondents left the zip code field blank and entered city and state. Respondent 246/118 explained, "This one is tricky for me. This would make sense for any job that has one single location…I drive to people's houses to teach piano lessons." This respondent left the zip code field blank and when he went to proceed to the next survey question was prompted to confirm that he had left the zip code field blank intentionally.

Two respondents used the autofill option at Q18. At least two others mentioned that they noticed the autofill button but did not use it.



Usability: No usability issues but several respondents indicated that they had not read the instructions carefully.



19. B17DPREFT



Goal: This question is only administered to respondents currently working fewer than 35 hours/week, and is intended to distinguish between those working fewer hours by choice versus those who wish to work more.


Only three respondents were asked Q19 per survey skip pattern. All three answered ‘No.’ Two of the three respondents indicated that current work hours were by their own choice and one respondent by employer choice. The meaning of the term ‘prefer’ was clear to all three respondents.

One respondent expressed confusion at Q19. She was being asked about a former job to which she plans to return but is not working currently. At probing it was clear that this respondent understood the question but she was unsure how to answer.



Usability: One respondent (254/119) was asked about a former job at Q19. It was not clear whether this was due to a programming error or if the respondent had entered dates of employment incorrectly.




20. B17DINDUST



Goal: Are respondents familiar with the term industry? Are they able to classify their job into an appropriate industry category? See Attachment 1 for examples of industries to show respondents.

At Q20, respondents entered a response and were then offered a Showcard listing industry categories to choose from. Twelve respondents expressed a preference for the list on the Showcard while 8 settled on a showcard choice and preferred being able to enter a response. All respondents seemed to interpret the term ‘industry’ as intended, offering terms such as "field," "sector," "category," and "business."

Spontaneous answers collected at Q20 are as follow:

  • Health club

  • Social Services

  • Construction

  • Sales

  • Bankruptcy

  • Law Firm

  • Hospitality

  • Customer Services

  • Spa

  • Clothing

  • Home Healthcare

  • Private Music Lessons

  • Fast Food Restaurant

  • Hospitality

  • Security

  • Restaurant

  • Merchandising

  • Education enrollment

  • Fitness Center

  • Education

Respondent 224/104 settled on the ‘Retail sales/retail trade’ showcard option even though this category was not a great fit: "I have no clue, I guess sales? … kind of hard because we do so much, we help with notices and invoices."

Respondent 228/105 had entered ‘Law Firm’ at Q20 and selected “Professional, scientific, and technical services” from the Showcard list. The respondent did find that “legal” was listed under examples of this category but commented that unless he was looking at the detailed examples the industry title in the left hand column was too vague.

Respondent 273/121 works in a call center giving information on education enrollment. She choose to enter 'education enrollment' into the box, but when she was looking for a category on the showcard list focused on the fact that it was a call center, and found a category that mentioned 'telecommunications', rather than focusing on education.

Respondent 246/118 entered “Private music lessons” at Q20 but thought of the overall industry as “Arts.” He felt this choice was appropriate but more general than the response he had entered: "I think I would prefer to enter it in just myself, because sometimes I feel like it's easier just to say exactly what I think it is than to try to fit it into one of these categories."



Usability: One respondent (224/104) encountered a usability issue at Q20. The respondent was not able to view the entire question on the screen when the keyboard function was open on her Android/LG phone.





21. B17DJBREAB



Goal: The intent of the question is to determine if respondents have positions with leadership or management responsibilities, to assess the “level” of their position. The goal of cognitive testing of the item is to see if they interpret these questions as assessing such responsibilities, and also to compare their responses to what we learn about their jobs, to determine if responses are measuring the concept of interest.



In your current job, do you supervise the work of others?

Six respondents answered “Yes” and fourteen answered “No” to supervising the work of others.

In your current job, do you participate in the hiring or firing decisions?

Four respondents answered “Yes” and 16 answered “No” to participating in the hiring or firing decisions.

Respondents who answered ‘Yes’ to either one of these two questions, also reported having responsibilities such as: scheduling staff, office function coordination, training new employees, collecting data for commission rates, store organization, store surveillance, and sending data reports.

When asked what management responsibilities were missing from the list, some respondents mentioned responsibilities such as: Performing evaluation reviews, budgeting responsibilities, managing employee schedules, and training employees.

Eight respondents associated this question with “having a supervisor or management position” – “Am I a supervisor, am I part of management”( 224/104); while seven associated it with “having supervising/management responsibilities” in general – “These are two questions that really establish leadership responsibilities” (237/110). Five respondents associated this two part question with “supervising others.” – “Just if I oversee people in my job that work under me; am I responsible for anybody else” (223/106)

One respondent, who reported not having any supervisor responsibilities, answered “Yes” to “participating in hiring and firing decisions” because in her current job the decision of hiring a new case manager is made as a group. They all work one day with the aspiring employee and based on how the day goes, they cast a vote on hiring or not the new case manager. – “The question on hiring and firing decisions is great to put there because just because you are not a supervisor does not mean that you don’t have any input on that.” (220/103)

One respondent suggested making this an open-ended question: "Do you have any leadership/management responsibilities" (246/118)



Usability: One respondent had usability issues at Q21, the screen froze and only after refreshing the page she was able to move forward using the ‘next button.’





22. B17DJOBSA



Goal: The response options are intended to be discrete aspects of job satisfaction, and we are interested in whether or not these are understood, if there are overlaps, or other dimensions of job satisfaction that are not being captured.



Your pay:

Seven respondents answered “Yes” and 13 answered “No”.


Fringe Benefits:

Eight respondents answered “Yes”, and ten answered “No”

Four respondents did not know the meaning of 'fringe benefits'. During probing three of these four respondents mentioned that if they had to guess they would think it meant: “any benefits” (238/112); “vacation time, 401K” (224/104); “benefits in general, like life insurance, health insurance, stuff like that.” (254/119)

Importance and challenge of your work:

15 respondents answered “Yes” and five answered “No” to being satisfied with the importance and challenge of their work.


Opportunities for promotion and advancement:

Nine respondents answered “Yes” and eleven answered “No”


Opportunities to use your training and education:

10 respondents answered “Yes” and 10 answered “No”


Your job security:

14 respondents answered “Yes” and 6 answered “No”


Opportunities for further training and education:

10 respondents answered “Yes” and ten answered “No”

16 respondents understood the questions to be asking about their overall satisfaction at their current job – “They just want to know how satisfied you are overall with your job” (220/103); while 4 respondents understood the questions were asking about each component of their satisfaction, but did not mentioned their overall satisfaction at work. – “They are pretty specific. Each is asking you something different.” (223/106)



Respondents were probed as to whether they would add anything to the list and offered the following suggestions:

  • Future of organization;

  • financials and revenue of the company;

  • work environment;

  • opportunities for getting education paid for;

  • gaining valuable experience/skills that can be applied later in career;

  • how you are treated at work; schedule/hours;

  • and the leadership team at the job.



Usability: Three respondents had usability issues at Q22:

  • 237/110 wanted to review the previous question to see if it was related to this one, he/she hit “previous” and then when he/she came back to this question he/she had to re-enter all responses.

  • 264/117 had selected “No” response option but while scrolling down inadvertently changed to “Yes” response, the respondent did notice it and changed it back to “No.”

  • For respondent 278/124 the page froze and she had to exit the survey in order to move on; when she logged back in the application came back up to where she had left off.



23. B17DSEARCH





Goal: Is the provided definition of “looking for a different job” similar to the activities respondents consider to define “looking for a different job?” Do respondents notice, and distinguish looking for a different job? The question relates to job satisfaction of respondents who are currently employed, so “different” job is important (as opposed to an un- or under-employed respondent who is looking for work.



Fourteen Respondents were asked Question 23: 9 answered “Yes” and 5 answered “No.”

Of those answering yes, 6 mentioned that even though they were somewhat satisfied with their current job, they were looking for a job that was in their field of study.

Out of the 14 respondents who answered this question, 7 did not understand “different job” as intended. They understood it as “another job” while they kept the one they already have. They said that they would answer “Yes” to this question if they were looking for “a second job.” – “Yes, because I am still looking for a different job” (235/116). Another respondent commented: “I think I overlooked the bold of the different job in reading that…I thought was this asking are you looking for another job.” (246/118)

The 7 respondents who understood “different job” as intended, said that they would answer “No” if they were looking for a “second job.” – “No, I am not looking for a second job; I am looking for a different job” (238/112); "Different…that would be going away from my primary job." (206/102)

When probed as to interpretation of the phrase ‘looking for a job,’ respondents mentioned activities such as: Sending resume, interviewing, looking online, applying in person, asking around, looking in the newspaper, and networking with people in that business world.



Usability: No usability issues



24. B17DFIRSTJOB



Goal: To determine how the question is interpreted, and what the respondent thinks about. We are particularly interested in respondents who may have had the same job during and after enrollment – how would they answer?



Nine respondents answered ‘Yes’ and 11 answered ‘No’ at Q24.

All but 2 respondents understood the question as intended. One of these respondents (237/110) answered “No” although he really should have responded “Yes.” This respondent had worked moving gigs while in college, which he did not consider a “job.” His current job was actually his first job after college but he was thinking about the fact that he had worked before. The second respondent (252/114) said that if he was working at a job while he was enrolled in school, and continued with that same job after graduating, he would not consider that job his “first job after college.”

Two respondents answered “No” because they were still attending college. Another respondent (238/112) commented that she thought she should not have been asked this question, because she thought that "first job after college" made it sound like she had finished the program, when in fact she had left without completing it.

All other respondents, even those who were already working at their current job while they were in school understood the question to be asking if their current job was their first job after graduating from college. Two of these respondents also included the first job a person could have after leaving college, even if they had not graduated.



Usability: No usability issues



25. B17DFIRSTHRS



Goal: Can respondents accurately recall and report information from this timeframe?



All 10 respondents who answered this question were thinking about the same job they talked about at Question 24.

Respondents without a regular schedule at their first job after college provided an estimate of the hours they worked per week. Most respondents used their full-time or part-time work status to calculate their working hours per week.

All but one respondent were thinking about the number of hours they worked when they started the job. This respondent (237/110) was thinking about the more recent average number of hours.



Usability: No usability issues



26. B17DFIRSTERN



Goal: Can respondents accurately recall and report information from this timeframe?



Four respondents who were working at this job while they were still enrolled in school mentioned that their rate had not changed after they had graduated. They reported their initial rate. Another respondent who was working at this job while she was still enrolled in school mentioned that her rate had changed after she had graduated. She also reported her initial rate.

Five other respondents who started a job after they graduated reported the rate they were making when they started at the job.



Usability: No usability issues


27. B17EPARHELP



Goal: What do respondents think of when considering “education and living expenses?” Who are they including in “parents or guardians?”

Six respondents selected ‘Yes’ at Q27. Of the six, one actually had received no financial support but expects to do so prior to June 2016. This respondent assumed his parents would support him if he asked because they have been ‘fussing at him’ to get back in school. His ‘Yes’ response led him to the follow-up question on amount of support which he was not able to answer (252/114)

Another respondent who is also not currently enrolled in school answered ‘Yes’ because his parents did support him when he was in school and he still lives with them now and thus receives support.

Most respondents who answered ‘Yes’ considered financial help to include support other than tuition, such as buying books, living expenses and spending money. All respondents who answered ‘Yes’ included parents, however, some also noted that siblings and grandparents also offered support and they counted them as well when answering the question. One respondent who answered ‘Yes’ considered only tuition as support (but not the other support provided).

Of the 14 respondents who answered ‘No’, 9 were not currently enrolled and so did not think the time frame applied to them, although some of them noted they had received financial support in the past. One noted that he considered his student loans that relatives sometimes help him with, but since he is not currently enrolled did not think they would count. Another considered but did not include the check his in-laws gave to his wife for her educational expenses for this time period.

There was some confusion about whether or not other help such as living with a parent would count. A couple of respondents who answered ‘Yes’ also noted that they lived with their parents (and did not receive tuition help). Two respondents who answered ‘No,’ however, also either lived with their parents or received some other non-tuition support during the time frame.

One respondent who is enrolled lives with his mother but contributes to the rent and pays his own tuition so did not consider his mother’s payment of utilities, food and the remaining rent as support. Another respondent was currently enrolled in graduate school and lives with her parents but her parents do not contribute to tuition.

A few respondents noted that the question is asked in past tense but asks about a reference period that is partly in the future. This made some answer hypothetically about the future.

Usability: One respondent, who is not currently enrolled answered ‘Yes’, in expectation of going back to school before June 30, 2016 and expect that his parents will provide support if he asks them. This response let him to the follow-up question on amount which he was unable to answer (252/114)



28. B17F2000



Goal: How does answering a question like this make respondents feel? What sources did they consider getting the $2000 from?



Fifteen respondents said they were comfortable with question 28. Three others were also comfortable but felt they wouldn’t be if their situation were different or that others may not be comfortable. One respondent, who could come up with money, thought the question was a little personal: "… kind of a little personal I guess. I wasn't really offended by it but some people might be." (223/116) Another respondent stated, "It's fine. I don't think it’s too intrusive or anything. For people who could not come up with $2000 it might be a bit embarrassing, but, whatever." (228/115)

Fourteen respondents were split between being definitely and probably not able to come up with the $2000. Of those who certainly or probably could not come up with the $2000, all mentioned bills or (low) income as a barrier and two probably not and 1 certainly not respondents speculated that they could ask family or friends.

Of those who certainly or probably come up with the $2000, 10 considered their own financial resources and 4 mentioned both their own resources and family/friends.

Three respondents reported needing to come up with this much in the past including one for entertainment purposes not emergencies.



Usability: No usability issues


29. B17FINTRST



Goal: How does answering a question like this make respondents feel? How confident are they in their answer?

Almost all respondents correctly responded that they would have more money. Only one respondent was confident that there would be less.

One respondent was unsure but said it would be the same, however this respondent also was confused if interest would be gained or lost and said if it is gained there would be more: “I'm confused by whether interest is gained or taken away. If interest is gained than it would be more.” (253/115)

Only one respondent reported being uncomfortable with the question, stating, “I don’t like that, I don’t like mental math.” (238/112)



Usability: No usability issues



30. B17FINFLAT



Goal: How does answering a question like this make respondents feel? How confident are they in their answer?

Twelve respondents correctly selected that they would have less than today and 5 thought they would have exactly the same. Two respondents answered they would have more and 1 left the question blank because she did not know the answer. All but 3 of those who had responded correctly were confident in their answers. None of the 5 who thought they would have exactly the same were confident in their answers.

The term ‘inflation’ was identified by several respondents as a term they were not familiar with.

"I have no idea how to answer this question, 'inflation', I have no idea what that is. [Probe, have you heard the term?] No in my head, inflation is to blow something up." (224/104)

I don’t even know what ‘freaking’ inflation is.” (238/112)

Other respondents who had answered the question also noted being unsure of the term:

"I just don't really know how inflation works, I've never really studied this…I think I have a better idea of how interest works but I'm also not exactly sure" (246/118)

I’m not sure what inflation is. It sounds like some fees or penalties that are taking form my savings account.” (235/116)

Both respondents who answered more than today said they were confident in their answers, however one admitted he did not know the meaning of inflation. He had answered the previous question correctly (276/122) The other respondent appeared to be answering that there would be more money not more spending power. Those who answered correctly and were confident appeared to know what the term inflation meant.

Although 16 respondents were comfortable with the question, more reported being less comfortable with this question than the previous math questions. Most of the discomfort appeared to be with uncertainty of how to respond even among those who answered correctly.

One respondent was comfortable answering but said: “It’s stupid! It is not very clear, it is not telling you if this is based on the $100 question or if it’s based on your personal account.” (220/103)

Some questioned need for the question within the survey: “They are really asking this? I thought it was just going to be about education.” (226/117)

Several respondents, regardless of their answer, wanted a ‘Don’t know’ response: “Is there an ‘I don’t know’ answer?” (238/112)



Usability: No usability issues



31. B17FSTOCK



Goal: How does answering a question like this make respondents feel? How confident are they in their answer?

Seven of the 20 respondents selected ‘True’ at Q31, 2 selected ‘False’ and 11 chose ‘Don't Know.’

One of the older respondents who responded ‘Don’t know’ had originally put down False but changed his answer because he didn’t think there was enough information to determine the response, describing how a stock in one sector might be safer than a mutual fund in another:

"I don't know what it is asking, if they had put something like an index fund then I would have said false but it says stock mutual fund. (Probe are you familiar with the term Stock Mutual Fund?) Yea they primarily invest in stock and spread the wealth and invest in different sectors.” (206/102)

Most of the respondents that answered ‘Don’t know’ reported they had no idea. Other respondents who answered ‘Don’t know’ appeared at the cusp of understanding mutual funds but did not feel confident in their understanding. Several of the respondents who answered False also appeared to be unsure of themselves.

Among all response choices there were indications of misunderstanding. Because respondents were provided an option to respond ‘Don’t know’ reports of discomfort with the question were quite low. Only 2 Rs indicated any discomfort and in both of those cases it was simply related to confusion with what the question was asking.



Usability: One respondent entered next and was bumped out of the survey and back to the start/login page; when he logged back in he was taken to the correct spot. (233.109)



32. B17FWDFALL



Goal: How does answering a question like this make respondents feel? How confident are they in their answer?

All respondents reported being comfortable with Q32 and all felt confident in their selected responses.

Five respondents chose only one response, although no interviewer reported that the respondent thought they had to choose only one response. Three of the five chose savings and two chose payment of debts.

Four respondents reported living alone. Of the remaining 16, only one said she was thinking of just herself; she lives with her boyfriend and noted they do not share expenses. The remaining 15 all said they were thinking of their household when answering Q32.

Usability: No usability issues

33. B17FFEDACT



Goal: How does answering a question like this make respondents feel? How confident are they in their answer?



Half of all respondents (10 of 20) selected all three response options at Q33. Three respondents selected two responses and seven chose only one.

Of those choosing only one response, one who selected garnish wages saw the question as asking what the government should do, rather than what they can do. Another respondent, who chose ‘None of the above’, answered with what they wanted the government to do noting that they should try to work with the student to come up with an amenable option rather than using one of the “harsh options” available to them. (278/124)

One student who had selected only one had thought it was a select only one so chose the response he was sure of. The remaining respondents, as well as those who chose two responses, appeared to be answering only what they were sure of, not that they did not think the other options were not available.

All respondents who marked all three options were confident in their answers. The others were confident only of the answers they had selected.

Of the 10 respondents who chose one or two answers, 4 did not report having loans in the screener and six did report having them.

Two respondents reported not being comfortable with the question, one because it reminded her of her situation, the other because she wasn’t sure of the answer. Two respondents weren’t sure what the term ‘Garnish wages’ meant.



Usability: No usability issues



34. B17FMATH

Goal: Are respondents able to categorize math courses they have taken into the categories provided? Is this question difficult for them to answer?



Fifteen respondents answered Q34 and 5 left the question blank (no option for none of the above). All answered this question without difficulty or usability issues.

The majority of respondents were thinking about post-secondary courses, but a handful were also thinking about high school classes. In some cases the courses marked were prerequisites and in some cases they were core courses.

Usability: No usability issues



Other Usability and Cognitive Issues for Non-Probed Questions

Spontaneous Probes: Several respondents encountered usability issues with entered dates not registering as intended in the instrument. In one case the interviewer noticed that the respondent entered dates of attendance differently than intended and was thus asked follow-up questions that did not correspond to his situation. In other cases, the interviewers did not notice any errors in entering dates; however, the respondents were asked questions about in-school activities during dates they were not attending, were attending a different school, or were asked about completion after graduation. It was not clear to the interviewers in these cases why the skip patterns were out of sync with the respondents’ situations. For question B17BSCHRES it is possible the instrument pre-fills with the original school regardless of the school attended at the time.

CaseID 206/102: B17AOTENL01: The respondent was using a laptop where all dates of attendance were visible on the screen at a single time. He selected the first and last month of attendance, rather than all months. The interviewer thought this appeared to be a logical selection based on the way the screen appeared. However it meant that he was asked follow up questions that were intended for someone who had not been continuously in school. In addition, at B17BSCHRES (Housing), the instrument asked about the 2015-2016 school year at the original institution, although the respondent was not attending the original school then. He was attending a new school. The interviewer could not tell during the interview if this caused the error on B17BSCHRES or if was caused by an instrument error while entering the original school (this also occurred in case 233/109).

CaseID 233/109: The respondent entered enrollment information accurately reflecting attendance at two schools. He transferred from the original school to a second institution but, at B17BSCHRES, the instrument prefilled the original school rather than the school currently attended for the 2015/2016 school year.

CaseID 237/110: The respondent accurately entered program completion date at B17ADGN indicating that her Bachelor’s degree was awarded in December 2014. B17AEXPN seemed to come up in error, asking for month and year of expected degree completion. Because the instrument would not allow the respondent to enter a past date, she skipped this question.

CaseID 246/118: Similarly to CaseID 237/110, the respondent accurately entered program completion date at B17ADGN indicating that he had completed a program, but he was still asked when he expected to complete his degree at B17AEXPN. Because he was not able to enter a past date, this respondent opted to select 'don't know' in order to move forward in the instrument.

CaseID 223/106: B17EPARHELP asks about financial help from parents for the 2015/2016 school year, but the respondent was not attending school during this time frame. The interviewer did not observe any respondent error in entering dates of attendance.

CaseID 273/121: The respondent had issues with several questions because they asked about time periods after she had left school. At B17ASTST, the respondent commented that it asked if she had been a full-time student during a period of time which covered 2015 through 2016, when she had already entered in a previous question that she did not attend school during any of 2015 or 2016. The interviewer did not observe any errors when the respondent was previously entering the months and years of her schooling.















File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJessie Engel
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy