OMB83 C Memo

BB09_OMB_FS_Change_memo2.docx

Baccalaureate and Beyond Longitudinal Study, Third Followup (B&B:09)

OMB83 C Memo

OMB: 1850-0729

Document [docx]
Download: docx | pdf

Memorandum United States Department of Education

Institute of Education Sciences

National Center for Education Statistics



DATE: February 17, 2009


TO: Kashka Kubzdela, NCES


FROM: Ted Socha

Postsecondary Studies Division, NCES


SUBJECT: Summary of changes for full-scale data collection for B&B:08/09

OMB forms clearance (No. 1850-0729 v2) was received in February 2008 for the field test and full-scale data collections of the 2008/09 Baccalaureate and Beyond Longitudinal Study (B&B:08/09). This memo serves to summarize the planned changes to the full-scale data collection as they relate to what was approved previously.

Overview

B&B:08/09 is the first follow-up study of baccalaureate recipients identified as part of the 2008 National Postsecondary Student Aid Study (NPSAS:08). The 2008/09 Baccalaureate and Beyond Longitudinal Study (B&B:08/09) sample will consist of those students who were eligible to participate in NPSAS:08 and completed requirements for a bachelor’s degree during the 2007–08 academic year at postsecondary institutions in the United States, the District of Columbia, and Puerto Rico.

The primary purpose of the B&B series of studies is to focus on the value of obtaining bachelor’s degrees, and to track the paths of recent graduates into employment and additional education. This submission to OMB requests approval for the planned changes to the previously obtained clearance (OMB No. 1850-0729 v2). NPSAS and the B&B longitudinal study are authorized under the Education Sciences Reform Act of 2002 (P.L. 107-279, Title 1 Part C), and Section 183 of the Education Sciences Reform Act of 2002.

The B&B:08/09 study involves two data collection components: a transcript collection that collects postsecondary transcripts from each of the sample members (NPSAS graduating institutions only) and the sample members will be interviewed using a telephone- or web-based interview, scheduled to begin in July 2009. The full-scale transcript collection received OMB clearance along with the field test submission in February 2008 and is currently underway. Additional data for the B&B:08/09 sample will be obtained from a variety of extant data sources, including the Central Processing System (CPS), the National Student Loan Data System (NSLDS), Pell grant files, and the National Student Clearinghouse (NSC).

Previous iterations of the B&B first follow-up interview have laid the foundation for a large majority of the survey items for which we will be seeking information, so much will be simply carried-over or updated. Questions will cover topics related to finances and indebtedness, transition to employment and/or graduate education, family formation, volunteerism, and career plans. For those sample members in the teaching occupation, the interview will also cover their teacher preparation, experiences and job satisfaction relating to their first year in field. The data elements submitted for the full-scale study have been developed based on feedback from the Technical Review Panel (TRP) held in November 2008.

Summary of Planned Changes:

This section summarizes the changes planned for the B&B:08/09 full-scale data collection. Topics addressed include the incentive plan, the sample design, the final data elements, and estimated burden. Supporting documentation is provided in attachments and referenced in the sections below.

First, based on the results of the field test evaluations, some revisions have been made to the mailout and incentive plans. Attachment A provides the results of each experiment. Outlined below are the recommended changes based on those results.

  • Data collection notification materials will be sent to sample members in a plain US Department of Education 9” x 12” large envelope.

  • During the early response period, sample members will be offered a $5 prepaid cash incentive, followed by a $30 postpaid check upon early survey completion (the first four weeks of data collection.)

  • Incentives will be offered for interviews completed during the early response period and the nonresponse conversion period. No incentives will be offered during production—time period between early response and nonresponse—interviewing.

  • All incentive offers made after the early response period will be promised rather than prepaid.

Second, there have been some changes in the sample design planned for the B&B:08/09 full scale data collection. The revised sample design is provided in Attachment B. In summary, the following revisions to the design are planned.

  • The expected sample size has been decreased from 23,600 to 17,312 due to a lower number of students being confirmed in the NPSAS:08 base year study as completing bachelor’s degree requirements during the 2007-08 academic year.

  • The expected eligibility rate has been increased to 99 percent from the 90 percent stated in the original OMB submission because transcript data will be used to help determine cohort eligibility.

  • The expected response rate has been decreased to 86 percent from the original estimate of 90 percent based on a review of prior B&B 1-year follow-ups and the field test results.

Third, as stated above, the final set of full-scale data elements was developed with feedback from the TRP during their meeting in November 2008. The revised set of elements, with changes from the field test instrument noted, is provided in Attachment C. Of note is a new set of items related to language course taking and proficiency, that has been added to better understand the labor market outcomes of recent graduates entering the workforce where foreign language skills are now considered essential career skills (Attachment D). The majority of these items come from the National Assessments of Adult Literacy (NAAL) and, therefore, have been fielded already with a slightly different – adults ages 16 and above -- but overlapping population.

Also noteworthy is the handling of the TEACH and SMART grant items following the NPSAS:08 base year interview. Using administrative data sources, those who received TEACH grants will be identified in the first follow-up study, making it possible to track their entry into teaching through the TEACH program. Because SMART grant program participation is limited to undergraduate enrollment, there will be few newly qualified SMART grant recipients in the B&B first follow-up interview to permit analysis. Consequently, those items were recommended for deletion by the TRP. However, in order to allow longitudinal analysis, a flag will be retained on the data file identifying base year participants in the SMART grant program.

To ensure item quality once the interview has been programmed (May 2009), cognitive testing will be performed on the entire instrument with a group of 9 participants, including six who are prequalified to answer the full set of language items. Although the items contained in the B&B survey have either been used in other NCES surveys, the cognitive testing will inform us about the appropriateness of the items, particularly the new items (e.g. recession items, language items), and the approach for the B&B cohort as well as the usability of the instrument. Any needed modifications will be made prior to the start of data collection in July 2009.

And finally, as shown in Attachment E, the student interview is expected to require an average of 25 minutes per response, a decrease of 5 minutes from the original submission, with a range of times from 10 to 45 minutes. From the anticipated starting sample of 17,312, 14,722 interviews are expected for a total burden of 6,134. That burden estimate is 3,416 hours less than stated in the initial submission.








Attachment A


Results of Field Test Experiments



Three experiments were conducted during the B&B:08/09 field test. These experiments assessed the following questions:

  • Would study materials sent via Priority Mail produce higher participation rate during the early response period than materials sent in a large 9” x 12” envelope via regular mail?

  • Would a $5 prepaid cash or check incentive ($5 up front, followed by a promised $30 check for NPSAS:08 interview respondents or $50 check for NPSAS:08 interview nonrespondents upon interview completion) produce higher participation rate during the early response period than those who were offered the promise of a $35 or $55 incentive check upon interview completion?

  • Would a $20 production incentive—or $40 for NPSAS:08 interview nonrespondents—produce higher participation rates during the production phase than no production incentive?

Analysis of Priority Mail

To test the impact of the visibility of mailout materials on participation rates, 1 the field test sample was randomly assigned to two groups prior to the start of data collection: one group received the initial study materials via regular mail in a large envelope, and the other group received the same materials via Priority Mail also delivered in a large envelope.

Table 1 presents the results of this experiment. Overall, those who received the study materials via Priority Mail envelope had an early participation rate of 43 percent, compared with a participation rate of 41 percent for those who received via regular mail. There is no indication of a statistically significant difference in the early participation rate between the two types of mailing materials.

Table 1. Early participation rates, by type of mailing: 2008

Type of initial mailing

Eligible sample

Participated

Number

Percent

All cases

1,820

760

41.6

Priority Mail

910

390

42.6

Regular Mail

910

370

40.8

NOTE: Detail may not sum to totals because of rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2008 Baccalaureate and Beyond Longitudinal Study (B&B:08/09) Field Test.

Analysis of Prepaid Incentives

The effectiveness of a prepaid incentive was also examined in the field test. Prior to the start of data collection, the field test sample was randomly assigned to one of three groups: one group received $5 prepaid cash with promise of a $30 check upon completion, one group received $5 prepaid check with promise of a $30 check, and the other group received only the promise of a $35 check. Sample members had to complete the interview during early response period in order to receive their promised checks. In addition, all NPSAS:08 interview nonrespondents were offered an additional $20 to complete the interview during the early response period. That is, if they were assigned to the $5 prepaid cash or check incentive group, they were offered a $50 check on interview completion. If they were assigned to the no prepaid group and completed the interview within the early response period, they were offered a $55 check on interview completion.

Table 2 presents the results of the prepaid incentive experiment. Overall, the $5 cash group had a significantly higher participation rate (49 percent) during early response period than both the $5 check group (37 percent, z = 3.54, p < .01) and the promised group (41 percent, z = 2.81, p < .01). For NPSAS:08 interview respondents, the participation rates were also significantly higher for the $5 cash incentive (61 percent) than for the $5 check incentive (47 percent, z = 3.30, p < .01) or the promised incentive (52 percent, z = 2.49, p < .01). Participation rates followed this same trend for NPSAS:08 interview nonrespondents but the differences were not significant.

Aside, a nominal negative relationship was observed in response between those offered a $5 prepaid check compared to those not offered an incentive at all, but it was not statistically significant. That said, the result is still bewildering since it was not in the expected direction, i.e. those not incentived yielded a better response rate than those who were. (This trend was observed for both NPSAS respondents and nonrespondents.)

Table 2. Early participation rates, by prepaid incentive status: 2008


Overall


NPSAS interview respondents


NPSAS interview nonrespondents

Eligible

Partici-pated

Percent

Eligible

Partici-pated

Percent

Eligible

Partici-pated

Percent

$5 cash

450

220

48.5


310

190

60.5


150

40

23.6

$5 check

460

170

36.9


310

140

47.2


150

20

16.0

Promised

910

370

40.5

 

610

320

51.8

 

300

50

17.7

NOTE: Detail may not sum to totals because of rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2008 Baccalaureate and Beyond Longitudinal Study (B&B:08/09) Field Test.

Analysis of Production Incentives

Prior results from other studies (BPS:04/06 field test) suggested that paying an incentive during the production interviewing phase of data collection does increase the likelihood that sample members will participate. The effect, however, was not a strong one. Consequently, the experiment was conducted again for the B&B:08/09 field test. Prior to data collection, the field test sample was randomly assigned to a production incentive group according to their NPSAS interview response status. NPSAS respondents were randomly assigned a $0 or $20 production incentive, while NPSAS nonrespondents were randomly assigned a $0 or $40 production incentive. Once the early response period ended, interviewers began contacting the remaining sample members to complete the interview over the telephone. Sample members were notified of the production incentive (if one was assigned to them) by an interviewer, or through an e-mail and/or letter.

Table 3 presents the results of the production incentive experiment. No significant difference was found between the $0 and $20 groups for NPSAS respondents, or between the $0 and $40 groups for NPSAS nonrespondents.

Table 3. Interview participation rates, by production incentive status: 2008

Type of production incentive

Eligible sample

Participated

Number

Percent

NPSAS interview respondents




$0

610

67

11.0

$20

610

65

10.7

NPSAS interview nonrespondents




$0

300

21

7.1

$40

300

22

7.4

NOTE: Detail may not sum to totals because of rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2008 Baccalaureate and Beyond Longitudinal Study (B&B:08/09) Field Test.




Attachment B

Revised Full-Scale Sample Design

Sampling Specifications for the Full-Scale Study

Section A. Introduction

Identification of the B&B:08/09 full-scale sample requires a multi-stage process that began, first, with selection of the NPSAS:08 sample of institutions and was followed by selection of students within institutions. The third and final stage is a B&B-specific activity that will confirm the cohort eligibility of sample members identified during NPSAS as baccalaureate recipients during the 2007–08 academic year. The sampling specifications presented here describe that confirmation process. Section B describes the target population and Section C details the specific sample design to be applied in identifying the B&B:08 cohort.

Section B. B&B:08 Target Population

B&B-eligible persons are individuals who completed requirements for the bachelor’s degree from NPSAS-eligible institutions between July 1, 2007 and June 30, 2008 and were awarded their baccalaureate degree by the institution from which they were sampled no later than June 30, 2009. This definition provides theoretically complete coverage of the population of students completing their degree requirements during the 07-08 academic year because every completer is associated with one 4-year institution on the NPSAS sampling frame, i.e., there is no student multiplicity. Moreover, it assigns a known and well-defined probability of selection for each student in the B&B sample. Through the institution awarding the degree, each completer has exactly one linkage to the B&B sampling frame. Consequently, although NPSAS sample weights must include a multiplicity adjustment to account for multiple linkages to the NPSAS sampling frame, the B&B sample weights need not.

Section C. B&B:08 Sample Design

The B&B:08 sample will consist of all students who completed requirements for the bachelor’s degree at any time between July 1, 2007 and June 30, 2008. Eligibility for the B&B:08 full-scale cohort will be based primarily on information obtained from the student’s transcript. Lacking a transcript, eligibility will be based on responses provided during the NPSAS:08 student interview. Without either the transcript or the interview, eligibility will be based on the student’s institutional record obtained through NPSAS:08 CADE or the enrollment list provided by the NPSAS institution at the time of student sampling.

There are other data sources, such as the Central Processing System (CPS) and the National Student Clearinghouse (NSC) that might provide information on degree completion, but the value of these sources is unknown. These sources will be evaluated through the course of the study and will not be used to confirm cohort eligibility. Table 8 shows the distribution of the 25,000 NPSAS:08 sample members who are potentially eligible for membership in the B&B cohort according to their current interview, CADE, and/or enrollment list status. It should be noted that the B&B:08 sample is not intended to be representative at the state level.

The final step in identifying the B&B:08 sample will occur in the spring of 2009, in time to begin student contacting in June. At the time of sample selection, the transcript collection will be nearing completion, and B&B eligibility based on transcripts will be known for most of the potential B&B:08 sample members. Transcripts are expected for about 90% of the 25,000 students for whom transcripts were requested.

The sample sizes presented in this document are based on the B&B:08/09 field test results, and will be updated prior to sample selection based on transcript results to date. Applying the rates observed in the field test, of the 18,005 students who were confirmed in the NPSAS full-scale interview to be B&B eligible, we expect about 81.1 percent (14,597) to have a transcript that confirms eligibility, 6.6 percent (1,193) to be ineligible based on transcripts, and 12.3 percent (2,215) to not have a transcript. Table 9 shows the expected transcript status of the B&B:08 sample with baccalaureate receipt confirmed in the NPSAS:08 interview.

In order to have full population coverage of the B&B sample, a subsample of 500 of the 7,000 NPSAS:08 interview nonrespondents who were either confirmed in CADE to be degree candidates or listed by the NPSAS sample institution as bachelor’s degree candidates will be selected. A subsample size of 500 has been typical for the NCES postsecondary longitudinal studies (B&B and BPS). Because the subsample will not be analyzed separately, 500 is a sufficient size to ensure that the full sample is representative of all baccalaureate recipients. Also, because the response rate for sample members who were NPSAS interview nonrespondents is expected to be lower than the response rate for NPSAS interview respondents, a larger subsample size is likely to result in a lower overall response rate.

The 7,000 NPSAS:08 interview nonrespondents will be stratified based on study respondent, transcript, CADE, and list statuses. Within each stratum the nonrespondents will be sorted by institution sector to ensure representation of the sample. The samples will be drawn within each stratum with probabilities proportional to the NPSAS:08 sampling weight. The sampling rates used in each stratum will be different in order to maximize response rates while also representing the various types of sample members. This approach is known to introduce some unequal weighting and, consequently, result in larger design effects, but the overall mean design effect is expected to be between 2.0 and 3.0, which is similar to past postsecondary longitudinal studies.

Based on the B&B:08/09 field test results, the highest sampling rates are expected among students who were NPSAS study respondents, are potentially eligible based on CADE or the enrollment list, and are confirmed eligible by the transcript. The next highest sampling rates will be among students who were NPSAS study respondents, are potentially eligible based on CADE or the enrollment list, but have no transcript, and among students who were not NPSAS study respondents, are potentially eligible based on CADE or the enrollment list, and are confirmed eligible by the transcript. The lowest sampling rates are expected among students who were not NPSAS study respondents, are potentially eligible based on CADE or the enrollment list, but have no transcript.2 Table 10 shows the estimated eligible sample and subsample sizes of the NPSAS:08 potential baccalaureate recipients without a NPSAS interview.

Table 8. Distribution of the NPSAS:08 sample by B&B eligibility

NPSAS:08 B&B eligibility

Count

Total potentially B&B eligible

25,045



Baccalaureate receipt confirmed in interview

18,005

Baccalaureate receipt confirmed in CADE

4,623

Listed as potential baccalaureate recipient

2,417


Table 9. Estimated transcript status of the B&B:08 sample members with baccalaureate receipt confirmed in the NPSAS:08 interview

Transcript status

Count

Total

18,005



Confirmed B&B eligible

14,597

Confirmed B&B ineligible

1,193

No transcript

2,215


Table 10. Estimated eligible sample and subsample sizes of the NPSAS:08 potential baccalaureate recipients without a NPSAS interview

Study respondent

Transcript

CADE or list

Expected
number eligible

Preliminary
sample size

Total




500






Yes

Yes

CADE

3,280

384

Yes

Yes

List

580

58

Yes

No

CADE

540

27

Yes

No

List

210

11

No

Yes

CADE

140

7

No

Yes

List

200

10

No

No

CADE

60

2

No

No

List

30

1


Based on the sample sizes in tables 9 and 10, it is expected that the sample size for student data collection will be about 17,310. The expected eligibility rate of these sample members is expected to be about 99 percent, which will give an eligible sample size of about 17,130. We also expect a response rate of about 86 percent among the eligible sample members, which will yield about 14,720 responding baccalaureate recipients. See table 11 for expected eligibility and response rates by base-year response status and transcript status.

T

Supporting Statement Request for OMB Review (SF83i) 15

able 11. Expected eligibility and response rates, by base-year response status and transcript status

Sample of NPSAS:08 interview respondents

NPSAS:08 interview respondent

Transcript receipt

Transcript status

Count

Sample available for B&B:09 student data collection

Assumed eligibility rate from student interview

Expected eligible cases

Expected response rate

Expected interview yield

Y

Y

Received, confirmed B&B eligible

14,597

14,597

100%

14,597

87%

12,699

Y

Y

Received, confirmed B&B ineligible

1,193

 

 

 

 

 

Y

N

Transcript not received, eligibility unconfirmed

2,215

2,215

92%

2,038

87%

1,773

 

 

Total

18,005

16,812

99%

16,635

87%

14,472










Subsample of NPSAS:08 interview non-respondents

NPSAS:08 study respondent

Transcript receipt

NPSAS:08 eligibility source

Expected number eligible

Sample available for B&B:09 student data collection

Assumed eligibility rate from student interview

Expected eligible cases

Expected response rate

Expected interview yield

Y

Y

CADE

3,280

384

100%

384

51%

196

Y

Y

List

580

58

100%

58

51%

30

Y

N

CADE

540

27

75%

20

51%

10

Y

N

List

210

11

75%

8

51%

4

N

Y

CADE

140

7

100%

7

51%

4

N

Y

List

200

10

100%

10

51%

5

N

N

CADE

60

2

75%

2

51%

1

N

N

List

30

1

75%

1

51%

0

 

 

Total

290

500

98%

490

51%

250










 

 

Total Sample

 

17,312

99%

17,125

86%

14,722




Attachment C

Proposed Full-Scale Data Elements










Changes to the B&B:09 Data Elements for Full-Scale



Data element



Applies to3



Purpose/issues



Changes for full scale

B&B eligibility




Confirm received bachelor’s degree or completed

requirements between July 1, 2007 and June 30, 2008?

All

Eligibility confirmation


Date received bachelor’s (month and year)

All

Eligibility confirmation






Undergraduate enrollment history




Institution granting the degree (confirm NPSAS or add code on-line)

All

Path/time to degree


Term and year first began undergraduate education

All



Undergraduate enrollment at other institutions between high school and bachelor’s degree

All



Names of other colleges attended (on-line coding) (up to 6)

Attended multiple



Terms/years attended other colleges

Attended multiple



Degree program and degrees attained at other colleges

Attended multiple



Class level at other colleges

Attended multiple


Dropped attendance status element, retained class level

Previous educational attainment prior to bachelor’s (previous certificate, associate’s, bachelor’s, other) at NPSAS3

NPSAS non-respondents


Ask only of respondents for whom this information is not available from NPSAS and revise question wording to mimic NPSAS

Dates of NPSAS attendance (month and year)

All



Continuous enrollment for bachelor's degree

All



Reasons for enrollment gaps

Stopouts



Reasons for attending a 2-year college

Attended a 2-year


Revise question wording to ask generally about reasons for 2-year institution enrollment

Original major at NPSAS

NPSAS non-respondents


Ask only of respondents for whom this information is not available from NPSAS

Number of times changed major

NPSAS non-respondents


Ask only of respondents for whom this information is not available from NPSAS and revise question wording to mimic NPSAS

Final major at NPSAS

NPSAS non-respondents


Ask only of respondents for whom this information is not available from NPSAS

Transfer or multiple enrollment (transfer/multiple

enrollment/both)

Attended multiple



Credits attempted to transfer/were accepted from other colleges

Attended multiple



Reasons for transferring (financial/academic

/personal/location/other)

Transfers


Drop; not analytically useful

Purpose of overlapping enrollment (transfer/additional courses/additional degree/financial/other)

Overlapping enrollment


Drop; not analytically useful





Performance




Withdrew from any course because failing

All



Repeated any course to improve grade

All



Received any incompletes

All



Ever on academic probation

All



Graduated with academic honors

All



Ever on Dean's list

All


New





SMART grants




Received Pell grant after July 2006

All


Drop all SMART and Pell grant items; the TRP recommended using this item set for NPSAS only. We will include a flag to identify SMART recipients (obtained as part of NPSAS from NSLDS) so analysts can track outcomes of B&B SMART recipients

Received SMART grant 3rd/4th years

All


Chose or changed major to qualify for SMART grant

SMART recipients


Major change from what to what

Changed major


Reason not eligible in 4th year (no Pell/not full time/not

qualifying major/didn’t earn 3.0 GPA)

Received SMART 3rd year/not 4th






Undergraduate student loan debt




Confirm total amount borrowed in student loans (from NPSAS)

All

Debt and finances

Was not included in field test, will not be in full scale

Loan type

Borrowers



Amount borrowed

Borrowers



Amount owed

Borrowers



Currently repaying student loans

Borrowers



Amount of monthly payments

Repaying



Parents helping to repay the loans

Repaying



Reasons not repaying

Borrowers not repaying



Deferment reason

Borrowers not repaying



Participation in loan forgiveness program

Borrowers



Has the debt influenced career plans

Borrowers



Consider the student loan debt a worthwhile investment

Borrowers







Assessment of education




Undergraduate education was worth cost

All



Satisfaction with quality of education from NPSAS

All



Satisfaction with undergraduate major choice

All



Current status (at time of interview)

All

Labor market outcomes


Working for pay at a full-time or part-time job




Taking courses toward a graduate or professional degree or postbaccalaureate certificate




Taking courses toward an undergraduate degree or certificate




Taking other courses, not for a formal award




Serving in an internship or training program



Drop; not analytically useful

Serving on active duty in the armed forces




Keeping house (full-time homemaker)




Holding a job but on temporary layoff from work or waiting to report to work




Looking for work




Traveling




Disabled




Volunteering (Peace Corps, VISTA)












Post-baccalaureate enrollment




Enrolled since earning bachelor’s degree

All



Enrolled for degree/certificate

Enrolled


Added; missing in FT data elements

When first enrolled for post-BA degree/certificate

Enrolled for degree/certificate


Exclude nondegree enrollees

Name of institution attending (on-line coding)

Enrolled for degree/certificate


Exclude nondegree enrollees

Currently enrolled

Enrolled for degree/certificate


Exclude nondegree enrollees

Degree type

Enrolled for degree/certificate


Exclude nondegree enrollees

Type of master's/doctoral/professional degree



Drop; not analytically useful

Degree program/field of study (on-line coding)

Enrolled for degree/certificate


Exclude nondegree enrollees

Attendance status

Enrolled for degree/certificate


Exclude nondegree enrollees

When completed/expect to complete program

Enrolled for degree/certificate


Exclude nondegree enrollees

Reasons for enrolling:

- To gain further education before beginning a career

- To prepare for graduate school or further education

- To change your academic or occupational field

- To gain further skills or knowledge in your academic or

occupational field

- For licensure or certification

- To increase opportunities for promotion, advancement, or

higher salary

- Required or expected by employer

- For leisure or personal interest

Enrolled


Drop; not analytically useful

Reason for choosing this institution

(reputation/faculty/location/financial aid/can go part time/other)

Enrolled


Drop; not analytically useful

Reason for choosing this program

(academic/financial/personal/other)

Enrolled


Was not included in field test, will not be in full scale

Type of financial aid received

Enrolled for degree/certificate


Exclude nondegree enrollees

Receiving any employer aid to support post-baccalaureate education

Enrolled for degree/certificate and working


Exclude nondegree enrollees

Number of hours worked per week while enrolled

Enrolled for degree/certificate and working


Exclude nondegree enrollees

Consider yourself primarily an employee or student

Enrolled for degree/certificate and working


Exclude nondegree enrollees





Plans for future enrollment in degree/certificate program




Expect to pursue a higher degree/certificate

Not enrolled


Drop; behavior predictions are of limited utility

Reasons not going to continue education (no

interest/academic/job/financial/personal)

STEM majors only


Drop; dropped all STEM-related items

When expect to start (coming year/next year/2 years/5 years/more than 5 years)

Expect more education


Drop; behavior predictions are of limited utility

Taken GRE or other graduate/first professional entrance exam

All



Type of degree/field of study expected

Expect more education


Drop; behavior predictions are of limited utility

Intended enrollment intensity

Expect more education


Drop; behavior predictions are of limited utility

Expect tuition reimbursement

Expect more education


Drop; behavior predictions are of limited utility

Reasons for delay (financial, academic/personal)

Expect more education


Drop; behavior predictions are of limited utility





Other coursetaking (nondegree)




Taken/taking formal courses (credit or noncredit) other than those taken while enrolled in a degree or certificate program

All


Drop; replaced by "Enrolled in any nondegree coursework"

Enrolled in any nondegree coursework

All


New

Reasons for taking courses

Enrolled in nondegree coursework



Any of school-related costs paid by employer

Taken/taking courses


Drop; not analytically useful





Employment at time of interview




Employed as an elementary/secondary school teacher


Employed

Labor market outcomes

Drop; this can be determined from the occupation coder

Date began job




Employed full-time or part-time




Prefer to have a full-time job

Employed PT



Number of jobs held currently

Employed



Number of jobs held since graduation




Type of occupation (on-line coding)




Type of duties (specify)




Type of industry (on-line coding)




Type of firm




Size of the company (number of employees)



Drop; not analytically useful

Salary (indicate per time period)




Average number of hours per week worked




Self-employed




Future plans for self-employment



Drop; not analytically useful

Reasons for self-employment

Future plans for self-employment


Drop; only applies to a very small percentage so won't be useful data

College degree required to obtain this job

Employed


Drop; we can get a measure of prestige from the occupation codes

Related to undergraduate major




Job part of career path




Difficult to get hired

On career path


New

Type of non-career job

Not on career path



Flexibility of job (very flexible/somewhat flexible/not flexible)

Employed


Drop; not analytically useful

Could do this job without flexibility

Flexible job


Drop; not analytically useful

Able to telecommute (y/n/does not make sense)

Employed


Drop; not analytically useful

Frequency of telecommuting

Telecommuters


Drop; not analytically useful





Job satisfaction




Compensation

Employed

Labor market outcomes

Revised item to "compensation" from "pay and fringe benefits"

Importance and challenge




Opportunity for advancement



Drop; not analytically useful

Opportunity to use training and education



Drop; not analytically useful

Job security




Opportunity for further training and education



Drop; not analytically useful

The job as a whole








Benefits




Medical and/or other health insurance (dental, vision, etc.)

Employed

Labor market outcomes


Life insurance




Retirement or other financial benefits, such as

401(k)/403(b)




Other




Stock options



Drop; not analytically useful

Flexible spending accounts



Drop; not analytically useful

Employee discounts



Drop; not analytically useful

Other employee facilities or subsidies, such as for

childcare, transit, or fitness



Drop; not analytically useful

Employee assistance program (counseling/legal)



Drop; not analytically useful

Tuition reimbursement



Drop; not analytically useful





Responsibilities




Supervise work of others

Employed

Labor market outcomes

Drop; not applicable to this group (1 year after bachelor's degree receipt)

Participate in hiring/firing decisions



Participate in setting salary rates



Level of autonomy







Reasons for part-time work

Part-time


Drop; not analytically useful

Full-time unavailable



None of the employees worked a full-time schedule



Family responsibilities



Attended school while working



No need or desire to work full-time



Pursuing other interests or hobbies



Health problems prohibited full-time work



Other







Job search




Looking for work

All



Job search strategies

Looking for work


Drop; not analytically useful

Method for finding job



Drop; not analytically useful

Which job search activity led to job

Employed


Drop; not analytically useful

Number of jobs applied to before current job



Drop; not analytically useful

Location of job search

Employed or looking for work


Drop; not analytically useful

Employed since earning bachelor's

All


Added; missing in FT data elements

Employment status by month




July 2007 (working/looking for work)



Added some months that were missing from the FT data elements, edited years

August 2007 (working/looking for work)




September 2007 (working/looking for work)




October 2007 (working/looking for work)




November 2007 (working/looking for work)




December 2007 (working/looking for work)




January 2008 (working/looking for work)




February 2008 (working/looking for work)




March 2008 (working/looking for work)




April 2008 (working/looking for work)




May 2008 (working/looking for work)




June 2008 (working/looking for work)




July 2008 (working/looking for work)




August 2008 (working/looking for work)




September 2008 (working/looking for work)




October 2008 (working/looking for work)




November 2008 (working/looking for work)




December 2008 (working/looking for work)




January 2009 (working/looking for work)




February 2009 (working/looking for work)




March 2009 (working/looking for work)




April 2009 (working/looking for work)




May 2009 (working/looking for work)




June 2009 (working/looking for work)




July 2009 (working/looking for work)




August 2009 (working/looking for work)




September 2009 (working/looking for work)








Work-related training




Any work-related training such as workshops or seminars (not college courses)

Employed


Drop; not analytically useful

Areas of training (management or supervisor/training in occupational field/general professional training, such as speaking, writing, computer software skills/other

Had training


Drop; not analytically useful

Reasons for training (facilitate change in occupational field/gain skills or knowledge in current occupational field/licensure or certification/increase opportunities for

advancement or salary increases/learn skills for recently acquired position/required or expected by employer)

Had training


Drop; not analytically useful

Most important reason (select one of above)

Had training


Drop; not analytically useful





Current demographics




Date of birth

NPSAS non-respondents

Background information for analyses of debt/teaching/other employment

Limit question to those for whom we do not have preloaded information

Citizenship status (citizen, permanent resident, other)

NPSAS non-respondents and non-citizens in NPSAS


Limit question to those for whom we do not have preloaded information saying they are a citizen

Current state of legal residence

All



Live more than 50 miles from NPSAS institution

All



Live more than 50 miles from where attended high school

All



Reasons live more than 50 miles from where attended high school (work/school/location preference/family/other personal)

Those who lived more than 50 miles from where attended high school


Drop; not analytically useful

Reasons live more than 50 miles from where graduated from college (work/school/location preference/family/other personal)

Those who lived more than 50 miles from NPSAS institution


Drop; not analytically useful

Household composition

All



Marital status (never married/ married/ separated/divorced/partner)

All



Date of last change in status

All


Drop item; detail not necessary

Number of dependent children

All



Age of youngest dependent child

Have children



Employment/enrollment status of spouse/partner

Have spouse/ partner


Condensed items

Income/debt of spouse/partner

Have spouse/ partner


Condensed items

Income in 2008

All


Adjust question wording to make clear that respondent should report his/her income only, not household income

Type of disability

All


Longitudinal item from NPSAS:08; inadvertently omitted from FT data elements

Main disability

Disabled



Native language

All


New

Other language

All


New

Language coursetaking

Know a non-English language


New

Non-English language use during childhood

Know a non-English language


New

Use of non-English language

Know a non-English language


New

Proficiency in non-English language

Know a non-English language


New





Assets and debt




Own home or rent

All

Debt and finances


Monthly mortgage/rent amount

All



Other type of housing (parents/military/job

includes/religious/other)

No mortgage/rent payment


Drop; the important items here are captured other places (military service and living with parents)

Own any motor vehicles

All



Monthly auto payments

Vehicle owners



Untaxed benefits

All


Drop; not analytically useful

Impact of recession on enrollment and employment decisions

All


New





Civic and volunteer activity




Registered to vote in U.S.

U.S. citizens



Voted in any election

U.S. citizens



Military status (veteran, active, reserves, none)

All



Perform any community service/volunteer work in last year

All



Types of service and time commitment

All



Volunteer hours per month

Volunteers


Added; missing in FT data elements

Reasons why volunteered

Volunteers


Drop; not analytically useful

Volunteer benefits

Volunteers


Drop; not analytically useful

Future plans to volunteer

Volunteers


Revised from "Volunteer again in next 12 months" to "Future plans to volunteer"





Identifying prospective teacher pipeline members




Teaching experience at K-12 level

All

Screen for K–12 teaching pipeline


Prepared for teaching

All who hadn’t taught



Considering teaching

All who hadn’t taught or prepared







Teaching experiences




Types of teaching positions held since NPSAS school:

regular, short-term substitute, long-term substitute,

teacher’s aide, support, itinerant, student teacher

All who had taught

Identify K-12 teachers (those who had regular, long-term substitute, support, or itinerant positions in a public or private K-12 school)


Number of schools/districts held teaching positions since

NPSAS school




For types held, month/year when first taught


Teaching career paths


Held substitute or teacher’s aide position to get permanent K-12 job

Taught only in short-term substitute or teacher’s aide positions

Identify transition jobs into teaching

Drop; not analytically useful

Current teaching position/most recent position if not currently teaching

K-12 teachers


Drop; this information can be derived from responses to items in the school/district loop

Participated in teacher internship program

K-12 teachers

Teaching career paths


How well did your student teaching or internship

experience prepare you for teaching?

K-12 teachers

Teacher education/training, teaching career paths

Drop; not analytically useful

How well did your education courses in college prepare you for teaching?

K-12 teachers

Teacher education/training, teaching career paths

Drop; not analytically useful

How well did your academic courses in college prepare you for teaching?

K-12 teachers

Teacher education/training, teaching career paths

Drop; not analytically useful

How many K-12 teaching jobs (not including teacher’s aide, short-term substitute, or student teaching jobs)?

K-12 teachers

Teaching career paths

Was not included in field test, will not be in full scale

For each school/district (not including teacher’s aide, short-term substitute, or student teaching jobs):

K-12 teachers

Teaching career paths


Type of teaching job (regular, long-term substitute,

support, itinerant)




Start and end date




Number of schools at which taught in this job




School(s) where taught (CCD/PSS coder)




Sector and level of school

If school not in coder



County and district of school for itinerant position

Itinerant teachers



Whether participated in a formal induction program (first job only)




Grades taught




Subject areas taught




Taught any college prep, AB/IB, honors, bilingual/ESL, gifted, or remedial classes (check all that apply)



Drop; not analytically useful

Whether prepared to teach all subjects taught




Whether taught full or part-time




Academic year base salary and other compensation




Why did you leave that school/district?




Degree of preparation for first teaching position

K-12 teachers

Teaching career paths


Support from school or district in first teaching job

K-12 teachers

Teaching career paths


Satisfaction with aspects of teaching

K-12 teachers

Teaching career paths


How long do you plan to continue teaching?

Currently teaching

Teaching career paths

Drop; not analytically useful

Do you plan to return to teaching?

Taught but no longer teaching

Teaching career paths

Drop; not analytically useful

Certification and preparation


Teacher education/training, teaching career paths


Ever certified to teach at the K-12 level? (Do not include emergency certificates or waivers.)

Prepared to teach


Drop; not analytically useful

Currently certified to teach in any of grades K-12 in any state?

Ever certified



State of certification

Ever certified


Drop; not analytically useful

Type of certification

Ever certified


Added; missing in FT data elements

Name of teaching certification

Ever certified


Drop; was used only to generate data to help determine response categories for type of certification item

Date first certified (month, year)

Ever certified



Field(s) in which certified

Ever certified



Certified through an alternative certification program?

Ever certified


Drop; not analytically useful

Type of alternative certification program

Certified by alternate route


Drop; not analytically useful

Completed or completing student teaching or teacher practicum

No regular certification or not teacher education majors


Ask only of respondents without regular/standard certifications or who were not teacher education majors

Taken or taking courses towards certification

Prepared but never certified







Teaching job applications




Applied for teaching jobs since completing degree

Not taught, but had prepared or were currently considering

Teaching career paths


Received any offers?

Applied



Rejected all offers?

Received an offer


Was not included in field test, will not be in full scale

Reasons for not taking offered teaching job(s)

- Received offer after another job was accepted

- Pay was not adequate

- Job offer too far from home

- Job offer in dangerous/difficult school

- Offer not in area for which I was qualified

- Another job offered more interesting/challenging

work

- Poor teaching conditions

- Already in another job

- Received better offer

Rejected offers


Was not included in field test, will not be in full scale

Reasons for not applying for a teaching position

Did not apply



Have you had any non-teaching jobs in elementary or secondary education? (principal, assistant principal, program administrator, curriculum coordinator, department head, school

psychologist/counselor/advisor, coach, library media specialist/librarian, support staff (e.g., secretary), other)


Taught, prepared, or were currently considering

Teaching career paths

Was not included in field test, will not be in full scale

Do you have any plans to move into or continue in a non teaching job in elementary or secondary education?

Taught, prepared, or were currently considering

Teaching career paths


Why did you major in teacher education?

Teacher education majors who said no to screeners 1 and 3 and had neither applied nor taught since graduation


Drop; not analytically useful

Plan to teach in future

Teacher education majors who said no to screeners 1 and 3 and had neither applied nor taught since graduation



Loan forgiveness program awareness and participation

All who taught, prepared, or are currently considering

Loan forgiveness programs

NOTE: This addresses TEACH grants





Locating information

All

Tracing for next follow-up






Attachment D


Additional Foreign Language Questions


Proposed Wording

NAAL 2003 Question


Similar/modified/

new

Justification

Is English your native language?

A-5, A-6

Similar

This a gateway question to differentiate between native and non-native speakers.

What language do you consider to be your native language?

A-7

Similar

This question to be used to set primary language acquisition

Do you know any other language(s) or have you ever taken classes in a foreign language?

A-8


Modified


The NAAL currently asks this question in respect to ESL classes. We would like to modify it have it apply to all people who know more than one language.

Which second language do you know best?

A-13

Similar

This question to be used to set secondary language acquisition

How long ago did you last take a [T_LNGNAM] class?

A-10

Similar

This question asked to understand when last formal class was taken.

Growing-up, did you speak [T_LNGNAM] at home always, sometimes, or never?

A-5

Similar

This language to determine how non-english language was spoken in home.

In comparison to your English, how proficient in [T_LNGNAM] are you in the following:

A-14, A-15

Similar

Question asked to get self-assessment of non-english language skills versus english.

Currently, do you interact with people in [T_LNGNAM] on a regular basis?

---

New

These questions are not asked in the NAAL. We would like to cognitively test these so that we may get an accurate picture of how people are maintaining their 2nd language skills

Do you use, or plan to use, your [T_LNGNAM] in your career?

---

New




Attachment E


Change to Response Burden


The burden estimates and estimates of costs to respondents are provided in tables 4 and 5 below. The response time for participating institutions was described in the previously approved package that covered the transcript collection. The focus here will be on student sample members.


Projected estimates for response burden and costs for B&B:08/09 are based on experiences from B&B:93/03 and more recent studies, including NPSAS:08 and BPS:04/09 as well as the B&B:08/09 field test. Estimated response burden for students is based on extensive timing analysis conducted in previous B&B interviews.


Table 4. Estimated burden on B&B:08/09 full-scale respondents

Full-scale Data collection activity

Sample

Expected eligible

Percent expected response rate

Number of respondents

Average time burden per response

Range of response times

Total time burden (hours)

Student interview

17,312

17,125

86

14,722

25 min.

10 to 45 min.

6,134

NOTE: B&B:08/09 = 2008/09 Baccalaureate and Beyond Longitudinal Study.

Table 5. Estimated costs to students for the B&B:08/09 full-scale implementation

Full scale Data collection activity

Sample

Response rate (percent)

Number of respondents

Average burden (time)

Total burden (time)

Rate per hour ($)

Total cost ($)

Student interview

17,312

86

14,722

25 min.

6,134 hrs.

10

61,342

NOTE: B&B:08/09 = 2008/09 Baccalaureate and Beyond Longitudinal Study.




1 Participation was measured as the outcome, rather than response rates. The participation rate includes those cases that initiated the interview, but were determined to be ineligible cases were not counted as completes, and thus are not represented in the response rates. However, it was the response to different data collection strategies that is of primary interest for these analyses. . There was little difference in the numbers that participated and that completed – e.g. there were very few ineligibles.

2 If the number of students who were not NPSAS study respondents, are potentially eligible based on CADE or the enrollment list, but have no transcript is lower than expected, then the sampling rate may be a little higher so that a few cases can be sampled.

3 Column 2 indicates the subsample to whom the item will apply. Column 3 provides the purpose or issue being addressed by the data element, and column 4 indicates how the data element has been changed since the original OMB submission if it has changed. Rows with data elements to be deleted are highlighted.

Supporting Statement Request for OMB Review (SF83i)

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleMemorandum United States Department of Education
Authoraudrey.pendleton
File Modified0000-00-00
File Created2021-02-02

© 2024 OMB.report | Privacy Policy