Q & As

QA_Consolidated.doc

National Postsecondary Student Aid Study

Q & As

OMB: 1850-0666

Document [doc]
Download: doc | pdf

Memorandum United States Department of Education

Institute of Education Sciences

National Center for Education Statistics



TO: Edie McArthur December 12, 2007

Assistant to the Commissioner, NCES


FROM: James Griffith

Postsecondary Studies Division, NCES

SUBJECT: Responses to questions (ICR 200709-1850-006)

  1. What were the major data elements (and their sources) from NPSAS:04 used to estimate "the average amounts of the federal education tax benefits (Hope, Lifetime Learning, and Tuition and Fees Deductions) and their distribution among students?” How confident is NCES in the quality of those data elements? Will the data items and methodology remain the same in 2008?

Since there is no student-level data available to determine the value of these federal tax benefits, estimates of the tax benefit amounts for the NPSAS:04 sample students were imputed based on the eligibility requirements and data published by the Internal Revenue Service (IRS).


The IRS publishes tables based on a sample of tax returns that show the number of returns by adjusted gross income (AGI), the amount of tax liability, as well as the number and amount of various types of credits and deductions claimed (U.S. Department of the Treasury 2005). The amounts of the Hope and Lifetime Learning tax credits claimed are shown as a combined total "education tax credits" in the IRS tables. These are the amounts claimed prior to any adjustment for taxes owed, which may limit the actual amount of the benefit received. The tuition and fees deduction amounts in the IRS tables are shown as the amount of the deduction claimed, which is substantially larger than the value of the tax benefit. The number of education tax credit and deduction claims and the total dollars claimed by AGI levels that are reported in the IRS tables for the 2003 tax year were used as the target numbers for imputing the claims estimates in the NPSAS:04 sample data. The results were checked for consistency and are reasonable estimates given the assumptions. A comparison with IRS totals is shown in NCES-2006-186 (p. B-22).


The methodology, assumptions, and caveats in deriving estimates in the NPSAS:04 study are described at length in the appendix B to NCES-2006-186 (pp B-16 to B-29).


A similar method is planned for use in NPSAS:08.


  1. NCES indicates that among the primary issues for NPSAS:08 are the HERA changes in the EFC calculation. Please confirm that the CCRA changes also will be addressed in the collection.

Yes, the relevant College Cost Reduction Act (CCRA) changes will be considered, although some of these do not go into effect until 2008-09.

  1. Understanding students' private loan debt, and even the degree to which they understand whether their loans are private, are important policy questions. To what degree can NPSAS:08 collect and provide basic information about private loans?

The subject of private loans was discussed extensively at the recent NPSAS:08 Technical Review Panel meeting (held in August 2007). Our experience in past NPSAS surveys, which have included some questions about private loans and amounts borrowed, suggested that many students do not understand the difference between federal loans and other types of financial aid, including private loans.

For NPSAS:08, we have designed the interview questions to better differentiate among the various types of aid. For instance, in addition to asking whether the student has taken out any private loans, an item has been included to ask if a cosigner is needed for any of the student’s loans (since a cosigner is only required for a private loan). The NPSAS:08 student interview also includes items pertaining to the types of activities students engage in to learn about their financial aid options. A facsimile of the survey questions about this topic is attached with this memo.

  1. Just as NPSAS:04 was able to provide baseline information that allows for a 2008 evaluation of the SMART grant program, there is policy interest in establishing a baseline for TEACH grants which will be first implemented in late 2008. To what degree can NPSAS:08 collect and provide baseline information about whether students indicate that they are interested in becoming a teacher (in concert with the item on the institutional data collection about TEACH grant recipiency)? Specifically, what is the feasibility of asking the intention to teach questions (currently planned for the B&B subsample) of the full sample? Were any of these questions asked in the 2004 NPSAS?

The Teacher Education Assistance for College and Higher Education (TEACH) Grant program begins with the 2008-09 academic year and is targeted to academically qualified students who are willing to make a commitment to teach full time for at least 4 academic years within 8 years of completing their program of study. The grant will be available to eligible undergraduate and graduate students who go on to teach a high-need subject at a school serving low-income students .

In NPSAS:04, students who were in the cohort of first-time beginners (FTBs) were asked the following question:

Do you plan on becoming a teacher at the K-12 (Kindergarten-grade 12) level?

Definitely yes

Probably yes

Probably not

Definitely not 

In NPSAS:08, the B&B subsample will be asked if they have ever taught or if they are currently considering teaching.

Have you ever been employed as a K-12 teacher at a public, private, or parochial school? (asked of all B&B eligible students)

Are you currently considering teaching at the K-12 level at a public, private, or parochial school? (asked of all B&B eligible students who have not taught)

Since the grant will not be implemented until late 2008, the NPSAS:08 survey could be used to collect baseline measures of interest in teaching. The grant will be available to both undergraduate and graduate students, so the teacher questions will be asked of everyone to determine the level of interest even among students who are not currently preparing to become teachers. However, we plan to restrict the question about having ever taught to students who are at least 22 years old, but will ask them if they are considering teaching.

  1. Please provide a copy/link to the 2006 field test results specifically regarding incentives.

A draft of the field test methodology report is attached with this memo, but the summary below describes field test data collection periods and results pertaining to experiments using incentives.


Data Collection Periods


Overall, 2,020 of approximately 2,950  (68 percent) eligible sample members completed either a full or partial student interview.  Interviews were completed in three periods:


1)   Early Response 

After initial locating of sample members, a period of 3 weeks was allotted for students to complete the self-administered interview via the Web. All respondents who completed the student interview during the early response period were offered a $30 incentive.

About 1,050 interviews  (36 percent of the eligible sample) were completed during the early response period and were thus eligible for the $30 incentive.


2)   Production 

Sample members who did not complete the NPSAS:08 field test interview during the early response period were contacted by telephone interviewers during the production interviewing phase.

Approximately 480 interviews were completed but were not eligible for an incentive either because they were completed after the early response interviewing phase, or were only partially completed. 


3)   Nonresponse Conversion

The final phase of student interviewing involved the nonresponse conversion of refusal cases and of those students who were difficult to locate. Respondents who completed the student interview during the final phase were offered a $30 incentive.

Among the 1,420 sample members eligible for the nonresponse conversion incentive of $30, about 34 percent (n=490) completed the interview. 


Experiment Results


The NPSAS:08 field test study included experiments designed to evaluate the impact of various data collection strategies on student response rates. In NPSAS:04, a $30 incentive was offered to nonrespondents late in the data collection period with apparent effectiveness. However, questions remained regarding the effectiveness of such an incentive in terms of early response and interview completion by self-administration. Moreover, the literature also suggests that type of mailing may play an important role in determining whether or not sample members actually receive and/or read the notification letter and, therefore, whether or not they actually become aware of the incentive offering.


Two experiments in the NPSAS:08 field test focused on response rates during the early response period, during which students are asked to complete a self-administered interview.

The first experiment examined whether the use of Priority Mail to send study materials produced a higher response rate in the early response period than First-Class Mail.  The second experiment during the early response period examined the effectiveness of prompting calls.  About halfway through the early response period telephone calls were made to remind sample members about the study and assist with login information if needed.  Results showed that both Priority Mail and prompting calls were associated with higher early response rates.


Another experiment considered the use of prepaid incentives during the final phase of data collection-the nonresponse conversion phase.  Sample members who became eligible for nonresponse conversion (e.g., those who had not completed the student interview and had at some point refused or were particularly difficult to locate) were offered a $30 incentive. Sample members had already been randomly assigned to one of two incentive conditions, either being sent a check for $10 up front along with a letter offering another $20 upon interview completion or receiving a letter promising the entire $30 incentive after the interview was completed.  There was no significant difference, however, in response rates between those who received the prepaid and those who received the promised nonresponse conversion incentive.

  1. Why is the data collection period six months long? How will NCES address reference point issues given this lengthy data collection period?

A summary of sampling and data collection procedures will help to address the question of the time required for data collection.

The NPSAS study employs a two-stage sample design. First, institutions are selected from a frame consisting of all Title IV eligible institutions in IPEDS and are asked to provide student enrollment lists. Depending on the institutions’s calendar system, institiutions submit enrollment lists on a flow basis between January and June of the study year.

The student sample is then selected in the second stage from the enrollment lists, which contain students enrolled from July 1, 2007 – April 30, 2008. While many institutions will be able to provide lists of such students early in 2008, others (e.g., those with continuing enrollment) will not be able to provide lists until later in the spring. Table 1 presents the flow of lists experienced in the NPSAS:04 study. As can be seen, the data collection period is not the same for all students and, in fact, can be quite restricted for those who are sampled late in the process.

Table 1. Cumulative flow of enrollment list receipt: 1996 and 2004

Month

Cumulative percentage of lists received

NPSAS:96

NPSAS:04

1

17.7

12.5

2

42.2

38.4

3

63.6

58.8

4

85.1

75.4

5

95.9

88.7

6

98.8

98.2

7

100.0

100.0

SOURCE: U.S. Department of Education, National Center for Education Statistics,1996 National Postsecondary Student Aid Study (NPSAS:96), and 2004 National Postsecondary Student Aid Study (NPSAS:04).

Another factor that is driving the proposed data collection schedule is the large sample size of about 138,000 students in NPSAS:08. A sample this large will require significant time and resources to work all the sample cases, especially those that are difficult to locate or who initially refuse to participate. Also, students are initially given three weeks to complete the interview by self-administration on the Web before computer-assisted telephone interviewing (CATI) begins, so the CATI data collection period is shorter than the overall data collection period.

Thus, the data collection schedule for NPSAS is determined by several constraining factors. Data collection cannot begin until the sample has been selected, which cannot happen until enrollment lists have been received. Depending on the institutional calendar system, some enrollment lists cannot be submitted until June. The scheduled period of time for data collection is necessary given the level of effort required to locate and interview students, as well as collect institutional record data and data from extant sources, and to prepare the data files for the contractually scheduled delivery.

The second part of this question pertains to the impact of the proposed data collection period on the “reference period.” The NPSAS student survey collects information about enrollment and experiences during an academic year (between July 1 and June 30) rather than at a single point in time. Data collection generally begins in the spring of that year and goes through the summer. Depending on when the enrollment list is received and the student sample selected, a student may be interviewed during the NPSAS year, or soon thereafter. Even when interviews occur during the NPSAS year, some students are still enrolled and others are no longer enrolled.

To address this reference point issue, we have implemented conditional wording for many of the interview questions. The wording will vary for students who are currently enrolled at the time of the interview and for those who are no longer enrolled. During the study year, students not enrolled at the time of the interview will be asked about their “most recent term of enrollment during the 2007-2008 school year” to ensure consistency of responses.

For interviews conducted on or after July 1, 2008 (e.g., after the NPSAS year is over), all students will be asked about the “most recent term of enrollment during the 2007-2008 school year” to ensure that all students respond about their experiences during the reference point of interest rather than their current enrollment experiences.

  1. Is any special outreach planned to the approximately 20 institutions that will be included in both the field test and the full-scale study? 

All 20 institutions sampled for both the full-scale and field test are part of the 6-state oversample. Institutions from two states—Minnesota and California—were not sampled for the field test expressly because those states initially expressed interest in funding their own state oversample. Hence, only 4 states have institutions that were sampled for both the field test and full-scale study.

Of the institutions selected for both the field test and full-scale studies, four schools did not participate in the field test and will be treated as refusal conversion schools for the full-scale. Data will be provided at the system level for four other schools, and thus they will not burdened by repeat participation since the requested data are provided centrally .


For the remaining 12 institutions that did participate in the field test study, we will build upon the strong working relationships we developed with these institutions during the field test.

For example, when working with an institution that participated in the field test, we plan to work directly with the Institutional Coordinator (IC) to inform him/her that the institution has been selected for the full-scale study before sending the initial mailing to the Chief Administrator. This is intended to minimize any questions or concerns that a blind mailing to the Chief Administrator’s office might engender. We will emphasize the importance of their participation for the success of the state-representative component of the study. As with any institution where potential burden becomes an issue, these institutions will be offered compensation for their time and effort in providing both list and student record data, and the assistance of RTI field data collectors in data entry of student record information for sampled students.

 

We are also working closely with representatives from the 6 oversample states and with system level staff to provide support and encourage the participation of these institutions in the full-scale study.  (Four of the schools are part of state systems where we have established strong working relationships).

  1. What are the "minimum data requirements, regardless of source" for a student to be considered a respondent?

NPSAS:04 introduced the “study respondent” concept, which was continued in the NPSAS:08 field test. Because multiple data sources were used in the NPSAS:08 field test study, it was possible for a sample member to have all of the crucial pieces of information while only matching or responding to a small number of data sources. As a result, a study respondent was a variable-based rather than source-based definition. Key variables were identified across sources.

For both studies, a respondent is defined as any sample member who is determined to be eligible for the study and, minimally, has valid data from any source for the following:

  • student type (undergraduate or graduate/first-professional);

  • date of birth or age;

  • gender; and

  • at least 8 of the following 15 variables:

  1. dependency status;

  2. marital status;

  3. any dependents;

  4. income;

  5. expected family contribution (EFC);

  6. degree program;

  7. class level;

  8. baccalaureate status;

  1. months enrolled;

  2. tuition;

  3. received federal aid;

  4. received nonfederal aid;

  5. student budget;

  6. race; and

  7. parent education.

The vast majority of study respondents exceeded considerably the minimum requirements to be classified as a study respondent. Overall data completeness was quite high. Table 2 shows that, in the NPSAS:08 field test, 99 percent of study respondents had CADE data, and over two thirds of study respondents had both student interview and CADE student record data. Approximately 45 percent of study respondents had data from all three of the primary data sources (CADE student record data, student interview data, and Central Processing System [CPS] data).

Table 2. Data completeness for NPSAS:08 study respondents, by data source: 2007

Sources of data

Study respondents

Number

Percent

Total

2,900

100.0




Student interview, CADE student record, and CPS data

1,300

45.0

Student interview and CADE student record data

690

23.7

CADE student record and CPS data

590

20.5

Student interview and CPS data

20

0.6

CADE student record data only

290

9.8

Student interview data only

10

0.5

NOTE: Detail may not sum to totals because of rounding. CADE = computer-assisted data entry; CPS = Central Processing System.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2008 National Postsecondary Student Aid Study (NPSAS:08) Field Test.

Table 3 presents data completeness for study respondents in the NPSAS:04 full-scale study. In total, 92 percent (weighted) of the study respondents have student record data from the NPSAS institution (CADE data). The percentage of study respondents who have student interview data is 70 percent. Additionally, 52 percent of study respondents had a federal aid application for the 2003–04 academic year in the CPS database. The percentage of study respondents who matched to the NSLDS loan database for the 2003–04 academic year is 34 percent. Those that matched to the NSLDS Federal Pell Grant database for the same year is 23 percent.


Table 3. Percent of student respondents with data, by institutional characteristics, student type, and source: 2004

Institutional characteristics
and student type
5

Number of responding students6

Student record percent1




Interview percent2




CPS percent3






NSLDS loans

Percent7






NSLDS Pell Grants

percen4

Un­weighted

Weighted

Un­weighted

Weighted

Un­weighted

Weighted

Un­weighted

Weighted

Un­weighted

Weighted

Total

90,750

90.2

91.7


68.5

69.7


62.7

52.4


37.7

33.8


32.0

23.1

















Institutional level
















Less-than-2-year

9,690

87.6

87.8


49.8

49.6


83.3

69.9


44.5

47.6


63.8

47.7

2-year

31,260

87.4

88.9


66.3

68.5


59.1

43.1


24.3

16.5


36.1

24.7

4-year non-doctorate-granting

19,400

91.7

94.4


71.3

70.0


71.2

63.0


51.5

48.7


33.5

26.7

4-year doctorate-granting

30,400

92.8

93.7


74.8

72.8


54.4

55.0


40.4

43.5


16.8

16.4

















Institutional control
















Public

56,990

91.3

91.6


71.1

70.5


55.5

46.8


28.2

25.6


27.3

21.5

Private not-for-profit

20,630

89.2

93.2


70.8

70.8


67.7

62.5


49.0

50.7


27.6

19.5

Private for-profit

13,120

86.9

89.2


53.1

59.8


86.0

82.5


61.2

71.8


59.5

47.4

















Type of institution
















Public less than-2-year

1,930

94.6

95.9


62.1

60.6


71.1

34.1


14.9

11.0


50.4

21.0

Public 2-year

26,320

90.5

89.6


68.2

69.0


53.9

40.5


17.3

13.0


32.1

22.5

Public 4-year nondoctorate-granting

8,160

90.8

94.8


72.2

71.1


63.2

55.8


40.0

38.4


30.8

25.0

Public 4-year doctorate-granting

20,600

92.2

93.2


75.4

72.9


53.0

53.2


38.6

40.6


17.6

18.2

Private not-for-profit 2-year or less

2,570

64.9

70.8


52.5

55.6


83.1

77.4


41.0

45.1


58.8

46.6

Private not-for-profit 4-year non-doctorate granting

8,550

91.2

93.1


73.0

69.8


74.8

65.6


56.7

52.6


32.2

25.0

Private not-for-profit 4-year doctorate granting

9,510

93.9

94.7


73.8

72.7


57.1

58.1


44.2

48.9


15.1

11.7

Private for-profit less than-2-year

7,150

87.3

87.0


47.9

48.1


86.8

76.1


54.9

55.0


66.8

51.7

Private for profit 2-year or more

5,970

86.3

90.2


59.3

65.0


85.2

85.4


68.6

79.2


50.8

45.4

















Student type
















Total undergraduate

79,850

90.0

91.5


67.1

69.0


66.4

53.8


38.7

33.0


36.3


Potential FTB

35,510

89.7

90.5


70.4

76.6


72.1

59.0


38.9

31.3


40.6

30.9

Other undergraduates

44,340

90.2

92.0


64.5

65.4


61.9

51.3


38.5

33.8


32.9

24.3

Graduate/first professional

10,890

91.6

93.0


78.3

74.5


35.5

43.0


30.2

39.0


0.47

0.77

1 Percent of study respondents who met the criteria for qualification as a computer-assisted data entry (CADE) completion.

2 Percent of study respondents who met the criteria for qualification as a student interview completion.

3 Percent of study respondents who matched to CPS, which contains federal aid application (FAFSA) data.

4 Percent of study respondents who matched to the National Student Loan Data System (NSLDS) for loans and Pell Grants during the 2003–04 academic year.

5 Both institutional characteristics and student classifications were verified (where possible) to correct classification errors on the sample frame.

6 A responding student is defined as any eligible student for whom sufficient data were obtained from one or more sources, including: student interview, institutional records, and the Department of Education's Central Processing System (CPS).

7 The small percentage of matched graduate and first-professional study respondents were undergraduates at some time during the year and as such were eligible for this type of aid during the year.

NOTE: Detail may not sum to totals because of rounding. FTB = first-time beginner.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Postsecondary Student Aid Study (NPSAS:04).

  1. Does receipt of data from another source impact nonresponse follow up of students?

Data receipt from a source other than the student interview does not impact nonresponse follow up of students. It would be very difficult and costly to monitor data collection at that level, given the simultaneous data collection efforts. However, contact information collected from all sources is used to help locate and contact the student.

Data elements from multiple sources are useful for creating derived variables and for use in imputation and nonresponse bias analysis. For instance, when data are available from multiple sources, decision rules can be used to determine the best source for a particular data element.

Multiple data sources are also useful for nonresponse bias analysis of items that are known for both respondents and nonrespondents. Furthermore, while there is some overlap across sources for some data elements, most data items are unique to a particular data source – that is, they are only obtained from the student interview or CADE and not from other sources. For these reasons, we still attempt to complete student interviews even with students for whom we have the other data sources.

  1. Please provide more information on the types of institutions that tend to be nonrespondents. How does this pattern affect the representativeness of the student sample? What nonresponse bias analysis is planned?

Our experience in NPSAS:04 showed that public less-than-2-year and private not-for-profit 4-year institutions had the lowest institution participation rates. Table 4 shows the NPSAS:04 unweighted and weighted participation rates by institutional level, control, and type.



Table 4. Numbers of NPSAS:04 sampled, eligible, and participating institutions and enrollment list participation rates, by institutional characteristics: national sample

Institutional characteristics2

Sampled institutions

Eligible
institutions
3

Institutions providing lists1

Number

Unweighted percent

Weighted percent

All institutions

1,670

1,630

1,360

83.5

80.0







Institutional level






Less-than-2-year

260

250

200

82.1

80.8

2-year

490

480

410

85.4

78.0

4-year non-doctorate-granting

460

460

380

83.3

74.6

4-year doctorate-granting

450

450

370

82.4

85.6







Institutional control






Public

810

800

680

84.9

79.6

Private not-for-profit

570

560

450

81.2

79.8

Private for-profit

290

270

230

84.2

86.7







Type of institution






Public less-than-2-year

70

60

50

76.6

74.3

Public 2-year

380

380

320

85.4

77.6

Public 4-year non-doctorate-granting

130

130

110

85.1

70.3

Public 4-year doctorate-granting

230

230

200

86.3

87.1

Private not-for-profit less-than-4-year

70

70

70

89.0

92.6

Private not-for-profit 4-year non-doctorate-granting

280

270

220

81.9

78.1

Private not-for-profit 4-year doctorate-granting

220

220

170

77.7

80.8

Private for-profit less-than-2-year

170

160

140

84.0

82.3

Private for-profit 2-year or more

110

110

90

84.4

88.2

1 Percents are based on the eligible institutions within the row under consideration.

2 Institutional characteristics are based on data from the sampling frame which was formed from the 2000–01 and 2002–03 Integrated Postsecondary Education Data System (IPEDS).

3 Among the 30 ineligible institutions: 10 closed after the sampling frame was defined, and 10 failed to meet one or more of the criteria for institutional NPSAS eligibility. The remainder were treated as merged institutions because two or more campuses were included on one combined student list.

NOTE: Detail may not sum to totals because of rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Postsecondary Student Aid Study (NPSAS:04).

To prevent the institution nonresponse from affecting the representativeness of the student sample, institution weights will be computed that will then be incorporated into the student weights. These institution weights will be adjusted for nonresponse and poststratified to IPEDS totals. Institution nonresponse bias will be performed before and after weight adjustments for any institution sector with a response rate less than 85 percent (per NCES statistical standards). Likewise, student nonresponse bias will be performed before and after weight adjustments for any institution sector with a student study response rate less than 85 percent (per NCES statistical standards). If significant institution or student bias still remains after weight adjustments, then consideration will be given to re-computing the weights in a way to better reduce bias.

  1. During the CADE process, will institutions be aware of the fact that they are being asked for last names and SSNs only of aid applications for which there was a non-match to CPS? Does a non-match have any particular administrative meaning (e.g., that the school's records for a particular student may be inaccurate)?

Currently, the CADE process involves asking institutions to provide or confirm the name and SSN of all sampled students regardless of whether they were matched successfully to CPS. In dataCADE, we ask this information for all students because the data file specifications are generic, not customized by student or institution. In webCADE, this information is collected or confirmed in the Student Characteristics section of the instrument. If the institution’s staff provided name and SSN on the student enrollment list, then the information is displayed and the institution coordinator is asked to confirm it.

After revisiting this procedure, in response to this question, we have decided to ask institutions only to confirm or provide the SSN of the student in webCADE if the student was not matched successfully to CPS. The CPS match rate in NPSAS:04 was approximately 60 percent, so we expect institution coordinators will need to confirm/provide student SSN in webCADE for less than half of the students under this approach.

Institutions that opt to complete webCADE will be informed that they are being asked for SSNs only of students who did not match to CPS even though the record data indicate that those students had received federal aid. We will include this information in the webCADE User’s Guide and in the data elements available on the institution website, as well as in the Frequently Asked Questions documents that are sent to the institutions in the webCADE request packet and provided on the NPSAS institution website. Please note that in dataCADE we will continue to request SSN for all students.

A non-match does not necessarily have an administrative meaning for the institution. A non-match can result from one or more of several reasons. The most likely reason is that the student did not apply for federal aid. Students for whom we have valid match criteria (SSN and the first two letters of the last name) who do not match to CPS are assumed to be non-applicants. An analysis presented in the BPS:04/06 methodology report (forthcoming) evaluated the match rate to CPS among federal aid recipients and found that about 97 percent of students known to have received federal aid (either by the presence of a record in the NSLDS file or information abstracted from each sample member’s institution record (CADE data) as part of NPSAS:04.) matched to records in the CPS.

Another possible reason for a non-match is if the student’s last name or SSN were missing from the student enrollment list, or if there was a data-entry error. There may also have been a last name change between the time that the student applied for federal aid and the time when the student enrollment list was provided. While such mismatches are rare, we minimize this possibility by trying a second match attempt if a new last name or SSN are obtained in CADE.

12. How many locating companies other than Accurint will be used in the student locating operations? With whom is the Accurint (and other) contract/purchase order/subscription? What security arrangements are in place with those companies? Has NCES confirmed that the credit database companies in particular are not retaining/using the search information?

RTI is planning to use four locating vendors for intensive tracing activities on NPSAS:08.

  1. Accurint,

  2. FastData,

  3. Experian, and

  4. TransUnion

Each of these companies is regularly used by RTI for locating information across a variety of studies. RTI maintains signed contractual agreements with each of the vendors that specify the contractual restrictions that limit RTI’s use of the information (to locating research subjects) and to assure for the confidentiality of the information that is collected.   

The IT systems security includes access that is restricted by unique, password-protected individual accounts logins. Access is further restricted to users with RTI defined IP origination site.


SUBJECT: December 20, 2007, Responses to 2nd set of questions (ICR 200709-1850-006)


1.  There is interest in examining the relationship of the TEACH2008 grants and student preparing to teach.  What items do you propose to use to accomplish this?


We were asked to speak with OPE regarding the necessary data and respondent group to examine this issue relating to the TEACH 2008 grant program.  We spoke with David Bergeron of OPE, and he recommended that we include items in the student interview that ask all students in 2- and 4-year postsecondary institutions whether they plan to prepare to teach.  Thus, we propose to ask the non-B&B students in the NPSAS:08 sample (including those in less-than-2-year institutions) the following: 


A) For the non-B&B-eligible students, we would ask :


Do you plan on becoming a teacher at the K-12 (Kindergarten-grade 12) level? (If you are currently teaching or have taught, please answer this question about your plans to continue teaching.)

1  Definitely yes

2  Probably yes
3  Probably no
4  Definitely no


If answer = YES (1 or 2), then ask:


Are you currently taking or planning to take any course(s) at a college or university that would  prepare you to teach at the K-12 (Kindergarten-12th grade) level?


1 Yes
2 No


B) For B&B-eligible students, we will ask the following questions (as originally proposed):


Have you ever been employed as a K-12 teacher at a public, private, or parochial school?

1 Yes
2 No


If answer = NO (2), then ask:

Are you currently considering teaching at the K-12 level at a public, private, or parochial school?

1 Yes
2 No


For the B&B students who have taught, items in the interview collect additional detail on their teaching experience and for those who are considering teaching, we ask additional questions about things they have done to prepare for teaching.


During data file preparation, we will construct a derived variable that combines answers for both sets of respondents (B&B and all others) to indicate interest in teaching:


Have you ever been employed or do you plan to teach at the K-12 level

1 Yes, I have been employed at the K-12 level

2 Yes, I am currently considering teaching, or I definitely or probably will teach at the K-12 level

3 No, I am not currently considering teaching, or I definitely or probably won't teach at the K-12 level


2.  Please provide information on the number of sampled institutions receiving incentive payments and the range of such payments. 


Detailed records of NPSAS:04 institution reimbursements from 2003-2004 are no longer available; however, RTI has compiled estimates of the NPSAS:04 reimbursements from our project records.


Approximately 24% of the participating institutions requested reimbursement for providing student record data. Of these, the average reimbursement was approximately $350, with a range of $300 to $1,200 per institution.


The average of $350 is consistent with the estimated cost to each institution for participation in NPSAS:08 as indicated in Table 3 of the previously submitted OMB package.

 

3.  Please respond to concerns about the use of data that NCES shares with tracing / locating companies.  Specifically, has NCES confirmed that the credit database companies in particular are not retaining / using the search information?


Tracing sample members is an essential component to conducting a successful survey, and has proven to be particularly important in past NPSAS studies.


RTI has corporate agreements with vendors for tracing and locating activities. Vendors have been selected based on cost, results of test samples and historic success rates with each. Vendors do not collect, store or reuse any of the source data provided by RTI. Privacy statements from tracing/locating vendors are attached in “Q3_Vendor_Info.doc.”


For all tracing and locating activities, access to subject data is limited to staff who have been approved by the project and have an acceptable need to access the data. For interactive searches, no files are transferred from RTI to vendors; rather, approved RTI staff are given access to vendor databases where search criteria are entered and matches are returned.


For batch searches, information (names, addresses, telephone numbers, and case IDs) transferred to external tracing sources for locating a sample member are transmitted electronically by an access-controlled FTP server using a password-protected login. Case identifiers in these files will be different from case identifiers on final data files produced for release.


Our existing contracts with these vendors do not specifically address these security measures in sufficient detail. We are currently in the process of obtaining updated contracts with these tracing/locating vendors to specify requirements for data usage, security, storage, and disposal. If desired, the relevant contractual language regarding security requirements may be provided to NCES/OMB as soon as it becomes available.


4.  Please provide documentation on the precision for estimates of various subgroups in the NPSAS:08 sample. 


The attached document, “Q4_Domainsandoutcomes.doc,” describes the methodology used to determine the necessary sample sizes to achieve the desired precision for estimates of various subgroups in the sample. 


The document, “Q4_Domainsandoutcomes.xls.” lists these various subgroups and estimates.


The cost/variance optimization process was not conducted for state estimates for NPSAS:08, as it was done for NPSAS:04. In NPSAS:04, the actual sample yield was not sufficient in many of the states, especially the smaller states. Therefore, for NPSAS:08, we set sample sizes based on our NPSAS:04 experience. Specifically, as stated in section B.2.a of the OMB package:


We estimate that we will need approximately 1,200 respondents per state in the 4-year and for-profit sectors and 2,000 respondents in the public 2-year sector in order to yield a sufficient number of full-time, dependent, low-income undergraduates—the subset of students that is of particular relevance for the study of postsecondary access.


5.  Please delete from NPSAS:08 materials reference to NESA and the E-Government Act Resubmit the revised materials.


In looking at the student brochure for NPSAS:08, we could not find any reference to either the National Education Statistics Act (NESA) of 1994 or the E-Government Act of 2002. Therefore, we think we do not need to make any changes to the student brochure.


We did find reference to NESA and the E-Government Act in the NPSAS:08 institution brochure, which was part of the institution contacting material that had already been approved by OMB in fall 2006. These brochures were sent to institutions in September 2007.


The institution website is already live, but we can make modifications to it. The student website is currently in development. As directed, we will remove reference to NESA and the E-Government Act on the Institution and Student websites.

 

6.  Please provide information on how students participating in NPSAS:08 will be given "informed consent" for participation in B&B.


Appropriate language for obtaining informed consent for participation in the B&B longitudinal study is included in the initial screens of the interview for both web and telephone administration. RTI’s IRB has reviewed and approved the attached wording.

See attachment, “Q_6_Consent_Wording.doc.”


7.  Please provide documentation on experiments of incentives in the BPS:04/06 study.


The attached document, “Q_7_BPS06_FT_Exp_Results.doc,” describes results of the BPS:04/06 field test incentive experiments.


We have also attached results of the B&B:93/2003 field test incentive experiment results in the attached document, “Q_7_BB03_FT_Exp_Results.doc.”


8. Incentive amounts are not actually listed in Part A, item 9.  There is a discussion of the experiments in Part B, 4.  OMB understands that the intention is to offer $30 (max) to students - if they are early responders or if they are nonresponders who convert late in the process.  Can you provide a clear explanation of this into Part A, 9 as part of a revised package?  


The section below serves to replace Section A.9 in the OMB package.


Paying incentives is expected to encourage respondents (a) to respond early primarily via self-administration on the web and (b) to encourage nonresponding sample members to participate in the study.


Students will be offered an incentive of $30 for interview completions during the early response period and during the nonresponse conversion phase.


In the NPSAS:08 field test, we tested several specialized plans for ensuring that sample members were made aware of the study and incentive offers, and to improve response rates. The tests included evaluations of (a) contacting efforts using United States Postal Service Priority Mail, (b) prompting, and (c) the use of prepaid incentives to students during the nonresponse conversion phase. The results of these tests are described in Section B.4.


The use of incentives provides significant advantages to the government in terms of increased overall response rates and timely data collection. In addition, the use of incentives can also result in decreased data collection costs.





7

File Typeapplication/msword
AuthorMelissa Cominole
Last Modified Bykathy.axt
File Modified2008-01-08
File Created2008-01-08

© 2024 OMB.report | Privacy Policy