NPSAS 2012 Field Test Student Data Part A

NPSAS 2012 Field Test Student Data Part A.docx

National Postsecondary Student Aid Study

OMB: 1850-0666

Document [docx]
Download: docx | pdf









2011-12 National Postsecondary Student Aid Study (NPSAS:12)


Student Interview and Student Records





Supporting Statement Part A

Request for OMB Review

(OMB # 1850-0666 v.8)







Submitted by

National Center for Education Statistics

U.S. Department of Education








January 17, 2010

Contents

A. Justification 1

1. Circumstances Making Collection of Information Necessary 1

a. Purpose of this Submission 1

b. Legislative Authorization 2

c. Prior NPSAS Studies 3

d. Prior and Related Studies 5

e. Study Design for NPSAS:12 5

2. Purposes and Uses of the Data 6

3. Use of Information Technology 9

4. Efforts to Identify Duplication 9

5. Method Used to Minimize Burden on Small Businesses 10

6. Frequency of Data Collection 10

7. Special Circumstances of Data Collection 10

8. Consultants outside the Agency 10

9. Provision of Payments or Gifts to Respondents 11

10. Assurance of Confidentiality 11

11. Sensitive Questions 13

12. Estimates of Response Burden 14

13. Estimates of Cost 16

14. Costs to Federal Government 17

15. Reasons for Changes in Response Burden and Costs 18

16. Publication Plans and Time Schedule 19

17. Approval to Not Display Expiration Date for OMB Approval 20

18. Exceptions to Certification for Paperwork Reduction Act Submissions 20

B. Collection of Information Employing Statistical Methods 21

1. Respondent Universe 21

2. Statistical Methodology 22

3. Methods for Maximizing Response Rates 31

a. Collection of Data from Institutional Records 31

b. Student Survey: Self-Administered Web and CATI 33

4. Tests of Procedures and Methods 40

a. Experiment # 1: Increasing Survey Participation Using Informational Video 40

b. Experiment # 2: Response Propensity Approach 42

c. Experimental Design 47

5. Reviewing Statisticians and Individuals Responsible for Designing and

Conducting the Study 50

6. Other Contractors’ Staff Responsible for Conducting the Study 50

C. Overview of Analysis Topics and Survey Items 53

D. References 55

Appendixes

A. Studies Addressing Issues Relevant to NCES’ Postsecondary Longitudinal and Sample Surveys Studies Program A-1

B. Linkages To Administrative Data Sources B-1

C. Technical Review Panel Contact List C-1

D. Data Security Language for Vendor Contracts D-1

E. Confidentiality E-1

F. Letters and Contacting Materials F-1

G. Endorsing Associations for NPSAS:12 G-1

H. NPSAS:12 Student Interview H-1

I. NPSAS:12 Student Re-interview & NPSAS:08 User’s Manual for Web-Based Data Entry Interface I-1

J. NPSAS:12 Student Records Data Elements Table J-1

K. NPSAS:08 User’s Manual for Web-Based Data Entry Interface K-1

List of Tables

Table Page


1. Chronology of NPSAS and its longitudinal components 5

2. Maximum estimated burden on respondents 14

3. Maximum estimated costs to respondents for the NPSAS:12 field test and full-scale implementations 15

4. Individual and total costs to the National Center for Education Statistics (NCES) for the NPSAS:12 field test and full-scale implementations 16

5. Contract costs for NPSAS:12 17

6. Operational schedule for NPSAS:12 19

7. NPSAS:12 field test expected student sample sizes and yield 23

8. NPSAS:12 full-scale preliminary student sample sizes and yield 24

9. Weighted false positive rate observed in FTB identification, by sector: NPSAS:04 29

10. Detectable differences for field test experiment hypotheses 49

List of Figures

Figure Page

  1. Candidate Variables for Response Propensity Modeling 44

  2. NPSAS:12 field test experiment schedule of interventions 46

  3. 2011-12 National Postsecondary Student Aid Study (NPSAS:12)

  1. Justification

    1. Circumstances Making Collection of Information Necessary

      1. Purpose of this Submission

  1. The National Center for Education Statistics (NCES), within the U.S. Department of Education (ED), is requesting approval for conducting student interviews (including a re-interview of about 10 percent of respondents), collecting student records, and file matching for the 2011-12 National Postsecondary Student Aid Study (NPSAS:12), a nationally representative study of how students and their families finance education beyond high school. A separate request for review of NPSAS:12 materials and procedures for institution contacting and enrollment list collection was approved by OMB in July 2010 (OMB# 1850-0666 v.7). This study is being conducted by RTI International1 and its primary subcontractor, MPR Associates, under NCES Contract Number ED-IES-09-C-0039. Other subcontractors include Branch Associates, Kforce Government Solutions, Inc. (KGS), Research Support Services, Millennium Services 2000+, Inc., and consultants Dr. Cynthia Decker and Ms. Andrea Sykes.

  2. As with previous NPSAS submissions, we are requesting clearance for data elements and procedures. Following a field test study in 2011, NCES will provide the Office of Management and Budget (OMB) with a memorandum summarizing any changes planned for the full-scale data collection and a revised OMB package. We anticipate only minimal changes between the field test and the full-scale implementation, and therefore we are seeking a waiver of the 60-day Federal Register Notice for both components (institution and student) of the full-scale data collection submission planned for 2011.

  3. NPSAS was first implemented by NCES during the 1986–87 academic year to meet the need for national-level data about significant financial aid issues. Since 1987, NPSAS has been fielded every 3 to 4 years, most recently conducted during the 2007–08 academic year. This implementation is the eighth in the series and will be conducted during the 2011–12 academic year. NPSAS:12 also will serve as the base year study for the Beginning Postsecondary Students Longitudinal Study (BPS), a study of first-time postsecondary students that will focus on issues of persistence, degree attainment, and employment outcomes.

  4. Previous studies related to or based on data from NPSAS or its longitudinal spin-offs are listed in appendix A. A description of the security procedures in place for the linkages to administrative data sources is provided in appendix B. Appendix C lists the study’s Technical Review Panel (TRP). Data security language for vendor contracts is shown in Appendix D. A sample of the confidentiality pledge and affidavit of nondisclosure completed by all project staff having access to individually identifying data are provided in appendix E. Letters to institutions and students selected for participation in the NPSAS study are found in appendix F. A list of endorsing institutions and associations supporting NPSAS:12 is provided in appendix G. Appendix H presents the NPSAS:12 student interview data collection instrument and Appendix I contains the student re-interview data collection instrument2. Appendix J is a table of the data elements collected from student records, and Appendix K contains the user’s manual for institutions using RTI’s web-based data entry interface to submit student record data.

      1. Legislative Authorization

  1. NPSAS:12 is conducted by NCES within the Institute of Education Sciences (IES) in close consultation with other offices and organizations within and outside the U.S. Department of Education (ED). NPSAS is authorized under the Higher Education Opportunity Act of 2008, 20 U.S.C. § 1015:

  2. (A)(k) Student aid recipient survey

  3. (1) Survey required

  4. The Secretary, acting through the Commissioner for Education Statistics, shall conduct, on a State-by-State basis, a survey of recipients of Federal student financial aid under Title IV—

  5. (A) to identify the population of students receiving such Federal student financial aid;

  6. (B) to describe the income distribution and other socioeconomic characteristics of recipients of such Federal student financial aid;

  7. (C) to describe the combinations of aid from Federal, State, and private sources received by such recipients from all income categories;

  8. (D) to describe the—

  9. (i) debt burden of such loan recipients, and their capacity to repay their education debts; and

  10. (ii) the impact of such debt burden on the recipients’ course of study and post-graduation plans;

  11. (E) to describe the impact of the cost of attendance of postsecondary education in the determination by students of what institution of higher education to attend; and

  12. (F) to describe how the costs of textbooks and other instructional materials affect the costs of postsecondary education for students.

  13. (2) Frequency

  14. The survey shall be conducted on a regular cycle and not less often than once every four years.

  15. (3) Survey design

  16. The survey shall be representative of students from all types of institutions, including full-time and part-time students, undergraduate, graduate, and professional students, and current and former students.

  17. (4) Dissemination

  18. The Commissioner for Education Statistics shall disseminate to the public, in printed and electronic form, the information resulting from the survey.

      1. Prior NPSAS Studies

  1. As noted above, NPSAS:12 will be the eighth NPSAS in a series dating back to 1986–87. The first in the series, the 1987 National Postsecondary Student Aid Study (NPSAS:87), based on a sample of students enrolled in the fall term of 1986, is not completely comparable to later studies. Beginning in 1989–90, NPSAS surveys sampled students enrolled at any time during a full academic year, so that students enrolled only during the summer or spring terms, as well as those who began at any time in institutions (primarily vocational) not on a traditional calendar system, were included. Additional detailed information about each of the prior NPSAS studies and related longitudinal studies conducted by NCES can be found at http://nces.ed.gov/surveys/npsas.

  2. Since the inception of NPSAS, the data collection techniques and sources used for these studies have improved and expanded over time. NPSAS:90 was based on institutional data transcribed on paper forms, computer-assisted telephone interviews (CATIs), and only one external data source, the Pell Grant payment file. NPSAS:93 introduced the computer-assisted data entry system, allowing institutions to enter data from student records directly into electronic files. This facilitated matching student records to federal student loan and the Pell Grant files. NPSAS:96 made more use of electronic data files to supplement the survey information from the data entry system and CATI. In addition to the Pell Grant files, student records were matched with the electronic Institutional Student Information Records (ISIR) of the Central Processing System (CPS) for federal financial aid applications, the federal student loan history records of the National Student Loan Data System (NSLDS), and the files of the College Board and ACT for student test scores on the SAT and ACT tests. NPSAS:04 introduced a web-based student interview that allowed both self-administration and interviewer-administration via CATI. This multi-mode approach to data collection has increased flexibility and convenience for study participants and reduced burden. NPSAS:04 also saw more institutions submitting student-level data electronically for entire school systems, which reduced the burden for individual campuses and increased the efficiency of data submission. NPSAS:08 continued the multi-mode data collection approach and added the National Student Clearinghouse (NSC) as an administrative data source.

  3. NPSAS is the only periodic, nationally representative survey of student financial aid. There is no other single national database containing student-level records for students receiving aid from all of the numerous and disparate programs funded by the federal government, the states, postsecondary institutions, employers, and private organizations. NPSAS:12 data will allow for the continued evaluation of trends regarding financial aid and postsecondary enrollment. This information is critical to the development of government policy regarding higher education. The NPSAS studies reflect the changes made in government guidelines for financial aid eligibility and availability, and provide a good measure of the effect of those changes.

  4. The NPSAS studies also inform policymakers about what is working and what needs changing in the future. A central focus of all of the NPSAS studies is the effect of the federal financial aid programs. Major changes in federal financial aid policy are usually made every 5 to 6 years through Reauthorization of the Higher Education Act (HEA), the legislation establishing the basic rules for the federal grant, loan, and work-study programs, including eligibility criteria and need analysis requirements. The federal financial aid described in the NPSAS:90 and NPSAS:93 studies was awarded under the policies set in the 1986 Reauthorization of HEA. The Reauthorization of 1992 made many substantial changes. It established a single need analysis formula for Pell Grants and the other need-based federal programs, eliminated home equity from consideration in need analysis, created an unsubsidized student loan program for dependent students which has no need requirements, and increased borrowing limits in the federal loan programs. The results of the NPSAS:96 survey reflected these changes. For example, the proportion of middle-income students with federal loans increased substantially at four-year colleges, and annual student loan and cumulative debt amounts increased at all income levels and at all types of institutions.

  5. NPSAS:2000 and NPSAS:04 reflected the reauthorization of 1998 legislation, which made relatively few changes in the federal financial aid programs. The changes to need analysis were minor. The student income protection allowance increased somewhat, requiring a smaller contribution from prior-year earnings. Student loan amount limits were kept at the same levels that had been in effect since 1993, although interest rates were lower. The Pell Grant maximum amounts were increased to $4,050 for 2003–04. Since the basic financial aid programs and policies had not changed since NPSAS:96, the results of the NPSAS:2000 and NPSAS:04 surveys created a clearer picture of the underlying trends in the effect tuition increases had on various categories of students. In addition to documenting the continuing increases in college prices, these two surveys showed the parallel increases in grant awards and student loan borrowing. In 2003–04, three-fourths of all full-time undergraduates were receiving financial aid, and the average amount received was almost $10,000. The percentage of full-time undergraduates receiving grant aid (62 percent) continued to be greater than the percentage with student loans (50 percent) in 2003–04. Cumulative student loan debt continued to increase: among graduating seniors at private not-for-profit institutions, nearly three-fourths graduated with an average student loan debt of $22,000.

  6. NPSAS:04 was innovative in a number of ways. The sample size was substantially increased to yield about 90,000 study members (compared with 60,000 in NPSAS:2000). For the first time, the NPSAS sample was designed to provide representative state-level estimates of undergraduates in 12 selected states, which has provided data for comparisons of differences in college prices and financial aid programs among states. As noted above, in addition to the usual telephone interviews, for the first time the NPSAS:04 student interview was offered as a self-administered web survey. Also for the first time, the NPSAS survey data were used to estimate the average amounts of the federal education tax benefits (Hope, Lifetime Learning, and Tuition and Fees Deductions) and their distribution among students. Nearly one-half of all undergraduates were found to benefit from one of these federal tax reductions.

  7. NPSAS:08 had an even further expanded sample size of more than 130,000 students and included state-representative undergraduate samples for four degree-granting institutional sectors in six states: California, Georgia, Illinois, Minnesota, New York, and Texas. It reflected the legislative changes of the Higher Education Reconciliation Act of 2005 which increased Stafford loan limits, expanded PLUS loans to graduate students and added the ACG and SMART grant programs. NPSAS:08 contained a representative sample of likely grant recipients to study the impact of these new Federal grants. For the first time the sample weights in NPSAS:08 were adjusted to the sum of net Stafford loan disbursements instead of gross loan commitments. This proved to provide a more accurate estimate of Stafford borrowing but required that all previous surveys based on gross Stafford loan commitments, NPSAS:96, NPSAS:00, and NPSAS:04, also have their weights revised to net disbursements so they would be comparable. In addition to documenting the continuing increases in college prices, these surveys showed the parallel increases in grant awards and student loan borrowing. In 2007–08, some 80 percent of all full-time undergraduates were receiving financial aid, and the average amount received was $12,700. The percentage of full-time undergraduates receiving grant aid (64 percent) continued to be greater than the percentage with student loans (53 percent) in 2007–08.

      1. Prior and Related Studies

  1. Two longitudinal studies conducted as part of the Postsecondary Longitudinal and Sample Studies (PLSS) Program within the Postsecondary, Adult, and Career Education (PACE) Division of NCES were designed to address a variety of issues regarding higher education. Based on samples of students attending postsecondary education in a particular year regardless of age, each of these studies incorporates base year data from the cross-sectional NPSAS and extends it through longitudinal follow-up components focusing on first-time students and on recent college graduates in alternate NPSAS survey years: Beginning Postsecondary Students (BPS) and Baccalaureate and Beyond (B&B). The chronology of the previous administrations of the NPSAS study and its associated longitudinal components is presented in table 1.

  2. Table 1. Chronology of NPSAS and its longitudinal components

    1. Base year

    1. First follow-up

    1. Second follow-up

    1. Third follow-up

    1. NPSAS:90

    1. BPS:90/92

    1. BPS:90/94

    1. NPSAS:93

    1. B&B:93/94

    1. B&B:93/97

    1. B&B:93/03

    1. NPSAS:96

    1. BPS:96/98

    1. BPS:96/01

    1. NPSAS:2000

    1. B&B:2000/01

    1. NPSAS:04

    1. BPS:04/06

    1. BPS:04/09

    1. NPSAS:08

    1. B&B:08/09

    1. B&B:08/12

    1. NPSAS:12

    1. BPS:12/14

    1. BPS:12/17

  3. Not applicable.

  4. NOTE: NPSAS = National Postsecondary Student Aid Study; BPS = Beginning Postsecondary Students; B&B = Baccalaureate and Beyond.

  5. The six major issues addressed in these PACE studies are:

  1. undergraduate access/choice of institution;

  2. persistence;

  3. progress/curriculum;

  4. attainment/outcome assessment;

  5. graduate/professional school access; and

  6. rates of return to individuals and society.

  1. Specific studies that use data from NPSAS, BPS, or B&B to explore some of these issues are listed in appendix A for reference.

      1. Study Design for NPSAS:12

  1. Data for NPSAS:12 will be collected from both postsecondary institutions and students. The target population includes all students enrolled in a sample of institutions in a given academic year (2010–11 for the field test and 2011–12 for the full-scale study). A stratified sample of students within the sampled institutions will be selected.

  2. Institutions will be asked to provide information from student financial aid records and other institutional sources. Much of the required student financial aid data contained in institutional records is also available in the Central Processing System (CPS), which houses and processes data contained in the Free Application For Federal Student Aid (FAFSA) forms; these data will be obtained through file matching/downloading with this system. This process will reduce the data collection burden on sampled institutions. As in NPSAS:08, institutions will be asked to verify institutional characteristics and financial aid program participation and to provide enrollment lists for sampling purposes. Data from students will be collected via a self-administered survey on the Internet or through web-based CATI.

  3. Additional data for the NPSAS:12 student sample will be obtained from a variety of administrative data sources. These include queries of CPS, the National Student Loan Data System (NSLDS), Pell loan and grant files, the National Student Clearinghouse (NSC), and vendors of national undergraduate, graduate, and professional student admission tests including ACT and SAT scores. A description of the security procedures in place for the linkages to administrative data sources is provided in Appendix B.

    1. Purposes and Uses of the Data

  1. The fundamental purpose of NPSAS is to create a research data set that brings together information about a variety of programs for a large sample of undergraduate, graduate, and first professional students. NPSAS provides the data for comprehensive descriptions of the undergraduate and graduate/first professional student populations in terms of their demographic characteristics, academic programs, types of institutions attended, attendance patterns, and employment. Demographic and enrollment data establish the appropriate context that allows research and policy analysts to address basic issues about postsecondary affordability and the effectiveness of the existing financial aid programs. These results are published in four statistical briefs with accompanying web tables: a profile of undergraduates, a report on undergraduate financing, a profile of graduate students, and a report on the financing of graduate studies. The financing reports describe the “sticker” price, the net price after grant aid, and the “out-of-pocket” price (reduced by both grant and loan aid).

  2. A second purpose of NPSAS is to gather base year data on a subset of students who become the sample for a longitudinal study. NPSAS:12 will establish the base year cohort for a Beginning Postsecondary Students Study of students who are just beginning their college education, with a follow-up survey 2 years later (BPS:12/14) and another follow-up in 2017. A section of the student interview will capture information about student knowledge, experiences, and perceptions of the costs and benefits of education to support analysis of student choices related to major, persistence, and completion.

  3. A third purpose of NPSAS – new in 2012 – is to provide a nationally representative sample that can be used to rigorously address fundamental research questions through experimental research methodologies. NCES plans to expand the use of NPSAS through collaboration with the National Center for Education Research (NCER). NPSAS provides a rich source of data that could potentially be used to support experimental research funded by NCER. To date, interested researchers have submitted two proposals to RTI, detailing their experimental design and analysis plans. A decision about funding of these proposals will be made in 2011. If proposals are funded, RTI will serve as a subcontractor to the grantee, will conduct any needed data collection and/or file-matching activities, and will deliver a restricted data file to NCES to provide to the grantee for analysis. RTI may also be responsible for sampling, programming an instrument and control system, data processing and cleaning, and weighting. For any NCER grants that are funded, we will submit documentation about data collection plans to OMB in the clearance package for full-scale student data collection.

  4. b. NPSAS:12 Research and Policy Issues

  5. Many of the important research questions remain the same across all of the NPSAS studies. Price increases, net price levels, remaining need after financial aid, and increases in student loan debt will continue to be central issues. The NPSAS:12 data will be used to address policy issues relating to the changes in federal financial aid programs resulting from the College Cost Reduction and Access Act of 2007 (CCRAA) and the Higher Education Opportunity Act of 2008 (HEOA). These legislative changes included increases to the Pell maximum award, reductions in the interest rate for subsidized Stafford loans, more consumer transparency on college tuition and lender disclosures about loan terms. There are other potentially far-reaching legislative changes on the horizon such as the Student Loan and Fiscal Responsibility Act of 2009 (SAFRA) which may affect how the federal student loan program operates.

  6. Some of the primary research and policy issues to be addressed through the use of NPSAS:12 data will likely be:

  7. Student demographics;

  • What is the distribution of student enrollment among types of institutions by gender, race/ethnicity, age, dependency, and income?

  • What types of institutions are serving the largest proportions of low-income, non-traditional, and ethnic minority students?

  • What proportion of undergraduates are first generation college students, and what types of institutions are they attending?

  • What proportion of students are immigrants or children of immigrants, and what types of institutions are they attending?

  • How much are students with disabilities participating in postsecondary education?

  • What proportion of students enrolled in postsecondary education are veterans and what types of institutions do they attend?

  1. Academic preparation and programs;

  • What proportion of undergraduates enroll in college courses while still in high school?

  • What proportion of college students have taken remedial courses?

  • What types of students are enrolled in vocational certificate, associate’s, and bachelor’s degree programs, and what are their fields of study?

  • What is the extent of internet-based and other distance education, and what types of institutions and students are using it?

  • What are students’ primary purposes for enrolling in postsecondary education and their educational goals?

  1. Financial aid;

  • What proportion of students have financial aid need and what is the average amount of need by income?

  • What proportion of students receive Federal Pell grants and where do they attend college?

  • What proportion of students are receiving aid from states, institutions, employers, and private sources, and what are the average amounts received?

  • What proportion of students are receiving need-based or merit-based aid?

  • How does the amount and type of aid vary by dependency and income level?

  • What is the ratio of federal to non-federal aid at various types of institutions?

  • What is the ratio of grants to loans at various types of institutions?

  • What proportion of students receive veterans and other Department of Defense benefits?

  1. Price of attendance;

  • What are the differences in the average tuition and total price of attendance by type of institution and among students by dependency, income, and full-time or part-time attendance status?

  • What is the average net price of attendance (student budget minus financial aid) at various income levels at different types of institutions?

  1. Student borrowing;

  • What are the differences in the percentage borrowing and the average amounts borrowed through the federal student loan programs by institution type, attendance status, class level, and income?

  • What proportion of students borrow the maximum Stafford loan amounts?

  • What is the difference in the proportion of students receiving subsidized or unsubsidized Stafford loans by dependency and income level?

  • What is the relationship between the level of student debt and persistence in postsecondary education?

  • What is the average cumulative debt of students by class level, especially among graduating college seniors?

  • What proportion of students borrow private loans, in what amount, and how does this borrowing vary by institution type?

  1. Student employment;

  • What proportion of students engage in paid work while enrolled and what are the average hours per week they work?

  • What is the average amount earned from work while enrolled?

  1. Sources of funds;

  • What types of financial support are dependent students receiving from their parents?

  • What is the estimated proportion of students who might benefit from the federal education tax benefits (such as the Hope and Lifelong Learning tax credits) based upon family income, tuition paid, and grant aid received?

  1. Answers to these and other questions are vital if policymakers at the local, state, and national levels are to respond adequately to the changing environment of postsecondary education. As the publications listed in appendix A indicate, since inception, the NPSAS, BPS, and B&B series have resulted in numerous NCES publications addressing these issues. The data from these studies have also been used extensively to explore PACE program issues through the NCES Postsecondary Education Descriptive Analysis Report (PEDAR) series.

    1. Use of Information Technology

  1. To improve the efficiency of student data collection and virtually eliminate the need to burden the respondent with a re-contact for data retrieval, both the NPSAS:12 field test and full-scale studies will use web-based student interviewing. The modes of data collection will be web-based self-administered surveys and web-based CATI. The survey instrument displays questions for the respondent or interviewer in program-controlled sequences on a computer screen. Computer control of the survey administration and the monitoring of responses offers substantial improvements in data quality and data collection efficiency over standard paper and pencil surveys. The incidence of missing or inconsistent data is greatly reduced because questionnaire skip patterns are computer-controlled. Moreover, invalid entries or entries inconsistent with previous responses are rejected by the computer and must be corrected by the respondent or interviewer during the interview.

  2. The self-administered web-based student interview adds considerable flexibility to the interviewing process. The wording and presentation of subsequent questions can be tailored to reflect answers already received as well as pre-loaded information. On-line help screens are also available to provide respondents with more in-depth explanations of questions and examples of the categories of answers listed. Perhaps the most important feature of the self-administered web-based student interview is that respondents can complete portions of the interview and save their responses. This allows them to return and complete the interview at a later time rather than requiring the interview be entirely completed at once. These features reduce participant burden while ensuring that the most accurate data are collected.

    1. Efforts to Identify Duplication

  1. NCES has consulted with other federal offices, such as ED’s Office of Postsecondary Education (OPE), the Office of Planning, Evaluation and Policy Development (OPEPD), the Congressional Budget Office (CBO), and the Office of Management and Budget (OMB). Consultations with non-federal associations such as the American Council on Education (ACE), the National Association of Independent Colleges and Universities (NAICU), and the National Association of Student Financial Aid Administrators (NASFAA) confirm that the data to be collected through NPSAS are not available from any other sources. These consultations provide methodological insights from the results of other financial aid investigations and assure that the data collected through NPSAS meet the needs of the federal government and other relevant organizations.

    1. Method Used to Minimize Burden on Small Businesses

  1. The student survey for NPSAS:12 does not involve small businesses or entities. However, for-profit schools and other small public and private schools will be asked to provide enrollment lists and student records as part of NPSAS:12. To minimize burden on all participating institutions, NPSAS:12 will offer institutions a choice of several methods for submitting the requested data. Each institution may select the format that it finds most convenient and least burdensome. Available methods include: (1) uploading an electronic file to the project’s secure web site; (2) downloading an Excel workbook from the project’s web site, then uploading the completed file to the site; (3) use of a web-based data entry interface. This interface has recently been redesigned to allow users to enter data in the manner most convenient for them.

    1. Frequency of Data Collection

  1. This cycle of NPSAS will take place 4 years after the last data collection. The rationale for conducting NPSAS periodically is based on the historical need for information on financial aid programs. The large-scale and rapid changes in federal policy concerning postsecondary student aid necessitate frequent studies. Eligibility restrictions change, size of grant and loan amounts fluctuate, and the balance between various aid options changes dramatically. Since these changes affect students’ ability to finance postsecondary education and the level of debt that students are accumulating, data collections every 3 to 5 years are necessary. A recurring study is essential to helping predict future costs for financial aid because loan programs create continued obligations for the federal government as long as the loans are being repaid.

  2. Repeated surveys, such as NPSAS, are also necessary because of the dynamic nature of the postsecondary environment. For example, for-profit institutions have recently assumed a much more prominent role than was the case in years past. Changes in private-sector lending, increases in tuition and fees, and changes in federal student aid policies (such as the recent increase in maximum Pell Grant awards) further highlight the need for periodic data collections. Effects of these changes on federal policy and postsecondary education participation create an opportunity, as well as a need, to monitor this rapidly changing situation on a regular basis.

    1. Special Circumstances of Data Collection

  1. No special circumstances of data collection are anticipated.

    1. Consultants outside the Agency 

  1. The 60-day Federal Register notice was published on October 25, 2010 (75 FR, No. 205, p. 65464). One public comments was received in response to this notice. The comment and NPSAS response are included with the package materials.

  2. Recognizing the significance of NPSAS data collection, several strategies have been incorporated into the project work plan that allow for the critical review and acquisition of comments relating to project activities, interim and final products, and projected and actual outcomes. These strategies include consultations with persons and organizations both internal and external to NCES, ED, and the federal government.

  3. Previous NPSAS implementations have benefited from a standing federal review panel composed of staff from several offices in ED (the Office of Postsecondary Education [OPE] and the Office of Planning, Evaluation, and Policy Development [OPEPD]) and representatives of OMB and CBO. Members of this panel also belong to the Technical Review Panel (TRP) for NPSAS:12. The membership of the TRP (see appendix C) represents a broad spectrum of the postsecondary and financial aid communities. The non-federal members serve as expert reviewers on the technical aspects of the study design, data collection procedures, and instrument design, especially item content and format. The TRP reviewed the plans for study design and key topics during their July 2010 meeting.

    1. Provision of Payments or Gifts to Respondents

  1. In an effort to maximize response rates, the use of incentives is proposed for two purposes—to encourage early response using the self-administered web survey, and to limit nonresponse bias through refusal conversion. The Tests of Procedures and Methods section of this document (section B.4) discusses in detail specialized plans for improving weighted response rates and reducing nonresponse bias through response propensity modeling, which will identify, and provide special treatment for, sampled cases predicted to be less likely to respond. We propose to use the NPSAS:12 field test to conduct an experimental evaluation of this approach. Roughly half of the cases will be assigned to a low propensity group and half to a high propensity group based on the results of modeling. We will use the median calculated response propensity from the NPSAS:04 modeling work as our high/low cutoff in NPSAS:12. The median response propensity from NPSAS04 is 0.6102 which is reflective of the overall response rate. We will then randomly assign cases within propensity groups to either an experimental group or a control group. We will offer a higher incentive of $45 to low-propensity cases in the treatment group and $30 to low-propensity cases in the control group. Within the high propensity group, cases will also be assigned to either an experimental group, receiving an incentive offer of $15, or a control group, receiving $30.

  2. In summary, if the propensity modeling is shown to successfully identify sample members who have high and low likelihood of response, a new incentive plan can be implemented for the full-scale data collection that maximizes the use of the project resources for data collection. Specifically, by motivating low propensity cases to respond with the strategic use of a higher incentive amount, data collection costs will be minimized since fewer resources will need to be invested to locate and interview the cases. At the same time, the low propensity cases will be more equally represented among the respondent groups, with the goal of reducing bias in key survey variables.

    1. Assurance of Confidentiality

  1. NCES assures participating individuals and institutions that any data collected under NPSAS and related programs shall be in total conformity with NCES’s standards for protecting the confidentiality of identifiable information about individuals and adhere to the confidentiality provisions of the Education Sciences Reform Act (ESRA) of 2002 (20 U.S.C. § 9573).

  2. Data security and confidentiality protection procedures are in place to ensure that RTI and its subcontractors comply with all privacy requirements, including:

  3. The Statement of Work of this contract;

  4. Privacy Act of 1974 U.S.C. § 552(a) (2009);

  5. The U.S. Department of Education Incident Handling Procedures (February 2009);

  6. The U.S. Department of Education General Handbook for Information Technology Security General Support Systems and Major Applications Inventory Procedures (March 2005);

  7. The U.S. Department of Education, ACS Directive OM: 5- 101, Contractor Employee Personnel Security Screenings.

  8. Family Educational and Privacy Act of 1974, 20 U.S.C. § 1232g (2009);

  9. ESRA, 20 U.S.C. § 9573 (2009); and

  10. Any new legislation, which impacts the data collection through this contract.

  11. To ensure that confidentiality is appropriately maintained at all times, RTI requires that vendors who assist in locating and tracing sample members follow procedures to safeguard personally identifying information. RTI’s vendor contracts outline requirements for information security policies and assessments, security awareness training, physical and environmental security, monitoring, and access control. They also specify the means by which information may be transmitted between RTI and the contractor. Appendix D documents the data security language contained in vendor contracts.

  12. RTI will also comply with the Department’s IT security policy requirements as set forth in the Handbook for Information Assurance Security Policy and related procedures and guidance as well as IT security requirements in the Federal Information Security Management Act (FISMA), Office of Management and Budget (OMB) Circulars, and the National Institute of Standards and Technology (NIST) standards and guidance.

  13. RTI will adhere to NCES Statistical Standards, as described at the website: http://nces.ed.gov/statprog/2002/std4_2.asp.

  14. The NPSAS:12 Data Security plan for maintaining confidentiality includes notarized nondisclosure affidavits obtained from all personnel who will have access to individual identifiers (copies of the agreement and affidavit are provided in E). Also implemented are personnel training regarding the meaning of confidentiality; controlled and protected access to computer files; built-in safeguards concerning status monitoring and receipt control systems; and a secure, staffed, in-house computing facility. The Data Security plan will detail guidelines and procedures for securing sensitive project data, including (but not limited to): physical/environment protections, building access controls, system access controls, system login restrictions, user identification and authorization procedures, encryption, and project file storage/archiving/destruction. We do not anticipate receiving many enrollment lists via FAX (none were received in NPSAS:08), but any that are received will be securely shredded upon study completion. Electronic enrollment lists will reside within an independent secure network, and be deleted upon study completion.

  15. There are several security measures in place to protect data during file matching procedures. NCES has a secure data transfer system, which uses Secure Socket Layer (SSL) technology, allowing the transfer of encrypted data over the Internet. The NCES secure server will be used for all administrative data sources with the exception of the National Student Clearinghouse (NSC) which has its own secure FTP site. All data transfers will be encrypted using FIPS 140-2 validated encryption tools.

  16. Furthermore, the Department has established a policy regarding the personnel security screening requirements for all contractor employees and their subcontractors to secure the confidentiality of NPSAS respondents. The contractor must comply with these personnel security screening requirements throughout the life of the contract. The Department directive that contractors must comply with is OM:5-101, which was last updated on 1/29/08. There are several requirements that the contractor must meet for each employee working on the contract for 30 days or more. Among these requirements are that each person working on the contract must be assigned a position risk level. The risk levels are high, moderate, and low based upon the level of harm that a person in the position can cause to the Department’s interests. Each person working on the contract must complete the requirements for a “Contractor Security Screening.” Depending on the risk level assigned to each person’s position, a follow-up background investigation by the Department will occur. Materials related to these security features are provided in appendix E.

  17. Study notification materials sent to institutions will describe the voluntary nature of the NPSAS:12 survey and convey the extent to which study member identifiers and responses will be kept confidential. Similarly, the scripts to be read by telephone staff will be very specific in the assurances made to sample members and contacts. Contacting materials are presented in appendix F. The following confidentiality language is provided in the study brochure that is supplied to all sample members:

  18. The 2011-12 National Postsecondary Student Aid Study is conducted under the authority of the Higher Education Opportunity Act (HEOA) of 2008 (20 U.S.C. § 1015) and the Education Sciences Reform Act (ESRA) of 2002 (20 U.S.C. § 9512) which authorizes NCES to collect and disseminate information about education in the United States. Collection is most often done through surveys.

  19. NCES is required to follow strict procedures to protect the confidentiality of persons in the collection, reporting, and publication of data. All individually identifiable information supplied by individuals or institutions to a federal agency may be used only for statistical purposes and may not be disclosed or used in identifiable form for any other purpose, unless otherwise compelled by law (20 U.S.C. § 9573).

    1. Sensitive Questions

  1. The student interview collects information about earnings, assets, and marital and dependency status. Regulations governing the administration of these questions require (a) clear documentation of the need for such information as it relates to the primary purpose of the study, and (b) provisions to clearly inform sample members of the voluntary nature of participation in the study, as well as assurances that their responses will be treated confidentially.

  2. Financial data related to earnings and assets, as well as marital and dependency status, are key items used in calculating need for financial aid, parental contributions, and financial aid awards. Consequently, the data elements are critical to the conduct of policy-related analyses and to the modeling and projection of the effects of federal program changes on students and on program costs. Several procedures have been implemented (see section A.10) to provide assurances to sample members about the voluntary nature of participation in the study as well as the confidential treatment of survey responses.

  3. Early file matching activities, described in B.2.b., will be an essential step in accurately identifying BPS cohort members. Past experience has shown that accurate identification of FTBs has been extremely difficult. A review of data from NPSAS:04 (the last NPSAS to serve as base year for a BPS cohort) showed that approximately 22 percent of the false positive cases would have been prevented if NSLDS data had been available prior to sampling. Matching to NSC would be expected to identify about 7 percent of the cases matching to NSC as false positives, and matching to both NSC and NSLDS would be expected to identify about 16 percent of all potential FTBs over the age of 18 as false positives (based on NPSAS:04 data). For NPSAS:12, a pre-sampling match to NSLDS and NSC will identify cases with evidence of prior enrollment to ensure that they are not sampled as potential FTBs.

  4. SSN will be needed to 1) conduct file matches to administrative records and 2) maintain the sample for the longitudinal study (BPS). File matching to administrative records is a crucial element of the NPSAS study and would not be possible without the collection of SSNs. Data obtained from file matching will both minimize respondent burden and increase data quality.

    1. Estimates of Response Burden

  1. NPSAS:12 materials and procedures for institution contacting and enrollment list collection were approved by OMB in July 2010 (OMB# 1850-0666 v.7). The total burden on respondents requested in this submission is a sum of the institutional burden approved in the previous package (OMB# 1850-0666 v.7) and the student data collection burden presented here.

  2. Two data collection activities will take place: (1) student record collection from eligible institutions that provided enrollment lists; and (2) student interviews. The respective burden estimates for each data collection activity are provided in table 2.

  3. Table 2. Maximum estimated burden on respondents for the student data collection

    1. Data collection activity

    1. Sample

    1. Expected eligibles

    1. Expected Number of respondents/ responses

    1. Percent expected response rate

    1. Average time burden per response

    1. Range of response times

    1. Total time burden (hours)

    1. NPSAS:12 Field Test

    1. Student record collection *

    1. 150

    1. 149

    1. 141

    1. 95

    1. 13.0 hrs.

    1. 1 to 40 hrs.

    1. 1,833

    1. Student interview

    1. 4,530

    1. 4,302

    1. 3,000

    1. 70

    1. 38.1 min.

    1. 20 min. to 1 hr.

    1. 1,905

    1. Student re-interview

    1. 300

    1. 300

    1. 240

    1. 80

    1. 15.0 min.

    1. 10 to 20 min.

    1. 60

    1. Number of responding institutions and students /Number of responses

    1.  

    1.  

    1. 3,141/3,381

    1.  

    1.  

    1.  

    1. 3,798

    1. NPSAS:12 Full-scale Study

    1. Student record collection *

    1. 1,406

    1. 1,395

    1. 1,339

    1. 96

    1. 22.2 hrs.

    1. 1 to 40 hrs.

    1. 29,726

    1. Student interview

    1. 117,256

    1. 111,427

    1. 83,876

    1. 75

    1. 25.1 min.

    1. 20 min. to 1 hr.

    1. 35,088

    1. Number of responding institutions and students

    1. 85,215

    1.  

    1.  

    1.  

    1.  

    1.  

    1. 64,814

  4. * “Sample” is the number of institutions that provided enrollment lists for student sampling.

  5. Student record collection. Based on results from the NPSAS:08 full-scale study, about 40 percent of schools are expected to provide programmer-created electronic data files to the contractor, resulting in an average estimated response burden of about 18 hours. Sixty percent are expected to enter some portion of the requested information themselves at 25 hours per response (on average). This distribution of responses results in an estimated average of 22 hours per institution response for the full-scale study. The estimated burden for field test institutional data collection is lower (approximately 13 hours) due to the smaller sample size for each institution.

  6. Student interviews. Although many of the data elements to be used in NPSAS:12 appeared in the previously approved NPSAS:04 and NPSAS:96 studies (the last NPSAS studies to include a BPS cohort), additional items will also be included in NPSAS:12. (Facsimiles of the student interview and re-interview are presented in Appendixes H and I, respectively). Due to the addition of new items, burden for the field test interview is estimated to be about 38 minutes. Based on field test results, the interview will be shortened and streamlined, resulting in a lower estimate of 25 minutes for completion of the full-scale interview.

  7. Table 3 presents estimated costs to respondents participating in the NPSAS:12 field test and full-scale studies.

  8. Table 3. Maximum estimated costs to respondents for the NPSAS:12 field test and full-scale implementations

  1. Data collection activity

  1. Sample

  1. Expected eligibles

  1. Expected number of respondents

  1. Percent expected response rate (percent)

  1. Average time burden per response

  1. Total time burden (hours)

  1. Rate per hour

  1. Total cost

  1. Institutions (NPSAS:12)

  1. Field test

  1. 150

  1. 149

  1. 141

  1. 95

  1. 13.0 hrs

  1. 1,833 hrs.

  1. $17

  1. $31,161

  1. Full scale study

  1. 1,406

  1. 1,395

  1. 1,339

  1. 96

  1. 22.2 hrs

  1. 29,726 hrs.

  1. 17

  1. 505,342

  1. Student interview (NPSAS:12)

  1. Field test

  1. 4,530

  1. 4,302

  1. 3,000

  1. 70

  1. 38.1 min.

  1. 1,905 hrs.

  1. 10

  1. 19,050

  1. Full scale study

  1. 117,256

  1. 111,427

  1. 83,876

  1. 75

  1. 25.1 min.

  1. 35,088 hrs.

  1. 10

  1. 350,880

  1. Student re-interview (NPSAS:12)

  1. Field test

  1. 300

  1. 300

  1. 240

  1. 80

  1. 15 min.

  1. 60 hrs.

  1. 10

  1. 600

    1. Estimates of Cost

  1. There are no capital, startup, or operating costs to institutions or students for participation in the project. No equipment, printing, or postage charges will be incurred.

    1. Costs to Federal Government

  1. A summary of estimated costs to the federal government for NPSAS:12, categorized by field test, full-scale study, and total costs is shown in table 4. Included in the contract estimates are all staff time, reproduction, postage, and telephone costs associated with the management, data collection, analysis, and reporting for which clearance is requested.3 A more detailed breakdown of contract costs is provided in table 5.

  2. Table 4. Individual and total costs to the National Center for Education Statistics (NCES) for the NPSAS:12 field test and full-scale implementations

    1. Costs to NCES

    1. Amount

    1. NPSAS:12 Field Test

    1. Salaries and expenses

    1. $62,370

    1. Contract costs

    1. 5,895,937

    1. Total

    1. 5,958,307

    1. NPSAS:12 Full Scale Study

    1. Salaries and expenses

    1. 197,739

    1. Contract costs

    1. 21,897,611

    1. Total

    1. 22,095,350

    1. Total Costs

    1. Salaries and expenses

    1. 260,109

    1. Contract Costs

    1. 27,793,548

    1. Total

    1. 28,053,657

  3. Table 5. Contract costs for NPSAS:12

    1. Study area and task

    1. Budgeted amount

    1. 110

    1. Post award conference

    1. $40,891

    1. 120

    1. Schedules

    1. 57,488

    1. 130

    1. Monthly reports

    1. 803,152

    1. 140

    1. Integrated monitoring system

    1. 587,665

    1. 150

    1. Technical review panels

    1. 1,521,527

    1. Field test (FT) data collection

    1. 211

    1. Institution sampling

    1. 173,953

    1. 212

    1. Institution contacting

    1. 584,390

    1. 213

    1. Student sampling

    1. 134,559

    1. 220

    1. FT RIMG/OMB forms clearance

    1. 160,116

    1. 231

    1. Instrumentation

    1. 1,571,218

    1. 232

    1. Tracing

    1. 123,657

    1. 233

    1. Training for institution level data collection

    1. 99,756

    1. 234

    1. Training for CATI data collection

    1. 138,464

    1. 235

    1. Institution level data collection

    1. 168,915

    1. 236

    1. Web/CATI data collection

    1. 675,734

    1. 237

    1. Data processing

    1. 716,487

    1. 240

    1. Methodology report

    1. 157,591

    1. Full-scale (FS) data collection

    1. 311

    1. Institution sampling

    1. 56,935

    1. 312

    1. Institution contacting

    1. 950,674

    1. 313

    1. Student sampling

    1. 545,454

    1. 320

    1. FS RIMG/OMB forms clearance

    1. 109,159

    1. 331

    1. Instrumentation

    1. 1,087,697

    1. 332

    1. Tracing

    1. 1,391,671

    1. 333

    1. Training for institution level data collection

    1. 183,487

    1. 334

    1. Training for CATI data collection

    1. 589,646

    1. 335

    1. Institution level data collection

    1. 602,650

    1. 336

    1. Web/CATI data collection—General

    1. 7,840,060

    1. Web/CATI data collection--Incentives

    1. 2,338,459

    1. 337

    1. Data processing

    1. 1,602,292

    1. 338

    1. Weighting, imputations & nonresponse bias analysis

    1. 617,271

    1. 339

    1. Data disclosure planning and prevention

    1. 48,872

    1. 340

    1. Methodology report

    1. 259,782

    1. Descriptive reporting

    1. 410

    1. First Look

    1. 193,000

    1. 420

    1. Data analysis system

    1. 364,102

    1. 430

    1. Additional special tabulations

    1. 355,533

    1. 440

    1. Descriptive reports

    1. 531,525

    1. 450

    1. Respond to information requests

    1. 388,137

    1. 460

    1. Final technical memo

    1. 21,579

    1. Total

    1. 27,793,548

  4. NOTE: Costs presented here do not include base or award fee. CATI = computer assisted telephone interview.

    1. Reasons for Changes in Response Burden and Costs

  1. The first portion of the NPSAS data collection (institution contacting and enrollment list collection from institutions) was approved in July 2010 (OMB# 1850-0666 v.7). This submission requests clearance for the remaining portion of the field test study – the student data collection, and hence the apparent burden increase in relation to current OMB inventory. The total burden for both components associated with the NPSAS field test to be conducted in 2011 consists of 3,441 total respondents (the 300 already approved under institutional collection plus the 3,141 institutions and students requested here); 4,093 separate responses (712 for institutional collection plus 3,381 [including student interviews and re-interviews] requested here); and 4,256 burden hours (458 for institutional collection plus 3,798 requested here).

  2. Projected estimates for response burden and costs for NPSAS:12 are based on experiences from NPSAS:08. Institutional response burden is difficult to estimate due to the wide variation in response times experienced in NPSAS:08, particularly since student sample sizes and record abstraction methods varied widely. Furthermore, accurate timing data are not available for institutional record abstraction. However, the figures presented in tables 2 and 3 are believed to portray an accurate assessment of the estimated time required for participation.

  3. Certain assumptions guide the estimates for institution response burden. We assume that each institution will need approximately 2 hours to prepare and review instructions prior to performing record abstractions. Then, we are assuming an average of approximately 20 to 40 minutes per student. In the field-test study, we anticipate an average of 29 responding students per responding institution; in the full-scale study we anticipate an average of 60 responding students per responding institution. Given the increased use of electronic submission for student record data, institutions will benefit from an economy of scale that will reduce the overall average for institutional data collection.

  4. Changes to the design of the student records collection Web interface may also decrease burden on institutions. The NPSAS:12 interface has been redesigned to capitalize on increased institution use of electronic data submission, improvements in technology, and greater use of the Internet. The data entry option now allows institution staff to either enter data for one student at a time (as in the past) or to enter data for multiple students at the same time in a grid mode. The redesigned interface also provides a new option that allows institution staff to download an Excel template, enter the requested data, and submit the completed file. The data provided can then be reviewed and edited using the Web interface. This review and editing functionality is also available to the institutions that opt to have programming staff create and upload a data file versus requiring them to resubmit edited data files.

  5. Extensive timing analyses were conducted in NPSAS:08, and the core of the NPSAS:12 interview remains largely the same. However, the interview for NPSAS:12 will include several new items and timing estimates have been adjusted accordingly.

    1. Publication Plans and Time Schedule

  1. The formal contract for NPSAS:12 requires the following reports, publications, or other public information releases:

  1. Statistics-In-Brief and Web Tables for online dissemination to a broad audience;

  2. a detailed methodological report describing all aspects of the full-scale study design and data collection procedures (a working paper detailing the methodological findings from the field test will also be produced);

  3. complete data files and documentation for research data users in the form of both a restricted-use electronic codebook (ECB) and a public-use data tools (i.e. QuickStats, PowerStats); and

  4. special tabulations of issues of interest to the higher education community, as determined by NCES.

  1. The operational schedule for the NPSAS:12 field test and full-scale study is shown in table 6.

  2. Table 6. Operational schedule for NPSAS:12

    1. NPSAS:12 activity

    1. Start date

    1. End date

    1. Field test

    1. Contacts with institutions to request enrollment lists

    1. Oct. 4, 2010

    1. Feb. 7, 2011

    1. Enrollment list collection

    1. Jan. 24, 2011

    1. April 23, 2011

    1. Select student sample

    1. Feb. 1, 2011

    1. April 23, 2011

    1. Collect student data from institution records

    1. Mar. 15, 2011

    1. Jun. 30, 2011

    1. Self-administered web-based data collection

    1. Mar. 15, 2011

    1. Jun. 30, 2011

    1. Conduct telephone interviews of students

    1. Apr. 7, 2011

    1. Jun. 30, 2011

    1. Process data, construct data files

    1. Jan. 25, 2011

    1. Aug. 30, 2011

    1. Prepare/update field test reports

    1. Apr. 4, 2011

    1. Oct. 26, 2012

    1. Full-scale study

    1. Contacts with institutions to request enrollment lists

    1. Sept. 12, 2011

    1. Jun. 15, 2012

    1. Select student sample

    1. Jan. 24, 2012

    1. Jul. 16, 2012

    1. Collect student data from institutional records

    1. Jan. 31, 2012

    1. Sept. 28. 2012

    1. Self-administered web-based data collection

    1. Feb. 7, 2012

    1. Sept. 28, 2012

    1. Conduct telephone interviews of students

    1. Feb. 28, 2012

    1. Sept. 28, 2012

    1. Process data, construct data files

    1. Nov. 3, 2011

    1. Jun. 17, 2013

    1. Prepare/update reports

    1. Aug. 24, 2012

    1. Sept. 30, 2014

  3. Note: The current request for OMB review includes only student data collection activities for the field test study.

    1. Approval to Not Display Expiration Date for OMB Approval

  1. The expiration date for OMB approval of the information collection will be displayed on data collection instruments and materials. No special exception to this request is requested.

    1. Exceptions to Certification for Paperwork Reduction Act Submissions

  1. There are no exceptions to the certification statement identified in the Certification for Paperwork Reduction Act Submissions of OMB Form 83-i.

1 RTI International is a trade name of Research Triangle Institute.

2 Re-interview items will be selected after cognitive testing of the student interview data collection instrument. The student re-interview data collection instrument will be available in January 2011.

3 This package requests clearance for field test student data collection. A previously submitted package requested clearance for field test institution contacting and enrollment list collection. Costs shown here are for the full study, including institution and student data collection efforts.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleChapter 2
Authorspowell
File Modified0000-00-00
File Created2021-02-01

© 2025 OMB.report | Privacy Policy