Justification

Volume 1 BPS 2012-2017 Pilot Test.DOCX

NCES Cognitive, Pilot, and Field Test Studies System

Justification

OMB: 1850-0803

Document [docx]
Download: docx | pdf





National Center for Education Statistics





Volume I

Supporting Statement




2012/17 Beginning Postsecondary Students Longitudinal Study (BPS:12/17) Pilot Test



OMB # 1850-0803 v.150











January 2016





Contents



Attachments

Attachment I – Technical Review Panel (TRP) List

Attachment II – Pilot Test Communication Materials

Attachment III – Cognitive Interviews Summary Report

Attachment IV – BPS:12/17 Survey Items


Tables





  1. Circumstances Necessitating Data Collection

    1. Purpose of this Submission

The following material is being submitted under the National Center for Education Statistics (NCES) clearance agreement (OMB # 1850-0803), which allows NCES to improve the methodologies, question types, and/or delivery methods of its survey and assessment instruments by conducting testingm such as pilot tests, pretests, focus groups, or cognitive interviews.

This request is to conduct a pilot test in preparation for the 2012/17 Beginning Postsecondary Students Longitudinal Study (BPS:12/17). In place of a larger field test, the pilot, for which web data collection will begin in March 2016, will test a subset of the BPS:12/17 survey items. The data collection for this study is being carried out for NCES by RTI International under contract to the U.S. Department of Education (Contract # ED-IES-09-C-0039). A report of the results will be provided to NCES by August 2016. The results from this pilot test, combined with feedback from a Technical Review Panel (TRP) scheduled for June 2016, will be used to finalize the survey instrument for the BPS:12/17 full-scale data collection.

    1. Background

As noted above, BPS is designed to follow a cohort of students who enroll in postsecondary education for the first time during the same academic year, irrespective of the date of high school completion. The study collects data on student persistence in and completion of postsecondary education programs; their transition to employment; demographic characteristics; and changes over time in their goals, marital status, income, and debt, among other indicators. Data from BPS are used to help researchers and policymakers better understand how financial aid influences persistence and completion, what percentages of students complete various degree programs, what are the early employment and wage outcomes for certificate and degree attainers, and why students leave school.

Sampled from the 2011–12 National Postsecondary Student Aid Study (NPSAS:12), BPS:12/17 will be the second follow-up interview with sample members who were first-time beginning (FTB) students during the 2011–12 academic year. The first contact was made in 2012 as part of the NPSAS:12 data collection, and then again in 2014 as part of BPS:12/14, the first follow-up.

Prior to the pilot, two rounds of cognitive testing were conducted. The first round (OMB # 1850-0803 v.134), which focused on a subset of draft questions, was conducted from May 2015 through July 2015 and was administered as a paper questionnaire. The second round of testing (OMB # 1850-0803 v.143) focused on questionnaire content and understanding and usability of the self-administered instrument (web and mobile-friendly versions) and was conducted from September 2015 through December 2015. Findings from both rounds were incorporated into the survey instrument to be fielded in the pilot test. The results of both rounds of testing will be delivered to NCES in late January 2016.

    1. Legislative Authorization

BPS is authorized by the Education Sciences Reform Act of 2002 (ESRA; 20 U.S.C. § 9543) and is being conducted in close consultation with other offices and organizations within and outside the U.S. Department of Education (see section 6 below for details on the consultation outside NCES).

  1. Purpose and Uses of the Data

    1. The BPS Cohort

BPS follows a cohort of students who entered postsecondary education for the first time in the same academic year. BPS differs from other studies in two key ways: the population it follows and the sources of data from which it draws. First, it is the only nationally representative study of all beginning college students. Unlike other studies, it includes students entering postsecondary education immediately after high school as well as those entering after being away from school for years. In addition, unlike other studies that focus only on baccalaureate students, BPS includes not just students seeking bachelor’s degrees but also students pursuing certificates, working toward associate’s degrees, and taking postsecondary classes outside of a degree or certificate program. BPS is also unique in that it includes a student interview and does not rely solely on institution-reported data. The inclusion of a student interview allows BPS to provide a more accurate portrait of students’ persistence and attainment anywhere within postsecondary education and not just their retention and attainment at a specific institution.

BPS is an essential source of data on FTB students’ demographics, high school preparation, enrollment and employment while enrolled, financial aid and borrowing, and education and career expectations. The primary purpose of BPS is to improve our understanding of how these factors relate to three key outcomes: postsecondary persistence, degree attainment, and employment.

With the first BPS cohort starting in 1990 (BPS:90), the BPS:12 cohort is the fourth study of beginning postsecondary students. Beginning with the BPS:96 cohort, FTB students are surveyed at three points in time for up to 6 years: in the base year (through the NPSAS student interview) and 3 and 6 years later in the BPS follow-up interviews. The BPS:90 cohort was also surveyed at three points in time, but the second follow-up was 5 years later. Table 1 shows the data collection timeline for the base-year and subsequent follow-up studies for each BPS cohort.

Table 1. Chronology of BPS: 1990–2017

BPS Cohort

Base year study

First follow-up

Second follow-up

BPS:90

NPSAS:90

BPS:90/92

BPS:90/941

BPS:96

NPSAS:96

BPS:96/98

BPS:96/01

BPS:04

NPSAS:04

BPS:04/06

BPS:04/09

BPS:12

NPSAS:12

BPS:12/14

BPS:12/17

1 The second follow-up for the BPS:90 cohort was conducted 5 years after postsecondary enrollment. All subsequent second follow-ups were conducted after 6 years.

NOTE: BPS = Beginning Postsecondary Students Longitudinal Study. NPSAS = National Postsecondary Student Aid Study.

This most recent cohort of BPS includes students who first entered postsecondary education in 2011–12. Data on their first academic year were collected in 2012, and then data on their second and third year were collected in the BPS:12/14 first follow-up study conducted in 2014. The BPS:12/17 second follow-up study will provide data on these sample members’ fourth, fifth, and sixth year after entering postsecondary education.

This particular cohort has several features that will enhance our understanding of FTB students’ experiences and will be examined in several BPS:12/17 publications. First, this BPS has adopted the human capital model as its research framework, which addresses the costs and benefits associated with enrolling and persisting in higher education (Becker 1975). Thus this study will enable researchers to investigate how new interview items measuring students’ physical and emotional health, earnings possibilities given current education, earnings expectations after completing anticipated credentials, and the extent to which students’ preferences for current over future rewards (also known as their discount rate) shape students’ education and employment outcomes. Second, given increased interest in the relationship between subbaccalaureate credentials and employment, this BPS cohort features an oversample of students seeking educational certificates in 2-year public, 2-year for-profit, and 4-year for-profit institutions. Third, analyses of students’ labor market experiences will be further aided by the expanded employment section, which builds on the employment history in BPS:12/14 for FTB students’ first through third year after entering postsecondary education by collecting employment history for their fourth through sixth year after entering postsecondary education. And finally, to help us further understand the role of education and employment during school on students’ employment outcomes 6 years after entering college, BPS:12/17 will collect a greater amount of detail than past cohorts on students’ current or most recent job.

    1. Current Research and Policy Issues Related to BPS

Growing competitiveness abroad and structural changes to the U.S. economy have increased interest in improving Americans’ postsecondary education attainment and labor market preparedness. While in 1990 the United States led the world in 4-year degree attainment, it now ranks 12th. President Obama has called for every American to pursue at least one year of education beyond high school—including postsecondary occupational education. Prominent foundations such as the Lumina Foundation and the Bill and Melinda Gates Foundations have also set goals for increasing Americans’ attainment of postsecondary credentials valued in the labor market. Designed to improve our understanding of the factors related to postsecondary persistence, degree attainment, and employment, BPS:12/17 will address several key research and policy issues relevant to policymakers’ search for ways to improve Americans’ educational attainment and readiness for today’s job market.

First, as the face of America changes, policymakers and practitioners are increasingly concerned about demographic differences between who enters postsecondary education and who attains postsecondary credentials. Federal TRIO programs and the Obama administration’s My Brother’s Keeper initiative are just a couple of programs tackling this issue. BPS:12/17 will be able to provide the latest nationally representative numbers on how key populations are entering and faring in postsecondary education, particularly students from low socioeconomic and minority backgrounds. These data will allow researchers and policymakers to explore the factors related to populations experiencing greater success on the outcome measures of interest.

Second, there is greater focus on how the educational and employment outcomes of students are shaped by the control and level of the institution they attend. The recently launched College Scorecard is designed to publicize key metrics about student outcomes so families can make more informed college choice decisions. Additionally, Gainful Employment regulations seek to ensure that students are able to find employment in a recognized occupation with earnings that can cover their student loan repayments. BPS:12/17 has the sample size and the key measures to enable researchers and policymakers to analyze students’ attainment and employment by sector, while also controlling for other variables that may be related to these outcomes.

Third, as the cost of college, the percentage of students borrowing, and the amounts students are borrowing have increased, the role to which college costs, financial aid, and student loans affect students’ ability to complete credentials is an ever more pressing research and policy issue. In recent years, Pell Grant eligibility has been expanded, and the amount of the grant increased. There have also been calls at the national, state, and local level to make community college tuition free for 2 years, with Tennessee featuring prominently in such discussions given its new Tennessee Promise scholarship. BPS:12/17 can help inform these policy decisions by providing data on how grants and other financial aid, as well as net college costs, impact students’ road to a credential.

Finally, researchers and policymakers are interested in how attainment and employment are affected by several aspects of the postsecondary experience, such as remedial education, online education, and employment while in school. Students’ lack of college readiness and need for remedial or developmental education have been identified as impediments in students’ time to degree, as well as factors in students dropping out without a credential. In fact, the Department of Education launched a new Center for the Analysis of Postsecondary Readiness (CAPR) to strengthen the research, evaluation, and support of college readiness efforts across the nation. BPS:12/17 includes information on high school coursetaking, grades, and test scores as well as developmental coursetaking in different subjects while in college, which can provide key data for the Center’s work. The growth of online courses and degree programs has also attracted attention, with researchers and practitioners wanting to better understand its potential in speeding students’ time to degree and the ways such courses and programs are perceived by employers. The degree to which working while enrolled helps or hurts postsecondary attainment, time to degree, and later employment outcomes is a key debate. Through its employment history, BPS:12/17 will be able to add real data to this discussion.

Following are some of the many research and policy issues to be addressed with BPS:12/17 data:

Postsecondary Enrollment Characteristics and Experiences

  • How are FTB students distributed across institutions of varying control and levels?

  • How are FTB students distributed across different degree programs?

  • What fields of study do FTB students pursue, and in which fields do they obtain degrees?

  • How frequently do FTB students change their field of study, particularly from science, technology, engineering, and mathematics (STEM) to non-STEM fields and vice versa?

  • To what extent do FTB students participate in online, night, and weekend courses and programs?

  • To what extent do FTB students feel a sense of belonging at their institution?

  • How do FTB students rate their mental and physical health; and how do their ratings vary in their first, third, and sixth year after postsecondary entry?

  • How do answers to the above questions differ by factors like demographic characteristics, control and level of institution, and field of study?

Employment During Enrollment

  • What percentage of students work while enrolled, and how many hours do they work?

  • Of those who work while enrolled, do they work on or off campus, and how many hours per week do they work?

  • How do students’ individual patterns in working while enrolled change by year of enrollment and U.S. economic conditions?

  • How do answers to the above questions differ by factors like demographic characteristics, control and level of institution, and field of study?

Financial Aid and Borrowing

  • How much financial support do dependent FTB students receive from their parents or other relatives and friends for their postsecondary education?

  • What proportion of FTB students receive federal Pell Grants or veterans or other Department of Defense education benefits?

  • What proportion of FTB students take out private loans, in what amount?

  • How does the percentage of FTB students taking out federal loans and the average amount borrowed vary by demographic and enrollment characteristics?

  • How does the amount of grants and loans FTB students receive from federal, institutional, and private sources differ during each year of enrollment?

  • How much do FTB students borrow in private loans?

  • What is the average cumulative debt of FTB students after 6 years?

  • How does the amount of student loan debt that FTB students incur compare to any credit card debt or car loan debt?

  • What kinds of borrowers struggle in repayment and default on their student loans after 6 years?

  • How do answers to the above questions differ by factors like demographic characteristics, control and level of institution, and other enrollment characteristics?

Education and Career Expectations

  • What degrees or certificates do FTB students expect to attain, when do they expect to complete them, and how confident are they in these expectations?

  • What is the relationship between these attainment expectations and actual attainment outcomes 6 years after students begin college?

  • How much social and emotional support do FTB students receive from their families and friends in their pursuit of their educational goals?

  • To what careers do FTB students aspire, and what do they think they will earn in these positions?

  • How close are students’ predicted earnings to actual average earnings in their expected careers?

  • What do FTB students think they would do if they were not in school, and, if they think they would be working, what do they think they would be earning?

  • To what extent do FTB students place value on more immediate versus more remote rewards? In other words, what is their discount rate?

  • How do answers to the above questions differ by demographic characteristics?

Persistence

  • At what rate do students stop out of postsecondary education, how often do they do it, and when do they do it?

  • At what rate do students transfer between institutions, when do they transfer, and what are the most common transfer patterns in terms of the types of institutions left and entered?

  • What proportion of certificate attainers enter another certificate or degree program? Are their subsequent certificates and degrees in related fields of study?

  • What proportion of FTB students are enrolled in their first institution 6 years after initially enrolling but have yet to earn a credential?

  • What proportion of FTB students are enrolled in any institution 6 years after first enrolling but have yet to earn a credential?

  • Among students who leave postsecondary education without a credential, in what year did they leave?

  • How do answers to the above questions differ by demographic characteristics, high school preparation, control and level of institution, attendance intensity, employment during enrollment, financial aid and borrowing, physical and mental health, sense of belonging at institution, discount rate, and education and career expectations?

Attainment

  • What percentage of FTB students earn a certificate, associate’s degree, or bachelor’s degree?

  • How long does it take FTB students to earn each of these credentials?

  • How do answers to the above questions differ by institution level and control, attendance intensity, transfer patterns, stopouts, changes in major and major choice? What role do demographic characteristics, high school preparation, employment during enrollment, financial aid and borrowing, physical and mental health, sense of belonging at institution, discount rate, and education and career expectations play?

Employment Outcomes After Leaving Postsecondary Education

  • How much do FTB students earn after 6 years, and what benefits do they receive?

  • What percentage of FTB students are employed in their field of study? How do their employment outcomes compare to those who are not employed in their field of study? In what occupations are those not employed in their field of study employed?

  • Among FTB students who did not enter postsecondary education directly from high school, to what extent does their employment before and after postsecondary education differ? To what extent does employment prior to postsecondary education influence employment outcomes after postsecondary education?

  • How do FTB students’ employment outcomes after leaving postsecondary education compare to their employment during their postsecondary education?

  • What percentage of FTB students have experienced unemployment spells? How many spells have they had, and how many months has each spell lasted?

  • How do answers to the above questions differ by degree and certificate attainment; field of study; and level, control, and selectivity of institution attended? What role do demographic characteristics, employment prior to and during postsecondary enrollment, debt, and earlier education and career expectations play?

Answers to these and other related questions must be obtained so that policymakers at the local, state, and national levels can craft informed policies that meet America’s changing student demographics, postsecondary landscape, and labor force needs.

    1. Previous Agency Use of the Data

NCES has used data from the previous cycles of BPS in a variety of publications. NCES also makes BPS data available for use by researchers, policymakers, and others via both restricted-use data files and the public-use data tools, PowerStats and QuickStats.

  1. Use of Information Technology

The BPS:12/17 pilot interview will primarily be a web-based interview. In NPSAS:12 and the BPS:12/14 full-scale study, 80.6 percent and 78.8 percent of interviews, respectively, were completed online as self-administered surveys. For BPS:12/17, the interview will be adapted to be mobile friendly as more respondents are using mobile devices to complete the interview. In BPS:12/14 full-scale, approximately 22 percent of responses were completed on a mobile device.

  1. Method Used to Minimize Burden on Small Businesses

The student survey for BPS:12/17 does not involve small businesses or entities.

  1. Frequency of Data Collection

BPS studies have been conducted periodically since 1990, as described in section 2a of this package. The first follow-up BPS:12/14 full-scale data collection was conducted in 2014, 2 years after the base year NPSAS:12 full-scale collection from which the BPS student sample was selected. BPS:12/17 is the second follow-up. In addition to the interview in 2017, administrative record matching, student records collection, and a one-time transcript collection may occur in 2018, if funded.

NPSAS and its longitudinal spin-off studies, BPS and the Baccaulaureate and Beyond Longitudinal Study (B&B), are conducted to reflect the large-scale and rapid changes in federal policy concerning postsecondary student aid. Eligibility restrictions change, sizes of grant and loan amounts fluctuate, and the balance between various aid options can change dramatically. A recurring study is essential, first, to help predict future costs for financial aid because loan programs create continued obligations for the federal government as long as the loans are being repaid. Second, repeated surveys can capture the changing nature of the postsecondary environment. With the longitudinal design of the NPSAS survey and BPS follow-ups, representative national samples of FTB students with similar base-year characteristics may be compared over time to determine the effects of changes in federal policy and programs. Third, repeated surveys can help researchers understand the effect of economic conditions on the employment outcomes for subbaccalaureate educational certificate holders. The new oversample of certificate seekers that will be available for the full-scale study, combined with the longitudinal nature of BPS, allows for analysis of how the value of these credentials shifts in response to market forces.

  1. Consultants Outside the Agency

Recognizing the significance of the BPS:12/17 data collection, several strategies have been incorporated into the project work plan to ensure efforts are not duplicative with other studies, and to allow for the critical review and acquisition of comments relating to project activities, interim and final products, and projected and actual outcomes.

Consultations with other federal offices include the U.S. Department of Education’s Office of Postsecondary Education; the Office of Planning, Evaluation and Policy Development; and other agencies, such as the Government Accountability Office; the Congressional Budget Office; and the Office of Management and Budget. In addition, NCES collaborates with the National Center for Science and Engineering Statistics (NCSES) at the National Science Foundation (NSF) to ensure that each unit is kept up-to-date on each other’s studies pertaining to postsecondary students and institutions. NCES and NSF meet on a regular basis to cover topical issues relevant to both offices and each has staff serving on study TRPs. NCES routinely consults with non-federal associations, such as the American Council on Education, the Association of Private Sector Colleges and Universities, the National Association of Student Financial Aid Administrators, the National Association of Independent Colleges and Universities, the Council of Graduate Schools, and the Institute for Higher Education Policy.

NCES also consults with academic researchers, several of whom attend the BPS TRP meetings. These consultations provide methodological insights from the results of similar and related studies conducted by NCES, other federal agencies, and nonfederal sources. The consultations also assure that data collected through BPS will meet the needs of the federal government and relevant organizations. The membership of the TRP (see attachment I) represents a broad spectrum of the postsecondary community. The nonfederal members serve as expert reviewers on the technical aspects of the study design, data collection procedures, and instrument design, especially item content and format.

In August 2015, the TRP reviewed the proposed pilot-scale interview content and study design during its meeting. A second meeting is scheduled for June 2016 when results of the cognitive testing and the pilot study will be presented. The 2016 meeting will focus on recommendations for the full-scale collection.

  1. Provision of Payments or Gifts to Respondents

A $30 incentive will be provided for all sample members who complete the pilot test interview to encourage their participation and thank them for their time and information. This is the same ammount as the baseline incentive in BPS:12/14. In the pilot test, sample members will be able to select their incentive payment in the form of a check, PayPal payment, or electronic gift certificate. Respondents choosing an electronic gift certificate will receive an e-mail from the vendor (Creative Group, Inc.) with credentials to log in to the website. Once logged in, respondents can select a gift card of their choice. Gift card options include Amazon.com, Starbucks, Target, Barnes & Noble, and Walmart.

  1. Design and Context

The pilot, for which web data collection will begin in March 2016, will test a subset of the BPS:12/17 survey items. The following section outlines the pilot test sampling design, the survey items to be assessed during the pilot, and the data collection procedures.

    1. Sampling design

At the conclusion of the NPSAS:12 field test, approximately 2,000 students had been interviewed and confirmed to be FTB students. Subsequently, the BPS:12/14 field test included all of the students who responded to the NPSAS:12 field test and were confirmed to be FTBs. In addition, the BPS:12/14 field test included approximately 1,500 students who did not respond to the NPSAS:12 field test but were potential FTBs according to institution lists and/or student records. As shown in table 2, of the 3,496 field-test sample members, 143 were found to be ineligible (98 NPSAS study members and 45 NPSAS non-study members). Of the remaining 3,353 eligible sample members, 1,884 responded to the BPS:12/14 field test.

Table 2. Sample member disposition, by sample member and prior round response status

Study member and prior-round response status

Number in sample

Number fielded in BPS:12/17 pilot

Total

3,496

2,308

NPSAS:12 Non-Study Member

347

0

NPSAS:12 Nonrespondent

343

0

BPS:12/14 Ineligible

45

0

BPS:12/14 Nonrespondent

233

0

BPS:12/14 Respondent

65

0

NPSAS:12 Respondent

4

0

BPS:12/14 Nonrespondent

4

0

NPSAS:12 Study Member

3,149

2,308

NPSAS:12 Nonrespondent

1,150

309

BPS:12/14 Ineligible

98

0

BPS:12/14 Nonrespondent

743

0

BPS:12/14 Respondent

309

309

NPSAS:12 Respondent

1,999

1,999

BPS:12/14 Nonrespondent

489

489

BPS:12/14 Respondent

1,510

1,510

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2012/17 Beginning Postsecondary Students Longitudinal Study (BPS:12/17), Pilot Test.

A subset of the eligible sample members will be asked to particpate in the BPS:12/17 pilot test. Sample members who did not meet the definition of a NPSAS:12 study member1 will be excluded, as well as sample members who did not respond to NPSAS:12 and who were either deemed ineligible in, or did not respond to, BPS:12/14. As shown in table 3, a total of 2,308 sample members will be included in the BPS:12/17 pilot test.

With regards to the BPS:12/17 full-scale sample, it will include the same groups as those fielded in the pilot, with the addition of the NPSAS:12 study members who did not respond to the NPSAS:12 student interview and the NPSAS:12 non-study members who responded to the BPS:12/14 student interview. These two groups will not be fielded in the pilot becuase the items that ask about base year enrollment to establish study eligibility will not need to be tested in the pilot.

Table 3. BPS:12/17 sample size by institution characteristics: 2015


Total

Total

2,308



Public


Less-than-2-year

8

2-year

968

4-year non-doctorate-granting

160

4-year doctorate-granting

356

Private nonprofit


Less-than-4-year

25

4-year non-doctorate-granting

172

4-year doctorate-granting

126

Private for-profit


Less-than-2-year

88

2-year

88

4-year

317

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2012/17 Beginning Postsecondary Students Longitudinal Study (BPS:12/17), Pilot Test.

    1. Overview of Survey Items Being Tested

The BPS:12/17 pilot study will field a subset of items planned for the BPS:12/17 full-scale collection. The pilot test instrument will emphasize items that are new or experimental, revised, of interest to the TRP, or necessary for instrument function (e.g. routing to appropriate items). Attachment IV provides all interview items currently planned for the full-scale data collection with an indication of which of these items will be tested in the pilot study. Attachment IV also provides information on previous use of these items in related studies, including the first follow-up in the current (2012) cohort, BPS:12/14, and the equivalent to BPS:12/17 second follow-up in the previous cohort, BPS:04/09. Any potential revisions to the the full-scale instrument would result from the pilot study findings and from the subsequent TRP, and will be reflected in the full-scale clearance package that will be submitted to OMB in 2016.

Many of the data elements to be used in BPS:12/17 were fielded in NPSAS:12, BPS:12/14, and/or BPS:04/09 interviews. Since BPS:12/14, items have been added to the BPS:12/17 instrument to address the data collection needs for the FTB cohort 6 years after entering postsecondary education. The added items were assessed in previous rounds of cognitive testing (see summary of results in attachment III) or originate from prior full-scale rounds, such as from the BPS:04/09 instrument. New or revised items that were cognitively tested were also presented to the 2015 TRP for review. The BPS:12/17 pilot test provides an opportunity to test these items with a larger sample and in conjunction with some of the other full-scale survey items.

Additionally, the BPS:12/17 pilot test instrument includes new approaches to assisted coding systems, or “coders,” used to identify standardized codes for text string responses. The pilot test will assess a predictive search algorithm that provides potential matching results in real time. This real-time search is in contrast to the process traditionally used in coders, in which respondents had to manually type an entire search entry before querying the system. Predictive searches will be familiar to many respondents, given their wide adoption on web-based tools, including search engines such as Google.

BPS:12/17 includes three types of coders, for major/field of study, postsecondary institutions, and ZIP codes. Traditionally, for major and postsecondary institutions, respondents enter text strings that are used to perform a keyword search linked to an underlying database. The coder returns a series of possible matches for the respondent to review and from which to select. For the ZIP code coder, a 5-digit numeric code is entered by the respondent and matched to a zip code database. Using the new predictive coders, the respondent enters three or more characters in the search field, and then potential matching results will be displayed immediately in the search field. In addition, the predictive search on the ZIP coder provides the ability to match a partially entered ZIP code to city and state names, resulting in a list of matched ZIP codes from which respondents may more easily select.

Testing of these three predictive search coders will be performed using an experimental design. At the initiation of the interview, respondents will be randomly assigned to either a treatment or control group. The treatment group will be administered the new predictive coders, while the control group will be administered the traditional coders. Upon completion of the pilot test, treatment and control data will be compared for item timing and rates of missing data (i.e., when a respondent does not select a code).

In addition to the predictive coders, a survey item will be piloted to ask respondents to provide their grade point average (GPA), which is motivated by the importance of GPA in education research and the social desirability of a higher GPA. Many studies have examined the reliability of self-reported GPAs and have often shown mixed results (e.g., meta-analysis by Kuncel, Credé, and Thomas 2005). The pilot test will assess a GPA question that uses “forgiving” introduction text in question wording, suggesting normative or common behavior (Tourangeau and Yan 2007). This approach, sometimes referred to as “question loading,” is intended to reduce measurement error associated with sensitive questions. Testing of the GPA question will also be performed using experimental design, with the GPA question used in past BPS surveys serving as a control and the new “forgiving” version of the question as the treatment. Respondents will be randomly assigned to a treatment or control group. Results for each group will be compared, including mean GPAs for each group and also question timing and rates of missing responses.

    1. Data collection plans and procedures

The pilot test data collection period will last approximately 6 weeks. Data collection efforts will focus primarily on encouraging sample members to complete the interview on the Web. Nonresponse conversion will be limited to written communication, and will not include telephone follow-up. However, sample members who prefer to complete a telephone interview will be able to contact the BPS help desk to complete the interview over the phone with a member of the project team. Materials and procedures used for the pilot test will be evaluated and refined prior to full-scale implementation. As shown in table 4, given the recruitmnt protocol and timeframe, of the 2,308 fielded cases, an estimated yield of 992 (43 percent) is expected.

Table 4. BPS:12/17 pilot test projections

Sample type

Eligible cases

Target
response rate

Target yield

Total

2,308

43%

992

BPS:12/14 FT respondents

1,819

50%

910

BPS:12/14 FT nonrespondents

489

15%

73

NOTE: Detail may not sum to totals because of rounding. FT = field test.

  1. Assurance of Confidentiality

NCES assures participating individuals that all identifiable information collected under BPS may be used only for statistical purposes and may not be disclosed or used in identifiable form for any other purpose except as required by law (Education Sciences Reform Act of 2002 [ESRA], 20 U.S.C. § 9573). BPS:12/17 data security and confidentiality protection procedures are in place to ensure that RTI and its subcontractors comply with all privacy requirements, including:

  • the Statement of Work of this contract;

  • Privacy Act of 1974, 5 U.S.C. § 552(a);

  • the U.S. Department of Education Incident Handling Procedures (February 2009);

  • the U.S. Department of Education General Handbook for Information Technology Security General Support Systems and Major Applications Inventory Procedures (March 2005);

  • the U.S. Department of Education, ACS Directive OM: 5-101, Contractor Employee Personnel Security Screenings;

  • Family Educational Rights and Privacy Act (FERPA) of 1974, 20 U.S.C. § 1232(g);

  • ESRA, 20 U.S.C. § 9573; and

  • all new legislation that impacts the data collected through this contract.

RTI will comply with the Department of Education’s IT security policy requirements as set forth in the Handbook for Information Assurance Security Policy and related procedures and guidance, as well as IT security requirements in the Federal Information Security Management Act (FISMA), OMB circulars, and the National Institute of Standards and Technology (NIST) standards and guidance. All data products and publications will also adhere to the revised NCES Statistical Standards, as described at the website: http://nces.ed.gov/statprog/2012/.

The BPS:12/17 procedures for maintaining confidentiality include notarized nondisclosure affidavits obtained from all personnel who will have access to individual identifiers; personnel training regarding the meaning of confidentiality; controlled and protected access to computer files; built-in safeguards concerning status monitoring and receipt control systems; and a secure, staffed, in-house computing facility. BPS:12/17 follows detailed guidelines for securing sensitive project data, including but not limited to physical/environment protections, building access controls, system access controls, system login restrictions, user identification and authorization procedures, encryption, and project file storage/archiving/destruction.

There are security measures in place to protect data during file-matching procedures. NCES has a secure data transfer system, which uses Secure Socket Layer technology, allowing the transfer of encrypted data over the Internet. All data transfers will be encrypted.

The Department of Education has established a policy regarding the personnel security screening requirements for all contractor employees and their subcontractors. The contractor must comply with these personnel security screening requirements throughout the life of the contract. The Department of Education directive with which contractors must comply is OM:5-101, which was last updated on 7/16/2010. There are several requirements that the contractor must meet for each employee working on the contract for 30 days or more. Among these requirements are that each person working on the contract must be assigned a position risk level. The risk levels are high, moderate, and low based upon the level of harm that a person in the position can cause to the Department of Education’s interests. Each person working on the contract must complete the requirements for a “Contractor Security Screening.” Depending on the risk level assigned to each person’s position, a follow-up background investigation by the Department of Education will occur.

Sample member contact materials will describe the voluntary nature of the BPS:12/17 interview and convey the extent to which study member identifiers and responses will be kept confidential. Similarly, informed consent scripts included in the survey will provide sample members with assurances that strict procedures are in place to protect their personal information. Contacting materials are presented in attachment II. The following confidentiality language is also provided in the study brochure that is supplied to all sample members:

The 2012/17 Beginning Postsecondary Students Longitudinal Study (BPS:12/17) is conducted under the authority of the Higher Education Opportunity Act (HEOA) of 2008 (20 U.S.C.§ 1015) and the Education Sciences Reform Act (ESRA) of 2002 (20 U.S.C. § 9543), which authorizes NCES to collect and disseminate information about education in the United States.  NCES is required to follow strict procedures to protect personal information in the collection, reporting, and publication of data. All individually identifiable information supplied by individuals or institutions may be used only for statistical purposes and may not be disclosed or used in identifiable form for any other purpose, except as required by law (20 U.S.C. § 9573).

  1. Sensitive Questions

The BPS:12/17 interview contains items about income, earnings, debts, academic performance, and marital and family status. Federal regulations governing the administration of these questions, which might be viewed as sensitive due to personal or private information, require (1) clear documentation of the need for such information as it relates to the primary purpose of the study, (2) provisions to respondents that clearly inform them of the voluntary nature of participation in the study, and (3) assurances that responses may be used only for statistical purposes, unless otherwise compelled by law (20 U.S.C. § 9573).

The collection of data related to income, earnings, indebtedness, academic performance, and employment is essential to the key policy issues motivating this study. Financial resources and obligations can play an important role in student persistence in and attainment of postsecondary credentials, as can academic performance. In addition, income and earnings are critical outcome measures in analyzing students’ employment and assessing the rate of return students receive for their investment in postsecondary education.

The collection of information about marital and family status also facilitates the exploration of key policy issues. Social and financial support provided by spouses can play an important role in students enrolling and persisting in postsecondary education, as well as in employment decisions. The financial and time demands of dependents can also influence educational choices, like length of degree program to pursue, major, whether to persist and attain a credential, as well as employment choices, like hours worked and benefits needed.

  1. Estimate of Respondent Burden

Table 5 provides the projected estimates for response burden and respondent burden time costs for the BPS:12/17 pilot test. The pilot test questionnaire is estimated to require approximately 15 minutes, on average, for completion. Review of pilot test recruitment materials is estimated to require three minutes per sample member. Estimating an hourly rate of $19.032 for respondents, the burden time cost for the pilot test total of 363 hours is approximately $6,908. In addition, beginning in October 2016, panel maintenance (address update) will be conducted on the full scale sample, resulting in an additional 279 burden hours, translating to an estimated $5,310 of burden time cost.

Table 5. Estimated BPS:12/17 pilot test cost and response burden


Sample size

Expected response rate

Number of respondents*

Number of responses

Average burden (min)

Total burden (hour)

Pilot Test Recruitment

2,308

-

2,308

2,308

3

115

Pilot Test Survey

2,308

43%

992

992

15

248

Full-scale Panel Maintenance

37,166

15%

5,575

5,575

3

279

Total

-

-

7,883

8,875

-

642

* Respondent totals do not include duplicative counts of individuals.

  1. Estimates of Cost to Respondents

Respondents will incur no costs associated with participation in this study beyond the response burden time cost.

  1. Cost to Federal Government

Cost estimates include staff time, reproduction, postage, and telephone costs associated with the management, data collection, analysis, and reporting for which clearance is requested. Table 6 provides a more detailed breakdown of contract costs for pilot and full-scale collection.

Table 6. Individual and total costs to NCES for the BPS:12/17 pilot test

BPS:12/17 Pilot Test

Costs to NCES

NCES salaries and expenses

$111,290

Contract costs

1,903,689

Total

1,914,979

  1. Publication Plans and Schedule

The contract for BPS:12/17 requires multiple reports, publications, and other public information releases. Results of the pilot study will be appended to the full-scale data file documentation, and will be included in the full-scale OMB request. In addition, the following will be produced from the full-scale data:

  • a First Look report and Web Tables;

  • complete data files and documentation for research data users in the form of both a restricted-use file and public-use data tools (i.e., QuickStats, PowerStats);

  • detailed data file documentation describing all aspects of the full-scale study design and data collection procedures; and

  • special tabulations of issues of interest to the higher education community, as determined by NCES.

Additional reports, including Statistics in Brief and Statistical Analysis Reports, may also be published from the full-scale data.

The operational schedule for the BPS:12/17 full-scale study is shown in Table 7.



Table 7. Operational schedule for BPS:12/17

Activity

Start date

End date

Pilot Test



Sample specifications

6/16/2015

8/11/2015

Data collection



Self-administered web interviews

3/7/2016

4/18/2016

Process data, construct data files

2/29/2016

6/17/2016

Prepare data file documentation

3/8/2016

8/18/2017

Full-scale



Update/deliver sample specifications

4/26/2016

6/28/2016

Conduct panel maintenance activities

10/1/2016

01/15/2017

Conduct locating and tracing activities (cleared under 1850-0631 v.8)

11/1/2016

10/17/2017

Conduct help desk and CATI training

12/12/2016

3/9/2017

Data collection



Self-administered web interviews

2/20/2017

10/17/2017

Telephone interviews

3/10/2017

10/17/2017

Process data, construct data files

12/12/2016

7/19/2018

Prepare and publish reports

9/18/2017

6/27/2019

NOTE: BPS:12/17 = 2012/17 Beginning Postsecondary Students Longitudinal Study; CATI = computer-assisted telephone interviewing.

  1. Reviewing Statisticians and Individuals Responsible for Designing and Conducting the Study

The study is being conducted by the Sample Surveys Division of the National Center for Education Statistics (NCES), U.S. Department of Education. NCES’s prime contractor for BPS:12/17 is RTI. NCES staff responsible for the statistical aspects of the study include: Dr. David Richards, Dr. Sean Simone, Dr. Tracy Hunt-White, Mr. Ted Socha, and Dr. Chris Chapman. RTI staff include: Mr. Jason Hill, Dr. Jennifer Wine, Dr. David Wilson, Ms. Nicole Ifill, Dr. Austin Lacy, Ms. Kristin Dudley, Dr. Alexandria Radford, Mr. Michael Bryan, Mr. Peter Siegel, and Dr. Emilia Peytcheva. Additional RTI staff for BPS:12/17 include: Ms. Donna Anderson, Mr. Jeff Franklin, Ms. Chris Rasmussen, and Dr. Jennie Woo.

  1. References

Becker, G.S. (1975). Human Capital: A Theoretical and Empirical Analysis, With Special Reference to Education. 2nd ed. New York: Columbia University Press.

Kuncel, N.R., Credé, M., and Thomas, L.L. (2005). The Validity of Self-Reported Grade Point Average, Class Ranks, and Test Scores: A Meta-Analysis and Review of the Literature. Review of Educational Research, 75: 63–82.

Tourangeau, R., and Yan, T. (2007). Sensitive Questions in Surveys. Psychological Bulletin, 133(5): 859–883.

1 NPSAS:12 staff identified key variables across the various NPSAS:12 data sources—student records; student interviews; and administrative federal and private databases such as Central Processing System (CPS), National Student Loan Data System (NSLDS), National Student Clearinghouse (NSC), ACT files, and SAT files—to define a minimum set of data points necessary to support the analytic objectives of the study. Sample members for whom those key variables were available were classified as study membersthe NPSAS:12 unit of analysis.

2 The hourly rate was obtained by averaging the first quartile earnings of full-time wage and salary workers for a combination of adults 25 years and over with a bachelor’s degree or higher and the median earnings of workers with some college or associate’s degree in the fourth quarter of 2015. (Table 5, http://www.bls.gov/news.release/pdf/wkyeng.pdf.)

16

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorWine, Jennifer S.
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy