BPS 2012-17 Full Scale Response to Passback

BPS 2012-17 Full Scale Response to Passback.docx

2012/17 Beginning Postsecondary Students Longitudinal Study: (BPS:12/17)

BPS 2012-17 Full Scale Response to Passback

OMB: 1850-0631

Document [docx]
Download: docx | pdf

Memorandum United States Department of Education

Institute of Education Sciences

National Center for Education Statistics

OMB # 1850-0631 v.10

DATE: December 13, 2016


TO: Robert Sivinski and Ann Carson, OMB


THROUGH: Kashka Kubzdela, OMB Liaison, NCES


FROM: David Richards, BPS:12/17 Project Officer, NCES

Tracy Hunt-White, Team Lead, Postsecondary Longitudinal and Sample Surveys, NCES


SUBJECT: 2012/17 Beginning Postsecondary Students Longitudinal Study: (BPS:12/17) Responses to the OMB Passback (OMB # 1850-0631 v.10)


December 12, 2016 - OMB Passback


On December 12, 2016 OMB posed three questions with regard to the BPS:12/17 OMB package for the full-scale student data collection. The questions (in bold) are below:


  1. Have any of the procedures or policies regarding privacy or confidentiality been updated since the BS 12/14 went through? Please let us know if they have, as well as which ones.


NCES Response: New procedures have been implemented since BPS:12/14. In BPS:12/17, for the first time security questions were added asking the respondent to select the answer known to them from a list of several options. This was implemented to provide an additional level of security for respondents. There was no change in policy since BPS:12/14, but an attempt to build in more security as the issue of data security has become a topic of increasing concern to our respondents over the years.



2. Is the 12/17 student instrument identical to the 12/14 instrument? I ask because the only student survey I could find on ROCIS was for 12/14. I assume you will be asking about the intervening years in 12/17.


NCES Response: The BPS:12/17 instrument includes many of the same items as the BP:12/14 survey, but is not identical. Many of the items that are common between the two instruments do ask about the years since the BPS:12/14 interview. For example, enrollment and employment questions ask about postsecondary attendance and jobs during the intervening years.


In the pilot test clearance package (OMB # 1850-0803 v. 150, Agency ICR Reference # 201406-1850-002, which was approved on February 26, 2016), the Attachment IV document (under ICs) lists which items were dropped since BPS:12/14 and Table 1 in that document highlights items that were used in the BPS:12/14 survey vs. added or revised for the BPS:12/17 pilot test. Items where only the date changed between BPS:12/14 and BPS:12/17 pilot test are not considered revisions in this table.


In the current submission for the BPS:12/17 Full-Scale (OMB # 1850-0631 v.10) the Appendix G BPS 2012-17 FS Interview Facsimile document, provides all the question wording and response options for the BPS:12/17 student survey. Table 1 in that document provides a summary of changes to the content of the BPS:12/17 survey when compared to the BPS:12/17 pilot test instrument. The table provides the variable names and labels, whether a particular variable was modified, and the rationale for the modification. Following Table 1, the question wording and help text for each BPS:12/17 student survey item are provided. The survey items are identified by question names that correspond to the variable names in Table 1.


3. In Appendix H, I noticed that for each section (enrollment, budget, financial aid), the records instrument progresses from one year to the next. Wouldn't it be easier to do all the sections for a given year, then progress on to the next year?  Is each section in a different data file?


NCES Response: There are a few reasons that the student records instrument is arranged by section:


      • The primary reason to arrange by section is that it is common for different institution staff to complete the various sections. (E.g., the financial aid office may complete Financial Aid, while the Registrar completes Enrollment, and the Bursar completes Budget.)  Grouping by section instead of year makes it easier for institution staff to quickly work through their assigned section.


      • Related to the comment above, institutions often reference multiple data sources to compile the information requested in student records; enrollment data may come from one report, while financial aid data may come from another. Arranging by section allows them to see all of the data they will need to pull from each source at once.


      • Each section of the instrument has a separate data file, as you noted. If an institution is providing data in a CSV file, they upload a separate file for each of the four instrument sections.


      • Arranging by section keeps the appearance, workflow, and order of items as consistent as possible with how these data were collected in the base-year as part of the National Postsecondary Student Aid Study (NPSAS). We anticipate that the same institution offices and staff who completed NPSAS will also complete the BPS Student Records request.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorAuthorised User
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy