IPEDS 2017-18 through 2019-20 Responses to 30-day Public Comments

IPEDS 2017-18 through 2019-20 Responses to 30-day Public Comments.docx

Integrated Postsecondary Education Data System (IPEDS) 2017-18 through 2019-20

IPEDS 2017-18 through 2019-20 Responses to 30-day Public Comments

OMB: 1850-0582

Document [docx]
Download: docx | pdf

Comments Received During the 30-day Public Comment Period

and NCES Responses

February 2017

Integrated Postsecondary Education Data System (IPEDS) 2017-20

ED-2016-ICCD-0127 Comments on FR Doc # 2017-00944


Comments related to response burden and reporting complexity

Document: ED-2016-ICCD-0127-0018

Submitter’s Name: Sonia Schaible

Date posted: January 26, 2017

I believe that surveys such as IPEDS are part of the reason that college costs have expanded so much. To comply with these surveys, institutions need a level of expertise that comes with significant cost. The continued expansion, modification and burden only serve to further exacerbate college tuition at public institutions. When you add the regulations from HEA, Financial Aid, Gainful Employment, accreditation "federal compliance" needs, etc, it comes at a cost, and those costs are passed on to the consumer. The irony is that you blame colleges for all of the increases when in fact, you require colleges to have staff to insure compliance with all of your rules. You need to take a serious look at what really is "required" versus what your lobbyists want to have and take ownership of your part of the increase in college tuition.

Document: ED-2016-ICCD-0127-0031

Name: Anonymous Anonymous

Date posted: February 21, 2017

Many of us at the community college level are already devoting an overwhelming percentage of our time and resources to IPEDS reporting. Some of us are a 1-2 person office, with many other responsibilities. So adding these additional layers of reporting, particularly when they are this complicated, will definitely put a burden on our IR office. Our reporting has been established for quite some time now, and retroactively capturing data 8 years will be very involved and time-consuming. We are unsure as to how much more value this new layer of reporting will add to the information that we already provide regarding completion rates... It is complicated enough to understand even for people in higher education .

Document: ED-2016-ICCD-0127-0019

Name: Anonymous Anonymous

Date posted: January 27, 2017

All of our IPEDS processes are designed around the October 15 reporting deadline; there is no equivalent Spring. We would have to create an entirely new process for generating Spring student cohorts in order to be able to report them. Further, because of the longitudinal nature of the Outcomes Measures survey we are always required to implement them retroactively; it will be a SUBSTANTIAL burden to have to work outside of our data system to try to identify and code Spring starts. And while it is true that we have students that start in Spring, the vast majority at our institution start in the Fall based on program requirements, so it is not clear what benefit we would obtain from this effort.

Adding a 4-year time point to Outcomes Measures is, for a 2-year college, redundant with the existing 200% graduate rate survey.

Document: ED-2016-ICCD-0127-0021

Name: Anonymous Anonymous

Date posted: January 31, 2017

We are a small institution on a quarter system. All cohort information we have available to us is based on our fall quarter. Because we are required to report data from at least six years ago, we would have no means of gathering the proper cohorts needed for retroactive reporting. This would place an enormous burden on our small office.

We have very few students who begin their studies with us in a quarter other than fall. Adding in an entire year of students creates another layer of complexity. Will we then report four graduating dates for each of our quarters per year? We do not have the manpower available to us to recreate the various cohorts needed in addition to revising our existing cohorts to include Pell recipient information.

In addition, to change our processes to be based on a fiscal year rather than academic year is an addition burden. This is asking us to try to split our various summer subterm sessions into two separate cohorts.

Please consider using an academic year for your model. In addition, allow a phase-in period to allow us to begin collecting the necessary data over the next 4 to 6 years. This could be done in a manner similar to the ethnic code changes we did in the past. We could begin to report Pell recipient data based on our fall cohorts while phasing in additional cohort groups to be reported beginning six years from now.

Response

Thank you for your feedback responding to a 30-day request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) 2017-20. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS. The Paperwork Reduction Act (PRA) provides an opportunity for an open and public comment period where comments on collections can be made. We are grateful for this process and your comment.

We understand your concern that additional reporting to IPEDS, a required annual federal reporting of U.S. institutions that participate in the federal student aid program, will increase response time burden and may be interpreted as contributing to the increase of college costs. However, the additional Outcome Measures requirements will allow for a broader coverage of student graduation rate data to reflect a more diverse student population at two-year institutions and improve the collection of the overall student progression and completion data. Requiring institutions to report Outcome Measures (OM) data on a full-year cohort is an imperative need to the federal government. The proposed changes would allow for the inclusion of more students, in particular those that enroll in the spring that have not been included in prior cohorts.

NCES strongly considered the increased institutional burden and determined that the need to be accountable and transparent to the public outweighs the change in burden, particular for Pell Grant recipients. In 2014-15 the federal government disbursed 30.3 billion in Pell Grants to 8.4 million full- and part-time undergraduate students (Federal Student Aid Data Center). The Pell Grant program is a large commitment of public dollars. IPEDS began collecting Pell Grant and Subsidized-No-Pell data, which are the Higher Education Act disclosure rates, on its Graduation Rates survey. Thus, institutions have the data already set up for first-time, full-time cohorts. As an extension, NCES seeks to enhance the information on graduation rates of Pell Grant recipients through OM. The federal government needs to have comparable and comprehensive institutional data that reflect the outcomes of undergraduate and Pell-recipient populations. To this end, NCES has proposed definitions that will allow for the selection of student based on matriculation date and enrollment status. Institutions do keep enrollment and degree records for students that include term beginning date, term ending date, enrollment status, information on course taking, degrees and awards received, payment information, financial aid information, and many other data elements. The cohorts can be constructed from these student unit record systems.

With regards to spring admits, we understand that their number may be small. However, the primary purpose of the Outcome Measures (OM) survey is to address the limitations of the Graduation Rates survey component, which collects data on first-time, full-time (FTFT) undergraduate students, across degree-granting institutions. Academic reporting institutions are currently required to report a Fall census based cohort, leaving out students that were admitted outside of the Fall census. The limitations of defining a cohort as FTFT and fall census have been strongly critiqued by several past Administrations, Congress, media, and data users. The OM survey component allows for a more complete collection and inclusive depiction of the overall student progression on an increasingly diverse undergraduate student population.

The addition of a 4-year time point to OM is not redundant with the existing 200% graduate rate survey. The 200% graduation rate survey only collects data on FTFT. OM would collect new data on part-time, first-time; full-time, non-first-time; and part-time, non-first-time students. Also, the 4-year time point would require institutions to identify the type of award received (i.e., certificates, associates, or bachelors), which the 200% graduate rate survey for less-than-4-year institutions does not collect.

Comments related to needing historical records for OM reporting

Document: ED-2016-ICCD-0127-0020

Submitter’s Name: Anonymous Anonymous

Date posted: January 27, 2017

Dear Mr. Reeves:

Thank you for the opportunity to submit additional comments on the proposed changes to IPEDS.

In the Report and Suggestions from IPEDS Technical Review Panel #50, regarding the burden associated with having academic reporters who traditionally report on a fall snapshot move to full-year reporting, it was noted "The approximately 4,000 academic reporters that report on fall cohorts would experience an increase in burden as a result of this change. ... This change would require academic reporters to track and report data on one set of entering cohorts for OM reporting and a different set of statutorily defined cohorts for GR reporting. The burden on affected institutions would be compounded should the OM cohorts be further disaggregated."

Regarding the burden associated with creating Pell cohorts for OM cohorts, it was noted that "using a different definition from that in the disclosure requirements would require institutions to redefine Pell recipients in their data collection and enrollment systems in order to assign them to a cohort. Institutions without enterprise resource planning systems would have to define the cohorts using separate enrollment and financial aid systems (or possibly multiple financial aid systems)."

IPEDS should consider the enormity of the burden that will be placed upon institutions to implement the proposed changes. As was done when the new race and ethnicity standards were put into place, there should be a phase in period for the additional reporting changes to allow institutions to build needed capacity for reporting. Additional staff may need to be hired and/or systems and reports modified and expanded. It is my understanding that the first instance of the Outcomes Survey resulted in a substantial amount of errors. Adding a further layer of complexity to the Outcomes Survey without allowing institutions sufficient time to adapt their business processes will more than likely result in further "bad data" submissions.

I am also concerned that there seems to be confusion about what the statue requires for reporting six-year graduation rates by financial aid status. It was noted by the Technical Review Panel that "Pell at entry means Pell in the student's first year." This is not aligned with HEOA reporting requirements, which as noted on page 55944 of the Federal Register, Volume 74, No. 208, clearly specifies that it be the fall term of each year for "an institution that offers a predominant number of its programs based on semesters, trimesters, or quarters." Therefore, defining Pell at entry as Pell in the first year will most certainly place a high degree of burden upon institutions that have been defining and collecting Pell data indicators as per the statute requirements.

Thank you for your consideration. Implementing the proposed changes for the 2017-2018 reporting cycle is simply too soon to allow institutions time to build the needed capacity to respond.

Sincerely,

Anonymous

Document: ED-2016-ICCD-0127-0022

Submitter’s Name: Anonymous2 Anonymous2

Date posted: February 6, 2017

After a careful review of the proposed changes to the IPEDS - Outcome Measures collection, I agree 100% with the comment posted on January 27 from Anonymous Anonymous. That person has articulated the difficulties this change will impose on my institution as well. We are an institution that has seen declining enrollment, as well as less funding in some part due to the State of Illinois' lack of budget and higher ed disbursement over the past 2 years now. The burden imposed by this change on the docket for 2017-18 will likely require additional personnel in technical services to program the changes and reporting personnel that will pull money from other needed areas of the institution. I would just like to voice my plea to delay this change until some time in the future.

Document: ED-2016-ICCD-0127-0028

Submitter’s Name: Anonymous Anonymous

Date posted: February 16, 2017

If you change the fall cohort reporting to a full year reporting, please don't start by going back 8 years. If you could give us a warning that we need to start tracking these students for a few years, before we actually have to begin reporting on their graduation rates, it would give us time to collect good data each year before we have to track the students. I do not have transfer lists from 2009, so I had to try to recreate the list from a grade report for that semester. It took me hours trying to find the transfers out of the entire student list for that semester.

Document: ED-2016-ICCD-0127-0032

Submitter’s Name: Anonymous Anonymous

Date posted: February 21, 2017

Please do not require this information going back historically. This is overly burdensome to institutions already struggling under budget and resource restrictions. It invites poor quality data and will not provide meaningful results. Getting pell and full year cohort data going back to 2009 is an unrealistic expectation.

Response

Thank you for your feedback responding to a 30-day request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) 2017-20. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS. The Paperwork Reduction Act (PRA) provides an opportunity for an open and public comment period where comments on collections can be made. We are grateful for this process and your comment.

Requiring institutions to report Outcome Measures (OM) data on a full-year cohort establishes what should be a more simple method of creating a cohort. Rather than basing the cohort on a specific term, an institution may use its enrollment extracts for a complete year. Creating a cohort from these extracts will be additional work the first year but ultimately is a simpler method for the creation of OM cohorts. The increase in time the first year to adhere to this new requirement is justified both by the need to comply with the recommendations made in the report issued by the Committee on Measures of Student Success. Most institutions have historical enrollment and degree records. While sometimes these records are stored in archives for enrollment and degree verification purposes, it is possible to construct the proposed cohorts from these record sets. NCES will work with institutions that have difficulty creating these cohorts and the IPEDS Help Desk will continue to support all institutions that reach out and request assistance with any aspect of IPEDS reporting.

The proposed changes would allow for the inclusion of more students, in particular those that enroll in the spring that have not been included in prior cohorts. NCES strongly considered the increased institutional burden and determined that the need to be accountable and transparent to the public outweighs the change in burden, particular for Pell Grant recipients. In 2014-15 the federal government disbursed 30.3 billion in Pell Grants to 8.4 million full- and part-time undergraduates students (Federal Student Aid Data Center). The Pell Grant program is a large commitment of public dollars. IPEDS began collecting Pell Grant and Subsidized-No-Pell data, which are the Higher Education Act disclosure rates, on its Graduation Rates survey. Thus, institutions have the data already set up for first-time, full-time cohorts. As an extension, NCES seeks to enhance the information on graduation rates of Pell Grant recipients through OM. The federal government needs to have comparable and comprehensive institutional data that reflect the outcomes of undergraduate and Pell-recipient populations. To this end, NCES has proposed definitions that will allow for the selection of student based on matriculation date and enrollment status.

The language you cite from the Higher Education Opportunity Act (HEOA) 2008, as amended, and the Federal Register reference institutional selection criteria for compliance with IPEDS reporting requirements. Specifically, this language is for the selection of program reporting institutions. It does not reference the specific OM measurement methods proposed in this package.

Comments related to instructions for OM reporting

Document: ED-2016-ICCD-0127-0023

Submitter’s Name: Andrea Galliger

Date posted: February 6, 2017

The proposed changes to the Outcome Measures survey will be burdensome to fulfill but it is difficult for my institution to calculate how best to handle them without more detail. If the changes are passed then my institution will report on awards achieved by full year cohorts for the first time. There are no further instructions in the proposal as to how to define at what time it has been four, six or eight years to report an award earned (exactly four/six/eight years after entry term, sometime during their fourth.sixth/eigth academic year, etc.) Until we know that detail it is impossible for us to know if our current system is capable of fulfilling the new proposed requirements.

Response

The proposal screens and instructions in the OMB clearance package have been created based on interactions with degree granting postsecondary institutions that called either NCES or the IPEDS Help Desk. In previous OM collections, institutions have reported their students’ outcomes after the time of entry at the six-year and eight-year census dates. This methodological design has not changed for six-year and eight-year census and will be applied to the new four-year census date. While we have not received a high volume of questions related to the duration or timeframes where awards are measured, we will share this comment with the Help Desk and work on materials to support institutions that have difficulty determining how to define time. It is possible that the source of confusion is that some institutions have multiple dates related to degree obtainment included an award date, a date where all coursework was completed needed for the award, and a date of award conferral. While it is rare that there is confusion on when institutions will report students at the four-, six- and eight-year census dates, NCES will develop guidelines for those institutions that call for help. If a substantial number of institutions call with questions, NCES will propose to add the materials to the formal survey instructions.

Comments related to OM and specificity of measurement

Document: ED-2016-ICCD-0127-0026

Submitter’s Name: Anonymous Anonymous

Date posted: February 14, 2017

On the Outcome Measures survey, changing the established cohort from a fall to a full-year cohort would confound the interpretation of the outcome point, especially for institutions like ours that do not confer degrees at the end of each term and that run on three primary sessions per year. Not all students would have the same attendance point when counted. Defining the full year cohort based on a fiscal year calendar is problematic. Regarding the proposed changes to the reporting of Pell cohorts, our institution would spend a significant amount of time creating data for each cohort causing undue and significant burden for minimal benefit.

Document: ED-2016-ICCD-0127-0029

Submitter’s Name: Sharon Gorsch

Date posted: February 16, 2017

I am a parent of 2 with one in community college and one in high school. Eventually they will be going off to a 4 year college. College students transition during semesters; some transfer from community college to a 4 year university, some graduate mid year. These young adults are quite fluid, therefore any data collected must have the flexibility to acknowledge and track their actual enrollment. Otherwise, how will you identify key indicators which enable you to be effective?

Response

Your comments regarding the fluidity of student enrollment and the potential shorter duration for degree completion are valid observations. It is true that a student matriculating in the Spring semester would have less time to complete the degree than a student matriculating in the Fall term. However, the OM survey is collected 8 years after the Fall term and a Spring matriculants would still have 15 semesters to complete an award. Also, the OM survey allows institutions to indicate that a student remains enrolled at the institution pursuing a degree. Therefore, for the few students still enrolled at the institution 15 semesters later shall be reported as enrolled at the reporting institution. The OM survey’s four cohorts facilitate the measurement of full-time, part-time, first-time, and transfer students. By establishing these cohorts and measuring whether the student receives an award, is still enrolled, or transferred to another institution, NCES asserts that this is an appropriate means to measure multiple enrollment patterns within an aggregate data collection without being too burdensome to postsecondary institutions.

In 2014-15 the federal government disbursed 30.3 billion in Pell Grants to 8.4 million full- and part-time undergraduates students (Federal Student Aid Data Center). The Pell Grant program is a large commitment of public dollars. IPEDS began collecting Pell Grant and Subsidized-No-Pell data, which are the Higher Education Act disclosure rates, on its Graduation Rates survey. Thus, institutions have the data already set up for first-time, full-time cohorts. As an extension, NCES seeks to enhance the information on graduation rates of Pell Grant recipients through OM. The federal government needs to have comparable and comprehensive institutional data that reflect the outcomes of undergraduate and Pell-recipient populations.

Comments related to OM and graduation rates measurement

Document: ED-2016-ICCD-0127-0030

Name: Laura Birch

Date posted: February 17, 2017

Thank you for this opportunity to comment on proposed changes to the 2017-18 IPEDS OM report identified in ED-2016-ICCD-0127. For small institutions with limited resources, these changes would be quite burdensome. In particular, the change from a fall cohort to a year cohort will complicate the tracking and reporting of data with minimal perceived benefit, particularly because the majority of our students matriculate in the fall. Presumably, the determination of 4-6-8-year graduation rates for a year cohort would have to take into account EACH student's start (matriculation) term and be calculated based on yearly increments from that start term. For instance, to calculate a 4-yr grad rate for a student who matriculates in Spring 2009, we would need to identify if the student graduated by the Spring 2013 term; 6-yr would be by Spring 2015; 8-yr would be by Spring 2017. With this in mind, providing 4-6-8-yr grad rates for the proposed year cohort of July 1 2009 - June 30 2010 for the 2017-18 report would not be possible; the most recent year cohort that could provide the requested 8-yr grad rate would be July 1 2008 - June 30 2009. Many of the previous comments left by others resonate with us, from expressing concerns about the burdens of these changes to the confusion that may be caused with multiple cohort time frames for end users.

Response

The issues of added response burden have been addressed in our comments earlier in this document. However, your comment about the 4-, 6-, 8-year graduation rates indicates a misunderstanding. OM has proposed to collected information on students enrolling in a single year, eight years later, while also collecting those students’ degree obtainment information at the 4-year and 6-year points in time as well. We do not propose that institutions work with multiple annum groups for OM, only a single annum separated into four cohorts with point in time measures at the 4-year, 6-year, and 8-year timeframes. With regard to your comment about the availability of data prior to 2010, virtually all institutions have record level data for more than 10 years stored in archives. For institutions that experience technical difficulties defining cohorts from archival data, or have any other questions regarding definitions and preparing IPEDS data, NCES recommends calling the IPEDS Help Desk for support.

Comments related to OM and Specificity of Measurement

Document: ED-2016-ICCD-0127-0027

Name: Gary Evenson

Date posted: February 15, 2017

This is a comment from the Wisconsin Association of Independent Colleges and Universities (WAICU). WAICU represents 24 private, nonprofit colleges and universities in Wisconsin. 22 of those members serve undergraduate students; all have significant enrollment by Pell eligible students.

Pell Grants serve an important role in helping secure college opportunity for students with financial need. Pell Grants make a demonstrable difference in college access and persistence. It is important and reasonable that the outcomes of all students - including Pell recipients - be monitored. In that regard, the Outcome Measures (OM) IPEDS survey, to support valid and useful measures to undergird policy, needs to examine multiple variables. As necessary, improvements and refinements to the OM survey should be pursued, if benefits from the changes can be realized without imposing inordinate burdens or creating mismatches in data trends or "after-this-because-of-this" conclusions.

Changes recently proposed for the OM survey starting in 2017-18 have generated concerns among many WAICU members. This brief comment only touches on some of the issues that have been voiced by our members.

The change from a fall-cohort basis to a full-year basis for OM reporting purposes will cause problems for a number of colleges. At many colleges or universities, there is a difference in the characteristics of fall entrants and non-fall entrants. It is not clear that combining these students in outcome representations will result in useful data. In addition, the full-year cohort reporting period (July 1 - June 30) may well cross academic scheduling periods, possibly skewing, but certainly imposing (for some) significant (perhaps manual) handling issues in order to gather and report on the new basis. For example, institutions of higher learning operating under a trimester, semester, or quarterly system may not "fit" the July 1- June 30 cycle - nor would January terms or summer sessions.

The requirement to track student status at 4-, 6-, and 8-years is a new mandate that does not necessarily readily match how some colleges record student progress. Student who begin studies other than in the fall may have times of completions that vary from the standard methods of tracking. This could lessen the viability of comparisons and the trending of outcomes. Unintended consequences could reduce the usefulness and significance of compared results. At a minimum, clear instructions on how to count and record and report results given these differing schedules must be presented.

These changes require that the colleges look back eight years to start the reporting for the covered cohorts. Data availability and data structures from that time may well not match what the new OM changes require. To the extent that the OM changes are deemed necessary by NCES to properly track Pell cohorts, it may be nonetheless reasonable to delay implementation for one year. This delay, coupled with specificity of what the new report requires and when it will be implemented, would give the colleges and universities more time to prepare for the changes and thus make the revised data collection both more useful for comparison and less of a burden for staff at many institutions.

WAICU anticipates the filing of comments on this OM change from several of our members (and numerous other colleges and universities across the country). These colleges know their students, know their data, and take seriously their obligations to monitor student progress. We urge the NCES to carefully weigh the comments from these entities and those data professionals who are "in the trenches" of recording, finding, and reporting on these cohort outcomes. We urge the NCES to consider and accommodate the concerns about timing, costs, and data relevancy that would be attendant to the proposed OM modifications.

Thank you for inviting input on these proposals.

Gary A. Evenson, Director of Research and Analysis

Wisconsin Association of Independent Colleges and Universities

122 W. Washington Avenue, Suite 700

Madison, WI 53703-2723

608-204-5243

[email protected]

Response

We understand that the methods for accounting for the number of spring admits in cohort completion rates is not common for reporting at independent colleges and universities. However, the primary purpose of the Outcome Measures (OM) survey is to address the limitations of the Graduation Rates survey component, which collects data on first-time, full-time (FTFT) undergraduate students, across degree-granting institutions. Academic reporting institutions are currently required to report a Fall census based cohort, leaving out students that were admitted outside of the Fall census. The limitations of defining a cohort as FTFT and fall census have been strongly critiqued by several past Administrations, Congress, media, and data users. The OM survey component allows for a more complete collection and inclusive depiction of the overall student progression on an increasingly diverse undergraduate student population. The 4-, 6-, and 8-year outcomes are not new measurement concepts for most institutions and lend themselves to the student unit record systems implemented at virtually all postsecondary degree-granting institutions. The definitions are universal in their use of dates rather than program definitions and ultimately reflect a simpler, more complete reporting tool.

Requiring institutions to report Outcome Measures (OM) data on a full-year cohort is an imperative need to the federal government. The proposed changes would allow for the inclusion of more students, in particular those that enroll in the spring that have not been included in prior cohorts. NCES strongly considered the increased institutional burden and determined that the need to be accountable and transparent to the public outweighs the change in burden, particularly for Pell Grant recipients. In 2014-15 the federal government disbursed 30.3 billion in Pell Grants to 8.4 million full- and part-time undergraduate students (Federal Student Aid Data Center). The Pell Grant program is a large commitment of public dollars. IPEDS began collecting Pell Grant and Subsidized-No-Pell data, which are the Higher Education Act disclosure rates, on its Graduation Rates survey. Thus, institutions have the data already set up for first-time, full-time cohorts. As an extension, NCES seeks to enhance the graduation rates of information on Pell Grant recipients through OM. The federal government needs to have comparable and comprehensive institutional data that reflect the outcomes of undergraduate and Pell-recipient populations.

Comments related to OM and Importance of IPEDS to BEA

Document: ED-2016-ICCD-0127-0024

Name: Sarah Garrick

Date posted: February 9, 2017

Response

Thank you for your feedback responding to a 30-day request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) 2017-20. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS. The Paperwork Reduction Act (PRA) provides an opportunity for an open and public comment period where comments on collections can be made. We are grateful for this process and your comment. NCES appreciates your support in the proposed improvements and your statement on the record regarding the importance of IPEDS to BEA. We thank you for taking the time to provide comment.

Comments from NAICU

Document: ED-2016-ICCD-0127-0033

Name: Sarah Garrick

Date posted: February 21, 2017


February 16, 2017

Ms. Kate Mullan

Acting Director, Information Collection Clearance Division

Office of the Chief Privacy Officer

Office of Management

Dear Ms. Mullan,

On behalf of the more than 1,000 member institutions and associations of the National Association of Independent Colleges and Universities (NAICU), I write in response to a request for comments regarding the Integrated Postsecondary Education Data System (IPEDS), 2017-18 through 2019-20 (Docket ID ED–2016–ICCD–0127, as published in the January 18, 2017, Federal Register). The following addresses proposed changes to IPEDS, and in particular the addition of a Pell Grant recipient cohort to the Outcome Measures (OM) survey.

NAICU is the national public policy association for the nation’s private, non-profit colleges and universities. Our member institutions include major research universities, church-related colleges, historically black colleges, art and design colleges, traditional liberal arts and science institutions, women’s colleges, two-year colleges, and schools of law, medicine, engineering, business, and other professions.

As the representative of these institutions, NAICU recognizes the need for appropriate levels of federal data collection. We have historically supported efforts to provide useful and reliable information to students and families that at the same time recognize the diversity and integrity of our higher education institutions.

It is our opinion that the proposed addition of a Pell Grant recipient cohort to the OM survey will (a) provide little additional benefit to students, families, and policy-makers seeking to determine the academic success of low-income students, and (b) add substantial burden and complexity to an already sizable reporting responsibility. For the following reasons, we recommend that you reconsider collecting Pell Grant recipient outcomes, and in particular those disaggregated by subgroups and/or based on a full-year cohort:

  • We disagree with the idea that reliable and complete information on the federal investment of the Pell Grant program is unavailable. Currently, longitudinal surveys conducted by the National Center for Education Statistics (NCES) document financial aid, demographic, and attainment data for Pell recipients from the time they enter college to several years after they leave higher education. These recurring surveys not only provide ample information about grantees and their academic progress, but also provide a comparative context in relation to students in the same cohort who did not receive need-based aid. In addition, these surveys will soon be supplemented by the addition of Pell Grant and Stafford Loan cohorts to the Graduation Rates (GR) component. Combined, these two sources should provide sufficient data to study federal aid recipients and to compare them to other members of the same cohort.

  • Institutions must already disclose graduation rates for Pell Grant recipients under 20 U.S.C. 1092, thereby providing students and families the ability to determine which schools best serve those from low-income backgrounds. Given that national data exist for policy makers and institutional data exist for students and families, the need to introduce yet another extensive data collection is onerous on institutional providers and overkill for students and families.

  • We question whether the new OM survey will provide additional value to policy makers or the public. For one, we believe the Pell Grant and other federal programs that help low-income students access and complete college have proven invaluable to national attempts to educate populations that have been historically underrepresented in higher education. Two, the reasons for access to and success in higher education for low-income students are numerous and the resulting availability of data provided by these surveys can potentially be misconstrued. The potential misuse of these data could be detrimental to decades of efforts to facilitate higher education attainment for all students.

  • Finally, we cannot overstate our concern about the burden these changes will place on institutions, particularly smaller ones, as this proposal will have a substantial effect on colleges with limited staff, infrastructure, resources, or access to longitudinal data. In particular, the requirement to use a full-year cohort of Pell Grant recipients (as opposed to the established fall cohort) would require substantial changes to collection and reporting mechanisms for many of our institutions. Should the proposed changed be adopted, the time needed to accommodate could be considerable.

Thank you for the opportunity to comment on this proposal. Please feel to contact our office should you have questions or comments.

Sincerely,

Frank Balz

Vice President for Research and Policy Analysis

Response

Thank you for your feedback responding to a 30-day request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) 2017-20. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS. The Paperwork Reduction Act (PRA) provides an opportunity for an open and public comment period where comments on collections can be made. We are grateful for this process and your comment.

The longitudinal studies conducted by NCES are effective, valid, and reliable in documenting financial aid, demographic, and attainment data for Pell recipients from the time they enter college to several years after they leave higher education. However because these data are collected using statistical sampling methods, institutional data are not available to the public. In addition these recurring surveys are not conducted every year but on a 2- year or longer bases. NCES is excited about the ability to supplement the sample surveys data with Pell Grant and Stafford Loan data but must acknowledge that institutions are uniquely equipped to define cohorts of students for purposes of measuring graduation and outcome measures.

The disclosure of graduations rates for Pell Grant recipients under the Higher Education Act (2008) requires institutions to post these rates on their website. By collecting these rates in a central location, NCES can offer these statistics to the public through the College Navigator consumer tool and the IPEDS Data Center. Since the statistics are already being calculated by institutions, adding the fields to IPEDS should not require much additional burden. The value of having degree-granting institutions submit, to a central repository, this already collected information will have value to the policy makers, prospective students and their families, government entities, and others with interest in postsecondary education.

Title IV financial aid programs have had an undoubtable effect on the ability of low and middle income students to access and flourish in postsecondary education. Understanding total counts of Pell Grant recipients at every degree-granting institution coupled with the outcomes: transfer out, graduated, still enrolled, or unknown, will provide a piece of information that has remained elusive to prospective students and their families, policy makers, researchers, and others interested in postsecondary education. This new set of outcome measures will have a profound impact on the nation’s understanding of student enrollment and completion propensity for all students receiving Pell and those that do not.

We understand that creating annual cohorts causes some additional burden. However, the primary purpose of the Outcome Measures (OM) survey is to address the limitations of the Graduation Rates survey component, which collects data on first-time, full-time (FTFT) undergraduate students, across degree-granting institutions. Academic reporting institutions are currently required to report a Fall census based cohort, leaving out students that were admitted outside of the Fall census. The limitations of defining a cohort as FTFT and fall census have been strongly critiqued by several past Administrations, Congress, media, and data users. The OM survey component allows for a more complete collection and inclusive depiction of the overall student progression on an increasingly diverse undergraduate student population. We thank you for taking the time to provide comment.

Comment

Document: ED-2016-ICCD-0127-0034

Name: Anonymous Anonymous

Date posted: February 21, 2017

We oppose the proposed changes to the Outcome Measures survey. Not only will these changes place an unfunded burden on reporting institutions, but also we believe the methodology will produce results that are difficult to interpret. The proposed instructions for the Outcome Measures Survey specify that all institutions will report using a full-year cohort, which includes all degree/certificate-seeking undergraduate students entering the institution between July 1, 2009 and June 30, 2010. Further, the instructions state that students will be assigned one of eight sub-cohorts upon entry and will remain in the assigned sub-cohort. Here are our concerns about this methodology:

Concerns with defining the full-year cohort

Almost all our students who start in the summer are enrolled in a part-time status; thus, we would place these students in a part-time sub-cohort. However, many of these students enroll in a full-time status for the remainder of their academic career. Grouping those who are primarily full-time but happened to start as part-time in a summer term with those who are primarily part-time will obfuscate the outcomes data for part-time sub-cohorts.

The partition of the summer term into two academic years is challenging. Our institution offers several sessions throughout our summer term, some which start before July 1 and some which start after. To determine the appropriate full-year cohort for students who start in summer, we must examine the start date of each summer class for which the student is enrolled and determine whether the earliest class started before or after July 1. This is requires the examination of much more granular data compared to those who start in the fall or spring terms.

In addition, the start date of some summer sessions is not consistent year-to-year. For example, in summer 2009, all of our major summer sessions started before July 1; thus, we would not include any of the students who started in summer 2009 in the July 1, 2009 through June 30, 2010 full-year cohort - these students would have been in the previous cohort. This reduces the year-to-year stability and the interpretability of our four-year award status rate.

Concerns with reporting award status

We are concerned that all students in the full-year cohort have the same award status date of August 31 regardless of when they started during the academic year. For example, the award status at four years after entry for those who start on July 1, 2009 represents four years and 2 months while the status of those who start on June 30, 2010 represents three years and 2 months. Institutions who enroll a larger percentage of their students at the beginning of the cohort year appear to have an advantage over institutions who enroll a smaller percentage of students at beginning of the cohort year.

Response

Thank you for your feedback responding to a 30-day request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) 2017-20. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS. The Paperwork Reduction Act (PRA) provides an opportunity for an open and public comment period where comments on collections can be made. We are grateful for this process and your comment.

Requiring institutions to report Outcome Measures (OM) data on a full-year cohort is an imperative need to the federal government. The proposed changes would allow for the inclusion of more students, in particular those that enroll in the spring and have not been included in prior cohorts. NCES strongly considered the increased institutional burden and determined that the need to be accountable and transparent to the public outweighs the change in burden, particular for Pell Grant recipients. In 2014-15 the federal government disbursed 30.3 billion in Pell Grants to 8.4 million full- and part-time undergraduates students (Federal Student Aid Data Center). The Pell Grant program is a large commitment of public dollars. IPEDS began collecting Pell Grant and Subsidized-No-Pell data, which are the Higher Education Act disclosure rates, on its Graduation Rates survey. Thus, institutions have the data already set up for first-time, full-time cohorts. As an extension, NCES seeks to enhance information on the graduation rates of Pell Grant recipients through OM. The federal government needs to have comparable and comprehensive institutional data that reflect the outcomes of undergraduate and Pell-recipient populations. To this end, NCES has proposed definitions that will allow for the selection of student based on matriculation date and enrollment status.

Many institutions in the United States have enrollment patterns specific to that institution or a group of similar institutions. Your statement that a specific enrollment pattern would show a different completion rate is one of the advantages of the OM survey because this information will become available to students and their families, policy makers, researchers, and others interested in postsecondary education. Additionally, your observation that institutions sometimes change their academic calendar is true but does not detract from the need to share this valuable information with the public. Deviations from a prior year can be explained when they result from changes in administrative process. It is also true that an institution that brings in students later in the academic year would have less time to show those students as graduates. However, given that the OM reporting period is 8 years, changes in a few months are unlikely to have a significant effect on outcomes rates especially when considering one of the options is to report the student as still enrolled at the reporting institution.

Comments related to XXX

Document: ED-2016-ICCD-0127-0025

Name: Kathleen Schwan Minik

Date posted: February 13, 2017

We have reviewed proposed changes to the IPEDS Outcome Measures survey. While we share concerns about the importance of representing outcomes of all students, we feel the need to express concerns about some of the changes. In particular, switching to a full-year entering cohort elicits a number of concerns. The inclusion of more student data does not necessarily make the data more meaningful or aid interpretability, while adding to institutional reporting burden. Following are some of the issues we are concerned with regarding the proposed changes:

1. Different measurement intervals. Students who start in different semesters, e.g. Fall and Spring, would have the same ending point. What might be called a six-year rate would actually be a 5 year rate for spring entrants. Institutions that have many spring entrants would be affected by this and would, all else equal, show lower rates than those with mostly fall entrants.

2. Different enrollment characteristics. Students entering in fall and spring semesters may have different characteristics and enrollment patterns. Institutions with a large number of fall entrants would be compared to those who have a mix of fall and spring entrants, and this could provide misleading data.

3. Use of retrospective cohorts. When institutions recreate cohorts from eight years back, it would seem there is the potential for some issues with data quality for some institutions. Is there enough confidence that all institutions will have the capability to make these substantial changes?

4. Students may change their load during the year. Students could change from full- to part-time or from part-time to full-time during the year. Guidance will need to be provided to ensure consistency in reporting.

5. Multiple completion rates. There will be a graduation rate and the outcome measure completion rate that will have substantially different cohorts of students. This has the potential of causing confusion by those viewing this data.

6. Comparisons might include the same students. Students could be counted at more than one institution in the same interval if they attend an institution in fall and then change to another in spring.

7. Small cohorts of students. The problem of small population sizes could be exacerbated by using sub-cohorts of Pell grant students, which could further reduce population sizes. For example, our institution generally has relatively few first-time, part-time students. There may be a potential for student information to be identifiable when there is a small number of students. There are also concerns about the use of percentages when the population size is small, as percentages could fluctuate widely from year to year. Consideration might be given to requiring data only when the population reaches a certain size.

8. Pell cohorts. Guidance will need to be provided about who to include as Pell recipients. For example, there may be some students who need to return their award due to attendance issues and we would need to know whether to include these students.

While it would be ideal to be able to adequately represent the outcomes of all students and provide useful comparisons, proposed changes to the Outcome Measures survey, particularly the switch to a full-year cohort, do not appear to us to substantially increase data that could be meaningfully used. Considering the potential data quality issues, the extra work done to provide this data would be unduly burdensome.

Kathleen Schwan Minik, Sr. Research Analyst

Jane Baranowski, Associate Registrar

Rebecca McHugh-Reindl, Sr. Assistant Registrar

Alverno College, Milwaukee, Wisconsin

Response

In the earlier responses in this document NCES has addressed comments about the different measurement intervals. With regards to Spring matriculants, it is true that they would have the same ending point as Fall matriculants but it is also true that the rates for 4-, 6-, and 8-year are all collected at the 8 year point in time. In addition, still enrolled at the reporting institution is one of the options. So students that are still enrolled will be reported as such. One of the reasons the OM survey was created was to address the limitations of the Graduation Rate surveys that defines a fall cohort for academic reporters. The OM survey is a more complete measure of student outcomes and has a more diverse set of students and allows for a broader set of awards and outcomes. Measuring the ultimate outcomes for all students matriculating in a single year will provide an essential piece of information to the nation on its Title IV eligible postsecondary institutions.

In virtually all cases institutions have been reporting to IPEDS for a decade or more and have kept records on their reporting methods and data used. Institutions have been reporting 12-month enrollment, completions, and other student based reports for many years and likely have the requisite data to prepare the OM reports accurately. NCES has worked to prepare priority rules describing how to define students in these cohorts and will be certain to have the IPEDS Help Desk prepared to support any institutions needing technical assistance.

NCES is working on a communication to explain the differences between OM and GR and asserts that the OM statistics are already being collected and are already part of the public space. Students have always had the potential to be in multiple institutional reports in IPEDS and OM is no different. We agree that small cell sizes will need to be considered when publishing statistics along with proper documentation of the cohorts.



9

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorAuthorised User
File Modified0000-00-00
File Created2021-01-20

© 2024 OMB.report | Privacy Policy