IPEDS 2013-2016 Response to 60-day Public Comment

IPEDS 2013-2016 Response to 60-day Public Comment.docx

Integrated Postsecondary Education Data System (IPEDS) 2013-2016

IPEDS 2013-2016 Response to 60-day Public Comment

OMB: 1850-0582

Document [docx]
Download: docx | pdf

Public Comments Received During a 60-day Comment Period

and NCES Responses

Comments on ED-2013-ICCD-0029-0001
Agency Information Collection Activities; Comment Request; Integrated Postsecondary Education Data System (IPEDS) 2013-2016

Docket: ED-2013-ICCD-0029
Integrated Postsecondary Education Data System (IPEDS), Web-Based Collection SystemComment on FR Doc # 2013-05958

Comment 1

Document: ED-2013-ICCD-0029-DRAFT-0009
Name: Nicole Vachon
Address: Bangor,

Email: [email protected]
Organization: New England School of Communications
Government Agency Type: Federal

Date: March 20, 2013


Please be aware that schools may not have software capable of manipulating data requested on the IPEDS report. For Departments that are already understaffed and trying to keep up with ongoing federal regulation this report takes valuable time that could be spent working with students.

NCES Response:

Dear Ms. Vachon,

Thank you for your comment dated March 20, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS. The Paperwork Reduction Act (PRA) provides an opportunity for an open and public comment period where comments on collections can be made.  We are grateful for this process and your comment. 

We recognize that not all institutions required to report IPEDS data have the same level staffing and technological resources.  We work closely with the postsecondary education community to provide well designed and efficient collection forms and work directly with the IPEDS keyholders to facilitate their IPEDS data submissions. NCES has worked to provide a data submission tool that can be used by most major web browsers without the purchase of additional software. We continue to work to provide institutions with as many resources as possible to help ease the burden of the IPEDS reporting process.

NCES is very sensitive to the need to balance increased reporting burden with the utility of the data requested. Each Technical Review Panel meeting, regardless of the topic being discussed, is specifically charged with the task of minimizing the additional institutional burden as a result of increased IPEDS reporting requirements. Further, entire meetings of the IPEDS Technical Review Panel have been dedicated to the topic of IPEDS reporting burden:

We recognize that IPEDS reporting takes time, however the information collected not only provides a common set of information for prospective students to use when making a decision to enroll in postsecondary education, but is regularly utilized by the Department of Education and other federal agencies, researchers, policy makers, and others to monitor and improve postsecondary education for the public.

If you need further assistance with data, IPEDS also provides a help desk at 877-225-2568 and at [email protected].

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program



Comment 2

Document: ED-2013-ICCD-0029-DRAFT-0010
Name: Bernard Fryshman
Address: Brooklyn, NY

Contact Phone: 718-253-4857

Date: March 20, 2013



To: The United States Department of Education


Introduction


As always, I am most grateful for the opportunity to comment. Unfortunately comments such as mine are evaluated in the same office which proposed the regulations. This creates an inevitable conflict which, in my opinion, prevents input from the public and the educational community from being evaluated in an objective and transparent manner. The Department should create a venue wherein serious presentations from the public are discussed comprehensively, in open session, to ensure that input has been taken into account prior to a final decision being made.


Burden


While the estimated number of responses (77,600) may be reasonable, the estimated number of annual burden hours has been seriously underestimated. An average 12.3 hours per response does not take into account the fact that submitting IPEDS documents in a responsible manner requires school officials to identify sources of information, to assign tasks to various staff members, to gather data, to validate it, to clarify questions, to complete the documents and to submit them.


The smallest responding institution will require more than 12.3 hours for this purpose; moderate to large institutions can require four, five and six times as many hours. Furthermore, the process of completing IPEDS requires trained individuals who must be maintained as part of staff throughout the year. In essence, gathering data for NCES has become a serious cost center, in a post secondary setting diverting human and monetary resources away from teaching and learning.


A much more realistic assessment of burden than the 954,030 hours listed would be the total cost associated with this process. It is not likely that any of the 7,500 institutions mentioned spends less than $2000 a year in staff time and in activities associated with IPEDS. This provides a base cost to Higher Education of at least 15 million dollars a year. If one extrapolates from small institutions enrolling less than 350 students to systems which enroll as many as 40,000 students, the cost burden to the nation could readily rise by a factor of 5!


I would only add that this is not money furnished by the Department of Education, but funds taken directly out of the budgets of hard pressed institutions. 75 million dollars could make an important difference in the lives of thousands of students, were it made available for tutorial service, extra classes, improved laboratory and study equipment, and counseling.


"Burden" provides the background against which my comments should be read. The Department of Education, acting through the National Center for Educational Statistics, should make every effort to reduce burden and demonstrate that IPEDS Data Collection adds value to Higher Education.


Sampling


The expertise of people associated with the field of data collection is extremely impressive and raises the question as to why most of the purposes of IPEDS cannot be served using sophisticated sampling techniques. The onus must be on the Department to establish that information cannot be determined through well established sampling techniques. Information that is needed for each individual school can be gathered every several years rather than on an annual basis, since changes in Higher Education is gradual rather than precipitous.


Small Entities


The Department has undertaken initiatives to reduce the number of IPEDS elements to be collected from small entities (say, schools enrolling 350 or less students). Unfortunately the number of such data points eliminated have been relatively few, and much more needs to be done. One must keep in mind that gathering information in order to develop broad national trends certainly does not need figures from schools which make up so infinitesimal a part of the total.


Students seeking information are evidently not served by IPEDS Data regarding such small entities. A school enrolling less than 350 students has usually not benefitted from decades of IPEDS Data Collection, and if it were free to choose, would likely elect to avoid the expense and travails associated with IPEDS Data Collection. This is true, even if it means the school's data would not be available for students looking for a suitable school.


I believe the Department could develop a one page 'ez IPEDS' to gather only that information which is explicitly specified in the statute; small entities could be permitted to submit information every three years instead of every year, since year by year changes are usually insignificant. And finally, the Department could ask small institutions for suggestions as to how to ease the IPEDS Data Collection burden.


Sunset


NCES must demonstrate that it examines all data elements collected for continued relevance, justifying their inclusion in subsequent IPEDS Collection instruments.


In carrying out such an examination, the Department must structure its TRP committees in a manner representative of Higher Education as a whole. Populating committees mainly with IR professionals distorts the process, since these people appreciate data, use it, and view its collection as highly important to their work. This is a legitimate point of view, but it is certainly not a perspective which reflects the thinking of Higher Education as a whole.


Optimally, there would be an automatic Sunset provision which would require that the collection of every data element be justified anew. In the absence of such a provision, there must be a much more effective examination of elements in the future.


Policy


The Department of Education must demonstrate that the decades of data collected as part of IPEDS have led to enhancements to policy and practice in Higher Education, and is useful to the public and to legislators.


Governmental Sources


Schools should not be taxed with the responsibility of gathering data which can be obtained elsewhere in government. Once again, it is the responsibility of the Department of Education to ascertain this and to simplify the IPEDS burden accordingly.


NCES Response:

Dear Dr. Fryshman,

Thank you for your comment dated March 20, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS. The Paperwork Reduction Act (PRA) provides an opportunity for an open and public comment period where comments on collections can be made. We are grateful for this process and your thoughtful comments.

The burden of the IPEDS data collection has been and remains a crucial factor in decision making surrounding the collection. NCES is very sensitive to the need to balance increased reporting burden with the utility of the data requested. Each Technical Review Panel meeting, regardless of the topic being discussed, is specifically charged with the task of minimizing the additional institutional burden as a result of increased IPEDS reporting requirements. Further, entire meetings of the IPEDS Technical Review Panel have been dedicated to the topic of IPEDS reporting burden.

These meetings of the TRP have had a significant impact in how the IPEDS collection is designed and implemented. This is evident in the citations provided in the Supporting Statement Part A. NCES is aware of the careful balance that must be made between the burden of the collection on postsecondary institutions and the value of the information collected for regulatory purposes. Although there is no formal method established for the calculation of burden, NCES has employed the recommended practice of working with respondents to better understand burden and the respondent experience. NCES strives to provide accurate burden estimates based on these interactions and empirical data from the collection systems. Burden costs to respondents are calculated by multiplying the estimated number of response burden hours by an estimation of the loaded average cost of data analyst and related equipment ($37.15 for 2013-14). A detail of this information is provided in the Supporting Statement Part A (Section A.12). Per Part A.12, total estimated costs to respondents for the 2013-14, and 2014-15 through 2015-16 are as follows:

Table 12.

Estimated Total Burden Hours for All Institutions

Estimated Cost to All Institutions

Average Estimated Costs Per Institution

2013-14

843,820

$31,347,913

$4,180

2014-15

907,640

$34,390,480

$4,585

2015-16

1,037,130

$40,085,075

$5,345



The Executive Order 13610, “Identifying and Reducing Regulatory Burdens”, requires agencies to take continuing steps to reassess regulatory requirements and, where appropriate, to streamline, improve, or eliminate those requirements. While IPEDS already reflects many of the recommendation cited to reduce burden, NCES remains vigilant in exploring the possibility of using a short form option (Item 2), use of sampling (Item 5), and exemptions for streamlining for small entities (Item 3). In the most recent meeting of the National Postsecondary Education Cooperative (NPEC1), on 6/13/2013, the Program Director for IPEDS outlined a plan to investigate the efficacy of using short forms in the IPEDS collection for smaller institutions. Timing for findings related to this investigation are contingent on available resources but should be completed prior to the submission of the next IPEDS clearance.

There is always outreach for TRP sessions. In general, members of NPEC and keyholders are among those that are notified of upcoming TRPs. Keyholders and NPEC members tend to be from the institutional research community. The topic of each TRP does have an influence on who is contacted. For instance, the recent TRP on financial aid had more financial aid professionals attending. Also, RTI makes an effort to identify consortia that are relevant to each topic and invites them to the meeting. Lastly, there is always an invitation for comments from the public related to each topic and all received comments are reviewed.

The data collected by IPEDS have been central in postsecondary education policy research for decades and has been utilized by students, parents, postsecondary institutions, nonprofit organizations, federal, state, and local governments, individual researchers, and other groups. NCES remains committed to soliciting feedback on these collections through the PRA process, outreach to data providers and users, coordination with federal constituents and partners, TRPs, advisory committees like NPEC, cognitive research, and outreach to the data community. The thoughtful advancement of this collection is essential for it to remain a key data source for postsecondary education.

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program



Comment 3

Document: ED-2013-ICCD-0029-DRAFT-0011
Name: James Smith
Address: Monroe, LA,

Email: [email protected]
Organization: Louisiana Delta Community College
Government Agency Type: State

Date: April 8, 2013


Virtually every year IPEDS asks for more information, much of which is difficult to extract. IPEDS has taken on a life of its own. It is difficult for smaller institutions to keep up with all the changes and additions due to staff limitations.

Much of the information, Accounting and Human Resources in particular, is none of the government's concern and should be omitted. IPEDS is a prime example of a governmental organization which started out small, has grown out of control, and is accountable to no one.

Start removing items from IPEDS rather than adding to the burden of already overworked IR departments.

NCES Response:

Dear Mr. Smith,

Thank you for your comment dated April 8, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS.

NCES works closely with the postsecondary education community and feedback on possible changes to the IPEDS collection are actively sought through Technical Review Panel (TRP) meetings, the National Postsecondary Education Cooperative (NPEC), and through the Paperwork Reduction Act (PRA) comment process.

NCES is very sensitive to the need to balance increased reporting burden with the utility of the data requested. NCES has worked with the postsecondary community, regulators, and data users to identify areas where the data collection may be reduced and is committed to continuing these efforts. Each Technical Review Panel meeting, regardless of the topic being discussed, is specifically charged with the task of minimizing the additional institutional burden as a result of increased IPEDS reporting requirements. Further, entire meetings of the IPEDS Technical Review Panel have been dedicated to the topic of IPEDS reporting burden:

The Human Resource (HR) survey and specifically the reporting of racial/ethnic and gender data for institutional staff on the Human Resources component is also mandated by P.L. 88-352, Title VII of the Civil Rights Act of 1964, as amended by the Equal Employment Opportunity Act of 1972 (29 CFR 1602, subparts O, P, and Q), in odd-numbered years (i.e., 2007-08, 2009-10, etc.), for institutions with fifteen (15) or more full-time employees.

The federal government has made an effort to align data collections to postsecondary institutions under a single program, IPEDS. The HR component that collects race, ethnicity, and gender data (previously referred to as the fall staff section, and required in odd-numbered years) replaces the former EEO-6 survey, and is used by the Equal Employment Opportunity Commission (EEOC) in place of their data collection efforts. Under Public Law 88-352, Title VII of the Civil Rights Act of 1964, as amended by the Equal Employment Opportunity Act of 1972, all institutions of higher education that have 15 or more (full-time) employees are required to keep records and to make such reports biennially to EEOC. NCES now collects the data and provides them to EEOC as required in their regulations. The Office for Civil Rights (OCR) and the Office of Federal Contract Compliance Programs (OFCCP) of the Department of Labor also use these data. The filing of race, ethnicity, and gender data on staff is mandated under Section 709(c) of Title VII.

The data provide information on staffing levels at the institutions for various occupational categories and are used extensively in peer institution analysis, manpower utilization studies, and in examining the health of the institutions. Good quality data on racial/ethnic composition of postsecondary employees are useful to EEOC and OCR for monitoring compliance with Title VII. On an annual basis, institutions also classify all of their employees by full- or part-time status, faculty status, and occupational category; in addition, medical school staff are reported separately.

Salary outlays for full-time instructional staff and other full-time employees are also collected annually. These data are used by:

  • The U.S. Department of Education's Grants and Contracts Service, which makes frequent use of the salary data collected by NCES to set standards for expected salary outlays during grants and contracts negotiations processes;

  • The U.S. Bureau of Labor Statistics (BLS), Department of Labor, which includes salary data when developing its Occupational Outlook Handbook.

The House Labor and Human Resources Committee, the Office for Civil Rights, and the Bureau of the Census have requested trend data. State agencies rely on salary data to determine budgets for their state-supported institutions and to make comparative studies with other states. Institutions use salary data to establish their own compensation packages, and institution officials study the compensation packages offered by their peers and/or competitors prior to developing their salary schedules.

The Accounting data, or Finance survey component as it is referred to in the supporting statement, are needed for reporting and projecting the revenues and expenditures of a national activity representing a significant component of the GNP. To ease reporting burden and enhance the comparability and utility of the finance data, IPEDS redesigned the data collection instruments to conform to the accounting standards governing both public and private institutions.

The Department of Education's Title III (Institutional Aid) grant program relies on the finance data to help determine whether or not an applicant college or university is eligible to receive a grant. These data are needed annually. The Government Accountability Office published a report on Postsecondary Education Financial Trends in Public and Private Nonprofit Institutions for the U.S. Senate Committee on Health, Education, Labor, and Pensions that used IPEDS finance data. The National Science Foundation is a regular user of IPEDS finance data. The Bureau of the Census relies on this form to collect data required in its census of governments. NCES and Census worked closely to ensure that one instrument satisfied the needs of both agencies. The Bureau of Economic Analysis also contributed significantly to this endeavor. The Office of Management and Budget asked NCES to collect these data because the Bureau's survey universe was a subset of the IPEDS universe. The Bureau of the Census also uses the data from other parts of the survey to:

  • develop estimates of state and local governments' finances to provide to the Bureau of Economic Analysis for calculation of the Gross National Product; and

  • collect supplemental data that their census of governments does not collect.

The Bureau of Labor Statistics and the Federal Mediation and Conciliation Service are secondary users of NCES/Census finance data. The Office for Civil Rights has used finance data to determine states' or institutions' compliance with anti-discrimination laws. From these data OCR was able to determine whether or not predominantly black, publicly controlled institutions were being discriminated against through funding decisions made by state boards of higher education. The Bureau of Economic Analysis of the U.S. Department of Commerce uses financial statistics to prepare totals and forecasts on total non-farm expenditures for structures and equipment, and to develop Gross National Product accounts. Increasing numbers of state agencies use the NCES Finance report to assemble data to plan and evaluate their higher education policies.

Among associations, the American Council on Education (ACE), the Association for Institutional Research, the Brookings Institution, the Carnegie Foundation for the Advancement of Teaching, and The Delta Cost Project are frequent users of Finance data. Researchers from these and other organizations use the data to assess the economic future of the nation's colleges and universities.

NCES will continue to assess the IPEDS collection for its appropriateness and utility by reviewing existing regulations, working with data users and postsecondary institutions, and through public comment periods like this one.

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program



Comment 4

Document: ED-2013-ICCD-0029-DRAFT-0012
Name: Mary Harrington
Address: University, MS,

Email: [email protected]
Organization: University of Mississippi
Government Agency Type: State
Government Agency: University of Mississippi

Date: April 8, 2013


Having to calculate graduation rates for non-first-time students will create a tremendous reporting burden for institutions. And the resulting calculation will not be meaningful because transfer students can come with one semester of academic work or eight semesters of academic work, resulting in very different graduation rates. If the decision is made to go ahead with this, please consider postponing it for another year so that we can get prepared.

NCES Response:

Dear Ms. Harrington,

Thank you for your comment dated April 8, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS.

The IPEDS Technical Review Panel discussed the issue of cohort size at length. The TRP also acknowledged that certain of the four cohorts could be very small at some institutions, but allowed that the “traditional” full-time, first-time cohort used for Graduation Rates reporting is very small and non-representative of the student body at many institutions. The TRP felt that expanding IPEDS to collect outcome information for part-time and non-first-time students would significantly increase the comprehensiveness of outcome data available at the federal level.

NCES understands that any aggregation of students will not allow for the complete description of the students within. However, NCES works to balance the potential burden imposed upon an institution with the value and of the data provided. In this case, the use of only 4 cohorts is an attempt to strike a meaningful but not too burdensome balance.

It is true that the cohort definition does not account for the academic circumstance of the students within the cohort. This is similarly true of admitted first-time students that may test out of classes or bring in college credits from advanced placement coursework and/or exams. While the assumption that all students within a cohort are similarly prepared cannot be made, data collected through the new Outcome Measures component will provide a more complete representation of completion for degree and certificate-seeking undergraduates in U.S. postsecondary education.

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program



Comment 5

Document: ED-2013-ICCD-0029-DRAFT-0013
Name: Rick Jenkins
Address: Little Rock, AR,

Email: [email protected]
Submitter's Representative: Rick Jenkins
Organization: Ark. Dept. of Higher Education
Government Agency Type: State
Government Agency: Ark. Dept. of Higher Education

Date: April 8, 2013


Institutional Burden – I believe that this will be a substantial administrative burden on many smaller institutions. In Arkansas, we have 22 two-year colleges, many which have only 1 or 2 dedicated institutional research staff. Some have no dedicated institutional research staff. The requirements of TRP 40 will be an immense burden to these institutions. The same is true for our private/independent institutions. And for Arkansas, the only increases in funding for higher education have come only through increases in tuition for several years.

Utility of Measure – I also question the usefulness of the measure. Since you are including non-first-timers, this will include second-year freshmen, sophomores, juniors, and seniors. Will the resulting percentage rate be comparable from year-to-year? I have real concerns that it will not be.

Fall Cohort – I believe that a focus on the fall cohort is a problem. Nationwide, many institutions are offering courses at many different time frames than the typical fall, spring and summer terms. Many institutions are offering 3-4 terms to the traditional term. I know of one school that has 4 sub-terms in the fall, 4 in the spring, and another 4 in the summer. The fall term is no longer considered as the term where college students begin their higher educational career. Some 2-year colleges in Arkansas have only 55-65% of first-time entering students beginning in the fall. An unduplicated annual count would be much more representative.

Transfers – All institutions do not participate in the National Student Clearinghouse due to the cost. Many institutions will not be able to produce transfer information.

NCES Response:

Dear Mr. Jenkins,

Thank you for your comment dated April 8, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS.

We recognize that not all institutions required to report IPEDS data have the same level of staffing and technological resources.  We work closely with the postsecondary education community to provide well designed and efficient collection forms and work directly with the IPEDS keyholders to facilitate their IPEDS data submissions. We continue to work to provide institutions with as many resources as possible to help ease the burden of the IPEDS reporting process.

NCES is very sensitive to the need to balance increased reporting burden with the utility of the data requested. Each Technical Review Panel meeting, regardless of the topic being discussed, is specifically charged with the task of minimizing the additional institutional burden as a result of increased IPEDS reporting requirements. Further, entire meetings of the IPEDS Technical Review Panel have been dedicated to the topic of IPEDS reporting burden.

The exemption of non-first time students has brought criticism that the IPEDS collection fails to properly measure postsecondary education student movement. It is true that the cohort definition does not account for the academic circumstance of the students within the cohort. This is similarly true of admitted first-time students that may test out of classes or bring in college credits from advanced placement coursework and/or exams. It would be inappropriate for a data user to make an assumption of homogeneity of cohorts for different institutions or the same institution for different cohort years. However, there are many institutions that feel underrepresented in the first time only cohort reporting system because these institutions matriculate a high number of non first-time students. While the assumption that cohorts are similarly prepared cannot be make, the additional cohorts will provide for a more complete representation of completers for the U.S. postsecondary education sector.

The Fall Enrollment collection provides valuable information on, what is for many institutions, the beginning of their school year. NCES is aware the fall enrollment statistics do not account for all students, and in some cases do not count large numbers of students. For this reason, NCES also has the 12-month enrollment collection that collects an unduplicated headcount for postsecondary institutions. These two collections capture the majority of enrollment patterns while minimizing the amount of burden that would be needed to collect all or even most of the other enrollment patterns in postsecondary education.

NCES is aware that many institutions do not participate with the National Student Clearinghouse. This topic was discussed at a technical review panel (TRP) on outcomes measures. The TRP discussed the issue of institutions reporting subsequent enrollment data, and suggested an outcome category of “subsequent enrollment status unknown” to address situations where these data are not available. The collection allows for institutions to report ‘Unknown” as a valid response. Many institutions will have some information on external outcomes because of their participation in State Longitudinal Data Systems (SLDS) and other reciprocity or consortia like systems that allow for the identification of students that re-enroll at other institutions.

Prior to addressing your comments on the new Outcome Measure component of IPEDS, we would like to clarify that IPEDS will continue to collect graduation rates as it has in the past. The impetus and work to develop the new outcomes measures have taken several years to bring to fruition.

The Higher Education Opportunity Act of 2008 established the U.S. Department of Education’s Committee on Measures of Student Success to advise the Secretary of Education in assisting 2-year degree-granting institutions of higher education in meeting the completion or graduation rate disclosure requirements outlined in the Higher Education Act of 1965, as amended. The Committee completed its work in December 2011.

In its final report to the Secretary, the Committee noted that the “current federal graduation rate measure is incomplete and does not adequately convey the wide range of student outcomes at 2-year institutions.” In addition, the Committee observed that “data are not collected on other important outcomes achieved by students at 2-year institutions.”

The Committee recommended that the Department:

  • Broaden the coverage of student graduation data to reflect the diverse student populations at 2-year institutions;

  • Improve the collection of student progression and completion data;

  • Improve technical guidance to institutions in meeting statutory disclosure requirements; and

  • Encourage institutions to disclose comparable data on employment outcomes and provide incentives for sharing promising practices on measuring student learning.

Although its work focused on 2-year institutions, the Committee suggested that its recommendations be considered and implemented for 4-year institutions as well. The Committee’s final report is available at http://www2.ed.gov/about/bdscomm/list/cmss-committee-report-final.pdf.

In April 2012, the Department released an action plan for improving measures of postsecondary student success in support of the Administration’s college completion agenda and based on the recommendations of the Committee on Measures of Student Success (http://www.ed.gov/edblogs/ous/files/2012/03/Action-Plan-for-Improving-Measures-of-Postsecondary-Student-Success-FINAL2.pdf).

The Department’s action plan is designed to improve the quality and availability of student success data at the federal level for consumers, institutions, policymakers, and researchers. This plan also includes activities to help institutions, systems, and states increase their capacity for collecting and disseminating data on student success. Various offices within the Department are responsible for implementing the activities within the plan. In the plan, the Department has committed to “revise, where feasible under its current authority, existing data collection vehicles to include more comprehensive measures of student success for a broader group of students.”

As part of this activity, the NCES has taken steps to enhance graduation rate and transfer rate reporting in IPEDS. Using existing processes for considering changes to IPEDS, NCES examined the feasibility of broadening measures by collecting outcome information for part-time, degree/certificate-seeking undergraduate students and non-first-time, degree/certificate-seeking undergraduate students in IPEDS.

IPEDS TRP 37 was convened in February 2012 to discuss the feasibility of collecting outcome information on part-time, first-time students. The TRP suggested that NCES clarify the definition of a degree/certificate-seeking student for IPEDS reporting purposes and collect certain outcome information in IPEDS for part-time, first-time students.

IPEDS TRP 40 was convened in October 2012 to discuss the feasibility of collecting outcome information on non-first-time students. The TRP suggested that NCES collect certain outcome information in IPEDS separately for full-time and part-time, non-first-time students, similar to information that TRP 37 proposed for part-time, first-time students. The TRP also suggested that similar outcome information be collected for full-time, first-time students. Outcome measures information will be submitted by degree-granting institutions only.

The new outcome information that institutions would report to IPEDS is designed to provide consumers, policymakers, and researchers context for and an alternative to the graduation rates calculated for the purposes of the Student Right to Know and Campus Security Act of 1990.

To expedite the availability of data that will be useful to consumers, policymakers, and researchers, TRP 40 suggested that institutions report on student outcomes retrospectively. If a prospective reporting model were used, outcome measures data would not be available until 2023.

As a result of TRP suggestions and public comments, NCES has requested clearance to implement a new Outcome Measures component in the Winter collection. The burden for this part of the collection is substantial, with an estimated time of 30.1 hours per institution for a total of 147,490 hours in the first year. After the first year of reporting, when the method for organizing this information is established, the estimated burden drops to 15 hours or 73,500 hours for all institutions. Based on the work done in the TRPs we believe that adding this information to the national knowledgebase on outcomes is justified.

The use of third party data services is not required for the completion of the Outcome Measures component of IPEDS. One outcome category for students who did not receive an award at the reporting institution is that their subsequent enrollment status is unknown; NCES will accept this response even if subsequent enrollment status is unknown for all students. NCES has no expectation or requirement that an institution must share its data with a third party to complete the survey. However, many institutions already participate in state longitudinal data systems (SLDS), the National Student Clearinghouse, or data systems related to articulation and reciprocity agreements that they may use for this reporting if they choose to do so.

After reviewing comments from the public and considering the difficulty in creating retroactive cohorts from 2006, NCES is now proposing in the Information Collection Review (ICR) to begin collecting outcomes information one year later, in 2015-16, using the cohorts that began in 2007. It is our hope that providing an extra year to prepare will alleviate some of the difficulty in creating the cohorts and completing the Outcome Measures component of IPEDS. NCES is committed to working with the postsecondary education community to provide resources that will assist institutions with their outcome measures reporting.

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program



Comment 6

Document: ED-2013-ICCD-0029-DRAFT-0014
Name: Robert Pfaff
Address: Rensselaer, IN,

Email: [email protected]
Organization: Saint Joseph's College (IN)

Date: April 8, 2013


I am the IPEDS keyholder at a 4-year college of about 1,100 students. In my world, institutional research is a part-time obligation. Currently, I also have a 3/4-time teaching load.

For institutions such as ours, IPEDS collection is a nightmare. Whoever at IPEDS thought that changing the 2012-2013 collection schedule so that only 1 survey was due in February and 5 surveys were due on April 10th had absolutely no concept of the hardship that placed on small institutions.

The proposed addition of more survey components in 2014-2015 is a recipe for trouble. Many small institutions, I predict, will not be able to comply on IPEDS' timetable. I really don't think IPEDS needs specific information on library holdings, for example. And I don't see a need to split admissions off into its own survey.

Further, IPEDS only collects three times a year; the workload for institutions should be better balanced across the 9 academic months (for keyholders such as myself who are on 9-month contracts who work alone in IR).

The load placed on IPEDS reporting has gotten out of hand. Please reconsider the addition of more component surveys and reexamine the submission schedule.

NCES Response:

Dear Mr. Pfaff,

Thank you for your comment dated April 8, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS. We recognize that not all institutions required to report IPEDS data have the same level of staffing and technological resources.  We work closely with the postsecondary education community to provide well designed and efficient collection forms and work directly with the IPEDS keyholders to facilitate their IPEDS data submissions. We continue to work to provide institutions with as many resources as possible to help ease the burden of the IPEDS reporting process.

NCES is very sensitive to the need to balance increased reporting burden with the utility of the data requested. Each Technical Review Panel meeting, regardless of the topic being discussed, is specifically charged with the task of minimizing the additional institutional burden as a result of increased IPEDS reporting requirements. Further, entire meetings of the IPEDS Technical Review Panel have been dedicated to the topic of IPEDS reporting burden.

The most recent addition to IPEDS, the Academic Libraries Component was previously an independent data collection that has now been brought over to the IPEDS collection from another part of the Department of Ed. In this instance, the IPEDS collection is growing but not as a result of new information being asked of institutions but rather it represents movement towards an omnibus postsecondary institutional data collection.

NCES made the changes to the collection cycle in collaboration with the postsecondary institution sector and will monitor its feasibility in this upcoming cycle. So far your comment on timing is the only one received in this comment period, but NCES will continue to evaluate the IPEDS collection and make every reasonable attempt to reduce burden on the reporting institutions while complying fully with the law and regulations related to this collection.

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program



Comment 7

Document: ED-2013-ICCD-0029-DRAFT-0015
Name: Nancy Iacovone
Address: Immokalee, FL, FL,

Email: [email protected]
Organization: Immokalee Technical Center
Government Agency Type: Local

Date: April 10, 2013


I recommend that there be a District employee assigned as IPEDS Keyholder for the 2 Vocational/Technical Schools of Collier County Public Schools. The reporting process of IPEDS is very time extensive and requires a fulltime position to collect the data.

NCES Response:

Dear Ms. Iacovone,

Thank you for your comment dated April 10, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register.

We are aware that not all institutions have the same staffing and technological resources available and we continue to work with the industry to provide well-designed and efficient collection forms. NCES has worked to provide a data submission tool that can be used by most major web browsers without the purchase of additional software. The topic of reporting burden has been a topic of multiple technical review panels (TRPs) and NCES continues to identify areas where burden can be reduced. The decision for keyholder assignment resides with the institution and if a change in keyholder is necessary, the IPEDS Help Desk can assist with that process.

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program



Comment 8

Document: ED-2013-ICCD-0029-DRAFT-0016
Name: Dana Malone
Address: Austin, TX,

Email: [email protected]
Organization: University of Texas System
Government Agency Type: State
Government Agency: State of Texas

Date: April 10, 2013


We are commenting on item #4 "How might the department enhance the quality, utility, and clarity of the information to be collected" related to the Finance Survey as we do not believe that it represents our financial statements due to the way the survey is forcing collection. The survey currently requires all positive impacts on net assets to be reflected as revenues and other additions and negative impacts on net assets as expenses and other deductions. Some of the additions and deductions are transfers and other items which are not properly classified as revenues or expenses, yet when the data is produced, that is where the numbers are reflected. As a result, we often have to explain to our Executive Management why our institution’s numbers appear greater than what is represented on the financial statements. Also some fields will not allow negatives to be entered, thus requiring us to enter the amounts in another section for Net Assets to be reflected properly. In addition, IPEDS forces us to allocate O&M of Plant, Depreciation expense, and Interest expense to the other functions which does not agree to the way our financial statements are reported as a GASB institution. As a result, in the end, what we are reporting does not represent our published financial statements. We believe that what is produced from IPEDS should represent what we publish to be more useful.

NCES Response:

Dear Ms. Malone,

Thank you for your comment dated April 10, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS. The IPEDS Finance survey component is designed to be based on an institution’s General Purpose Financial Statement (GPFS) to the extent practicable. Based on suggestions by the IPEDS Technical Review Panel in a meeting held in 2007, changes to the Finance component were proposed and approved by the Office of Management and Budget. These changes were designed to improve utility of the data and comparability across data collected from institutions following the Governmental Accounting Standards Board (GASB) and those following the Financial Accounting Standards Board (FASB). We recognize that in some instances, this requires institutions to report data to IPEDS in a slightly different manner than may be represented on their published financial statements.

We will continue to explore how we may address your concerns about if and when negative numbers are allowable in the finance reporting and appreciate you providing us with feedback on this issue.

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program



Comment 9

Document: ED-2013-ICCD-0029-DRAFT-0017
Name: Michele Delisle
Address: Rochester, MN,

Email: [email protected]
Organization: Nova Academy of Cosmetology

Date: April 12, 2013


I am responding towards minimizing the burden through Information Technology. I am the Key Holder and as I am completing the surveys it seems much of the information being asked for is contained within the student's ISIR and the NSLDS system. If there could be a way to pull that information electronically for IPEDS there would be very little information left (burden) for the schools to compile and report. I feel the accuracy would be much better coming straight from the source (ISIR) as well. Much less chance for re-keying error.

NCES Response:

Dear Ms. Delisle,

Thank you for your comment dated April 12, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS. NCES works closely with the Office of Federal Student Aid and is dedicated to exploring ways in which already existing administrative data systems may be utilized to reduce reporting burden on institutions in the future.

We also want to make sure you are aware that in addition to manual entry, several options exists for uploading data into the IPEDS data collection system, that could help reduce any keying errors. These other formats do require that an institution extracts the data from their institutional systems and formats it into the necessary layout for upload into the IPEDS system. Upon upload, the system runs the data editing procedures that are done when information is manually entered. Please contact the IPEDS help desk for more information (at 877-225-2568 and at [email protected]).

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program



Comment 10

Document: ED-2013-ICCD-0029-DRAFT-0018
Name: Ruthie J. Orsborn
Address: Selma, AL,

Government Agency Type: Local

Date: April 15, 2013


What happens if NCES is not given authorization to continue its IPEDS data collection? If NCES is not authorized to collect IPEDS data, does this mean institutons participating inTitle IV financial aid programs are not mandated to report IPEDS data. If the answer is no and they are still mandated to report IPEDS data, to whom do they report.

Is there a reason why reporting IPEDS data is not voluntary for all? If IPEDS collection is voluntary, is authorization required or even necessary.

NCES Response:

Dear Ms. Orsborn,

Thank you for your comment dated April 15, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register.

The IPEDS collection reflects a conglomeration of statutes that mandate the reporting of information. These include a general mandate whereby NCES is authorized by law under the Section 153 of the Education Sciences Reform Act of 2002 (P.L. 107-279). Accordingly, NCES "shall collect, report, analyze, and disseminate statistical data related to education in the United States and in other nations, including -

  • collecting, acquiring, compiling (where appropriate, on a state by state basis), and disseminating full and complete statistics on the condition and progress of education, at the pre-school, elementary, secondary, and postsecondary levels in the United States, ...;

  • conducting and publishing reports and analyses of the meaning and significance of such statistics;

  • collecting, analyzing, cross-tabulating, and reporting, to the extent feasible, so as to provide information by gender, race, ...; and

  • assisting public and private educational agencies, organizations, and institutions in improving and automating statistical and data collection activities..."

Mandatory Reporting for Institutions with Program Participation Agreements

The completion of all IPEDS surveys, in a timely and accurate manner, is mandatory for all institutions that participate in or are applicants for participation in any Federal financial assistance program authorized by Title IV of the Higher Education Act (HEA) of 1965, as amended. The completion of the surveys is mandated by 20 USC 1094, Section 487(a)(17) and 34 CFR 668.14(b)(19).

Vocational Education Data

IPEDS responds to certain of the requirements pursuant to Section 421(a)(1) of the Carl D. Perkins Vocational Education Act. The data related to vocational programs and program completions are collected from postsecondary institutions known to provide occupationally specific vocational education.1

Data on Race/Ethnicity and Gender of Students

The collection and reporting of race/ethnicity and gender data on students and completers are mandatory for all institutions which receive, are applicants for, or expect to be applicants for Federal financial assistance as defined in the Department of Education (ED) regulations implementing Title VI of the Civil Rights Act of 1964 (34 CFR 100), or defined in any ED regulation implementing Title IX of the Education Amendments of 1972 (34 CFR 106). The collection of race/ethnicity and gender data in vocational programs is mandated by Section 421(a)(1) of the Carl D. Perkins Vocational Education Act.

Data on Race/Ethnicity and Gender of Staff

The collection and reporting of race/ethnicity and gender data on the Human Resources (HR) component are mandatory for all institutions which receive, are applicants for, or expect to be applicants for Federal financial assistance as defined in the Department of Education (ED) regulations implementing Title VI of the Civil Rights Act of 1964 (34 CFR 100). The collection of these data is also mandated by P.L. 88-352, Title VII of the Civil Rights Act of 1964, as amended by the Equal Employment Opportunity Act of 1972 (29 CFR 1602, subparts O, P, and Q). Institutions with 15 or more full-time employees are required to respond to the IPEDS Human Resources component under this mandate.

Student Right-to-Know

Sections 668.41, 668.45, and 668.48 of the Student Assistance General Provision (34 CFR 668) were amended to implement the Student Right-to-Know Act, as amended by the Higher Education Amendments of 1991 and further by the Higher Education Technical Amendments of 1993 and 1999. The final regulations require an institution that participates in any student financial assistance program under Title IV of the Higher Education Act of 1965, as amended, to disclose information about graduation or completion rates to current and prospective students. The final regulations also require such institutions that also award athletically related student aid to provide certain types of data regarding the institution's student population, and the graduation or completion rates of categories of student-athletes, to potential athletes, their parents, coaches, and counselors.

Consumer Information

Section 101 of the Higher Education amendments of 1965 (P.L. 105-244) requires that NCES collect the following information about undergraduate students from institutions of higher education: tuition and fees, cost of attendance, the average amount of financial assistance received by type of aid, and the number of students receiving each type.

Section 132 of the Higher Education Opportunity Act of 2008 (P.L. 110-315) requires that NCES make the following consumer information about postsecondary institutions available on the College Navigator college search web site: the institution’s mission statement; a link to the institution’s website that provides, in an easily accessible manner, information on student activities, services for individuals with disabilities, career and placement services, and policies on transfer of credit; admissions rates and test scores; enrollment by race and ethnicity, gender, enrollment status, and residency; number of transfer students; students registered with the disability office; retention rates; graduation rates within normal time of program completion and 150% and 200% of normal time; number of certificates and degrees awarded, and programs with the highest number of awards; student-to-faculty ratio and number of faculty and graduate assistants; cost of attendance and availability of alternative tuition plans; average grant aid and loans, and number of students receiving such aid, by type; total grant aid to undergraduates; number of students receiving Pell Grants; three years of tuition and fees and average net price data; three years of average net price disaggregated by income; a multi-year tuition calculator; College Affordability Lists and reports; Title IV cohort default rate; and campus safety information. State spending charts and a link to Bureau of Labor Statistics information on starting salaries are also required.



For the IPEDS collection to become voluntary, each of the above statutes would have to be modified and new regulations to be written.

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program



Comment 11

Document: ED-2013-ICCD-0029-DRAFT-0019
Name: Marina Meier
Address: Idaho Falls, ID,

Email: [email protected]
Submitter's Representative: Marina Meier
Organization: Eastern Idaho Technical College
Government Agency Type: State
Government Agency: Educational Institution

Date: April 16, 2013


Regarding the Proposed data collection on the New outcome Measures (OM), our institution finds it cumbersome if not impossible to dig up 2006 information, it was in an archaic database that produced only paper, our institution cannot provide 2006 cohort information

NCES Response:

Dear Ms. Meier,

Thank you for your comment dated April 16, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS. As systems are upgraded it is understandable that some information become more difficult to retrieve in electronic format. Given that the IPEDS data collection is more than 30 years old we are familiar with a time when almost all reporting was based on paper forms and manual counts. It is understandable that a system upgrade could pose a barrier to extracting the required information for your IPEDS submission. We strongly encourage your institution to begin work on extracting and digitizing necessary information for IPEDS reporting as soon as the final Information Collection Review ICR is approved. After reviewing comments from the public and considering the difficulty in creating retroactive cohorts from 2006, NCES is now proposing in the Information Collection Review (ICR) to begin collecting outcomes information one year later, in 2015-16, using the cohorts that began in 2007. It is our hope that providing an extra year to prepare will alleviate some of the difficulty in creating the cohorts and completing the Outcome Measures component of IPEDS. NCES is committed to working with the postsecondary education community to provide resources that will assist institutions with their outcome measures reporting.

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program



Comment 12

Document: ED-2013-ICCD-0029-DRAFT-0020
Date: April 17, 2013

We are a Cosmetology School, for-profit.
With starting classes every month, it is very hard to
follow the normal Ipeds time lines that you create. Our time lines don't follow the preset ones so it's difficult to report accurately the information. It takes a lot of time to figure out what students to use and then when and where. I have now resorted to using our academic year of June to the next July.
Also in a small school there is only one person to put in all the time that is required. Which makes it a heavy burden of time taken away from my daily tasks at hand.
Looking at the future plans of proposed reports, it looks even more time consuming.

NCES Response:

Dear Cosmetology School Commenter,

Thank you for your comment dated April 17, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS. We recognize that not all institutions required to report IPEDS data have the same level staffing and technological resources.  We work closely with the postsecondary education community to provide well designed and efficient collection forms and work directly with the IPEDS keyholders to facilitate their IPEDS data submissions. We continue to work to provide institutions with as many resources as possible to help ease the burden of the IPEDS reporting process.

NCES is very sensitive to the need to balance increased reporting burden with the utility of the data requested. Each Technical Review Panel meeting, regardless of the topic being discussed, is specifically charged with the task of minimizing the additional institutional burden as a result of increased IPEDS reporting requirements. Further, entire meetings of the IPEDS Technical Review Panel have been dedicated to the topic of IPEDS reporting burden:

We recognize that IPEDS reporting takes time, however the information collected not only provides a common set of information for prospective students to use when making a decision to enroll in postsecondary education, but is regularly utilized by the Department of Education and other federal agencies, researchers, policy makers, and others to monitor and improve postsecondary education for the public. Please do not hesitate to contact the IPEDS Help Desk for further assistance (at 877-225-2568 and at [email protected]).

Sincerely,


Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program



Comment 13

Document: ED-2013-ICCD-0029-DRAFT-0021
Name: Lou McClelland
Address: Boulder, CO,

Email: [email protected]
Organization: U of Colorado Boulder

Date: April 18, 2013


Comments from University of Colorado Boulder Institutional Research [email protected] Lou McClelland

Outcomes

  • There are conflicting definitions of the starting cohorts. The proposed changes say “Fall cohorts” (for academic reporting institutions). But for years we’ve formed cohorts using the definition of “first time” in the glossary http://nces.ed.gov/ipeds/glossary/?charindex=F: First-time student (undergraduate)

      • A student who has no prior postsecondary experience (except as noted below) attending any institution for the first time at the undergraduate level. This includes students enrolled in academic or occupational programs. It also includes students enrolled in the fall term who attended college for the first time in the prior summer term, and students who entered with advanced standing (college credits earned before graduation from high school).

  • CU-Boulder and the entire state of Colorado (through the Student Unit Record Data System) have for many many years formed “fall cohorts” of students enrolled in fall but entering in fall or in the prior summer. This should be continued. The writeup of proposed changes makes it sound as if it’s changing.

    • The glossary definitions of cohort are not helpful on this.


Passwords

  • We already need more entry passwords than are available. With these additional parts (e.g. libraries) we’ll need even more. We would continue to lock all portions.


Veterans and related

  • Proposed addition to IC, Table 2, which of the following are available to veterans, military service members, or their families?

    • Post-9/11 GI Bill, Yellow Ribbon Program” – This could mean, check yes if

      • Have BOTH post-9/11 GI bill AND Yellow Ribbon Program

      • Have EITHER

      • Have Yellow Ribbon Program, which is a subset of Post-9/11 GI Bill

      • In other words, need a more precise definition

    • We think the other items listed are clear

  • Proposed additions to SFA, Table 3. As above, need more precise definitions. For example

    • Does “Post-9/11 GI Bill benefits” include Yellow Ribbon?

    • Are “Post-9/11 GI Bill benefits” and “DoD Tuition Assistance” mutually exclusive? If a student has both, should he be counted in both?

    • Are any of the benefits to be included paid directly to the student, rather than through an institutional bill? If so, knowing which students got them, and the amounts, will be very difficult.



NCES Response:

Dear Mr. McClelland,

Thank you for your comment dated April 17, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS. NCES will review the definitions provided and ensure they reflect the intended purpose of the collection. To the extent feasible, the same guidance that NCES already provides for setting a cohort will continue to be used for the new Outcome Measures component.

NCES has begun work to allow for additional proxy users for the IPEDS data collection system. We hope that this will be helpful, particularly for the purposes of the Academic Libraries component.

Your comments on the veterans and related information have highlighted a need to review the instructions and assure that they are as explicit as possible. NCES is undertaking this review. In the meantime, please do not hesitate to contact the IPEDS Help Desk for further assistance (at 877-225-2568 and at [email protected]).

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program



Comment 14

Document: ED-2013-ICCD-0029-DRAFT-0022
Name: Andrea Galliger
Address: Minneapolis, MN,

Email: [email protected]
Organization: University of Minnesota
Government Agency Type: State

Date: April 22, 2013



I have a comment on IPEDS 2013-2016 Supporting Statement Part A Table 4: Proposed New Outcome Measures Component. The table states that for each of the 4 cohorts in question, institutions will be required to state whether those students "Did not receive award, subsequently enrolled at another institution." The only available method for a large number of institutions to find out such information is to use the National Student Clearinghouse, which requires institutions to pay a fee for that information or to provide extra information to them in exchange for that information, creating an extra burden on the institutions. It seems unfair for the federal government to force higher education institutions to 1) enter into agreements with the National Student Clearinghouse if they have not already done so, and to 2) take on the extra cost or burden the National Student Clearinghouse requires for that information. Thank you.

NCES Response:

Dear Ms. Galliger,

Thank you for your comment dated April 22, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register.

The use of third party data services is not required for the completion of the Outcome Measures component of IPEDS. One outcome category for students who did not receive an award at the reporting institution is that their subsequent enrollment status is unknown; NCES will accept this response even if subsequent enrollment status is unknown for all students. NCES has no expectation or requirement that an institution must share its data with a third party to complete the survey. However, many institutions already participate in state longitudinal data systems (SLDS), the National Student Clearinghouse, or data systems related to articulation and reciprocity agreements that they may use for this reporting if they choose to do so.

After reviewing comments from the public and considering the difficulty in creating retroactive cohorts from 2006, NCES is now proposing in the Information Collection Review (ICR) to begin collecting outcomes information one year later, in 2015-16, using the cohorts that began in 2007. It is our hope that providing an extra year to prepare will alleviate some of the difficulty in creating the cohorts and completing the Outcome Measures component of IPEDS. NCES is committed to working with the postsecondary education community to provide resources that will assist institutions with their outcome measures reporting.

Sincerely,



Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program



Comment 15

Document: ED-2013-ICCD-0029-DRAFT-0023
Name: Ronald Crowl
Address: Alliance, OH,

Email: [email protected]
Organization: University of Mount Union

Date: April 24, 2013



Determining where students have enrolled after leaving our institution will place an undue and potentially expensive burden on our institution. We would not support this change.

Thank you.

NCES Response:

Dear Mr. Crowl,

Thank you for your comment dated April 24, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register.

NCES is aware that many institutions do not participate with the National Student Clearinghouse. This topic was discussed at a technical review panel (TRP) on outcomes measures. Many institutions will have some information on external outcomes because of their participation in State Longitudinal Data Systems (SLDS) and other reciprocity or consortia like systems that allow for the identification of students that re-enroll at other institutions. The TRP discussed the issue of institutions reporting subsequent enrollment data, and suggested an outcome category of “subsequent enrollment status unknown” to address situations where these data are not available. The collection allows for institutions to report ‘Unknown” as a valid response.

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program



Responses to Comments 16-23

Comment 16

Document: ED-2013-ICCD-0029-DRAFT-0024
Name: Donald Eastman
Address: St. Petersburg, FL,

Email: [email protected]
Submitter's Representative: Bill Young
Organization: Eckerd College

Date: April 24, 2013



The proposed IPEDS changes are unwise, unhelpful and, if implemented, will result in erroneous statistics that will mislead the public.

The proposed IPEDS reporting changes will require institutions to extrapolate past cohorts from data that will often be incomplete. In addition to the extra work load this creates, it depends on secondary information like the National Student Clearinghouse that we cannot realistically be held accountable for. That is, we must report and sign off on reports that are based on data we did not generate nor have any way of checking for accuracies. It would see that a more direct approach would be to request reports directly from the source and then combine them in whatever form is desired. In addition, not all institutions partner with the NSC, which may result in artificially high counts of "status unknown." How will this missing data be reported to stakeholders?

The proposed change states that "[a] total of students who did not receive an award will be calculated." Some students who leave will graduate elsewhere, some will leave because their educational needs have been met (particularly adult learners) and some will leave and not pursue any further education, but under this simplistic calculation all of these scenarios would be lumped together, resulting in a rather unhelpful and misleading statistic. This exacerbates an already bad situation of aggregating statistics for entirely different programs (traditional residential and adult evening), so that the result misrepresents both.

Comment 17

Document: ED-2013-ICCD-0029-DRAFT-0025
Name: Sandra Kinney
Address: Baton Rouge, LA,

Email: [email protected]
Organization: Louisiana Community and Technical College system
Government Agency Type: State

Date: April 29, 2013



Thank you for your comment dated April 29, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS. Regarding the outcome measures for two-year colleges specifically the portion mentioned below:
"Collect the status update from both 2-year and 4-year institutions 8 years after the cohort enters the institution. Outcome Measures data collection will begin in 2014-15. Institutions will report on their 2006 cohorts."

1) The amount of time and burden involved in collecting data for an 8 year period for a 2 year college does not make sense. I have worked in two different state system offices and 99 percent of all outcomes occur within 6 years. The amount of time/burden and the diminishing returns on tracking students out 8 years will create quality issues, especially in those two-year institutions with few resources. While it makes sense for a four year institution to track out for 8 years (200% time to graduation), it does not for a two-year institution (400% time to graduation). It makes more sense to track out for 6 years (300% time to graduation) to account for part-time and stop out behaviour patterns common in the two-year sector.

A suggestion may be to align the outcome measures for community colleges and other two year colleges to line up with VFA which requires a 6 year follow up and makes more sense for a two-year institution.

Comment 18

Document: ED-2013-ICCD-0029-DRAFT-0026
Name: Kristen Douglas
Address: Douglasville, GA,

Organization: West Georgia Technical College
Government Agency Type: State

Date: April 29, 2013



IPEDS OMB Survey: Please align the outcome measure for Technical and Community Colleges with a 4-6 year follow up report, rather than the proposed 8 year period.



Comment 19

Document: ED-2013-ICCD-0029-DRAFT-0027
Name: Kathryn Davis
Address: Dublin, GA,

Email: [email protected]
Organization: Oconee Fall Line Technical College
Government Agency Type: State
Government Agency: Technical College System of Georgia

Date: April 29, 2013



In respect to how the Deparment might minimize the burden of this collection on the respondents, I would recommend the two year institutions only report 6 year follow-up student data. As two year institutions, reporting 8 years of follow-up data seems excessive.

Comment 20

Document: ED-2013-ICCD-0029-DRAFT-0030
Name: Diane Bosak
Address: Harrisburg, PA, PA, 

Email: [email protected]
Organization: PA Commission for Community Colleges

Date: May 1, 2013


We applaud and support the Department of Education's ongoing efforts to provide the most up-to-date and comparable higher education information. Proposed changes to IPEDS reporting to include part-time students addresses a long-standing concern of community colleges that our students' achievements are undercounted and overall success is underrepresented.

In support of the Department's efforts to update IPEDS to reflect the mission and role of two-year degree-granting institutions, we feel that the Department should a six-year tracking period which affords consistency with the national Voluntary Framework of Accountability (VFA) – the first national system of accountability specifically for community colleges and by community colleges. 

The largest benefit to institutions is the reduced reporting burden. Institutions participating in the VFA would simply re-report the same data to IPEDS. Continuing with two different datasets measuring nearly identical outcomes doubles the reporting burden and is likely to generate confusion among consumers, policymakers, and researchers.

Eight-Year Tracking

We concur with the Committee on Measures of Student Success' observation that community college students are often balancing school with work and family responsibilities and therefore need a longer period to complete a degree. We do not concur that an eight-year tracking period is necessary or even desirable for a two-year institution of higher education.

The VFA and the National Community College Benchmark Project (NCCBP) survey, administered annually by Johnson County Community College in Kansas, have established six-year tracking periods for completions and transfers. A total of 269 community colleges participated in the 2012 NCCBP survey. All 14 Pennsylvania community colleges have participated in NCCBP for the past six years and are expected to participate in the 2013 survey. A growing number of community colleges are committing to the VFA. 

Independent research [
the comment ends here]

Comment 21

Document: ED-2013-ICCD-0029-DRAFT-0032
Name: Melea Fields
Address: Whittier, CA,

Email: [email protected]
Submitter's Representative: Chief Research Officer
Organization: Southern California University of Health Sciences
Government Agency Type: Federal

Date: May 9, 2013

As presented by the AICCU and NAICU, we agree with the summary of the proposed changes and the affects they would have. An institution like mine would not benefit form the changes especially the retrospective as it would not present an accurate picture of our enrollment trends. Therefore like requested we agree with the following and ask that the changes not be made to the reporting of IPEDS data:
1. This is a “productivity measure” – students that enroll, walk away with “something” (without regard to what that “something” is, its value, its quality or timely completion). It adds burden without meaningful utility or information coming out of it.
2. Outcomes produced will have no comparability across institutions or relationship to an institution’s own graduation rate. The results from data points that represent a variety of credential types aggregated together are unique and in comparable to each institution and within each cohort year at a single institution. each institution is different therefore we are not comparing apples to apples and need to account for that.
3. Student snapshot timeframes have no correlation to appropriate program completion timeframes; an institution’s outcomes can be skewed and misleading making them only “appear” to foster student success.
4. Possible “retrospective” cohort year requirement to allow for immediate reporting of data is problematic – particularly with the addition of “new” cohort report groupings. Institutions with few transfer and/or part-time students may not have or be separately tracking them, nor following them if they transfer, making retrospective data impossible to report. Institutions often do not know if a student that intended to transfer subsequently enrolled and is actively attending another institution – certainly, wouldn’t have this retrospectively.

Thank you for considering the comments and the potential impact of the proposed changes.

Comment 22

Document: ED-2013-ICCD-0029-DRAFT-0036
Name: Karen Warner
Address: Canton, Ohio, 

Email: [email protected]
Organization: Malone University

Date: May 10, 2013

1) The number of part-time, first-time undergraduates on our campus is a very small number: Fall 2012 – 1; Fall 2011 – 3; Fall 2010 – 1; Fall 2009 – 5; Fall 2008 – 2; Fall 2007 – 4; Fall 2006 – 4; Fall 2005 – 4; Fall 2004 – 3; Fall 2003 – 12; Fall 2002 – 7
2) While our transfer-in numbers are considerably more substantial than the above cohort, they are of 2 distinctly different types: traditional undergraduate and adult degree-completion undergraduates for which the admissions and data-collection processes are quite different.
3) We have no mechanism to determine if a student who intended to transfer subsequently enrolled and is actively attending another institution.
4) We only track and report what is required due to budget and personnel constraints; our IR office consists of approximately 50% of the workload of ONE FT employee, who is also the IPEDS Keyholder. The additional tracking of 3 more cohorts (part-time, first-time; full-time “other;” and part-time “other”) will create additional reporting burden for a very small staff.
5) We currently do not partner with the National Student Clearinghouse or a Statewide Longitudinal Data System. Even if “enrollment unknown” cells are provided, extensive use of these cells (due to the near impossibility of following “transfer-outs”) could lead to misinterpretation by the public and policymakers and possibly even punitive action.

Comment 23

Document: ED-2013-ICCD-0029-DRAFT-0044
Address: DC

Email: [email protected]
Submitter's Representative: Kent Phillippe
Organization: American Association of Community Colleges

Date: May 13, 2013


Comments of the American Association of Community Colleges and Association of Community College Trustees on Proposed Integrated Postsecondary Education Data System (IPEDS) 2013‐2016 Agency Information Collection Activities


The American Association of Community Colleges, representing the nation’s 1,100 community colleges and the Association of Community College Trustees, representing the nation’s community college boards of directors, would like to commend the Department of Education and NCES for seeking to provide better data collection in the Integrated Postsecondary Education Data System (IPEDS). In particular, AACC is supportive of the new “Outcome Measures” component proposed in IPEDS. Our organizations support a more holistic view of student outcomes in higher education that goes beyond the limited and incomplete picture of student outcomes provided by the current measures collected and reported by NCES.


The advantage of collecting better outcome data can mitigate some of the increases in burden associated with our member colleges reporting this new data. However, in response to the questions raised in the call for comments, our organizations are concerned with the proposed eight‐year cohort tracking time period for 2‐year colleges in the new Outcome Measures component.


Community colleges are embracing increased reporting of appropriate measures. AACC, along with ACCT have recently launched the Voluntary Framework of Accountability which many states are adopting or building into state accountability systems and California has recently released a community college report card. One consistent aspect across these, and other, community college accountability systems is that colleges track student cohorts for six years and report their outcomes. Since many community colleges will be reporting outcomes for six‐year cohorts for these accountability systems, using a six‐year reporting period is less burdensome for those institutions than having to calculate an additional set of rates for an 8‐year follow up. In addition, consistent metrics for Federal and state or local measures will provide dramatically greater utility in the use of the data, rather than providing confusion over different outcome data reporting for different agencies.


For these reasons, AACC and ACCT Strongly encourages NCES and the Department of Education to change the cohort tracking period for two‐year colleges from eight years to six years on the Outcome Measures section of the proposed IPEDS changes.


American Association of Community Colleges

Association of Community College Trustees



NCES Response:

Dear Mr. Eastman, Ms. Kinney, Ms. Douglas, Ms. Davis, Ms. Bosak, Ms. Fields, Ms. Warner, and Dr. Phillippe,

Thank you for your comment dated May 10, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS.

NCES understands that requiring institutions to report on cohorts that were not defined at the time of matriculation will require additional burden and has accounted for it in the OMB clearance package. The IPEDS Technical Review Panel discussed the issue of cohort size at length. The TRP acknowledged that using the two new cohorts increased the likelihood of having small cohorts. This is particularly likely at very small institutions, but allowed that the “traditional” full-time, first-time cohort used for Graduation Rates reporting is very small and non-representative of the student body at many institutions. The TRP felt that expanding IPEDS to collect outcome information to non-first-time students would significantly increase the comprehensiveness of outcome data available at the national level. These additional cohorts will give the nation a more complete set of information on the completion information.

NCES understands that any aggregation of students will not allow for the complete description of the students within; that is no cohort is perfect. However, NCES works to balance the potential burden imposed upon an institution with the value of the data provided. In this case, the use of only 4 cohorts is an attempt to strike a meaningful but not too burdensome balance.

The diversity of postsecondary education institutions makes the creation of a single cohort definition difficult or impossible to create. These four cohorts represent the most recent best effort to allow institutions to report outcomes for the majority of their students. As with any data collection of aggregate statistics, small cell sizes will occur and their usage in reports will be the obligation of the reporting person or organization. NCES trains data users annually and actively works with the research community to ensure the proper use of IPEDS and other federal data.

We are aware that not all institutions have the same staffing and technological resources available and we continue to work with the industry to provide well designed and efficient collection forms. Certain providers of postsecondary education included in the IPEDS universe of Title IV eligible institutions - operators of proprietary (private for-profit) schools - are small businesses. NCES has taken several actions to reduce reporting burden for these entities. These actions include requesting a reduced set of data items from schools offering only certificates below the baccalaureate level, and maintaining a close liaison with the Association of Private Sector Colleges and Universities, which represents proprietary postsecondary institutions, to assure the appropriateness of data being requested and the feasibility of collecting it.

Prior to addressing your comments on the new Outcome Measure component of IPEDS, we would like to clarify that IPEDS will continue to collect graduation rates as it has in the past. The impetus and work to develop the new outcomes measures have taken several years to bring to fruition.

The Higher Education Opportunity Act of 2008 established the U.S. Department of Education’s Committee on Measures of Student Success to advise the Secretary of Education in assisting 2-year degree-granting institutions of higher education in meeting the completion or graduation rate disclosure requirements outlined in the Higher Education Act of 1965, as amended. The Committee completed its work in December 2011.

In its final report to the Secretary, the Committee noted that the “current federal graduation rate measure is incomplete and does not adequately convey the wide range of student outcomes at 2-year institutions.” In addition, the Committee observed that “data are not collected on other important outcomes achieved by students at 2-year institutions.”

The Committee recommended that the Department:

  • Broaden the coverage of student graduation data to reflect the diverse student populations at 2-year institutions;

  • Improve the collection of student progression and completion data;

  • Improve technical guidance to institutions in meeting statutory disclosure requirements; and

  • Encourage institutions to disclose comparable data on employment outcomes and provide incentives for sharing promising practices on measuring student learning.

Although its work focused on 2-year institutions, the Committee suggested that its recommendations be considered and implemented for 4-year institutions as well. The Committee’s final report is available at http://www2.ed.gov/about/bdscomm/list/cmss-committee-report-final.pdf.

In April 2012, the Department released an action plan for improving measures of postsecondary student success in support of the Administration’s college completion agenda and based on the recommendations of the Committee on Measures of Student Success (http://www.ed.gov/edblogs/ous/files/2012/03/Action-Plan-for-Improving-Measures-of-Postsecondary-Student-Success-FINAL2.pdf).

The Department’s action plan is designed to improve the quality and availability of student success data at the federal level for consumers, institutions, policymakers, and researchers. This plan also includes activities to help institutions, systems, and states increase their capacity for collecting and disseminating data on student success. Various offices within the Department are responsible for implementing the activities within the plan. In the plan, the Department has committed to “revise, where feasible under its current authority, existing data collection vehicles to include more comprehensive measures of student success for a broader group of students.”

As part of this activity, the NCES has taken steps to enhance graduation rate and transfer rate reporting in IPEDS. Using existing processes for considering changes to IPEDS, NCES examined the feasibility of broadening measures by collecting outcome information for part-time, degree/certificate-seeking undergraduate students and non-first-time, degree/certificate-seeking undergraduate students in IPEDS.

IPEDS TRP 37 was convened in February 2012 to discuss the feasibility of collecting outcome information on part-time, first-time students. The TRP suggested that NCES clarify the definition of a degree/certificate-seeking student for IPEDS reporting purposes and collect certain outcome information in IPEDS for part-time, first-time students.

IPEDS TRP 40 was convened in October 2012 to discuss the feasibility of collecting outcome information on non-first-time students. The TRP suggested that NCES collect certain outcome information in IPEDS separately for full-time and part-time, non-first-time students, similar to information that TRP 37 proposed for part-time, first-time students. The TRP also suggested that similar outcome information be collected for full-time, first-time students. Outcome measures information will be submitted by degree-granting institutions only.

The new outcome information that institutions would report to IPEDS is designed to provide consumers, policymakers, and researchers context for and an alternative to the graduation rates calculated for the purposes of the Student Right to Know and Campus Security Act of 1990.

To expedite the availability of data that will be useful to consumers, policymakers, and researchers, TRP 40 suggested that institutions report on student outcomes retrospectively. If a prospective reporting model were used, outcome measures data would not be available until 2023.

As a result of TRP suggestions and public comments, NCES has requested clearance to implement a new Outcome Measures component in the Winter collection. The burden for this part of the collection is substantial, with an estimated time of 30.1 hours per institution for a total of 147,490 hours in the first year. After the first year of reporting, when the method for organizing this information is established, the estimated burden drops to 15 hours or 73,500 hours for all institutions. Based on the work done in the TRPs we believe that adding this information to the national knowledgebase on outcomes is justified.

The use of third party data services is not required for the completion of the Outcome Measures component of IPEDS. One outcome category for students who did not receive an award at the reporting institution is that their subsequent enrollment status is unknown; NCES will accept this response even if subsequent enrollment status is unknown for all students. NCES has no expectation or requirement that an institution must share its data with a third party to complete the survey. However, many institutions already participate in state longitudinal data systems (SLDS), the National Student Clearinghouse, or data systems related to articulation and reciprocity agreements that they may use for this reporting if they choose to do so.

There has been a lot of discussion about using the six-year outcome measure for the 2-year institutions and an eight-year outcome measure for a 4-year institution. When creating the Outcomes Measures there was an explicit goal to keep the Outcomes Measures section as simple and straightforward as possible. This was done to keep the burden to respondents low and to facilitate ease of use by data users. If the Outcomes Measures section allowed for different durations to degree by type of institutions, we had difficulty reconciling how to explain the impetus for giving a student seeking the same award different amounts of time based on the institutional type. There are an increasing number of institutions offering multiple degree types (certificates, associates, and bachelors) making the institutional type variable misleading for an Outcomes Measure statistic.

After reviewing comments from the public and considering the difficulty in creating retroactive cohorts from 2006, NCES is now proposing in the Information Collection Review (ICR) to begin collecting outcomes information one year later, in 2015-16, using the cohorts that began in 2007. It is our hope that providing an extra year to prepare will alleviate some of the difficulty in creating the cohorts and completing the Outcome Measures component of IPEDS. NCES is committed to working with the postsecondary education community to provide resources that will assist institutions with their outcome measures reporting.

Please do not hesitate to contact the IPEDS Help Desk for further assistance (at 877-225-2568 and at [email protected]).

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program



Comment 24

Document: ED-2013-ICCD-0029-DRAFT-0028
Name: Kimberly Carter
Address: Northwest, DC,

Email: [email protected]
Organization: Bureau of Economic Analysis
Government Agency Type: Federal
Government Agency: DOC

Date: April 30, 2013


Director of Information Collection Clearance Division

U.S. Department of Education

400 Maryland Avenue, SW LBJ, Room 2E117

Washington, DC 20202-4537


Dear Director:


The Bureau of Economic Analysis (BEA) strongly supports the National Center for Education Statistics' continued efforts to collect data using the Integrated Postsecondary Education Data System (IPEDS). The data collected on this survey are crucial to key components of BEA's economic statistics.


IPEDS data are used in the national income and product accounts (NIPAs) to help estimate personal consumption expenditures (PCE). Additionally, I PEDS data are used to estimate output for private education in the benchmark input-output accounts. Finally, there are benefits from the IPEDS data used by the U.S. Census Bureau in the preparation of its Annual Survey of Government Finances (ASGF). BEA uses the ASGF data to estimate components of state and local government spending in the NIPAs. The attachment shows the IPEDS data codes used by BEA.


Please keep BEA informed about any modifications to the form. We are particularly interested in any modifications proposed during the form's approval process that would substantially affect our use of these data. For additional information, please contact Ruth Bramblett, Source Data Coordinator, on 202-606-9653 or by e-mail at [email protected]. Should you should need assistance in justifying this form to the Office of Management and Budget, please do not hesitate to con tact BEA.


Dennis J. Fixler

Chief Statistician

BEA Usage of the IPEDS

Data Codes Used

BEA Uses for the Data

F2e011

Estimation of PCE for gross operating expenses of private higher education, schools, colleges, and universities at an annual level.

F2e031

F2e041

F2e051

F2e061

F2e112

F2d01

F2dll

F3d01

F3d06

F3d07

Estimates used in the benchmark for the input-output accounts.



NCES Response:

Dear Mr. Fixler,

Thank you for your comment dated April 30, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS. We wanted to thank the Bureau of Economic Analysis (BEA) for its support of IPEDS data collection. We are pleased to provide useful information to our IPEDS stakeholders. We want to assure you that we will keep you informed with regards to any possible changes to the IPEDS data elements used by your office.

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program



Comment 25

Document: ED-2013-ICCD-0029-DRAFT-0029
Name: Charles Stewart
Address: New York, NY,

Email: [email protected]
Organization: City College Libraries

Date: May 1, 2013

I am not sure this is the appropriate place for this question, but under "Supporting Documents" and within "Forms and Instructions" the last document, "IPEDS 2014 IC Outcome Measures-Admissions-Academic Libraries" at: http://www.regulations.gov/contentStreamer?objectId=09000064812339cc&disposition=attachment&contentType=pdf
says:
"The finalized content of all 2014-15 data collection instruments will posted by April 30th, 2013."

Can you please tell me where this is posted?

Thank you.

NCES Response:

Dear Mr. Stewart,

Thank you for your comment dated May 1, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS. The updated documents were posted for public review in the same location where you accessed the initial version of those documents (the updated versions replaced the initial versions) at http://www.regulations.gov/#!docketDetail;D=ED-2013-ICCD-0029.

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program



Comment 26

Document: ED-2013-ICCD-0029-DRAFT-0031
Name: Gary Nigh
Address: Trenton, NJ

Email: [email protected]
Organization: New Jersey Higher Education
Government Agency Type: State
Government Agency: Office of the Secretary of Higher Education

Date: May 2, 2013


The Office of the Secretary of Higher Education for the State of New Jersey opposes the proposed elimination of the Estimated Enrollment items from the Institutional Characteristics Survey. This item has been used to fill our need for state-wide enrollment data in a timely manner. Under the proposed change we would not have a stable, authoritative number for fall enrollment until the following May, a full eight months after the semester traditionally begins. In our experience these data have been reasonably reliable. For fall 2011, the difference between the early estimate and final numbers for the 61 institutions we coordinate was .26% or 1,153 students out of a final number of 442,878. We routinely compile and post this information and would miss this comprehensive source for enrollment information.

NCES Response:

Dear Mr. Neigh,

Thank you for your comment dated May 2, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS. Your positive experience with the estimated enrollment items were among the minority of experiences with those data reported to NCES. The estimated enrollment items on IC were added in 2006-07 to fill a perceived need for an early estimate of fall enrollment, due to the fact that the IPEDS Fall Enrollment component is not collected until the following spring. However, NCES has determined that overall these data are not of particularly good quality. Their inclusion in the IPEDS Data Center is a source of confusion to many (if not most) data users, and there is evidence that they are not being widely used. Further, these data are not displayed on College Navigator because they are estimates.

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program



Comment 27

Document: ED-2013-ICCD-0029-DRAFT-0033
Name: Donna Tudor
Address: Nashville, TN, 

Email: [email protected]
Organization: Trevecca Nazarene University

Date: May 9, 2013


I am the Director of Institutional Research at Trevecca Nazarene University. I am very much against this data collection because I don’t see how it will provide useable data for decision-making…especially if the data users don’t know the reasons that students drop out or transfer out. Does the proposed data collection follow the same aggregate format (percentages) or does it involve reporting students individually? This type of data collection will lead to "duplicate reporting" and add an unnecessary burden to the already slim resources of most research offices. Thank you for this opportunity to express my sincere concerns.

NCES Response:

Dear Ms. Tudor,

Thank you for your comment dated May 9, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register.

The NCES has taken steps to enhance graduation rate and transfer rate reporting in IPEDS and ensure that there is not any duplicate reporting of outcomes. Using existing processes for considering changes to IPEDS, NCES examined the feasibility of broadening measures by collecting outcome information for part-time, degree/certificate-seeking undergraduate students and non-first-time, degree/certificate-seeking undergraduate students in IPEDS.

IPEDS TRP 37 was convened in February 2012 to discuss the feasibility of collecting outcome information on part-time, first-time students. The TRP suggested that NCES clarify the definition of a degree/certificate-seeking student for IPEDS reporting purposes and collect certain outcome information in IPEDS for part-time, first-time students.

IPEDS TRP 40 was convened in October 2012 to discuss the feasibility of collecting outcome information on non-first-time students. The TRP suggested that NCES collect certain outcome information in IPEDS separately for full-time and part-time, non-first-time students, similar to information that TRP 37 proposed for part-time, first-time students. The TRP also suggested that similar outcome information be collected for full-time, first-time students. Outcome measures information will be submitted by degree-granting institutions only.

The TRP felt that expanding IPEDS to collect outcome information for part-time and non-first-time students would significantly increase the comprehensiveness of outcome data available at the national level. The new outcome information that institutions would report to IPEDS is designed to provide consumers, institutions, policymakers, and researchers context for and an alternative to the graduation rates calculated for the purposes of the Student Right to Know and Campus Security Act of 1990.

The data collected are used by over 1 million College Navigator users a month to view institutional data. In addition it is used by the College Scorecard hosted on the Whitehouse website. The information collected not only provides a common set of information for prospective students to use when making a decision to enroll in postsecondary education, but is regularly utilized by the Department of Education and other federal agencies, researchers, policy makers, and others to monitor and improve postsecondary education for the public. 

Related to your comment on the burdensome nature of this collection, NCES will continue to evaluate the IPEDS collection and make every reasonable attempt to reduce burden on the reporting institutions while complying fully with the law and regulations related to this collection..

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program



Comment 28

Document: ED-2013-ICCD-0029-DRAFT-0034
Name: Paula Krist
Address: San Diego, CA,

Email: [email protected]
Organization: University of San Diego

Date: May 10, 2013


MEMORANDUM

To: IPEDS

CC: Andrew Allen, Ph.D., Associate Provost

From: Paula S. Krist, Ph.D., Director, Institutional Research and Planning



Table 4: Proposed New Outcome Measures Component

In addition to representing a tremendous burden for personnel in reporting institutions, this proposal raises several issues that will affect the ability of colleges and universities to provide accurate and meaningful data:

  1. For all institutions of all types, immediate retrospective cohort year requirement for reporting of data is problematic – particularly with the addition of new cohort report groupings that have not been tracked previously. This would result in a large commitment of institutional resources for very little yield.

    1. Institutions with few transfer and/or part-time students may not be separately tracking them, or following them if they transfer, making retrospective data impossible to report. At USD, we have been tracking transfers, but part-time students are few and often change their status.

    2. Subsequent enrollment data will be difficult to supply accurately. Institutions often do not know if a student who intended to transfer subsequently enrolled and is actively attending another institution. Further, not all institutions are members of the National Clearinghouse; this would force non-members to join and increase their financial burden. Most institutions would not have historical data on this.

  2. Some of the data would have very small cell sizes; small cell sizes make comparisons irrelevant. Part time students are the exception at many private institutions, especially smaller ones. Others report very few transfer students. Creating percentages would exaggerate the small numbers tremendously; that means of comparison would not be useful. If data were reported to IPEDS, resulting small cell sizes would prohibit the publication of data for many institutions. This would be a great resource burden with no value for NCES/policymakers, researchers, consumers or institutions.

  3. This proposed data collection asks information about students receiving an award, without specification of what the award is. If the interest is in degrees, degrees should be specified. For example, at USD we award degrees and certificates. Students may also earn credentials awarded by the state. Many credentials are already tracked at the state and federal levels.

  4. Because of #3 above, data produced may not be at all comparable across institutions or have a relationship with an institution’s own graduation rate.

    1. There could be results from data points that represent a variety of credential types aggregated together, and possible unique to each institution, creating a situation in which the results could not be compared. Further, the aggregated groups from one year to the next could be very different, even at the same institution.

    2. Detail must be provided regarding how awards are defined and what will be aggregated. If all awards are aggregated with no distinction between types or levels, it will lead to misinterpretation of data.

  5. Student snapshot timeframes may not accurately capture program completion timeframes.

  6. “Enrollment Unknown” cells could lead to a misinterpretation of the absence of data.



Table 7: Proposed Integration of Academic Libraries Survey into IPEDS Data

It is not clear if the library data collection would become an annual data requirement. Presently, these data are reported biennially via the ALS survey. The burden to institutions would be 100% increased time if it becomes annual.

NCES Response:

Dear Dr. Krist,

Thank you for your comment dated May 10, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS. The IPEDS Technical Review Panel (TRP) acknowledged that certain of the four cohorts could be very small at some institutions, but allowed that the “traditional” full-time, first-time cohort used for Graduation Rates reporting is very small and non-representative of the student body at some institutions.  The TRP felt that expanding IPEDS to collect outcome information for part-time and non-first-time students would significantly increase the comprehensiveness of outcome data available at the national level.

The diversity of postsecondary education institutions makes the creation of a single cohort definition that represents all students and serves all institutions equally difficult or impossible to create. These four cohorts represent the most recent best effort to allow institutions to report outcomes for the majority of their students. As with any data collection of aggregate statistics, small cell sizes will occur and their usage in reports will be the obligation of the reporting person or organization. NCES trains data users annually and actively works with the research community to ensure the proper use of IPEDS and other federal data.

The proposed collection requests information on degrees obtained by 150% and 200% of time. In addition, in the Outcomes Measures section, the collection requests completions which include certificates, associate’s degrees, and bachelor’s degrees after a fixed 8 year time period. The graduation rate statistics are mandatory under the Student Right to Know Act while the Outcomes Measures came from The Higher Education Opportunity Act of 2008 (HEOA). The HEOA established the U.S. Department of Education’s Committee on Measures of Student Success to advise the Secretary of Education in assisting 2-year degree-granting institutions of higher education in meeting the completion or graduation rate disclosure requirements outlined in the Higher Education Act of 1965, as amended. The Committee completed its work in December 2011.

In its final report to the Secretary, the Committee noted that the “current federal graduation rate measure is incomplete and does not adequately convey the wide range of student outcomes at 2-year institutions.” In addition, the Committee observed that “data are not collected on other important outcomes achieved by students at 2-year institutions.”

The Committee recommended that the Department:

  • Broaden the coverage of student graduation data to reflect the diverse student populations at 2-year institutions;

  • Improve the collection of student progression and completion data;

  • Improve technical guidance to institutions in meeting statutory disclosure requirements; and

  • Encourage institutions to disclose comparable data on employment outcomes and provide incentives for sharing promising practices on measuring student learning.

Although its work focused on 2-year institutions, the Committee suggested that its recommendations be considered and implemented for 4-year institutions as well. The Committee’s final report is available at http://www2.ed.gov/about/bdscomm/list/cmss-committee-report-final.pdf.

In April 2012, the Department released an action plan for improving measures of postsecondary student success in support of the Administration’s college completion agenda and based on the recommendations of the Committee on Measures of Student Success (http://www.ed.gov/edblogs/ous/files/2012/03/Action-Plan-for-Improving-Measures-of-Postsecondary-Student-Success-FINAL2.pdf).

The Department’s action plan is designed to improve the quality and availability of student success data at the federal level for consumers, institutions, policymakers, and researchers. This plan also includes activities to help institutions, systems, and states increase their capacity for collecting and disseminating data on student success. Various offices within the Department are responsible for implementing the activities within the plan. In the plan, the Department has committed to “revise, where feasible under its current authority, existing data collection vehicles to include more comprehensive measures of student success for a broader group of students.”

As with the current Completions and Graduation Rate components, the data reported to IPEDS would be for formal awards that are conferred by the institution. NCES and the National Postsecondary Education Cooperative (NPEC) has done much work over recent years to further clarify the definition of a formal award and help ensure consistency in reporting across institutions and states.

The TRP did discuss the issue of collecting data on students that received any award versus separating out award type and though it was noted that collecting data on any award would not provide as much information for institutions that offer multiple award levels than collecting data by award level would. However, the panel felt that making a distinction between award levels would be even more burdensome and would not necessarily add corresponding value to the data collection. The panel was sensitive to the fact that student success can mean many different things and felt that making a distinction between certificate and degree completion was too limiting (https://edsurveys.rti.org/IPEDS_TRP/documents%5CTRP40_Suggestions_final.pdf page 5 , second complete paragraph).

NCES is very sensitive to the need to balance increased reporting burden with the utility of the data requested. Each Technical Review Panel meeting, regardless of the topic being discussed, is specifically charged with the task of minimizing the additional institutional burden as a result of increased IPEDS reporting requirements. Further, entire meetings of the IPEDS Technical Review Panel have been dedicated to the topic of IPEDS reporting burden:

We recognize that IPEDS reporting takes time, however the information collected not only provides a common set of information for prospective students to use when making a decision to enroll in postsecondary education, but is regularly utilized by the Department of Education and other federal agencies, researchers, policy makers, and others to monitor and improve postsecondary education for the public. 

Prior to addressing your comments on the new Outcome Measure component of IPEDS, we would like to clarify that IPEDS will continue to collect graduation rates as it has in the past. The impetus and work to develop the new outcomes measures have taken several years to bring to fruition.

The Higher Education Opportunity Act of 2008 established the U.S. Department of Education’s Committee on Measures of Student Success to advise the Secretary of Education in assisting 2-year degree-granting institutions of higher education in meeting the completion or graduation rate disclosure requirements outlined in the Higher Education Act of 1965, as amended. The Committee completed its work in December 2011.

In its final report to the Secretary, the Committee noted that the “current federal graduation rate measure is incomplete and does not adequately convey the wide range of student outcomes at 2-year institutions.” In addition, the Committee observed that “data are not collected on other important outcomes achieved by students at 2-year institutions.”

The Committee recommended that the Department:

  • Broaden the coverage of student graduation data to reflect the diverse student populations at 2-year institutions;

  • Improve the collection of student progression and completion data;

  • Improve technical guidance to institutions in meeting statutory disclosure requirements; and

  • Encourage institutions to disclose comparable data on employment outcomes and provide incentives for sharing promising practices on measuring student learning.

Although its work focused on 2-year institutions, the Committee suggested that its recommendations be considered and implemented for 4-year institutions as well. The Committee’s final report is available at http://www2.ed.gov/about/bdscomm/list/cmss-committee-report-final.pdf.

In April 2012, the Department released an action plan for improving measures of postsecondary student success in support of the Administration’s college completion agenda and based on the recommendations of the Committee on Measures of Student Success (http://www.ed.gov/edblogs/ous/files/2012/03/Action-Plan-for-Improving-Measures-of-Postsecondary-Student-Success-FINAL2.pdf).

The Department’s action plan is designed to improve the quality and availability of student success data at the federal level for consumers, institutions, policymakers, and researchers. This plan also includes activities to help institutions, systems, and states increase their capacity for collecting and disseminating data on student success. Various offices within the Department are responsible for implementing the activities within the plan. In the plan, the Department has committed to “revise, where feasible under its current authority, existing data collection vehicles to include more comprehensive measures of student success for a broader group of students.”

As part of this activity, the NCES has taken steps to enhance graduation rate and transfer rate reporting in IPEDS. Using existing processes for considering changes to IPEDS, NCES examined the feasibility of broadening measures by collecting outcome information for part-time, degree/certificate-seeking undergraduate students and non-first-time, degree/certificate-seeking undergraduate students in IPEDS.

IPEDS TRP 37 was convened in February 2012 to discuss the feasibility of collecting outcome information on part-time, first-time students. The TRP suggested that NCES clarify the definition of a degree/certificate-seeking student for IPEDS reporting purposes and collect certain outcome information in IPEDS for part-time, first-time students.

IPEDS TRP 40 was convened in October 2012 to discuss the feasibility of collecting outcome information on non-first-time students. The TRP suggested that NCES collect certain outcome information in IPEDS separately for full-time and part-time, non-first-time students, similar to information that TRP 37 proposed for part-time, first-time students. The TRP also suggested that similar outcome information be collected for full-time, first-time students. Outcome measures information will be submitted by degree-granting institutions only.

The new outcome information that institutions would report to IPEDS is designed to provide consumers, policymakers, and researchers context for and an alternative to the graduation rates calculated for the purposes of the Student Right to Know and Campus Security Act of 1990.

To expedite the availability of data that will be useful to consumers, policymakers, and researchers, TRP 40 suggested that institutions report on student outcomes retrospectively. If a prospective reporting model were used, outcome measures data would not be available until 2023.

As a result of TRP suggestions and public comments, NCES has requested clearance to implement a new Outcome Measures component in the Winter collection. The burden for this part of the collection is substantial, with an estimated time of 30.1 hours per institution for a total of 147,490 hours in the first year. After the first year of reporting, when the method for organizing this information is established, the estimated burden drops to 15 hours or 73,500 hours for all institutions. Based on the work done in the TRPs we believe that adding this information to the national knowledgebase on outcomes is justified.

The use of third party data services is not required for the completion of the Outcome Measures component of IPEDS. One outcome category for students who did not receive an award at the reporting institution is that their subsequent enrollment status is unknown; NCES will accept this response even if subsequent enrollment status is unknown for all students. NCES has no expectation or requirement that an institution must share its data with a third party to complete the survey and understands that there is lack of data on outcomes after leaving. However, many institutions already participate in state longitudinal data systems (SLDS), the National Student Clearinghouse, or data systems related to articulation and reciprocity agreements that they may use for this reporting if they choose to do so.

The previous Academic Library Survey was administered every other year and NCES is now proposing to collect the information annually. However, the library survey has been significantly reduced such that the increased burden of completing the survey annually is mitigated by the decrease in the burden of a shorter form. The proposed form has an estimated burden of 4.2 hours while the previous form had a burden of 8.23 hours. NCES believes that collecting more timely data on a shorter form will better serve the knowledgebase for academic libraries.

By their nature, cohort reporting methods are a reflection of institutions following specific enrollment policies and as such will not serve all institutions equally. While some institutions take very few part-time students, others enroll very few first-time students. The four cohorts proposed in this collection represent the four most common groups that enroll at IPEDS reporting institutions. As has been the case in the past, some institutions will have small cell sizes that could be misinterpreted. However, collecting these four cohorts will allow for more institutions to accurately report outcomes for their students and will give data users a more complete measurement of student enrollment and completion in postsecondary education.

After reviewing comments from the public and considering the difficulty in creating retroactive cohorts from 2006, NCES is now proposing in the Information Collection Review (ICR) to begin collecting outcomes information one year later, in 2015-16, using the cohorts that began in 2007. It is our hope that providing an extra year to prepare will alleviate some of the difficulty in creating the cohorts and completing the Outcome Measures component of IPEDS. NCES is committed to working with the postsecondary education community to provide resources that will assist institutions with their outcome measures reporting.

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program



Comment 29

Document: ED-2013-ICCD-0029-DRAFT-0035
Name: Peggy Foster
Address: Lewiston, ID, 

Email: [email protected]
Organization: Headmasters School

Date: May 10, 2013

The IPEDS reports are just another government waste of our time. The information is not useful in that it only reflects numbers which are difficult to report since the program tells you that a change outside of its preset acceptable range are incorrect. It does not show how schools are unique in offering education to different segments of the population especially in areas with a high poverty rate and poor economic conditions. The time and financial border for a small school to track, report and defend their numbers is much better spent in the classroom and on education for our students. The report sent back to the school is not useful since it compares schools in different parts of the country that cover completely different demographic and economic areas. I am really wondering who reads these reports.

NCES Response:

Dear Ms. Foster,

Thank you for your comment dated May 10, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. The IPEDS data collection satisfies the Higher Education Opportunity Act (2008) requirements for reporting for institutions using Title IV funds. The data collected are used by the Department of Education in the College Navigator search tool that receives an average of over 1 million unique users a month. The information is also used by educational researchers, policymakers, state organizations, and others. Related to your comment on the burdensome nature of this collection, NCES has taken steps in the past to limit the amount of information requested from smaller institutions, and will continue to address this important issue in the future.

The Data Feedback Report is an annual report designed to be useful to institutional executives and institutions for benchmarking and peer analysis, and also to help improve the quality and comparability of IPEDS data. Institutions are encouraged to establish their own custom comparison group of institutions for the report, in order to make the report more useful and relevant.

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program



Comment 30

Document: ED-2013-ICCD-0029-DRAFT-0037
Name: Thomas Haakenson
Address: Minneapolis, 

Email: [email protected]
Organization: Minneapolis College of Art and Design

Date: May 10, 2013

As part of the IPEDS collection process. the U.S. government should provide comprehensive software for institutional and student records maintenance that would provide without intermediary the required data from each accredited institution in the U.S.

NCES Response:

Dear Mr. Haakenson,

Thank you for your comment dated March 20, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. 

The IPEDS Technical Review Panel (TRP) discussed the possibility of the creation of software to facilities submission of IPEDS data in a meeting in 2010 dedicated to identifying ways to reduce IPEDS reporting burden. A detailed summary of the discussion held at TRP #30 can be found on the TRP website: https://edsurveys.rti.org/IPEDS_TRP/Default.aspx. The panel agreed that the utility of such a tool would vary widely across institutions and states. Consequently, the TRP suggested soliciting additional input and feedback from the larger IPEDS community regarding the feasibility and utility of a new tool that would allow for student records maintenance and the aggregation of those records necessary for IPEDS reporting. We would welcome your input on this issue.

The IPEDS web based data collection system allows data providers to populate the IPEDS forms by manual key entry or several different file upload options, including an xml option. This allows NCES to update and maintain the software related to data submission without interfering with local computing systems of institutions. The comprehensive IPEDS data collection system is 508 compliant and accessible through multiple web browsers and, at this time, represents the best and most cost effective means to allow for the efficient and accurate submission of data by the varied types of institutions that provide information for IPEDS.

If you need further assistance with data, IPEDS also provides a help desk at 877-225-2568 and at [email protected].

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program



Comment 31

Document: ED-2013-ICCD-0029-DRAFT-0038
Name: Michael Chambers
Address: Portland, OR, 

Email: [email protected]
Organization: Oregon Alliance of Independent Colleges & Universities

Date: May 10, 2013

The Table 4 Outcome Measures certainly improve on the Full Time, First Time only measurement, but will be imperfect; for example, there will be no way to account for disparities in the number of transfer credits applied to a non-first-time student’s degree. And, definitions will be necessary to clarify AP, dual enrollment, concurrent registration, as well as cohort modifications based on non-enrollment or student migration between cohorts. 

Retroactive definitions are problematic, depending on original census methods and tracking systems; some institutions’ changes in record keeping systems since 2006 may prevent accurate classification of students. 

The four status categories also raise questions. For 4-year institutions, the usefulness of sub-categories for non-awarded students no longer enrolled is moot— subsequent enrollment elsewhere in and of itself has limited meaning, especially for non-traditional and part time students—and reporting would require a costly and laborious synchronizing of externally procured data. How will this inform about migrations in ways that the new cohort reporting at 4-year institutions themselves would not? 

When considering reporting burden, data integrity, or potential to inform, it is useful to consider to what extent these reporting attempts to answer questions that can ultimately be answered only by student-level longitudinal data systems.

Full Time, First Time cohort graduation rates are displayed in isolation as a proxy for institutional effectiveness on the White House College Scorecard. If displayed together with the percentage of incoming students the cohort represents, it would provide a clearer picture. Presenting the percentages of an incoming class represented by each of the four proposed cohorts together with graduation rates calculated for the aggregate and each cohort could provide prospective students a much more useful view of a 4-year institution’s success with the student populations it serves.

NCES Response:

Dear Mr. Chambers,

We thank you and the Oregon Alliance of Independent Colleges and Universities for these comments.

With regard to the four cohort categories and the appropriateness of them to different types of institutions: The Outcome Measures component is a direct result of the Higher Education Opportunity Act (2008) and the work of the Committee to Measure Student Success, which was tasked with recommending additional or alternative measures of student success for 2-year institutions. However, the IPEDS TRP felt strongly that the new measures should be implemented for 4-year institutions as well, much like the traditional graduation rates are reported by all institutions. Definitions for cohorts have been prepared and will address AP, dual enrollment, concurrent enrollment, and cohort modification.

As I’m sure you are aware, IPEDS is prohibited from collecting student-level data. The collection of student-level data would be able to answer a wide range of questions more easily than the Outcome Measures component can.

After reviewing comments from the public and considering the difficulty in creating retroactive cohorts from 2006, NCES is now proposing in the Information Collection Review (ICR) to begin collecting outcomes information one year later, in 2015-16, using the cohorts that began in 2007. It is our hope that providing an extra year to prepare will alleviate some of the difficulty in creating the cohorts and completing the Outcome Measures component of IPEDS. NCES is committed to working with the postsecondary education community to provide resources that will assist institutions with their outcome measures reporting.

Thank you for your suggestions as to the effective presentation of these data to the public. You comment regarding using the proportion of the student body the cohort represents is a salient one and we will pass it on to the Scorecard development group for consideration in a future version of the college scorecard.

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program



Comment 32

Document: ED-2013-ICCD-0029-DRAFT-0039
Name: Katherine Tromble
Address: Washington, DC

Email: [email protected]
Organization: The Education Trust

Date: May 10, 2013


The Education Trust Comments: Integrated Postsecondary Education Data System (IPEDS) 2013-2016


The Education Trust (Ed Trust) promotes high academic achievement for all students at all levels pre- kindergarten through college. Our goal is to close the gaps in opportunity and achievement that consign far too many young people especially those from low-income families or who are black, Latino, or American Indian to lives on the margins of the American mainstream. In our research, policy, and advocacy work, we maintain an unflinching commitment to thorough data analysis, and as such, rely heavily on the Integrated Postsecondary Education Data System (IPEDS) the most comprehensive data source for information on institutions of higher education.


We laud the Department of Education’s efforts to continue, enhance, and improve this annual data collection. IPEDS provides crucial information for consumers, policymakers, institutions, and researchers about institutional characteristics, pricing and tuition, student financial aid, admissions, enrollments, completions, graduation rates, institutional finance, and human resources. In particular, the additional data elements proposed for inclusion in the 2014-15 and 2015-16 data collections will provide a more thorough understanding of student progression through and success in college. These additional data will prove immensely valuable in measuring student success. Once they are publicly available, Ed Trust likely will incorporate these supplemental measures of student success into College Results Online (CRO, www.collegeresults.org), a public-access Web database that facilitates comparisons of salient characteristics across similar institutions of higher education. This improved reporting also will allow other key consumer tools like College Navigator, the White House’s College Scorecard, and the Financial Aid Shopping Sheet to provide more comprehensive information on student success, facilitating institutional improvement, enhanced consumer choice, and better informed policymaking.


Feasibility of collecting expanded data on student completion


As part of a partnership with the National Association of System Heads (NASH), Ed Trust launched the Access to Success Initiative (A2S) in 2007. Through A2S, 22 state systems of higher education, representing more than 300 institutions, have committed to cutting in half their gaps in access and success for low-income students and students of color by 2015. As part of this commitment, each year they report to us data measuring their system and institution-level progress.


When the Initiative launched, the system heads knew that traditional measures of student success — those that only include first-time, full-time students — would be insufficient to benchmark their progress. Instead, they agreed upon common definitions of success that went beyond those included in IPEDS. These expanded measures include part-time and transfer students, who are missing from current IPEDS graduation rates, as well as low-income students, whose success rates are not discernible under our current federal reporting scheme. Not only did this diverse array of systems agree upon common definitions, but they have been reporting this broader set of data for five years. These systems were ahead of the curve in their commitment to data and transparency, but perhaps even more important, they have proven that it is feasible to report detailed information on student success and that the more detailed reporting doesn’t impose a huge burden.


Additional graduation-rate cohorts


In addition to Ed Trust’s role in the A2S Initiative, we also have a keen understanding of existing IPEDS data, including its strengths and weaknesses. As mentioned, our CRO Web tool relies on IPEDS data, including graduation rates. The “Student Right to Know” graduation rates — which measure the percent of first-time, full-time students who complete a credential at their initial institution within 150 percent of normal time — are the only comprehensive and comparable statistics on completion rates at individual institutions of higher education. We use these data in CRO, and for other research purposes, but recognize their limitations. In 2011, IPEDS graduation rates accounted for only 47 percent of undergraduates.2


Clearly, supplementary data on the success of non-first-time, full-time students is crucial to understanding outcomes for a broader range of today’s college students. As a result, we support the recommendations of Technical Review Panel #40 (TRP #40), Additional Selected Outcomes of the Advisory Committee on Measures of Student Success, that IPEDS collect data on students who are:


Full-time, first-time (have no prior postsecondary experience and have enrolled full-time with the intent to earn a degree, certificate, or other formal award);

Part-time, first-time (have no prior postsecondary experience and have enrolled part-time with the intent to earn a degree, certificate, or other formal award);

Full-time, transfer-in (have prior postsecondary experience and have enrolled full-time with the intent to earn a degree, certificate, or other formal award); and

Part-time, transfer-in (have prior postsecondary experience and have enrolled part-time with the intent to earn a degree, certificate, or other formal award).3


Success outcome reporting


In addition to recommending data collection on non-first-time, full-time students, TRP #40 also recommended that IPEDS collect information on subsequent enrollment of students who (1) receive formal awards and (2) do not receive formal awards. Institutions sometimes claim that their graduation rates look artificially low because students take longer than the allowed 150 percent graduation timeframe to complete, or they transfer to another institution. Providing information on subsequent enrollment will help evaluate these claims and provide a more accurate, comprehensive picture of student movement through the postsecondary system. Ed Trust supports collecting these additional outcome data. However, to improve the quality, utility, and clarity of the data, we recommend two changes to the following outcome categories outlined by TRP #40:


Received formal award:

o Subsequently enrolled at the reporting institution;

o Subsequently enrolled at another institution; or

o Subsequent enrollment unknown.

Did not receive formal award:

o Still enrolled at the reporting institution;

o Subsequently enrolled at another institution; or

o Subsequent enrollment unknown.


First, we recommend disaggregating these data by the type of award granted. This disaggregation is a crucial piece of information because completion rates may vary dramatically when considering completion of bachelor’s degrees, as compared with associate degrees or certificates. Many institutions offer a mixture of credential types, so the above categories would generate confusing and potentially misleading results. For example, in 2011-12:


Miami Dade College in Florida granted 1,497 undergraduate certificates, 11,959 associate degrees, and 667 bachelors degrees.

University of Phoenix’s Online campus awarded 311 undergraduate certificates, 39,341 associate degrees, and 32,432 bachelor’s degrees.

Santa Barbara City College in California awarded 1,738 associate degrees, and 1,049 certificates.4


Failing to disaggregate completion data by type of award would obfuscate precise outcomes at these institutions and others like them that provide more than one type of credential. While the TRP “agreed that student success can mean many things and felt that making a distinction between certificate and degree completion was too limiting,” we disagree. Collecting data with more specificity does not determine what qualifies as student success, nor is it limiting. Rather, it is less limiting and provides more information for consumers, policymakers, institutions, and researchers about the kind of success a student is likely to experience at a particular institution. As such, we suggest collecting outcome data, including information on subsequent enrollment (see additional comments below), for each type of formal award received (at a minimum: bachelor’s, associate, and certificate), as well as for students who attempted but did not attain a formal award.


Second, the Department can enhance the quality, utility, and clarity of these data by requiring institutions to report not just whether students subsequently enrolled in another institution, but what type of institution they enrolled in, as follows:


Subsequently enrolled at another four-year institution;

Subsequently enrolled at another two-year institution; or

Subsequently enrolled at another less-than-two-year institution.


Providing this level of detail does not as some critics claim make value judgments about different sectors. Rather, it provides information in a way that promotes accurate and useful decision-making. As we’ve stated previously in response to TRP #37’s recommendations, this information is vitally important not only to prospective students who enroll in two-year institutions intending to transfer, but also to policymakers who need to evaluate whether 2+2 models are actually providing a viable pathway to the bachelor’s degree in their states. Finally, many institutions are already able to report such information, as evidenced by the hundreds of colleges participating in the A2S and Complete College America initiatives. Both of these initiatives require systems or states to report transfer-out rates for two-year colleges by level of receiving institution. While some institutions may face challenges in tracking student transfer, the requirement to submit these data, along with the clear appetite for this information, should drive states to improve their State Longitudinal Data Systems.


Finally, while additional information on continued and subsequent enrollment is helpful, we urge the Department not to include students falling in these “still enrolled” categories in the numerator of a success rate calculation, except in the case of students who transfer from a two to a four-year institution. Rates of subsequent enrollment provide useful contextual information on student progression, but they should not be counted as “successes” in a graduation-rate measure. Just as a student who is still enrolled in year 5 is not included in the numerator, she also should not be counted in the numerator if she is still enrolled (at the same or another institution) in year 6 (or later). Graduation rates should count as successes only students who achieved an intended outcome — earning a credential or transferring from a two to a four-year institution.


Timing of new reporting


TRP #40 suggests two options for the timing of reporting these expanded success measures:


Option 1: Report an Interim Status and a Final Status, or

Option 2: Report Final Status.


Ed Trust strongly recommends Option 1: Report an Interim Status and a Final Status, which would provide information and data in a more relevant and actionable time period. Option 1 would require reporting of student success data at the 150 percent and 200 percent points, rather than waiting until students have completed the 200 percent timeframe and reporting retrospectively on the 150 percent rate. Graduation-rate data already are reported retrospectively, providing a snapshot of the performance of students who entered six years earlier (at a four-year institution). While this lag is unavoidable because of the nature of graduation-rate data, the Department should make efforts to limit this lag period as much as possible. In fact, the data would be even more relevant and timely if institutions were to report data at the 100 percent, 150 percent, and 200 percent point. Eight years is way too long to wait for a glimpse into institutional performance.


Additional recommendation for improving data on student success


The A2S systems report student success information on part-time and transfer students, but also report on the success rates of another key demographic Pell Grant recipients. Current IPEDS data do not include graduation rates by financial aid status. While institutions are required to disclose these data, they are not reported to IPEDS, nor are they available in one common location. Furthermore, research has found that only 25 percent of institutions are in compliance with the disclosure requirement.5 In order to reach our national attainment goals, we must graduate more low-income students, and to truly drive improvement, we must first understand our current levels of performance. The Department of Education should require institutions to report graduation rates by financial-aid status to IPEDS, using the following categories:


Pell Grant recipients,

Subsidized Stafford loan recipients who do not receive Pell Grants, and

Students who receive neither Pell Grants, nor subsidized Stafford loans.


The Department of Education’s changes to IPEDS’ measures of student success – especially if made as outlined in these comments will provide more clear, accurate, and timely data on postsecondary outcomes. These data will paint a more precise portrait of student success that captures the diversity of today’s college students. The data also will prove immensely valuable for students making college decisions, institutions of higher learning working to advance student success, and policymakers attempting to evaluate and improve existing and future policies. For us, the data will better inform our work on ways to close gaps and increase completion for low-income students and students of color. Thorough and accurate data hold enormous power to drive our postsecondary system toward better student outcomes, and we recommend the Department make the proposed changes to IPEDS to achieve this more accurate and thorough data.


NCES Response:

Dear Ms. Tromble,

Thank you for your comment dated May 10, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS. NCES appreciates Ed Trust’s support of the additional metrics related to student progress and completion included in this proposed collection. We understand Ed Trust’s desire to collect more specific information on completion type with the graduation rate metric. For the data user community, more data is almost always a welcome addition. However, NCES must balance the need for more and better information of the data users with minimizing the burden on the data providers. Between those two groups is the legal and regulatory framework that supports this very important federal data collection. In general, NCES will try to adhere to the recommendations of the TRP sessions and considers those forums essential for preparing a relevant and achievable collection.

With respect to your comments about the proposed new Outcome Measures (OM) component, two separate TRPs were held on this topic and these issues were discussed at length and then carefully considered by NCES. After much discussion, the TRP suggested that distinguishing between transfer from a 2-year to a different 2-year institution as opposed to transfer from a 2-year to a 4-year institution was unnecessarily burdensome for institutions. Because this information describes an institution’s offerings and not necessarily the program that the student enrolled in (i.e. a student transferring to a 4-year institution could be enrolled in a sub-baccalaureate certificate program, an associate’s degree program, or a bachelor’s program), it may not be particularly meaningful for these purposes. Further, given the substantial increase in burden that this new collection already places on institutions, NCES has proposed that this information be collected only from degree-granting institutions and only at one-point in time—namely 8 years after a student’s entry.

During the TRP, it was noted that collecting data on any award received would not be as useful for institutions that award multiple degree levels, or as constructive for addressing policy questions, as collecting data by award level (e.g., number of associate’s degree completers, number of bachelor’s degree completers). However, the panel agreed that student success can mean many things and felt that making a distinction between certificate and degree completion was too limiting. Further, collecting information on any award better accounts for students who receive an award and subsequently enroll at the reporting institution or another institution. There was also a concern that collecting information about subsequent enrollment would imply that institutions are required to track this information. Despite the challenges with measuring transfer activity, the panel agreed with TRP #37 that reporting such information on non-first-time students who are enrolled either full-time or part-time would provide a meaningful measure in the context of progression and completion outcomes.

It is not yet clear what nomenclature will be used for statistics using the new Outcomes Measures data on subsequent enrollment information. In other federal statistics, such as unemployment, there are different variations of the statistic that are used for various purposes. It is likely that collecting the new IPEDS information will lead to a more detailed and varied set of statistics on outcomes used by the federal government and postsecondary community, and that these new statistics will be used to compliment the graduation rate statistics already in use.

Collecting data at an interim data point for cohorts was discussed at a TRP. It was decided that asking institutions to provide status updates at two different points in time would further increase burden and make the number of different cohorts that an institution is reporting on a given year unwieldy.

Your suggestions for collecting data to calculate cumulative debt at graduation and graduation rate information for Pell recipients have been discussed at past meetings of the IPEDS Technical Review Panel (TRP) and National Postsecondary Education Cooperative (NPEC), and have been carefully considered by NCES. Currently, much of this information is available at a national level through the National Postsecondary Student Aid Study (NPSAS) and the postsecondary longitudinal sample surveys. In addition, steps have been taken to allow for ED’s National Student Loan Data System (NSLDS) to provide this information at the institution level in the future. Therefore, in an effort to obtain this information from already existing data systems and keep reporting burden at a minimum, NCES feels that IPEDS should not collect this information from institutions.

Given your support of the new cohort groups, you should be aware that after reviewing comments from the public and considering the difficulty in creating retroactive cohorts from 2006, NCES is now proposing in the Information Collection Review (ICR) to begin collecting outcomes information one year later, in 2015-16, using the cohorts that began in 2007. It is our hope that providing an extra year to prepare will alleviate some of the difficulty in creating the cohorts and completing the Outcome Measures component of IPEDS. NCES is committed to working with the postsecondary education community to provide resources that will assist institutions with their outcome measures reporting.

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program



Comment 33

Document: ED-2013-ICCD-0029-DRAFT-0040
Name: Michael Self
Address: Miami, FL,

Email: [email protected]
Organization: Miami Dade College

Date: May 11, 2013

When collecting and reporting graduation rates, maintain separate rates. Institutions will benefit more from having one rate for full–time student cohorts and one rate for part–time student cohorts instead of having one overall graduation rate that includes both. This will increase an institution’s ability to develop strategies to improve graduation rates that can be tailored to the population of interest.

Concerning reporting of cost of attendance, provide institutions with the opportunity to describe the population that they are reporting cost of attendance for (i.e., dependent living with family, independent living off campus). This will enhance an institution’s ability to effectively compare themselves to other peer institutions.

NCES Response:

Dear Mr. Self,

Thank you for your comment dated May 11, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS. On the new Outcome Measures component, institutions will report data for all four cohorts separately, so those data will be available separately through the IPEDS Data Center and other data use tools and tabulations. The Technical Review Panel was quite clear that the utility of these new measures can be found in their disaggregation, not as a single measure.

Institutions currently report cost of attendance information separately by living arrangement (on-campus, off-campus with family, off-campus without family), so those data are similarly available disaggregated through the IPEDS Data Center and College Navigator. If there are additional places where seeing these data disaggregated would benefit institutions, please let us know.

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program



Comment 34

Document: ED-2013-ICCD-0029-DRAFT-0041
Name: Linda Miller
Address: Ithaca, NY,

Email: [email protected]
Organization: Cornell University Library

Date: May 13, 2013

Hello-

Comments from Cornell University Library:

Having read the definitions currently available, we know we will not be able to provide these measures because of our workflows:
• Databases
• Media
• Number of branch and independent libraries, excluding the main or central library (national definitions do not provide enough guidance)
• Are staff fringe benefits paid out of the library budget? Y/N (see next entry)
• Fringe benefit expenditures (for the Library, except for sponsored programs, only endowed unit benefits are paid through the Library budget)
• Material expenditures breakouts: one-time purchases of books, serial backfiles and other materials; ongoing commitments to subscriptions; other information resources
• Other operations and maintenance expenditures breakouts: preservation services; all other operations and maintenance expenditures

We may also not be able to provide this measure
• Circulation – digital/electronic

Thank you, Linda Miller

NCES Response:

Dear Ms. Miller,

Thank you for your comment dated May 13, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register.

The IPEDS Technical Review Panel (TRP) that discussed and made suggestions concerning the reintegration of the Academic Libraries Survey (ALS) into IPEDS was composed of 42 individuals representing the federal government, state governments, institutions, data users, association representatives, and others. Of these, seven individuals represented academic libraries.

Although the size and scope of the Academic Libraries Survey (ALS) data collection is decreasing from what was collected biannually until now, most of the remaining items will not change. Consequently, instructions and definitions will remain the same where the items are not changing, but new instructions and definitions will be provided where needed. One notable exception is in the area of staffing. Currently, when reporting staff data to the ALS, libraries provide the number of filled or temporarily vacant full-time equivalent (FTE) positions and the corresponding salary and wage data. The TRP suggested that instead of collecting FTE, IPEDS should collect a count of part-time and full-time library staff to remain consistent with how data are collected throughout IPEDS. These data are collected on the Human Resources (HR) component of IPEDS, and can be used to generate FTE estimates. Starting with the 2012-13 data collection, librarians and library technicians are reported in separate categories.

It does appear that Cornell University was able to respond to the longer form of this survey in 2010 and that many of the items of concern that were listed in the comment were provided. In the case of a new item like Databases, Cornell University was able to account for many of the constructs that will be used in the databases item for the new form.

Please do not hesitate to contact the IPEDS Help Desk for further assistance (at 877-225-2568 and at [email protected]).

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program



Comment 35

Document: ED-2013-ICCD-0029-DRAFT-0042
Name: Linda Anderson
Address: Allendale, MI,

Email: [email protected]
Organization: GVSU Libraries

Date: May 13, 2013

We use information from the sections listed below that are proposed to change (or no longer be collected):

  • FTE total, including the breakdown by librarians, professional staff, and other paid staff.

  • Information on student employees

  • Operating expenditures (which would include computer hardware and software)

  • Circulation information

  • Information services to individuals*

  • *Regarding questions for virtual references, this isn't useful without understanding of the other information services provided.

  • Of these items listed, we would most like to see the FTE and salary information remain in the survey.


NCES Response:

Dear Ms. Anderson,

Thank you for your comment dated May 13, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. The IPEDS Technical Review Panel (TRP) discussed the structure of the expenditures portion of the component at length.

The proposed Academic Libraries Survey (ALS) is shorter than the previous version. Information on library staff and faculty has been moved to the Human Resources IPEDS survey. The HR section is proposed to collect employee information on library staff. Specifically there will be questions on ‘library technicians’, ‘librarians’, ‘Archivists, Curators, and Museum Technicians’, This information will be collected with respect to gender, race/ethnic, full-/part-time, and tenure track. Total salary and wages and benefit expenditures will remain on the ALS. Since the proposed ALS is a shorter form, there are some broader categories of information that will be collected. Computer hardware and software will be collected in an ‘other’ category and not delineated specifically. Below is some background information regarding the reasoning for the proposed changes.

The current Academic Libraries Survey (ALS) 300 series collects data on funds expended by the library in the most recent fiscal year from its regular budget and from all other sources—for example, research grants, special projects, gifts and endowments, and fees for services. Several data elements in the ALS 300 series are disaggregated into detailed categories. A number of panelists agreed that defining and repurposing the existing categories is problematic because of the lack of clarity surrounding the current definitions and how the elements can be affected by changing technology. The panel noted that the IPEDS Finance component does not capture detailed expenditure data at the level of granularity in the ALS. As a result, data from this series can be gathered from the library budget but cannot be pulled from the institution’s general purpose financial statement. Consequently, the panel suggested collapsing detailed categories into aggregate categories.

Removing the detail significantly decreases reporting burden, and the panel agreed that this reporting method preserves ALS trend data on expenditures. While there is value in collecting more detailed information on expenditures and capturing data to reflect the changing dynamic from the purchase of materials to the leasing of materials, the panel concluded that IPEDS is not the appropriate instrument for collecting this information.

The TRP also discussed the collection of staffing information. Currently, when reporting staff data to the ALS, libraries provide the number of filled or temporarily vacant full-time equivalent (FTE) positions and the corresponding salary and wage data. The TRP suggested that instead of collecting FTE, IPEDS should collect a count of part-time and full-time library staff to remain consistent with how data are collected throughout IPEDS. These data are collected on the Human Resources (HR) component of IPEDS, and can be used to generate FTE estimates. Starting with the 2012-13 data collection, librarians and library technicians are reported in separate categories; unfortunately, information about student employees is not collected in IPEDS, although information on graduate assistants is.

Collection of information on both physical and digital/electronic library collections and circulation will be retained.

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program



Comment 36

Document: ED-2013-ICCD-0029-DRAFT-0043
Name: Lauren Asher
Address: Oakland, CA,

Email: [email protected]
Submitter's Representative: Matthew Reed
Organization: The Institute for College Access & Success

Date: May 13, 2013



Ms. Kate Mullan

Acting Director, Information Collection Clearance Division

U.S. Department of Education

400 Maryland Avenue SW, LBJ Room 2E117

Washington, DC 20202-4537

[email protected]

(submitted electronically via: www.regulations.gov)




Dear Ms. Mullan:


We are writing in response to the request for comments on the proposed revision of the Integrated Postsecondary Education Data System (IPEDS), published in the Federal Register on March 14, 2013, docket number ED-2013-ICCD-0029. The Institute for College Access & Success (TICAS) is an independent, nonprofit organization that works to make higher education more available and affordable for people of all backgrounds. By conducting and supporting nonpartisan research and analysis, TICAS aims to improve the processes and public policies that can pave the way to successful educational outcomes for students and for society.


As TICAS has long recommended, incremental changes to IPEDS could result in substantial improvements in the availability of meaningful data about student borrowing and outcomes. In these comments, we recommend specific changes that will greatly improve the information available to consumers, researchers, and policymakers while minimizing reporting burden for colleges.


The recent efforts of the U.S. Department of Education (ED) to make better data available to students and consumers underscore the urgent need for better information on these fronts. Both the College Scorecard and the Financial Aid Shopping Sheet were designed to help students and families make informed decisions about whether and where to attend college and how to pay for it. However, the absence of available good data has led ED to make compromises that may mislead rather than enlighten consumers. For example, without cumulative debt for all colleges, the College Scorecard compares median debt figures that are apples-to-oranges, not distinguishing between colleges where few or all students borrow, or where few or all students graduate6.


We applaud ED's efforts to expand the collection and reporting of debt and outcome data, and provide detailed comments below.


Cumulative debt at graduation


As noted above, there is an urgent need for better data on cumulative student loan debt for use in the College Scorecard and Financial Aid Shopping Sheet. The best data currently available, from the Common Data Set (CDS), is grossly insufficient: they are reported voluntarily by only some four-year colleges, and for bachelor degree recipients only. We commend ED for working toward obtaining better data through the National Student Loan Data System (NSLDS), but such data will not be available until late 2014 at the earliest and will only cover federal student loans. Consumers need better data right now.


We recommend that IPEDS immediately start collecting data on cumulative debt at graduation for completers of undergraduate certificates, associate's degrees, and bachelor's degrees. To minimize reporting burden and ensure apples-to-apples comparisons, we suggest applying the Common Data Set (CDS) definitions already established for questions on this topic to certificates and associate’s degrees as well as bachelor’s degrees7. That is, collect data for students who started as first-time undergraduates at the reporting institution and count only student debt accumulated at the reporting institution (excludes transfer-in students and debt accumulated at other institutions).


Specifically, for each award level noted above, IPEDS should collect and report the following data points:

Number of students in graduating class (as defined above)

Number of graduating students with debt

Total debt of the graduating class


Collecting those data would allow the National Center for Education Statistics (NCES) or stakeholders to calculate the percent of graduates with debt and the average debt per borrower. Minimally, these data should be collected for all student loans, federal student loans, and non- federal student loans separately. Ideally, non-federal loans should be further divided by source (states, colleges, and banks/lenders).


We recommend collecting these data in IPEDS starting in 2013-14. If and when such data are validated through NSLDS calculations, they may be dropped from required IPEDS reporting.


Graduation rates for Pell recipients


In 2008, the Higher Education Opportunity Act added a requirement that colleges disclose graduation rates for Pell Grant recipients, Subsidized Stafford Loan recipients without Pell Grants, and all other students8. However, many colleges do not make these data available9. This hampers the ability of researchers, policymakers, and consumers to understand which colleges not only enroll substantial numbers of low-income Pell Grant recipients, but also graduate them.


We commend ED for working toward the collection and reporting of comprehensive data on this topic using NSLDS. However, such data will not be available for several years. Therefore, IPEDS should immediately begin collecting these data, particularly graduation rates for Pell Grant recipients. Minimally, these data should be collected for graduation rates tracking first-time, full-time students completing within150% of normal time. As colleges are already required to calculate and disclose these numbers, reporting them to IPEDS would not constitute an additional burden on colleges.


Ideally, a sub-cohort of Pell recipients should be tracked in all of the required cohorts for 2014-15 reporting and beyond, currently proposed as follows:

Within 150% of normal time for first-time, full-time students

Four, five, and six years after entry for bachelor's degree-seeking first-time, full-time students.

Eight years after entry for first-time full-time, first-time part-time, non-first-time full-time, and non-first-time part-time cohorts.


Annual private (non-federal) loan data for all undergraduates


As student debt levels continue to rise, it is important to note that the type of borrowing as well as the amount of borrowing matters. Private (non-federal) student loans are one of the riskiest ways to pay for college, generally lacking the capped, fixed interest rates, flexible repayment plans, and other borrower protections built into federal loans.


Currently, IPEDS collects data on annual federal and private loan borrowing for first-time, full- time undergraduates’, but only collects data on annual federal loan borrowing for all undergraduates. With so many undergraduates following non-traditional pathways through higher education and private student loan volume starting to rise once again, it is crucial that consumers and policymakers have timely data about private student loans for all undergraduates at each school, not just those entering as first-time, full-time students. Ideally, these data should be disaggregated by source (state, colleges, or banks/lenders).


We have also long called for ED to track private as well as federal student loans in its student loan database, which is ultimately the best way to provide accurate and comprehensive data on these loans. But as there are no immediate plans to do so, it is imperative that IPEDS collect these data.


Number of loan-eligible students (for CDR PRIs)


Colleges facing the loss of eligibility for federal student aid on the basis of persistently high cohort default rates (CDRs) have a number of ways to appeal to ED to avoid sanctions. Colleges with high CDRs but a low share of students borrowing federal loans may appeal using the Participation Rate Index (PRI). As part of this process, the college calculates the number of students eligible for federal student loans, the number receiving such loans, the participation rate (number receiving divided by number eligible) and the PRI (participation rate times CDR). The PRI recognizes that CDRs may not be representative indicators of institutional quality at colleges where CDRs – which only describe the share of borrowers who default – reflect outcomes for only a small share of students. However, the data required to calculate the participation rate and the PRI are not available publicly. This is because IPEDS collects the number of undergraduates receiving federal student loans, but not the total number of students receiving federal student loans. In addition, for most colleges, the cohort for these data is fall enrollees only, not all enrollees during a full 12-month award year.


We recommend that IPEDS collect and report the following data points to facilitate the calculation of PRIs:

Number of students receiving federal Direct Loans (undergraduates and graduate students, full 12-month award year)10

Number of regular students who were enrolled at the institution on at least a half-time basis during any part of the award year11


With these data points, NCES, ED, or other stakeholders could calculate colleges’ participation rates and PRIs. These data would provide important context for the public and policymakers by distinguishing between schools where CDRs are more and less meaningful indicators of quality. Collecting these figures would also help colleges better understand their risk of sanctions and prevent schools from unnecessarily withdrawing from the student loan program, which cuts students off from the safest way to borrow if they cannot otherwise afford to stay in school12. Colleges would still be able to appeal CDR sanctions using a different 12-month period from the academic year by submitting the relevant data to ED.


Reporting better graduation-rate data


ED has taken a number of promising steps to implement the recommendations of the Committee on Measures of Student Success (CMSS). In particular, ED’s proposed changes to IPEDS include a substantial expansion of the graduation rate data collected and reported. For the first time, data would be collected not only for first-time full-time undergraduates, but also for first- time part-time, non-first-time full-time, and non-first-time part-time undergraduates.


We make three recommendations related to how these data should be reported:

- As noted in our earlier comments on this topic13, we strongly recommend that only "vertical transfer" (e.g., 2-yr to 4-yr institution) be included in success measures. The currently proposed way of capturing transfer is to count students who “subsequently enrolled at another institution,” which presumes that subsequent enrollment at any other institution of higher education be considered a successful outcome. However, enrolling at one college after another is not necessarily a sign of student success. Consider a student who moves quickly from one college to another because they were not satisfied with the first one, or a student who realizes after a year of coursework that their credits will not transfer and subsequently opts to start over elsewhere. Including these types of subsequent enrollments as successes would make the progression and completion measures less meaningful, not more so.

- We also recommend that these graduation rates apply to all institutions, not just degree- granting institutions. Many students choose between certificate programs in a particular field at degree-granting and non-degree-granting institutions. Some colleges are classified as degree-granting on the basis of a small number of degrees, but are in fact predominantly certificate-granting institutions.

- Finally, because the time it takes to complete a degree or certificate matters, IPEDS should collect the outcomes status of each cohort twice: four years and eight years after entry for four-year colleges and three years and six years after entry for two-year colleges. This was one of the options originally put forward for consideration by the technical review panel (TRP) on this topic. As currently proposed, all degree-granting institutions would report outcomes data for these cohorts only once, eight years after students’ entry.


Better data about and for veterans and service members


As noted in the IPEDS TRP report on this topic14, since the Post-9/11 GI Bill went into effect in August 2009, there has been substantial growth in the number of students receiving education benefits for veterans or service members and the total dollars received under these programs, reaching a total of $8 billion in fiscal year 2010. It is critical that veterans and service members, policymakers, and the public have sufficient information about participating colleges to ensure that this investment is not only supporting increased access for this student population, but also that it is helping them to complete meaningful post-secondary credentials without incurring burdensome levels of debt.


Specifically, these stakeholders are interested in where these students enroll, what services are offered for them, how successful they are at different institutions, and what the costs are for students and taxpayers. The current proposal includes appropriate changes to IPEDS to collect information on veteran and service member enrollment and access to services. However, these changes do not go far enough, as they neglect crucial questions about outcomes and provide very limited information about costs for these student populations. Specifically:


- It is crucial to track not only where these students enroll but also where they are successfully completing degrees and certificates. Just showing how many veterans and service members have enrolled at a particular school, what services that school provides, and limited information on tuition benefits received at that school will do little to help these students determine how much it will cost them to attend or their odds of success. Indeed, it could mislead veterans and service members to see the enrollment of a lot of students like them and the availability of certain services as direct indicators of institutional quality and value.


To correct for this, we recommend that IPEDS collect data on the number of veterans and service members completing degrees or certificates by award level and field of study (CIP Code). Veterans and service members should be disaggregated in the current retention rates. For graduation rates, minimally these student populations should be disaggregated when tracking first-time, full-time students graduating within 150% of normal time. Ideally, they should be disaggregated in the new graduation rate data collected under ED’s proposed changes. Given that colleges are already required to identify veterans and service members in some ways (e.g., the process of benefit certification or self-identification on the FAFSA), tracking their outcomes should not be burdensome.


- Information about affordability is also crucial both for veterans and service members and for policymakers. We commend ED for including in the proposed changes data on the number of undergraduates receiving assistance from Post-9/11 GI Bill and Department of Defense (DoD) Tuition Assistance and the total tuition and fee amounts received by these students. This information is already available to institutions, and the collection and dissemination of it will shed important light on where substantial federal investments are being made.


We urge ED to continue to work with the Department of Veterans Affairs (VA) to make more comprehensive data available on the number of students receiving benefits under all VA and DoD programs and the total dollar amounts received under these programs.


Our previous comments on this topic include additional recommendations, including incorporating military/veterans benefits into net price calculators15.


Data integration


The lack of a common identifier for colleges across different federal datasets continues to pose substantial challenges to users of ED's data, including consumers, researchers, and policymakers16. Until common identifiers among NCES, Federal Student Aid (FSA), the Office of Postsecondary Education (OPE), and federal agencies outside the department, such as VA and DoD are established, ED should minimally provide a definitive crosswalk and mapping tools to help users integrate FSA data with IPEDS data using OPEID numbers. Within IPEDS, ED can take the simple step of asking all colleges for all eight-digit OPEID numbers that correspond to each UNITID every year. Including this information in the institutional characteristics data released on the IPEDS Data Center would be a first step toward providing a comprehensive crosswalk.


Marketing/recruiting expenses at for-profit colleges


ED’s proposed changes to IPEDS include increasing the level of detail in for-profit colleges’ reporting of revenues, expenses, assets, and liabilities. While this represents a good first step, IPEDS should also collect data on expenditures for recruiting, advertising, and marketing. In this, we concur with the comments submitted by Senator Tom Harkin, Representatives Elijah Cummings and Raúl Grijalva, and the National Association for College Admission Counseling (NACAC) in response the TRP on this topic17.


Thank you for the opportunity to share our suggestions and concerns on this important topic. Please feel free to contact me or my colleague Matthew Reed via email at ljasher@ticas.org or m[email protected], or by phone at (510) 318-7900, with any questions.



Sincerely,



Lauren Asher

President


NCES Response:

Dear Ms. Asher,

Thank you for your comment dated May 13, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS.

NCES is very sensitive to the need to balance increased reporting burden with the utility of the data requested. Each Technical Review Panel meeting, regardless of the topic being discussed, is specifically charged with the task of minimizing the additional institutional burden as a result of increased IPEDS reporting requirements. Further, entire meetings of the IPEDS Technical Review Panel have been dedicated to the topic of IPEDS reporting burden.

Your suggestions for collecting data to calculate cumulative debt at graduation and graduation rate information for Pell recipients have been discussed at past meetings of the IPEDS Technical Review Panel (TRP) and National Postsecondary Education Cooperative (NPEC), and have been carefully considered by NCES. Currently, much of this information is available at a national level through the National Postsecondary Student Aid Study (NPSAS) and the postsecondary longitudinal sample surveys. In addition, steps have been taken to allow for ED’s National Student Loan Data System (NSLDS) to provide this information at the institution level in the future. Therefore, in an effort to obtain this information from already existing data systems and keep institutional reporting burden at a minimum, NCES feels that IPEDS should not collect this information from institutions.

Your suggestion for collecting data on private borrowing for all undergraduates is one that may be considered by a future meeting of the IPEDS TRP. At past meetings of the TRP, institutions have made it clear that it is not always known to an institution if the student has taken a private loan in addition to Title IV, state, or institutional aid they may be receiving. In an effort to minimize reporting burden, the data on aid received by all undergraduates were kept to a minimum, as the data required to calculate an average net price for an institution per the Higher Education Opportunity Act substantially increased the reporting burden for the Student Financial Aid component of IPEDS.

NCES appreciates your interest in the Cohort Default Rate (CDR) calculation and providing as much context as possible for this rate. However, the calculation of the CDR and its appeal process rests with the Office of Federal Student Aid (FSA). Suggestions for improvements to this process and additional data that may help inform the public and policymakers about the rate are best directed to FSA. NCES will pass these suggestions on to FSA, but encourage you to do so as well.

With respect to your comments about the proposed new Outcome Measures (OM) component, two separate TRPs were held on this topic and these issues were discussed at length and then carefully considered by NCES. After much discussion, the TRP suggested that distinguishing between transfer from a 2-year to a different 2-year institution as opposed to transfer from a 2-year to a 4-year institution was unnecessarily burdensome for institutions. Because this information describes an institution’s offerings and not necessarily the program that the student enrolled in (i.e. a student transferring to a 4-year institution could be enrolled in a sub-baccalaureate certificate program, an associate’s degree program, or a bachelor’s program), it may not be particularly meaningful for these purposes. Further, given the substantial increase in burden that this new collection already places on institutions, NCES has proposed that this information be collected only from degree-granting institutions and only at one-point in time—namely 8 years after a student’s entry. Asking institutions to provide status updates at two different points in time would further increase burden and make the number of different cohorts that an institution is reporting in a given year unwieldy.

NCES appreciates your thoughtful comments on how better data may be made available to veterans and servicemembers as well as providing policymakers and the public more information about the education benefit programs offered to these students. The IPEDS TRP held on this topic suggested that IPEDS is not the appropriate vehicle for collecting graduation rate information for these students, especially given that servicemembers are a particularly transient population due to the nature of their work, and IPEDS, as an institution level data collection, would not capture this activity well. ED is working closely with the Department of Veterans Affairs (VA) and the Department of Defense (DoD) as they work towards providing this information from their already existing data systems.

Your suggestion that IPEDS collect data on marketing and recruiting expenses through the Finance survey component for for-profit institutions was also discussed at length at a meeting of the IPEDS TRP on this topic. The TRP suggested that because the scope of advertising and marketing expenditures is too complex, crossing several functional expense categories, it is not practical to include as a separate category in IPEDS at this time. ED is aware that the different institutional identifiers used by ED complicate data integration activities and has been actively looking into ways to aid this type of work.  However, each of these identifiers was created to satisfy a specific law and as such has different requirements related to it.  Your comments on a definitive crosswalk to be provided by ED to the public will be passed on to the leadership at NCES to determine if this request is feasible. 

Finally, NCES suggests that you send any comments specifically related to the College Scorecard to [email protected]. ED invites public comment on the College Scorecard and suggestions for how it may be improved in the future.

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program



Comment 37

Document: ED-2013-ICCD-0029-DRAFT-0045
Name: Scott Filter
Address: Washington, DC,

Email: [email protected]
Organization: Bridgepoint Education, Ashford University, University of the Rockies

Date: May 13, 2013


Comments on IPEDS Data Collection

These comments are being provided by Bridgepoint Education on behalf of Ashford University and the University of the Rockies to the National Center for Educational Statistics regarding the new data elements being proposed in the IPEDS surveys.

Selected Outcomes of the Advisory Committee on Measures of Student Success

Overall, we are generally supportive of the inclusion of non first-time, non full-time students. However, we believe that the variables requested do not truly reflect part-time and non full-time students at all institutions, but instead reflect those students at two year institutions. This focus reflects a limited understanding of non first-time, full-time students and their educational habits. We would encourage NCES to do further study to better understand part-time and non-first time students beyond those that go to community colleges. For example, the requested data requires graduation reporting rates of eight years, which assists community colleges, but disadvantages institutions which offer Bachelor Degrees. By requiring all institutions to report graduation rates at the eight year mark, rather than by a percentage of completion time, benefit is given to shorter term programs compared to longer term programs. For instance, a four-year institution would report graduation at 200% of program length, while a community college would report it at 400% of program length, thus reflecting an unfair advantage to community colleges.

Veterans

NCES has requested comments on adding additional questions regarding the identification of Veterans enrollment and institutional policies directed towards veterans. IPEDS is requesting to include information on the benefits available to students, such as the Post 9/11 GI Bill, credit for military training, member of Service members Opportunity Colleges and having a dedicated point of contact for student support for veterans. Coupled with this, IPEDS requests the number of students receiving Post 9/11 GI Bill benefits and DoD Tuition Assistance. In addition, IPEDS requests the specific URL where the university provides tuition policies related to veterans and military service. With the large number of veterans that the universities support, we believe it is a benefit to report and disseminate information on this important group of students. We believe that collecting this information will be the first step in showing the success that veterans and those in military service are having at our institutions.

While we are supportive of the request to include data for this important and overlooked group of students, we have a concern in asking for the URL for tuition policies for veterans and service members, which will presumably be given to students in a disclosure, such as through College Navigator. As a non-term institution, we are concerned that the information gathered through the URL at the time of publication, by NCES may be inaccurate- as information and policies change faster at a non-term institution than they do at a traditional, semester based institution. Institutions that are non-term based should not be penalized for information that changes from the point of submission to the point that NCES publishes that information.

Finance

IPEDS requests comments on changes to their Finance Survey to better gather information from the For- Profit sector of higher education. The changes separate out a number of variables to determine where funding comes from, but also two additional variables that request information on the amount of taxes paid to federal, state and local government. These changes are beneficial to us as it provides more information for accurate comparisons between our institution and others (both in the For-Profit sector and in the Not-For-Profit sector). In addition, including taxes paid for For-Profit institutions will reflect the amount that we provide back to the government and in turn demonstrate that we do not only take funds from the government, but give back a fair share on the profits earned. We believe that this will benefit our institutions by providing a fairer depiction of our institutions in the federal data reported.

We strongly support changes to the Finance survey to include more information on For-Profit institutions. We believe that this information will show the benefit to the community that For-Profit institutions provide. It is beneficial to have this university-level disaggregated information available to the education community in a format they are familiar. We believe the information on tax rates to be particularly beneficial and helpful.

However, regional accrediting agencies are currently taking other data points into account to assess financial strength of institutions that are not being included in IPEDS. We encourage NCES to align information that regional accreditors use with IPEDS data to ensure that data exists in both formats and that there is consistency in definitions for variables being used, which will limit confusion in the understanding of data supplied on finances of For Profit institutions.

NCES Response:

Dear Dr. Filter,

Thank you for your comment dated May 13, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS. We agree that there will be increasing demand for information for active service members and veterans on educational services available to them and that this IPEDS enhancement is an important first step. As with all data collected in IPEDS, the URL provided in the collection will be released. When users access the released URL, they should be able to see the most up-to-date information posted by the institution at that URL.

For discussion of graduation rates and the new outcome measures, please see responses to comments 16-23 above. Thank you for your support on the enhancement of the financial section of IPEDS. We agree that it will help to provide a more complete understanding of the for-profit sector as well all of postsecondary education. NCES will continue to work with states and other consortia to align definitions and collections where possible.

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program


Comment 38

Document: ED-2013-ICCD-0029-DRAFT-0046
Name: Zora Ziazi
Address: Chicago, IL,

Email: [email protected]
Organization: National Louis University

Date: May 13, 2013

A primarily adult-serving institution, National Louis University (NLU) does not have a “traditional” student population composed of a large number of 18 to 22 year-old, full-time students fresh out of secondary school. Therefore, the mandated calculation method (specifically, the cohort definition) used for Title IV reporting of retention and graduation rates by institutions with “non-traditional” student populations are based on extremely small, non-representative numbers of students. Exacerbating this issue is the fact that, overwhelmingly, adult students come to institutions like NLU with some college experience under their belts, making the IPEDS method for calculating retention and graduation rates based on cohorts of full-time, first-time undergraduate students doubly inappropriate. Because institutions like NLU have student bodies largely composed of graduate students and the undergraduates transferring in credits from other institutions, the current calculation method results in extremely skewed data being reported to the public – data that completely misrepresent the institution’s actual retention and completion rates.

Because IPEDS retention and graduation rates are publically available and often used to evaluate an institution’s relative effectiveness, we believe that the highly inaccurate rates that institutions like ours are forced to report have severe, deleterious effects on institutional reputations by effectively disseminating misinformation. This situation is remarkably unfair, in that the calculation method is skewed to produce reliable results only for institutions with large undergraduate student populations of first-time, full-time students. Institutions like NLU that serve populations of students seeking degree completion and graduate study should not be forced to provide what is clearly unrepresentative and incorrect data to the public.

NCES Response:

Dear Zora Ziazi,

Thank you for your comment dated May 13, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS. Your statement that the IPEDS graduation rate does not serve all institutions is one that has been discussed in several formal settings including the Technical Review Panels (TRPs) (https://edsurveys.rti.org/IPEDS_TRP/). To that end, NCES has worked with the postsecondary education community to broaden the types of students reported on in the degree completion cohorts. These improved graduation rate metrics may better reflect the students attending National Louis University. There will now be four different cohort definitions for graduation rates: Full-time, first-time students; Part-time, first-time students; Full-time, non-first-time entering students; Part-time, non-first-time entering students. The addition of the two new cohorts should provide an opportunity for National Louis University to report outcomes that are representative of more students.

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program


Comment 39

Document: ED-2013-ICCD-0029-DRAFT-0047
Name: Martha Kyrillidou
Address: Washington, DC, DC,

Email: [email protected]
Organization: Association of Research Libraries
Government Agency Type: Regional

Date: May 13, 2013

We are writing to encourage IPEDS to use the ARL Statistics survey questions and instructions as much as possible for the new reintegrated Academic Library Survey (ALS). The survey form is located at: http://www.arlstatistics.org/documents/admin/12arlstatistics.pdf. The instructions are located at: http://www.libqual.org/documents/admin/12webinstruct.pdf (also attached with the comment).

The ARL Statistics is a survey form used by all academic libraries and it would make the collection of national data more streamlined and would considerably lessens the burden of data collection.

In particular we note that the collection variables in the proposed ALS/IPEDS survey form asks libraries to report collection data for different formats. Libraries are moving away from tracking different format for national level reporting and instead using a single ‘title’ count. We would like to encourage ALS/IPEDS to move towards a unified count of ‘titles’ across all formats.

We believe these recommendations would enhance the utility of the ALS/IPEDS.

Sincerely,

Robert E. Fox, Jr.

Dean and Professor, University Libraries

Chair, ARL Statistics and Assessment Committee

University of Louisville

Louisville, KY 40292


Martha Kyrillidou

Senior Director, ARL Statistics and Service Quality Programs

Association of Research Libraries

Washington, DC 20036


21 Dupont Circle

Washington, DC 20036

202 296 2296 telephone

202 872 0884 fax

http://www.arl.org/


NCES Response:

Dear Dr. Fox and Ms. Kyrillidou,


Thank you for your comment dated May 13, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS.

The IPEDS Technical Review Panel #35 (TRP) that discussed and made suggestions concerning the reintegration of the Academic Libraries Survey into IPEDS was composed of 42 individuals representing the federal government, state governments, institutions, data users, association representatives, and others. Of these, seven individuals represented academic libraries.

The TRP was asked to examine the various sections of the ALS and consider whether the level of detail at which the data are currently collected should be maintained, given the amount of reporting burden already faced by institutions. The TRP recognized that other mechanisms will preserve ALS data elements not suggested for reintegration into IPEDS. The ARL survey was not a topic of discussion during the TRP but many of the items on the ARL survey are in the proposed collection either on the library, human resource, or other sections of IPEDS.

A number of the items currently collected by ALS are not proposed to change, but in some cases, the panel felt that reporting less detailed information would not only reduce reporting burden but would preserve trends, so it suggested that some reporting categories be collapsed into broader categories. The proposed collection represents a movement towards broader categories of formats for title counts. The TRP panel was concerned that the ALS does not make a distinction between a physical count of materials and an electronic/digital count of materials. Thus, the panel suggested capturing the allocation of online and physical materials to allow institutions to make peer comparison on the distribution of resources. This also allows for the presentation of key trend data on the redistribution of resources from physical to electronic. As a result of this discussion, the panel suggested revising the 400 series to include a count of physical and digital/electronic titles in each category of library collection.

To assure comparability with previous ALS collections, the instructions and definitions will remain the same where the items are not changing. The form and its instructions will be available for preview as defined in the Supporting Statement Part A for this collection.

One notable exception is in the area of staffing. Currently, when reporting staff data to the ALS, libraries provide the number of filled or temporarily vacant full-time equivalent (FTE) positions and the corresponding salary and wage data. The IPEDS Technical Review Panel suggested that instead of collecting FTE, IPEDS should collect a count of part-time and full-time library staff to remain consistent with how data are collected throughout IPEDS.

Further, as required by OMB, all Federal agencies that publish occupational data for statistical purposes are required to use 2010 Standard Occupational Classification (SOC) system to increase data comparability across Federal programs. Had the ALS remained a separate data collection, the next OMB review would have required alignment with SOC. As of the 2012-13 data collection year, IPEDS is fully aligned with the 2010 SOC system.

The minor 2010 SOC category 25-4000 defines Librarians, Curators, and Archivists and the hierarchical structure includes detailed occupations. Each level of detail includes a corresponding definition. After reviewing the SOC definitions, the panel agreed that the IPEDS HR component should collect separate counts for Librarians; Library Technicians; and Archivists, Curators, and Museum Technicians. The IPEDS Human Resources component calculates an FTE value from reported full-time and part-time staff counts.

NCES proposes that the Academic Libraries component be collected annually to align with the other IPEDS components. This will ensure that data are available on a more frequent basis than currently.

A detailed discussion of the TRP discussion can be found at https://edsurveys.rti.org/IPEDS_TRP/documents/TRP35_SummaryPackage_Suggestions_final.pdf.

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program



Comment 40

Document: ED-2013-ICCD-0029-DRAFT-0048
Name: Terri Fisher
Address: Los Angeles, CA,

Email: [email protected]
Organization: Los Angeles Community Colleges
Government Agency Type: State
Government Agency: Education

Date: May 13, 2013

Out of Range questions - please include the percentages that we are out of range such as +-20%

NCES Response:

Dear Terri Fisher,

Thank you for your comment dated May 13, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS. Including the out-of-range percentage information in the edit report will help keyholders, and we will investigate how we can accomplish this.

Sincerely,

Richard J. Reeves

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program


Comment 41

Document: ED-2013-ICCD-0029-DRAFT-0049
Name: Gigi Jones
Address: Washington, DC,

Email: [email protected]
Organization: NASFAA

Date: May 13, 2013

N·A·S·F·A·A

Ms. Kate Mullan

Acting Director, Information Collection Clearance Division

U.S. Department of Education

400 Maryland Ave SW, LBJ Room 2E117

Washington, DC 20202-4537

ICDocketMgr@ed.gov


Dear Ms. Mullan:

We are writing in response to the request for comments on the proposed revision of the Integrated Postsecondary Education Data System (IPEDS), published in the Federal Registrar on March 14, 2013, docket number ED-2013-ICCD-0029. Founded in 1966, the National Association of Student Financial Aid Administrators (NASFAA) is a non-profit, professional association representing more than 18,000 student financial assistance professionals at over 3,000 institutions of higher education, serving over 16 million students. The primary goal of NASFAA is to promote maximum funding and effective delivery of financial assistance to students who are in need of additional funds to pursue their education beyond high school.


After careful review of the proposed data collection changes to IPEDS submitted by U.S. Department of Education (ED), Institute of Education Sciences, and the National Center for Education Statistics, we applaud ED's efforts to improve the collection and access of data on postsecondary students and institutions. In particular, NASFAA supports the improvement of graduation rate metrics, which will show graduation rates for part-time and transfer students, who currently are not included in graduation rate calculations. In a changing educational landscape that has seen more students choose to attend school part-time and move between various institutions, colleges and universities should be able to include and show these populations in their graduation rates. Similarly, students, families, and policymakers should be able to see how well institutions serve such students. We also encourage the efforts of collecting more information on veterans, as there is minimal information on the veteran student population and their use of Veteran Affairs educational benefits to attain postsecondary education.


Notwithstanding these improvements to IPEDS data collection, we feel it necessary to also state our concern for the potential burden increase that will be placed on our institutions; institutions who currently face numerous reporting requirements. The list of disclosures required of Title IV institutions is extensive18. IPEDS states that the new graduation rates data on part-time and transfer students will be extensively burdensome and the veteran data will be moderately burdensome. While we think it's a step in the right direction that IPEDS will gather voluntary, baseline data on estimated burden time in 2012 and then following every three years from 2014, we hope that IPEDS, along the way, will also seek ways of streamlining the data collection and frequently revisit how other federal organizations or agencies can help collect the data. In this way, it would be possible to lessen the time and burden to report this important data for consumers and policymakers and achieve the critical effort to improving accountability and transparency at postsecondary institutions.


Thank you for the opportunity to share our suggestions and concerns on this important topic. Please do not hesitate to contact me with questions by email, [email protected], or by phone, (202) 785-6943.


Sincerely,


Gigi Jones, Ph.D.

NASFAA Director of Research


1101 CONNECTICUT AVE NW, SUITE 1100, WASHINGTON, DC 20036-4303

PHONE: 202.785.0453

FAX: 202.785.1487

WEB: www.nasfaa.org


NCES Response:

Dear Dr. Jones,

Thank you for your comment dated May 13, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS. NCES appreciates your support of the improvement of the graduation rate metrics. As noted in your comment, these metrics will provide information on part-time and transfer students which has not been available consistently for every Title IV institution. NCES remains committed to balancing the need for information from Title IV institutions with the burden imposed on those institutions and has committed to investigating ways of minimizing burden on smaller institutions over the next three years.

NCES is actively involved in several initiatives with the Department of Defense (DOD) and the Veteran’s Administration (VA) to measure the postsecondary activities of active service members as well as veterans. These measures are related to benefits received and are designed to provide a set of sufficient and complete measures for U.S. service members and veterans.

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program


Comment 42

Document: ED-2013-ICCD-0029-DRAFT-0050
Name: Joe May
Address: Baton Rouge, LA,

E mail: [email protected]
Submitter's Representative: Ruth
Organization: Rebuilding America's Middle Class (RAMC)
Government Agency Type: State

Date: May 13, 2013


Ms. Kate Mullan

Acting Director, Information Collection Clearance Division

U.S. Department of Education

400 Maryland Avenue SW, LBJ Room 2E117

Washington, DC 20202-4537


Dear Ms. Mullan:


We are writing to comment on the proposed changes to the Integrated Postsecondary Education Data System (IPEDS) data collection instruments for 2014-15 and 2015-16. Rebuilding America’s Middle Class (RAMC) is a coalition of state and individual community college systems from across the nationrepresenting over 120 colleges and 1.5 million students—that share the common belief that community colleges are one of America's primary solutions to building a strong, more competitive workforce and therefore, a strong middle class. We appreciate the opportunity to comment.


IPEDS is an important data tool that allows the most comprehensive comparisons across institutions of higher education that is useful to the institutions themselves, policymakers, researchers and the public, including students and families. However, IPEDS, especially as it relates to the data collected on community colleges and the students who attend them, is significantly lacking in its ability to accurately report on the outcomes of students attending such schools. The opportunity to begin to address these deficiencies as the U.S. Department of Education (the Department) seeks to renew its authorization to collect data under IPEDS and revise what data it collects is welcome.


These comments will focus on the Department's proposed changes designed to reflect the recommendations of the Advisory Committee on Measures of Student Success. Among the proposed changes the Department seeks is the expansion of the collection of data under IPEDS to better reflect transfers and graduation rate data at community colleges. We support the Department's proposed changes to add data collection on part-time, first-time students; full- time, non-first-time entering students; and part-time, non-first-time entering students.


In addition, we support the specific data collected for each of these cohorts. We especially appreciate the Department's inclusion of data on students who did not receive a degree or certificate at a certain institution, but were subsequently enrolled at another institution. As you are aware, community colleges, while playing an important role in educating and training students through certificate or two-year degree programs, also offer critical opportunities for students seeking a bachelor’s degree to transfer to a four-year institution of higher education. Data collection that reflects this is important to accurately portraying the role that community colleges play in allowing many students to obtain a four-year degree.


As part of this data collection the Department indicates that the data collected for these cohorts will not be disaggregated by race, ethnicity, or gender. We recommend that the Department reverse this decision and collect data by these demographic categories. The collection of data in a disaggregated fashion allows institutions and the public a more accurate picture of student success at all institutions. Failing to do so will not allow sufficient analysis and improvement of graduation and transfer rates, especially at community colleges.


Lastly, we do want to point out the need to balance the costs associated with additional data collection and the outcomes that are sought through these proposed IPEDS changes. In going forward with these changes, we do urge the Department to carefully consider the additional resources that these changes proposed by the Department will require, especially for smaller institutions such as community colleges.


Thank you for the opportunity to comment on the proposed data collection requirements.


Sincerely,

Joe May

Board Chair, Rebuilding America’s Middle Class (RAMC) President

Louisiana Community & Technical College System

RAMC Board

Richard Carpenter, Chancellor, Lone Star College System

Glenn DuBois, Chancellor, Virginia Community College System

Joe May, President, Louisiana Community & Technical College System

Jeff Rafn, President, Northeast Wisconsin Technical College

Scott Ralls, President, North Carolina Community College System

Tom Snyder, President,

Ivy Tech Community College



NCES Response:

Dear Mr. May and RAMC Board,

Thank you for your comment dated May 13, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS. In particular, we appreciate your support of the new proposed Outcome Measure component. The decision to not collect these data disaggregated by gender, racial/ethnic, or financial groupings was made in an effort to allow for expanded data on student outcomes in IPEDS while minimizing the increased institutional reporting burden as much as possible.

NCES will continue to dedicate significant time and attention to understanding the potential for allowing more limited reporting by smaller institutions while still satisfying the legal and regulatory reporting requirements.

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program


Comment 43

Document: ED-2013-ICCD-0029-DRAFT-0051
Name: Julie Strawn
Address: Washington, DC,

Email: [email protected]
Organization: Center for Law and Social Policy (CLASP)

Date: May 13, 2013



Ms. Kate Mullan

Acting Director, Information Collection Clearance Division

U.S. Department of Education

400 Maryland Avenue SW, LBJ Room 2E117

Washington, DC 20202-4537

[email protected]


Dear Ms. Mullan:


Thank you for the opportunity to comment on the proposed revision of the Integrated Postsecondary Education Data System (IPEDS), published in the Federal Register on March 14, 2013, docket number ED-2013-ICCD-0029. The Center on Postsecondary and Economic Success (C-PES) at CLASP works for policies and investments that can increase the number of low-income adults and youth who earn postsecondary credentials that open doors to good jobs, career advancement, and economic mobility. C-PES has in-depth knowledge of federal higher education, workforce, and human services policy and also provides technical assistance to states and colleges on postsecondary access and completion as well as on career pathways and performance measurement.


Data is a particular focus of our work. In our recent report for the Gates Foundation's Reimagining Aid Design and Delivery project, we analyzed potential uses of performance data in federal higher education policy and offered a number of recommendations for ensuring that students and policymakers get the information they need to guide their decisions. Under the Shifting Gears initiative, we helped 5 Midwest states improve alignment of their data systems to support their system reform efforts. Through our Alliance for Quality Career Pathways (AQCP), we are partnering with 10 states to develop a shared set of performance metrics that support the creation of high quality career pathway systems that lead to higher attainment of stackable postsecondary credentials with labor market value. And we are also a founding member of the Workforce Data Quality Campaign, which works at the federal and state level to promote more inclusive, aligned and market-relevant education and workforce data systems that provide useful information about skills and labor market outcomes to the public, the private-sector and policymakers.


In general we support ED's efforts to expand the collection and public reporting of student outcome, financial aid, and debt data that students, parents, and policymakers need to make informed decisions. While the Department’s proposed IPEDS changes do not go as far as we think is necessary, they are a positive step forward. Our comments below for improving postsecondary data further are based on the research and analysis in our February 2013 report, Reforming Student Aid: How to Simplify Tax Aid and Use Performance Metrics to Improve College Choices and Completion.19


Improving Postsecondary Data for Consumers and Policymakers


We urge the federal government to improve collection and reporting of data on postsecondary education in several ways:

  • expand federal collection and reporting on key measures of affordability, student progress, and completion;

  • require public reporting of important information that is now only required to be disclosed on request; and,

  • expand collection and reporting of employment and earnings outcomes, whether through the states or the federal government.

These proposals aim to provide students, parents, and policymakers with much better information on results to inform their postsecondary decision-making. Colleges would also benefit from this data as they could use it to improve their performance on access and completion.


Expand public reporting of institutional measures of affordability, student progress, and credential completion and change disclosure requirements to public reporting.


We recommend modifying existing institutional reporting and disclosure requirements under the Higher Education Act to implement expanded public reporting that includes the addition of some new measures and shifts some existing measures from institutional disclosures to reporting requirements through IPEDS. Specifically we suggest:


  • Expanded reporting by institutions to address data gaps for measuring access and success for low-income students, including key measures of institutional access and affordability (from our Tier One measures in Table 5 below, excerpted from our report) such as percent receiving Pell Grants and other need-based financial aid (grants only), a measure of debt burden per student or graduate20, and net price information; interim measures of student progress (from Tier Two), such as developmental education course completion and progression in a program of study; and reporting of credential and degree attainment rates (Tier Three), using both the current definition of these rates and an expanded student cohort along the lines of the Committee on Measures of Student Success.


  • A stronger role for the Department of Education and the National Center for Education Statistics, including the development of common definitions and data elements and the development of comparable information on these measures. The Department of Education should make these results available for currently reported subcategories of students, such as gender and race/ethnicity, and for Pell Grant recipients and by enrollment status. This information should be made public through improved websites with better search capability so that results for key groups can be observed easily. Key measures should be included, as appropriate, on the Department of Education’s College Scorecard and Financial Aid Shopping Sheet.



  • Modification of Higher Education Act requirements, changing to reporting requirements certain elements currently included as disclosure requirements. This would include, at a minimum, Pell Grant graduation rates, transfer policies, and data on cost.



  • A full review of all existing Higher Education Act reporting and disclosure requirements by the Department of Education, including input from institutions, the research community, and consumers, resulting in a report to Congress with recommendations for streamlining and simplifying these requirements.



  • Exploration by the Department of Education of technical options for institutions to report required data in a more cost-effective manner than the current IPEDS process. This might include the option for institutions to replace some portion of the summary reporting requirement by submitting student-level data to a national clearinghouse, such as the National Student Clearinghouse. Another alternative would be for Congress to replace IPEDS entirely with a national student unit record system, which would make the task of collecting and reporting this data far easier, and facilitate the inclusion of employment and earnings data in consumer information.



Currently consumers and policymakers lack critical data needed to understand how well institutions perform on access and completion, especially for low-income students. For example, the Education Sector and the American Enterprise Institute surveyed 152 public and private four-year colleges and universities to assess the availability of required information under the Higher Education Act.21 The central finding was that “[t]he large majority of colleges are in total noncompliance with some of the most widely cited provisions of HEA: those meant to focus attention on the struggle of low-income students to graduate from college.” This included provisions for collecting and reporting such data elements as the graduation rate for Pell Grant recipients, for which only 25 percent of sample institutions had publicly available information. Some type of employment placement information was provided by 67 percent of the institutions, but this largely consisted of “anecdotal information about the jobs and employers of recent graduates” for about 11 percent of the institutions. The report recommended the conversion of all HEA “disclose” requirements to “report” requirements so that the NCES can function as a central clearinghouse for comparable information.


We agree with the Education Sector/AEI recommendations and would go farther to urge the inclusion of data on results for interim measures of progress, as recommended by the Committee on Measures of Student Success. We also recommend reporting an expanded graduation rate that includes part-time students in the observed student cohorts, and that includes transfers and those substantially prepared for transfer in the numerator. We also encourage the Department to explore the possibility of breaking out key data by enrollment status over time, including students who attend always full-time, attend always part- time, and who have mixed enrollment status. A recent study by the National Student Clearinghouse of nearly two million undergraduates found that more than half (51 percent) attended a mix of full and part-time over a six-year period , while just 7 percent attended exclusively part-time. This data highlights how problematic it is to group students in IPEDS by their enrollment status at enrollment as for half of those students that initial enrollment status does not accurately describe their attendance over time. These additional requirements would be balanced at least to some extent by potential reductions in the reporting burden that could result from the review of institutional disclosure requirements.


The additional reporting requirements would enable the development of better profile information for colleges along the lines of the NCES College Navigator site or the College Portrait of Undergraduate Education developed for colleges participating in the Voluntary System of Accountability. Further, these improved profiles would include results for types of students that frequently encounter difficulty persisting in college and completing a credential. Such profile information should be provided through well-designed web interfaces that have multiple paths to information and that allow users to avoid extraneous material, while drawing their attention to important contextual elements. While the added reporting burden to colleges is significant, the benefits of having this information are substantial, and a review of existing disclosure requirements may identify opportunities to reduce reporting burdens to at least partly offset the additional requirements. Such a review was recommended by the Advisory Committee on Student Financial Assistance (2011) in its study of federal higher education regulations. Institutions and the Department of Education could also explore producing IPEDS reports via a third-party clearinghouse, such as the National Student Clearinghouse, to ease the process. We also note that Congress could choose to create a national student unit record system which would eliminate the need for IPEDS, lift much of the reporting burden from colleges, and solve a myriad of issues that arise from an institution-based postsecondary data system.


Expand collection and reporting of employment and earnings outcomes for students by postsecondary program, whether through the states or the federal government.


We also believe that students need access to information about the employment and earnings of graduates, broken out by postsecondary program and institution so that they can shop around for the best value program for their career goals. Access to usable information on the labor market results of program graduates and noncompleters is a critical unmet need for all students, but it is particularly critical for low-income students and first-generation college goers. According to the Higher Education Research Institute’s survey of freshman at bachelor’s-degree-granting institutions 86 percent of freshmen cited “to be able to get a better job” as a “very important” reason for deciding to go to college, followed by “to learn more about things that interest me” (83 percent), “to get training for a specific career” (78 percent), “to gain a general education and appreciation of ideas” (72 percent), and “to be able to make more money” (72 percent).22


The top five reasons cited by freshmen students for selecting the particular college they were attending were: “very good academic reputation” (64 percent), “graduates get good jobs” (55 percent), “offered financial assistance” (44 percent), “a visit to the campus” (43 percent), and the “cost of attending” (41 percent). The top three objectives considered to be “essential” or “very important” for freshman survey respondents were: “being very well off financially” (80 percent), “raising a family” (73 percent), and “helping others who are in difficulty” (70 percent). Finally, the survey found that 72 percent of incoming freshmen agreed strongly or somewhat with the statement: “The chief benefit of a college education is that it increases one’s earning power.” This was the highest percentage among all such statements in the survey.


Evidence developed by Jennie Brand and Yu Xie suggests that those students who are the least likely to attend college due to socioeconomic barriers are the most likely to benefit from it in terms of subsequent earnings.23 Andrew Kelly and Mark Schneider found that when parents were “provided with graduation-rate data, 15 percent switched their preference to the school with the higher graduation rate.”24 In addition, these effects were stronger among parents with lower educational attainment levels and lower incomes. A review of focus group studies of how students select colleges found that “the focus group findings with low-income, first-generation, and academically underprepared students were consistent with research on adult students in that these students also collapse the search and choice stages into one abbreviated step. They tend to focus on a single college or two, primarily due to cost considerations and the fact that their grades and test scores limit their choices.”25 For these latter students, having program-level data is especially important because it may help them expand the range of program and institutional options they explore.


Each of these research findings supports the idea that providing better employment and earnings information to students and parents on the labor market outcomes resulting from occupational programs of study at individual institutions will improve the ability of students to select programs and colleges that best meet their needs. Despite this, the availability of high-quality, comparable data on labor market results at the institution and program levels is very limited. There are two principle options for addressing this:


  • Encourage states to gather and disclose aggregate student employment and earnings for all programs of study. The Department of Education could build on existing State Longitudinal Data System grants to encourage states to develop a common definition of postsecondary program enrollment and standardized collection of data on certificate and degree attainment, so that students enrolled in and successfully completing programs of study can be identified in a comparable manner. This standard approach to defining program enrollment would most likely be based on student course-taking patterns rather than students’ stated intent. In addition to comparable data on program enrollments and completions (Tier 3 measures), the Department of Labor could build on existing Workforce Data Quality Initiative grants to require inclusion of UI earnings data (Tier 4) as part of longitudinal student records accessible through the State Longitudinal Data System. Education and Labor could work together with the states, building on efforts such as the Wage Record Interchange System to provide cross-state access to UI earnings data so that employment and earnings results for programs of study can be developed in a cost-effective manner that protects student privacy. Congress should include language in the appropriations for each department specifically authorizing access to these UI earnings data, notwithstanding other provisions of law. States could be required to submit these aggregate results to the Department of Education for use by NCES to expand institutional-level profile information to include employment and earnings results for all occupational programs of study (not just certificate programs) and for all students, including those who complete a credential or degree and those who do not.


  • Create a national student unit record system and match education outcome data with employment and earnings data, broken out by institution and program.



It would be possible to have a national student unit record system that allows the matching of student-level education and employment and earnings data while protecting individual and employer privacy. Congress would have to act to remove the current bar on such a system; if it did so, the process of producing usable consumer information on labor market outcomes would be far easier than under a state-based system.


Regardless of the approach, the benefits to consumers of having employment and earnings data available by program are very clear. Recent research documents the wide variation in returns to postsecondary education, even within the same levels of certificate and degree attainment.26 Furthermore, such data will be useful for more than improving students’ career and college choices. The earnings results will be of interest to colleges as they develop and improve programs of study and career pathways, and they will be of interest to policymakers at all levels who seek to assess returns on the investment of public resources. One concern is that earnings data at the program and institutional levels could be misinterpreted and colleges compared on earnings results in inappropriate ways. These concerns can be addressed with careful design of the metrics and attention to the presentation of the data. While some disadvantages of adding labor market data to consumer information on colleges may remain, they are outweighed by the potential benefits of giving students access to this critical information.


Thank you for your consideration of our comments.


Julie Strawn Tim Harmon

Senior Fellow Consultant

Center on Postsecondary and Economic Success, CLASP


NCES Response:

Dear Ms. Strawn and Mr. Harmon,


Thank you for your comment dated May 13, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS.


NCES appreciates your support of the expanded sections of IPEDS, and understands that there is a need for more quality data collected from postsecondary institutions. We read your suggestions with interest. NCES works hard to balance the desire for more and better information with the burden imposed on the reporting institutions, while assuring that postsecondary education information required by federal law and regulations is provided to the public.


With regards to your comments on common data standards, changes in congressional mandate, and student unit record data systems, while changes in legislative authority of the Department of Education and by extension NCES are something outside the scope of this FRN process, the U.S. Department of Education has several initiatives to build common data standards. The Common Education Data Standards (ceds.ed.gov) is the department’s attempt to address many of the topics in your comment within the current legislative authority given to the department.


With regards to the suggested expansions to IPEDS reporting, NCES uses Technical Review Panels (TRP) as a forum to investigate potential changes to the IPEDS data collection. Many of the suggestions made in your comment related to improving postsecondary data for consumers and policymakers have been topics of previous Technical Review Panels (TRP). A summary of the TRP discussions can be found at https://edsurveys.rti.org/IPEDS_TRP/Default.aspx. NCES will work with advisory groups to review your suggestions and determine which warrant one or more Technical Review Panels.


Thank you again for your input.


Sincerely,


Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program


Comment 44

Document: ED-2013-ICCD-0029-DRAFT-0052
Organization: Career Education Corporation

Date: May 13, 2013


Career Education Corporation appreciates the time and attention that the Department has invested in exploring ways to enhance and improve the IPEDS data collection effort. We believe that the Technical Review Panels are the best way for the Department to solicit comments and suggestions from users in the field and we commend NCES for the work that was accomplished through the Committee on Measures of Student Success. Many of the suggestions included in the Proposed changes to the IPEDS data collection instruments for 2014-15 and 2015-16 (OMB No. 1850-0582 v.13) are positive and would enhance the usefulness of the IPEDS database. In particular, we support the expansion of the database to include First-Time Part-Time students, Non-First-Time Full Time and Non-First-Time Part Time students; the collection of additional information regarding services and support provided to military personnel, veterans and their families; the addition of reporting elements related to academic libraries; and the proposed changes to the Finance Form in order to better align financial reporting between non-profit and proprietary institutions.

However, while we support many elements of the proposal, in general we believe that it is beyond the authority of the Department of Education or the Office of Management and Budget to authorize such a significant expansion of IPEDS through the regulatory process. Since the parameters of IPEDS data collection are dictated by statute, we believe that the recommendations of the TRP and the Committee on Measures of Student Success should be provided to Congress and potentially incorporated in the Administration’s proposal for reauthorization of the Higher Education Act, with the final IPEDS parameters to be determined, as they have been in the past, through the legislative process. The Department’s current efforts to substantially expand and modify IPEDS through the regulatory process set a dangerous new precedent which could result in perpetual changes being made to the IPEDS collection effort in the future, making it useless as a tool for studying trends and outcomes over the long term.

With regard to the specifics of the proposal, we believe that the Department of Education must recognize that institutions of higher education are under an extreme burden to meet a growing number of federal, accreditor and state reporting initiatives at a time when resources are more constrained than ever and the impetus to drive down the cost of higher education has never been greater. Increased reporting burden leads to higher administrative costs, so it is critically important that IPEDS expansion be considered in the context of the total regulatory burden so that Congress can eliminate other unnecessary, outdated and redundant reporting requirements, thereby allowing institutions to redirect current resources toward meeting the new requirements of IPEDS.

While we wholeheartedly support the idea of including all degree seeking students, and not just first-time, full-time students in IPEDS, this transition will require the reprogramming of data systems as well as additional data entry and quality assurance steps, all of which take time and cost money. For that reason, we recommend that these changes be made prospectively, or at a minimum that the phase-in period be extended for two or more years to give all institutions the time needed to gear up for these additional reporting requirements.

We support the collection of data which allows a more accurate comparison of financial metrics regardless of ownership structure.  However, modifications to the public and non-profit surveys should be made and released at the same time as the updated Finance Form for proprietary institutions to ensure comparability between forms.

Again, we appreciate the continuing effort on the part of the National Center for Education Statistics to collect timely and relevant data and to engage IPEDS users in discussions about how to improve the data system. We fully support most elements of the proposal, but believe that changes of this magnitude should be made only through the legislative process where the appropriate checks and balances are in place and where the reporting burden associated with an expanded IPEDS collection effort can be offset by the elimination of other unnecessary, redundant or outdated reporting requirements.

NCES Response:

Dear Career Education Corporation,

Thank you for your comment dated May 13, 2013, responding to a request for comments on proposed changes to the Department of Education’s Integrated Postsecondary Education Data System (IPEDS) published in the Federal Register. The National Center for Education Statistics (NCES) appreciates your interest in IPEDS.

NCES is authorized by law under the Section 153 of the Education Sciences Reform Act of 2002 (P.L. 107-279). Accordingly, NCES "shall collect, report, analyze, and disseminate statistical data related to education in the United States and in other nations, including -

  • collecting, acquiring, compiling (where appropriate, on a state by state basis), and disseminating full and complete statistics on the condition and progress of education, at the pre-school, elementary, secondary, and postsecondary levels in the United States, ...;

  • conducting and publishing reports and analyses of the meaning and significance of such statistics;

  • collecting, analyzing, cross-tabulating, and reporting, to the extent feasible, so as to provide information by gender, race, ...; and

  • assisting public and private educational agencies, organizations, and institutions in improving and automating statistical and data collection activities..."

Determining the scope and nature of the IPEDS data collection is well within the legislative authority granted to NCES. The introduction of new data collection items does not set a dangerous new precedent, as legislative action is only one way through which the IPEDS data collection has been shaped through its history. Additionally, the Technical Review Panel is mindful of the importance of preserving trends.

We recognize that not all institutions required to report IPEDS data have the same level of staffing and technological resources.  We work closely with the postsecondary education community to provide well designed and efficient collection forms and work directly with the IPEDS keyholders to facilitate their IPEDS data submissions. We continue to work to provide institutions with as many resources as possible to help ease the burden of the IPEDS reporting process.

NCES is very sensitive to the need to balance increased reporting burden with the utility of the data requested. Each Technical Review Panel meeting, regardless of the topic being discussed, is specifically charged with the task of minimizing the additional institutional burden as a result of increased IPEDS reporting requirements. Further, entire meetings of the IPEDS Technical Review Panel have been dedicated to the topic of IPEDS reporting burden:

We recognize that IPEDS reporting takes time, however the information collected not only provides a common set of information for prospective students to use when making a decision to enroll in postsecondary education, but is regularly utilized by the Department of Education and other federal agencies, researchers, policy makers, and others to monitor and improve postsecondary education for the public.

After reviewing comments from the public and considering the difficulty in creating retroactive cohorts from 2006, NCES is now proposing in the Information Collection Review (ICR) to begin collecting outcomes information one year later, in 2015-16, using the cohorts that began in 2007. It is our hope that providing an extra year to prepare will alleviate some of the difficulty in creating the cohorts and completing the Outcome Measures component of IPEDS. NCES is committed to working with the postsecondary education community to provide resources that will assist institutions with their outcome measures reporting.

Sincerely,

Richard J. Reeves

Program Director

Postsecondary Institutional Studies Program

1 NPEC was established by NCES in 1995 as a voluntary organization that encompasses all sectors of the postsecondary education community including federal agencies, postsecondary institutions, associations and other organizations with a major interest in postsecondary education data collection.

2 Education Trust analysis of IPEDS 2011 Graduation Rate Survey cohorts.

3 “Report and Suggestions from IPEDS Technical Review Panel #40 Additional Selected Outcomes of the Advisory Committee on Measures of Student Success.”

4 College Navigator (nces.ed.gov/collegenavigator), National Center for Education Statistics, Accessed May 3, 2013.

5 Kevin Carey and Andrew P. Kelly, “The Truth Behind Higher Education Disclosure Laws,” Education Sector and American Enterprise Institute, 2011.

6 TICAS. 2013. New College Scorecard: Two Steps Forward, One Step Back. http://views.ticas.org/?p=982.

7 See: Common Data Set Initiative. http://www.commondataset.org/.

8 See 20 U.S.C. 1092(a)(7)(A).

9 Education Sector and American Enterprise Institute. 2011. The Truth Behind Higher Education Disclosure Laws. http://www.educationsector.org/sites/default/files/publications/HigherEdDisclosure_RELEASE.pdf.

10 As defined in 34 CFR 668.195(b)(i) and 34 CFR 668.214(b)(i).

11 As defined in 34 CFR 668.195(b)(ii) and 34 CFR 668.214(b)(ii).

12 TICAS, 2011. Still Denied: How Community Colleges Shortchange Students by Not Offering Federal Student Loans. http://projectonstudentdebt.org/files/pub/still_denied.pdf

13 TICAS. 2012. Comments in response to the “Report and Suggestions from IPEDS Technical Review Panel #37, Selected Outcomes of Advisory Committee on Measures of Student Success.” http://www.ticas.org/files/pub/TICAS_comments_on_TRP37_CMSS_final_05-29-12.pdf.

14 RTI International. 2012. Report and Suggestions from IPEDS Technical Review Panel #36 Collecting Data on Veterans. https://edsurveys.rti.org/IPEDS_TRP/documents/Report%20and%20Suggestions%20from%20TRP36_final.pdf.

15 TICAS. 2012. Comments in response to the “Report and Suggestions from IPEDS Technical Review Panel #36, Collecting Data on Veterans.” http://www.ticas.org/files/pub/TICAS_comments_on_TRP36_veterans.pdf.

16 See: TICAS. 2012. Report from Education Department Advisory Group Calls for Improvements to Financial Aid Data. http://views.ticas.org/?p=842.

17 For example see: Reps. Cummings and Grijalva. 2012. Comments in response to the “Report and Suggestions from IPEDS Technical Review Panel #39, Improving Finance Survey Forms for For-Profit Institutions.” http://democrats.oversight.house.gov/images/stories/2012-12-07.%20EEC%20to%20IPEDS%20Project%20Director%20RE%20Comments%20to%20TRP.pdf.

18 Fuller, C. & Salerno, C. (2009, Oct). Information Required to Be Disclosed Under the Higher Education Act of 1965: Suggestions for Dissemination. National Postsecondary Education Cooperative. http://nces.ed.gov/pubsearchlpubsinfo.asp?pubid=2010831rev

20 See, for example, the Student Default Risk Index Score, suggested by The Institute for College Access and Success (Reed, Matthew, and Deborah Cochrane, Student Debt and the Class of 2011, October 2012) and the Debt to Degree measure suggested by Education Sector and the American Enterprise Institute (Carey, Kevin, and Andrew P. Kelly, The Truth Behind Higher Ed Disclosure Laws, 2011).

21 Carey and Kelly, 2011.

22 Pryor, John H., Linda DeAngelo, Laura Palucki Blake, Sylvia Hurtado, and Serge Tran. The American Freshman: National Norms Fall 2011. Higher Education Research Institute, UCLA, 2011.

23 Brand, Jennie E., and Yu Xie. “Who Benefits Most from College? Evidence for Negative Selection in Heterogeneous Economic Returns to Higher Education.” American Sociological Review, 2010.

24 Kelly, Andrew P., and Mark Schneider. Filling in the Blanks: How Information Can Affect Choice in Higher Education. American Enterprise Institute, 2011.

25 MacAllum, Keith, Denise M. Glover, Barbara Queen, and Angela Riggs. Deciding on Postsecondary Education: Final Report. National Postsecondary Educational Cooperative, 2007.

26 Carnevale, Anthony P., Ban Cheah, and Jeff Strohl. Hard Times: Not All College Degrees Are Created Equal. Georgetown Center on Education and the Workforce, 2012.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorKashka Kubzdela
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy