1840-0766 2017 IS APR Supporting Statement Part A - 30 day

1840-0766 2017 IS APR Supporting Statement Part A - 30 day.docx

Annual Performance Reports for Title III, Title V, and Title VII Grantees

OMB: 1840-0766

Document [docx]
Download: docx | pdf

EDICS Tracking and OMB Number: (XXXX) XXXX-XXXX Revised: XX/XX/XXXX

RIN Number: XXXX-XXXX (if applicable)



SUPPORTING STATEMENT

FOR PAPERWORK REDUCTION ACT SUBMISSION


     


A. Justification


1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a hard copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information, or you may provide a valid URL link or paste the applicable section1. Specify the review type of the collection (new, revision, extension, reinstatement with change, reinstatement without change). If revised, briefly specify the changes. If a rulemaking is involved, make note of the sections or changed sections, if applicable.


Under Titles III, V, and VII of the Higher Education Act of 1965 (HEA), as amended, discretionary grants are awarded to eligible institutions of higher education and organizations (Minority Science and Engineering Improvement Program (MSEIP) Title III, E only) to support improvements in educational quality, institutional management and fiscal stability. The office of Institutional Service (IS) is authorized to award one-year planning grants and five-year development grants to institutions with low per-student expenditures that enroll large percentages of minority and financially disadvantaged students. The communities served by Titles III, V, and VII of the HEA include: Asian American and Native American Pacific Islander-Serving Institutions (AANAPISI); Alaskan Native-Serving Institutions; Native Hawaiian-Serving Institutions (ANNH, NHSI); Historically Black Colleges and Universities (HBCU); Historically Black Graduate Institutions (HBGI); Hispanic-Serving Institutions (HSI); Native American-Serving Nontribal Institutions (NASNTI); Predominantly Black Institutions (PBI); American Indian Tribally Controlled Colleges and Universities (TCCU); and other institutions that serve a significant number of minority and financially disadvantaged students and have low average and general expenditures per student.


There are major forces that continue driving the Annual Performance Report (APR): (1) the need to improve the quality and effectiveness of our program monitoring efforts; (2) the need to provide more reliable and valid data for the Government Performance and Results Act (GPRA); (3) the need to evaluate grantee and Program effectiveness; and (4) capacity building efforts toward a Title III, Title V, and Title VII community of practice. The Office of Inspector General (IG) has identified repeatedly the aforementioned needs as areas that the Department of Education should resolve. For the past several years, the Department has been focused on addressing these areas. The data elements for all programs that use this APR continue with minor changes to standardize data collection across programs.


This APR, designed specifically for Title III and V programs (as well as Title VII part A, Master’s Degree Programs at Historically Black Colleges and Universities and Master’s Degree Programs at Predominantly Black Institutions), captures the diverse and unique properties of grant projects, as well as overall program accomplishments. The APR casts a wide net over the Title III, V, and VII programs, but is flexible enough to address all of the specific needs of each of the programs. The APR allows grantees to measure their progress against their institution's own baseline data, select their areas of emphasis, and provide additional qualitative information in narrative form if they wish to do so.


The APR uses a standard format, making it far easier to elicit specific responses, aggregate data and compare responses within the entire grantee pool or across years. Albeit narrative responses are allowed, our grantees’ time is more efficiently spent collecting and entering data that, for the most part, already exists in their institution’s records or as a result of their project evaluation plan (which is part of their original grant application). The APR incorporates the summative and formative independent grant evaluations and provides IS program officers with data that heretofore was not captured electronically and therefore not aggregated and easily analyzed in a systematic manner.


Authorization for the collection of information can be found in the following sections of the HEA, by program CFDA:


Additional references can be found in the Education Department General Administrative Regulations (EDGAR) parts 606, 607, 608, 609, and 637.


2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.


The information gathered by the APR will be used to (1) monitor the yearly progress of Title III, V, and VII grantees; (2) determine future funding of awards to grantees; (3) collect GPRA data to report to policymakers; (4) follow through on corrective action plans resulting from IG audits; (5) analyze and report Program profiles, trends and practices; and (6) evaluate Program and grants management success. The project directors compile the information for the report and submit it to the Department of Education via a secure web-based report at https://apr.ed.gov. For the six percent of grantees that fail to meet the submission deadline an optional paper format is available. Since inception, we have captured more than 7,500 annual reports from Title III, Title V, and Title VII grantees. Once received, the Title III, V and VII program office and other applicable internal and external entities may analyze the APR information. The results of the report have played, and will continue to play, a central role in analyzing project and Program information, forecasting, creating a transparent view of Title III, Title V, and Title VII programs and demonstrating the U.S. Department of Education’s success in improving access to our nation’s higher education system. Trend and Profile Reports have been developed for all programs.


The program office makes grant awards for the following year in the G5 grants management system, which provides at least 90 days to inform grantees of their funding status. Grantees must demonstrate that they have made significant progress towards meeting the goals of their project objectives in order to receive funding for the next cycle of an award. The APR records the accomplishments or progress of a project, provides grantees with an opportunity to articulate why grant objectives were or were not met, and documents their planned and actual federal expenditures. In addition, the APR has narrative sections that allow grantees to communicate important information that is harder to capture in the quantitative output sections of the report, such as unexpected results from their Title III, Title V, or Title VII projects.


The APR is structured to provide varying levels of analysis, the most expansive of which is the collection of GPRA data and independent evaluation information. The most detailed and individualistic level of analysis is focused on the specific grant activities identified in the grantee’s original application. As the grantees provide responses to the status of their activities, the configuration of the APR allows for broader inquiry by grouping activities into categories that are identified in the legislation governing Titles III, V, and VII. The flexible structure of the APR is further conducive to a program-wide analysis and allows us to measure the targeting of federal resources, the program outputs, and subsequently, the success of meeting the programs’ legislative intent. These analyses are central to our compliance with GPRA requirements, the President’s transparency initiative, and the need to evaluate national programs and individual projects from independent sources. A standardized Executive Summary was developed across IS programs. In this format, some questions were removed. Based on feedback from the 60-day FR notice, questions regarding evidence are now included for programs that have included priorities or selection criteria focused on front-end evidence of effectiveness, evidence of promise, or back-end rigorous evaluations as defined by EDGAR. This will improve the Department’s ability to evaluate and collect information on the performance of grantees that have received special consideration for evidence-using or evidence-building activities.


As the Executive Summary was standardized, it became clear that the Project Status section should, and could, also be standardized to collect better and more pertinent data as described earlier. To do so, a tabular section was added for reporting on Project Objectives. This new format could reduce the burden for both grantees and Program staff. Additionally, within the Institutional Profile section, Institutional Measures will be in tabular form. These measures look at the institution before and during the grant. Each measure is collected at the end of the grant year to illustrate the institutions’ success in achieving the Programs’ legislative intent.


3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or forms of

information technology, e.g. permitting electronic submission of responses, and the basis for the decision of adopting this means of collection. Also describe any consideration given to using technology to reduce burden.


The APR is housed and maintained under contract with our consultants. The respondents can upload information, save, and return to the report before submitting it to IS, print out the report at any time, and benefit from web-security.


The advantages of a web-based APR for IS are significant. For clarity in completing the report, the web-based version displays only the relevant portions of the APR to the grantee, based on the program that the grantee is participating in and the type of institution the grantee represents. Given that the APR is intended to serve multiple programs and diverse institutions, if the report is viewed in its entirety, there are an overwhelming number of options. Based on the information that a grantee provides when they login to the system (creating a profile), only the pertinent sections of the report will be selected and displayed to the grantee. For example, a 2-Year Institution would not see questions about enrollment at 4-Year Institutions, making the report easier to understand and complete. The paper version of the APR that existed prior to 2001 encompassed every option for every type of institution and program— the web-version only displays what is pertinent to the program and the type of institution reporting.


The web-based APR facilitates data management and subsequently information management purposes. Once the reports are complete, in order to make use of the data, the responses need to be entered into a database. To manually create a database from a paper copy of the APR is an extremely daunting and inefficient task.


Since the inception we have collected 96% of approximately 7500 individual performance reports completed through the APR online and therefore the information is available for analysis. (The approximately 7500 reports collected do not include the final performance reports generated by the system). The APR is accessible by all personal computers, handheld PDAs, and mobile phones with web browsers in a Linux, Apple, or Microsoft environment. The most recent completion rate across all programs for the online APR (for the FY 2015 data collection) was 98.6%.


Considerable effort has been devoted to provide training to program staff and technical assistance to grantees. A training manual is available for all grantees and staff 24 hours a day under the “training tab” at https://apr.ed.gov. Staff can practice exercises as if they are grantees and potential applicants. Additionally, the general public can become familiar with the information needed to report success or failure of Title III, Title V, and Title VII grants. A technical assistance phone number and customer service e-mail are available while grantees are completing the APR: the e-mail address is [email protected].


A test system is available at https://opeweb.ed.gov/title3and5beta/ to access the electronic version of the APR in beta testing mode. A login name and password is available to OMB upon request. The current APR electronic system has not been updated with the submitted APR components due to the most recent data collection period and the need for public comment. Once the components are cleared for use, the current APR will be updated by the contractor, for each program, to improve the user experience.


4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.


Duplications found in the report deal solely with the Institutional Profile (Section Two) data collection in the APR. As noted in the instructions, the tables correspond to surveys from the Integrated Postsecondary Education Data System (IPEDS), which is administered by the National Center for Education Statistics (NCES), located within the U.S. Department of Education. IPEDS is a comprehensive system of surveys designed to collect institution-level data in such areas as enrollments, program completions, faculty, staff, and finances. Approximately 9,900 postsecondary institutions complete the IPEDS surveys every year.


The Institutional Profile data that the APR is collecting is essential because it lends relevant context to the report. It is important to make clear the operating conditions of the institutions we serve, especially since so many of them focus on disadvantaged students and underrepresented groups or “at risk” students. Also, this institutional context helps gauge how our programs have institution-wide effects. IPEDS offers a meaningful institutional context by providing data regarding student body characteristics, enrollment, and graduation / completion rates. Rather than create our own method for collecting this data, we felt that it would be less burdensome for the grantee to align our report with the IPEDS survey.


Furthermore, when most grantees log into the APR, the Institutional Profile section is already populated with data. IS has been working closely with NCES to ensure that this duplication of data will have a minimal burden on institutions. The grantee will not have to enter in this data, as it will have been pre-loaded into their report. During our consultation with the grantee community, they asked that we display the data on their institution for their review, a request that we honor.


The exceptions to the aforementioned process will occur when (1) an institution does not report any data to IPEDS; or (2) a branch campus reports data to IPEDS as an aggregated part of a multi-campus system. Our consultation with the grantee community informed us that when a branch campus (which may receive its own Title III, V, or VII grant) is part of a multi-campus system that reports to IPEDS as a single entity, the branch campus data frequently exists in their institutional records. In this case, we will ask the branch campus to disaggregate their IPEDS data and report directly in the APR only their particular branch campus data.


When an institution does not report to IPEDS, the NCES policy is to impute the data based on a number of variables. To maintain regularity, if an institution does not provide the requested information, we will follow NCES policy and use the imputations supplied by NCES. The following year, both the IPEDS surveys and the APR will again provide the institution with another opportunity to provide first-hand data.


In the rare circumstance where an institution or branch campus is unable to provide any IPEDS data (and it cannot be imputed), we will provide a narrative that may be used to explain how providing this data for the purposes of the APR would be far too burdensome or expensive for the institution to absorb. If the institution provides a satisfactory justification, it will be excused from completing the Institutional Profile section.


Based on the scope of institutions participating in the IPEDS survey and our consultation with the grantee community, we believe that providing the data for this section will be of little burden to the majority of institutions. In regard to the aforementioned exceptions, we will be able to identify those schools in advance and work closely with them to ensure that their participation will not be an excessive burden.


5. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden. A small entity may be (1) a small business which is deemed to be one that is independently owned and operated and that is not dominant in its field of operation; (2) a small organization that is any not-for-profit enterprise that is independently owned and operated and is not dominant in its field; or (3) a small government jurisdiction, which is a government of a city, county, town, township, school district, or special district with a population of less than 50,000.


The collection of information will not have a significant impact on small businesses or entities.



6. Describe the consequences to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


Without the use of an APR, we can expect three major consequences. First, our efforts to monitor programs will be greatly hindered. As the IG audit reports have made clear, we need to improve our program monitoring, and the APR is central to this challenge. By revitalizing and improving our performance reports, we can gain a deeper understanding of our programs without substantially increasing our grantees’ existing burden expectations. While the recommendations made by the IG are certainly a motivating force, even more so is the expectation that with more adequate tools, we can serve our grantees better and more successfully demonstrate the effectiveness of our programs to policymakers and the general public.


Secondly, without a standardized APR it is very difficult to aggregate data in a way that satisfies GPRA requirements and IG concerns. The immense diversity of Title III, V, and VII grant activities, as well as the variety of goals expressed in the authorizing legislation, has made it challenging to measure program outcomes in a reliable manner. With the APR we are collecting information that is more reliable, reasonable, and informative.


Third, we cannot present to the American citizens and the higher education community a comprehensive transparent view of Title III, Title V, and Title VII Programs without this data collection.


7. Explain any special circumstances that would cause an information collection to be conducted in a manner:


  • requiring respondents to report information to the agency more often than quarterly;


  • requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;


  • requiring respondents to submit more than an original and two copies of any document;


  • requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records for more than three years;


  • in connection with a statistical survey, that is not designed to produce valid and reliable results than can be generalized to the universe of study;


  • requiring the use of a statistical data classification that has not been reviewed and approved by OMB;


  • that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or that unnecessarily impedes sharing of data with other agencies for compatible confidential use; or


  • requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information’s confidentiality to the extent permitted by law.


There are no special circumstances as outlined in #7 of the Supporting Statement Instructions.


  1. As applicable, state that the Department has published the 60 and 30 Federal Register notices as required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.


Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instruction and record keeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.


Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years – even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.


During the period of the prior cleared APR (2013-2016), IS has actively consulted with our Title III, V, and VII grantee communities. Our goal has been to solicit our grantees’ input, guidance and support in developing a new system that will more fairly and accurately measure institutional and program performance. OPE’s Institutional Service has begun a long-term effort to reconsider and revise the entire proposal and performance-report process, including consultation with the grantees and other stakeholder communities. This future revision would move IS APRs from compliance to outcomes-based reporting. In particular, discussions on potential revisions to the proposal and reporting process, the feedback we have already received, efforts to develop outcome measures, and possible improvements for the major focus areas and legislatively-allowed activities are currently being reviewed in consultation with Title III, V, and VII grantee communities. These efforts are not reflected in the current APR renewal; we anticipate that revision and consultation with stakeholders will take at least one year before we have a version of a proposal/data collection process ready for trial.


The 60-day Federal Register notice was published in the Federal Register on November 22, 2016, Vol. 81, Page 83825. To date, no public comment has been received. A 30-day Federal Register notice will also be published to solicit public comments.


9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees with meaningful justification.


No payment or gifts are provided to respondents.

10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy. If personally identifiable information (PII) is being collected, a Privacy Act statement should be included on the instrument. Please provide a citation for the Systems of Record Notice and the date a Privacy Impact Assessment was completed as indicated on the IC Data Form. A confidentiality statement with a legal citation that authorizes the pledge of confidentiality should be provided.2 If the collection is subject to the Privacy Act, the Privacy Act statement is deemed sufficient with respect to confidentiality. If there is no expectation of confidentiality, simply state that the Department makes no pledge about the confidentially of the data.


The Department makes no pledge about the confidentiality of the data. No personally identifiable information, other than contact information for the grantee project director, is provided. As such, requests for this information are in accordance with the following ED and OMB policies: Privacy Act of 1974, OMB Circular A-108, A-103, OMB M-06-15, and OM: 6-104-Privacy Act of 1974.


11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. The justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.


There are no questions of a sensitive nature within the APR.


12. Provide estimates of the hour burden of the collection of information. The statement should:


  • Indicate the number of respondents by affected public type (federal government, individuals or households, private sector – businesses or other for-profit, private sector – not-for-profit institutions, farms, state, local or tribal governments), frequency of response, annual hour burden, and an explanation of how the burden was estimated, including identification of burden type: recordkeeping, reporting or third party disclosure. All narrative should be included in item 12. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.


  • If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens in the ROCIS IC Burden Analysis Table. (The table should at minimum include Respondent types, IC activity, Respondent and Responses, Hours/Response, and Total Hours)


  • Provide estimates of annualized cost to respondents of the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included in Item 14.



P rior to the first submission of this package in fiscal year 1999, nine (9) grantees voluntarily reviewed and completed the APR as a “pilot test.” In addition to providing valuable insights and recommendations, the grantees were able to supply a reliable burden estimate based on their experiences. The hour burden on respondents is expected to vary by program as the APR is structured around the number of activities that a grantee is undertaking. Typically, different projects funded by Title III, V, and VII have more or less activities than others, which cause variation in the burden on respondents. Each of the Title III/V/VII programs are identified in the following tables by CFDA, Program Name, and corresponding Title:

*Estimate based on total burden hours x $22.00 estimated hourly wage table:


  • Number of respondents: 1114

  • Frequency of response: Once per year for 1114

  • Annual hour burden: Hour Burden Total (23,390)/ Number of Respondents (1114) = 21 hours approximately


  • Estimated annualized cost to respondents: $514,580

(Estimate was based on total burden hours X $22.00 estimated hourly wage)


13. Provide an estimate of the total annual cost burden to respondents or record keepers resulting from the collection of information. (Do not include the cost of any hour burden shown in Items 12 and 14.)


  • The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life); and (b) a total operation and maintenance and purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and acquiring and maintaining record storage facilities.


  • If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of contracting out information collection services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.


  • Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government or (4) as part of customary and usual business or private practices. Also, these estimates should not include the hourly costs (i.e., the monetization of the hours) captured above in Item 12


Total Annualized Capital/Startup Cost :      

Total Annual Costs (O&M) :      

____________________

Total Annualized Costs Requested :      


There are no costs to respondents other than those listed in question 12.


14. Provide estimates of annualized cost to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information. Agencies also may aggregate cost estimates from Items 12, 13, and 14 in a single table.


Contract for data collection, site maintenance, data checking, and updates, is approximately $150,000 per option year. Staff support, including technical and substantive contract monitoring, approximately 200 hours per year at $66 per hour, totals $13,200. The estimated overhead cost is $594. Total cost estimate: $163,794


15. Explain the reasons for any program changes or adjustments. Generally, adjustments in burden result from re-estimating burden and/or from economic phenomenon outside of an agency’s control (e.g., correcting a burden estimate or an organic increase in the size of the reporting universe). Program changes result from a deliberate action that materially changes a collection of information and generally are result of new statute or an agency action (e.g., changing a form, revising regulations, redefining the respondent universe, etc.). Burden changes should be disaggregated by type of change (i.e., adjustment, program change due to new statute, and/or program change due to agency discretion), type of collection (new, revision, extension, reinstatement with change, reinstatement without change) and include totals for changes in burden hours, responses and costs (if applicable).


The increase in burden is a result of more accurate numbers of respondents who complete the APR for Title III, V, and VII programs. Prior OMB clearance did not fully account for all Title III, V, and VII programs that would complete the APR.


16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.


Institutional Service will not be publishing the results of the information collection.


17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


Institutional Service is not seeking approval to not display the expiration date for OMB approval of the information collection.


18. Explain each exception to the certification statement identified in the Certification of Paperwork Reduction Act.


There are no exceptions to the certification statement in the Certification of Paperwork Reduction Act.


1 Please limit pasted text to no longer than 3 paragraphs.

2 Requests for this information are in accordance with the following ED and OMB policies: Privacy Act of 1974, OMB Circular A-108 – Privacy Act Implementation – Guidelines and Responsibilities, OMB Circular A-130 Appendix I – Federal Agency Responsibilities for Maintaining Records About Individuals, OMB M-03-22 – OMB Guidance for Implementing the Privacy Provisions of the E-Government Act of 2002, OMB M-06-15 – Safeguarding Personally Identifiable Information, OM:6-104 – Privacy Act of 1974 (Collection, Use and Protection of Personally Identifiable Information)



Shape1

13


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSUPPORTING STATEMENT
AuthorKenneth Smith
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy