UNITED STATES FOOD & DRUG ADMINISTRATION
Pilot to Develop Standardized Reporting Forms for Federally Funded
Public Health Projects and Agreements
OMB Control No. 0910-NEW
SUPPORTING STATEMENT Part A – Justification:
1. Circumstances Making Collection of Information Necessary
The Food and Drug Administration (FDA, the agency, us or we) is conducting a pilot regarding federally funded public health projects administered by FDA’s Office of Regulatory Affairs (ORA). Consistent with applicable regulations, ORA collects information 3 to 4 times annually related to an awardee’s progress in completing agreed-upon performance metrics. The authority for collecting such information is found in Code of Federal Regulations Title 2: Grants and Agreements (Uniform Guidance). Section 200.301, Performance Measurement (2 CFR 200.301), instructs that “[t]he Federal awarding agency must measure the recipient's performance to show achievement of program goals and objectives, share lessons learned, improve program outcomes, and foster adoption of promising practices,” for grants and cooperative agreements. Similarly, Federal Acquisition Regulation Part 42 Contract Administration and Audit Services, subpart 42.15 - Contractor Performance Information, requires agencies collect sufficient data to support past performance evaluations for contracts.
Related to, but distinct from performance and progress measures and metrics, ORA is seeking to increase its efficiency in analyzing and evaluating program effectiveness, return-on-investment (ROI), and return-on-value (ROV) for the federal partnership projects it administers. This effort aligns with HHS specific performance measurement requirements described in 45 CFR 75.301 -- Performance measurement stating that “Performance reporting frequency and content should be established to not only allow the HHS awarding agency to understand the recipient progress but also to facilitate identification of promising practices among recipients and build the evidence upon which the HHS awarding agency's program and performance decisions are made.” Currently, respondents submit requisite information to the FDA in free text and narrative form via portable document format (pdf) and email at the Mid-Year, or in the case of the Annual/End of Year Report, via eRA Commons using the Research Performance Progress Report (RPPR), forms from OMB 0970-0334 and Federal Financial Report (FFR) SF-425.
Data resulting from the proposed collection will be used to monitor awardee progress towards project goals and objectives, for quality improvement, and to respond to inquiries from the Department of Health and Human Services, Congress, and other sources. Monitoring and assessment of program activities allows ORA to provide oversight of the use of Federal funds. This information collection will allow ORA to monitor the increased emphasis on partnerships and programmatic collaboration and is expected to enhance program impact and maximize the use of Federal funds.
These proposed collection instruments herein are necessary and not duplicative, as the eRA Commons forms and submission platform are unable to optimize sufficient data uniformity and aggregation capabilities for ORA to conduct programmatic evaluations on the timeline necessary to make informed decisions for the next funding cycle. The forms available in OMB 0970-0334 were designed to evaluate general performance in meeting grant and cooperative agreement specific requirements. ORA’s programmatic reporting elements specific to the public health objectives for the program are both quantitative and qualitative and need to be aggregated and analyzed categorically. ORA’s program managers currently spend months reviewing text based pdfs that may be hundreds of pages long to sift through extraneous information to find and extract data of interest.
The lack of specificity of these reporting mechanisms to the ORA’s program specific data needs imposes a substantial time and psychological burden on award recipients and FDA internal stakeholders conducting program evaluations. Specific indicators of this burden include turnover in award coordinators responsible for reporting at recipient state regulatory agencies, numerous follow-ups needed for each reporting cycle by the ORA’s funded program managers to get the information to complete their programmatic assessments, and disproportionate hours spent reviewing reports and attempting to extract key program data with poor or unusable results.
A real example of the extraneous burden imposed on ORA award recipients in trying to meet program reporting requirements under current reporting mechanisms is found in a state agency under an ORA funding program that had been trying, unsuccessfully, for over a year to document satisfactory progress and provide required data under the program. State project leadership characterized reporting as “impossible”. Only after numerous meetings, follow ups, research, coordination with FDA project managers, was the state able to return the program to good standing under the agreement. Specifically tailored forms submitted under this collection are designed to alleviate confusion and time needed to understand and meet program reporting requirements.
Generalized reporting forms necessitate a substantial time investment for state agency staff to cross-reference program requirements with more ambiguous data entry fields and seek assistance from ORA staff to understand requirements. Despite this level of effort, ORA program staff still receive tens of thousands of pages of extraneous information that is not relevant or useful for their assessments. Turnover for state program coordinators magnifies this burden on both sides. The example described above represents a smaller state regulatory agency which often have limited administrative staff available to devote to reporting requirements and may be particularly susceptible to burdens observed under the current reporting system. These are the same agencies and state public health programs with great potential to benefit in meaningful ways from participation in the ORA’s funded public health projects and making any barriers to participation resulting from reporting alone highly undesirable.
Generalized reports further contribute to sub-optimal data quality because they lack programmatic specificity and structure. Without specific questions and guidance, ORA has seen the identical questions answered via widely different approaches across awardees within the same program. Every reporting cycle there are numerous instances of submitted reports that must be sent back to awardees. In addition, regardless of the original format, attachments uploaded to the eRA Commons system are converted to a flattened pdf, which prevents any form of aggregation except manual recognition and extraction of needed data elements by individual ORA program staff. These factors limit the reliability of within and cross-program analyses and ORA’s ability to respond to inquiries on program effectiveness. Cumulative spending reports using the SF-425 that do not distinguish spending by tracks within an agreement or spending by major expense type limit ROI/ROV analysis, program evaluations, and identification of successful funding applications. The ORA has tried to work within the constraints of existing data collection mechanisms for years with limited success to satisfy questions on how public health funds are being invested or achieving desired results under these programs.
In contrast to other competitive federal funding award programs, the ORA funded public health projects are designed to foster cooperation and partnership between federal and state public health agencies and representative regulatory associations. Programs are also highly specific in their public health objectives as informed by current public health risks and have a project lifespan of 3-5 years, necessitating a strong working relationship with open communication between ORA and project recipients. We have developed several digital forms that contain targeted, standardized questions designed to capture the data elements we believe are necessary to monitor and analyze performance, as well as measure and track ROI/ROV. Feedback used to inform development of the report forms under the current application for clearance included:
15 Program specific form development workgroups that included report data users representing FDA stakeholders both internal and external to the ORA.
Evaluation of data quality from reports submitted using comparable forms under programs that had less than 9 participants and not subject to the PRA.
Frequent (at times daily) conversations between ORA program managers or technical staff with award recipients.
Feedback received by issuing draft forms to program participants with a blanket request for feedback referencing the Federal Register notice for public comment.
Informal listening sessions and comments ORA staff received at annual program meetings.
However, at this stage we anticipate forms and questions will need to be revised as we receive new data and user feedback during the pilot. The ORA expects to continue receiving feedback solicited and unsolicited via e-mail, phone, in-person, and virtual meeting communication between ORA project managers, technical staff and award recipients due to the nature of these agreements and projects. We will continue to use lessons learned from data received under programs not subject to PRA review and ask our project managers and technical staff to confirm they are receiving useable data or notify the ORA data collection staff when data deficiencies resulting from form design are identified. Program staff will continue to host remote and in-person listening sessions at national meetings for program participants to provide feedback. Also aligned with the FDA and ORA’s goal to apply limited funding resources most effectively as indicated by current public health needs, we anticipate further development of forms and questions is still needed.
We have developed the following forms as part of the pilot project:
Animal Feed Regulatory Standards (AFRPS) Program Report
Animal Food Contract Quarterly Summary Report
Animal Food Safety Inspection Audit Form
Corrective Action Plan for Program and Individual Performance Deficiencies
Egg Contract Quarterly Summary Report
Emergency Response Course Preregistration Workbook
FDA 3610 Field Inspection Audit
Flexible Funding Model (FFM) Program Report
Food Protection Task Force (FPTF) Program Report
General Program Report Form (non-specific for new cooperative agreement and grant programs)
Human Food Contract Quarterly Summary Report
Laboratory Flexible Funding Model (LFFM) Program Report
LFFM Instructions QTR LFFM Chem_LFFM HAF Results Sheet
LFFM Instructions QTR Data Template Micro HAF Product Testing
LFFM Instructions Sample and Activity Plan Proposal
LFFM ORS Capability Inquiry Template
LFFM QTR Chem_LFFM_HAF_Results_Sheet
LFFM QTR Data Template Micro HAF Product Testing
LFFM Sample and Activity Plan_Proposal Template
LFFM_SRP-Lab Agreement Template_HAF Tracks
Manufactured Food Course Preregistration Workbook
Medical Devices Contract Quarterly Summary Report
MQSA MEU and Spending Update Report
Produce CAP_Project Plan outline
Produce CAP Assessment Template
Produce Course Preregistration Workbook
Produce Educational Needs Assessment Submission Template
Produce Inspection Aggregate Data Workbook
Produce Instructions Inspection Aggregate Data
Produce Inventory & Education Aggregate Data workbook
Produce Instructions Inventory & Education Aggregate Data
Produce Program Report
Request for Audit Reduction
Scientific Conference Program Report
State Implementation Agreement and Year End Evaluation
Veterinary Medicine Course Preregistration Workbook
We have provided a spreadsheet indexing the forms together with this submission. We are therefore requesting OMB approval for the information collection covered by the pilot project, collected utilizing the proposed forms, and discussed in this supporting statement.
2. Purpose and Use of the Information Collected
Respondents to the collection of information are recipients of FDA federally funded public health projects, or participants in related non-funded projects administered by the ORA. Feedback from ORA internal and external stakeholders on draft forms or from comparable forms for programs not subject to PRA review confirm the use of standardized forms will reduce the time needed by awardees to complete and submit required progress reports and related performance data, as well as facilitate ORA review of the requisite information. Targeted questions will eliminate ambiguity in responses and allow categorical and quantitative review and evaluation of the reporting data and facilitate ORA’s ability to establish and employ best practices for future program and funding decisions for the public health projects it administers. This will be assessed in the pilot by reviewing aggregated data with the established program specific forms workgroups (which include project managers and technical staff responsible for program assessment) to confirm data quality is adequate or identify revisions to achieve this by program. Other criteria to be considered includes the total number of pages program staff need to review, time required to aggregate program specific data and the number of follow-ups needed to complete program assessments.
As part of the pilot, respondents will complete approximately 2 to 4 reports that include specific questions regarding project updates. For some projects, ORA is also introducing an initial report to be submitted annually. The initial report will not be used to capture progress but ask the awardee for an activity plan which will then be used to set-up targeted report forms for that year. Historically, this was accomplished by email or phone communication between the ORA program staff and participants and the initial report will provide a standardized format to communicate this information. Based upon public feedback, we hope to revise the digital forms, tailoring them to capture specific project data elements including, but not limited to, to improve question clarity, formatting, usability (e.g., drop down menu selections), and potential common response indicators that will reduce time respondents will need to provide information. Feedback will be received via individual communication between program staff and program participants, listening sessions and established program specific form development workgroups.
To ensure data quality, on a case-by-case basis, we anticipate the potential need for follow-up questionnaire(s) and/or ancillary supporting documentation to supplement the scheduled reports as standard instruments of collection are developed and fine-tuned through this effort. Examples of categorical supporting performance documentation includes, but is not limited to, documentation related to training and verifying conformance activities.
Standardization of data elements and field designations will enable implementation of an easy to use data analysis dashboard for internal use and greatly reduce burden of review activities for project managers as well. We expect this system will eliminate the need to manually search long narratives for specific indicators of awardee progress toward required performance metrics because applicable progress narratives will be linked as the awardee enters information in a specified data entry field. The reduction in review time should improve project managers’ response times to awardees and greatly reduce incidences of requesting additional information due to missed performance metrics (e.g. metrics not addressed in the report by mistake or missed during the review). This will be confirmed during the pilot by internal assessment of these criteria with ORA program staff, specific comparative analysis of pages submitted using proposed forms vs previously used generalized reports, comparisons of FDA staff time to generate aggregated reports previously as compared to currently, and the amount of follow ups with state agencies regarding information presented.
3. Use of Improved Information Technology and Burden Reduction
Currently, project performance data is reported in free text and narrative form in portable document format (pdf) submitted by email at the Mid-Year, or in the case of the Annual/End of Year Report via eRA Commons using the Research Performance Progress Report (RPPR), OMB 0970-0334 and Federal Financial Report (FFR) SF-425. Under the pilot project, we will utilize digital forms in MS Excel or fillable pdf format with standardized reporting elements common to all awardees. These forms are intended to capture specific data needed to document progress planned and linked to specific performance elements for an individual award at the beginning of each budget year. Planned progress items from standardized forms will be captured in an aggregate data management dashboard for easy review and extraction by project managers. Planned items may also be used to pre-populate data fields in subsequent progress reports for easy reference by respondents. This approach should reduce the time and effort burden for awardees by providing previously planned progress items and a standard structure with labeled data fields for project progress and performance report data that clearly connect progress narratives to the applicable performance metrics for an award. The addition of drop-down menus for data capture when possible will also reduce the dependence on free-form narrative text and further standardize progress data.
Each progress report form will also include a customizable section by public health project to ensure that project managers have the flexibility to ensure the performance data types unique to that project award will still be captured. Customized sections will still include labeled data fields and standard drop-down menus as opposed to free text when possible. We propose to use early stage information collection and analysis to reveal whether performance and progress data gathering used to help determine program effectiveness (ROI/ROV data gathering) are germane enough to be done on the same instruments, or distinct and requiring separate instruments.
4. Efforts to Identify Duplication and Use of Similar Information
We are unaware of duplicative information collection. The RPPR and other forms available from OMB 0970-0334 are designed to measure grant or agreement performance metrics and do not capture data needed to evaluate ROI/ROV and program effectiveness metrics ORA must assess. The format of data submitted using broadly focused generalized questions and free text narrative response structure for both, submissions across a single program and even within a single recipient project is completely inconsistent, preventing uniform data collection and confidence in aggregation for analysis. The eRA Commons system is also restricted to flattened pdf data outputs, requiring numerous project managers and many hours to review and manually extract data needed for their assessments. We believe developing customized reporting forms/collection instruments will better allow ORA to meet its programmatic evaluation needs with regard to budget funding cycles.
5. Impact on Small Businesses or Other Small Entities
No undue burden is imposed on small entities as a result of the information collection.
6. Consequences of Collecting Information Less Frequently
The information collection schedule is consistent with current regulatory requirements and functions in a follow-on reporting capacity for respondents who engage in contractual activities utilizing federal funds.
7. Special Circumstances Relating to the Guidelines in 5 CFR 1320.5
Federally funded public health project managers are charged with ensuring federal monies are used for their intended purpose and achieving the desired ROI and ROV goals. The only times the pilot study may require participants to report the information more often than quarterly or provide a written response in less than 30 days would be for those projects where the data collected via the digitized forms is found to be incomplete at the time of review. While expected to be rare, the project manager may send a follow-up questionnaire to collect additional data needed to complete their review.
There are no other special circumstances associated with this information collection.
8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency.
In accordance with 5 CFR 1320.8(d), we published a 60 day notice soliciting public comment in the Federal Register of July 29, 2021 (86 FR 40853). No comments were received.
ORA has received the following information from internal and external stakeholders to date by soliciting feedback external to the PRA public comment opportunity:
ORA program reviewers:
The new forms have successfully provided useable program effectiveness data for their evaluation without the need for follow-up.
Reports that do not use the templates fail to address required program report criteria and lack consistency to generate aggregated reports.
Reports that use eRA Commons to submit (i.e. End of Year report) are in flattened pdf form require many more hours,days, and weeks to review and confirm requirements are met compared to reports submitted using new templates.
New forms allow for quick data aggregation (within a day in some cases).
Aggregated reports allow for more efficient review because data can be filtered and reviewed by specific criteria or tracks depending on reviewing assignment, numerical data can be quantified and visualized using chart functions.
Need to be able to document expenses by track and major category for a project to answer questions about how the money is spent and inform ROI/ROV.
Developed initial drafts and helped fine tune questions as workgroup participants.
Participants from programs using comparable forms not subject to the PRA or for established collections.
The forms reduce or remove the ambiguity from reporting requirements.
There was a learning curve, just like any new form, but once they got used to working in the MS Excel format they liked it.
Noticed a couple cases where we asked for the same information in a different way for different programs. Could improve consistency in how we ask for some data across forms for those participating in multiple programs.
Some data requested is not something they routinely track for their program.
Would like to see a frequently asked questions or job aid developed for using the MS Excel forms that include things like helpful short keys and avoiding errors.
The forms will reduce their reporting burden for OP required reports but also be implemented and used for the state agency for program evaluation and recordkeeping purposes.
Participants representing programs for which draft forms were shared soliciting feedback referencing the published Federal Register Notices for public comment for this collection:
Like the pre-population of administrative information such as agreement number and performance period dates, would like more of that where possible.
Would prefer that fields with pre-populated data from the previous report remain editable so they can fix mistakes at the subsequent submission if needed.
The excel forms can be difficult to review for internal approval due to display limitations for longer entries.
Printing excel forms does not work well without adjusting settings.
Spell check is not automatic in MS Excel.
Some individuals had trouble copying information into the template.
For some projects the individuals reporting are different by track. In this case would like to fill out and submit multiple copies of the form from a single agency to avoid waiting for different individuals to complete their sections.
Some of the aforementioned feedback was received after this collection was submitted for clearance and will therefore be implemented at the next revision cycle for the collection. The rest was either incorporated into the current drafts or discussed with the submitter as to why it was not incorporated at this time.
9. Explanation of Any Payment or Gift to Respondents
No payment or gift to respondents associated with this information collection.
10. Assurance of Confidentiality Provided to Respondents
This information collection does not request trade secret or commercial confidential information.
Privacy Act
Although the ICR is collecting personally identifiable information (PII), it is collected in the context of the subject individuals’ professional capacity and the FDA-related work performed for their employer. Information will be collected through the listed report forms. The PII collected is name, address, telephone number and email address. Although PII is collected, the information
collection is not subject to the Privacy Act of 1974, and the particular notice and other
requirements of the Privacy Act do not apply. Specifically, FDA does not use name or any other
personal identifier to retrieve records from the information collected.
11. Justification for Sensitive Questions
There are no sensitive questions associated with this announcement.
12. Estimates of Annualized Hour Burden and Costs
12a. Annualized Hour Burden Estimate:
Table 1.--Estimated Annual Reporting Burden1 |
|||||
Activity |
No. of Respondents |
No. of Responses per Respondent |
Total Annual Responses |
Average Burden per Response |
Total Hours |
Initial report |
330 |
1 |
330 |
10 hours |
3,300 |
Update reports |
330 |
2 |
660 |
40 hours |
26,400 |
Supplement or Follow-up reports (if applicable) |
100 |
1 |
100 |
10 hours |
1,000 |
TOTAL |
30,700 |
1There are no capital costs or operating and maintenance costs associated with this collection of information.
We estimate that 330 respondents will participate under this pilot project and will submit an average of 3 to 4 reports (including the initial report) annually (Table 1). To ensure adequate reporting will be achieved over the course of this pilot, the option for a supplement or follow-up report is included in the estimated reporting burden; however, the need for these reports will be determined on a case-by-case basis with the FDA project manager. Examples of ancillary performance supporting documentation may include, but are not limited to, those funded projects that include training, audit verification, or other documentation as part of their performance metrics.
Table 2.--Estimated Annual Recordkeeping Burden1 |
|||||
Activity |
No. of Records |
No. of Records per Recordkeeper |
Total Annual Records |
Average Burden per Recordkeeping |
Total Hours |
Records related to Initial Report |
330 |
1 |
330 |
0.5 hour |
165 |
Records related to Update Reports |
330 |
2 |
660 |
0.5 hour |
330 |
Records related to Supplement or Follow-up Report (if applicable) |
100 |
1 |
100 |
0.5 hour |
50 |
TOTAL |
545 |
1There are no capital costs or operating and maintenance costs associated with this collection of information.
Recordkeeping activities include storing and maintaining records related to submitting a request to participate in the project and compiling reports. We assume respondents use current record retention capabilities for electronic or paper storage to achieve these activities. We assume it will take 0.5 hour/year to ensure the documents related to submitting a request to participate in the program and compiled reports are retained properly according to their existing recordkeeping policies, but no less than three years, as recommended by FDA (Table 2).
Table 3.--Estimated Annual Third-Party Disclosure Burden1 |
|||||
Awardee Activity |
No. of Respondents |
No. of Disclosures per Respondent |
Total Annual Disclosures |
Average Burden per Disclosure |
Total Hours |
Coordination with partnering entities related to Initial Report |
200 |
2 |
400 |
8 hours |
3,200 |
Coordination with partnering entities related to Update Reports |
200 |
4 |
800 |
8 hours |
6,400 |
Coordination with partnering entities related to Supplement or Follow-up Report (if applicable) |
100 |
2 |
200 |
8 hours |
1,600 |
TOTAL |
11,200 |
1There are no capital costs or operating and maintenance costs associated with this collection of information.
For those pilot projects that involve a participant composed of partnering entities in the program, we are taking into consideration the time that partnering entities will spend coordinating with each other in a pilot project. We estimate 200 respondents will work with their respective partnering entities, and the average number of partnering entities will be 2. We assume each respondent will spend eight hours coordinating with each partnering entity on each response for this pilot. We estimate that seven respondents will need to coordinate with an average of two partnering entities to create progress reports and the final report to submit to FDA (Table 3).
12b. Annualized Cost Burden Estimate:
The annualized cost to all participants for the hour burden for the collection and reporting of information is estimated at $1,014,436 (42,445 hours x $23.90 per hour). The hourly wage estimate is the average of mean wages received by Agricultural and Food Science Technicians at $22.08, Biological Technicians at $23.79, and Chemical Technicians at $25.82 (May 2020 National Occupational Employment and Wage Estimates United States) who represent the primary roles expected to contribute effort in compiling report information. See http://www.bls.gov/oes/current/oes_nat.htm.
Table 4.--Estimated Annual Burden Cost1 |
|||
Activity |
Total Burden Hours |
Hourly Wage Rate |
Total Respondent Costs |
Initial Report |
6,665 |
$23.90 |
$159,294 |
Update Reports |
33,130 |
$23.90 |
$791,807 |
Supplement Report |
2,650 |
$23.90 |
$63,335 |
Total |
$1,014,436 |
13. Estimates of Other Total Annual Cost Burden to Respondents and/or Recordkeepers/Capital Costs
There are no capital, start-up, operating, or maintenance costs associated with this pilot program.
14. Annualized Cost to the Federal Government
ORA intends to communicate with all pilot project participants to ensure the learnings from the pilot project(s) will be complementary in informing the direction of the development of the new fillable forms for partners. ORA will work with participants to develop an appropriate schedule for the submission of update reports based on the design and duration of the pilot project.
The annualized government cost estimate is $46,210.77 as shown in Table 5, which will be supported by existing ORA program budgets.
Table 5.--Estimated Government Costs1 Using the 2020 Salary Tables |
|||
Government Personnel |
Effort Commitment |
Average Annual Salary |
Total Costs |
GS-11 (1) |
25% |
$64,009 |
$16,002.25 |
GS-13 (9 @ 3% each) |
27% |
$91,796 |
$24,784.92 |
GS-14 (1) |
5% |
$108,472 |
$5,423.60 |
Total |
$46,210.77 |
15. Explanation for Program Changes or Adjustments
This is a new information collection. This burden estimate was adjusted from 400 respondents published in 30-day notice, 87 FR 38165, to 330 respondents to reflect the actual enrollment in ORA funded public health projects under this collection.
16. Plans for Tabulation and Publication and Project Time Schedule
There are no plans for tabulation, publication, and project time scheduling.
17. Reason(s) Display of OMB Expiration Date is Inappropriate
The OMB expiration date will be displayed as required on all subject forms.
18. Exceptions to Certification for Paperwork Reduction Act Submissions
There are no exceptions to the certification in 5 CFR 1320.9.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Modified | 0000-00-00 |
File Created | 2023-08-30 |