March 21, 2007
Supporting Statement Part A. for Paperwork Reduction Act Submission
OMB Control Number: 1660-0032
Title: National Fire Academy End of Course Evaluation Form
Form Number(s): FF 95-20
A Supporting Statement, including the text of the notice to the public required by 5 CFR 1320.5(a)(i)(iv) and its actual or estimated date of publication in the Federal Register, must accompany each request for approval of a collection of information. The Supporting Statement must be prepared in the format described below, and must contain the information specified in Section A below. If an item is not applicable, provide a brief explanation. When Item 17 or the OMB Form 83-I is checked “Yes”, Section B of the Supporting Statement must be completed. OMB reserves the right to require the submission of additional information with respect to any request for approval.
To complete the supporting statement, type in your responses in the white space below each question. Your responses should be full and complete and provide sufficient information to help the OMB desk officer to understand what you are planning to do and why and how the Agency/Federal Government will benefit from and use the information you will be obtaining or soliciting.
1. Explain the circumstances that make the collection of information necessary (give details as to why this information is being collected). Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information. Provide a detailed description of the nature and source of the information to be collected.
The National Fire Academy (NFA) is congressionally mandated to provide training and education to the Nation’s fire service and emergency response personnel (Exhibit #4). The state-of-the-art programs offered by the NFA serve as models of excellence, and state and local fire service agencies rely heavily on the curriculum to train their personnel. To maintain the high standards of these programs, it is critical that courses be evaluated to determine student satisfaction and reaction to the course materials, instructional delivery, and the training environment.
There are two channels of NFA courses offering, on-campus and off-campus, for which there have been two separate evaluation forms used. However, the agency believes it will be more efficient for both information collection and its analyses to consolidate the two forms into one. This request seeks approval for the use of one evaluation form that will be employed to evaluate all traditional classroom based course deliveries, that include both on-campus and off-campus locations, ranging in duration from two to six to ten days in delivery. The questionnaires have been modified, and some of the previous questions have been eliminated and others updated for consolidation.
To coincide with the completion of each course offering, paper evaluation forms will be mailed to each off-campus training location and administered to students. In contrast, for all on-campus courses, an online electronic evaluation form will be used, and students will be given the opportunity to access the electronic form through a web-based application at the completion of the course (Exhibit #3). The data collection will be managed under the overall guidance of the NFA project officer, but it will be implemented by contracted staff with considerable training and experience in data collection, analysis and reporting.
2. (1) Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, (2) indicate the actual use the agency has made of the information received from the current collection. If applicable, provide a detailed description of how the information will be shared and for what programmatic purpose.
(1) The data collected by NFA for its end-of-course evaluation form will be used by:
(a) contract instructor faculty
(b) instructional development/training specialist staff, and
(c) U.S. Fire Administration/NFA management staff
Information collected with this evaluation form will enable NFA staff to monitor and recommend changes in course materials, subject selection criteria, the training experience and the classroom environment. From the processed data, reports will be made available to management, staff and contracted instructional faculty.
Analysis Plan
The paper forms once received in the Academy’s Training Evaluation Center will be optically scanned and the data will be processed using the Statistical Program for the Social Sciences (SPSS) software. The online electronic form will process the data through an Oracle database.
The NFA End of Course Evaluation Form is organized by part (there are four individual parts) and includes two types of data. The first type of data includes descriptive statistical data concerning: respondent demographics, course materials, instructional delivery methods and the overall training experience. The second type of data includes narrative comments from students indicating: suggested improvements for NFA training and why, particular feedback about individual instructors, and the most/least beneficial aspect of the training. The data will be processed using the SPSS software (for the optically scanned forms) and through the Oracle database (for the online web form). Microsoft Word reports (scanned forms) and Crystal Reports (online forms) will be compiled for review by Training/Instructional Systems Specialists and NFA managers.
In addition to the standardized descriptive and narrative types of data provided in regular end-of-course reports to Training/Instructional Systems Specialists and NFA managers, special reports will be run from the end-of-course data for other internal and external audiences as required.
(2) There are several ways in which the data from this data collection are used. For example,
(a) contract instructor faculty receive both item-specific and scaled assessment scores from student evaluations [Parts 1- II of Exhibit #1 ] and are able to determine their overall effectiveness in teaching the NFA training course. Narrative data are also included in the NFA course evaluation reports and indicate student suggestions for course improvements as well as suggestions or comments students might have for course instructors,
(b) instructional development/training specialist staff receive the same data (see Exhibit #1 ) and use it to review and determine appropriate methodologies and delivery modes for NFA training, finally,
(c) U.S. Fire Administration/NFA management staff use the EOC data and project overall fire service training needs and suggestions for future training courses.
By way of analysis, Parts 1-II from the first page of the printed report [Exhibit #1] indicate mean scores for scales presenting student assessments of several dimensions of the training experience (i.e., materials, environment, and instructional delivery) . Narrative data from the report (see pages 2 and following of Exhibit #1) indicate the need for suggested course revisions or updates and student feedback to instructors as indicated by the above numbered items.
Additional uses of the data include the ability of instructors, training managers and NFA management and staff to synthesize comparative instructor ratings.
In the case of instructors, each NFA instructor can query the End of Course [Oracle] database to display his or her course (and/or curriculum) performance score relative to both (a) the overall average score for all instructors who have taught in the same course and/or curriculum, and (b) the overall average score for all NFA instructors teaching across all curriculum areas.
In the case of Training Evaluation staff, each NFA Training Specialist can also query the End of Course [Oracle] database to display instructor course (and/or curriculum) performance score(s), and again, with these in comparison to (a) all instructors who have taught in the same course and/or curriculum, and (b) all NFA instructors teaching across all curriculum areas.
3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.
The survey activity will partially include electronic and automated methods. The data entry and process will be electronic and automated (Implementation of the Government Paperwork Elimination Act, Office of Management and Budget, Part I. Section 2. a., Section 3. a. (8)). The report will be in both printed and electronic forms. Details are following:
The form will be made available as an online, electronic form for use by students attending on-campus course deliveries. Each classroom/computer lab computer desk top will display an evaluation icon that will allow students to (1) access, (2) complete and (3) submit their responses online. The data from these submissions will be processed using an Oracle database and both the data and subsequent reports resulting from these submissions will be available in real time for users with access credentials. These access credentials will include a user identification and secure password.
For off-campus use at state and local training locations the current plan is to print the form in an optical scannable format. Students will be asked to complete the form by marking potential response categories with either a pen or a pencil. The forms will then be scanned using a self-feeding electronic scanner, and the reports generated from the scanned data will be distributed electronically via e-mail and web posting. Over time, NFA staff will work with states and local training institutions to determine their ability to use online electronic resources. This is because individual states and local training entities currently vary in their ability to use information technology in support of training delivery and evaluation. Therefore, utilizing both methods, printed and electronic, is desirable to minimize the burden as much as possible.
4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.
There used to be a NFA course evaluation form, OMB 1660-0031, which was coexisting with and similar to the proposed one, for NFA Field/Off-campus and State Weekend Programs. The form, OMB 1660-0031, was discontinued due to the agency’s request as of 11/30/2005. This request of the agency was solely devoted to the agency’s effort to eliminate the existence of two similar sets of data collection instruments, and to allow the efficiency of using one standard data collection form for all NFA traditional classroom based course deliveries.
No other systematic collection of data regarding all the information such as NFA student demographics, course content and quality of instructional delivery exists. The NFA admissions form had been considered as a possible linkage to the course evaluation form for the demographics data field to reduce the respondent’s hour burden. However, the possibility is extremely low because it can automatically expose the student’s name and social security number. On the contrary, the students should remain anonymous when they fill out the form. Thus, it is necessary to collect both the demographic data and opinions of the students within the same form in order to correlate the information in all data fields without defying confidentiality.
5. If the collection of information impacts small businesses or other small entities (Item 5 of OMB Form 83-I), describe any methods used to minimize.
Only individual students who attend NFA traditional classroom based training are asked to complete this evaluation form. Small businesses and/or other small entities are not required to complete this course evaluation form.
6. Describe the consequence to Federal/FEMA program or policy activities if the collection of information is not conducted, or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.
If the data collection is not conducted, the USFA/NFA will not be able to determine the appropriateness of materials for the target audiences. It also enables staff to determine trends in participants’ reactions to course content over time. If the data is not collected, not only the database is not usable and participants’ evaluation of training events could not be assessed, but also it would be difficult to determine the need for improvements and the degree of student satisfaction with the course experience.
7. Explain any special circumstances that would cause an information collection to be conducted in a manner:
(a) Requiring respondents to report information to the agency more often than quarterly.
Respondents are only required to provide the information once, after completion of each training course. There are no special circumstances that require the collection to be conducted in a manner that is inconsistent with the general information collection guidelines in 5 CFR 1320.6.
(b) Requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it.
The end-of-course evaluation process elicits level I training data which measures student reaction and satisfaction with the training course. In order to capture the student’s immediate reaction and satisfaction, the evaluation form is administered immediately following the conclusion of the training course, and the completed form is expected to be turned in the same day for on campus or within approximately ten days for off-campus. Completion of the form is voluntary.
(c) Requiring respondents to submit more than an original and two copies of any document.
Respondents are required to submit only the original evaluation form.
(d) Requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records for more than three years.
There is no recordkeeping requirement involved in this collection.
(e) In connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study.
The survey methodology used in this collection follows accepted standards of data quality and reliability. The methodology used in this collection is the same methodology approved by OMB under the last PRA submission. Refer to Part B for details on the proposed statistical methodology.
(f) Requiring the use of a statistical data classification that has not been reviewed and approved by OMB.
The information collected is for internal use for program planning, management and evaluation and is not intended as general statistical information for public dissemination.
(g) That includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use.
There is no confidentiality pledge to respondents that will hamper sharing data with program managers involved in this information collection.
(h) Requiring respondents to submit proprietary trade secret, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information’s confidentiality to the extent permitted by law.
No proprietary or secret information will be asked from respondents in this collection.
Federal Register Notice:
a. Provide a copy and identify the date and page number of publication in the Federal Register of the agency’s notice soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.
A 60-day Federal Register Notice inviting public comments was published on January 19, 2007, Vol. 72, No. 12, PP. 2537-2538. No comments were received.
b. Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.
The USFA’s National Fire Academy is in the first year of a five-year contract with Synthesis, Inc., an 8-A contractor to provide on-site evaluation services. As such, contracted personnel provide expert assistance in the formulation of questionnaires/data collection instruments which the Academy and other divisions within the USFA use to evaluate their training and education programs. Evaluation Center staff also develops reporting formats used to display the results of the data analysis and routinely prepares reports in conjunction with individual training courses.
c. Describe consultations with representatives of those from whom information is to be obtained or those who must compile records. Consultation should occur at least once every three years, even if the collection of information activities is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.
The NFA consults with its residential student population on a regular basis regarding course evaluations. The Superintendent holds a “Superintendent’s Lunch” and meets with class representatives to discuss issues concerning training development, course delivery and evaluation every delivery cycle, which occurs approximately once every two weeks. He also visits each class every delivery cycle, and provides an open forum for any issues the students and instructors want to discuss.
9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.
There is no remuneration to respondents for their participation in this collection.
Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy. Provide details on:
The information to be collected complies with the Privacy Act of 1974. Students will be instructed on each form not to include/write their name or any other information that may reveal their identities.
a. Whether respondents are informed on the mandatory or voluntary nature of providing the information,
Although completion of the survey is encouraged, respondents are informed that the survey is a voluntary way of providing information.
b. Opportunities to decline participation or to consent to particular uses in the information, and
Applicants have the opportunity to decline participation by just refusing to complete and submit the questionnaire.
c. How can respondents grant such consent?
Since respondents are informed of the voluntary participation in this survey, by choosing to complete the questionnaires respondents are granting consent.
d. State any administrative and/or technological control to secure the information.
The information collected from the survey instruments is entered into a secured database.
The system is protected by multiple layers of physical and electronic security under DHS control. The database is password-protected and can only be accessed by a select number of authorized DHS staff and contractors on a need-to-know basis. Data is not released to external requestors without the appropriate approval. The online web based form is part of the USFA web farm application and has received the necessary certification and accreditation/privacy impact and has the authority to operate.
e. Will data findings be analyzed and reported in a way that protects respondents’ anonymity?
Data will be analyzed and reported in an aggregate format to prevent the disclosure of any individual respondent’s information. Demographic data elements are used exclusively for analytical purposes.
f. For electronic information collections (web-based): In addition to the above information, provide a detailed description of the use of any agency-authorized tracking of respondents (due to a compelling need), and whether there is an intent to identify individual respondents in conjunction with other data elements (i.e., gender, race, age, geography, and other descriptors).
Individual responses will not be used in conjunction with demographic data elements. Collective/summary demographic data for courses and/or broad curriculum areas will be used to analyze fire service training needs.
11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.
There are no questions of a sensitive nature.
12. Provide estimates of the hour burden of the collection of information. The statement should:
a. Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desired. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.
NFA used to have two sets of similar course evaluation forms; one for on-campus-only courses and the other for off-campus-only. Seeking more efficient information collection and analysis method, NFA decided to consolidate the two forms into one to be used for both on-campus and off-campus courses. The off-campus-only course evaluation form (OMB Control Number 1660-0031) was discontinued on 11/30/2005 according to the agency’s effort to have “one form for all’ the NFA traditional classroom-based courses. Thus the number of respondents (14,000) was estimated by summing the numbers of registrations for the courses on-campus (9,000) and off-campus (5,000) based on FY2005 NFA Admission’s data. The hour burden estimate for both paper and electronic form is 15 minutes. The hour burden for the paper form has been estimated utilizing the agency’s observation of the students filling out the currently approved course evaluation form during the previous years. The hour burden for the electronic form has been estimated by informal usability tests of the web form system using the current version of the End-of-Course form.
Table 1. Annual Hour Burden |
|||||
Project/Activity (Survey Form(s), Focus Group, etc.) |
Number of Respondents |
Frequency of Responses |
Hour Burden per Response (hours) |
Annual Responses |
Total Annual Hour Burden (hours) |
(A) |
(B) |
(C) |
(D=AxB ) |
(E=CxD) |
|
FF 95-20: Off-campus students |
5,000 |
1 |
.25 |
5,000 |
1,250 |
FF 95-20: On-campus students |
9,000 |
1 |
.25 |
9,000 |
2,250 |
TOTAL |
14,000 |
1 |
.25 |
14,000 |
3,500 |
b. If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens in Item 13 of OMB Form 83-I.
c. Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. (The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead this cost should be included in Item 14.)
The annualized cost to respondents totals $65,800 with an average cost of $4.70 per respondent. Due to the diversity of participants’ occupations and geographic locations (i.e. Fire fighters, engineers, medical Personnel, emergency managers, educators), the national median hourly rate for all occupations per Bureau of Labor Statistics is used to estimate respondents cost.
Table 2. Annual Cost to Respondents for the Hour Burden |
||||
Program |
Total Annual Hour Burden (hours) |
Median Hour Rate ($) |
Average Cost per Respondent ($) |
Annualized Cost to Respondents ($) |
FF 95-20 |
3,500 |
$18.80(1) |
$4.70 |
$65,800 |
Total |
3,500 |
|
|
$65,800 |
(1) Median hourly rate for Firefighter, Bureau of Labor Statistics, 2005.
13. Provide an estimate of the total annual cost burden to respondents or record keepers resulting from the collection of information. (Do not include the cost of any hour burden shown in Items 12 and 14.)
There is no start-up, maintenance or operational cost to respondents involved in this collection.
14. Provide estimates of annualized cost to the Federal Government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing and support staff), and any other expense that would have been incurred without this collection of information.
Table 3. Annual Cost to the Federal Government |
|
Item |
Cost ($) |
Contract Costs (Include survey development, implementation, analysis and reporting) |
100,000.00 |
Staff Salaries |
9,000.00 |
Printing |
4,000.00 |
IT Maintenance |
15,000.00 |
Total |
$128,000.00 |
The following provides details about the approximate cost to the Government of this data collection:
Contract costs/operational costs are estimated at $100,000.00 per year. They include staffing, maintaining the electronic database, and report generation. The Academy's Evaluation Center is staffed with three full-time contractor staff that oversees the database management and report generation under the guidance of the National Fire Academy Project Officer.
Staff salaries are estimated at $9,000.00 per year. They include approximately 1 hour per day of the Project Officer’s time calculated at the GS-12 level.
Supplies and forms are estimated to cost about $4,000.00 per year.
IT maintenance for the online web farm application is estimated at $15,000.00 per year.
15. Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the OMB Form 83-I in a narrative form. Present the itemized changes in hour burden and cost burden according to program changes or adjustments in Table 5. Denote a program increase as a positive number, and a program decrease as a negative number.
Definitions
Program changes should not be confused with adjustments.
i) Program change
A "Program increase" is an additional burden resulting from an action or directive of a branch of the Federal government (e.g., an increase in sample size or coverage, amount of information, reporting frequency, or expanded use of an existing form). This also includes previously in-use and unapproved information collections discovered during the ICB process, or during the fiscal year, which will be in use during the next fiscal year.
A "Program decrease", is a reduction in burden because of: (1) the discontinuation of an information collection; or (2) a change in an existing information collection by a Federal agency (e.g., the use of sampling (or smaller samples), a decrease in the amount of information requested (fewer questions), or a decrease in reporting frequency).
ii) An "Adjustment" denotes a change in burden hours due to factors over which the government has no control, such as population growth, or in factors which do not affect what information the government collects or how (e.g., changes in the methods used to estimate burden or correction of errors in burden estimates).
Table 4. Itemized Changes in Hour Burden.
Information Collection Activity/Instrument |
Current Hour Burden (hours) |
Proposed Hour Burden (hours) |
Program Changes |
Adjustment |
FF 95-20: Off-campus Course Evaluation |
0 |
1,250 |
+1,250 |
0 |
FF 95-20: On-campus Course Evaluation |
1,450 |
2,250 |
0 |
+800 |
Total |
1,450 |
3,500 |
+1,250 |
+800 |
This information collection retains the statistical methodology and target population character approved for the same information collection 1660-0032 in the current OMB Inventory. However, since OMB Collection 1600-0031 was discontinued and merged into this collection (see A.1., A.4. and A.12 for details) the universe of respondents will increase by 1250 (students), who previously were respondents of OMB 1660-0031. The increase of 800 in the adjustment demonstrates the number of students of Academy’s State Weekend Program, which is really an on-campus program, but previously had been counted in the off-campus number of respondents.
There is no change in cost burden from the currently approved IC.
16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.
There is no plan to publish the results of the data collection at this time.
17. If seeking approval not to display the expiration date for OMB approval of the information collection, explain reasons that display would be inappropriate.
A valid OMB Control Number and expiration date will be displayed in the survey forms.
18. Explain each exception to the certification statement identified in Item 19 “Certification for Paperwork Reduction Act Submissions,” of OMB Form 83-I.
No exceptions referenced above are sought for this collection.
Attachment
Exhibit #1: NFA EOC Sample Report
Exhibit #2: SPSS Missing Values Test Narrative
Exhibit #3: Screen Shots of the Online Electronic Form
Exhibit #4: Public Law 93-498
File Type | application/msword |
File Title | Rev 10/2003 |
Author | FEMA Employee |
Last Modified By | clim |
File Modified | 2007-03-30 |
File Created | 2007-03-19 |