Supporting Statement for Paperwork Reduction Act
7/14/10
OMB Control Number: 1660-0032
Title: U.S. Fire Administration’s National Fire Academy Evaluation Collection
Form Number(s): FEMA Form 064-0-4, FEMA Form 064-0-5, FEMA Form 064-0-10.
B. Collections of Information Employing Statistical Methods.
When Item 17 on the Form OMB 83-I is checked “Yes”, the following documentation should be included in the Supporting Statement to the extent it applies to the methods proposed:
(a) Describe (including numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. (b) Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. (c) Indicate expected response rates for the collection as a whole. If the collection has been conducted previously, include the actual response rate achieved during the last collection.
Since the NFA Distance Learning Course Evaluation (FF 064-0-4) and the USFA Conference/Symposium (FF 064-0-10) are new data collection requests, the total population universe will be used as the potential respondent universe. As such, the total population universe is estimated at 40,000 (based on FY2009 NFA Online completion data) and 600 (based on FY 2009 conference/symposium attendance records), and will be used to reflect the respondent universe.
NFA End of Course Evaluation is an established collection and the population universe has not changed. It has a historical response rate in the 95-99% range and we expect a high response rate with the NFA Distance Learning Course Evaluation and the USFA Conference/Symposium as well though we have no historical data on those.
|
Universe |
Sample |
||
Num of Entities [Description] |
Size |
Num of Entities [Description] |
Size |
|
NFA Distance Learning Course Evaluation Form / FEMA Form 064-0-4 |
1 [Individual - student] |
40,000 |
1 [Individual] |
40,000 |
NFA End of Course Evaluation Form / FEMA Form 064-0-5 (Formerly FEMA Form 95-20) |
1 [Individual - student] |
14,000 |
1 [Individual] |
14,000 |
USFA Conference / Symposium Form / FEMA Form 064-0-10 |
1 [Individual] |
600 |
1 [Individual] |
600 |
Total |
3 |
54,600 |
3 |
54,600 |
Describe the procedures for the collection of information including:
Statistical methodology for stratification and sample selection,
Each of the data collections represents a census style evaluation, so no sampling selection is necessary.
Analysis Plan FF 064-0-5 End of Course Evaluation
The paper forms once received in the Academy’s Training Evaluation Center will be optically scanned and the data will be processed using the Statistical Program for the Social Sciences (SPSS) software. The online electronic form will process the data through an MS SQL 2000 database.
The NFA End of Course Evaluation Form is organized by part (there are four individual parts) and includes two types of data. The first type of data includes descriptive statistical data concerning: respondent demographics, course materials, instructional delivery methods and the overall training experience. The second type of data includes narrative comments from students indicating: suggested improvements for NFA training and why, particular feedback about individual instructors, and the most/least beneficial aspect of the training. The data will be processed using the SPSS software (for the optically scanned forms) and through the MS SQL 2000 database (for the online web form). Microsoft Word reports (scanned forms) and Crystal Reports (online forms) will be compiled for review by Training / Instructional Systems Specialists and NFA managers.
In addition to the standardized descriptive and narrative types of data provided in regular end-of-course reports to Training/Instructional Systems Specialists and NFA managers, special reports will be run from the end-of-course data for other internal and external audiences as required.
There are several ways in which the data from this data collection are used. For example,
(a) contract instructor faculty receive both item-specific and scaled assessment scores from student evaluations and are able to determine their overall effectiveness in teaching the NFA training course. Narrative data are also included in the NFA course evaluation reports and indicate student suggestions for course improvements as well as suggestions or comments students might have for course instructors,
(b) instructional development/training specialist staff receive the same data and use it to review and determine appropriate methodologies and delivery modes for NFA training,
(c) U.S. Fire Administration/NFA management staff use the EOC data and project overall fire service training needs and suggestions for future training courses.
By way of analysis, the first page of the report indicates mean scores for scales presenting student assessments of several dimensions of the training experience (i.e., materials, environment, and instructional delivery). Narrative data from the report indicates the need for suggested course revisions or updates and student feedback to instructors.
Additional uses of the data include the ability of instructors, training managers and NFA management and staff to synthesize comparative instructor ratings.
In the case of instructors, each NFA instructor can view the End of Course [MS SQL 2000] database to display his or her course (and/or curriculum) performance score relative to both (a) the overall average score for all instructors who have taught in the same course and/or curriculum, and (b) the overall average score for all NFA instructors teaching across all curriculum areas.
In the case of Training Evaluation staff, each NFA Training Specialist can also view and/or query the End of Course [MS SQL 2000] database to display instructor course (and/or curriculum) performance score(s), and again, with these in comparison to (a) all instructors who have taught in the same course and/or curriculum, and (b) all NFA instructors teaching across all curriculum areas.
Analysis Plan FF 064-0-4 Distance Learning Evaluation
The NFA DL Evaluation Form is organized into three parts, which overall, request two types of data.
The first type of data includes information by which to ground such general and descriptive statistics as respondent demographics, levels of course satisfaction, satisfaction with the training experience and online facilities, and satisfaction with NFA’s servicing/support options. These data are necessary because they provide NFA the opportunity to confirm empirically that it is reaching the audiences it has identified for training deliveries.
The second type of data is narrative in form and includes end-user comments which indicate suggestions for course improvements. This second type of data is necessary because narrative data provide training specialists, instructional system specialists, and USFA/NFA managers’ information useful in consideration of curriculum revision.
These data will be compiled into course-specific reports which will be distributed to training specialists, instructional system specialists and USFA/NFA managers to provide statistical overviews of student responses.
In particular, these statistical overviews will include the following percentage distributions:
(1) Summaries of student background characteristics (i.e., percent distributions of previous NFA training experiences, department type, jurisdiction/population served and job responsibilities).
(2) The number and percent of students who judge that the training course/materials:
Increased their knowledge,
Were well organized and easy to navigate,
Met stated course objectives,
Provided useful ways to measure progress throughout the class,
Had a good fit between material and test questions,
Met training expectations.
Provided up-to-date links and information,
Were of good technical quality,
Will be a useful reference on the job,
Fit comfortably with student’s background in technology.
Helped improve their job performance;
Were consistent with their department’s training expectations,
Were useful for a department the size of their own,
Included materials that helped their department’s prevention efforts,
Will help reduce the fire-related risks within the student’s community,
Provided information their department could use when either preparing for or responding to an all-hazards and/or terrorist event; and
Were generally worth recommending to others.
(3) The number and percent of students who indicated the need for the following suggested class improvements:
Leave alone: No improvement is needed,
Clarify course objectives,
Link test questions to course objectives,
Improve course navigation/add help menu(s),
Provide opportunity for access to instructor,
Include more “check your progress” features,
Include video links,
Expand content, course is too basic,
Limit the content, course is too detailed, and
Add online chat feature.
(4) The number and percent of students reporting their satisfaction with the system configuration and its available online tools, as well as their experience of the course materials and intent to use the continuing education credits.
The second type of data provided by the DL reports prepared for training and instructional systems specialists and NFA managers is narrative data. Narrative data vary, but generally include comments that indicate:
Needed improvements in the NFA training;
Descriptions of incidents in which students have applied NFA training; and
Topics for future training classes.
Summaries of descriptive statistics and narrative types of data provided in regular Distance Learning reports to training and instructional systems specialists and NFA managers. Special reports will be run from Distance Learning data for other internal and external audiences as required.
Analysis Plan FEMA Form 064-0-10 USFA Conference/Symposium
The NFA Conference/Symposium Evaluation Form is organized into four parts, which overall, request two types of data.
The first type of data includes such general and descriptive statistics as respondent demographics; levels of satisfaction with conference/symposium format; satisfaction with opportunities for peer networking and participation; and the usefulness of presentations and workshops. These data are necessary because they provide NFA the opportunity to confirm empirically that it is meeting the needs of its stakeholders.
The second type of data is narrative in form and includes end-user comments which indicate suggestions for conference/symposium/workshop improvements. This second type of data is necessary because narrative data provide training specialists, instructional system specialists, program managers, and USFA/NFA leadership the information necessary in consideration of curriculum and program revision and development initiatives.
These data will be compiled into conference/symposium specific reports which will be distributed to training specialists, instructional system specialists and USFA/NFA managers to provide statistical overviews of participant responses.
In particular, these statistical overviews will include the following percentage distributions:
(1) Summaries of participant background characteristics (i.e., percent distribution of State, national, local affiliations).
(2) The number and percent of participants who judge that the conference/symposium presentations/workshops and materials:
Increased participant’s knowledge,
Were well organized,
Met participant expectations,
Provided up-to-date and relevant information,
Will be a useful reference on the job,
Will improve participant’s job performance,
Provided information participants will take back to their departments.
(3) Summaries of narrative data which indicate:
Needed improvements in the conference/symposium format,
Suggestions for sessions and workshops,
Suggestions for new and/or additional exhibit resources.
Estimation procedure,
Because each of these data collection forms are simple course or conference/symposium evaluations used for instructor and/or program feedback, we expect the response rate to be high (over 80%). We do not expect the need for diagnostic tests for missing data. The data represent a census of the population of student or symposium attendees so no estimation procedures are anticipated since population parameters will be directly obtained.
In the event of missing data, however, SPSS diagnostic tests, available through the SPSS Missing Values component will be employed. These diagnostic tests identify patterns for missing data through item by item analyses that test for differences between respondents and non-respondents. Thus, these diagnostics permit estimates of population parameters for all items in which missing values are recorded, and they allow the user a greater sense of accuracy for all items that include missing values. SPSS software provides an extended summary of its programming capabilities for identifying the kinds of missing data patterns that can occur, and Exhibit 1 summarizes the specific types of information provided by these tests if needed in the event of substantial missing data.
Degree of accuracy needed for the purpose described in the justification,
Degree of accuracy (margin of error) is not immediately applicable to these information collections because we expect the response rates to be high and over 80% and there is no sampling process involved. In addition, extremely accurate estimates of true scores are not needed in these cases since the data findings will be used for administrative and program purposes only and not statistically based inferential purposes. In addition, findings will not be used for inferential purposes and the homogeneity of the target population and interest in the subject ensures satisfactory levels of validity and reliability based on respondents’ ability to provide useful and consistent information.
Unusual problems requiring specialized sampling procedures, and
There are no unusual problems requiring specialized sampling procedures.
Any use of periodic (less frequent than annual) data collection cycles to reduce burden.
There is no use of periodic data collection cycles to reduce burden.
Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.
FF 064-0-5 End of Course Evaluation - In order to maximize response rates, clear instructions are provided to instructors or delivery personnel at the onset of the course delivery cycle and information collection stage. Written instructions are provided to students who complete both the paper-based form as well as the online web-based form. Detailed instructions are provided with adequate completion time. The NFA has also established an on-line evaluation web-page and direct mailbox to assist respondents with questions.
FF 064-0-4 Distance Learning Evaluation - In order to maximize
response rates, the NFA will place the sharable content object for
the Distance Learning Course Evaluation Form into each NFAOnline
course. Students will be encouraged to complete the evaluation form
before they print their course completion certificate. In addition,
system generated email reminders will be sent through the learning
management system if a response target of at least 50 percent is not
initially met.
It is expected that this will help
maintain sufficiently high response rates suitable to analysis.
FF 064-0-10 USFA Conference/Symposium - In order to maximize response rates, forms will be distributed to conference/symposium participants at the immediate closing of the presentation/workshop for a quick return. Clear instructions will be provided to participants by the conference/symposium program manager(s). Paper-based forms contain instructions for completion and will be collected as participants exit the final presentation/workshop.
4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.
FF
064-0-5 End of Course Evaluation – A
usability test was conducted for the online evaluation survey
instrument to ensure that the web-based application was in working
order for the currently approved NFA EOC evaluation form
(Implementation of the Government Paperwork Elimination Act, Office
of Management and Budget, PART I. Section 2. a.). This test was
conducted over a period of 4 months (December, 2005-March, 2006),
with specific attention to (1) the ease of navigation within the
form, (2) the clarity of the form’s items (3) the length of
completion time and (4) the instructions for use. The test was
conducted under the oversight of both the Project Officer and NFA
WEB Master, and trained staff from the NFA’s Evaluation
Center. Students volunteered assessments and comments and where
applicable (e.g., in navigation directions), this feedback was
incorporated into the online version of the currently approved NFA
EOC evaluation form.
FF 064-0-4 Distance Learning Evaluation - Lessons learned from previous usability testing for the NFA’s Level I end-of-course form and its Level III long-term evaluation form have been incorporated into the development of this evaluation instrument. Additional usability tests were scheduled in early 2010 to ensure that the Web-Based application was in working order. Test was conducted over a period of approximately 6- 9 months, with specific attention directed to (1) the ease of navigation within the form (2) the clarity of form instructions and (3) the clarity of the form’s items. The test was conducted under the oversight of both the Project Officer and USFA Web Master, and trained staff from the NFA’s Evaluation Center.
FF 064-0-10 USFA Conference/Symposium - Lessons learned from previous usability testing for the NFA’s paper-based Level I end-of-course form have been incorporated into the development of this evaluation instrument. Test was conducted over a period of approximately 6-8 weeks, with specific attention directed to (1) the clarity of form instructions and (2) the clarity of the form’s items. The test was conducted under the oversight of both the Project Officer and trained staff from the NFA’s Evaluation Center.
Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.
From FEMA – U.S. Fire Administration/National Fire Academy
Person 1: Laura Chevalier (301) 447-1614
Training Administration Planning and Analysis Section, Admin. & Delivery Branch
National Fire Academy/U.S. Fire Administration
Federal Emergency Management Agency
16825 South Seton Avenue
Emmitsburg, MD 21727
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | nbouchet |
File Modified | 0000-00-00 |
File Created | 2021-02-02 |