PA Supporting_Statement_Part_B_1660-0170_4-18-11-New Changes

PA Supporting_Statement_Part_B_1660-0170_4-18-11-New Changes.doc

FEMA Public Assistance Program Evaluation and Customer Satisfaction Survey

OMB: 1660-0107

Document [doc]
Download: doc | pdf

Date: April 18, 2011


Supporting Statement for

Paperwork Reduction Act Submissions


OMB Control Number: 1660 - 0107


Title: FEMA Public Assistance Program Customer Satisfaction Survey



B. Collections of Information Employing Statistical Methods.



When Item 17 on the Form OMB 83-I is checked “Yes”, the following documentation should be included in the Supporting Statement to the extent it applies to the methods proposed:


1. Describe (including numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection has been conducted previously, include the actual response rate achieved during the last collection.


All assistance recipients comprise the universe of this collection and they are all included in this survey. All primary units in the sampling universe are contacted. The total potential universe of grantees and sub-grantees varies from year to year based on how many disasters there are. It is a particularly difficult number to predict. The number of recipients in 2006, for example, was unusually high at 9,018 because of hurricane Katrina. Our universe of selection is the number of all recipients in the upcoming years of this collection. Therefore, our proposed number of applicants that we are asking to collect data from are what we expect in a typical year based on a modest return rate.



The target population is composed of those that applied for and received Public Assistance funds as a result of a federally-declared disaster. The sampling frame is the list of all the public units that applied for and received FEMA Public Assistance and derived from EMMIE. We are asking for approval to survey 3,360 of them and are making efforts to get a response from all applicants. FEMA contacts the representatives of all Business or other for-profit, Not-for-profit institutions, Federal Government, and State, Local or Tribal Government institutions in the sampling universe.


The purpose of the Public Assistance Program Customer Satisfaction Survey is to assess customer satisfaction with different processes and human performance aspects of FEMA’s Public Assistance Program to improve or maintain the quality of service offered to the these entities.


The disaster survey response rate is calculated using the following formula:

Number of Completed Surveys Returned

(Number of Applicants – Number of Undeliverable Surveys)


Historical Response Rates for This Collection

This is a new information collection request. Upon our next request in three years we will provide detailed descriptions of the response rates from this collection.


This collection of information is necessary to enable the Agency to garner customer and stakeholder feedback in an efficient, timely manner, in accordance with our commitment to improving service delivery. The information collected from our customers and stakeholders will help ensure that users have an effective, efficient, and satisfying experience with the Agency’s programs. This feedback will provide insights into customer or stakeholder perceptions, experiences and expectations, provide an early warning of issues with service, or focus attention on areas where communication, training or changes in operations might improve delivery of products or services. These collections will allow for ongoing, collaborative and actionable communications between the Agency and its customers and stakeholders. It will also allow feedback to contribute directly to the improvement of program management.


Measures:

Frequencies of responses are converted to percentages of survey respondents selecting one of the affirmative response choices (e.g., very satisfied, etc.). No complex analytical techniques or scaling methods are used to compile the survey responses received. These percentages are then used to compute the percent of customer satisfaction for the performance measure by summing the percent of respondents selecting the “very satisfied,” etc. Twenty-six performance measures are grouped into six performance standards, with percent customer satisfaction for each performance measure averaged to compute the average percent customer satisfaction with each performance standard. The average percent satisfactions for the six performance standards are then averaged to compute the overall average customer satisfaction with FEMA’s assistance for the particular disaster.


The percent customer satisfaction with each performance measure and standard are tabulated in individual disaster survey reports and compared with the targets established for the measures and standards. The supporting survey response choice frequencies, percents, and cumulative percents for each performance measure is tabulated in the state report addendum, along with written comments in response to survey requests for additional information. Disasters with fewer than five survey respondents do not have report addendum prepared and the survey responses for that disaster are not aggregated with the other disasters for the annual report.


After all state disaster surveys have been compiled and reported, FEMA will prepare an annual survey report, which aggregates individual disaster survey data collected during the year for those disasters with more than 5 responses. Each disaster is weighted the same as another, regardless of the number of responses received, when aggregating the survey data. The computed average percent customer satisfaction for each measure and standard, therefore, represents an average of that recorded for each disaster surveyed that year. Statistical analyses consist of simple frequencies and percentages by response choices and computing average customer satisfaction for each performance measure and standard. The aggregated annual survey data will undergo comparative analysis by evaluating average percent customer satisfaction among different categories, to performance targets, and to the satisfaction levels for the previous year as follows:


  • Disaster Types (e.g., hurricanes, tornados, floods, wildfires, winter storms),

  • Disaster Size: defined as the total obligated dollars. Small as being less than $10M, Medium as between $10M and $25M and Large as being greater than $25M.

  • Respondent Types, i.e., grantee versus sub-grantee, and

  • Project Size: as defined by the statutory dollar value for small versus large projects.


Both the state and regional disaster survey reports and the annual reports will be compiled for their use when implementing for future disasters. The reporting schedule is based on the surveyed disaster declaration date. FEMA contacts the participant by phone to conduct the survey or emails, mails or faxes the web survey link or paper/fill able form approximately 180-270 days after the declaration date. This time lag is necessary to allow applicants time to complete all the steps necessary to apply for and obtain funds from FEMA, since the survey asks for feedback on all these processes.


If the web surveyor paper/fill able form survey are used, the applicant is allowed a maximum of 60 days to respond to the survey. Or applicants can opt to respond via phone. Responses are gathered in a database survey tool. The responses are compiled and summarized, and simple frequencies and percents are tabulated in report addendum and summarized in a report. Once the data from those disasters surveyed are completed, the reports will be distributed to the Program Office and Regions for review. The reports will contain an exhibit comparing the overall customer satisfaction for the specific disaster with satisfaction rates for the other disasters surveyed or with the performance goal.


After the data from all disasters surveyed within the year are compiled, the data will be compiled and the Annual Report prepared and reviewed. The Annual Report is then tabulated and given to FEMA HQ and the Regions for approval.



Focus Group:

The format of the focus group is a mix of state grantees and local subgrantee respondents in an open forum discussion. Discussions may be recorded.


Respondents to participate in the focus groups will be selected from the criteria listed below.  We expect to do this in the form of 4 focus groups, one in each region, of approximately 20 people each.  FEMA staff will identify four regions in which to hold focus groups with stakeholders.  They then will contact FEMA Regional staff to inform them of the focus groups, and ask them to recommend a location and work with nearby states to identify Public Assistance Program applicants and others whom they feel could describe the impact of Public Assistance Program in their community.  The Regional staff and states will then assemble lists of potential attendees who meet the following criteria:


  • A mix of state grantees and local subgrantees;

  • A mix in the type of subgrantee organizations;

  • A mix in the functional roles of people within their grantee/subgrantee organization;

  • A mix in the type of disaster;

  • A mix in the project sizes of participants; and

  • To the degree appropriate, a mix of applicants who wrote their own PWs and who did not write their own PWs.



2. Describe the procedures for the collection of information including:


  • Statistical methodology for stratification and sample selection,


The sampling frame is the entire EMMIE list of grantees and sub-grantees. The survey is sent to all the primary units in the sampling frame for each disaster. No stratification has been adopted in the sampling. We hope to reach all but expect that not all will return their questionnaires.


The state grantee in the case of large-scale disasters typically is represented by as many as six primary grantee staff—the State Director, the Governor’s Authorized Representative (GAR), the alternate GAR, the State Public Assistance Officer (PAO), the deputy State PAO, and the State Coordinating Officer (SCO). Smaller disasters (defined by dollar amount) may have one staff fill these different roles; larger disasters may have up to six different staff filling in these roles. Depending upon the individual state organizations, one individual often serves in multiple grantee roles and usually fills out only one survey. Sub-grantees are typically State, local, or tribal governments and may also include private nonprofit organizations who received Public Assistance Program funds. The sub-grantee list consists of one contact person per organization that submitted a Request for Public Assistance (RPA) to FEMA for public assistance and received funds through the Public Assistance Program.


  • Estimation procedure:


The method of calculating satisfaction with each performance measure is by combining the positive response choices (very satisfied, satisfied and slightly satisfied) to a question. Responses that are not included when calculating satisfaction—such as do not know, the various not applicable response choices, and missing responses—are removed from the universe of affirmative responses for that question. The formula for calculating percent satisfaction for 24 of the 26 performance measuring items is as follows:


Very Satisfied + Satisfied + Slightly Satisfied

Total Respondents – (Do Not Know + All Not Applicable Responses +

Not Answered/Bad Response)


In the case of the other two items—‘was staff turnover a problem’ and ‘were site visits performed in a timely manner’—the response choice options differ from that described above. The affirmative response choices for staff turnover a problem are yes or no. In this case, the positive response choice is no—i.e., staff turnover was not a problem. The affirmative response choice options for the measure concerning the timing of Project Worksheet site visits are too soon after the disaster, at the right time, too late to be helpful. The positive response choice for this measure is at the right time. In these two cases, the formula for calculating the percent satisfaction with these two measures is:

Positive response choice

Total Responses – (Do Not Know + Not Applicable + Not Answered/Bad Response)


Each affirmative survey response is given equal weight when calculating percent customer satisfaction for a given disaster, regardless of whether they are from grantee and sub-grantee applicants. This disaster-specific customer satisfaction rate is presented in disaster survey report prepared for each disaster.


For the annual report covering multiple disasters surveyed within a given year, each disaster with five or greater responses is given equal weight regardless of the number of respondents for the surveyed disaster; the individual survey responses within a disaster are therefore weighted proportionally to the number of response within that disaster in order for each disaster to be given equal weight. The percent customer satisfaction recorded in the annual report, therefore, represents an average of each disaster surveyed during that year.


  • Degree of accuracy needed for the purpose described in the justification,


Surveying the entire universe of Public Assistance fund recipients, including both grantees and subgrantees, yields the highest degree of precision, accuracy, representativeness, reproducibility, and completeness of the survey sample. Steps have been taken to reduce sampling error by surveying as much of the entire population of grantee and subgrantee applicants that we can reach by trying to increase the number of responses we get back by follow-up attempts and reminders. Non-sampling error, caused by participant non-response is attempted to be reduced here as well. Strategies have been adopted to maximize the response rate so that the results reported are representative of the entire population of grant applicants for any given disaster (e.g., follow-up survey phone calls, emails, mailings or follow-up faxes for disasters with small populations).


  • Unusual problems requiring specialized sampling procedures, and


We do not anticipate any unusual problems on hard to reach populations other than those who have phone numbers, email or mail addresses not up to date in the EMMIE system. The list of phone numbers, email or mail addresses comes from that system and for office locations where assistance has recently been awarded These are generally accurate in the EMMIE database because they have recently asked for funds with a correct phone number or address..


We do not provide an incentive for the respondents to return our questionnaire and, therefore, use follow-up in the form of phone calls, re-emailing, re-faxing or re-mailing the web survey link to the questionnaire or the fill able form with a reminder as a way of reaching all respondents and encouraging them to return the questionnaire. Each are applicants that have received assistance and the incentive for filling out the questionnaire is being able to express their opinion and evaluate their satisfaction or dissatisfaction with their recent service.


  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


The survey tool does not require periodic (less frequent than annual) data collection cycles to reduce burden.



3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.

Form/Questionnaire: To maximize response rates, we will do the following:

*Ensure a short time lag between the disaster and data collection

*Conduct follow-ups

* Provide a Public Assistance Survey Hotline to aid in help with questions to ensure a more frequent likelihood of a returned response

Surveys are expected to be conducted with all Public Assistance funds recipients about 180-270 days from the disaster declaration date and appropriate follow-up activities performed to achieve a survey response rate goal of approximately 80% per disaster surveyed. The response rate of 80% is a goal. Mail and on-line survey tools typically, however, achieve a response rate of between 5% and 30%. Improved timeliness in providing the survey questionnaire; and more frequent and timely follow-ups for the survey; and to respondent questions received via the dedicated Public Assistance survey hotline; and email address will be used to increase the survey response rates.


It is expected that these measures will help to maintain sufficiently high response rates suitable to analysis, but in the event of response rates falling below 80%, a non-response analysis will be performed on the group(s) in question. These analyses will be conducted by using the “SPSS Analysis of Missing Data” module of the general SPSS software package or a similar analysis process and the findings of the analysis will be addressed accordingly.



4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.

Pilot Test

At the beginning of each collection period, a pilot test may be conducted on no more than 10 persons to discover any potential problems with the survey instrument or process. For quality assurance purposes, data from the pilot is reviewed and improvements are made to the survey process as deemed necessary.



5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.



Customer Satisfaction Analysis Section, Texas National Processing Service Center

Kathy Canaday at 940 891-8856

Maggie Billing at 940 891-8709


FEMA-Information Resources Management Branch, IC-Records Management

Nicole Bouchet
Records Management Division
Office of Management
Federal Emergency Management Agency
Attention: OM-RM
500 C Street, SW
Washington, DC 20472
Office: (202) 646-2814
Fax: (202) 646-3347



8


File Typeapplication/msword
AuthorFEMA Employee
Last Modified Bynbouchet
File Modified2011-04-21
File Created2011-04-15

© 2024 OMB.report | Privacy Policy