Supporting Statement for OMB Collection 1660-0107 (Part B), CLEAN

Supporting Statement for OMB Collection 1660-0107 (Part B), CLEAN.doc

Public Assistance Customer Satisfaction Surveys

OMB: 1660-0107

Document [doc]
Download: doc | pdf

March 24, 2017


Supporting Statement for

Paperwork Reduction Act Submissions


OMB Control Number: 1660 – 0107


Title: Public Assistance Customer Satisfaction Surveys


Form Number(s):

FEMA Form 519-0-32, Public Assistance Initial Survey (Telephone);

FEMA Form 519-0-33, Public Assistance Initial Survey (Internet);

FEMA Form 519-0-34, Public Assistance Assessment Survey (Telephone);

FEMA Form 519-0-35, Public Assistance Assessment Survey (Internet)


B. Collections of Information Employing Statistical Methods

When Item 17 on the Form OMB 83-I is checked “Yes”, the following documentation should be included in the Supporting Statement to the extent it applies to the methods proposed:


1. Describe (including numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection has been conducted previously, include the actual response rate achieved during the last collection.


Part B #1. Annual Estimates of Universe of Completed Surveys


Respondent Type

Survey Form

Estimated Annual Universe (FY2013-2015)

Projected Monthly Universe

Projected Monthly Completes (based on ~70% response rate)

Projected Annual Completes (based on ~70% response rate)

Non-Profit Institutions (~10% of universe)

Public Assistance Initial Survey FEMA Form 519-0-32 (Telephone)

450

38

27

324

State, Local or Tribal Government (~90% of universe)

4,051

338

237

2,844

Non-Profit Institutions (~10% of universe)

Public Assistance Initial Survey FEMA Form 519-0-33 (Internet)

113

9

6

72

State, Local or Tribal Government (~90% of universe)

1,012

84

59

708

 

Subtotal

5,626

469

329

3,948

Non-Profit Institutions (~10% of universe)

Public Assistance Assessment Survey FEMA Form 519-0-34 (Telephone)

354

30

21

252

State, Local or Tribal Government (~90% of universe)

3,182

265

186

2,232

Non-Profit Institutions (~10% of universe)

Public Assistance Assessment Survey FEMA Form 519-0-35 (Internet)

88

7

5

60

State, Local or Tribal Government (~90% of universe)

796

66

46

552

 

Subtotal

4,420

368

258

3,096

 

Grand Total

 

837

587

7,044


Note. Calculations were completed going from left to right in the table. Each calculation was rounded to the nearest whole number. There may be slight variance due to rounding error, but these numbers are meant as an estimate only. Additionally, the whole universe is sampled each month, so the projections have no effect on sampling methodology.


The universe of respondents will consist of organizations or entities that applied for Public Assistance (PA) following a presidentially-declared emergency or major disaster. The entities consist of approximately 90% local, state or tribal governments and 10% eligible private non-profit organizations. More specifically, the PA Initial Survey universe will consist of applicants who are flagged as having had their first one-on-one meeting with FEMA (e.g., Recovery Scope Meeting), and may contain applicants that are either eligible or ineligible for disaster assistance. Ineligible applicants were not included as potential respondents in the previous PA collection.


The PA Assessment Survey universe will be applicants who are flagged as having funds obligated for payment by FEMA, which is the same universe as the previous PA collection. Each universe will be evaluated on a monthly basis based on the applicant’s recent interaction with FEMA. Applicants in the PA Assessment Survey universe will most likely be surveyed at two different time points during the PA process.


The questions in the PA Initial and PA Assessment surveys are different. Each survey will be mixed-mode, which incorporates both internet and phone administration.


In order to estimate the universe size for the PA Initial Survey, we examined the average number of PA applicants per year (eligible or ineligible) for the three year period FY 2013-2015. There were approximately 5,626 applicants on an annual basis. If we take that total and divide by 12 months, we get an average of 469 applicants per month. The average response rate across FY 2013-2015 was approximately 70%. This suggests 70% of our universe will complete the PA Initial Survey on a monthly basis, resulting in 329 survey completions. This would equate to 3,948 annual completions.


When looking at the PA Assessment Survey, there were approximately 4,420 eligible applicants on average out of 5,626 average total applicants for the three year period FY 2013-2015. If we take that total and divide by 12 months, we get an average of 368 eligible applicants per month. The average response rate across FY 2013-2015 was approximately 70%. This suggests 70% of our universe will complete the PA Assessment Survey on a monthly basis, resulting in 258 survey completions. This would equate to 3,096 annual completions.


In sum, we estimate 3,948 completions annually for the PA Initial Survey (329 completes x 12 months = 3,948) and 3,096 completions annually for the PA Assessment Survey (258 completes x 12 months = 3,096) for a total of 7,044 completions. Qualitative interviews (e.g., focus groups, one-on-one interviews and small group interviews) will not be subject to statistical sampling methods (e.g., usually based on convenience sample) or statistical analysis.


These numbers are rough approximations; some months are going to be more heavily affected by disasters than others. There is no statistical sampling method for the PA Information Collection because we will be surveying the entire available universe each month.


2. Describe the procedures for the collection of information including:


-Statistical methodology for stratification and sample selection:


The sampling frame is the universe of applicants for Public Assistance which are state, local and tribal governments, and eligible private non-profit organizations. The sample is generated from data extracted from FEMA’s Enterprise Data Warehouse (EDW) from the Emergency Management Mission Integrated Environment (EMMIE) electronic database or a program similar to it that contains the points of contact names, phone numbers, email addresses, mailing addresses and disaster related information.


-Estimation procedure:


This collection is based on a universe of applicants for Public Assistance; therefore, no estimation procedure is utilized.


-Degree of accuracy needed for the purpose described in the justification:


Surveying the entire universe of Public Assistance applicants, including state, local and tribal governments, and eligible private non-profit organizations, yields the highest degree of precision and confidence. The monthly universe is typically small, usually only a couple hundred applicants. Surveying the entire universe ensures we have enough survey completions to make reliable and valid conclusions about the population. Additionally, we will now be surveying applicants who may receive an ineligible determination during the PA Initial Survey in order to gain a more complete representation of customer satisfaction.

-Unusual problems requiring specialized sampling procedures:


There is no expectation for unusual problems or hard to reach populations other than those who do not have up-to-date phone numbers or email addresses in the EMMIE database system, which provides the sampling list of phone numbers or email addresses. For applicants who have recently applied for Public Assistance, the contacts are up-to-date and are generally accurate in the EMMIE database.




-Any use of periodic (less frequent than annual) data collection cycles to reduce burden:


Data will be collected on a monthly basis. Applicant data will be imported into the survey sample if they have had a one-on-one meeting with FEMA (e.g., Recovery Scope Meeting), or if they received an obligation of disaster funds by FEMA. Currently we survey six to nine months following the disaster declaration, and applicants have a difficult time evaluating services and procedures that occurred early in the Public Assistance process. This is why we are transitioning to surveying at two time points in the Public Assistance process for the revised information collection.


Surveying less frequently than annually would further increase recall errors and decrease the reliability and validity of the data that was gathered.

3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.


The average response rate for Public Assistance Surveys completed in FY 2013-2015 was 70%.


There is no financial incentive for the respondents to answer the survey questions. The incentive is the opportunity for applicants to express their opinion and evaluate their satisfaction with their recent service. We have found this to be an effective incentive. This is evidenced by our approximate 70% response rate, which is extremely high compared to industry average. Additionally, previous survey respondents have requested web survey forms, which we are now making available to reduce respondent burden and further maximize response rates.


All applicants who have an email address on file will receive an email invitation with the ability to complete the survey via a web link. Reminder emails will be sent if the applicant does not complete the web survey in a certain time period. Phone interviewers will contact applicants who don’t have an email address on file, as well as contact applicants who do not complete the web survey after a certain time period has elapsed. We are attempting to reduce non-sampling error, caused by non-response or bad contact data, by contacting applicants in mixed administration modes (e.g., internet and phone). Additionally, if applicants wish to contact FEMA and schedule a more convenient time to complete the survey, we now have a toll free 800 phone number with answering machine solely for Public Assistance Surveys.


The goal is that the entire available universe will be contacted on a monthly basis for surveying, so there isn’t a sampling method. Everyone will be contacted to ensure we have enough information collected to make generalizations about the population. Data accuracy will increase by minimizing the elapsed time between services received by the respondent and survey administration. If we have months with low disaster activity and there isn’t enough survey data available to draw valid conclusions, a disclaimer about the small sample size and low precision rate will be included at the beginning of any reports.


Below are survey efforts that will be performed regularly in order to maintain high response rates.

  • Scheduling of phone surveys will be during normal business hours. Hours may be changed depending on disaster activity and time zone of the respondents being surveyed.

  • Follow-ups or reminders in the form of emails or phone calls will be used to encourage participation.

  • Callbacks will be attempted to applicants who request a different time/day to take the survey that is more convenient.

  • The opening statement will explain the purpose of the survey and that participation is voluntary.

  • Multiple attempts will be made to contact each applicant. Those who receive the web survey will be sent approximately two email reminders within roughly a 2 week period. If the survey isn’t completed within a certain time frame, the applicant data may be placed in the phone queue.

  • The questions are straightforward, short, and easy to answer. We’ve also incorporated numeric scales in the current collection to reduce respondent burden. Listening to and remembering long lists of verbal response options can be tedious.

  • Applicants will be told their survey responses will in no way affect the outcome of their application for FEMA Public Assistance.

  • On-going training will be provided to interviewers.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.

At the beginning of each collection period, a pilot test may be conducted with less than 10 persons to discover any potential problems with the survey instrument or process. For quality assurance purposes, data from the pilot will be reviewed and improvements made to the survey process as deemed necessary. Read aloud testing was conducted with multiple interviewers for time trials, as well as to gain feedback on awkward wording/phrasing. FEMA staff with various experience in Public Assistance were asked to read the survey to assess question clarity.

5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


Kristin Brooks

Statistician

Customer Survey and Analysis Section

Reports and Analytics Division, FEMA

[email protected]

Office: (940) 891-8579; Alt phone: (310) 569-3347





5


File Typeapplication/msword
File TitleRev 10/2003
AuthorFEMA Employee
Last Modified ByGreene, Sherina
File Modified2017-03-24
File Created2016-09-22

© 2024 OMB.report | Privacy Policy