June 23, 2014
Supporting Statement for
Paperwork Reduction Act Submissions
OMB Control Number: 1660 – 0129
Title: Federal Emergency Management Agency Individual Assistance Survivor Centric Customer Satisfaction Survey (formerly, Follow-Up Program Effectiveness & Recovery Survey)
B. Collections of Information Employing Statistical Methods.
When Item 17 on the Form OMB 83-I is checked “Yes”, the following documentation should be included in the Supporting Statement to the extent it applies to the methods proposed:
If the collection does not involve statistical methodology please enter “THERE IS NO STATISTICAL METHODOLOGY INVOLVED IN THIS COLLECTION” and delete Q1 through 5.
Describe (including numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection has been conducted previously, include the actual response rate achieved during the last collection.
2. Describe the procedures for the collection of information including:
-Statistical methodology for stratification and sample selection: This survey is time-limited and based on a target population of individuals and households who are disaster survivors seeking federal assistance after a declared disaster. The disaster process covers a span of time and the goal is to measure and then report on satisfaction with services and assistance over the span of time. The goal is to cumulate into a statistically valid response.
The sampling frames consist of the names of all the disaster survivors who have contacted FEMA for disaster assistance for the targeted audience and who may have been surveyed in a separate collection. There cannot be a misclassification or eligibility confusion for the sampling frames because they are generated strictly by the definition of the target population. There is neither an exclusion of any element, nor an alternative sample frame. The responses are aggregated to estimate the customers’ satisfaction level for the target population.
For the Survivor Centric Customer Satisfaction Survey, two sampling processes may be utilized: a sample generated from previously surveyed respondents under a separate collection 1660-0128 or a random sample from the entire target population from the electronic data files in the National Emergency Management Information System (NEMIS) Individual Assistance (IA) Client replicated to the Enterprise Data Warehouse (EDW), which contain the names, phone numbers, and disaster related information of all such survivors. The survey sample is imported into the Customer Satisfaction Analysis System (CSAS), where the survey is stored.
Survivor Centric Customer Satisfaction Survey (formerly, Follow-Up Program Effectiveness & Recovery Survey):
The Survivor Centric Customer Satisfaction Survey is conducted by phone approximately 1-24 months after the survivor has registered for assistance. The sample is based either on the volume of registrants for assistance by disaster for the target audience or the previously surveyed sample under collection 1660-0128. The survey may be repeated approximately 6 times a year.
-Estimation procedure: The sample is based on the sample size calculation of the number of respondents required to achieve completion of the goal for the survey.
-Degree of accuracy needed for the purpose described in the justification: Although extremely accurate statistical inference is not necessary for this information collection, the goal is to achieve a level of estimated customer satisfaction based on a response volume at a 95% confidence level, plus or minus 5%, at a 50% response distribution.
-Unusual problems requiring specialized sampling procedures:
There are no unusual problems requiring specialized sampling procedures.
-Any use of periodic (less frequent than annual) data collection cycles to reduce burden: Usage of any periodic data collection cycle is not applicable to this particular type of information collection since disaster occurrences are not predictable enough to schedule a collection cycle in advance.
3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.
Extremely accurate statistical inference is not necessary for the intended use of this Information Collection. Results provide reliable customer satisfaction levels as well as information about areas that need improvement.
The estimated average response rate for this Information Collection (IC) is based on a similar survey in IC 1660-01028, which is similar in length. The actual response rate for IC 1660-0128 for FY2013 was 32.37%, greater than normally expected ranges for phone surveys. (Research Studies follow below.) Survey efforts to maximize and further increase response rate are listed below.
The scheduling of the phone surveys covers a span of time between 9 am and 8 pm, typically Monday-Friday in the time zone of the respondent with additional attempts made during a different time frame.
Callbacks are attempted to survivors who state they will be available at another time within the survey period that would be more convenient for the respondent.
The interviewer explains how important the feedback is.
Multiple attempts are made to reach the survivor by phone each time the case systematically returns to the call queue during the survey period.
The opening statement briefly explains the purpose of the study, the nature of being voluntary, and asks for the survivor's help in order to improve FEMA's quality of service.
The questions are very straightforward and easy to answer.
The questions are short and require little time to answer.
An explanation is given that the questions will in no way affect the outcome of the disaster survivor’s application for assistance.
Information gathered from focus groups will be used to ensure that the survey items included are of interest to disaster survivors, making respondents more likely to see the survey as relevant.
On-going training is provided to interviewers.
Interpreters are used to obtain results from survivors more comfortable with other languages.
The time frame for the survey is structured to be focused on specific topics of interest to the targeted audience and the service providers.
Note: Sending a pre-notification letter for the survey is not desirable because of the time constraints for this survey type and survey period. (See Part B #2 above.)
The response-rate formula used is recognized by the American Association for Public Opinion Research (AAPOR) as follows:
RR = I / {(I+P) + (R+NC+O) + U}, where
RR = Response rate
I = Complete interview
P = Partial interview
R = Refusal and break-off
NC = Non-contact
O = Other (bad/wrong numbers, technical phone problem, etc.)
U = Unknown eligibility (= 0 in this case, see B #2.)
While the response rate of 32.37% estimated for this customer satisfaction collection is greater than a recent study in 2012 by Pew Research Center (See Research Studies-2 below), factors that contribute to the non-response portion may be due to the disaster; such as, the communities of survivors often do not have all public services restored due to the disaster, the businesses employing the survivors are still rebuilding or may have chosen to not rebuild. The individuals may be still be in the re-build phase of their recovery and unavailable for a survey. Frequent relocations and displacements are anticipated affecting the respondent’s availability to complete the survey. Survivors may not want to use their cell phone minutes to respond to a survey. Disaster trauma may be a factor and the survivor may not remember contacting FEMA or was not familiar with the case. Other factors include bad/wrong phone number, busy signals, no answer, voice mail, and privacy manager.
Research Study 1-Response Rate:
McCarty et al., 2006, a paper concerning phone survey response rates from 205 phone surveys conducted at the University of Florida Survey Research Center at the Bureau of Economic and Business states on Pages 172-173, “…recent research has shown that the effect of nonresponse on data is less critical than previously thought (Curtin, Presser, and Singer 2000; Keeter et al. 2000). This helps put response rates in perspective and reduces the tendency to disregard survey research simply because of low response rates.”
On Page 183, Figure 1 is a histogram of the response rates for 205 telephone surveys, which shows the mode response rate 25% and the mean about 41.5%. (McCarty et al., 2006, Effort in Phone Survey Response Rates: The Effects of Vendor and Client-Controlled Factors, Field Methods, Vol. 18 No. 2, 172-188).
Research Studies 2-Response Rate, Sample Size:
Other research reflects industry-wide increase in sample size just to keep response rates at a current level. Below are bullets from various research articles justifying low response rates including cell phone usage and survey fatigue.
Increase - Sample sizes are increasing in order to keep response rates at the current level
Cell phones are becoming primary residence phones
Unwillingness to answer “Unknown” numbers; recent survey 54.4% would not answer (Buskirk et al., 2008)
Industry trend in difficulty of reaching respondents (see fig. 2)
Survey Fatigue (Hader et al., 2012; Kohut et al., 2012)
Nonresponse Bias
Is not directly correlated to nonresponse – if response rate is low, does not mean bias (Groves, 2006)
Increasing response rates by reducing non-contacts can exacerbate disparities among respondents, ex. Income, urbanicity, etc. and create a bias (Braick et al., 1996; Dennis et al, 1999; Groves, 2006)
Figure 2 (below) represents the decline of typical response rates for surveys conducted from 1997-2012. Results were extracted from the PEW Research Center’s 2012 Methodology Study. Response Rates were computed according to the AAPOR standard formula.
Figure 2: Surveys Face Growing Difficulty Reaching, Persuading Potential Respondents: |
||||||
|
1997 |
2000 |
2003 |
2006 |
2009 |
2012 |
|
% |
% |
% |
% |
% |
% |
Contact rate |
|
|
|
|
|
|
(percent of households in which an adult was reached) |
90 |
77 |
79 |
73 |
72 |
62 |
Cooperation rate |
|
|
|
|
|
|
(percent of households contacted that yielded an interview) |
43 |
40 |
34 |
31 |
21 |
14 |
Response rate |
|
|
|
|
|
|
(percent of households sampled that yielded an interview) |
36 |
28 |
25 |
21 |
15 |
9 |
Other Sample size research references:
Brick, J. M., Allen, B., Cunningham, P., & Maklan, D. (1996). Outcomes of a calling protocol in a telephone survey. Proceedings of the Survey Research Methods Section of the American Statistical Association, Alexandria, VA.
Buskirk, T. D., Rao, K., & Kaminska, O. (2008). My cell phone’s ringing, “caller unknown,” now what? Usage behavior patterns among recent landline cord cutters who have become cell phone only users. The American association for (AAPOR) 63rd annual conference, 2008 & AAPOR 61th annual conference.
Dennis, J. M., Saulsberry, C., Battaglia, M. P., Roden, A., Hoaglin, D. C., Frankel, M., et al. (1999). Analysis of Call Patterns in a Large Random-Digit-Dialing Survey: The National Immunization Survey. Conference website of the International Conference on Survey Nonresponse 1999: 1-23.
Groves, R. M. (2006). Nonresponse rates and nonresponse bias in household surveys. Public Opinion Quarterly, 70(5), 646-675.
Häder, S., Häder, M., & Kühne, M. (Eds.). (2012). Telephone Surveys in Europe. Springer: London.
Kohut, A., Keeter, S., Dohetry, C., Dimock, M., & Christian, L. (2012). Assessing the representativeness of public opinion surveys. The Pew Research Center For the People & The Press.
Reimer, B., Roth, V., & Montgomery, R. (2012, July). Optimizing call patterns for landline and cell phone surveys. Presentation delivered at Joint statistical meetings, San Diego, CA.
Van Rooy, C., van Steenis, J.C. (1999). Bellen & Gebeld Worden: fabels en feiten. In: Calling and being called: Fables and facts, MOAjaar-boek.
Research Study: Disaster Trauma
Considering, even during normal stages of everyday life, “time-limited polls often yield very low response rates” and “survey fatigue,” this collection has achieved a very good response rate, if not the best possible for this particular type of population. The survey efforts described above in Part B #3 are utilized to achieve the success in the response rate even though respondents may be still in disaster trauma during the survey period.
Disaster trauma psychology symptoms may include the following based on the Community Emergency Response Team-Citizen Corps Training for disaster psychology:
http://www.citizencorps.gov/cert/IS317/medops/medops/index03.htm
Irritability or anger |
Relationship conflicts/marital discord |
Self-blame or the blaming of others |
Loss of appetite |
Isolation and withdrawal |
Headaches or chest pain |
Fear of recurrence |
Diarrhea, stomach pain, or nausea |
Feeling stunned, numb, or overwhelmed |
Hyperactivity |
Feeling helpless |
Increase in alcohol or drug consumption |
Mood swings |
Nightmares |
Sadness, depression, and grief |
The inability to sleep |
Denial |
Fatigue or low energy |
Concentration and memory problems |
|
4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.
Many of the questions in the survey have been performed for several years and were initially based on comments from past focus groups as well as contractor opinion. FEMA personnel also reviewed questionnaire content and wording to improve readability and clarity. Tests with less than 10 applicants may be performed by FEMA’s customer satisfaction analysis staff when updates are desirable, and all updates to questionnaires will be submitted to OMB for approval
5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.
Maggie Billing
Program Analyst
Customer Satisfaction Analysis Section
National Processing Service Center
940 891-8709 or 940 891-8500 (switchboard)
Or
Kyle M. Mills, P.E.
Manager
Customer Satisfaction Analysis Section
National Processing Service Center
940 891-8881
File Type | application/msword |
File Title | Rev 10/2003 |
Author | FEMA Employee |
Last Modified By | Billing, Maggie |
File Modified | 2014-07-01 |
File Created | 2014-06-30 |