1660-0128 Program Effectiveness Recovery Supporting Statement Part B-1-...

1660-0128 Program Effectiveness Recovery Supporting Statement Part B-1-....doc

Federal Emergency Management Agency Individual Assistance Program Effectiveness & Recovery Survey

OMB: 1660-0128

Document [doc]
Download: doc | pdf


January 13, 2014


Supporting Statement for

Paperwork Reduction Act Submissions


OMB Control Number: 1660 – 0128


Title: Federal Emergency Management Agency Individual Assistance Program Effectiveness & Recovery Survey


Form Number(s):

Program Effectiveness & Recovery Survey, FEMA Form 007-0-20


B. Collections of Information Employing Statistical Methods.



When Item 17 on the Form OMB 83-I is checked “Yes”, the following documentation should be included in the Supporting Statement to the extent it applies to the methods proposed:


If the collection does not involve statistical methodology please enter “THERE IS NO STATISTICAL METHODOLOGY INVOLVED IN THIS COLLECTION” and delete Q1 through 5.


  1. Describe (including numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection has been conducted previously, include the actual response rate achieved during the last collection.


Part B Question #1: Description of Respondent Universe, Sampling Method, Response Rates

 

 

 

 

Type of Respondent / Entity

Form Name / Form Number

Respondent Universe Numerical Estimate Basis

Potential Respondent Universe Numerical Estimate

Estimated Number of Disasters Per Year Based on 3 Year Avg

Sample Size Calculation: Average Number of Respondents to Achieve a Completion

Annual Sample Size from FEMA Applicants

Estimated Completions per Week based on 50 Weeks Per Year

Sampling Criteria for Target Population

FY2013 Completed Surveys and Focus Groups

Actual or Expected Survey Response Rates with Actual FY2013 Avg of 32.37%

Respondent Universe

Surveys

 

 

A

B

C

D = A/B*C

 

 

 

 

Individuals and Households

Program Effectiveness & Recovery Survey
FEMA Form 007-0-20

Based on last 3 Years Average of Eligible Registered Applicants (FY2011-FY2013)

178,548

22

3.28

26,620

168.96

Sample by Disaster 60-90 Days after Close of the Application Period

3,443

32.37%

Total Survey Sample Size

 

 

 

 

 

26,620

168.96

 

3,443

32.37%

Focus Groups

 

 

 

 

 

 

 

 

 

 

Individuals and Households, Partners In Service Staff

Focus Group

Based on last 3 Years Average of Eligible Registered Applicants (FY2011-FY2013)

178,548

22

 

 

 

 

0

 

Individuals and Households, Partners In Service Staff

Travel to Focus Group

Travel for Focus Groups

N/A

N/A

 

 

 

 

N/A

 

Individuals and Households, Partners In Service Staff

One-on-One Interviews

Based on last 3 Years Average of Eligible Registered Applicants (FY2011-FY2013)

178,548

22

 

 

 

 

0

 

Individuals and Households, Partners In Service Staff

On-Line Interviews

Based on last 3 Years Average of Eligible Registered Applicants (FY2011-FY2013)

178,548

22

 

 

 

 

0

 

Focus Groups Total

 

 

 

 

 

0

0

 

0

 

Surveys and Focus Groups

 

 

 

 

 

26,620

168.96

 

3,443

32.37%


2. Describe the procedures for the collection of information including:


-Statistical methodology for stratification and sample selection: This survey is time-limited and based on a target population of individuals and households who are disaster survivors seeking federal assistance after a declared disaster. The disaster process covers a span of time and the goal is to measure and then report on satisfaction with services and assistance over the span of time. The goal is to cumulate into a statistically valid response.

The survey has an independent sampling, each disaster’s sample is individual, and there is no stratification involved in the sampling. The sampling frames consist of the names of all the disaster survivors who have contacted FEMA for disaster assistance for the targeted audience of eligible applicants. There cannot be a misclassification or eligibility confusion for the sampling frames because they are generated strictly by the definition of the target populations stated above. There is neither an exclusion of any element, nor an alternative sample frame. There is no post-stratification procedure. The responses are aggregated to estimate the customers’ satisfaction level for the target population.

For the Program Effectiveness & Recovery Survey (PE&R), a random sample is generated from the entire target population from the electronic data files in the National Emergency Management Information System (NEMIS) Individual Assistance (IA) Client replicated to the Enterprise Data Warehouse (EDW), which contain the names, phone numbers, and disaster related information of all such survivors. The survey sample is imported into the Customer Satisfaction Analysis System (CSAS), where the survey is stored.

Program Effectiveness & Recovery Survey:

The Program Effectiveness & Recovery Survey is conducted by phone approximately 60-90 days after the close of the disaster application period for the specific disaster. The one time random sample per declared disaster is based on the volume of recipients of assistance.


-Estimation procedure: The sample is based on the sample size calculation of the number of respondents required to achieve completion of the goal for the survey.


-Degree of accuracy needed for the purpose described in the justification: Although extremely accurate statistical inference is not necessary for this information collection, the goal is to achieve a level of estimated customer satisfaction based on a response volume at a 95% confidence level, plus or minus 5%, at a 50% response distribution.

-Unusual problems requiring specialized sampling procedures:

There are no unusual problems requiring specialized sampling procedures.


-Any use of periodic (less frequent than annual) data collection cycles to reduce burden: Usage of any periodic data collection cycle is not applicable to this particular type of information collection since disaster occurrences are not predictable enough to schedule a collection cycle in advance.


3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.


Extremely accurate statistical inference is not necessary for the intended use of this Information Collection. Results provide reliable customer satisfaction levels as well as information about areas that need improvement.


The actual average response rate for FY2013 was 32.37%, greater than normally expected ranges for phone surveys. (Research Studies follow below.) Survey efforts to maximize and further increase response rate are listed below.


  • The scheduling of the phone surveys covers a span of time between 9 am and 8 pm, typically Monday-Friday in the time zone of the respondent with additional attempts made during a different time frame.

  • Callbacks are attempted to survivors who state they will be available at another time within the survey period that would be more convenient for the respondent.

  • The interviewer explains how important the feedback is.

  • Multiple attempts are made to reach the survivor by phone each time the case systematically returns to the call queue.

  • The opening statement briefly explains the purpose of the study, the nature of being voluntary, and asks for the survivor's help in order to improve FEMA's quality of service.

  • The questions are very straightforward and easy to answer.

  • The questions are short and require little time to answer. Historically, the questions average 9:55 minutes based on the previous survey questions conducted in FY2013.

  • An explanation is given that the questions will in no way affect the outcome of the disaster survivor’s application for assistance.

  • Information gathered from focus groups will be used to ensure that the survey items included are of interest to disaster survivors, making respondents more likely to see the survey as relevant.

  • On-going training is provided to interviewers.

  • Interpreters are used to obtain results from survivors more comfortable with other languages.

  • The time frame for the survey is structured to be focused on specific topics of interest to the targeted audience and the service providers.


Note: Sending a pre-notification letter for the survey is not desirable because of the time constraints for each survey type and survey period. (See Part B #2 above.)


The response-rate formula used is recognized by the American Association for Public Opinion Research (AAPOR) as follows:


RR = I / {(I+P) + (R+NC+O) + U}, where


RR = Response rate

I = Complete interview

P = Partial interview

R = Refusal and break-off

NC = Non-contact

O = Other (bad/wrong numbers, technical phone problem, etc.)

U = Unknown eligibility (= 0 in this case, see B #2.)


While the response rate of 32.37% for this customer satisfaction collection is greater than a recent study in 2012 by Pew Research Center (See Research Studies-2 below), factors that contribute to the non-response portion may be due to the disaster; such as, the communities of survivors often do not have all public services restored due to the disaster, businesses are still rebuilding or may have chosen to not rebuild. The individuals may be still be in the re-build phase of their recovery and unavailable for a survey. Frequent relocations and displacements are anticipated affecting the respondent’s availability to complete the survey. Survivors may not want to use their cell phone minutes to respond to a survey. Disaster trauma may be a factor and the survivor may not remember contacting FEMA or was not familiar with the case. Other factors include bad/wrong phone number, busy signals, no answer, voice mail, and privacy manager.


Research Study 1-Response Rate:

McCarty et al., 2006, a paper concerning phone survey response rates from 205 phone surveys conducted at the University of Florida Survey Research Center at the Bureau of Economic and Business states on Pages 172-173, “…recent research has shown that the effect of nonresponse on data is less critical than previously thought (Curtin, Presser, and Singer 2000; Keeter et al. 2000). This helps put response rates in perspective and reduces the tendency to disregard survey research simply because of low response rates.”

On Page 183, Figure 1 is a histogram of the response rates for 205 telephone surveys, which shows the mode response rate 25% and the mean about 41.5%. (McCarty et al., 2006, Effort in Phone Survey Response Rates: The Effects of Vendor and Client-Controlled Factors, Field Methods, Vol. 18 No. 2, 172-188).



Research Studies 2-Response Rate, Sample Size:

Other research reflects industry-wide increase in sample size just to keep response rates at a current level. Below are bullets from various research articles justifying low response rates including cell phone usage and survey fatigue.


  • Increase - Sample sizes are increasing in order to keep response rates at the current level

  • Cell phones are becoming primary residence phones

  • Unwillingness to answer “Unknown” numbers; recent survey 54.4% would not answer (Buskirk et al., 2008)

  • Industry trend in difficulty of reaching respondents (see fig. 2)

  • Survey Fatigue (Hader et al., 2012; Kohut et al., 2012)

  • Nonresponse Bias

  • Is not directly correlated to nonresponse – if response rate is low, does not mean bias (Groves, 2006)

  • Increasing response rates by reducing non-contacts can exacerbate disparities among respondents, ex. Income, urbanicity, etc. and create a bias (Braick et al., 1996; Dennis et al, 1999; Groves, 2006)



Figure 2 (below) represents the decline of typical response rates for surveys conducted from 1997-2012. Results were extracted from the PEW Research Center’s 2012 Methodology Study. Response Rates were computed according to the AAPOR standard formula.



Figure 2: Surveys Face Growing Difficulty Reaching, Persuading Potential Respondents:


1997

2000

2003

2006

2009

2012


%

%

%

%

%

%

Contact rate







(percent of households in which an adult was reached)

90

77

79

73

72

62

Cooperation rate







(percent of households contacted that yielded an interview)

43

40

34

31

21

14

Response rate







(percent of households sampled that yielded an interview)

36

28

25

21

15

9

Other Sample size research references:

  1. Brick, J. M., Allen, B., Cunningham, P., & Maklan, D. (1996). Outcomes of a calling protocol in a telephone survey. Proceedings of the Survey Research Methods Section of the American Statistical Association, Alexandria, VA.

  2. Buskirk, T. D., Rao, K., & Kaminska, O. (2008). My cell phone’s ringing, “caller unknown,” now what? Usage behavior patterns among recent landline cord cutters who have become cell phone only users. The American association for (AAPOR) 63rd annual conference, 2008 & AAPOR 61th annual conference.

  3. Dennis, J. M., Saulsberry, C., Battaglia, M. P., Roden, A., Hoaglin, D. C., Frankel, M., et al. (1999). Analysis of Call Patterns in a Large Random-Digit-Dialing Survey: The National Immunization Survey. Conference website of the International Conference on Survey Nonresponse 1999: 1-23.

  4. Groves, R. M. (2006). Nonresponse rates and nonresponse bias in household surveys. Public Opinion Quarterly, 70(5), 646-675.

  5. Häder, S., Häder, M., & Kühne, M. (Eds.). (2012). Telephone Surveys in Europe. Springer: London.

  6. Kohut, A., Keeter, S., Dohetry, C., Dimock, M., & Christian, L. (2012). Assessing the representativeness of public opinion surveys. The Pew Research Center For the People & The Press.

  7. Reimer, B., Roth, V., & Montgomery, R. (2012, July). Optimizing call patterns for landline and cell phone surveys. Presentation delivered at Joint statistical meetings, San Diego, CA.

  8. Van Rooy, C., van Steenis, J.C. (1999). Bellen & Gebeld Worden: fabels en feiten. In: Calling and being called: Fables and facts, MOAjaar-boek.


Research Study: Disaster Trauma

Considering even during normal stages of everyday life, “time-limited polls often yield very low response rates” and “survey fatigue,” this collection has achieved a very good response rate, if not the best possible for this particular type of population. The survey efforts described above in Part B #3 are utilized to achieve the success in the response rate even though respondents may be still in disaster trauma during the survey period.


Disaster trauma psychology symptoms may include the following based on the Community Emergency Response Team-Citizen Corps Training for disaster psychology:

http://www.citizencorps.gov/cert/IS317/medops/medops/index03.htm


Irritability or anger

Relationship conflicts/marital discord

Self-blame or the blaming of others

Loss of appetite

Isolation and withdrawal

Headaches or chest pain

Fear of recurrence

Diarrhea, stomach pain, or nausea

Feeling stunned, numb, or overwhelmed

Hyperactivity

Feeling helpless

Increase in alcohol or drug consumption

Mood swings

Nightmares

Sadness, depression, and grief

The inability to sleep

Denial

Fatigue or low energy

Concentration and memory problems



4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


Many of the questions in the survey have been performed for over ten years and were initially based on comments from past focus groups as well as contractor opinion. FEMA personnel also reviewed questionnaire content and wording to improve readability and clarity. Tests with less than 10 applicants may be performed by FEMA’s customer satisfaction analysis staff when updates are desirable, and all updates to questionnaires will be submitted to OMB for approval.


5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


Maggie Billing

Program Analyst

Customer Satisfaction Analysis Section

Texas National Processing Service Center

940 891-8709 or 940 891-8500 (switchboard)


Or


Kyle M. Mills, P.E.

Manager

Customer Satisfaction Analysis Section

Texas National Processing Service Center

940 891-8881


9


File Typeapplication/msword
File TitleRev 10/2003
AuthorFEMA Employee
Last Modified ByBilling, Maggie
File Modified2014-04-23
File Created2014-01-16

© 2024 OMB.report | Privacy Policy