Date: May 9, 2007
Supporting Statement for
Paperwork Reduction Act Submissions
Title: Federal Emergency Management Agency Housing Inspection Services Customer Satisfaction Survey
OMB Control Number: 1660-NW31
Form Number: FEMA Form 86-26 (MW), SEP 04
Headnote: The proposed information collection (IC), Housing Inspection Services Customer Satisfaction Survey, is currently bundled with other customer satisfaction surveys under OMB control number 1660-0036, named ‘Housing Inspection Survey’. The survey has been separated from the bundle after the consultation with the OMB desk officers, and is presented here as a new IC request.
B. Collections of Information Employing Statistical Methods.
When Item 17 on the Form OMB 83-I is checked “Yes”, the following documentation should be included in the Supporting Statement to the extent it applies to the methods proposed:
1. Describe (including numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection has been conducted previously, include the actual response rate achieved during the last collection.
The proposed IC is a Time-Constraint survey.
The target population of the proposed IC includes individual victims of a particular disaster (applicants) who have attained housing inspections within 21 days after the disaster declaration in seeking recovery assistance from FEMA. The sample frame consists of the entire individual names in the target population; therefore, there is no alternative sample frame. There is no element exclusion applied to either the target population or the sample frame. The sample frame is constructed from the database of FEMA Housing Inspections, which is maintained electronically in National Emergency Management Information System. Undercoverage may occur due to housing inspections conducted after the 21st day following a disaster declaration. Historically undercoverage rate was approximately 20 % when it occurred. In the past experience the undercoverage did not influence the result of statistical analyses significantly. The data collection is conducted by mail-survey. The frequency of the IC is one time per disaster, and the time of the collections varies according to disaster occurrences.
An effective sample size is determined by (1) the number of completed housing inspections and (2) the confidence-level and margin of error criteria (95 % 5 %), at 50 % response distribution, by FEMA’s precision requirement, and (3) historical response rate, which sets a goal for the target number of completed surveys. The objective is to obtain the number of completed surveys, which are returned to us, that will provide estimates of the true level of customer satisfaction with housing inspection services for a particular disaster as accurate as possible with the minimum time and cost burden. Historically the overall response rate to these mail surveys is in the range of 30 % to 50 % per disaster. For the most recently completed customer satisfaction survey (for disaster DR-1662-IN), the survey instruments were sent to a sample of 1,200 individuals from the target population of 2,820 applicants for the disaster. A total of 393 completed surveys were received, which resulted a response rate of 32.8 %.
Table 5 shows the data on the number of entities in the sample universe covered by the collection and the corresponding sample for the universe as a total and for each declaration.
Table 6. Universe and Sample sizes, entities and degree of accuracy.
Survey |
Total Universe per Year |
Number of Entity [description] |
Avg # of DRs per Year |
Target Universe per Disaster |
Sample per Disaster |
Confidence Level [Margin of Error at 50% response distribution] |
Housing Inspection Services Customer Satisfaction Survey |
343,500 |
1 [Individual] |
34 |
10,100 [Mean]* 2,300 [Median] |
950 [Mean] 835 [Median] |
95 % [<=3 %] |
* The unusually large mean size of the target universe per disaster is due to inclusion of Katrina case.
2. Describe the procedures for the collection of information.
Following a disaster declaration, an electronic database is created, which consists of the applicants’ attributes such as names and addresses. The list of those names is the sample frame for our target population. A random sample is selected from the sample frame utilizing Statistical Package for the Social Sciences (SPSS) random sample generator. The initial survey materials are mailed to all individuals in the sample. Sample sizes depend on not only the actual size of the applicant population but also other factors such as an area’s disaster history or overall literacy level. Some locations, for example, are subject to repeated flooding, hurricanes, tornadoes, etc. We use historical survey response rates from prior disasters in these locations to determine an adequate number of contact attempts. Because survey response rates are also positively correlated with literacy levels, we have found that in states with a highly educated populace, such as Massachusetts, we can achieve the completion target with fewer initial mailings.
For large disasters, mailings sent to 1,500 randomly selected applicants are usually sufficient to achieve the target number of completions. For exceptionally devastating disasters, such as Hurricane Katrina where applicants were scattered around the country, the initial number of mailings may need to be as high as 2,000. Figure 1 shows the over all procedure of Housing Inspection Services Customer Satisfaction Survey from the sampling stage to the final report.
Figure 1. A diagram describing the procedure of Housing Inspection Services Customer Satisfaction Survey from the sampling stage to the final report.
2.1. Statistical methodology for stratification and sample selection.
A probability sample is selected from the sample frame utilizing Statistical Package for the Social Sciences (SPSS) random sample generator. In general, SPSS program uses the Uniform function to generate random numbers. FEMA has established a standard that the number of completed surveys per disaster must be large enough to achieve an effect of a sample size that will produce 95 % confidence level with a 5 % confidence interval at 50 % response distribution. For each disaster, the target number of completed surveys and sample size are computed that will be large enough to achieve this criteria. Table 6 shows sample sizes in general for different sample universe sizes. There is no sample stratification.
Table 7. Sample sizes used for different sample universes depending on each disaster case and its target number of completed surveys.
Sample Universe Size |
Sample Size |
1,100 or fewer |
Universe Survey |
1,101 to 10,000 |
1,100 to 1,450 |
Over 10,000 |
1,500 |
Estimation procedure,
The proposed IC is a structured psychometric survey using combination of dichotomous questions and Likert response scale of bipolar scaling with an additional middle or uncertainty option for some cases. Population parameters are empirically derived for each question as a percentage for the responses that choose the particular answer of the total responses to the question. In addition to the observed response percentages, adjusted response percentages are presented, which exclude uncertain responses.
Degree of accuracy needed for the purpose described in the justification,
Though extremely accurate estimates of population response percentages are not necessary for this IC, we pursue an effect of 95 % confidence level with a +/– 5 % margin of error at 50 % response distribution for the results from the target number of survey completions. In case of each question, most of per-question error margins decrease for its dominant response because the dominant response distribution is usually greater than 50 %.
Unusual problems requiring specialized sampling procedures, and
Sample size may be increased in the event that the disaster is particularly devastating and large numbers of applicants have been dispersed around the country, as happened with Hurricane Katrina. In such a special case, a larger sample size may be necessary to ensure that the target number of completed surveys is received.
Any use of periodic (less frequent than annual) data collection cycles to reduce burden.
Only one data collection is conducted per disaster for this IC.
3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.
The target respondents of the proposed survey are disaster victims who applied for housing assistance. The response rate for the most recently completed HI customer satisfaction survey in the current OMB inventory (OMB 1660-0036) was 32.8 % using a response-rate formula recognized for mail surveys by the American Association for Public Opinion Research (AAPOR) as following:
RR = I / {(I+P) + (R+NC+O) + U}
, where
RR = Response rate
I = Returned and more than 80 % of the questions answered
P = Returned and 50 % to 80 % of the questions answered
R = Refusal and break-off (not returned, returned blank or less than 50% answered)
NC = Non-contact (dead, physical or mental disability, language/literacy problem, etc.)
O = Other (returned too late, other miscellaneous reasons)
U = Unknown eligibility (= 0 in this case, see B. #1.)
Our response rate of 32.8 % is compared with a mail survey response rate presented in Kaplowitz et.al., 2004, A Comparison of Web and Mail Survey Response Rates, Public Opinion Quarterly, Vol68 No.1 Pp94-101 (see figure 1). Their study objective was to compare response rates between mail and electronic survey modes. The target population for the study was MSU undergraduate, graduate, and professional students enrolled for academic year 2001–2002. Our intention is not to compare the two different survey modes, but to evaluate our mail survey response rate in comparison with theirs. In fact, their results provide valuable information for a response rate of an arbitrary mail survey, which is 31.5%.
Figure 1. Mail and E-mail survey response rates presented in Kaplowitz et.al., 2004.
Since the proposed survey is time constraint and almost immediately following a disaster, the victims have to be surveyed while they are still experiencing disaster trauma. In most of the cases the victims may be in the worst stage of the disaster trauma when they receive the survey materials. Disaster trauma psychology symptoms may include [http://www.citizencorps.gov/cert/downloads/training/PM-CERT-Unit7Rev3.doc]:
Irritability or anger.
Self-blame or the blaming of others.
Isolation and withdrawal.
Fear of recurrence.
Feeling stunned, numb, or overwhelmed.
Feeling helpless.
Mood swings.
Sadness, depression, and grief.
Denial.
Concentration and memory problems.
Relationship conflicts/marital discord.
Loss of appetite.
Headaches or chest pain.
Diarrhea, stomach pain, or nausea.
Hyperactivity.
Increase in alcohol or drug consumption.
Nightmares.
The inability to sleep.
Fatigue or low energy.
In addition to disaster trauma, frequent relocations are anticipated for the victims after a disaster, which attributes to Non-contact portion of non-response. Considering even during normal stages of everyday life “time-limited polls often yield very low response rates” (McCarty et al. 2006), we believe that we have achieved very good response rates of 32.8% if not the best possible for this particular type of target population.
Regardless the barriers and difficulties we face for this particular type of the target population, we ensure the following steps to maintain the level of success in our usual response rate:
In the event that the initial mailing does not produce the target number of completed surveys, a follow-up mailing shall be sent to the non-respondents to the initial mailing to remind them of the importance of the survey and encourage their participation. The follow-up shall be handled in the same manner as the original survey. If an adequate number of completed surveys is not obtained to establish the confidence level by the process stated above, we will distribute and collect an additional round of surveys from the sample in the same manner as the original survey in order to achieve the confidence level.
4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.
No testing will be undertaken. The current procedures and methods have demonstrated to produce reliable data with minimal burden on applicants in hundreds of previous disaster related surveys.
5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.
Survey |
Contractor |
FEMA Housing Inspection Services Customer Satisfaction Survey |
J & E Associates, Inc. Michael D. Campbell, Ph.D. 301-587-4220 x241 |
Program Officer |
|
FEMA Housing Inspection Services Customer Satisfaction Survey |
Chris Trice FEMA (540) 678-2109 |
Page
File Type | application/msword |
File Title | Rev 10/2003 |
Author | FEMA Employee |
Last Modified By | clim |
File Modified | 2007-05-09 |
File Created | 2007-05-09 |