Supporting Statement part B 1660-0102 8-2010

Supporting Statement part B 1660-0102 8-2010.docx

Federal Emergency Management Agency Housing Inspection Services Customer Satisfaction Survey.

OMB: 1660-0102

Document [docx]
Download: docx | pdf

November 19, 2013


Supporting Statement for

Paperwork Reduction Act Submissions


OMB Control Number: 1660 - 0102


Title: Federal Emergency Management Agency Housing Inspection Services Customer Satisfaction Survey


Form Number(s): FEMA Form 007-0-1 (formerly FEMA Form 86-26)



B. Collections of Information Employing Statistical Methods.



When Item 17 on the Form OMB 83-I is checked “Yes”, the following documentation should be included in the Supporting Statement to the extent it applies to the methods proposed:


1. Describe (including numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection has been conducted previously, include the actual response rate achieved during the last collection.


The proposed IC is a Time-Constraint survey.


The target population of the proposed IC includes individual victims of a particular disaster (applicants) who have attained housing inspections within 21 days after the disaster declaration in seeking recovery assistance from FEMA. The sample frame consists of the entire individual names in the target population; therefore, there is no alternative sample frame. There is no element exclusion applied to either the target population or the sample frame. The sample frame is constructed from the database of FEMA Housing Inspections, which is maintained electronically in National Emergency Management Information System. The data collection is conducted by mail-survey. The frequency of the IC is one time per disaster, and the time of the collections varies according to disaster occurrences.


In compliance with the FEMA Housing Inspection Services contract, inspectors meet with disaster survivors who have registered with FEMA and who are seeking assistance in designated federally declared disaster areas. The data will be used in making financial grants for disaster assistance. When the disaster survivor registers with FEMA, the disaster survivor is told that a housing inspection contractor will contact them within 7 to 10 days from the date of registration. FEMA treats the FEMA Information Data and Analysis (FIDA) report (the list of survivors receiving a housing inspection) and then FEMA uses that universe for whom FEMA then estimates the level of satisfaction with housing inspection services. If it is a small group (1,100 or fewer) the entire universe is surveyed. If it is larger, a random sample is drawn of just those individuals.


The purpose of this customer service survey is to measure the etiquette of the housing inspector, not the FEMA program.  Customer service survey results are tabulated and along with other measurements are used in FEMA’s Quality Assurance to determine incentive or disincentive payment to the housing inspection contractor based on customer service ratings.  Demographic data is maintained in NEMIS on all disaster survivors who receive a housing inspection to ensure no significant demographic differences on early and late responders and that data is available upon request.


An effective sample size is determined by (1) the number of completed housing inspections and (2) the confidence-level and margin of error criteria of (95 % 3 %) which is slightly above the required FEMA precision requirement (95 % 5 %), and a 50 % response distribution, and (3) historical response rate, which sets a goal for the target number of completed surveys. The objective is to obtain the number of completed surveys, which are returned to us, that will provide estimates of the true level of customer service satisfaction with housing inspection services for a particular disaster as accurate as possible with the minimum time and cost burden. Historically the overall response rate to these mail surveys is in the range of 30 % to 50 % per disaster.


Number of Disasters for Housing Inspections FY11-12


FY

DR’s

# of disasters

# of applicants receiving inspections

2011

DR 1942 NC through DR 4031 NY

37

242,881





2012

DR 4040 PR through DR 4081 MS

13

146,721









50 total 389,602 total


*Note: # of disasters declared in 2012 is well below the historical average of declared disasters.


This results in the average of 25 disasters for the last two fiscal years, FY11 and FY12. The average universe over the last two fiscal years is 194,801.



Table 5 shows the data on the number of entities in the sample universe covered by the collection and the corresponding sample for the universe as a total and for each declaration.


Table 6. Universe and Sample sizes, entities and degree of accuracy.

Survey

Total Universe per Year

Number of Entity [description]

Avg # of DRs per Year

Target Universe per Disaster

Sample

per Disaster

Confidence Level [Margin of Error at 50% response distribution]

Housing Inspection Services Customer Satisfaction Survey

194,801

1

[Individual]

32

8,367 [Mean]*

1,062 [Median]

533 [Median] 947 [Mean]

Rounded sample value= 950



95 %

[<=3 %]


2. Describe the procedures for the collection of information including:


-Statistical methodology for stratification and sample selection:


Following a disaster declaration, an electronic database is created, which consists of the applicants’ attributes such as names and addresses. The list of those names is the sample frame for our target population. A random sample is selected from the sample frame utilizing Statistical Package for the Social Sciences (SPSS) random sample generator. The initial survey materials are mailed to all individuals in the sample. Sample sizes depend on not only the actual size of the applicant population but also other factors such as an area’s disaster history or overall literacy level. Some locations, for example, are subject to repeated flooding, hurricanes, tornadoes, etc. We use historical survey response rates from prior disasters in these locations to determine an adequate number of contact attempts. Because survey response rates are also positively correlated with literacy levels, we have found that in states with a highly educated populace, such as Massachusetts, we can achieve the completion target with fewer initial mailings.


For large disasters, mailings sent to 1,500 randomly selected applicants are usually sufficient to achieve the target number of completions. For exceptionally devastating disasters, such as Hurricane Irene and Tropical Storm Lee where applicants were scattered around the country, the initial number of mailings may need to be as high as 2,000. Figure 1 shows the over all procedure of Housing Inspection Services Customer Satisfaction Survey from the sampling stage to the final report.


Shape1


Figure 1. A diagram describing the procedure of Housing Inspection Services Customer Satisfaction Survey from the sampling stage to the final report.


2.1. Statistical methodology for stratification and sample selection.


A probability sample is selected from the sample frame utilizing Statistical Package for the Social Sciences (SPSS) random sample generator. In general, SPSS program uses the Uniform function to generate random numbers. FEMA has established a standard that the number of completed surveys per disaster must be large enough to achieve an effect of a sample size that will produce 95 % confidence level with a 5 % confidence interval at 50 % response distribution. This study exceeds the FEMA standard by using a 3% confidence interval. For each disaster, the target number of completed surveys and sample size are computed that will be large enough to achieve this criteria. Table 6 shows sample sizes in general for different sample universe sizes. There is no sample stratification.


Table 7. Sample sizes used for different sample universes depending on each disaster case and its target number of completed surveys.


Sample Universe Size

Sample Size

1,100 or fewer

Universe Survey

1,101 to 10,000

1,100 to 1,450

Over 10,000

1,500



  • Estimation procedure,


The proposed IC is a structured psychometric survey using combination of dichotomous questions and Likert response scale of bipolar scaling with an additional middle or uncertainty option for some cases. Population parameters are empirically derived for each question as a percentage for the responses that choose the particular answer of the total responses to the question. In addition to the observed response percentages, adjusted response percentages are presented, which exclude uncertain responses.


  • Degree of accuracy needed for the purpose described in the justification,


Though extremely accurate estimates of population response percentages are not necessary for this IC, we pursue an effect of 95 % confidence level with a +/– 3 % margin of error at 50 % response distribution for the results from the target number of survey completions. In case of each question, most of per-question error margins decrease for its dominant response because the dominant response distribution is usually greater than 50 %.


  • Unusual problems requiring specialized sampling procedures, and


Sample size may be increased in the event that the disaster is particularly devastating and large numbers of applicants have been dispersed around the country, as happened with Hurricane Irene and Tropical Storm Lee. In such a special case, a larger sample size may be necessary to ensure that the target number of completed surveys is received.


  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


Only one data collection is conducted per disaster for this IC.


3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.


The target respondents of the proposed survey are disaster victims who applied for housing assistance. The response rate is obtained using a response-rate formula recognized for mail surveys by the American Association for Public Opinion Research (AAPOR) as following:


RR = I / {(I+P) + (R+NC+O) + U}

, where


RR = Response rate

I = Returned and more than 80 % of the questions answered

P = Returned and 50 % to 80 % of the questions answered

R = Refusal and break-off (not returned, returned blank or less than 50% answered)

NC = Non-contact (dead, physical or mental disability, language/literacy problem, etc.)

O = Other (returned too late, other miscellaneous reasons)

U = Unknown eligibility (= 0 in this case, see B. #1.)


Our response rate of 31 % is compared with a mail survey response rate presented in Kaplowitz et.al., 2004, A Comparison of Web and Mail Survey Response Rates, Public Opinion Quarterly, Vol68 No.1 Pp94-101 (see figure 1). Their study objective was to compare response rates between mail and electronic survey modes. The target population for the study was MSU undergraduate, graduate, and professional students enrolled for academic year 2001–2002. Our intention is not to compare the two different survey modes, but to evaluate our mail survey response rate in comparison with theirs. In fact, their results provide valuable information for a response rate of an arbitrary mail survey, which is 31.5% and our response rate of 31% compares favorably.


Shape2


Figure 1. Mail and e-mail survey response rates presented in Kaplowitz et.al., 2004.


Since the proposed survey is time constraint and almost immediately following a disaster, the victims have to be surveyed while they are still experiencing disaster trauma. In most of the cases the victims may be in the worst stage of the disaster trauma when they receive the survey materials. Disaster trauma psychology symptoms may include [http://www.citizencorps.gov/cert/downloads/training/PM-CERT-Unit7Rev3.doc]:


  • Irritability or anger

  • Self-blame or the blaming of others

  • Isolation and withdrawal

  • Fear of recurrence

  • Feeling stunned, numb, or overwhelmed

  • Feeling helpless

  • Mood swings

  • Sadness, depression, and grief

  • Denial

  • Concentration and memory problems

  • Relationship conflicts/marital discord

  • Loss of appetite

  • Headaches or chest pain

  • Diarrhea, stomach pain, or nausea

  • Hyperactivity

  • Increase in alcohol or drug consumption

  • Nightmares

  • The inability to sleep

  • Fatigue or low energy


In addition to disaster trauma, frequent relocations are anticipated for the victims after a disaster, which attributes to Non-contact portion of non-response. Considering even during normal stages of everyday life “time-limited polls often yield very low response rates” (McCarty et al. 2006), we believe that we have achieved very good response rates of 31% if not the best possible for this particular type of target population.


Regardless the barriers and difficulties we face for this particular type of the target population, we ensure the following steps to maintain the level of success in our usual response rate:

  • A follow-up mailing shall be sent to all respondent s to remind them of the importance of the survey and encourage their participation.

  • The follow-up mailings are sent at approximately 21 calendar days after the initial mailing and again, if necessary, at approximately 21 calendar days after the first follow up. Follow –ups are sent only to non-responders. The contractor tracks the returns using a unique identifier printed at the top of the survey form (e.g. 1933-0001 to 1933-1500). There is no personally identifying information on the form. The contractor’s mailing/tracking database contains only the identifier number and the information FEMA provides. It is a stand-alone database that is password protected and accessed only by those who generate mailings and track completed surveys. Survey data are entered into a separate stand-alone database that contains only the identifier number and the respondent’s answers. The two databases are never integrated. Mailing clerks work only with the first database and data clerks only with the second. All data are aggregated for reporting to FEMA, with no personally identifying information included. Through these safeguards, respondents’ and non-respondents’ privacy is protected at all times.

  • The follow-up shall be handled in the same manner as the original survey. If an adequate number of completed surveys is not obtained to establish the confidence level by the process stated above, we will distribute and collect an additional round of surveys from the sample in the same manner as the original survey in order to achieve the confidence level.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


No testing will be undertaken. The current procedures and methods have demonstrated to produce reliable data with minimal burden on applicants in hundreds of previous disaster related surveys.


5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.



Survey


FEMA Housing Inspection Services Customer Satisfaction Survey

Contractor:

J & E Associates, Inc.

Michael D. Campbell, Ph.D.

301-587-4220 x241



FEMA Housing Inspection Services Customer Satisfaction Survey


Program Officer:

Michael Hockman

FEMA

(540) 686-3802


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authornbouchet
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy