1660-0145 - Supporting Statement B.v7 clean

1660-0145 - Supporting Statement B.v7 clean.docx

Federal Emergency Management Agency Programs Customer Satisfaction

OMB: 1660-0145

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT FOR

Federal Emergency Management Agency Programs Customer Satisfaction Surveys

OMB Control Number: 1660 – 0145

COLLECTION INSTRUMENT(S):


FEMA Form FF-104-FY-21-180 (formerly 519-0-44) Preparedness –Phone

FEMA Form FF-104-FY-21-181 (formerly 519-0-45) Preparedness -Electronic

FEMA Form FF-104-FY-21-182 (formerly 519-0-46) Transitional Sheltering Assistance (TSA)–Phone

FEMA Form FF-104-FY-21-183 (formerly 519-0-47) Transitional Sheltering Assistance (TSA)-Electronic

FEMA Form FF-104-FY-21-181 (formerly 519-0-48) Temporary Housing Units (THU)–Phone

FEMA Form FF-104-FY-21-181 (formerly 519-0-49) Temporary Housing Units (THU) -Electronic


B. Collection of Information Employing Statistical Methods.



The agency should be prepared to justify its decision not to use statistical methods in any case where such methods might reduce burden or improve accuracy of results. When Item 17 on the Form OMB 83-I is checked “Yes”, the following documentation should be included in the Supporting Statement to the extent that it applies to the methods proposed:


1. Describe (including numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection has been conducted previously, include the actual response rate achieved during the last collection.

The sample consists of disaster survivors who registered for Federal Emergency Management Agency (FEMA) assistance for Presidentially-declared major disasters and emergencies. Communication preference provided by respondents during registration (U.S. Postal Service (USPS) correspondence or electronic) is used to determine administration mode.


Historical data was examined from January 2021 – June 2021 to determine the proportion of disaster survivors that preferred each communication method. The following percentages will fluctuate depending on disaster activity.


1. Phone Survey: Approximately 61% of applicants preferred USPS correspondence. It is assumed that those that prefer USPS correspondence will prefer a phone survey, as opposed to an electronic survey. Mail surveys are not provided. Response rates for phone administered programs surveys are around 28% for all surveys based on an average of data from 2016-2020.


2. Electronic Survey: Approximately 39% of applicants preferred email correspondence. Electronic surveys were not previously administered for this collection. The expected response rate for electronic administration is approximately 30% based on empirical research of web-based surveys and industry standards.


The proportion of disaster survivors that prefer USPS versus email communications can vary widely from disaster to disaster. These numbers are meant to be approximations only. For each study, a proportionate amount of disaster survivors will be surveyed based on their communication preference. For example, if 90% of applicants from a particular disaster preferred email communication, then 90% of our sample would be respondents who preferred email communication and they would receive a survey via email. Stratification is based on the interaction between disaster and communication preference.


Survey respondents that participated in the individual programs are surveyed as below:


Preparedness Survey (Phone or Electronic) measures the preparedness levels of FEMA applicants. The sample will include a proportionate amount of eligible and ineligible applicants by disaster.


The total yearly population of applicants is approximately 125,049, based on an average of Preparedness data for 2016-2020. The target number of completions per quarter is 400 to ensure statistical inference with 95% confidence level, 0.5 variability assumption on the population, and 5% precision (Margin of error).



Transitional Sheltering Assistance Survey (TSA) (Phone or Electronic) measures the quality of disaster assistance information and service received regarding eligibility and availability of hotel accommodations for disaster survivors. The sample will include TSA eligible applicants who participated in the program.


The total yearly population of applicants is approximately 54,357, based on an average of TSA data for 2016-2020. On average, there were four disasters declared for TSA per year. The target number of completions per disaster is 400 to ensure statistical inference with 95% confidence level, 0.5 variability assumption on the population, and 5% precision (Margin of error).

 

Temporary Housing Units Survey (THU) (Phone or Electronic) measures the quality of disaster assistance information and service received regarding eligibility and availability of housing units for disaster survivors. The sample will be disaster specific and include THU eligible applicants who participated in the program. The total yearly population of applicants is approximately 22,062, based on an average of THU eligible data for 2016-2020. On average, there were four disasters declared for TSA per year. The target number of completions per disaster is 400 to ensure statistical inference with 95% confidence level, 0.5 variability assumption on the population, and 5% precision (Margin of error).


Due to the infrequency of the TSA and THU program activations, data from the past 5 years were averaged and used when available.


Qualitative research (focus groups and interviews) will not be subject to probabilistic sampling methods (e.g., usually based on purposive or convenience sampling). Historical data shows response rates of 35% for focus groups without incentive to participate. However, with changes that have happened over the past few years (COVID issues with travel, holding in-person meetings, etc.) we have lowered our response rate.


Part B Question #1: Description of Respondent Universe, Sampling Method, Response Rates

Type of Respondent / Entity

Form Name / Form Number

Potential Respondent Universe Numerical Estimate

Estimated Completions per Quarter

Estimated Completions per Year

Sampling Criteria for Universe

Actual or Expected Survey Response Rates

Target Annual Adjusted Sample Size

 

 

Surveys

 

A

B

Bx4 Qtrs.

 

D

 

Individuals and Households

Preparedness Survey- Phone

76,280

244

976

Quarterly sample proportionate to applications per disaster by communication preference

28%

3,486

FEMA Form FF-104-FY-21-180 (formerly 519-0-44)

Preparedness Survey- Electronic

48,769

156

624

quarterly Sample proportionate to applications per disaster

30%

2,080

FEMA Form FF-104-FY-21-181 (formerly 519-0-45)

Preparedness Survey

125,049

400

1,600

 

 

5,566

TSA Survey- Phone

33,158

244

976

Sample proportionate to applications per disaster

28%

3,486

FEMA Form FF-104-FY-21-182 (formerly 519-0-46)

TSA Survey-Electronic/

21,199

156

624

Sample proportionate to applications per disaster

30%

2,080

FEMA Form FF-104-FY-21-183 (formerly 519-0-47)

TSA Survey

54,357

400

1,600

 

 

5,566

THU Survey- Phone

13,458

244

976

Sample proportionate to applications per disaster

28%

3,486

FEMA Form FF-104-FY-21-184 (formerly 519-0-48)

THU Survey-Electronic/

8,604

156

624

Sample proportionate to applications per disaster

30%

2,080

FEMA Form FF-104-FY-21-184 (formerly 519-0-49)

THU Survey

22,062

400

1,600

 

 

5,566

Total Survey Sample Size

 

201,468

1,200

4,800

 

 

16,698

Qualitative Studies

 




 

 

 


One-on-One Interviews

125,049

 

768

 

 

 

On-Line Interviews

125,049

 

768

 

 

 

Qualitative Studies Total

 

 

 

1,536

 

 

 

Surveys and Qualitative Studies

 

 

 

6,336

 

 

 


The table below shows the estimated size of the universe covered by the collection and the corresponding samples for each survey.



2. Describe the procedures for the collection of information including:

-Statistical methodology for stratification and sample selection:

TSA and THU surveys are based on a per-disaster basis for each survey. Sample for each survey is determined by disaster size to determine confidence interval desired. Each survey’s population will be divided into subgroups based on proportionate registrations by communication preference. These disaster subgroups will be based on proportionate registrations by disaster and communication preference. The Preparedness survey will be prorated by eligible and ineligible respondents. Stratification provides gains in precision, or reliability, of the survey estimates and the gains are greatest when the strata are maximally heterogeneous.


-Estimation procedure:

The estimated sample is determined by the five-year average of disaster registrations per each program. This sample can be adjusted to accommodate historical response rates, disaster activity, changes to programs, etc. to improve reliability.


-Degree of accuracy needed for the purpose described in the justification:

The degree of accuracy is obtained by using a 0.5 variability assumption on the population (response distribution), 5% precisions (Margin of error) and 95% confidence level. This sample size allows us to make statistical inference of the population.

-Unusual problems requiring specialized sampling procedures:

There are no unusual problems requiring specialized sampling procedures.


-Any use of periodic (less frequent than annual) data collection cycles to reduce burden:


TSA and THU programs may not be available in every disaster. Data will be collected and reported in two-week cycles, as available, for these programs. The Preparedness survey is surveyed every disaster and reported quarterly.


The TSA data collection cycle can begin after the end of the initial 14-day eligibility period and continue through three extension periods. The THU survey is conducted using pay period cycles with the first sample pulled 90 days after program activation. These surveys are conducted soon after the eligibility period in order to capture the experience respondents had with program services. The Preparedness survey is conducted 90 days or more after the application period closes. Preparedness is conducted months after a disaster declaration in order to understand where a respondent is in the disaster recovery process.


3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.

Maximizing Response Rates

Survey fatigue, survey bombardment, and confidence in the privacy of survey data has led to fewer surveys being completed within the survey industry. To combat these issues, several methods have been employed to help maximize response rates. Providing multiple survey administration methods ensures respondents can answer the survey in their preferred and most convenient method. Surveys are also reviewed to ensure they are concise and survey burden is short. Respondents are contacted at various times of the day or receive multiple electronic invites. This helps ensure various opportunities to take a survey are provided at the convenience of the respondent.



Reliability and Validity (Accuracy)


Sample is checked to ensure only applicants who have applied for the specific program pertaining to the survey they are taking are provided a survey. The reliability of the data that is provided from the responses is maintained by adjusting response rate using historical data to determine completion rates: for instance, if we would like 400 completions but we know we only receive a 50% response rate, we would double the sample size (800) to ensure this was done.


Responses options are screened for technical terms and jargon and plain language is used when possible. Also, questions are monitored to ensure overlapping or contradicting responses.


Unreliable Data


There are several factors that can contribute to the non-response of respondents: the nature of the disaster (the trauma associated with answering questions related to the disaster), not having access to phone or internet, and frequent relocations/displacements from their home or temporary residence while trying to recover from the disaster. To accommodate these factors, sample size is adjusted based on historical nonresponse rates and on a disaster basis.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


CSA and FEMA personnel who were experts in each survey type’s topic reviewed the questionnaires to improve readability and clarity of the content of surveys.


CSA staff conducted readability tests to improve both the phone and electronic surveys. Plain language, clarity, timing, and accuracy were reviewed.


Test of the surveys with fewer than 10 survivors may be undertaken by CSA for updates/revisions of the surveys.


5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


The Customer Survey & Analysis (CSA) Section plans, designs, administers, and analyzes results of the survey. This includes the survey methodology and sample selection, collecting, tabulation and reporting of the data.


Dr. Brandi Vironda, Statistician

Customer Survey & Analysis

Federal Emergency Management Agency

940 891-8572


Dr. Kristin Brooks, Statistician

Customer Survey & Analysis

Federal Emergency Management Agency

202 826-6291


Kristi Lupkey, Supervisory Program Analyst

Customer Surveys and Analysis

FEMA Recovery Reports and Analytics

(940) 891-8852



7


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleRev 10/2003
AuthorFEMA Employee
File Modified0000-00-00
File Created2022-05-02

© 2024 OMB.report | Privacy Policy