Supporting Statement Part B 1660-NW102

Supporting Statement Part B 1660-NW102.docx

Federal Emergency Management Agency Individual Assistance Customer Satisfaction Surveys

OMB: 1660-0143

Document [docx]
Download: docx | pdf

1/22/2021

Supporting Statement for

Paperwork Reduction Act Submissions


OMB Control Number: 1660 – NW102


Title: Federal Emergency Management Agency Individual Assistance Customer Satisfaction Surveys


Form Number(s):

FEMA Form 519-0-36 Initial Survey –Phone

FEMA Form 519-0-37 Initial Survey -Electronic

FEMA Form 519-0-38 Contact Survey -Phone

FEMA Form 519-0-39 Contact Survey- Electronic

FEMA Form 519-0-40 Assessment Survey -Phone

FEMA Form 519-0-41 Assessment Survey - Electronic


To streamline the paperwork process, this new collection of surveys replaces two unexpired collections of FEMA Individual Assistance Customer Satisfaction Surveys: 1) 1660-0036 with 11 surveys and 2) 1660-0128 with 1 survey. The 12 surveys in these two collections, which expire 9/2017 and 1/2018 respectively, will be replaced with this new collection. Upon approval of this new collection, the two current collections will be discontinued.


B. Collections of Information Employing Statistical Methods.



When Item 17 on the Form OMB 83-I is checked “Yes”, the following documentation should be included in the Supporting Statement to the extent it applies to the methods proposed:


If the collection does not involve statistical methodology please enter “THERE IS NO STATISTICAL METHODOLOGY INVOLVED IN THIS COLLECTION” and delete Q1 through 5.


1. Describe (including numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection has been conducted previously, include the actual response rate achieved during the last collection.

The sampling frame consist of all disaster survivors who registered for assistance with FEMA. The subpopulations are then grouped by the intent of the scope of each survey as follows:

Initial Survey (Phone or Electronic) measures the quality of disaster assistance information and services received during the initial registration process. Possible registration methods include (1) with a FEMA representative or (2) online via DisasterAssistance.gov website.


Proportionating registration methods into homogeneous subgroups reflect the primary interest of the Initial Survey. The sample will include a proportionate amount of applicants who used each registration method (FEMA representative or online). A proportionate sample of registration methods (e.g., 63.77% - Representative, 36.23% - Online) will be derived for weekly administration, the data is then divided into the applicant’s preferred communication method.


The total population of applicants is approximately 378,674, based on a 5 year average of disaster registration data. The target number of completions per month is 400 to ensure statistical inference with 95% confidence level, 0.5 variability assumption on the population, and 5% precision (Margin of error).


The system is flagged by the preferred communication method and the administration method to deliver the survey (Phone or Electronic).

1. Phone Survey: Historical data shows approximately 257,498 (68%) applicants with a preferred communication method by US Mail with a response rate of approximately 34% based on historical data.

2. Electronic Survey: Historical data shows approximately 121,176 (32%) applicants with a preferred communication method of email. The expected response rate for the electronic survey is approximately 30% based on empirical research of web based surveys and industry standards.

 

Contact Survey (Phone or Electronic) measures the quality of disaster assistance information and services received during additional contact methods with FEMA. These contact methods consist (1) with a FEMA representative phone contact (2) FEMA inspector contact and (3) online account access via DisasterAssistance.gov website. This would include someone checking the status of their application, or calling a Representative to ask questions about their case. Sample will include a proportionate amount of contact methods.


The use of proportionating contact methods into subgroups reflect the primary interest of the Contact Survey. A proportionate sample of contact methods (e.g. 63.19% - Calls to Representatives, 6.3% - Online checking of status, and 30.51% Inspection) is derived for weekly administration, the data is then divided into the applicant’s preferred communication method.


The total population of registered applicants who had post-registration interactions is approximately 263,099, based on a 5 year average. The target number of completions per month is 600 to ensure statistical inference with 95% confidence level, 0.5 variability assumption on the population, and 5% precision (Margin of error).


The system is flagged by the preferred communication method and the administration method to deliver the survey (Phone or Electronic).

  1. Phone Survey: Historical data showed approximately 178,907 (68%) registered applicants with a preferred communication method of US Mail had an expected response rate of approximately 36% based on historical data.

  2. Electronic Survey: Historical data showed approximately 84,192 (32%) registered applicants with a preferred communication method of email. The expected response rate for the electronic survey is approximately 30% based on empirical research of web based surveys and industry standards.


Assessment Survey (Phone or Electronic) measures the quality of disaster assistance information and services received after eligibility is determined. Sample will include an equal amount of eligibility statuses based on the 5 year average of the population being approximately 50% for each eligibility status.


The target number of completions per month is 800 (400 Eligible and 400 Ineligible) to ensure statistical inference with 95% confidence level, 0.5 variability assumption on the population, and 5% precision (Margin of error) for each eligibility status.


The system is flagged by the preferred communication method and the administration method to deliver the survey (Phone or Electronic).

1. Phone Survey: Historical data shows approximately 257,498 (68%) applicants with a preferred communication method by US Mail with a response rate of approximately 28% based on historical data.

2. Electronic Survey: Historical data shows approximately 121,176 (32%) applicants with a preferred communication method of email. The expected response rate for the electronic survey is approximately 30% based on empirical research of web based surveys and industry standards.


Qualitative research (focus groups and interviews) will not be subject to probabilistic sampling methods (e.g., usually based on purposive or convenience sampling). Historical data shows response rates of 6% for focus groups without incentive to participate.

The table below shows the estimated size of the universe covered by the collection and the corresponding samples for each survey.

Part B Question #1: Description of Respondent Universe, Sampling Method, Response Rates

Type of Respondent / Entity

Form Name / Form Number

Respondent Universe Numerical Estimate Basis

Potential Respondent Universe Numerical Estimate

Target Completions per Month

Target Completions per Year

Sampling Criteria for Universe

Actual or Expected Survey Response Rates with Actual 2nd Qtr. FY16 Avg of 33.71%

Target Annual Adjusted Sample Size

Respondent Universe

 

Surveys

 

5 Year Average of All:

A

B


 

D


Individuals and Households

Initial Survey- Phone/
FEMA Form
519-0-36

Registration modes: FEMA Representative or On-Line. Preferred method of communication- phone.

257,498

272

3,264

Weekly Sample proportionate to the registration mode

34%

9,600

Initial Survey- Electronic/
FEMA Form
519-0-37

Registration modes: FEMA Representative or On-Line. Preferred method of communication is email.

121,176

128

1,536

Weekly Sample proportionate to the registration mode

30%

5,120

Initial Survey

 

378,674

400

4,800

 

 

14,720

Contact Survey- Phone/
FEMA Form
519-0-38

Contact modes: Calling Helpline , On-Line, or Inspection Preferred method of communication is phone.

178,907

408

4,896

Weekly Sample proportionate to contact modes.

36%

13,600

Contact Survey-Electronic/
FEMA Form
519-0-39

Contact modes: Calling Helpline , On-Line, or Inspection. Preferred method of communication is email.

84,192

192

2,304

Weekly Sample proportionate to contact modes.

30%

7,680

Contact Survey

 

263,099

600

7,200

 

 

21,280

Assessment Survey-Phone/ FEMA Form 519-0-40

Eligibility determinations: Eligible vs Ineligible applicants. Preferred method of communication is phone.

257,498

544

6,528

Monthly sample equally distributed for eligible vs ineligible applicants.

28%

23,314

Assessment Survey-Electronic/ FEMA Form 519-0-41

Eligibility determinations: Eligible vs Ineligible applicants. Preferred method of communication is email.

121,176

256

3,072

Monthly sample equally distributed for eligible vs ineligible applicants.

30%

10,240

 

Assessment Survey

 

378,674

800

9,600

 

 

33,554

Total Survey Sample Size

 

 

 

1,800

21,600

 

34%

69,554

Qualitative Studies

 

 

 

 

 

 

 

 

Individuals and Households

Focus Group for 2 Hrs Plus Travel 1 Hr

Based on 5 Yr. Avg of Total Registrations and Eligible/Ineligible Housing Assistance Applicants.

252,138

 

960

 

6%

 

One-on-One Interviews

252,138

 

768

 

 

 

On-Line Interviews

252,138

 

768

 

 

 

Qualitative Studies Total

 

 

 

 

2,496

 

 

 

Surveys and Qualitative Studies

 

 

 

1,800

24,096

 

 

 



2. Describe the procedures for the collection of information including:

-Statistical methodology for stratification and sample selection:

A probability based sampling method of stratification will be used to make sure each homogeneous subgroup within the population are represented with proportionate estimates. To ensure each subgroup in our overall sample is represented at similar levels of precision, the sample is adjusted using historical response rates to accommodate each population. This ensures we have enough data elements within each sample to make statistical inference on the disaster survivor population and subpopulations of interest.

Simple random samples (where all units and all equal-numbered combinations of units have the same probabilities of selection) are rare in practice for a number of reasons. … Thus, other probability-based methods that employ multiple stages of selection, and/or stratification, and/or clustering are used to draw more practical samples that can be generalized with known degrees of sampling error.” [https://www.whitehouse.gov/sites/default/files/omb/inforeg/pmc_survey_guidance_2006.pdf]

Stratification provides gains in precision, or reliability, of the survey estimates and the gains are greatest when the strata are maximally heterogeneous.


-Estimation procedure:

For the Initial, Contact, and Assessment Surveys, the sample is based on the FY2016 response rates of previous similar surveys with similar subpopulations. Sample is adjusted to accommodate historical response rates to improve reliability.


-Degree of accuracy needed for the purpose described in the justification:

The degree of accuracy is obtained by using a 0.5 variability assumption on the population (response distribution), 5% precisions (Margin of error) and 95% confidence level. This sample size allows us to make statistical inference of the population.

-Unusual problems requiring specialized sampling procedures:

There are no unusual problems requiring specialized sampling procedures.


-Any use of periodic (less frequent than annual) data collection cycles to reduce burden:


Data will be collected on a weekly basis for Initial and Contact Surveys and monthly for Assessments. Due to the nature of the Initial and Contact questions and for best response recall, surveys are conducted within a week after the interaction with FEMA. Assessment Surveys ask overall questions related to service, assistance and recovery which need more time to experience after the disaster occurred; therefore, the survey is conducted 30 days or more after the eligibility is determined.




3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.

Maximizing Response Rates

Maintaining adequate response rates of surveys continues to be a problem as more people are fatigued from survey inundation, and highly publicized confidentiality breaches from various organizations have people uneasy about providing information. Because of this, several methods are used to maximize response rates: Survey burden time has decreased, voluntary and confidentiality disclosures are given, mixed mode administration is offered (phone & electronic) with follow-up or reminders for respondent’s convenience.


Reliability and Validity (Accuracy)


Sample is adjusted to accommodate historical response rates to improve reliability. This is done by taking previous year’s response rates and increase the targeted completions. If we would like 400 completions but we know we only receive 50% a response rate, we would then survey 800 to ensure we receive the 400.


Questions are screened to ensure readability through research of best practices and read aloud testing. Response options are also screened to create independent/ non overlapping options and dubious replies due to unclear or overlapping response scales. Complex wording, technical terms, jargon, and difficult phrases are closely monitored.


Unreliable Data

Factors that contribute to the non-response portion may be due to the nature of the disaster; such as, due to the disaster applicants who are survivors often do not have telephone service, cell phone service, nor electrical service in their community. Frequent relocations and displacements are anticipated affecting the respondent’s availability to complete the survey. Survivors may not want to use their cell phone minutes to respond to a survey. Disaster trauma may be a factor as the survivor may not remember all the different interactions with FEMA or was not familiar with the case. Due to these factors, sample size is adjusted to accommodate historical nonresponse rates to alleviate possible unreliable data.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


Most of the questions in the surveys have been performed for ten-to fifteen years and were initially based on comments from past focus groups and contractor recommendations. FEMA personnel also reviewed the questionnaire content and wording to improve readability and clarity.


Tests for readability are conducted by staff to help with reliability and accuracy. This includes question layout, wording, definitions, and timing. Questions are also analyzed for plain language.


Tests with fewer than 10 survivors will be performed by FEMA’s Customer Survey and Analysis staff for updates or revisions as needed.


5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


The Customer Survey & Analysis (CSA) Section plans, designs, administers, and analyzes results of the survey. This includes the survey methodology and sample selection, collecting, tabulation and reporting of the data.


Jessica Guillory, Statistician

Customer Survey & Analysis

Federal Emergency Management Agency

940 891 8528


Dr. Kristin Brooks, Statistician

Customer Survey & Analysis

Federal Emergency Management Agency

940 891 8579


Gena Fry, Program Analyst

Customer Survey & Analysis

Federal Emergency Management Agency

940 891-8543


Maggie Billing, Program Analyst

Customer Survey & Analysis

Federal Emergency Management Agency

940 891-8709


Kyle M. Mills, P.E., Section Manager

Customer Survey & Analysis

Federal Emergency Management Agency

940 891-8881




16


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleRev 10/2003
AuthorFEMA Employee
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy