1660-0107 Supporting Statement B Final

1660-0107 Supporting Statement B Final.pdf

Public Assistance Customer Satisfaction Surveys

OMB: 1660-0107

Document [pdf]
Download: pdf | pdf
May 12, 2020

Supporting Statement for
Paperwork Reduction Act Submissions
OMB Control Number: 1660 – 0107
Title: Public Assistance Customer Satisfaction Surveys
Form Number(s):
FEMA Form 519-0-32, Public Assistance Initial Survey (Telephone);
FEMA Form 519-0-33, Public Assistance Initial Survey (Internet);
FEMA Form 519-0-34, Public Assistance Assessment Survey (Telephone);
FEMA Form 519-0-35, Public Assistance Assessment Survey (Internet)
B. Collections of Information Employing Statistical Methods
When Item 17 on the Form OMB 83-I is checked “Yes”, the following documentation should be
included in the Supporting Statement to the extent it applies to the methods proposed:
1. Describe (including numerical estimate) the potential respondent universe and any sampling or
other respondent selection method to be used. Data on the number of entities (e.g., establishments,
State and local government units, households, or persons) in the universe covered by the collection
and in the corresponding sample are to be provided in tabular form for the universe as a whole
and for each of the strata in the proposed sample. Indicate expected response rates for the
collection as a whole. If the collection has been conducted previously, include the actual response
rate achieved during the last collection.

Part B #1. Annual Estimates of Universe of Completed Surveys

Respondent Type

Non-Profit
Institutions (~19%
of universe)
State, Local or
Tribal Government
(~81% of universe)

Survey Form

Public Assistance
Initial Survey
FEMA Form
519-0-32
(Telephone)

Estimated
Annual
Universe
(FY2018-2019)

Projected Annual
Completes
*Based on historical
response rates
PAI-52.56%
PAA-50.35%

Projected Monthly
Completes
(Projected Annual
Completes/12)

624

328

27

2,653

1,394

116

Non-Profit
Institutions (~19%
of universe)
State, Local or
Tribal Government
(~81% of universe)
Non-Profit
Institutions (~19%
of universe)
State, Local or
Tribal Government
(~81% of universe)
Non-Profit
Institutions (~19%
of universe)
State, Local or
Tribal Government
(~81% of universe)

Public Assistance
Initial Survey
FEMA Form
519-0-33
(Internet)

156

82

7

663

348

29

Subtotal

4,096

2,152

179

Public Assistance
Assessment
Survey FEMA
Form 519-0-34
(Telephone)

1,702

857

71

399

201

17

Public Assistance
Assessment
Survey FEMA
Form 519-0-35
(Internet)

425

214

18

100

50

4

2,626

1,322

110

6,722

3,474

289

Subtotal
Grand Total

Note. Calculations were completed going from left to right in the table. These numbers are meant
as an estimate only. We typically survey the entire population because the respondent pool is so
small, so the projections have no effect on sampling methodology. Annual subtotals and grand
totals are rounded to nearest whole number.
The population of respondents consists of organizations or entities that applied for Public Assistance
(PA) following a major disaster declared under the Stafford Act (42 U.S.C. §§ 5121 et seq). The entities
consist of approximately 81% local, state, territorial, or tribal governments and 19% eligible private
non-profit organizations based on FY 2018-2019.
Statistical methods may be applied to the analysis of the survey results, but there is no statistical
sampling methodology because all qualified applicants are surveyed each month.
There are two surveys in this collection. Sample for both surveys will be imported on a monthly basis.
Periodically, sample may be imported on a bi-weekly basis depending on disaster activity and survey
administration needs.
1) Applicants who completed a Recovery Scoping Meeting will be qualified to take the Public
Assistance Initial (PAI) Survey. These applicants may be eligible or ineligible for FEMA Public
Assistance.
2) Applicants who had funds obligated for at least one of their projects will be qualified for the
Public Assistance Assessment (PAA) Survey, which means the respondent universe includes

eligible applicants only. Applicants who participated in Stafford Act Section 428 Alternative
Procedures for one of their projects (received funds in beginning of process) or had a specialized
project (increased completion time/complexity) may be placed on hold for surveying. Ideally
these applicants will be surveyed closer to the end of the PA process to get a more accurate
representation of satisfaction.
In extreme circumstances (e.g., COVID-19), sampling procedures may be adapted to account for
unforeseen PA programmatic changes. For example, if PA stopped conducting Recovery Scoping
Meetings, we couldn’t pull data based on that flag. We would keep the data pull as consistent as
possible with our normal procedures and would likely import the data 60-90 days after the declaration
date instead.
Both surveys will eventually be mixed mode (phone and electronic). We currently do not have an
electronic administration capability, but we are in the process of acquiring new survey software and
electronic administration is a requirement. We anticipate electronic administration being operational for
the revised surveys, which is why it is included in the submission. We estimate 80% of the survey
completions to be phone administered and 20% to be completed via internet.
PA applicants may be surveyed at two different time points during the PA process through the PAI and
the PAA Surveys. The questions in both surveys are different based on the interactions that occurred
during specific time frames. The PAI Survey was designed to capture feedback at the onset of the PA
process, whereas PAA captures more general satisfaction at the end of the PA process. The surveys
were split up this way because the Public Assistance process can take years to complete. During the
first few weeks, Public Assistance applicants receive initial directions, deadlines, information on what
types of assistance they might qualify for, and other important guidelines from their PA representative
that can set the tone for the rest of the process. If applicants were only surveyed at the end of the
process, they would have a difficult time recalling the early interactions.
The PAI Survey was first approved in the previous information collection and administration began at
the start of FY 2018. In order to estimate the population for the PAI Survey, we examined the
respondent pool for FY 2018-2019. Across those two years, there was an average of 4,096 applicants
who were qualified to participate in the PAI Survey on an annual basis. If we apply the 52.56%
response rate, that results in roughly 2,152 annual completions1 and 179 monthly completions.
The PAA Survey was also first approved in the previous information collection and administration
began at the start of FY 2018. In order to estimate the population for the PAA Survey, we examined the
respondent pool for FY 2018-2019. Across those two years, there was an average of 2,626 applicants
who were qualified to participate in the PAA Survey on an annual basis. If we apply the 50.35%
response rate, that results in roughly 1,322 annual completions and 110 monthly completions.

1

There is a slight rounding error: 4,096*.5256 = 2,152.85. In the table, the response rate was multiplied by each respondent
type (non-profit vs. State) and administration mode (phone vs. internet) to get completion projections for each subpopulation. Those projections were rounded to nearest whole number and summed, which gave an overall estimate of 2,152
annual completions (as opposed to 2,153).

In sum, we estimate 2,152 completions annually for the PAI Survey and 1,322 completions annually for
the PAA Survey, for a total of 3,474 completions. Qualitative research (e.g., focus groups/interviews)
will not be subject to statistical sampling methods (e.g., usually based on convenience sample) or
statistical analysis.
These numbers are rough approximations; some months are going to be more heavily affected by
disasters than others. Additionally, there is great variance in disaster activity from year to year. High
hurricane activity, for example, can cause a dramatic increase in the number of PA applicants for a given
year. There is no statistical sampling methodology for the PA Information Collection because we
survey the entire qualified population each month. These numbers are meant to be an estimate of the
population only.
If a special circumstance arose (e.g., COVID-19, catastrophic event) where there was more available
sample than we can contact in a month, or that is necessary to make reliable generalizations about the
population, we would draw a random sample using a 95% confidence level with a 5% margin of error.
These situations are rare and impossible to anticipate.
Additionally, because of the nature of Emergency Management, sometimes there are situations where
program changes or special circumstances (ex. COVID-19) can make survey questions irrelevant. In
such situations, FEMA may choose to omit survey questions that are no longer applicable to
respondents. This would not change the nature of the survey except a potential decrease to respondent
burden. For example, if we knew respondents didn’t receive a Program Delivery Manager under a
special disaster declaration, we would not ask them to rate their Program Delivery Manager.
Reports will be provided to Public Assistance and FEMA Headquarters management on a quarterly
basis. Management uses these reports to monitor performance and identify possible areas of
improvement. The reports will display summary statistics for demographic information to give an
overview of the population, as well as descriptive breakdowns for questions (e.g., means and
percentages). The reports will also display trends over time for certain items. It is possible that
stakeholders may request additional reports more frequently than quarterly if they want to examine
trends for a particular disaster, state, or region, but that would be on a request only basis. Stakeholders
may also request more in-depth analysis from statisticians if there are significant changes in customer
satisfaction that warrant more analysis. Statisticians may examine demographics (e.g., work experience,
previous experience with PA, FEMA Region) crossed with individual items, and perform statistical tests
such as correlation, T-tests, Crosstabs with Pearson’s Chi-Square, Analysis of Variance (ANOVA), and
regression. Additionally, there is now a Tableau Dashboard available for each of the PA surveys that
displays descriptive statistics (e.g., averages) for selected survey questions. Dashboards are refreshed on
a monthly basis.
2. Describe the procedures for the collection of information including:
-Statistical methodology for stratification and sample selection:
The sampling frame is the universe of applicants for Public Assistance, which are state, local, territorial,
and tribal governments, and eligible private non-profit organizations. Customer Survey & Analysis
Section (CSA) imports information concerning the PA entity’s POC from the Operational Data Store

(ODS) and Enterprise Data Warehouse (EDW) into the Customer Satisfaction Analysis System (CSAS).
Specifically, contact/call lists include POC names, phone numbers, and email addresses.
-Estimation procedure:
This collection is based on surveying the entire population of applicants for Public Assistance; therefore,
no estimation procedure is utilized.
-Degree of accuracy needed for the purpose described in the justification:
Surveying the entire universe of Public Assistance applicants, including state, local and tribal
governments, and eligible private non-profit organizations, yields the highest degree of precision and
confidence. The monthly universe is typically small, usually only a couple hundred applicants.
Surveying the entire universe ensures we have enough survey completions to make reliable and valid
conclusions about the population.
-Unusual problems requiring specialized sampling procedures:
There is no expectation for unusual problems or hard to reach populations other than those who do not
have up-to-date phone numbers or email addresses. For applicants who have recently applied for Public
Assistance, the contact information is generally accurate.
-Any use of periodic (less frequent than annual) data collection cycles to reduce burden:
Sample for both surveys will typically be imported on a monthly basis. Periodically, sample may be
imported on a bi-weekly basis depending on disaster activity and survey administration needs.
Respondents would likely have difficulty with memory recall if data collection were less frequent than
annual, which would lead to distortions in performance measures. Applicants who completed a
Recovery Scoping Meeting will be eligible to take the PAI Survey. Applicants who had funds obligated
for at least one of their projects will be eligible for the PAA Survey.
Because surveying is based on the stage in the Public Assistance Process and not a standard amount of
time, applicants with less complex projects will likely take the PAA survey sooner than applicants with
more complex projects.
3. Describe methods to maximize response rates and to deal with issues of non-response. The
accuracy and reliability of information collected must be shown to be adequate for intended uses.
For collections based on sampling, a special justification must be provided for any collection that
will not yield “reliable” data that can be generalized to the universe studied.
The average response rate for PAI Survey in FY 2018-2019 was 52.56%.
The average response rate for PAA Survey in FY 2018-2019 was 50.35%.
Taking both surveys into account for FY 2018-2019, the combined response rate was 51.70%.

There is no financial incentive for respondents to answer the survey questions. The incentive is the
opportunity for applicants to express their opinion and evaluate their satisfaction with their recent
service. We have found this to be an effective incentive. Our response rates have fallen from the
previous information collection, but according to the Pew Research Center, response rates have
continued to decline in recent years across all telephone surveys (Kennedy & Hartig, 2019). Possible
explanations for the response rate decline include a growing refusal among respondents to participate
and difficulties in contacting individuals due to the increased use of answering machines, call screening
devices, and cellular telephones (Tourangeau, 2004; Ehlen & Ehlen, 2007). In addition, new
technologies sometimes mistakenly flag survey calls- even those conducted by the government- as
“spam” (Kennedy & Hartig, 2019).
Although our response rates have declined, they are still significantly higher compared to industry
averages. We believe we can improve response rates by incorporating an internet administration option.
Previous survey respondents have requested web survey forms. Internet administration will give
respondents more flexibility on when they complete the survey (e.g., they may be busy at work when we
call), as well as a sense of anonymity. Our goal was to incorporate internet administration in the
previous information collection, but we were unable to accomplish this goal due to setbacks with
software acquisition. We are in the process of acquiring new survey administration software, and
internet administration capability is a requirement.
Once internet administration is operational, all applicants who have an email address on file will receive
an email invitation with the ability to complete the survey via a web link. Reminder emails will be sent
if the applicant does not complete the web survey in a certain time period. Phone interviewers will
contact applicants who don’t have an email address on file, as well as contact applicants who do not
complete the web survey after a certain time period has elapsed. We are attempting to reduce nonsampling error, caused by non-response or bad contact data, by contacting applicants in mixed
administration modes (e.g., internet and phone).
The goal is that the entire available universe will be contacted during the survey time period, so there is
no sampling method. Everyone will be contacted to ensure we have enough information collected to
make generalizations about the population. If we have months with low disaster activity and there is not
enough survey data available to draw valid conclusions, a disclaimer about the small sample size and
low precision rate will be included at the beginning of any reports.
Below are survey efforts that will be performed regularly in order to maintain high response rates.
• Scheduling of phone surveys will be during normal business hours. Hours may be changed
depending on disaster activity and time zone of the respondents being surveyed.
• Follow-ups or reminders in the form of emails or phone calls will be used to encourage
participation.
• Callbacks will be attempted to applicants who request a different time/day to take the survey that is
more convenient.
• The opening statement will explain the purpose of the survey and that participation is voluntary.
Respondents will also be told that the survey phone call will not affect the outcome of their
application for FEMA assistance.

•
•
•

Multiple attempts will be made to contact each applicant. Those who receive the web survey will
be sent approximately two email reminders within roughly a 2-week period. If the survey is not
completed within a certain time frame, the applicant data may be placed in the phone queue.
The questions are straightforward, short, and easy to answer. Several questions have been revised
to cut down on wordiness. Terminology has been revised to reduce confusion.
On-going training will be provided to interviewers.

4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an
effective means of refining collections of information to minimize burden and improve utility.
Tests must be approved if they call for answers to identical questions from 10 or more
respondents. A proposed test or set of tests may be submitted for approval separately or in
combination with the main collection of information.
At the beginning of each collection period, a pilot test may be conducted with less than 10 persons to
discover any potential problems with the survey instrument or process. For quality assurance purposes,
data from the pilot will be reviewed and improvements made to the survey process as deemed necessary.
Read aloud testing was conducted with multiple interviewers for time trials, as well as to gain feedback
on awkward wording/phrasing. Public Assistance subject matter experts were asked to read the survey
to assess question clarity and appropriateness of terminology.
5. Provide the name and telephone number of individuals consulted on statistical aspects of the
design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will
actually collect and/or analyze the information for the agency.

Kristin Brooks, Ph.D.
Statistician
Customer Survey and Analysis Section
Reports and Analytics Division, FEMA
[email protected]
Office: (940) 891-8579; Cell: (202) 826-6291

References

Ehlen J, Ehlen P. (2007). Cellular-only substitution in the United States as lifestyle adoption. Public
Opinion Quarterly, 71, 717–733.
Kennedy, C. & Hartig, H. (2019). Response rates in telephone surveys have resumed their decline. Pew
Research Center. Retrieved from: https://www.pewresearch.org/fact-tank/2019/02/27/responserates-in-telephone-surveys-have-resumed-their-decline/
Tourangeau R. (2004). Survey research and societal change. Annual Review of Psychology, 55,775–801.


File Typeapplication/pdf
File Modified2020-09-22
File Created2020-09-22

© 2024 OMB.report | Privacy Policy