1660-0107 - BRI Supporting Statement B - 2023 02 23 clean

1660-0107 - BRI Supporting Statement B - 2023 02 23 clean.docx

Public Assistance Customer Satisfaction Surveys

OMB: 1660-0107

Document [docx]
Download: docx | pdf

February 23, 2023

Supporting Statement for

Paperwork Reduction Act Submissions


OMB Control Number: 1660 – 0107


Title: Public Assistance Customer Satisfaction Surveys


Form Number(s):

  1. FEMA Form FF-104-FY-21-155 (formerly FF-104-FY-21-155 (formerly 519-0-32)), Public Assistance Initial Customer Satisfaction Survey (Telephone);

  2. FEMA Form FF-104-FY-21-156 (formerly FF-104-FY-21-156 (formerly 519-0-33)), Public Assistance Initial Customer Satisfaction Survey (Internet);

  3. FEMA Form FF-104-FY-21-157 (formerly FF-104-FY-21-157 (formerly 519-0-34)), Public Assistance Assessment Customer Satisfaction Survey (Telephone);

  4. FEMA Form FF-104-FY-21-158 (formerly FF-104-FY-21-158 (formerly 519-0-35)), Public Assistance Assessment Customer Satisfaction Survey (Internet)

B. Collections of Information Employing Statistical Methods

When Item 17 on the Form OMB 83-I is checked “Yes”, the following documentation should be included in the Supporting Statement to the extent it applies to the methods proposed:


1. Describe (including numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State, local, and Tribal government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection has been conducted previously, include the actual response rate achieved during the last collection.


The target population consists of organizations or entities that applied for Public Assistance (PA) following a major disaster declared under the Stafford Act (42 U.S.C. §§ 5121 et seq). The entities consist of approximately 81% local, state, territorial, or tribal governments and 19% eligible private non-profit organizations based on FY 2018-2019 applicant data. More recent data were not used because COVID declarations in 2020-2021 were highly atypical and significantly skewed population estimates.


Statistical methods may be applied to the analysis of the survey results, but there is no statistical sampling methodology because all qualified applicants are surveyed each quarter.


There are two surveys in this collection. The sample for both surveys is imported on a disaster basis. Sampling frequency is dependent on disaster activity and sample availability. Eligibility criteria for each survey is as follows:


  1. Public Assistance Initial Survey (PAI)- Includes applicants who completed a Request for Public Assistance (RPA). Applicants are qualified to receive survey roughly 60 days after the disaster declaration, and once the majority of Recovery Scoping Meetings (~70%) have been completed for a disaster. These applicants may be eligible or ineligible for FEMA Public Assistance.

  1. Public Assistance Assessment Survey (PAA)- Applicants who were eligible for Public Assistance. Applicants are qualified to receive the survey roughly 210 days after the disaster declaration, and once the majority of funds (~70%) have been obligated to projects for that disaster. In some cases, applicants who participated in Stafford Act Section 428 Alternative Procedures for one of their projects (received funds in beginning of process) or had a specialized project (increased completion time/complexity) may be placed on hold for surveying. Ideally these applicants will be surveyed closer to the end of the PA process to get a more accurate representation of satisfaction.


In extreme circumstances (e.g., COVID-19), sampling procedures may be adapted to account for unforeseen PA programmatic changes.  For example, if PA stopped conducting Recovery Scoping Meetings, a different qualification criterion may need to be used to determine survey administration.  We would keep the data pull as consistent as possible with our normal procedures.


Both surveys are currently mixed mode (phone and electronic). Electronic surveying began in July 2021. All applicants receive electronic surveys first. If electronic surveys are not completed within two weeks, then those applicants are forwarded to the phone surveying queue. Response rates for electronic surveys are lower than the phone surveys, but this was expected.

PA applicants may be surveyed at two different time points during the PA process through the PAI and the PAA Surveys. The questions in both surveys are different based on the interactions that occurred during specific time frames. The PAI Survey was designed to capture feedback at the onset of the PA process, whereas PAA captures more general satisfaction at the end of the PA process. The surveys were divided this way because the Public Assistance process can take years to complete. During the first few weeks, Public Assistance applicants receive initial directions, deadlines, information on what types of assistance they might qualify for, and other important guidelines from their PA representative that can set the tone for the rest of the process. If applicants were only surveyed at the end of the process, they would have a difficult time recalling the early interactions.

Population estimates are rough approximations; some months are going to be more heavily affected by disasters than others. Additionally, there is great variance in disaster activity from year to year. High hurricane activity, for example, can cause a dramatic increase in the number of PA applicants for a given year. There is no statistical sampling methodology for the PA Information Collection because we survey the entire qualified population each month. These numbers are meant to be an estimate of the population only.


If a special circumstance arose (e.g., COVID-19, catastrophic event) where there was more available sample than we can contact in a month, or that is necessary to make reliable generalizations about the population, we would draw a random sample using a 95% confidence level with a 5% margin of error. These situations are rare and impossible to anticipate.


Additionally, because of the nature of Emergency Management, sometimes there are situations where program changes or special circumstances (ex. COVID-19) can make survey questions irrelevant.  In such situations, FEMA may choose to omit survey questions that are no longer applicable to respondents. This would not change the nature of the survey except a potential decrease to respondent burden. For example, if we knew respondents didn’t receive a Program Delivery Manager under a special disaster declaration, we would not ask them to rate their Program Delivery Manager.


Qualitative research (e.g., focus groups/interviews) will not be subject to statistical sampling methods (e.g., usually based on convenience sample) or statistical analysis.


Please reference the table below for projected annual universe, response rates, and completions.



Part B #1. Annual Estimates of Universe of Completed Surveys


Respondent Type

Survey Form

Population Description

Estimated Annual Universe (FY 2018-2019)

Response Rates (FY 2020- 2022)

Projected Annual Completes

Projected Monthly Completes (Projected Annual Completes/12)

Non-Profit Institutions (~19% of universe)

Public Assistance Initial Survey FEMA Form FF-104-FY-21-155 (formerly 519-0-32) (Telephone)

Applicants who submitted a Request for Public Assistance (RPA) •Approx. 60 days past declaration date
•Applicants did not respond to electronic survey invitation

692

40%

277

23

State, Local or Tribal Government (~81% of universe)

2,953

40%

1,181

98

Non-Profit Institutions (~19% of universe)

Public Assistance Initial Survey FEMA Form FF-104-FY-21-156 (formerly 519-0-33) (Internet)

Applicants who submitted a Request for Public Assistance (RPA) •Approx. 60 days past declaration date

778

11%

86

7

State, Local or Tribal Government (~81% of universe)

3,318

11%

365

30

 

Subtotal

Estimated annual universe = internet universe (phone is a subset of internet)

4,096

 

1,909

158

Non-Profit Institutions (~19% of universe)

Public Assistance Assessment Survey FEMA Form FF-104-FY-21-157 (formerly 519-0-34) (Telephone)

Applicants who were eligible for FEMA Public Assistance •Approx. 210 days past declaration date
•Applicants did not respond to electronic survey invitation

434

47%

204

17

State, Local or Tribal Government (~81% of universe)

1,850

47%

870

72

Non-Profit Institutions (~19% of universe)

Public Assistance Assessment Survey FEMA Form FF-104-FY-21-158 (formerly 519-0-35) (Internet)

Applicants who were eligible for FEMA Public Assistance •Approx. 210 days past declaration date

499

13%

65

5

State, Local or Tribal Government (~81% of universe)

2,127

13%

277

23

 

Subtotal

Estimated annual universe = internet universe (phone is a subset of internet)

2,626

 

1,416

118

Grand Total

 

6,722

 

3,325

276


Note. Calculations were completed going from left to right in the table. These numbers are meant as an estimate only. We typically survey the entire population because the respondent pool is so small, so the projections have no effect on sampling methodology. Annual subtotals and grand totals are rounded to nearest whole number.


2. Describe the procedures for the collection of information including:


-Statistical methodology for stratification and sample selection:


The sampling frame is the universe of applicants for Public Assistance, which are state, local, territorial, and tribal governments, and eligible private non-profit organizations. Customer Survey & Analysis Section (CSA) imports information concerning the PA entity’s POC from the Operational Data Store (ODS) and Enterprise Data Warehouse (EDW) into the Customer Satisfaction Analysis System (CSAS). Specifically, contact/call lists include POC names, phone numbers, and email addresses.


There is no stratification because all qualified applicants receive both Public Assistance Surveys. For details on target population, see Question 1.


-Estimation procedure:


This collection is based on surveying the entire population of applicants for Public Assistance; therefore, no estimation procedure is utilized.


Reports are provided to Public Assistance and FEMA Headquarters management on a quarterly basis. Management uses these reports to monitor service delivery and identify possible areas of improvement. The reports include descriptive breakdowns for survey questions (e.g., means and percentages) and respondent counts. The reports also display trends over time for certain items. It is possible that stakeholders may request additional reports more frequently than quarterly if they want to examine trends for a particular disaster, state, or region, but that would be on a request only basis. Stakeholders may also request more in-depth analysis from statisticians if there are significant changes in customer satisfaction that warrant more analysis. Statisticians may examine demographics (e.g., work experience, previous experience with PA, FEMA Region) crossed with individual items, and perform statistical tests such as correlation, T-tests, Crosstabs with Pearson’s Chi-Square, Analysis of Variance (ANOVA), and regression. Additionally, there is now a Tableau Dashboard available for each of the PA surveys that displays descriptive statistics (e.g., averages) for selected survey questions. Dashboards are refreshed on a monthly basis.


-Degree of accuracy needed for the purpose described in the justification:


Surveying the entire universe of Public Assistance applicants, including state, local and tribal governments, and eligible private non-profit organizations, yields the highest degree of precision and confidence. The monthly universe is typically small, usually only a couple hundred applicants. Surveying the entire universe ensures we have enough survey completions to make reliable and valid conclusions about the population.

-Unusual problems requiring specialized sampling procedures:


There is no expectation for unusual problems or hard to reach populations other than those who do not have up-to-date phone numbers or email addresses. For applicants who have recently applied for Public Assistance, the contact information is generally accurate.


-Any use of periodic (less frequent than annual) data collection cycles to reduce burden:


No data is collected less than annually. Sample for both surveys is typically imported on a disaster basis. The PAI survey is conducted roughly 60 days following the disaster declaration, and the PAA survey is conducted roughly 210 days following the disaster declaration. Respondents would likely have difficulty with memory recall if data collection were less frequent than annual, which would lead to distortions in performance measures.


Because surveying to some extent is based on the status of the disaster (e.g., percentage of Recovery Scope Meetings completed or funds obligated), and not a standard amount of time, disasters with less complex projects/lower volume may be surveyed sooner than disasters that are more catastrophic.


3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.


The entire collection has a response rate of roughly 37% (PAI- 35%; PAA 39%).


Response rates for phone surveys were examined for FY 2020-2022, whereas there was only data for electronic surveys ranging from July 2021 to September 2022.


There is no financial incentive for respondents to answer the survey questions. The incentive is the opportunity for applicants to express their opinion and evaluate their satisfaction with their recent service. We have found this to be an effective incentive. Our response rates have fallen from the previous information collection, but according to the Pew Research Center, response rates have continued to decline in recent years across all telephone surveys (Kennedy & Hartig, 2019). Possible explanations for the response rate decline include a growing refusal among respondents to participate and difficulties in contacting individuals due to the increased use of answering machines, call screening devices, and cellular telephones (Tourangeau, 2004; Ehlen & Ehlen, 2007). In addition, new technologies sometimes mistakenly flag survey calls- even those conducted by the government- as “spam” (Kennedy & Hartig, 2019).


Although our response rates have declined, they are still significantly higher compared to industry averages. The addition of electronic administration option is preferred by some respondents and more efficient. Previous survey respondents have requested web survey forms, and electronic administration gives respondents more flexibility on when they complete the survey (e.g., they may be busy at work when we call), as well as a sense of anonymity.


All applicants who have an email address on file receive an email invitation with the ability to complete the survey via a web link. Reminder emails are sent if the applicant does not complete the web survey in a certain time period. Phone interviewers contact applicants who don’t have an email address on file (rare), as well as applicants who do not complete the web survey after approximately two weeks. We are attempting to reduce non-sampling error, caused by non-response or bad contact data, by contacting applicants in mixed administration modes (e.g., internet and phone).


The entire available universe will be contacted during the survey time period, so there is no sampling method. Everyone will be contacted to ensure we have enough information collected to make generalizations about the population. If we have months with low disaster activity and there is not enough survey data available to draw valid conclusions, a disclaimer about the small sample size and low precision rate will be included at the beginning of any reports.


Below are survey efforts that will be performed regularly to maintain high response rates.

  • Scheduling of phone surveys will be during normal business hours. Hours may be changed depending on disaster activity and time zone of the respondents being surveyed.

  • Follow-ups or reminders in the form of emails or phone calls will be used to encourage participation.

  • Callbacks will be attempted to applicants who request a different time/day to take the survey that is more convenient.

  • The opening statement will explain the purpose of the survey and that participation is voluntary. Respondents will also be told that the survey phone call will not affect the outcome of their application for FEMA assistance.

  • Multiple attempts will be made to contact each applicant. Those who receive the web survey will be sent approximately two email reminders within roughly a 2-week period. If the survey is not completed within a certain time frame, the applicant data may be placed in the phone queue.

  • The questions are straightforward, short, and easy to answer. Several questions have been revised to cut down on wordiness. Terminology has been revised to reduce confusion.

  • On-going training will be provided to interviewers.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


No testing was done for this collection because the surveys were not revised.


FEMA has administered customer satisfaction surveys for the last 10-15 years, and the initial surveys were designed based on comments from past focus groups and contractor recommendations.

Whenever revisions are made to the surveys, tests for readability are conducted by staff to help with reliability and accuracy. This includes question layout, wording, definitions, and timing. Questions are also analyzed for plain language. Thorough testing is performed in each administration mode by multiple staff. Public Assistance subject matter experts are asked to read the survey to assess question clarity and appropriateness of terminology.


Discussion with interviewers who have one-on-one experience with public respondents are performed to revise survey content.



5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


The Customer Survey & Analysis (CSA) Section plans, designs, administers, and analyzes results of the survey. This includes the survey methodology and sample selection, collecting, tabulation and reporting of the data.


Dr. Kristin Brooks, Statistician

Customer Survey & Analysis

Federal Emergency Management Agency

202 826-6291


Dr. Brandi Vironda, Statistician

Customer Survey & Analysis

Federal Emergency Management Agency

940 205 9576



Kristi Lupkey, Supervisory Program Analyst

Customer Survey & Analysis

Federal Emergency Management Agency

940 368 2571


Chad Faber, Section Chief

Customer Survey & Analysis

Federal Emergency Management Agency

940 535 8364


Jason Salazar, Program Analyst

Customer Survey & Analysis

Federal Emergency Management Agency

940 268-9245





References



Ehlen J, Ehlen P. (2007). Cellular-only substitution in the United States as lifestyle adoption. Public Opinion Quarterly, 71, 717–733.

Kennedy, C. & Hartig, H. (2019). Response rates in telephone surveys have resumed their decline. Pew Research Center. Retrieved from: https://www.pewresearch.org/fact-tank/2019/02/27/response-rates-in-telephone-surveys-have-resumed-their-decline/

Tourangeau R. (2004). Survey research and societal change. Annual Review of Psychology, 55,775–801.




File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorEdmonds, Julia K. EOP/OMB (Intern)
File Modified0000-00-00
File Created2023-10-20

© 2024 OMB.report | Privacy Policy