1660-0107 - BRI Supporting Statement A - 2023 03 01 clean

1660-0107 - BRI Supporting Statement A - 2023 03 01 clean.docx

Public Assistance Customer Satisfaction Surveys

OMB: 1660-0107

Document [docx]
Download: docx | pdf

March 1, 2023

Supporting Statement for

Paperwork Reduction Act Submissions


OMB Control Number: 1660 - 0107


Title: Public Assistance Customer Satisfaction Surveys


Form Number(s):

  1. FEMA Form FF-104-FY-21-155 (formerly 519-0-32), Public Assistance Initial Customer Satisfaction Survey (Telephone);

  2. FEMA Form FF-104-FY-21-156 (formerly 519-0-33), Public Assistance Initial Customer Satisfaction Survey (Internet);

  3. FEMA Form FF-104-FY-21-157 (formerly 519-0-34), Public Assistance Assessment Customer Satisfaction Survey (Telephone);

  4. FEMA Form FF-104-FY-21-158 (formerly 519-0-35), Public Assistance Assessment Customer Satisfaction Survey (Internet)

  5. FEMA Manual FM-104-FY-22-102, CSA Qualitative Research Protocol


A. Justification


1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information. Provide a detailed description of the nature and source of the information to be collected.


This collection contains two (2) surveys and two (2) qualitative methods. Surveys in this collection support the Agency’s 2022-2026 Strategic Plan: Unify coordination and delivery of Federal Assistance (Objective 3.3). The Public Assistance Program (PA) utilizes survey feedback to assess applicant satisfaction with service delivery. Survey questions are used for general performance management and support PA’s reporting for the Government Performance and Results Act (GPRA). Examples of survey questions that are currently used for GPRA include:


  • Overall customer service

  • Helpfulness of PA staff throughout the PA process 

  • Reasonableness of the level of documentation required for grant processing

  • Satisfaction with the simplicity of the Grants Portal

  • Timeliness of when the PA grant award was approved


The target population consists of organizations or entities that applied for Public Assistance (PA) following a major disaster declared under the Stafford Act (42 U.S.C. §§ 5121 et seq). Applicants for Public Assistance are surveyed at the beginning and end of the customer journey. Applicants surveyed at the beginning of the process may be eligible or ineligible for funding, whereas applicants surveyed at the end of the process are eligible and have received funding for at least one of their projects.


The following legal authorities mandate the collection of the information in this request:


  • The September 11, 1993, Executive Order 12862, “Setting Customer Service Standards,” and its March 23, 1995, Memorandum addendum, “Improving Customer Service,” requires that all Federal agencies ask their customers what is most important to them and survey their customers to determine the kind and quality of services the customers want and their level of satisfaction with existing services. The 1993 Government Performance and Results Act (GPRA) requires agencies to set missions and goals, and measure performance against them.

  • The E-Government Act of 2002 (Public Law 107-347) includes finding innovative ways to improve the performance of governments in collaborating on the use of information technology to improve the delivery of Government information and services.

  • The GPRA Modernization Act of 2010 requires quarterly performance assessments of Government programs for purposes of assessing agency performance and improvement, and to establish agency performance improvement officers and the Performance Improvement Council. Executive Order 13571 “Streamlining Service Delivery and Improving Customer Service” and its June 13, 2011, Memorandum “Implementing Executive Order 13571 on Streamlining Service Delivery and Improving Customer Service” sets out guidelines for establishing customer service plans and activities; plus it expands the definition of customer and encourages the use of a broader set of tools to solicit actionable, timely customer feedback to capture insights and identify early warning signals.

  • The Sandy Recovery Improvement Act (SRIA) of 2013 and the response provided by FEMA staff from all divisions during Hurricane Sandy, the Disaster Survivor Assistance (DSA) Program was formed to provide additional in-person customer service during the initial phase of the recovery process.


The survey results provide FEMA an overall gauge of performance at different points in the Public Assistance process. Drops in overall satisfaction or customer service ratings prompt analysts to examine specific survey questions more closely in order to pinpoint underlying causes for dissatisfaction and identify possible strategies for improvement.


2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection. Provide a detailed description of: how the information will be shared, if applicable, and for what programmatic purpose.


This collection includes the FEMA Public Assistance’s Customer Satisfaction Surveys. This collection is managed by the Recovery Directorate, through the Reporting & Analytics Division’s Customer Survey & Analysis (CSA) Section. CSA is responsible for administering and reporting survey results. CSA also assists with survey design in conjunction with stakeholders. Questions used for performance management are chosen by the FEMA Recovery Performance Management Team and Leadership.


FEMA’s mission is helping people before, during and after disasters. The purpose of the Public Assistance Customer Satisfaction Surveys is to assess customer satisfaction with the Public Assistance Program, and to improve the quality of service for applicants (State, Local, Tribal governments, and eligible Private Non-Profit organizations) who have been affected by a disaster. The results provide continuous feedback and help management make data-driven decisions to improve service delivery across all customer touchpoints. The survey results are also used in FEMA’s GPRA measures and performance measures under guidance from the FEMA Recovery Performance Management Team. The questions utilized may change over time as priorities evolve.


Survey results reports are usually distributed by email to stakeholders, which includes Public Assistance Leadership and the Recovery Reporting and Analytics Branch. Reports are distributed on a quarterly basis, and include descriptive breakdowns of each question (e.g., means and percentages). Stakeholders may request reports more often than quarterly if they want to examine customer satisfaction for a given disaster, state, or FEMA Region. Additionally, there is a Tableau Dashboard that displays survey results (averages for select questions) that anyone in FEMA can access. The dashboards are refreshed on a monthly basis.

The purpose for each survey, as a part of our ongoing improvement process and mandates from various Acts and Executive Orders, is as follows:


FEMA Form FF-104-FY-21-155 (formerly 519-0-32) Phone and FEMA Form FF-104-FY-21-156 (formerly 519-0-33) Electronic, Public Assistance Initial (PAI) Survey assesses whether applicants are satisfied with the service and materials they receive from Public Assistance at the onset of the process. All applicants have a Recovery Scoping Meeting where their Program Delivery Manager sets expectations and provides timelines. Applicants receive important instructions and materials that can set the tone for the rest of the grant process. A disaster is qualified for surveying when ~60 days have elapsed since the declaration date and ~70% of Recovery Scoping Meetings have been completed.


FEMA Form FF-104-FY-21-157 (formerly 519-0-34) Phone and FEMA Form FF-104-FY-21-158 (formerly 519-0-35) Electronic, Public Assistance Assessment (PAA) Survey assesses customer satisfaction throughout the entire Public Assistance process. Survey topics include knowledge and helpfulness of FEMA representatives, timeliness of awards, simplicity of the process, reasonableness of requirements, accuracy of materials, satisfaction with communication, and usability of the Grants Portal. A disaster is qualified for surveying when ~210 days have elapsed since the declaration date and ~70% of funds have been obligated to projects.


In addition to the surveys, specialized qualitative research (e.g., focus groups; interviews) may be conducted periodically to assess program areas or program changes that the Public Assistance surveys do not capture. These are usually based on a convenience sample and target population will vary depending on research question. Such projects are conducted at the request of Public Assistance when funding is available. This type of research allows for flexibility to assess programmatic changes that surveys are unable to capture. Examples might include technological upgrades to the Grants Portal, or changes to existing programs like 428 Alternative Procedures. Qualitative research can also help inform survey development through identifying new topic areas that need to be assessed. Funding is usually limited, so sampling is often restricted to a few geographic regions. Focus groups are more feasible when applicants are densely packed in one area, whereas interviews are more practical when applicants are spread out over a geographic region.


3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.


The Customer Survey and Analysis Section (CSA) is responsible for administering the Public Assistance Surveys. In the previous submission of this collection, a software acquisition was in process that would enable electronic survey administration, but it was not yet implemented. Electronic survey administration began in July 2021. Currently both phone and electronic survey administration is utilized.


Most applicant organizations responding to the FEMA Public Assistance Customer Satisfaction Surveys have an email address on file. All applicants initially receive electronic surveys. In the rare case where an applicant does not have a valid email address on file, a phone survey is administered instead. All electronic surveys are open for two weeks. Once the electronic surveying period closes, applicants who have not completed an electronic survey are contacted via phone.


Response rates have declined since the previous submission of this collection, although they remain relatively high compared to industry standards. For the collection as a whole, response rates are around 37%. The Public Assistance Assessment Survey has slightly higher response rates than the Public Assistance Initial Survey (39% vs. 35%), and phone surveys have significantly higher response rates than electronic surveys (42% vs. 12%). Based on previous research, response rates for electronic surveys were expected to be lower than phone administered surveys. For example, in a research study examining response rates by administration mode, telephone administered surveys produced the highest response rates (30.2%), whereas internet administered surveys had the lowest response rates (4.7%; Sinclair et al., 2012).


Incorporating electronic administration has allowed respondents, who are often extremely busy responding to disaster operations, to complete surveys when it is convenient. Electronic surveys are also more efficient and reduce overall burden. To keep response rates high, phone surveys are still offered if electronic surveys are not completed in a 2-week timeframe.


Usability testing has been conducted on this collection. As a result, a reduction of 63 burden hours has been recognized and included as an update to the collection.



4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.


As this collection was just approved last year, CSA did not go through rigorous efforts to identify duplication for this filing.


As of last filing, the information gathered in the survey is not available from any other source. CSA met with stakeholders to ensure the surveys are adequately assessing the Public Assistance Program and reflect current practices.


5. If the collection of information impacts small businesses or other small entities (Item 5 of OMB Form 83-I), describe any methods used to minimize burden.


There is no impact from this collection of information on small businesses or other small entities.


6. Describe the consequence to Federal/FEMA program or policy activities if the collection of information is not conducted, or is conducted less frequently as well as any technical or legal obstacles to reducing burden.


Failure to conduct the Public Assistance Customer Satisfaction Surveys would result in the absence of documentation about customer input on the quality and timeliness of disaster assistance for Public Assistance applicants. The survey results serve as a vital tool for measuring customer satisfaction and are a requirement of the Executive Orders 12682 and 13571, and resulting Memorandums for “Streamlining Service Delivery and Improving Customer Service.” The surveys also support the Administrator’s Strategic Plan when it comes to measuring service delivery. If conducted less frequently, applicants may have difficulty recalling the specific aspects of the process if surveyed later (e.g., too much time has elapsed) and satisfaction scores may be distorted. Additionally, leadership would receive less timely customer feedback, which would lead to fewer actionable insights.


7. Explain any special circumstances that would cause an information collection to be conducted in a manner:


  1.  Requiring respondents to report information to the agency more often than quarterly.


This information collection does not require respondents to report information more than quarterly.


  1. Requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it.



This information collection does not require respondents to prepare a written response in fewer than 30 days after receipt of it



  1. Requiring respondents to submit more than an original and two copies of any document.


This information collection does not require respondents to submit more than an original and two copies of any document.


  1. Requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records for more than three years.


This information collection does not require respondents to retain records (other than health, medical, government contract, grant-in-aid, or tax records) for more than three years.


  1. In connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study.


This information collection contains a statistical survey that is designed to produce valid and reliable results that can be generalized to the universe of study.


  1. Requiring the use of a statistical data classification that has not been reviewed and approved by OMB.


This information collection does not use a statistical data classification that has not been reviewed and approved by OMB.

  1. That includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use.

This information collection does not include a pledge of confidentiality that is not supported by established authorities or policies.


  1. Requiring respondents to submit proprietary trade secret, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information’s confidentiality to the extent permitted by law.

This information collection does not require respondents to submit trade secrets or other confidential information.


8. Federal Register Notice:



 a. Provide a copy and identify the date and page number of publications in the Federal Register of the agency’s notice soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.


A 60-day Federal Register Notice inviting public comments was published on December 9, 2022, FR 87 75643. One public comment was received with multiple questions within it. FEMA has provided detailed responses to these questions in our Supporting Statement A.


Comment 1 (FEMA-2022-0053-0002): State of Iowa’s comments and questions are in red. FEMA’s responses are in blue.


The State of Iowa supports FEMA’s continued efforts to obtain customer use data and improve the Public Assistance program. Below are a few comments and questions related to the various documents and forms. The State of Iowa wishes to thank FEMA for the opportunity to provide feedback on the program/process.


CSA QUALITATIVE RESEARCH PROTOCOL Public Assistance (PA) Surveys


Are follow up surveys part of every disaster? If not, How are disasters selected?


Customer Satisfaction Surveys are administered for all presidentially declared disasters for Public Assistance.


Is the recipient notified when these surveys occur?


All applicants for Public Assistance receive email invitations to participate in the survey(s). If there is no email address on file, or the applicant does not respond the email invitation, applicants are contacted via phone to participate.


Is the recipient ever invited to participate?


Anyone with an application ID on file is invited to participate in the surveys. This includes both applicants and recipients. FEMA uses “applicants” as a generic term to refer to anyone who has applied for Public Assistance.


Is the recipient ever invited to observe focus groups or interviews?


Yes.


Are the results of these surveys shared with recipients?


The results are currently not shared with recipients.


When do focus groups or interviews occur?


Focus groups and interviews occur on an ad-hoc basis and are largely dependent on funding availability. The Public Assistance Program typically requests focus groups to capture programmatic feedback on topics that the surveys may not capture.


For the capture of the entire project life cycle, it would make sense that the focus groups and interviews would occur after the applicant has been closed for the specific disaster.


The timing of focus group administration is related to the research topic. If FEMA were interested in assessing something early in the PA process, it’s possible focus groups may be conducted earlier in the process to ensure more accurate recall. That being said, most focus groups occur post-obligation.


How and who determines whether an in-person interview, phone interview or focus group occurs?


Leadership for the Public Assistance Program primarily makes requests for qualitative research projects. Availability of funding would dictate whether qualitative research was conducted in-person or via phone.


When are the various forms for collection used? They appear to be two different set of questions, two set of questions per type.


The Public Assistance Initial (PAI) Survey assesses customer satisfaction with the service and materials they receive from Public Assistance at the onset of the process. Applicants have a Recovery Scoping Meeting where their Program Delivery Manager sets expectations and provides timelines. Applicants receive important instructions and materials that can set the tone for the rest of the grant process. A disaster is qualified for surveying when ~60 days have elapsed since the declaration date and ~70% of Recovery Scoping Meetings have been completed.


The Public Assistance Assessment (PAA) Survey assesses customer satisfaction throughout the entire Public Assistance process. Survey topics include knowledge and helpfulness of FEMA representatives, timeliness of awards, simplicity of the process, reasonableness of requirements, accuracy of materials, satisfaction with communication, and usability of the Grants Portal. A disaster is qualified for surveying when ~210 days have elapsed since the declaration date and ~70% of funds have been obligated to projects.


Both surveys have a phone version and an email version. The questions are the same, but the instructions are a little different.


Define PA staff as FEMA PA staff.


The verbiage “PA staff” was chosen because “FEMA PA staff” was causing confusion in previous survey versions. Some disasters are eligible for participation in the PA State Led Program, which is where recipients assume some of the traditional FEMA responsibilities. In these situations, applicants may not have any interactions with FEMA staff.


The target population with always be Public Assistance applicants. Disasters will be chosen for sample selection based on the research question”


What about recipients?


Anyone with an application ID (which includes recipients) is eligible to participate in the surveys.


Generate questions;”


Who generates the purpose and goals?


Customer Service and Analysis within FEMA’s Recovery Reporting and Analytics Division works with Public Assistance to define purpose and goals.


Who selects topics for questions?


Customer Service and Analysis within FEMA’s Recovery Reporting and Analytics Division works with the Public Assistance Program to develop the questions.


Can other topics be included, such as:

New thresholds;

Mitigation;

EHP;

Duplication of benefits;

Eligibility;

Closeout;

Documentation such as maintenance records;

Insurance:

Etc.


New questions can be added when the surveys are re-submitted to OMB. Typically survey rewrites occur approximately every three years. At that time, Customer Survey and Analysis reaches out to relevant stakeholders, mainly in the Public Assistance Program, to gain feedback on question content. There are always more questions that can be added, however, it is important to also consider the length of the survey, how that will impact response rates, and overall public burden.


What is FEMA CSA?


Customer Survey and Analysis (CSA) is a section within the Recovery Reporting and Analytics Division that specializes in survey administration. CSA does not generate the survey topics- they act as an intermediary that helps with requirements, survey design, OMB filings, survey administration, analysis, and reporting.


FF-104-FY-21-155 (formerly 519-0-32) phone


The questionnaire might be better organized to reflect the PA Process.


The Public Assistance Initial Survey only deals with the beginning of the PA process. Since some projects may take years, FEMA feels it is important to capture satisfaction with the first steps soon after they occur while memory recall is more accurate.


It appears the survey only deals with items prior to obligation. What about post obligation activities and closeout?


These topics are captured in the Public Assistance Assessment Survey.


What about questions that deal with EHP, mitigation and insurance?


Additional topics, such as the ones outlined above, may be considered for the next survey rewrite depending on the needs and priorities outlined by the Public Assistance Program.


Iowa’s Questions are noted below


What is the impact review?


This is a fairly new PA term. PA used to refer to this as the “exploratory phone call.” The Exploratory Call is a 15 to 30-minute introductory phone discussion between the PDMG and Applicant. This is meant to establish a relationship with the Applicant, review or complete the survey, and schedule the Recovery Scope Meeting.


3. Should project be developed at the RSM?


This is question is referring to Public Assistance Policy, which is not in scope with this submission.


4. Is all documentation provided at the RSM?


The Applicant, Recipient, and FEMA conduct an RSM to review and refine the list of impacts. The PDMG, if assigned, facilitates discussion of the PA delivery process, hazard mitigation opportunities and eligibility requirements, including insurance and environmental and historic preservation considerations. For Applicants pursuing direct application, the Recovery Public Assistance Program Delivery Guide (Draft) Scoping Video provides this information. The RSM or video starts a 60-day regulatory period when the Applicant must identify and report all eligible impacts and damage for FEMA to review. Further information on Public Assistance Policy and terms can be found here: https://www.fema.gov/sites/default/files/documents/public-assistance-program-delivery-guide-operational-draft.pdf.


9. What is meant by interact with the Site Inspector?


If the applicant met with a site inspector to tour damages, FEMA wants to assess their satisfaction with the inspector (e.g., customer service and knowledge).


d. Is the Site Inspector supposed to validate damages or report damages?


The Site Inspector (SI) collects and validates information about Applicants’ damage claims. The SI prepares for and performs timely site inspections and develops detailed damage descriptions with dimensions with supporting photos, sketches, and calculations. The SI role is critical as it is one of the only PA roles that sees claimed impacts and damage in person. Further information on Public Assistance Policy and terms can be found here: https://www.fema.gov/sites/default/files/documents/public-assistance-program-delivery-guide-operational-draft.pdf.


19. What is meant by staff? PDMGs, Mitigation, EHP, Recipient.



FEMA wants to understand overall customer satisfaction. This may encompass many interactions with many different people. The following is stated in the survey instructions:



You were involved with an application that has recently received funding under the Public Assistance Program, also known as PA. You may have been assigned a Program Delivery Manager, or PA representative, to lead you through the process. You may have also interacted with other staff who provided PA guidance. Please consider all interactions when answering the following questions.”


FF-104-FY-21-156 (formerly 519-0-33) internet


The questionnaire might be better organized to reflect the PA Process.


The organization is based on guidance from the Public Assistance Program.


It appears the survey only deals with items prior to obligation. What about post obligation activities and closeout?


The Public Assistance Assessment Survey does ask about timeliness of awards. Questions pertaining to close-out may be considered for the next revision or as a topic for an entirely different survey. Assessing close-out can be difficult in terms of survey administration because it can take years and information about the other topics in the survey would not be as fresh or as easily remembered


What about questions that deal with EHP, mitigation and insurance?


Additional survey topics will be revisited during the next rewrite. This purpose of this submission was to only adjust burden calculations.


Iowa’s Questions are noted below


What is the Impact Review?


Answered above.


Should project be developed at the RSM?


Answered above


Is all documentation provided at the RSM?


Answered above.


What is meant by interact with the Site Inspector? Is the Site Inspector supposed to validate damages or report damages?


Answered above.


What is meant by staff? PDMGs, Mitigation, EHP, Recipient.


Answered above.


FF-104-FY-21-157 (formerly 519-0-34) phone


It appears the survey only deals with items prior to obligation. What about post obligation activities and closeout?


Answered above.


What about questions that deal with EHP, mitigation and insurance?


As previously mentioned, additional topics may be considered in the next survey revision. The number of applicants who are able to answer questions about these topics is typically low in a given disaster, and there is a balancing act in trying to keep the surveys short to minimize burden and keep response rates high.


Iowa’s Questions are noted below


1. What is meant by customer service? Interpretation of customer service is widely varied. Please define or rephrase question.


Customer Service can be nuanced and have slightly different meanings to different people. Part of the difficulty in designing surveys is keeping them straightforward and simple, while also being specific enough for respondents to understand the question. Similar versions of these surveys have been administered since 2018. FEMA has not encountered any issues with having to define customer service to respondents.


4. Is the update of your projects up to obligation?


This question refers to any status updates the applicant may have received about their projects, up until the time the survey is administered. Typically, this survey is administered when ~210 days have elapsed since the declaration date and ~70% of funds have been obligated to projects.


22. Is the question regarding funding pertinent since funding is part of the recipient’s (pass-through) responsibility and not FEMA? Does this question need to be adjusted?


Sometimes the Public Assistance Program or the Performance Management Team within the Recovery Reporting and Analytics Division may request questions be included on the survey to address specific business questions. If during the next survey rewrite it is determined that the question is no longer needed, it will be removed.


FF-104-FY-21-158 (formerly 519-0-35) internet


It appears the survey only deals with items prior to obligation. What about post obligation activities and closeout?


See answers above


What about questions that deal with EHP, mitigation and insurance?


See answers above


Iowa’s Questions are noted below


What is meant by customer service? Interpretation of customer service is widely varied. Please define or rephrase question.


See answers above


5. Is the update of your projects up to obligation?


See answers above


23. Is the question regarding funding pertinent since funding is part of the recipient’s (pass-through) responsibility and not FEMA? Does this question need to be adjusted?


See answers above


A 30-day Federal Register Notice inviting public comments was published on March 1, 2023, at 88 FR 12973. The public comment period is open until March 31, 2024.


 b. Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.


Because the purpose of this submission is to reduce burden and not make changes to survey content, no efforts were made to consult with persons outside the agency.

Whenever survey content is changed, the Recovery Directorate and Public Assistance Program Managers are consulted for input about the data collected in the survey questionnaires. FEMA External Affairs are consulted regarding the use of plain language and clarity. Since 2015, Customer Survey and Analysis (CSA) has retained survey statisticians on staff that review the methodology in detail. CSA statisticians review the survey collection to ensure respondent burden is minimized, while also maximizing survey reliability and validity.

FEMA is often limited by budget constraints when it comes to consulting with persons outside the agency.


c. Describe consultations with representatives of those from whom information is to be obtained or those who must compile records. Consultation should occur at least once every three years, even if the collection of information activities is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.

Because the purpose of this submission is to reduce burden and not make changes to survey content, no efforts were made to consult with respondents.


For the previous submission of this collection, budget constraints limited FEMA from contracting to consult with applicants. In 2004 FEMA’s Recovery Directorate contracted to perform four focus groups to ensure that the information collected was meaningful to customers and the survey questions were clearly understood. Although not a direct consultation with respondents, members of Customer Survey and Analysis were deployed to Iowa in 2015 to learn about the Public Assistance New Delivery Model and observed Recovery Scope Meetings. This allowed for better understanding of procedures and aided with questionnaire development.


Whenever survey content is revised, analysis is performed on open-text questions to identify new topic areas or questions that might need clarification. Performance management analysts have previously conducted comment analysis on the survey results to extract themes from text boxes and “other” response options to identify topics important to customers that aren’t assessed. For example, a previous common critique of the PA program was “Lack of training in the Grants Portal.” To better capture this information, a question was added in the previous submission to assess whether respondents felt they received adequate training.


Customer Survey and Analysis also has a Quality Performance Team (QPT) that is responsible for monitoring phone interviewers. QPT reviewed the revised surveys for the previous submission of this collection. They provided feedback on things that might be difficult for interviewers to read, or for respondents to understand. Phone interviewers also provided survey writers with feedback regarding which survey items were consistently confusing to respondents on the prior information collection. For example, scale descriptions were restructured to improve sentence flow and reduce burden.


9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


There are no payments or gifts to respondents for this data collection.


10. Describe any assurance of confidentiality provided to respondents. Present the basis for the assurance in statute, regulation, or agency policy.



A Privacy Threshold Analysis (PTA) was completed by FEMA and adjudicated by the DHS Privacy Office on August 6, 2021.


This collection is covered by an existing Privacy Impact Assessment (PIA), DHS/FEMA/PIA-035 Customer Satisfaction Analysis System (CSAS), approved by DHS on February 27, 2014, and the existing System of Records Notice (SORN), DHS/FEMA-009 Hazard Mitigation, Disaster Public Assistance, and Disaster Loan Programs System of Records, 79 FR 16015, dated March 24, 2014.


There are no assurances of confidentiality provided to the respondents for this information collection.  Survey information is stored in the Customer Satisfaction Analysis System (CSAS), warehoused on secure FEMA servers. This information collection request proposes to collect information via a variety of voluntary information collection activities (see section 12). Although the agency is not invoking statutory support for confidentiality, the quality of this type of information requires respondent candor and anonymity. Therefore, the agency pledges to keep the information collected private unless otherwise required by law. Respondents will be notified on the data collection forms that their information will only be reported in aggregated form and no personally identifiable responses will be publicly released.


11. Provide additional justification for any question of a sensitive nature (such as sexual behavior and attitudes, religious beliefs and other matters that are commonly considered private). This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.


There are no questions of a sensitive nature related to sexual behavior and attitudes, religious beliefs, or other matters that are commonly considered private in the surveys.


Some questions of a demographic nature are included to help identify whether certain groups of people vary in their satisfaction with the Public Assistance Program, although the questions aren’t very personal or of a sensitive nature. Examples include how long the respondent has worked in their current position, whether they’ve applied for PA disaster assistance previously, and the number of personnel that worked on their PA project(s). It is possible that respondents with less resources (staff) and less relevant work experience interpret the difficulty associated with the Public Assistance Process differently from an applicant who is more experienced, or who has previously applied for Public Assistance. Asking these questions allow us to better identify whether we are serving all our customers equally, and whether our products and services need to be tailored to meet the needs of certain groups of people.


12. Provide estimates of the hour burden of the collection of information. The statement should:



 a. Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated for each collection instrument (separately list each instrument and describe information as requested). Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desired. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.


Current administrative data with average survey completion times indicates that the burden estimates for the phone surveys used in the previous submission are accurate. There is no need for further testing for phone surveys because no changes are being made to the survey forms.


In the last submission of this collection, Customer Survey and Analysis (CSA) estimated the burden associated with the electronic surveys with the information available. At the time of submission CSA had not acquired electronic survey administration software to perform testing. Electronic survey administration became active in July 2021, and data collected over the past year indicates that respondents can complete the surveys faster than previously estimated. The average burden per response was updated to reflect this. There is no need to do further testing because no changes are being made to the survey forms, and burden estimates are based on real data.


Below is a list of all the forms in this collection with associated number of responses, frequency of response, and annual hour burden.


FEMA Form FF-104-FY-21-155 (formerly 519-0-32)-Phone, Name of Instrument: Public Assistance Initial Customer Satisfaction Survey (Telephone) is estimated to have 1,458 respondents (277 non-profit institutions + 1,181 State, Local or Tribal Government) submitting 1 responses per year. It is estimated that each response will require 0.1667 hours, or 10 minutes, to complete, with the total annual burden of 243 hours (1,458 x 0.1667).


FEMA Form FF-104-FY-21-156 (formerly 519-0-33)-Electronic Name of Instrument: Public Assistance Initial Customer Satisfaction Survey (Internet) is estimated to have 451 respondents (86 non-profit institutions + 365 State, Local or Tribal Government) submitting 1 responses per year. It is estimated that each response will require 0.0833 hours, or 5 minutes, to complete, with the total annual burden of 38 hours (451 x 0.0833).


FEMA Form FF-104-FY-21-157 (formerly 519-0-34)-Phone, Name of Instrument: Public Assistance Assessment Customer Satisfaction Survey (Telephone) is estimated to have 1,074 respondents (204 non-profit institutions + 870 State, Local or Tribal Government) submitting 1 responses per year. It is estimated that each response will require 0.2167 hours, or 13 minutes, to complete, with the total annual burden of 233 hours (1,074 x 0.2167).


FEMA Form FF-104-FY-21-158 (formerly 519-0-35)-Electronic, Name of Instrument: Public Assistance Assessment Customer Satisfaction Survey (Internet) is estimated to have 342 respondents (65 non-profit institutions + 277 State, Local or Tribal Government) submitting 1 responses per year. It is estimated that each response will require 0.1333 hours, or 8 minutes, to complete, with the total annual burden of 46 hours (342 x 0.1333).


FEMA Manual FM-104-FY-22-102 Focus Group for 2 Hrs Plus Travel 1 Hr is estimated to have 360 respondents (68 non-profit institutions + 292 State, Local or Tribal Government) submitting 1 responses per year. It is estimated that each response will require 3 hours to complete, with the total annual burden of 1,080 hours (360 x 3).


FEMA Manual FM-104-FY-22-102 One-on-One Interviews is estimated to have 200 respondents (38 non-profit institutions + 162 State, Local or Tribal Government) submitting 1 responses per year. It is estimated that each response will require 1 hour to complete, with the total annual burden of 200 hours (200 x 1).


Qualitative research is conducted on a request basis; there is no special schedule for implementation. Completions for qualitative interviews were estimated from previous experience, and administration can occur in-person, by phone, or online. The decision to conduct interviews vs. focus groups usually depends on the density of the target population (e.g., sometimes applicants are spread all over the state and it is not feasible to meet in a central location) and available funding. It is likely that applicants who participate in qualitative research sessions will have previously taken the PA Initial or PA Assessment Survey. This will depend at what point in the PA process qualitative interviews take place and what topics are being researched. Qualitative interviews are used to gather more detailed information that cannot adequately be captured in a short, quantitative survey measure. This most often occurs when Public Assistance implements a new program or policy that isn’t addressed in the surveys. Examples of potential qualitative topics include changes to 428 procedures (estimates based on fixed costs), comparing PA New Delivery Model to old delivery model, assessing changes in the Grants Portal, and state-led disasters.


b. If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens in Item 13 of OMB Form 83-I.


Please see our response for 12a above and 12c below.

c. Provide an estimate of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. NOTE: The wage-rate category for each respondent must be multiplied by 1.46 and this total should be entered in the cell for “Avg. Hourly Wage Rate”. The cost to the respondents of contracting out or paying outside parties for information collection activities should not be included here. Instead this cost should be included in Item 13.


The universe of respondents will consist of all applicants that applied for Public Assistance following a major disaster declaration under the Stafford Act (42 U.S.C. §§ 5121 et seq).


Estimates, based on a 2-year average (FY 2018-2019) of annual applicant population, shows 19% of Public Assistance applicants are non-profit institutions, and 81% are state, local, or tribal government. Projected number of respondents, burden hours, and respondent costs are calculated accordingly. Population estimates for 2020-2021 were not used because they were highly atypical due to COVID declarations.



Estimated Annualized Burden Hours and Costs

Type of Respondent

Form Name / Form No.

No. of Respondents

No. of Responses per Respondent

Total No. of Responses

Avg. Burden per Response (in hours)

Total Annual Burden (in hours)

Avg. Hourly Wage Rate

Total Annual Respondent Cost

Non-Profit institutions

Public Assistance Initial Customer Satisfaction Survey FEMA Form FF-104-FY-21-155 (formerly 519-0-32) (Telephone)

277

1

277

0.1667

46

$40.61

$1,868

State, Local or Tribal Government

Public Assistance Initial Customer Satisfaction Survey FEMA Form FF-104-FY-21-155 (formerly 519-0-32) (Telephone)

1,181

1

1,181

0.1667

197

$48.51

$9,556

Non-Profit institutions

Public Assistance Initial Customer Satisfaction Survey FEMA Form FF-104-FY-21-156 (formerly 519-0-33) (Internet)

86

1

86

0.0833

7

$40.61

$284

State, Local or Tribal Government

Public Assistance Initial Customer Satisfaction Survey FEMA Form FF-104-FY-21-156 (formerly 519-0-33) (Internet)

365

1

365

0.0833

30

$48.51

$1,455

Non-Profit institutions

Public Assistance Assessment Customer Satisfaction Survey FEMA Form FF-104-FY-21-157 (formerly 519-0-34) (Telephone)

204

1

204

0.2167

44

$40.61

$1,787

State, Local or Tribal Government

Public Assistance Assessment Customer Satisfaction Survey FEMA Form FF-104-FY-21-157 (formerly 519-0-34) (Telephone)

870

1

870

0.2167

189

$48.51

$9,168

Non-Profit institutions

Public Assistance Assessment Customer Satisfaction Survey FEMA Form FF-104-FY-21-158 (formerly 519-0-35) (Internet)

65

1

65

0.1333

9

$40.61

$365

State, Local or Tribal Government

Public Assistance Assessment Customer Satisfaction Survey FEMA Form FF-104-FY-21-158 (formerly 519-0-35) (Internet)

277

1

277

0.1333

37

$48.51

$1,795

Non-Profit institutions

Focus Group for 2 Hrs Plus Travel 1 Hr

FM-104-FY-22-102

68

1

68

3

204

$40.61

$8,284

State, Local or Tribal Government

Focus Group for 2 Hrs Plus Travel 1 Hr

FM-104-FY-22-102

292

1

292

3

876

$48.51

$42,495

Non-Profit institutions

One-on-One Interviews

FM-104-FY-22-102

38

1

38

1

38

$40.61

$1,543

State, Local or Tribal Government

One-on-One Interviews

FM-104-FY-22-102

162

1

162

1

162

$48.51

$7,859

Total


3,885


3,885


1,839


$86,459


Instruction for Wage-rate category multiplier: Take each non-loaded “Avg. Hourly Wage Rate” from the BLS website table and multiply that number by 1.45.1 For example, a non-loaded BLS table wage rate of $42.51 would be multiplied by 1.45, and the entry for the “Avg. Hourly Wage Rate” would be $61.64.


For Non-Profit Respondents: According to the U.S. Department of Labor, Bureau of Labor Statistics, the May 2021 Occupational Employment and Wage Estimates wage rate for All Occupations (SOC: 00-0000) is $28.01.2 Including the wage rate multiplier of 1.45, the fully loaded wage rate is $40.61. Therefore, the burden hour cost is estimated to be $14,132 annually ($40.61 x 348 hours = $14,132).


For State, Local Government, or Tribal Respondents: According to the U.S. Department of Labor, Bureau of Labor Statistics, the May 2021 Occupational Employment and Wage Estimates wage rate for All Occupations (SOC 00-0000) is $30.13. Including the wage rate multiplier of 1.61,3 the fully loaded wage rate is $48.51. Therefore, the burden hour cost is estimated to be $72,328 annually ($48.51 x 1,491 hours = $72,328).


$14,132 + $72,328 = $86,459 (-1 due to rounding)


13. Provide an estimate of the total annual cost burden to respondents or record keepers resulting from the collection of information. The cost of purchasing or contracting out information collection services should be a part of this cost burden estimate. (Do not include the cost of any hour burden shown in Items 12 and 14.)


The cost estimates should be split into two components:


a. Operation and Maintenance and purchase of services component. These estimates should take into account cost associated with generating, maintaining, and disclosing or providing information. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred.


b. Capital and Start-up-Cost should include, among other items, preparations for collecting information such as purchasing computers and software, monitoring sampling, drilling and testing equipment, and record storage facilities.


Annual Cost Burden to Respondents or Recordkeepers

Data Collection Activity/Instrument

*Annual Capital Start-Up Cost (investments in overhead, equipment, and other one-time expenditures)

*Annual Operations and Maintenance Costs (such as recordkeeping, technical/professional services, etc.)

Annual Non-Labor Cost (expenditures on training, travel, and other resources)

Total Annual Cost to Respondents


Focus Group Travel

FM-104-FY-22-102

$0

$0

$13,500

$13,500


Total

$0

$0

$13,500

$13,500



Annual Non-Labor Cost for travel to Focus Groups is based on US General Services Administration (GSA) mileage rate for Privately Owned Vehicles (POV)4 effective July 1, 2022, at 0.625 per mile. Maximum travel to the Focus Group ≤ 30 miles one way or 60 miles round trip. Using this information, 60 miles roundtrip × 360 respondents = 21,600 miles at $0.625 per mile = $13,500 annual cost for mileage.


14. Provide estimates of annualized cost to the Federal Government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing and support staff), and any other expense that would have been incurred without this collection of information. You may also aggregate cost estimates for Items 12, 13, and 14 in a single table.



Annual Cost to the Federal Government

Item

Cost ($)

Contract Costs:

Contractor to provide a hotel conference room and incentive for 3 locations at $27,517.95 per location and focus group rental facilities for 3 locations at $22,240 per location at 20%

[(3 x $27,517.95) + (3 x $22,240)] x .20= $29,855

$29,855

Staff Salaries:

15 GS 9 Step 5 ($73,617) at 15% time x 1.45 loaded wage rate = $240,175

10 GS 11 Step 5 ($89,069) at 15% time x 1.45 loaded wage rate = $193,725

9 GS 12 Step 5 ($106,759) at 15% time x 1.45 loaded wage rate = $208,981

3 GS 13 Step 5 ($126,949) at 15% time x 1.45 loaded wage rate = $82,834

1 GS 14 Step 5 ($150,016) at 15% time x 1.45 loaded wage rate = $32,629

$240,175 +$193,725 + $208,981+ $82,834+ $32,629 = $758,344

$758,350

Facilities [cost for renting, overhead, etc. for data collection activity]

CSA department's % of rent, utilities, security, recycling: $75,591 at 15% cost

$75,591 x .15 = $11,339

$11,339

Computer Hardware and Software [cost of equipment annual lifecycle]

Adobe Presenter $300, IT SLA-4 Components $196,185, WinCati Maintenance-Training $21,896, Medallia Contract $293,836 at 15% cost

$300 + $196,185 + $21,896 + $293,836 = $512,217 x .15 = $76,833

$76,833

Equipment Maintenance [cost of annual maintenance/service agreements for equipment]

Annual Maintenance, Laptop and accessories for all 38 staff $30,963.41 at 15% cost

Printers: 1 Desktop & shared network $483.33 at 15% cost

$30,963.41 + $483.33 = $31,447 x .15 = $4,717

$4,717

Travel (not to exceed)

Total cost of 6 trips that average airfare, meals, lodging for 2 travelers at $7,252.49 per trip at 20%.

6 x $7,252.49 x .20 = $8,703

$8,703

Other:

Long Distance Phone Charges

AT&T phone charges at $0.0321 per call at 366,251 calls

Long distance completed calls at $0.01015 per minute x 9.781 minutes x average calls of 37,044

Long distance attempted calls at $.01015 per minute x 1.25 Minutes x 329,207 calls

($0.0321 x 366,251 calls) + ($0.01015 x 9.781 minutes x 37,044 calls) + ($0.01015 x 1.25 minutes x 329,207 calls) = $19,611 at %15 cost

$19,611 x .15 = $2,942


Cost for call management system to CSA per person at $744.64 for 38 staff members at %15 cost.

$744.64 x 38 x .15 = $4,244


Average yearly expenses for pens and notepads for 38 staff members.

$3,229 at 15% cost

$3,229 x .15 = $484


$2,942 + $4,244 + $484= $7,670

$7,670

Total

$897,467

1 Office of Personnel Management 2023 Pay and Leave Tables for the Washington-Baltimore-Arlington, DC-MD-VA-WV-PA locality. Available online at https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2023/DCB.pdf . Accessed January 17, 2023

2 Wage rate includes a 1.45 multiplier to reflect the fully-loaded wage rate.


15. Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the OMB Form 83-I in a narrative form. Present the itemized changes in hour burden and cost burden according to program changes or adjustments in Table 5. Denote a program increase as a positive number, and a program decrease as a negative number.

A “Program increase” is an additional burden resulting from a Federal Government regulatory action or directive. (e.g., an increase in sample size or coverage, amount of information, reporting frequency, or expanded use of an existing form). This also includes previously in-use and unapproved information collections discovered during the ICB process, or during the fiscal year, which will be in use during the next fiscal year.

A “Program decrease”, is a reduction in burden because of: (1) the discontinuation of an information collection; or (2) a change in an existing information collection by a Federal Agency (e.g., the use of sampling (or smaller samples), a decrease in the amount of information requested (fewer questions), or a decrease in reporting frequency).

An “Adjustment” denotes a change in burden hours due to factors over which the government has no control, such as population growth, or in factors which do not affect what information the government collects or changes in the methods used to estimate burden or correction of errors in burden estimates.


Itemized Changes in Annual Burden Hours

Data Collection Activity/Instrument

Program Change (hours currently on OMB inventory)

Program Change (new)

Difference

Adjustment (hours currently on OMB inventory)

Adjustment (new)

Difference

Public Assistance Initial Customer Satisfaction Survey FEMA Form FF-104-FY-21-155 (formerly 519-0-32) (Telephone) Non-Profit institutions




55

46

-9

Public Assistance Initial Customer Satisfaction Survey FEMA Form FF-104-FY-21-155 (formerly 519-0-32) (Telephone) State, Local or Tribal Government




232

197

-35

Public Assistance Initial Customer Satisfaction Survey FEMA Form FF-104-FY-21-156 (formerly 519-0-33) (Internet) Non-Profit institutions




11

7

-4

Public Assistance Initial Customer Satisfaction Survey FEMA Form FF-104-FY-21-156 (formerly 519-0-33) (Internet) State, Local or Tribal Government




46

30

-16

Public Assistance Assessment Customer Satisfaction Survey FEMA Form FF-104-FY-21-157 (formerly 519-0-34) (Telephone) State, Local or Tribal Government




186

189

3

Public Assistance Assessment Customer Satisfaction Survey FEMA Form FF-104-FY-21-158 (formerly 519-0-35) (Internet) State, Local or Tribal Government




39

37

-2

Total

0

0

0

569

506

-63



Explain: The overall burden has decreased for the customer satisfaction surveys in this collection. In order to comply with the DHS Burden Reduction Initiative, Customer Survey and Analysis (CSA) re-evaluated their information collections for ways to reduce burden. All of the below changes reflect adjustments to previous inaccuracies and updates to response rates.


CSA identified that the burden estimates for the electronic surveys were inaccurate; actual completion times are shorter than previously estimated. The inaccurate estimates were due to CSA not being able to test the electronic surveys prior to last submission (did not acquire survey software in time). This resulted in an adjustment to burden hours for electronic surveys.


In addition to shorter completion times for electronic surveys, response rates were updated. Previously we estimated response rates for phone and electronic surveys as the same because we had no data. Electronic response rates are much lower than phone response rates, and these numbers have been adjusted to reflect current data. Telephone response rates have also slightly declined. Possible explanations for the response rate decline include a growing refusal among respondents to participate and difficulties in contacting individuals due to the increased use of answering machines, call screening devices, and cellular telephones (Tourangeau, 2004; Ehlen & Ehlen, 2007). In addition, new technologies sometimes mistakenly flag survey calls- even those conducted by the government- as “spam” (Kennedy & Hartig, 2019).


Lastly, the population totals were adjusted to better reflect mixed mode methodology. Previously we estimated 80% of the target population would complete phone surveys and 20% would complete electronic surveys. In reality, 100% of the population receives electronic surveys and 11 to 13% respond. The population that does not respond to an electronic survey (roughly 87% to 89% of original population) becomes the target population for the phone surveys. This was a significant upwards adjustment to population totals for electronic surveys, but because electronic surveys are shorter, the overall impact on burden was still a net decrease.


The total of annual burden hours was 1,902 in the previous collection. The current collection has an annual total of 1,839 burden hours.


Total Program Decrease to Burden Hours = 1,902 (previous) – 1,839 (current) = -63 hours



Itemized Changes in Annual Cost Burden

Data Collection Activity/Instrument

Program Change (cost currently on OMB inventory)

Program Change (new)

Difference

Adjustment (cost currently on OMB inventory)

Adjustment (new)

Difference

Public Assistance Initial Customer Satisfaction Survey FEMA Form FF-104-FY-21-155 (formerly 519-0-32) (Telephone) Non-Profit institutions




$1,949

$1,868

-$81

Public Assistance Initial Customer Satisfaction Survey FEMA Form FF-104-FY-21-155 (formerly 519-0-32) (Telephone) State, Local or Tribal Government




$14,730

$9,556

-$5,174

Public Assistance Initial Customer Satisfaction Survey FEMA Form FF-104-FY-21-156 (formerly 519-0-33) (Internet) Non-Profit institutions




$390

$284

-$106

Public Assistance Initial Customer Satisfaction Survey FEMA Form FF-104-FY-21-156 (formerly 519-0-33) (Internet) State, Local or Tribal Government




$2,921

$1,455

-$1,466

Public Assistance Assessment Customer Satisfaction Survey FEMA Form FF-104-FY-21-157 (formerly 519-0-34) (Telephone) Non-Profit institutions




$1,559

$1,787

$228

Public Assistance Assessment Customer Satisfaction Survey FEMA Form FF-104-FY-21-157 (formerly 519-0-34) (Telephone) State, Local or Tribal Government




$11,809

$9,168

-$2,641

Public Assistance Assessment Customer Satisfaction Survey FEMA Form FF-104-FY-21-158 (formerly 519-0-35) (Internet) Non-Profit institutions




$319

$365

$46

Public Assistance Assessment Customer Satisfaction Survey FEMA Form FF-104-FY-21-158 (formerly 519-0-35) (Internet) State, Local or Tribal Government




$2,476

$1,795

-$681

Focus Group for 2 Hrs Plus Travel 1 Hr

FM-104-FY-22-102 Non-Profit institutions




$7,228

$8,284

$1,056

Focus Group for 2 Hrs Plus Travel 1 Hr

FM-104-FY-22-102 State, Local or Tribal Government




$55,617

$42,495

-$13,122

One-on-One Interviews

FM-104-FY-22-102 Non-Profit institutions




$1,346

$1,543

$197

One-on-One Interviews

FM-104-FY-22-102 State, Local or Tribal Government




$10,285

$7,859

-$2,426

Total

$0

$0

$0

$110,629

$86,459

-$24,170

Explain: Wage rates were also updated with new occupation classifications and current rates. Standard occupational classification was used for both non-profit and government workers because the survey respondents can hold a wide range of positions (previous classifications too specific).


Total Program Decrease to Cost = $110,629 (previous) – $86,459 (current) = -$24,170


16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.



Reports are provided to internal stakeholders within FEMA such as the Public Assistance management offices and various Recovery Directorate offices on a quarterly basis. These reports have a breakdown of each question (basic descriptive statistics; averages and percentages) as well as an overall analysis of patterns seen in the data each quarter and trends overtime. Data can also be aggregated by region, disaster, state, etc. depending on the needs of Public Assistance. Therefore, it is possible that stakeholders may request reports on a monthly and/or yearly basis.


Aggregated survey results reports can also be accessed by FEMA employees on the internal Customer Survey and Analysis (CSA) SharePoint site. FEMA employees can also access survey dashboards and view aggregated results for select questions. The dashboards are refreshed more frequently than reports are published.


Statisticians may be asked to do more in-depth analysis if there is a significant drop in customer satisfaction scores, and stakeholders want to understand why there was a decrease in satisfaction. This may involve correlation, T-tests, crosstabulations with Pearson’s Chi-Square, and Analysis of Variance (ANOVA). Nonparametric analysis may be used if appropriate. Demographic data will typically be used to describe the sample of respondents, but statisticians may also look for differences in satisfaction across demographic groups if a more in-depth analysis is requested. Statisticians are careful to include disclaimers about generalizability if sample is low. 


17. If seeking approval not to display the expiration date for OMB approval of the information collection, explain reasons that display would be inappropriate.


FEMA will display the expiration date for OMB approval of this information collection.



18. Explain each exception to the certification statement identified in Item 19 “Certification for Paperwork Reduction Act Submissions,” of OMB Form 83-I.


FEMA does not request an exception to the certification of this information collection.


References



Curtin, R., Presser, S., & Singer E. (2005). Changes in telephone survey nonresponse over the past quarter century. Public Opinion Quarterly, 69, 87–98.

Czajka, J. L. & Beyler, A. (2016). Declining Response Rates in Federal Surveys: Trends and Implications. Mathematica Policy Research, 1, 1-86. Retrieved from: https://aspe.hhs.gov/system/files/pdf/255531/Decliningresponserates.pdf

Duffy, B., Smith, K., Terhanian, G. & Bremer, J. (2005). Comparing Data from Online and Face-to-Face Surveys. International Journal of Market Research, 47(6). doi: 10.1177/147078530504700602.

Ehlen J, Ehlen P. (2007). Cellular-only substitution in the United States as lifestyle adoption. Public Opinion Quarterly, 71, 717–733.

Kennedy, C. & Hartig, H. (2019). Response rates in telephone surveys have resumed their decline. Pew Research Center. Retrieved from: https://www.pewresearch.org/fact-tank/2019/02/27/response-rates-in-telephone-surveys-have-resumed-their-decline/

Sinclair, M., O’Toole, J.O., Malawaraarachchi, M. & Leder, K. (2012). Comparison of response rates and cost-effectiveness for a community-based survey: postal, internet and telephone modes with generic or personalized recruitment approaches. BMC Medical Research Methodology, doi:10.1186/1471-2288-12-132

Szolnoki, G. & Hoffman, D. (2013). Online, face-to-face and telephone surveys-Comparing different sampling methods in wine consumer research. Wine and Economics Policy, doi: http://dx.doi.org/10.1016/j.wep.2013.10.001

Tourangeau R. (2004). Survey research and societal change. Annual Review of Psychology, 55,775–801.


1 Bureau of Labor Statistics, Employer Costs for Employee Compensation, Table 1.  Available at https://www.bls.gov/news.release/archives/ecec_03182022.pdf. Accessed October 12, 2022.  The wage multiplier is calculated by dividing total compensation for all workers of $40.35 by wages and salaries for all workers of $27.83 per hour yielding a benefits multiplier of approximately 1.45.

2 Information on the mean wage rate from the U.S. Department of Labor, Bureau of Labor Statistics is available online at: https://www.bls.gov/oes/2021/may/oes_nat.htm

3 Bureau of Labor Statistics, Employer Costs for Employee Compensation, Table 1.  Available at https://www.bls.gov/news.release/archives/ecec_03182022.pdf.  Accessed October 12, 2022.  The wage multiplier is calculated by dividing total compensation for State and local government workers of $54.96 by Wages and salaries for State and local government workers of $34.09 per hour yielding a benefits multiplier of approximately 1.61.

4 Privately Owned Vehicle (POV) Mileage Reimbursement Rates. U.S. General Service Administration. Last accessed: November 30, 2022. Available at: https://www.gsa.gov/travel/plan-book/transportation-airfare-pov-etc/privately-owned-vehicle-pov-mileage-reimbursement-rates

23

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorEdmonds, Julia K. EOP/OMB (Intern)
File Modified0000-00-00
File Created2023-08-26

© 2024 OMB.report | Privacy Policy