Supporting Statement for OMB Collection 1660-0107 (Part A) Pass_back_4.CLEAN

Supporting Statement for OMB Collection 1660-0107 (Part A) Pass_back_4.CLEAN.doc

Public Assistance Customer Satisfaction Surveys

OMB: 1660-0107

Document [doc]
Download: doc | pdf

March 28, 2017


Supporting Statement for

Paperwork Reduction Act Submissions


OMB Control Number: 1660 - 0107


Title: Public Assistance Customer Satisfaction Surveys


Form Number(s):

FEMA Form 519-0-32, Public Assistance Initial Customer Satisfaction Survey (Telephone);

FEMA Form 519-0-33, Public Assistance Initial Customer Satisfaction Survey (Internet);

FEMA Form 519-0-34, Public Assistance Assessment Customer Satisfaction Survey (Telephone);

FEMA Form 519-0-35, Public Assistance Assessment Customer Satisfaction Survey (Internet)


This is a request to revise 1660 – 0107 to accomplish the following goals: 1. Update Public Assistance Customer Satisfaction Surveys to reflect changes in the Public Assistance Program; 2. Increase accuracy in survey results by surveying at two different time points; 3. Reduce burden hours through restructuring survey questions and scales; 4. Revise survey questions based on respondent feedback (e.g., correct confusing wording and repetitive questions); 5. Continue to measure customer satisfaction with Public Assistance process and gauge performance.


The major changes are listed here and explained in more detail in Section A.12 and A.15


A. Justification


  1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information. Provide a detailed description of the nature and source of the information to be collected.


The following legal authorities mandate the collection of the information in this request:


The September 11, 1993 Executive Order 12862, “Setting Customer Service Standards,” and its March 23, 1995 Memorandum addendum, “Improving Customer Service,” requires that all Federal agencies ask their customers what is most important to them, and survey their customers to determine the kind and quality of services the customers want and their level of satisfaction with existing services. The 1993 Government Performance and Results Act (GPRA) requires agencies to set missions and goals, and measure performance against them.


The E-Government Act of 2002 includes finding innovative ways to improve the performance of governments in collaborating on the use of information technology to improve the delivery of Government information and services.


The GPRA Modernization Act of 2010 requires quarterly performance assessments of Government programs for purposes of assessing agency performance and improvement, and to establish agency performance improvement officers and the Performance Improvement Council. Executive Order 13571 “Streamlining Service Delivery and Improving Customer Service” and its June 13, 2011 Memorandum “Implementing Executive Order 13571 on Streamlining Service Delivery and Improving Customer Service” sets out guidelines for establishing customer service plans and activities; plus it expands the definition of customer and encourages the use of a broader set of tools to solicit actionable, timely customer feedback to capture insights and identify early warning signals. The new Public Assistance (PA) surveys were submitted to FEMA’s Office of Program and Policy and Program Analysis (OPPA) for review.  The survey items that are ultimately included in GPRA calculations are under the discretion of OPPA.  We have recommended that overall satisfaction with PA program (Initial Survey), satisfaction with simplicity of the PA process (Initial Survey), overall satisfaction with FEMA customer service (Assessment Survey), and how essential FEMA funding was to the organizations’ disaster response and recovery (Assessment Survey) be considered for the Public Assistance GPRA measures.  These questions will provide FEMA an overall gauge of performance at different points in the Public Assistance process.  Drops in overall satisfaction or customer service ratings will signal Public Assistance to examine specific survey questions more closely to pinpoint underlying causes for dissatisfaction, and identify possible strategies for improvement. 


The information collection will assess customer satisfaction with the FEMA Public Assistance process. Applicants will be surveyed at the beginning and end of the Public Assistance process. Applicants surveyed at the beginning of the process may be eligible or ineligible for funding, whereas applicants surveyed at the end of the process will have all received funding, The Public Assistance Initial Survey will assess whether applicants are satisfied with the service and materials they receive from FEMA at the onset of the process (after their first one-on-one meeting), and is new to the collection. The Public Assistance Assessment Survey will assess customer satisfaction throughout the entire process, and will be administered after applicants have funds obligated by FEMA for payment. Survey topics include knowledge and helpfulness of FEMA representatives, timeliness of the process, communication, accessibility to materials/information, and difficulty associated with various phases of the process. Specialized qualitative interviews (e.g., focus groups, one-on-one interviews, small group interviews) may be conducted periodically to assess program areas or program changes that the Public Assistance surveys do not capture.


2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection. Provide a detailed description of: how the information will be shared, if applicable, and for what programmatic purpose.


FEMA’s mission is to support the citizens of the United States and first responders to ensure that as a nation we work together to build, sustain, and improve our capability to prepare for, protect against, respond to, recover from, and mitigate all hazards. FEMA uses the collected information to measure customer satisfaction, to meet objectives, gauge and make improvements to increase customer satisfaction.


This collection is the Public Assistance (PA) Customer Satisfaction Surveys, managed by the Recovery Directorate, through the Reporting & Analytics Division, Customer Survey & Analysis Section (CSA) of the Federal Emergency Management Agency. The purpose of the Public Assistance Customer Satisfaction Survey is to assess customer satisfaction with the Public Assistance Program, and to improve the quality of service for applicants (State, Local, Tribal government, and eligible Private Non-Profit organizations) who have been affected by a disaster and receive funding. This collection of information has enabled FEMA Managers to garner customer feedback and satisfaction against standards for performance and customer service in an efficient, timely manner to help ensure that users have an effective, efficient, and satisfying experience with the Agency’s programs. The collection has allowed for ongoing, collaborative and actionable communications between the Agency and its stakeholders. Results from the previous Public Assistance (PA) collection has given FEMA the ability identify weaknesses in the PA program, as well as given applicants an opportunity to voice their concerns.


There are several improvements that have been made to the Public Assistance Program that were initiated based on survey results.   In the previous PA survey, applicants sometimes commented it was difficult to find current information on the FEMA application process.  Public Assistance has worked to make information more readily available to applicants by ensuring website links are updated, and that forms/templates provided to applicants are standardized.  Timeliness of funding was another survey item that often received lower ratings.  In order to improve satisfaction with timeliness, Public Assistance implemented procedures to expedite assistance.  For example, the PA Alternative Procedures Program and the new PA Model both have mechanisms built in where applicants with smaller projects are able to receive assistance more quickly as long as they meet certain requirements. Turnover of FEMA representatives was another common complaint identified in previous survey data.  The new PA Program assigns each applicant one FEMA representative that guides the applicant throughout the process.  This will hopefully cut down on miscommunication, lost paperwork, and build a sense of trust.  Another issue identified in previous survey data was that submitting paperwork and checking application status was burdensome.  In order to improve this part of the process, Public Assistance has built an online portal where applicants can now submit documentation and check their applications status.  The goal of the online portal is to improve the simplicity of the application process.  Survey data has also prompted teaching moments with staff and informed training procedures. 


Reports are usually distributed by email to stakeholders, which mainly includes Public Assistance Leadership and the Recovery and Analytics Branch. Reports will be distributed on a quarterly basis, and will mainly include descriptive breakdowns of each question (e.g., means and percentages). Stakeholders may request reports more often than quarterly if they want to examine customer satisfaction for a given disaster, state, or FEMA Region. Demographic items will primarily be used to describe the sample, but statisticians may be asked to do more in depth analysis using inferential statistics. This would be most likely if there was a significant drop in customer satisfaction from one quarter to the next, and stakeholders wanted to better understand the underlying causes. The reports are primarily used to monitor performance and identify areas of possible improvement.

3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.


All survey responses are stored in the Customer Satisfaction Analysis System for easy retrieval, statistical analyses, and reporting.  Collection techniques include phone interviews as well as electronic submission of responses.  The Public Assistance Customer Satisfaction Survey aligns with E-Government Act of 2002 and Executive Order 13571 of 2011 initiatives providing those who prefer electronic communications to complete and submit their survey responses electronically. The Customer Survey & Analysis Section, who administers the surveys, has survey software that expands functionality and connectivity providing a means for reducing burden through transition to electronic distribution and submission of surveys.


The information collection will utilize mixed-mode survey administration, which incorporates both telephone and internet surveying techniques. Applicant organizations responding to the FEMA Public Assistance Customer Satisfaction Surveys will be able to respond via phone call (computer assisted telephone interviewing) or a web-based link. Applicants that have an email address on file will first receive an email invitation. If the applicant does not complete the survey via the web within a designated amount of time (approximately 2 weeks), interviewers will attempt to contact the respondent via phone. Allowing mixed-mode administration should reduce respondent burden and administration cost, although it will be important to examine the possibility of mode effects. Currently we only send fillable forms via email when a respondent makes a request, and this is typically a small percentage of the sample (< 10 %). In the last submission we accounted for internet completions, but we were unable to utilize this technology because we encountered problems with upgrading our survey software (WinCATI) to the latest version. Internet surveys can only be sent outside of the firewall with the newest version of WinCATI, which will be operational when the updated collection is approved.

Response rates for our PA surveys are typically very high compared to industry average, although it’s unclear how response rates will differ by mode. Response rates for online administration are typically lower than phone administered surveys, but it is difficult to predict how much lower. For example, in a research study examining response rates by administration mode, telephone administered surveys produced the highest response rates (30.2%), whereas internet administered surveys had the lowest response rates (4.7%; Sinclair et al., 2012). Response rates differ depending on variables such as survey length, convenience of administration, who the respondents are, and survey importance. For the past three years, response rates for the Public Assistance Survey have been 70.17%.


For the updated collection, internet completions are estimated to be around 19% of the entire collection, although that percentage could increase as time goes on. Phone completions are expected to be approximately 74% of this collection, with the final 7% coming from qualitative interviews (focus groups, one-on-one interviews, small group interviews). If we exclude qualitative interviews from the calculations, we expect about 80% of the PA surveys to be phone administered and 20% of the surveys to be web administered. Allowing mixed-mode administration should improve response rates and reduce burden by allowing respondents to reply in their preferred administration method. Each administration method will have identical questions. The exception is qualitative interviews, which will vary depending on which program area needs to be assessed.


4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.


The information gathered in the survey is not available from any other source.


5. If the collection of information impacts small businesses or other small entities (Item 5 of OMB Form 83-I), describe any methods used to minimize burden.


There is no impact from this collection of information on small businesses or other small entities.


6. Describe the consequence to Federal/FEMA program or policy activities if the collection of information is not conducted, or is conducted less frequently as well as any technical or legal obstacles to reducing burden.


If FEMA’s surveys were not conducted, the consequences would be the absence of documentation about customer input on the quality and timeliness of disaster assistance for Public Assistance applicants. The survey results serve as a vital tool for measuring customer satisfaction and are a requirement of the Executive Orders 12682 and 13571, and resulting Memorandums for “Streamlining Service Delivery and Improving Customer Service.” The surveys also measure the effectiveness of the Administrator’s Strategic Plan based on the Public Assistance applicant’s perspective.


7. Explain any special circumstances that would cause an information collection to be conducted in a manner:


  1. Requiring respondents to report information to the agency more often than quarterly.



 (b) Requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it.



  1. Requiring respondents to submit more than an original and two copies of any document.



  1. Requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records for more than three years.



  1. In connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study.



 (f) Requiring the use of a statistical data classification that has not been reviewed and approved by OMB.


 (g) That includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use.



 (h) Requiring respondents to submit proprietary trade secret, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information’s confidentiality to the extent permitted by law.


All special circumstances contained in item 7 of the supporting statement are not applicable to this information collection.



8. Federal Register Notice:



 a. Provide a copy and identify the date and page number of publication in the Federal Register of the agency’s notice soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.



A 60-day Federal Register Notice inviting public comments was published on December 8, 2016, 81 FR 88696. FEMA received one positive comment supporting FEMA’s effort to survey their customers and FEMA’s commitment to continually improving the service provided to citizens during times of crisis. Please see the positive comment from Brenda Kohlmyer uploaded in ROCIS.

A 30-day Federal Register Notice inviting public comments was published on March 7, 2017, 82 FR 12823. No comments were received. See attached copy of the published notice included in this package.

 b. Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.


Budget constraints have prevented FEMA from consulting with persons outside the agency.


Recovery Directorate and Public Assistance Program Managers were consulted for input about the data collected in the survey questionnaires and the reporting format. Additionally, two statisticians have been hired in the past year to offer their expertise on data collection, survey writing, and reporting. Both statisticians have worked on the new survey collection to ensure respondent burden is minimized, while also maximizing survey reliability and validity.


c. Describe consultations with representatives of those from whom information is to be obtained or those who must compile records. Consultation should occur at least once every three years, even if the collection of information activities is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.

Budget constraints have prevented FEMA from contracting to consult with Public Assistance applicants since FY2004 when FEMA’s Recovery Directorate contracted to perform four focus groups to ensure that the information collected was meaningful to customers and the survey questions were clearly understood.


In 2016, small group interviews were conducted in six states to assess customer satisfaction with the Public Assistance Alternative Procedures Program. Although not the main focus of the interviews, applicants were asked if they had ever taken a customer satisfaction survey regarding their experience with Public Assistance. If they were familiar with the survey, they were asked if they had any feedback or suggestions for improvement. Small group interviews were also conducted in Oregon and Iowa to assess customer satisfaction with the redesigned Public Assistance Program (pilot test). During these interviews, applicants were shown sample survey questions from the new information collection and provided feedback on content and clarity. No additional focus groups have been held due to budget constraints.


Although direct contact with customers has been minimal, applicants often provide comments or feedback when completing the Public Assistance Survey. This feedback has been thoughtfully reviewed and applied in revising the current survey collection. Statisticians have conducted comment analysis on the survey results to extract themes from text boxes and “other” response options to identify topics important to customers that aren’t currently assessed. Additionally, phone interviewers provided survey writers with feedback regarding which survey items were consistently confusing to respondents.


9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


There are no payments or gifts to respondents for this data collection.


10. Describe any assurance of confidentiality provided to respondents. Present the basis for the assurance in statute, regulation, or agency policy.



A Privacy Threshold Analysis (PTA) was completed by FEMA and adjudicated by the DHS Privacy Office on September 23, 2016.


The Privacy Impact Assessment (PIA) is covered under the Department of Homeland Security FEMA/PIA-035 Customer Satisfaction Analysis System (CSAS), approved by DHS on February 27, 2014 and the existing System of Records Notice (SORN), is DHS/FEMA-009 Hazard Mitigation, Disaster Public Assistance, and Disaster Loan Programs System of Records, 79 FR 16015 approved by DHS on March 24, 2014.


There are no assurances of confidentiality provided to the respondents for this information collection. Survey information is stored in the Customer Satisfaction Analysis System (CSAS), warehoused on secure FEMA servers. FEMA limits CSAS access to staff that is directly responsible for system administration and data collection functions. In addition to limiting system access, FEMA only shares aggregate survey response information with its components.



11. Provide additional justification for any question of a sensitive nature (such as sexual behavior and attitudes, religious beliefs and other matters that are commonly considered private). This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.


There are no questions of a sensitive nature related to sexual behavior and attitudes, religious beliefs, or other matters that are commonly considered private in the surveys.


Some questions of a demographic nature have been added to help identify whether certain groups of people are more or less satisfied with FEMA customer service, although the questions aren’t very personal or of a sensitive nature. Examples include how long the respondent has worked in their current position, whether they’ve applied for PA disaster assistance previously, and the number of staff that worked on their FEMA project. It is possible that respondents with various resources (staff) and relevant work experience interpret the difficulty associated with the Public Assistance Process differently. Asking these questions will allow us to better identify whether we are serving all our customers equally, and whether our products and services need to be tailored to meet the needs of certain groups of people.


12. Provide estimates of the hour burden of the collection of information. The statement should:



 a. Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated for each collection instrument (separately list each instrument and describe information as requested). Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desired. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.


Read aloud testing by FEMA interview staff was conducted to approximate average survey response times. Previous research in survey methodology suggests online surveys can be completed faster than telephone surveys (e.g., Szolnoki & Hoffmann, 2013; Duffy et al., 2005; Kellner, 2004). Based on these findings, we estimate internet versions will be completed 2 minutes faster on average.


Therefore, estimated total survey response time is 6 minutes for the PA Initial Survey (Phone) with a skilled interviewer, and 4 minutes for the PA Initial Survey (Internet). Estimated total response time is 9 minutes for the PA Assessment Survey (Phone) with a skilled interviewer, and 7 minutes for the PA Assessment Survey (Internet). For qualitative interviews, focus groups typically take 2 hours to conduct, plus 1 hour for round trip travel time to the session, or 3 hours. One-on-one or small group interviews typically take 1 hour to conduct with no travel time.


Projected completions for the PA Initial Survey are based on an annual applicant population of 5,626 (eligible and ineligible applicants) and a 70.17% response rate based on a 3-year average. Projected completions for the PA Assessment Survey are based on an annual applicant population of 4,412 (eligible applicants only). For more information about estimated universe and projected completions, see Question 1 in Supporting Statement B.


In order to calculate projected completes for the PA Initial Survey, the estimated population of applicants (5,626) was multiplied by response rate (70.17%), which resulted in an estimate of 3,948 completions annually. In order to calculate projected completes for the PA Assessment Survey, the estimated population of applicants (4,412) was multiplied by the estimated response rate (70.17%), which resulted in an estimate of 3,096 completions annually.


For each of our surveys (excluding qualitative interviews; majority administered in-person), we expect about 20% to be completed via the internet, and about 80% to be completed via phone.


For the PA Initial Survey, internet completions are estimated by multiplying 20% by 3,948, which would result in approximately 790 completions. Phone completions are estimated by multiplying 80% by 3,948, which would result in 3,158 completions.


For the PA Assessment Survey, internet completions are estimated by multiplying 20% by 3,096, which would result in approximately 619 completions. Phone completions are estimated by multiplying 80% by 3,096, which would result in 2,477 completions.


Qualitative interviews are conducted on a request basis (no special schedule for implementation). These surveys may be conducted at the beginning or end of the Public Assistance Process depending on what type of information stakeholders are trying to gather. Completions for qualitative interviews were estimated from previous experience (see Q12b for more details). It is likely that participants in the qualitative interviews may have previously taken the PA Initial or PA Assessment Survey (depends at what point in the PA process qualitative interviews take place), but it is not a requirement for participation. Qualitative interviews are used to gather more detailed information that cannot adequately be captured on a short, primarily quantitative survey measure. This most often occurs when Public Assistance implements a new program or policy that isn’t addressed in our surveys.


In the Question 12 figure below, the total estimated annual burden is 2,093 hours based on following: 316 burden hours for PA Initial Survey-Phone (6 minutes*3,158 completions), 53 burden hours for PA Initial Survey-Internet (4 minutes*790 completions), 372 burden hours for PA Assessment Survey-Phone (9 minutes*2,477 completions), 72 burden hours for Assessment Survey-Internet (7 minutes*619 completions), and 1,280 burden hours for qualitative interviewing ((Focus Groups: 3 hours * 360 participants) + (Interviews: 1 hour * 200 participants)). This estimate accounts for a total of 7,604 completions across three survey measures. Some participants may only be surveyed once, while it is possible for others to be surveyed on three separate occasions. Burden hours per respondent could range anywhere from 4-6 minutes on the low end (complete PA Initial Survey only) to 3 hours and 15 minutes on the high end (complete PA Initial Phone- 6 min, PA Assessment Phone- 9 min, and 1 Focus Group- 3 hours). The majority of applicants will complete the PA Initial and PA Assessment Surveys by phone, which would be a total of 15 burden minutes per participant.


b. If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens in Item 13 of OMB Form 83-I.


Below, as well as in Question 12 figure, is a description of the universe and hour burden by survey instrument:


[FEMA Form 519-0-32] PA Initial Survey (Telephone) may be conducted and gathered by phone, with responses stored electronically. The number of responses collected by phone is estimated to be 3,158 or approximately 41% of the whole collection with an hour burden of 316. It has been estimated to take 6 minutes for the applicant to complete the survey with a skilled interviewer.


[FEMA Form 519-0-33] PA Initial Survey (Internet) may be submitted through an internet link, with responses stored electronically. The number of responses collected by internet link is estimated to be 790 or approximately 10% of the whole collection with an hour burden of 53. It is been estimated to take 4 minutes for the applicant to complete the survey online.


[FEMA Form 519-0-34] PA Assessment Survey (Telephone) may be conducted and gathered by phone, with responses stored electronically. The number of responses collected by phone is estimated to be 2,477 or approximately 33% of the whole collection with an hour burden of 372. It has been estimated to take 9 minutes for the applicant to complete the survey with a skilled interviewer.


[FEMA Form 519-0-35] PA Assessment Survey (Internet) may be submitted through an internet link, with responses stored electronically. The number of responses collected by internet link is estimated to be 619 or approximately 8% of the whole collection with an hour burden of 72. It is been estimated to take 7 minutes for the applicant to complete the survey online.


Qualitative interviews will most likely be conducted in person or by phone. For focus groups, the number of participants is estimated to be 360 with an hour burden of 1,080. Number of Focus Group participants was calculated by estimating 3 sessions with 12 applicants per focus group, in each of the 10 FEMA Regions (3 sessions*12 applicants*10 Regions= 360 participants). This estimate is the same that was used in the previous collection, which was adequate to meet all requests from stakeholders. The length of each focus group is estimated to be 2 hours with an additional 1 hour round trip travel time, for a total of 3 hours per participant (360 participants*3 hours = 1,080 burden hours). For interviews, the number of participants is estimated to be 200 with an hour burden of 200. Because the Public Assistance Program is undergoing significant changes, we anticipate getting more requests from stakeholders to conduct interviews with applicants. Based our previous experience and conversations with stakeholders, we estimated 200 hours based on 2 participants per 1 hour interview, with 10 interviews in each of the 10 FEMA Regions (2 participants*10 interviews*10 FEMA Regions = 200 hours). No travel time is required for applicants. That’s a total of 1,280 hours for qualitative interviews, and comprises approximately 7% of the whole collection.

c. Provide an estimate of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. NOTE: The wage-rate category for each respondent must be multiplied by 1.4 and this total should be entered in the cell for “Avg. Hourly Wage Rate”. The cost to the respondents of contracting out or paying outside parties for information collection activities should not be included here. Instead this cost should be included in Item 13.


See Question 12 figure below. For type of respondent, historical data (2-year average) shows 10% of Public Assistance applicants are non-profit institutions, and 90% are state, local, or tribal government. Projected number of respondents, burden hours, and respondent costs are calculated accordingly.



Question 12: Estimated Annualized Hour Burden and Costs

Type of Respondent

Form Name / Form Number

No. of Respondents

No. of Responses per Respondent

Avg. Burden per Response (in minutes)

Total Annual Burden
(in hours)

Avg. Hourly Wage Rate Multiplied by 1.4

Total Annual Respondent Cost

Non-Profit institutions

Public Assistance Initial Customer Satisfaction Survey FEMA Form 519-0-32 (Telephone)

316

1

6

32

$46.73

$1,495.36

State, Local or Tribal Government

2,842

1

6

284

$67.54

$19,181.36

Sub-Total

 

3,158

 

 

316

 

$20,676.72

Non-Profit institutions

Public Assistance Initial Customer Satisfaction Survey FEMA Form 519-0-33 (Internet)

79

1

4

5

$46.73

$233.65

State, Local or Tribal Government

711

1

4

48

$67.54

$3,241.92

Sub-Total

 

790

 

 

53

 

$3,475.57

Non-Profit institutions

Public Assistance Assessment Customer Satisfaction Survey FEMA Form 519-0-34 (Telephone)

248

1

9

37

$46.73

$1,729.01

State, Local or Tribal Government

2,229

1

9

335

$67.54

$22,625.90

Sub-Total

 

2,477

 

 

372

 

$24,354.91

Non-Profit institutions

Public Assistance Assessment Customer Satisfaction Survey FEMA Form 519-0-35 (Internet)

62

1

7

7

$46.73

$327.11

State, Local or Tribal Government

557

1

7

65

$67.54

$4,390.10

Sub-Total

 

619

 

 

72

 

$4,717.21

Total

Telephone and Internet

7,044

 

 

813

 

$53,224.41

 

Other-Qualitative Surveys

 

 

Avg. Burden per Response (in hours)

 

 

 

Non-Profit institutions

Focus Groups based on 12 participants per session, with 3 sessions for each of 10 regions. Each session lasts 2 hours, with an additional hour for travel (3 hours total).

36

1

3

108

$46.73

$5,046.84

State, Local or Tribal Government

324

1

3

972

$67.54

$65,648.88

Sub-Total

 

360

 

 

1,080

 

$70,695.72

Non-Profit institutions

Interviews based on 2 participants per 1 hour interview, with 10 interviews for each of the 10 regions. Travel not required. Total time 1 hour.

20

1

1

20

$46.73

$934.60

State, Local or Tribal Government

180

1

1

180

$67.54

$12,157.20

Sub-Total

 

200

 

 

200

 

$13,091.80

Total

Qualitative Surveys

560

 

 

1,280

 

$83,787.52

Total

 

7,604

 

 

2,093

 

$137,011.93

  • Note: The “Avg. Hourly Wage Rate” for each respondent includes a 1.4 multiplier to reflect a fully-loaded wage rate.


According to the U.S. Department of Labor, Bureau of Labor Statistics website (www.bls.gov) the wage rate category for all workers in the Non-Profit professions is $33.38 per hour @ 1.4 multiplier = $46.73 per hour. Therefore, the estimated burden hour cost to respondents for Non-profit institutions is estimated to be $9,766.57.


According to the U.S. Department of Labor, Bureau of Labor Statistics website (www.bls.gov) the wage rate category for state and local officials based on all management occupations is $48.24 per hour @ 1.4 multiplier = $67.54 per hour. Therefore, the estimated burden hour cost to respondents for State, Local or Tribal Government is estimated to be $127,245.36.


Therefore, the total annual respondent cost is $137,011.93.


13. Provide an estimate of the total annual cost burden to respondents or record keepers resulting from the collection of information. The cost of purchasing or contracting out information collection services should be a part of this cost burden estimate. (Do not include the cost of any hour burden shown in Items 12 and 14.)


The cost estimates should be split into two components:


a. Operation and Maintenance and purchase of services component. These estimates should take into account cost associated with generating, maintaining, and disclosing or providing information. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred.


b. Capital and Start-up-Cost should include, among other items, preparations for collecting information such as purchasing computers and software, monitoring sampling, drilling and testing equipment, and record storage facilities.



Question 13. Annual Cost Burden to Respondents or Record-keepers

Data Collection Activity/Instrument

*Annual Capital Start-Up Cost

(investments in overhead, equipment and other one-time expenditures)

*Annual Operations and Maintenance Cost (such as recordkeeping, technical/professional services, etc.)

Annual Non-Labor Cost

(expenditures on training, travel and other resources) * See Note below


Total Annual Cost to Respondents

Focus Group Travel

N/A

N/A

$11,664.00

$11,664.00


Annual Non-Labor Cost for travel to Focus Groups is based on US General Services Administration (GSA) mileage rate for Privately Owned Vehicles (POV) effective January 1, 2016 at $0.54 per mile. Maximum travel to the Focus Group not to exceed 30 miles one way or 60 miles round trip. Using this information, 60 miles roundtrip * 360 respondents = 21,600 miles @ $0.54 per mile = $11,664 annual cost for mileage.


14. Provide estimates of annualized cost to the federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing and support staff), and any other expense that would have been incurred without this collection of information. You may also aggregate cost estimates for Items 12, 13, and 14 in a single table.




Annualized Cost to the Federal Government

Administration and Performance of Surveys, Analysis and Reporting, Recommendations for Improvement, Database and Desktop Application and Maintenance of Survey Tools

Staff Salaries:

Title and GS Level

Salary at 2016 with Locality Pay Dallas - Ft Worth

Number of Staff at GS Level

Fully Loaded Wage Rate at 1.4 Multiplier

Cost (for Salaries includes the Wage Rate Multiplier)

Percent of Time Spent on Information Collection

Total Cost

Management, survey administration

Section Manager
GS 14 Step 5

$123,228

1

1.4

$172,519.20

16%

$27,603.07

Administrative Assistant

Administrative Assistant
GS 6 Step 5

$43,216

1

1.4

$60,502.40

16%

$9,680.38

Program Analyst

Program Analyst
GS 12 step 5

$85,189

2

1.4

$238,529.20

16%

$38,164.67

Supervisory, survey administration

Supervisory Customer Service Specialist
GS 13 Step 5

$101,303

1

1.4

$141,824.20

16%

$22,691.87

Project management, administer survey program, recommend improvements, oversee reports and software application implementation, testing and maintenance of survey tools

Customer Satisfaction Analyst
GS 12 Step 5

$85,189

4

1.4

$477,058.40

16%

$76,329.34

Statistician: OMB compliance, data analysis and reporting.

Customer Satisfaction Analyst
GS 12 Step 5

$85,189

2

1.4

$238,529.20

16%

$38,164.67

Survey Management: Administer surveys and focus groups, prepare sample, track data, analyze survey data, write reports and recommend improvements, software application implementation, testing and maintenance of survey tools and survey

Customer Service Specialist GS 11 Step 5

$71,073

6

1.4

$597,013.20

16%

$95,522.11

Supervisory, QC, Training Administration

Supervisory Customer Service Specialist GS 11 Step 5

$71,073

1

1.4

$99,502.20

16%

$15,920.35

QC, Training

Customer Service Specialist GS 11 Step 5

$71,073

2

1.4

$199,004.40

16%

$31,840.70

Supervisory, Survey Administration

Supervisory Customer Service Specialist GS 12 Step 5

$85,189

2

1.4

$238,529.20

16%

$38,164.67

Survey and special projects

Customer Service Specialists GS 9 Step 5

$58,742

19

1.4

$1,562,537.20

16%

$250,005.95

Subtotal

 

 

41

 

$4,025,548.80

16%

$644,087.81

Other Costs

 

 

 

 

 

Percent of Cost

 

Facilities [cost for renting, overhead, etc. for data collection activity]

16%

$10,961.56

Computer Hardware and Software [cost of equipment annual lifecycle]

16%

$5,619.64

Equipment Maintenance [cost of annual maintenance/service agreements for equipment]

16%

$33,975.84

Travel

 

 

 

$0.00

Other: Long Distance Phone Charges 21.26% average [number of data collections by phone, x minutes, x cost], Toll Free Phone Line at 100%, and Office Supplies at 16%

Varies

$2,881.52

Subtotal

 

 

$53,438.56

Total

 

 

$697,526.37


* Note: The “Salary Rate” includes a 1.4 multiplier to reflect a fully-loaded wage rate. Office of Personnel Management (OPM) annually publishes the Federal salaries. The salaries of the staff performing the administration of the surveys work in the Dallas Fort Worth area and their estimated pay rates were provided.



15. Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the OMB Form 83-I in a narrative form. Present the itemized changes in hour burden and cost burden according to program changes or adjustments in Table 5. Denote a program increase as a positive number, and a program decrease as a negative number.

A "Program increase" is an additional burden resulting from a federal government regulatory action or directive. (e.g., an increase in sample size or coverage, amount of information, reporting frequency, or expanded use of an existing form). This also includes previously in-use and unapproved information collections discovered during the ICB process, or during the fiscal year, which will be in use during the next fiscal year.

A "Program decrease", is a reduction in burden because of: (1) the discontinuation of an information collection; or (2) a change in an existing information collection by a Federal agency (e.g., the use of sampling (or smaller samples), a decrease in the amount of information requested (fewer questions), or a decrease in reporting frequency).

"Adjustment" denotes a change in burden hours due to factors over which the government has no control, such as population growth, or in factors which do not affect what information the government collects or changes in the methods used to estimate burden or correction of errors in burden estimates.

The previously approved collection was composed of one Public Assistance Customer Satisfaction Survey that was administered a variety of methods (telephone, internet, fillable forms, mail, or fax). We were not able to administer the survey via a web link due to technical limitations. The software upgrade that enabled online administration outside the firewall was delayed. The survey software will be operational before the revised collection is approved, so the internet forms are included in the updated collection.

The revised collection has the same purpose and assesses the same subject matter, but we have streamlined the process to reduce response burden and improve data validity. The updated collection is split into two short surveys with fewer overall questions. The new sampling method will allow for easier respondent recall.

Total Program Decrease to Burden Hours = 2,093 (current) – 4,341 (previous) = (-2,248 hours)

The overall burden change for the Public Assistance Surveys is a program decrease. Previous universe estimations were based on years with extremely high disaster activity, whereas the current estimations are based on more variable data (years with high and low activity). Additionally, the current collection assumes a 70.17% response rate based on the current response rate and calculates burden hours accordingly, whereas the previous collection calculated burden hours for the entire population. The number and length of questions have also decreased, even with the current collection having an additional survey. The estimated annual population was 12,740 in the previous collection. The current collection estimates an annual population of 5,626 respondents, with some of those respondents being surveyed at two different time points.

In the previous Public Assistance Survey Collection, applicants were surveyed 6 to 9 months after the disaster declaration. At this point in time, some applicants had been finished for months, some applicants were close to completion but still waiting on funding, and a few applicants with large projects still had years left until completion. Applicants who weren’t finished couldn’t answer questions about things like speed of funding, whereas applicants who were finished shortly after the disaster declaration (e.g., quick debris removal projects) had trouble remembering interactions with FEMA because 6+ months had passed. In order to address this issue, we decided to survey based on where applicants were in the Public Assistance Process instead of what disaster affected them. The sample for the PA Initial Survey will include applicants who completed a Recovery Scope Meeting (initial one-on-one meeting with FEMA) in the month we are sampling, whereas the PA Assessment Survey will include applicants who had funds obligated in the month we are sampling. Because we are surveying so early for the PA Initial Survey, it is possible that some applicants may not have received an eligibility decision. Ineligible and eligible applicants will likely be in the respondent pool. Applicants who receive an ineligible decision are still serviced by FEMA and can comment on the initial service provided by FEMA.

Public Assistance will have flags in their system signifying whether someone has had a Recovery Scope Meeting or has funds obligated, but these flags likely won’t be operational until early next year. Currently everything is in a transitional state- some disasters are being completed under the old program, and some under the new program. Until we are able to pull sample based on these flags, the Initial PA Survey will be conducted 60 days after the initial disaster declaration (enough time to complete a Recovery Scope Meeting), and the PA Assessment Survey will be completed 210 days following the disaster declaration (most applicants will have funds obligated).

Some of the large scale changes in the new Public Assistance Program include an online portal where applicants can upload documentation and check the status of their application. Previously all paperwork was submitted via email, fax, or in-person, which was time consuming and often led to lost documentation. The new survey asks applicants if they have used the portal, and their satisfaction with various aspects of the portal. Additionally, more standardization has been implemented in the beginning of the PA process regarding when initial phone calls and meetings occur and what documentation/instructions applicants should receive. This is to help improve consistency within the process. The PA Initial Survey asks whether applicants received the appropriate documentation, whether they spoke to FEMA representatives about special topics (e.g., hazard mitigation), and whether they were informed of significant deadlines. In the old program applicants may have dealt with several FEMA representatives, whereas the new program assigns each applicant one Program Delivery Manager (PDM) that guides applicants throughout the process. In previous interviews that assessed pilot versions of the new program, we learned that demographic data such as previous FEMA experience influenced applicant perceptions, which we have incorporated into our new surveys. Lastly, terminology has changed with the new program, which we have updated in the new collection.

15a) Change in Annual Hour Burden by Instrument:


  • PA Initial (Phone) is a new survey measure the previous collection did not have, with an increase in annual burden hours of:

  • 316 hours currently - 0 hours previously = (+316 hours).

  • Program increase due to new form.


  • PA Initial (Internet) is a new survey measure the previous collection did not have, with an increase in annual burden hours of:

  • 53 hours currently - 0 hours previously = (+53 hours).

  • Program increase due to new form.

  • PA Assessment (Phone) replaces Public Assistance Customer Satisfaction Survey (Phone), with a change in annual burden hours of:

  • 372 hours currently - 2,600 hours previously = (- 2,228 hours).

  • Program decrease due to smaller samples, fewer and shorter questions, and less frequent sampling.


  • PA Assessment (Internet) replaces Public Assistance Customer Satisfaction Survey (Web), with a change in annual burden hours of:

  • 72 hours currently - 413 hours previously = (-341 hours).

  • Program decrease due to smaller samples, fewer and shorter questions, and less frequent sampling.


  • The following methods will be discontinued in the new collection (Program decrease due to rarely used; online and telephone methods should be adequate):

    • Online Fillable: (-206 burden hours)

    • Fax: (-21 burden hours)

    • Mail: (-21 burden hours)


  • For qualitative interviews, annual burden hours are:

  • 1280 hours currently - 1080 hours previously = (+ 200 hours)

  • Program increase due to increases in types of information to be gathered. Public Assistance is currently revamping their program. The Public Assistance process is constantly evolving, and qualitative interviews add much needed flexibility when it comes to gaining insights into customer satisfaction with specific changes that aren’t captured in the surveys. Interviews are useful because it is sometimes difficult to gather enough respondents in a concentrated area to conduct Public Assistance Focus Groups.


  • Breakdown: Focus Groups: 1,080 currently - 1080 previously = (same).

Interviews: 200 currently - 0 previously = (+200 hours).


Note. In the previous collection, interviews were specified as a possible methodology under focus groups. The current collection disentangles the two methods to derive more accurate hour estimates, but both methods are still considered specialized qualitative interviews.


Question 15 a: Itemized Changes in Annual Hour Burden

Data Collection Instrument

Survey Administration Mode

Program Change (hours currently on OMB Inventory)

Program Change (New)

Difference

Explanation

Public Assistance Initial Survey FEMA Form 519-0-32

Telephone

0

316

316

Program increase due to new form

Public Assistance Initial Survey FEMA Form 519-0-33

Internet

0

53

53

Program increase due to new form

Public Assistance Customer Satisfaction Survey,
FEMA Form 519-0-1T,
Now Public Assistance Assessment Survey FEMA Form 519-0-34

Telephone

2,600

372

-2,228

Decrease due to smaller samples, fewer and shorter questions, and less frequent sampling

Public Assistance Customer Satisfaction Survey (Web),
FEMA Form 519-0-1INT,
Now Public Assistance Assessment Survey FEMA Form 519-0-35

Internet

413

72

-341

Decrease due to smaller samples, fewer and shorter questions, and less frequent sampling

Public Assistance Customer Satisfaction Survey,
FEMA Form 519-0-1 Fill-able, Sent By Email/Electronically

Email-Fillable Form

206

0

-206

Decrease due to discontinuation of administration method

Public Assistance Customer Satisfaction Survey,
FEMA Form 519-0-1, Fill-able, Sent by Fax

Fax-Fillable Form

21

0

-21

Decrease due to discontinuation of administration method

Public Assistance Customer Satisfaction Survey,
FEMA Form 519-0-1, Fill-able, Sent by Mail

Mail-Fillable Form

21

0

-21

Decrease due to discontinuation of administration method

Qualitative Interviews

Focus Group, One-one interviews, Small Group Interviews

1080

1,280

200

Program increase due to increases in types of information

Total

 

4,341

2093

-2248

 

Note: There were no program “adjustments” for item 15a.



15b) Change in Annual Cost by Instrument (see table 15b on following page)

  • PA Initial (Phone) is a new survey measure the previous collection did not have, with an annual increase of $20,676.72. Program increase due new form.


  • PA Initial (Internet) a new survey measure the previous collection did not have, with an annual increase of $3,475.57. Program increase due to new form.


  • PA Assessment (Phone) replaces the Public Assistance Customer Satisfaction Survey (Phone), with an annual decrease of $120,538.70. Program decrease due to smaller samples, fewer and shorter questions, and less frequent sampling.


  • PA Assessment (Internet) replaces the Public Assistance Customer Satisfaction Survey (Internet), with an annual decrease of $18,281.78. Program decrease due to smaller samples, fewer and shorter questions, and less frequent sampling.


  • The following methods will be discontinued in the new collection and result in an annual decrease in costs (Program decrease due to rarely used; online and telephone methods should be adequate):

    • Online Fillable: (-$11,499.49)

    • Fax: (-$1,149.95)

    • Mail: (-$1,149,95)

  • For qualitative interviews there is an annual cost increase of $23,596.31. Program increase due to increases in types of information that needs to be gathered. Additionally, there has been a wage increase since the last submission:


    • Wage Increase for Non-Profit Workers: $32.68 previously, $46.73 currently (includes 1.4 multiplier).

    • Wage Increase for State, Local, or Tribal Government: $61.14 previously, $67.54 currently (includes 1.4 multiplier).


  • For the total collection, there is a cost decrease of $104,871.27 (see table 15b).

Question 15 b: Itemized Changes in Annual Costs

Data Collection Instrument

Survey Administration Mode

Program Change (hours currently on OMB Inventory)

Program Change (New)

Difference

Explanation

Public Assistance Initial Survey, FEMA Form 519-0-32

Telephone

$0.00

$20,676.72

$20,676.72

Program increase due to new form

Public Assistance Initial Survey, FEMA Form 519-0-33

Web

$0.00

$3,475.57

$3,475.57

Program increase due to new form

Public Assistance Customer Satisfaction Survey,
FEMA Form 519-0-1T,
Now Public Assistance Assessment Survey FEMA Form 519-0-34

Telephone

$144,893.61

$24,354.91

($120,538.70)

Decrease due to smaller samples and shorter administration

Public Assistance Customer Satisfaction Survey (Web),
FEMA Form 519-0-1INT,
Now Public Assistance Assessment Survey FEMA Form 519-0-35

Web

$22,998.99

$4,717.21

($18,281.78)

Decrease due to smaller samples and shorter administration

Public Assistance Customer Satisfaction Survey,
FEMA Form 519-0-1 Fill-able, Sent By Email/Electronically

Web-Fillable Form

$11,499.49

$0.00

($11,499.49)

Decrease due to discontinuation of administration method

Public Assistance Customer Satisfaction Survey,
FEMA Form 519-0-1, Fill-able, Sent by Fax

Fax-Fillable Form

$1,149.95

$0.00

($1,149.95)

Decrease due to discontinuation of administration method

Public Assistance Customer Satisfaction Survey,
FEMA Form 519-0-1, Fill-able, Sent by Mail

Mail-Fillable Form

$1,149.95

$0.00

($1,149.95)

Decrease due to discontinuation of administration method

Qualitative Interviews

Focus Group, Small Group Interviews, One-on-one interviews

$60,191.21

$83,787.52

$23,596.31

Program increase due to increases in amount of information and increases in wages

Total

 

$241,883.20

$137,011.93

($104,871.27)

 

Note: There were no program “adjustments” for item 15b.

16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.



FEMA does not intend to employ the use of statistics or the publication thereof for this information collection.


We will be providing reports to Public Assistance management and Headquarters management on a quarterly basis. These reports will have a breakdown of each question (basic descriptive statistics; averages and percentages) as well as an overall analysis of patterns seen in the data each quarter and trends overtime. Data can also be broken down by region, disaster, state, etc. depending on the needs of Public Assistance, so it is possible that stakeholders will occasionally request reports on a more frequent basis than quarterly.


Statisticians may be asked to do more in-depth analysis if there is a significant drop in customer satisfaction scores, and stakeholders want to understand why there was a decrease in satisfaction. This may involve correlation, T-tests, Crosstabs with Pearson’s Chi-Square, and Analysis of Variance (ANOVA). Demographic data will typically be used to describe the sample of respondents, but statisticians may also look for differences in satisfaction across demographic groups if a more in-depth analysis is requested.


17. If seeking approval not to display the expiration date for OMB approval of the information collection, explain reasons that display would be inappropriate.


FEMA will display the expiration date for OMB approval of this information collection.



18. Explain each exception to the certification statement identified in Item 19 “Certification for Paperwork Reduction Act Submissions,” of OMB Form 83-I.


FEMA does not request an exception to the certification of this information collection.



Contact Information

Kristin Brooks, Ph.D.

Statistician

Customer Survey and Analysis Section

Reports and Analytics Division, FEMA

[email protected]

Office: (940) 891-8579; Alt phone: (310) 569-3347


21


File Typeapplication/msword
AuthorEdmonds, Julia K. EOP/OMB (Intern)
Last Modified BySYSTEM
File Modified2017-09-20
File Created2017-09-20

© 2024 OMB.report | Privacy Policy