3245-0370 Supporting Statement FINAL 6-30-2022

3245-0370 Supporting Statement FINAL 6-30-2022.docx

Disaster Assistance Customer Satisfaction Survey

OMB: 3245-0370

Document [docx]
Download: docx | pdf

U.S. Small Business Administration

SBA Form 2313, Customer Satisfaction Survey

OMB Control Number 3245-0370

Justification – Part A Supporting Statement


Overview of Information Collection: Provide a brief overview of the information being collected, disclosed, or the recordkeeping requirement imposed by the agency.


The Small Business Administration is authorized to make loans to survivors of declared disasters for the purpose of restoring their damaged property to, as near as possible, pre-disaster conditions. SBA is also authorized to make loans to businesses that have suffered an economic loss as a result of the disaster. This authority is found in 15 U.S.C. 636 as amended (copy attached). SBA’s Office of Disaster Assistance provides customer service to individual and business loan applicants on the phone and via email through its Disaster Assistance Customer Service Center (DACSC) and in-person through its Field Operations Centers (FOCs).


SBA is seeking renewal of an approved survey to collect basic customer satisfaction data regarding service provided at the DACSC and FOCs. The survey questions remain unchanged except for the removal of two open-ended questions: (1) In addition, .with? Is there anything else I can assist you and (2)Based on your experience with the SBA, do you have any suggestions for making the process easier?this renewal seeks authorization to expand the scope of the collection to include automated means of an emailed invitation to an online form, and a fully automated post-call telephone survey. Possible collection methods also include the option of a live (telephone) interview using DACSC employees.


The proposed fully automated approach will reduce the on-going burden to the government to a negligible amount. The capability for implementing this new approach was made possible as the result of a recent upgrade to the DACSC telecommunications system which includes the required functionality. No additional equipment acquisition or capital expenditures are expected to automate the survey.


  1. Need & Method for the Information Collection. Explain the circumstances that make the collection of information necessary.


The DACSC is the national contact center for SBA’s Office of Disaster Assistance (ODA).  Operating from offices in Buffalo, New York, and Citrus Heights, California, the DACSC provides customer support to disaster survivors throughout the United States and U.S. Territories.  The DACSC averaged more than 8 million calls annually over the past three years, due largely to the increased activity resulting from pandemic-related loan programs. Typical call volume of 45,000 calls monthly is more representative of future activity for which this survey would apply. Customer Service Representatives (CSRs) at the DACSC respond to a variety of SBA inquiries concerning the disaster loan program.  ODA also operates two Field Operations Centers – the FOC-East in Atlanta, Georgia and the FOC-West in Citrus Heights, California. The FOCs deploy CSRs to staff temporary disaster recovery centers and SBA disaster loan outreach centers in disaster-affected locales. During a typical year, the FOCs deploy hundreds of CSRs to the field to aid tens of thousands of disaster survivors.



The DACSC and FOCs use various ‘output’ metrics to assess effectiveness. Key Performance Indicators (KPIs) for the call center, including wait times, abandonment rates, and average call handling times, are tracked and compared with industry benchmarks. Similarly, the FOCs track productivity including customer contacts, applications accepted, and summary declines processed. While these output measures provide production information, they are not considered to be effective indicators of ‘customer satisfaction.’ A customer satisfaction survey is more “outcome” oriented and a much better indicator of the overall effectiveness of the program.


Information may be collected by the phone system if completed with automation, by trained Quality Assurance staff if conducted manually, or by a web-based application if the survey is conducted using email.


  1. Use of the Information. Indicate how, by whom, and for what purpose the information is to be used (e.g., program administration, application for benefits or services, regulatory compliance, inform policy development).


The DACSC may conduct its customer surveys through a variety of methods including automated telephone applications, email and/or live interviews with trained specialists. Regardless of method, the survey will ask a sample of callers a few brief questions for purposes of gauging customer satisfaction with the DACSC and FOCs (the Centers).  The results are strictly used internally to evaluate performance and provide management with timely feedback regarding areas of concern.  Customer satisfaction surveys are commonly used by successful organizations in both the private and public sectors, and SBA’s interest in this data demonstrates its commitment to delivering quality customer service for the nation’s taxpayers. This information will not be utilized by other Federal agencies.


  1. Use of Information Technology. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.


The DACSC may administer the survey telephonically using automated means, through a web-based application (emailed invitation), or at times, manually, using live phone agents.  An automated approach allows for a broader reach, as the survey can be offered to every caller.  At times, the DACSC may elect to contact customers and conduct the survey “live” using active agents to conduct the interview. This approach provides the ability for the survey specialist to address any customer service concerns raised by the respondent during the survey.  The results will be captured through the DACSC’s phone system and saved to a secure cloud-based database for analysis and reporting purposes.


  1. Non-duplication. Describe efforts to identify duplication.


The SBA disaster loan program participates annually in a survey conducted as part of the “American Customer Service Index” (ACSI).  This survey, while more comprehensive than the proposed DACSC survey, is administered only once a year, making it of limited value for assessing real-time customer satisfaction.  The infrequent nature of the ACSI survey also renders it ineffective for identifying areas of concern in a timely manner.  Additionally, because the annual ACSI survey is geared toward assessing customer satisfaction on a broad level, it is often difficult to correlate its results to specific work units.  For example, when surveying a respondent about “customer service” (after multiple interactions spanning several weeks), it can often be difficult for a customer to attribute their experience to a specific business unit within SBA.


From an operational perspective, a survey designed to elicit timely feedback based on a specific interaction and work unit (e.g., the DACSC or the FOC), is considered a better indicator of on-going customer satisfaction, and provides managers with timely feedback to address problems as they occur, rather than months after, as is the case with the ACSI.  It also provides the ability to link specific agents to surveys which will enhance development and accountability processes at the DACSC.  We believe an on-going survey is beneficial for providing the type of specific, targeted, timely and actionable feedback that will make a difference in the customer experience at the DACSC.


  1. Burden on Small Business. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden. Did the agency consider any exemptions, alternate options, or partial or delayed compliance options for small businesses?

This survey will not have a significant economic impact on small businesses or other small entities.

  1. Less Frequent Collection. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


Failure to implement the proposed methodology will limit ODA’s ability to accurately assess customer satisfaction levels and therefore, will affect management’s ability to take appropriate measures to improve delivery of critical financial assistance to disaster survivors.


  1. Paperwork Reduction Act Guidelines. Explain any special circumstances that would cause an information collection to be conducted in a manner…


There are no special circumstances.


  1. Consultation and Public Comments. Provide a copy and identify the date and page number of publication in the Federal Register of the agency’s notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB.


Comments were solicited in a Federal Register notice published on January 28, 2022, at 87 FR 4703, copy attached. The comment period closed on March 29, 2022, and no comments were received.


  1. Gifts or Payment. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


There are no payments or gifts to respondents.


  1. Privacy & Confidentiality. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.


No assurance of confidentiality is provided. The data captured through this survey will be maintained in a secure database accessible by a small number of authorized users. Management reports are not specifically linked to any person or entity, but rather depict the aggregate results of surveys administered over a specified period. The information provided is subject to disclosure under the Freedom of Information Act (5 U.S.C. §552). The Agency will not collect any personally identifiable information.


  1. Sensitive Questions. Provide additional justification for any questions of a sensitive nature.


No sensitive questions are asked.


  1. Burden Estimate. Provide estimates of the burden of the collection of information.


When using the automated process, customers will be given the choice whether to participate in the survey or not, and for these types of surveys, it is common for a relatively small percentage of customers to opt-in. For those agreeing to participate, phone-based surveys will be completed at the conclusion of the call.

Automated (electronic) phone surveys will be completed using a text-to speech feature within the phone system. Participants will respond to questions posed by the Interactive Voice Response (IVR) system within the DACSC’s phone system using their telephone keypad. When the outbound manual method is used, the survey requirement will be based on a random sample of calls to the DACSC using a 90% Confidence Level and a 10% margin of error. A similar process will be used for identifying FOC survey candidates. Regardless of method, (after call vs. outbound), participation is strictly voluntary.


The web-based application would entail emailing an invitation with a link to the survey to all callers with an email address on record.


The survey is brief, comprised of 6 questions, 1 requiring a “Yes/No” response, 5 requiring a rating on a 1-5 scale. Historically, the survey takes less than 5 minutes per respondent to administer (see attached survey sample). Based on recent activity levels for the DACSC, we expect to conduct between 100 – 1,500 surveys per month. Based on customer visits to field locations, we estimate surveying approximately 100 field customers on behalf of the FOCs to achieve statistically significant results. See response the expected number of responses. calculation ofto B.1 below for more information onThe survey is optional and the cost to the customer in terms of time is negligible.



Customer Service Center Customer Survey - Manual

Total Surveys = 100 surveys per month

100 surveys/month x 12 months = 1,200 annual responses

1,200 x .083 (5 minutes) = 99.6 burden hours


Customer Service Center Customer Survey - Automated

Total Surveys = 1,500 surveys per month

1,500 surveys/month x 12 months = 18,000 annual responses

18,000 x .083 (5 minutes) = 1,494 burden hours


Field Operations Customer Survey-Manual

Total Surveys = 100 surveys per month

100 surveys/month x 12 = 1,200 annual responses

1,200 x .083 (5 minutes) = 99.6 burden hours


Total number of surveys: Range from 2,400 – 24,400 respondents


Total burden hours: Range from 199.2 – 1,693.6 hours


Explain the reason for any changes to the burden and fill out the tables below (*or another table that explains the changes, as appropriate).


Due to the availability of automation to deliver the survey, this could result in an increase in the number of surveys being conducted, however the burden on each respondent remains unchanged.


Manual Method – DACSC and FOC


Requested

Program Change Due to New Statute

Program Change Due to Agency Discretion

Change Due to Adjustment in Agency Estimate

Change Due to Potential Violation of the PRA

Previously Approved

Annual Number of Responses for this IC

Max: 200/month (20,400 annual)

0

0

0

0

2,400

Annual IC Time Burden (Hour)

Max: 5 min/response = 1600

0

0

0

0

199

Annual IC Cost Burden (Dollars)

No cost to Respondent

0

0

0

0

No Cost to Respondent






Provide an estimate for the total annual cost burden to respondents or recordkeepers resulting from the collection of information.


Burden per Response:

 

Time Per Response

Hours

Cost Per Response

Reporting

5 minutes

.083 Hours

$0

Record Keeping

N/A

N/A

N/A

Third Party Disclosure

N/A

N/A

N/A

Total

5 minutes

.083 Hours

$0





Annual Burden:


Annual Time Burden (Hours)

Annual Cost Burden

(Dollars)

Reporting

.083 Hours

$0

Record Keeping

N/A

N/A

Third Party Disclosure

N/A

N/A

Total

.083 Hours

$0


Automated Method – Telephone/Email


Requested

Program Change Due to New Statute

Program Change Due to Agency Discretion

Change Due to Adjustment in Agency Estimate

Change Due to Potential Violation of the PRA

Previously Approved

Annual Number of Responses for this IC

1,500/month = 18,000 annual

0

18,000-1,200 = 16,800

0

0

0

Annual IC Time Burden (Hour)

5 min per response = 1,500 Hours

0

1,500 – 99.6 = 1,400.4

0

0

0

Annual IC Cost Burden (Dollars)

No cost to Respondent

0

0

0

0

No Cost to Respondent


  1. Estimated nonrecurring costs. Provide an estimate for the total annual cost burden to respondents or record keepers resulting from the collection of information. (Do not include the cost of any hour burden already reflected on the burden worksheet).


There are no additional costs beyond that identified in Item 12 above.


  1. Estimated cost to the Government. Provide estimates of annualized costs to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information. Agencies may also aggregate cost estimates from Items 12, 13, and 14 in a single table.


The functionality necessary to administer a fully automated survey is included as a standard feature of the DACSC’s phone system. Implementation of the survey through total automation will result in negligible additional costs to the Agency.


When using a manual outbound dialer approach using live agents, based on actual experience, we estimate it takes approximately 10 minutes (including unsuccessful attempts) to obtain a successful survey. Agency burden hours are calculated below:


Customer Service Center Customer Survey – Manual (QA Outbound)

1,200 (DACSC) + 1,200 (FOC) surveys x .167 hours (10 minutes) per survey = 400 Agency burden hours


The annual cost estimate for the Agency is based on the salary of a GS-11, Step 1, ($32.98 per hour for the Buffalo locality), which is representative of an employee performing these surveys. The cost is calculated as follows:


400 hours x $32.98 per hour = $13,192 Annual cost to the Government


Customer Service Center Customer Survey – (Automated)

Due to the acquisition of the new phone system at the DACSC, and the implementation of a fully automated process, there would be no incremental cost to the agency for the proposed survey.



  1. Reasons for changes. Explain the reasons for any program changes or adjustments reported on the burden worksheet.


The recent acquisition of a new phone system makes it possible for the DACSC to incorporate automation and efficiency into the survey process. Automation also opens the possibility of offering the survey to every caller. The ongoing cost to the government for conducting the survey is minimal when using a fully automated approach.


Employing a manual process includes costs associated with manpower used to conduct the survey and will vary based on the degree to which the manual process is used throughout the year.




  1. Publicizing Results. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.


No publication is anticipated.


  1. OMB Not to Display Approval. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


SBA will display the OMB expiration date.


  1. Exceptions to "Certification for Paperwork Reduction Submissions." Explain each exception to the topics of the certification statement identified in “Certification for Paperwork Reduction Act Submissions.”


There are no exceptions to the certification statement.


  1. Surveys, Censuses, and Other Collections that Employ Statistical Methods. If this request includes surveys or censuses or uses statistical methods (such as sampling, imputation, or other statistical estimation techniques), a Part B supporting statement must be completed.


See attached.


























B. Collections of Information Employing Statistical Methods.


  1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.


Activity in FY 20 and FY 21 was unprecedented and impacted by new COVID-19 loan programs. This level of activity may not be representative of on-going future call volumes. However, based on a normal year, with typical disaster activity, it would be reasonable to expect the DACSC to provide service to 500,000 or more callers per year for whom this survey would apply.


Under an “opt-in” approach, where the caller is asked if they would be willing to participate in a post-call survey, a 5% penetration rate would be expected, which could result in 1,500 survey respondents monthly. Automated (electronic) phone surveys will be completed using a text-to speech feature within the phone system. Participants will respond to questions posed by the Interactive Voice Response (IVR) system within the DACSC’s phone system using their telephone keypad. An emailed invitation with a link to a web-based survey allows participants the option to complete the survey at their convenience and could yield slightly higher participation rates.

If a manual outbound campaign is used, the goal would be to achieve a statistically significant sample of callers (90% confidence interval, 10% margin of error) which would require approximately 100 successful surveys per month. The outbound approach will be administered by a live agent and historically has a successful contact rate of approximately 45%. Therefore, approximately 225 call attempts are required to achieve 100 successful surveys using a manual outbound method.


  1. Describe the procedures for the collection of information.


For automated surveys (telephone or web based), data will be captured and saved to a secure database. Regardless of the automation deployed, all callers to the DACSC will be offered the opportunity to participate in the survey. We anticipate a participation rate of 5%.


When the manual method is used, the Quality Assurance staff will be responsible for conducting the survey using a scripted format to ensure uniformity in explaining the purpose of the survey, as well as the questioning, recording of results, and survey closure. The list of survey subjects will be taken from a random sample of callers to the DACSC within the previous 72 hours. A 90% confidence level with a 10% margin of error is deemed acceptable for this purpose. While this survey is designed to measure customer satisfaction on an on-going basis, safeguards have been implemented to ensure the same customer is not surveyed more than once during a twelve-month period by filtering all phone numbers called over the previous year from the list provided to the survey administrator.

There are no plans to stratify the population of callers to the DACSC for purposes of this survey. The results of this survey are intended to be used internally to measure the effectiveness of service at the DACSC and identify any potential areas for improvement.


  1. Describe methods to maximize response rates and to deal with issues of non-response.


Under the automated method, every caller will be advised of the survey and invited to participate. Callers opting-in will be transferred to the survey immediately following the call or provided an emailed link for the web-based survey method. When a manual outbound campaign is used, prospective survey participants will be contacted at their residence (or the contact number provided on the application) during reasonable hours being mindful of time zone differences. Care will be taken not to contact individuals at their place of employment when an outbound approach is used. Business customers will be contacted at the number provided for their business during customary business hours (8:00 am – 5:00 pm). To ensure completion of the requisite number of surveys, a random sample of customer contacts from the preceding 72 hours will be extracted from the DACSC phone database. To ensure completion of the surveys conducted on behalf of the Field Operations Centers (FOCs), a list of customers who visited field locations will be provided to the survey administrators by the FOCs.


  1. Describe any tests of procedures or methods to be undertaken.


The employees conducting the survey are trained in the proper procedure for administering the questions and are required to rehearse the survey with training personnel prior to conducting actual surveys. This training will ensure a standardized survey process and improve the reliability of the data obtained by the Government employees conducting the survey.


  1. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


No individuals were consulted on the statistical design of this survey. Perry Pedini, Customer Service Supervisor, and his staff will be responsible for collecting and summarizing the survey data on behalf of Center Director, Jeffery Zinn. The telephone number for the Customer Service Center is 716-843-4100.

10


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2022-07-01

© 2024 OMB.report | Privacy Policy