1660-0107 Supporting Statement A Final

1660-0107 Supporting Statement A Final.pdf

Public Assistance Customer Satisfaction Surveys

OMB: 1660-0107

Document [pdf]
Download: pdf | pdf
May 12, 2020

Supporting Statement for
Paperwork Reduction Act Submissions
OMB Control Number: 1660 - 0107
Title: Public Assistance Customer Satisfaction Surveys
Form Number(s):
FEMA Form 519-0-32, Public Assistance Initial Customer Satisfaction Survey
(Telephone);
FEMA Form 519-0-33, Public Assistance Initial Customer Satisfaction Survey
(Internet);
FEMA Form 519-0-34, Public Assistance Assessment Customer Satisfaction Survey
(Telephone);
FEMA Form 519-0-35, Public Assistance Assessment Customer Satisfaction Survey
(Internet)
This is a request to revise 1660 – 0107 to accomplish the following goals: 1. Update
Public Assistance Customer Satisfaction Surveys to reflect changes in the Public
Assistance Program; 2. Allow for elaboration on key questions to better understand
trends; 3. Improve clarity through simplifying survey questions and updating
terminology; 4. Continue to measure customer satisfaction with Public Assistance
process and gauge performance.
The information collection includes two surveys and qualitative research (focus
groups/interviews).
The major changes are listed here and explained in more detail in Section A.12 and A.15
A. Justification
1. Explain the circumstances that make the collection of information necessary. Identify any
legal or administrative requirements that necessitate the collection. Attach a copy of the
appropriate section of each statute and regulation mandating or authorizing the collection
of information. Provide a detailed description of the nature and source of the information
to be collected.
The surveys align with the Department of Homeland Security (DHS) mission for the Federal Emergency
Management Agency (FEMA) to ensure disaster resilience and with FEMA’s 2018-2022 Strategic Plan.

The specific objective is 3.1 Streamline Disaster Survivor and Grantee Experience. Additionally, the
surveys align with the FEMA Recovery Directorate’s 2019 Strategic Plan and GPRA Performance
Measure 3.1: Raise Applicant Satisfaction with Simplicity of the Public Assistance (PA) Process, Using
Customer Survey Results. The measures include customer satisfaction with:
•
•
•

Helpfulness of staff in guiding you through the PA process
Simplicity of the PA process
Overall satisfaction with the PA program

This information collection assesses customer satisfaction with the FEMA Public Assistance process.
Applicants are surveyed at the beginning and end of the Public Assistance process. Applicants surveyed
at the beginning of the process may be eligible or ineligible for funding, whereas applicants surveyed at
the end of the process are eligible and have received funding for at least one of their projects.
The Public Assistance Initial (PAI) Survey assesses whether applicants are satisfied with the service and
materials they receive from Public Assistance at the onset of the process. All applicants have a
Recovery Scoping Meeting where their Program Delivery Manager sets expectations and provides
timelines. Applicants receive important instructions and materials that can set the tone for the rest of the
grant process. Applicants are eligible to participate in the PAI Survey after completing the Recovery
Scoping Meeting.
The Public Assistance Assessment (PAA) Survey assesses customer satisfaction throughout the entire
Public Assistance process. Survey topics include knowledge and helpfulness of FEMA representatives,
timeliness of awards, simplicity of the process, reasonableness of requirements, accuracy of materials,
satisfaction with communication, and usability of the Grants Portal. Applicants are eligible to
participate after receiving funds for at least one of their projects.
Specialized qualitative research (e.g., focus groups; interviews) may be conducted periodically to assess
program areas or program changes that the Public Assistance surveys do not capture. These are usually
based on a convenience sample and target population will vary depending on research question.
The following legal authorities mandate the collection of the information in this request:
The September 11, 1993 Executive Order 12862, “Setting Customer Service Standards,” and its March
23, 1995 Memorandum addendum, “Improving Customer Service,” requires that all Federal agencies
ask their customers what is most important to them and survey their customers to determine the kind and
quality of services the customers want and their level of satisfaction with existing services. The 1993
Government Performance and Results Act (GPRA) requires agencies to set missions and goals, and
measure performance against them.
The E-Government Act of 2002 includes finding innovative ways to improve the performance of
governments in collaborating on the use of information technology to improve the delivery of
Government information and services.
The GPRA Modernization Act of 2010 requires quarterly performance assessments of Government
programs for purposes of assessing agency performance and improvement, and to establish agency

performance improvement officers and the Performance Improvement Council. Executive Order 13571
“Streamlining Service Delivery and Improving Customer Service” and its June 13, 2011 Memorandum
“Implementing Executive Order 13571 on Streamlining Service Delivery and Improving Customer
Service” sets out guidelines for establishing customer service plans and activities; plus it expands the
definition of customer and encourages the use of a broader set of tools to solicit actionable, timely
customer feedback to capture insights and identify early warning signals.
The Sandy Recovery Improvement Act (SRIA) of 2013 and the response provided by FEMA staff from
all divisions during Hurricane Sandy, the Disaster Survivor Assistance (DSA) Program was formed to
provide additional in-person customer service during the initial phase of the recovery process.
2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new
collection, indicate the actual use the agency has made of the information received from the
current collection. Provide a detailed description of: how the information will be shared, if
applicable, and for what programmatic purpose.
FEMA’s mission is to support the citizens of the United States and first responders to ensure that as a
nation we work together to build, sustain, and improve our capability to prepare for, protect against,
respond to, recover from, and mitigate all hazards. FEMA uses the collected information to measure
customer satisfaction, to meet objectives, gauge and make improvements to increase customer
satisfaction.
This collection includes the Public Assistance (PA) Customer Satisfaction Surveys, managed by the
Recovery Directorate, through the Reporting & Analytics Division, Customer Survey & Analysis Section
(CSA) of the Federal Emergency Management Agency.
The purpose of the Public Assistance Customer Satisfaction Surveys is to assess customer satisfaction
with the Public Assistance Program, and to improve the quality of service for applicants (State, Local,
Tribal government, and eligible Private Non-Profit organizations) who have been affected by a disaster.
This collection of information has enabled FEMA Managers to garner customer feedback and
satisfaction against standards for performance and customer service in an efficient, timely manner to
help ensure that users have an effective, efficient, and satisfying experience with the Agency’s process.
This collection has allowed for ongoing, collaborative and actionable communications between the
Agency and its stakeholders. Results from the previous Public Assistance (PA) collection have given
FEMA the ability to identify weaknesses in the PA program, as well as given applicants an opportunity
to voice their concerns.
The survey results from the previous collection have helped provide a more complete picture of the
Public Assistance process. For example, both in the field and in our surveys, applicants have
commented about the complexity of the PA process. Generally, satisfaction with the program is high,
but satisfaction with PA program simplicity consistently trends the lowest. This includes areas such as
documentation and program requirements. The Public Assistance Program continually tries to
streamline their processes. The 428 Program, or fixed estimates, are meant to provide more flexibility to
the applicant and get them the money they need faster. Additionally, the Public Assistance Program
introduced the Grants Portal during the last information collection, and we were able to provide
customer feedback on the new technology system. The survey results have helped initiate updates to the

Grants Portal, such as improved ease of access, and better understanding of the portal. Lower survey
ratings for “explaining the PA Process to applicants” has also prompted changes in which information is
presented to applicants. The surveys have also highlighted areas in which Public Assistance continues
to excel, such as with general customer service and helpfulness of the PA representatives.
Reports are usually distributed by email to stakeholders, which includes Public Assistance Leadership
and the Recovery and Analytics Branch. Reports are distributed on a quarterly basis, and include
descriptive breakdowns of each question (e.g., means and percentages). Stakeholders may request
reports more often than quarterly if they want to examine customer satisfaction for a given disaster,
state, or FEMA Region. Additionally, there is a Tableau Dashboard that displays survey results
(averages for select questions) that anyone in FEMA can access. The dashboards are refreshed on a
monthly basis. Demographic items are primarily used to describe the sample, but statisticians may be
asked to do more in-depth analysis using inferential statistics. This would most likely be if there was a
significant drop in customer satisfaction from one quarter to the next, and stakeholders wanted to better
understand the underlying causes. The reports are used to monitor performance and identify areas of
possible improvement.
Specialized qualitative research is conducted at the request of Public Assistance leadership when a need
arises, and funding is available. This type of research allows for flexibility to assess programmatic
changes that surveys are unable to capture. Examples might include technological upgrades to the
Grants Portal, or changes to existing programs like 428 Alternative Procedures. Qualitative research can
also help inform survey development through identifying new topic areas that need to be assessed.
Funding is usually limited, so sampling is often restricted to a few geographic regions. Focus groups are
more feasible when applicants are densely packed in one area, whereas interviews are more practical
when applicants are spread out over a geographic region.
3. Describe whether, and to what extent, the collection of information involves the use of
automated, electronic, mechanical, or other technological collection techniques or other forms of
information technology, e.g., permitting electronic submission of responses, and the basis for the
decision for adopting this means of collection. Also describe any consideration of using
information technology to reduce burden.
All survey responses are stored in the Customer Satisfaction Analysis System (CSAS) for easy retrieval,
analysis, and reporting. The Customer Survey and Analysis Section (CSA) planned on incorporating
internet administration in the previous information collection but were unable to do so because of a lack
of software functionality.
The current plan is to have mixed mode administration (phone and electronic) for the revised
information collection. CSA is in the process of acquiring software and an electronic capability is a
requirement. Collection techniques include phone interviews and electronic submission of responses.
Incorporating electronic administration will improve overall response rates, increase accessibility, and
reduce overall costs of administration, and it will be important to examine the possibility of
administration mode effects.
Applicant organizations responding to the FEMA Public Assistance Customer Satisfaction Surveys will
be able to respond via phone call (computer assisted telephone interviewing) or a web-based link. When

the software is available, applicants that have an email address on file will first receive an email
invitation. If the applicant does not complete the survey via the web within a designated amount of time
(approximately 2 weeks), interviewers will attempt to contact the respondent via phone.
Response rates for our PA surveys are high compared to industry average for customer satisfaction
surveys, although it’s unclear how response rates will differ by mode. Response rates for online
administration are typically lower than phone administered surveys, but it is difficult to predict how
much lower. For example, in a research study examining response rates by administration mode,
telephone administered surveys produced the highest response rates (30.2%), whereas internet
administered surveys had the lowest response rates (4.7%; Sinclair et al., 2012). In 2018, public opinion
polls had a response rate of approximately 6-7% when administered by telephone (Kennedy & Hartig,
2019). Federally administered household surveys tend to have much higher response rates but can vary
widely (for review, see Czajka & Beyler, 2016). Response rates differ depending on variables such as
survey length, convenience of administration, who is administering the survey, the use of incentives,
who the respondents are, and survey importance. For the past two years, the response rate for the Public
Assistance Surveys is 51.70% (PAI-52.56%; PAA- 50.35%).
For the updated collection, internet completions are estimated to be around 17% of the entire collection,
although that percentage could increase as time goes on. Phone completions are expected to be
approximately 69% of this collection, with the final 14% coming from qualitative interviews (focus
groups and interviews). If qualitative interviews are excluded from the calculations, we expect about
80% of the PA surveys to be phone administered and 20% of the surveys to be web administered.
Allowing mixed-mode administration should improve response rates and reduce burden by allowing
respondents to reply in their preferred administration method. Each administration method will have
identical questions. The exception is qualitative interviews, which will vary depending on which
program area needs to be assessed.
4. Describe efforts to identify duplication. Show specifically why any similar information already
available cannot be used or modified for use for the purposes described in Item 2 above.
The information gathered in the survey is not available from any other source. CSA meets with
stakeholders to ensure the surveys are adequately assessing the Public Assistance Program and reflect
current practices.
5. If the collection of information impacts small businesses or other small entities (Item 5 of OMB
Form 83-I), describe any methods used to minimize burden.
There is no impact from this collection of information on small businesses or other small entities.
6. Describe the consequence to Federal/FEMA program or policy activities if the collection of
information is not conducted, or is conducted less frequently as well as any technical or legal
obstacles to reducing burden.
Failure to conduct the Public Assistance Customer Satisfaction Surveys would result in the absence of
documentation about customer input on the quality and timeliness of disaster assistance for Public
Assistance applicants. The survey results serve as a vital tool for measuring customer satisfaction and
are a requirement of the Executive Orders 12682 and 13571, and resulting Memorandums for

“Streamlining Service Delivery and Improving Customer Service.” The surveys also contribute to
measuring FEMA’s 2018-2022 Strategic Plan: 3.1 Streamline Disaster Survivor and Grantee Experience.
If conducted less frequently, applicants may have difficulty recalling the specific aspects of the process
if surveyed later (e.g., too much time has elapsed) and satisfaction scores may be distorted.
Additionally, leadership would receive less timely customer feedback, which would lead to fewer
actionable insights.
7. Explain any special circumstances that would cause an information collection to be conducted
in a manner:
(a) Requiring respondents to report information to the agency more often than quarterly.
(b) Requiring respondents to prepare a written response to a collection of information in
fewer than 30 days after receipt of it.
(c) Requiring respondents to submit more than an original and two copies of any
document.
(d) Requiring respondents to retain records, other than health, medical, government
contract, grant-in-aid, or tax records for more than three years.
(e) In connection with a statistical survey, that is not designed to produce valid and reliable
results that can be generalized to the universe of study.
(f) Requiring the use of a statistical data classification that has not been reviewed and
approved by OMB.
(g) That includes a pledge of confidentiality that is not supported by authority established
in statute or regulation, that is not supported by disclosure and data security policies that
are consistent with the pledge, or which unnecessarily impedes sharing of data with other
agencies for compatible confidential use.
(h) Requiring respondents to submit proprietary trade secret, or other confidential
information unless the agency can demonstrate that it has instituted procedures to protect
the information’s confidentiality to the extent permitted by law.
All special circumstances contained in item 7 of the supporting statement are not applicable to this
information collection.
8. Federal Register Notice:
a. Provide a copy and identify the date and page number of publication in the Federal
Register of the agency’s notice soliciting comments on the information collection prior to
submission to OMB. Summarize public comments received in response to that notice and describe
actions taken by the agency in response to these comments. Specifically address comments
received on cost and hour burden.

A 60-day Federal Register Notice inviting public comments was published on January 20, 2020, 85 FR
5461. No comments were received. s
A 30-day Federal Register Notice inviting public comments was published on June 19, 2020, 85 FR
37102. No comments were received.

b. Describe efforts to consult with persons outside the agency to obtain their views on the
availability of data, frequency of collection, the clarity of instructions and recordkeeping,
disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or
reported.
Budget constraints have prevented FEMA from consulting with persons outside the agency.
Recovery Directorate and Public Assistance Program Managers were consulted for input about the data
collected in the survey questionnaires and the reporting format. FEMA External Affairs was consulted
regarding the use of plain language and clarity. Additionally, a statistician has reviewed the new survey
collection to ensure respondent burden is minimized, while also maximizing survey reliability and
validity.
c. Describe consultations with representatives of those from whom information is to be
obtained or those who must compile records. Consultation should occur at least once every three
years, even if the collection of information activities is the same as in prior periods. There may be
circumstances that may preclude consultation in a specific situation. These circumstances should
be explained.
Budget constraints have prevented FEMA from contracting to consult with Public Assistance applicants
since FY2004 when FEMA’s Recovery Directorate contracted to perform four focus groups to ensure
that the information collected was meaningful to customers and the survey questions were clearly
understood. Although not a direct consultation with respondents, members of Customer Survey and
Analysis were deployed to Iowa in 2015 to learn about the Public Assistance New Delivery Model and
observed Recovery Scope Meetings. This allowed for better understanding of procedures and aided
with questionnaire development.
Although direct contact with customers has been minimal, applicants often provide comments or
feedback when completing the Public Assistance Survey. This feedback has been thoughtfully reviewed
and applied in revising the current survey collection. Performance Management Analysts have
conducted comment analysis on the survey results to extract themes from text boxes and “other”
response options to identify topics important to customers that aren’t currently assessed. For example, a
common critique of the PA program included “Lack of training in the Grants Portal.” To better capture
this information, we have added a question assessing whether respondents feel they received adequate
training.
Customer Survey and Analysis also has a Quality Performance Team (QPT) that is responsible for
monitoring phone interviewers. QPT was asked to review the new surveys and give feedback on things

that might be difficult for interviewers to read, or for respondents to understand. Additionally, phone
interviewers provided survey writers with feedback regarding which survey items were consistently
confusing to respondents on the previous information collection. For example, we restructured the scale
descriptions to improve sentence flow and reduce burden.
9. Explain any decision to provide any payment or gift to respondents, other than remuneration
of contractors or grantees.
There are no payments or gifts to respondents for this data collection.
10. Describe any assurance of confidentiality provided to respondents. Present the basis for the
assurance in statute, regulation, or agency policy.
A Privacy Threshold Analysis (PTA) was completed by FEMA and adjudicated by the DHS Privacy
Office on January 2, 2020.
The Privacy Impact Assessment (PIA) is covered under the Department of Homeland Security
FEMA/PIA-035 Customer Satisfaction Analysis System (CSAS), approved by DHS on February 27,
2014 and the existing System of Records Notice (SORN), is DHS/FEMA-009 Hazard Mitigation,
Disaster Public Assistance, and Disaster Loan Programs System of Records, 79 FR 16015 approved by
DHS on March 24, 2014.
There are no assurances of confidentiality provided to the respondents for this information collection.
Survey information is stored in the Customer Satisfaction Analysis System (CSAS), warehoused on
secure FEMA servers.
11. Provide additional justification for any question of a sensitive nature (such as sexual behavior
and attitudes, religious beliefs and other matters that are commonly considered private). This
justification should include the reasons why the agency considers the questions necessary, the
specific uses to be made of the information, the explanation to be given to persons from whom the
information is requested, and any steps to be taken to obtain their consent.
There are no questions of a sensitive nature related to sexual behavior and attitudes, religious beliefs, or
other matters that are commonly considered private in the surveys.
Some questions of a demographic nature are included to help identify whether certain groups of people
vary in their satisfaction with the Public Assistance Program, although the questions aren’t very personal
or of a sensitive nature. Examples include how long the respondent has worked in their current position,
whether they’ve applied for PA disaster assistance previously, and the number of personnel that worked
on their PA project(s). It is possible that respondents with less resources (staff) and less relevant work
experience interpret the difficulty associated with the Public Assistance Process differently from an
applicant who is more experienced, or who has previously applied for Public Assistance. Asking these
questions allow us to better identify whether we are serving all our customers equally, and whether our
products and services need to be tailored to meet the needs of certain groups of people.
12. Provide estimates of the hour burden of the collection of information. The statement should:

a. Indicate the number of respondents, frequency of response, annual hour burden, and an
explanation of how the burden was estimated for each collection instrument (separately list each
instrument and describe information as requested). Unless directed to do so, agencies should not
conduct special surveys to obtain information on which to base hour burden estimates.
Consultation with a sample (fewer than 10) of potential respondents is desired. If the hour burden
on respondents is expected to vary widely because of differences in activity, size, or complexity,
show the range of estimated hour burden, and explain the reasons for the variance. Generally,
estimates should not include burden hours for customary and usual business practices.
Read-aloud testing by FEMA interview staff was conducted to approximate average survey response
times. Previous research in survey methodology suggests online surveys can be completed faster than
telephone surveys (e.g., Szolnoki & Hoffmann, 2013; Duffy et al., 2005). Based on these findings, we
estimate internet versions will be completed 2 minutes faster on average.
Therefore, estimated total survey response time is 10 minutes for the PA Initial Survey (Phone) with a
skilled interviewer, and 8 minutes for the PA Initial Survey (Internet). Estimated total response time is
13 minutes for the PA Assessment Survey (Phone) with a skilled interviewer, and 11 minutes for the PA
Assessment Survey (Internet). For qualitative research, focus groups typically take 2 hours to conduct,
plus 1 hour for round trip travel time to the session, or 3 hours. Interviews typically take 1 hour to
conduct with no travel time.
The surveys approved in the previous information collection have been active since the start of FY 2018.
This marked a transition from administering one questionnaire to two PA Customer Satisfaction
Surveys: PA Initial Survey and PA Assessment Survey. Response rates and population estimates were
examined for both surveys for FY 2018-2019 to make the estimates for the revised collection.
Projected completions for the PA Initial Survey are based on an annual applicant population of 4,096
(eligible and ineligible applicants) and a 52.56% response rate, resulting in approximately 2,152
respondents. Projected completions for the PA Assessment Survey are based on an annual applicant
population of 2,626 (eligible applicants only) and a 50.35% response rate, resulting in approximately
1,322 respondents. For more information about estimated universe and projected completions, see
Question 1 in Supporting Statement B.
For each survey (excluding qualitative research), we expect about 20% to be completed via the internet,
and about 80% to be completed via phone.
For the PA Initial Survey, internet completions are estimated by multiplying 20% by 2,152, which
would result in approximately 430 completions. Phone completions are estimated by multiplying 80%
by 2,152, which would result in 1,722 completions.
For the PA Assessment Survey, internet completions are estimated by multiplying 20% by 1,322, which
would result in approximately 264 completions. Phone completions are estimated by multiplying 80%
by 1,322, which would result in 1,058 completions.
Qualitative research is conducted on a request basis; there is no special schedule for implementation.
These surveys may be conducted at the beginning or end of the Public Assistance Process depending on

what type of information stakeholders are trying to gather. Completions for qualitative interviews were
estimated from previous experience (see Q12b for more details). It is likely that participants in the
qualitative interviews will have previously taken the PA Initial or PA Assessment Survey. This will
depend at what point in the PA process qualitative interviews take place and the what topics are being
researched. Qualitative interviews are used to gather more detailed information that cannot adequately
be captured in a short, quantitative survey measure. This most often occurs when Public Assistance
implements a new program or policy that isn’t addressed in the surveys. Examples of potential
qualitative topics include changes to 428 procedures (estimates based on fixed costs), comparing PA
New Delivery Model to old delivery model, assessing changes in the Grants Portal, and state-led
disasters.
In the Question 12 figure below, the total estimated annual burden is 1,902 hours based on following:
•
•
•
•
•

287 burden hours for PA Initial Survey-Phone (10 minutes*1,722 completions),
57 burden hours for PA Initial Survey-Internet (8 minutes*430 completions),
230 burden hours for PA Assessment Survey-Phone (13 minutes*1,058 completions),
48 burden hours for Assessment Survey-Internet (11 minutes*264 completions), and
1,280 burden hours for qualitative interviewing ((Focus Groups: 3 hours * 360 participants) +
(Interviews: 1 hour * 200 participants)).

This estimate accounts for a total of 4034 completions across two surveys and qualitative research
measures. Some respondents may only be surveyed once, while it is possible for others to participate on
three separate occasions. Burden hours per respondent could range anywhere from 8-10 minutes on the
low end (complete PA Initial Survey only) to roughly 3 hours and 23 minutes on the high end (complete
PA Initial Phone- 10 min, PA Assessment Phone- 13 min, and 1 Focus Group- 3 hours). The majority of
applicants will complete the PA Initial and PA Assessment Surveys by phone, which is a total of 23
burden minutes per participant.
b. If this request for approval covers more than one form, provide separate hour burden
estimates for each form and aggregate the hour burdens in Item 13 of OMB Form 83-I.
Below, as well as in Question 12 figure, is a description of the universe and hour burden by survey
instrument:
[FEMA Form 519-0-32] PA Initial Survey (Telephone) may be conducted and gathered by phone, with
responses stored electronically. The number of responses collected by phone is estimated to be 1,722 or
approximately 43% of the whole collection with an hour burden of 287. It has been estimated to take 10
minutes for the applicant to complete the survey with a skilled interviewer.
[FEMA Form 519-0-33] PA Initial Survey (Internet) may be submitted through an internet link, with
responses stored electronically. The number of responses collected by internet link is estimated to be
430 or approximately 11% of the whole collection with an hour burden of 57. It is been estimated to
take 8 minutes for the applicant to complete the survey online.
[FEMA Form 519-0-34] PA Assessment Survey (Telephone) may be conducted and gathered by phone,
with responses stored electronically. The number of responses collected by phone is estimated to be

1,058 or approximately 26% of the whole collection with an hour burden of 230. It has been estimated to
take 13 minutes for the applicant to complete the survey with a skilled interviewer.
[FEMA Form 519-0-35] PA Assessment Survey (Internet) may be submitted through an internet link,
with responses stored electronically. The number of responses collected by internet link is estimated to
be 264 or approximately 6% of the whole collection with an hour burden of 48. It is been estimated to
take 11 minutes for the applicant to complete the survey online.
Qualitative research will most likely be conducted in person or by phone. For focus groups, the number
of participants is estimated to be 360 with an hour burden of 1,080. The number of focus group
participants was calculated by estimating 3 sessions with 12 applicants per focus group, in each of the 10
FEMA Regions (3 sessions*12 applicants*10 Regions= 360 participants). The length of each focus
group is estimated to be 2 hours with an additional 1 hour round trip travel time, for a total of 3 hours
per participant (360 participants*3 hours = 1,080 burden hours). For interviews, the number of
participants is estimated to be 200 with an hour burden of 200. We estimated 200 hours based on 2
participants per 1 hour interview, with 10 interviews in each of the 10 FEMA Regions (2 participants*10
interviews*10 FEMA Regions = 200 hours). No travel time is required for applicants. The decision to
conduct interviews vs. focus groups usually depends on the density of the target population (e.g.,
sometimes applicants are spread all over the state and it is not feasible to meet in a central location).
These burden estimates are the same that were used in the previous collection, which was adequate to
meet all requests from stakeholders. Because the Public Assistance Program is undergoing significant
changes (e.g., New Delivery Model, more state-led disasters, changes to 428 policy, updates to Grants
Portal), we anticipate requests from stakeholders to conduct qualitative research as their funding
becomes available. That’s a total of 560 completions with 1,280 burden hours for qualitative research
and comprises approximately 14% of the whole collection.
c. Provide an estimate of annualized cost to respondents for the hour burdens for collections of
information, identifying and using appropriate wage rate categories. NOTE: The wage-rate
category for each respondent must be multiplied by 1.46 and this total should be entered in the cell
for “Avg. Hourly Wage Rate”. The cost to the respondents of contracting out or paying outside
parties for information collection activities should not be included here. Instead this cost should
be included in Item 13.
See Question 12 figure below. For type of respondent, historical data (FY 2018-2019) shows 19% of
Public Assistance applicants are non-profit institutions, and 81% are state, local, or tribal government.
Projected number of respondents, burden hours, and respondent costs are calculated accordingly.
Question 12: Estimated Annualized Hour Burden and Costs

Type of
Respondent

Non-Profit
institutions

Form Name / Form
Number

Public Assistance Initial
Customer Satisfaction
Survey FEMA Form
519-0-32 (Telephone)

No. of
Respondents

328

No. of
Responses
per
Respondent

Avg. Burden
per Response
(in hours)

1

0.1667

Total
Annual
Burden
(in
hours)

55

Avg.
Hourly
Wage
Rate

Total
Annual
Respondent
Cost

$35.43

$1,949

State, Local
or Tribal
Government

1,394

Sub-Total

1,722

Non-Profit
institutions
State, Local
or Tribal
Government

Public Assistance Initial
Customer Satisfaction
Survey FEMA Form
519-0-33 (Internet)

Sub-Total

Non-Profit
institutions

State, Local
or Tribal
Government

State, Local
or Tribal
Government

Public Assistance
Assessment Customer
Satisfaction Survey
FEMA Form 519-0-34
(Telephone)

Public Assistance
Assessment Customer
Satisfaction Survey
FEMA Form 519-0-35
(Internet)

Telephone and Internet

232

$63.49

287

$14,730
$16,679

82

1

0.1333

11

$35.43

$390

348

1

0.1333

46

$63.49

$2,921

57

$3,311

201

1

0.2167

44

$35.43

$1,559

857

1

0.2167

186

$63.49

$11,809

1,058

Sub-Total
Total

0.1667

430

Sub-Total
Non-Profit
institutions

1

230

$13,368

50

1

0.1833

9

$35.43

$319

214

1

0.1833

39

$63.49

$2,476

264

48

$2,795

3,474

622

$36,153

Other-Qualitative Surveys

Non-Profit
institutions

State, Local
or Tribal
Government
Sub-Total

Focus Groups based on
12 participants per
session, with 3 sessions
for each of 10 regions.
Each session lasts 2
hours, with an additional
hour for travel (3 hours
total).

68

1

3

204

$35.43

$7,228

292

1

3

876

$63.49

$55,617

360

1,080

$62,845

Non-Profit
institutions

State, Local
or Tribal
Government

Interviews based on 2
participants per 1 hour
interview, with 10
interviews for each of the
10 regions. Travel not
required. Total time 1
hour.

Sub-Total
Total
Total

Qualitative Surveys

38

1

1

38

$35.43

$1,346

162

1

1

162

$63.49

$10,285

200

200

$11,631

560

1,280

$74,476

4,034

1,902

$110,629

For Non-Profit Respondents: According to the U.S. Department of Labor, Bureau of Labor Statistics1,
the May 2019 Occupational Employment and Wage Estimates wage rate for Community and Social
Service Occupations (Standard Occupational Classification 21-0000) is $24.27. Including the wage rate
multiplier of 1.462, the fully-loaded wage rate is $35.43 per hour. Therefore, the estimated burden hour
cost to respondents for non-profit institutions is estimated to be $12,790.23 ($35.43 x 361 hours).
For State, Local Government, or Tribal Respondents: The average wage rate category for
Emergency Management Directors (Standard Occupational Classification 11-9161) is $39.68. Including
the wage rate multiplier of 1.603, the fully-loaded wage rate is $63.49 per hour. Therefore, the estimated
burden hour cost to respondents for State, Local or Tribal Government is estimated to be $97,838.09
($63.49 x 1,541 hours).
Therefore, the total annual respondent cost is $110,629.
13. Provide an estimate of the total annual cost burden to respondents or record keepers resulting
from the collection of information. The cost of purchasing or contracting out information
collection services should be a part of this cost burden estimate. (Do not include the cost of any
hour burden shown in Items 12 and 14.)
The cost estimates should be split into two components:
1

Information on the mean wage rate from the U.S. Department of Labor is available online at:
https://www.bls.gov/oes/tables.htm.
2
Employer costs per hour worked for employee compensation and costs as a percent of total compensation: Civilian
workers, by major occupational and industry group, March 2019.” Available at
http://www.bls.gov/news.release/archives/ecec_06182019.pdf. Accessed May 12, 2020. The wage multiplier is calculated
by dividing total compensation for all workers of $36.77 by wages and salaries for all workers of $25.22 per hour yielding a
benefits multiplier of approximately 1.46
3
Employer costs per hour worked for employee compensation and costs as a percent of total compensation: Civilian
workers, by major occupational and industry group, March 2019. Available at
http://www.bls.gov/news.release/archives/ecec_06182019.pdf. Accessed May 12, 2020. The wage multiplier is calculated
by dividing total compensation for State and local government workers of $50.89 by Wages and salaries for State and local
government workers of $31.75 per hour yielding a benefits multiplier of approximately 1.6

a. Operation and Maintenance and purchase of services component. These estimates
should take into account cost associated with generating, maintaining, and disclosing or providing
information. Include descriptions of methods used to estimate major cost factors including system
and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the
time period over which costs will be incurred.
b. Capital and Start-up-Cost should include, among other items, preparations for collecting
information such as purchasing computers and software, monitoring sampling, drilling and
testing equipment, and record storage facilities.

Question 13. Annual Cost Burden to Respondents or Record-keepers

Data Collection
Activity/Instru
ment

*Annual Capital
Start-Up Cost
(investments in
overhead,
equipment and
other one-time
expenditures)

*Annual Operations
and Maintenance
Cost (such as
recordkeeping,
technical/professional
services, etc.)

Annual NonLabor Cost
(expenditures
on training,
travel and other
resources) *
See Note below

Focus Group
Travel

N/A

N/A

$12,420

Total Annual
Cost to
Respondents

$12,420

Annual Non-Labor Cost for travel to Focus Groups is based on US General Services Administration
(GSA) mileage rate for Privately Owned Vehicles (POV) effective January 16, 2020 at $0.575 per mile.
Maximum travel to the Focus Group not to exceed 30 miles one way or 60 miles round trip. Using this
information, 60 miles roundtrip * 360 respondents = 21,600 miles @ $0.575 per mile = $12,420 annual
cost for mileage.
14. Provide estimates of annualized cost to the federal government. Also, provide a description of
the method used to estimate cost, which should include quantification of hours, operational
expenses (such as equipment, overhead, printing and support staff), and any other expense that
would have been incurred without this collection of information. You may also aggregate cost
estimates for Items 12, 13, and 14 in a single table.

Annualized Cost to the Federal Government
Performance of Surveys, Analysis and Reporting, Recommendations for Improvement, Desktop Application of Survey Tools
and Maintenance of Tools.

Survey Administration
or Functions

Title and GS
Level

Salary1

Number
of Staff
at GS
Level

Fully
Loaded
Wage
Rate 1.46
Multiplier2

Cost (for Salaries
includes the
Wage Rate
Multiplier)

Percent
of
Time

Total Cost

Management, survey
administration

Section
Manager
GS 14 Step 5

$131,695

1

1.46

$192,274.70

16.00%

$30,764

Administrative
Assistant

Administrative
Assistant
GS 9 Step 5

$64,628

1

1.46

$94,356.88

16.00%

$15,097

Program Analyst

Program
Analyst
GS 12 step 5

$93,724

2

1.46

$273,674.08

16.00%

$43,788

Supervisory, survey
administration

Supervisory
Customer
Service
Specialist
GS 13 Step 5

$111,448

1

1.46

$162,714.08

16.00%

$26,034

Project management,
administer survey
program, recommend
improvements, oversee
reports and software
application
implementation, testing
and maintenance of
survey tools

Customer
Satisfaction
Analyst
GS 12 Step 5

$93,724

4

1.46

$547,348.16

16.00%

$87,576

Statistician: OMB
compliance, data
analysis and reporting.

Customer
Satisfaction
Analyst
GS 12 Step 5

$93,724

2

1.46

$273,674.08

16.00%

$43,788

Supervisory, Survey
Administration

Supervisory
Customer
Service
Specialist
GS 12 Step 5

$93,724

1

1.46

$136,837.04

16.00%

$21,894

Survey Management:
Administer surveys and
focus groups, prepare
sample, track data,
analyze survey data,
write reports and
recommend
improvements,
software application
implementation, testing
and maintenance of
survey tools and survey

Customer
Service
Specialist
GS 11 Step 5

$78,192

7

1.46

$799,122.24

16.00%

$127,860

Supervisory, QC,
Training
Administration

Supervisory
Customer
Service
Specialist
GS 11 Step 5

$78,192

1

1.46

$114,160.32

16.00%

$18,266

QC, Training

Customer
Service
Specialist
GS 11 Step 5

$78,192

2

1.46

$228,320.64

16.00%

$36,531

Supervisory, Survey
Administration

Supervisory
Customer
Service
Specialist
GS 12 Step 5

$93,724

2

1.46

$273,674.08

16.00%

$43,788

Survey Interviews and
special projects

Customer
Service
Specialists
GS 9 Step 5

$64,628

14

1.46

$1,320,996.32

16.00%

$211,359

$4,417,152.62

16.00%

$706,745

Subtotal

38
Other Costs

Contract for Focus Group Incentives and Rental Facilities
Facilities [cost for renting, overhead, etc. for data collection activity]

$149,273.84
$75,590.74

19.00%
16.00%

$28,362
$12,095

Computer Hardware and Software [cost of equipment annual lifecycle]
Equipment Maintenance [cost of annual maintenance/service agreements for
equipment]
Travel (to Focus Group when not on Disaster Funds)
Other: Long Distance Phone Charges [number of data collections by phone, x
minutes, x cost]
Other: C3MP Usage / Licenses

$218,381.00

16.00%

$34,941

$31,446.75

16.00%

$5,031

$43,514.94

19.00%

$8,268

$19,611.09

16.00%

$3,138

$28,296.48

16.00%

$4,527

Other: Supplies
Subtotal

$3,229.24
$569,344.07

16.00%

$517
$96,879
$803,624

Total
1
Office of Personnel Management 2020 Pay and Leave Tables for the Dallas-Ft. Worth, TX-OK locality. Available online at
https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/20Tables/html/DFW_h.aspx. Accessed May 12,
2020.
2
Wage rate includes a 1.46 multiplier to reflect the fully-loaded wage rate.

15. Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the
OMB Form 83-I in a narrative form. Present the itemized changes in hour burden and cost
burden according to program changes or adjustments in Table 5. Denote a program increase as a
positive number, and a program decrease as a negative number.
A "Program increase" is an additional burden resulting from a federal government regulatory action
or directive. (e.g., an increase in sample size or coverage, amount of information, reporting frequency,
or expanded use of an existing form). This also includes previously in-use and unapproved information
collections discovered during the ICB process, or during the fiscal year, which will be in use during the
next fiscal year.
A "Program decrease", is a reduction in burden because of: (1) the discontinuation of an information
collection; or (2) a change in an existing information collection by a Federal agency (e.g., the use of
sampling (or smaller samples), a decrease in the amount of information requested (fewer questions), or
a decrease in reporting frequency).
"Adjustment" denotes a change in burden hours due to factors over which the government has no
control, such as population growth, or in factors which do not affect what information the government
collects or changes in the methods used to estimate burden or correction of errors in burden estimates.
The revised surveys in this collection are comparable to the previously approved surveys. Both
collections include two surveys (PA Initial and PA Assessment) with similar content. In the previous
collection we accounted for internet administration. We were unable to administer the survey via a web
link due to a lack of survey software functionality. We are currently in the process of acquiring new
survey administration software and an electronic capability is a requirement. The survey software
should be operational before the revised collection is approved, so the internet forms are included.
The revised collection has the same purpose and assesses the same subject matter. The target
populations are still the same, and we attempt to survey everyone within the target population. We have
updated some of the language and added questions to reflect programmatic changes and better assess
performance measures. The length of the surveys has increased slightly, but overall burden has
decreased due to lower overall disaster activity (FY 2018-2019) and lower response rates (adjustments).
Total Change in Burden Hours (Adjustment) = 1,902 (current) – 2,093 (previous) = (-191 hours)
The overall burden change for the Public Assistance Surveys is largely due to population adjustments,
which resulted in a decrease to burden hours. The previous population estimations included years with
very high disaster activity. The current collection uses FY 2018-2019 for population and response rate
calculations because those are the years in which we have historical data for both surveys. We only had
one PA customer satisfaction survey prior to FY 2018. The current collection assumes a 52.56%
response rate for the PAI Survey and a 50.35% response rate for the PAA Survey, which is a
considerable decline from the previous collection (70.17%). In general, response rates for telephone
surveys have been declining over the past several decades (Kennedy & Hartig, 2019; Curtin et al.,
2005). Possible explanations for the response rate decline include a growing refusal among respondents
to participate and difficulties in contacting individuals due to the increased use of answering machines,
call screening devices, and cellular telephones (Tourangeau, 2004; Ehlen & Ehlen, 2007). In addition,

new technologies sometimes mistakenly flag survey calls- even those conducted by the government- as
“spam” (Kennedy & Hartig, 2019). As a result, lower population estimates and response rates led to a
decrease in estimated burden for the revised collection. It is important to note that disaster activity can
vary greatly from year to year, so population sizes can shift rather easily. Despite population shifts, the
goal is always to survey the entire qualified population.
Typically, sample for both surveys will be pulled in on a monthly basis. Periodically, sample may be
pulled in on a bi-weekly basis depending on disaster activity and survey administration needs.
Applicants who completed a Recovery Scoping Meeting will be eligible to take the Public Assistance
Initial (PAI) Survey. Applicants who had funds obligated for at least one of their projects will be
eligible for the Public Assistance Assessment (PAA) Survey. Applicants who participated in 428 for
one of their projects (received funds in the beginning of the process) or had a specialized project
(increased completion time/complexity) may be placed on hold for surveying. Ideally these applicants
will be surveyed closer to the end of the PA process to get a more accurate representation of satisfaction.
Survey administration time has increased for both surveys due to the following:
•

The addition of questions assessing satisfaction with the key aspects of the 428 Alternative
Procedures program (e.g., fixed estimates).

•

There has been an increase in state-led disasters, which includes varying FEMA involvement.
The survey language was modified so that all questions are understandable if the applicant
interacted primarily with FEMA or State personnel. Questions were also added to understand the
extent applicants worked with State and local emergency management.

•

Questions were added to measure key performance metrics of timeliness, simplicity, and
accuracy.

•

Concepts that frequently appeared in comments were added to the survey.

•

Text boxes were added to some of the questions that have trended lower so that applicants can
provide more information about their experience.

15a) Change in Annual Hour Burden by Instrument:
•

PA Initial (Phone) has an estimated:
o 287 hours currently – 316 hours previously = -29 hours.
o Adjustment due to lower response rates and population estimates.

•

PA Initial (Internet) has an estimated:
o 57 hours currently - 53 hours previously = +4 hours.
o Program increase due to longer administration time.
o Note- The population decrease for the PAI Survey was of a smaller magnitude than the
population decrease for the PAA Survey. Internet administration forms also have a
smaller proportion of applicants compared to telephone forms. Together, these factors
resulted in the increased survey length for PAI Initial (Internet) having a stronger effect
on burden hours (slight increase) compared to the other forms.

•

PA Assessment (Phone) has an estimated:
o 230 hours currently - 372 hours previously = -142 hours.

o Adjustment due to lower response rates and population estimates.
•

PA Assessment (Internet) has an estimated:
o 48 hours currently - 72 hours previously = -24 hours.
o Adjustment due to lower response rates and population estimates.

•

For qualitative research (focus groups and interviews), annual burden hours are:
o 1280 hours currently – 1,280 hours previously = no change
o No Change. The Public Assistance process is constantly evolving, and qualitative
interviews add much needed flexibility when it comes to gaining insights into customer
satisfaction with specific changes that aren’t captured in the surveys. There are often
changes to things like Alternative Procedures, the Grants Portal, New Delivery Model,
and State-Led Disaster Framework that Public Assistance needs immediate feedback on.
Interviews are useful because it is sometimes difficult to gather enough respondents in a
concentrated area to conduct Public Assistance Focus Groups.
o Breakdown: Focus Groups: 1,080 currently – 1,080 previously = -no change.
Interviews: 200 currently - 200 previously = no change.
Question 15 a: Itemized Changes in Annual Hour Burden

Data
Collection
Instrument

Program
Change
(hours
currently
on OMB
Inventory)

Program
Change
(New)

Public
Assistance
Initial
Survey
FEMA Form
519-0-32

0

Public
Assistance
Initial
Survey
FEMA Form
519-0-33
Public
Assistance
Assessment
Survey
FEMA Form
519-0-34

Difference

Adjustment
(hours
currently
on OMB
inventory)

Adjustment
(new)

Difference

Explanation

0

0

316

287

-29

Adjustment due to lower
population estimates and
response rates

53

57

4

0

0

0

Program increase due to
longer administration
time

0

0

0

372

230

-142

Adjustment due to lower
population estimates and
response rates

Now Public
Assistance
Assessment
Survey
FEMA Form
519-0-35

0

0

0

72

48

-24

Adjustment due to lower
population estimates and
response rates

Qualitative
Interviews

0

0

0

1,280

1,280

0

No change

53

57

4

2,040

1,845

-195

2,093

1,902

-191

Total
Grand
Total for
adjustments
and
program
change

15b) Change in Annual Cost by Instrument (see table 15b on following page)
•

PA Initial (Phone) is a has an annual decrease of $3,998. Adjustment due to lower population
estimates, lower response rates, and decrease in wage rates.

•

PA Initial (Internet) has an annual decrease of $165. Program change: slight increase in
burden hours, but a decrease in costs due to lower wage rates.

•

PA Assessment (Phone) has an annual decrease of $10,987. Adjustment due to lower population
estimates, lower response rates, and decrease in wage rates.

•

PA Assessment (Internet) has an annual decrease of $1,922. Adjustment due to lower
population estimates, lower response rates, and decrease in wage rates.

•

For qualitative interviews there is an annual cost decrease of $9,312. Adjustment due to
decrease in wage rates. Instead of using management occupations to estimate State, Local, or
Tribal Government workers, we used emergency management directors, which is more
representative of our population. There was no distinct Non-Profit category on the last published
BLS wage estimate report for May 2019, so estimates were used for all community and social
service occupations.
o Wage Increase for Non-Profit Workers: $46.73 previously, $34.43 currently (includes
1.46 multiplier).
o Wage Increase for State, Local, or Tribal Government: $67.54 previously, $63.49
currently (includes 1.6 multiplier).

•

For the total collection, there is a cost decrease of $26,383 (see table 15b).

Question 15 b: Itemized Changes in Annual Costs

Data Collection
Instrument

Program
Change
(costs
currently
on OMB
Inventory)

Program
Change
(New)

Difference

Adjustment
(costs
currently
on OMB
inventory)

Adjustment
(new)

Difference

Public
Assistance
Initial Survey,
FEMA Form
519-0-32
Telephone

$0.00

$0.00

$0.00

$20,677

$16,679

-$3,998

Public
Assistance
Initial Survey,
FEMA Form
519-0-33
Internet

$3,476

$3,311

-$165

$0.00

$0.00

$0.00

$0.00

$0.00

$0.00

$24,355

$13,368

-$10,987

$0.00

$0.00

$0.00

$4,717

$2,795

-$1,922

$0.00

$0.00

$0.00

$83,788

$74,476

-$9,312

$3,476

$3,311

-$165

$133,537

$107,318

-$26,219

$137,012

$110,629

-$26,383

Public
Assistance
Assessment
Survey FEMA
Form 519-0-34
Telephone
Public
Assistance
Assessment
Survey FEMA
Form 519-0-35
Internet
Qualitative
Interviews
Total
Grand Total for
adjustments
and program
change

Explanation

Adjustment
due to lower
response rates,
population
estimates, and
wage rates
Program
change in
costs: longer
survey
administration
time, but lower
wage rates
resulted in
slightly lower
costs
Adjustment
due to lower
response rates,
population
estimates, and
wage rates
Adjustment
due to lower
response rates,
population
estimates, and
wage rates
Adjustment
due to lower
wage rates

16. For collections of information whose results will be published, outline plans for tabulation and
publication. Address any complex analytical techniques that will be used. Provide the time
schedule for the entire project, including beginning and ending dates of the collection of
information, completion of report, publication dates, and other actions.
FEMA does not intend to publicly publish results from this information collection.

We will be providing reports to Public Assistance management and Headquarters management on a
quarterly basis. These reports will have a breakdown of each question (basic descriptive statistics;
averages and percentages) as well as an overall analysis of patterns seen in the data each quarter and
trends overtime. Data can also be broken down by region, disaster, state, etc., depending on the needs of
Public Assistance; so, it is possible that stakeholders will occasionally request reports on a more
frequent basis than quarterly.
There is also a Public Assistance Tableau Dashboard on the Customer Survey and Analysis Website that
can be accessed by anyone in FEMA. The dashboard is refreshed on a monthly basis. The dashboard
displays averages for a subset of survey questions.
Statisticians may be asked to do more in-depth analysis if there is a significant drop in customer
satisfaction scores, and stakeholders want to understand why there was a decrease in satisfaction. This
may involve correlation, T-tests, Crosstabs with Pearson’s Chi-Square, and Analysis of Variance
(ANOVA). Demographic data will typically be used to describe the sample of respondents, but
statisticians may also look for differences in satisfaction across demographic groups if a more in-depth
analysis is requested.
17. If seeking approval not to display the expiration date for OMB approval of the information
collection, explain reasons that display would be inappropriate.
FEMA will display the expiration date for OMB approval of this information collection.
18. Explain each exception to the certification statement identified in Item 19 “Certification for
Paperwork Reduction Act Submissions,” of OMB Form 83-I.
FEMA does not request an exception to the certification of this information collection.

Contact Information
Kristin Brooks, Ph.D.
Statistician
Customer Survey and Analysis Section
Reports and Analytics Division, FEMA
[email protected]
Office: (940) 891-8579; Alt phone: (310) 569-3347

Maggie Billing
Program Analyst
Customer Survey & Analysis Section
Reports and Analytics Division, FEMA
[email protected]
Office: (940) 891-8709

References

Curtin, R., Presser, S., & Singer E. (2005). Changes in telephone survey nonresponse over the past
quarter century. Public Opinion Quarterly, 69, 87–98.
Czajka, J. L. & Beyler, A. (2016). Declining Response Rates in Federal Surveys: Trends and
Implications. Mathematica Policy Research, 1, 1-86. Retrieved from:
https://aspe.hhs.gov/system/files/pdf/255531/Decliningresponserates.pdf
Duffy, B., Smith, K., Terhanian, G. & Bremer, J. (2005). Comparing Data from Online and Face-to-Face
Surveys. International Journal of Market Research, 47(6). doi: 10.1177/147078530504700602.
Ehlen J, Ehlen P. (2007). Cellular-only substitution in the United States as lifestyle adoption. Public
Opinion Quarterly, 71, 717–733.
Kennedy, C. & Hartig, H. (2019). Response rates in telephone surveys have resumed their decline. Pew
Research Center. Retrieved from: https://www.pewresearch.org/fact-tank/2019/02/27/responserates-in-telephone-surveys-have-resumed-their-decline/
Sinclair, M., O’Toole, J.O., Malawaraarachchi, M. & Leder, K. (2012). Comparison of response rates
and cost-effectiveness for a community-based survey: postal, internet and telephone modes with
generic or personalized recruitment approaches. BMC Medical Research Methodology,
doi:10.1186/1471-2288-12-132
Szolnoki, G. & Hoffman, D. (2013). Online, face-to-face and telephone surveys-Comparing different
sampling methods in wine consumer research. Wine and Economics Policy, doi:
http://dx.doi.org/10.1016/j.wep.2013.10.001
Tourangeau R. (2004). Survey research and societal change. Annual Review of Psychology, 55,775–801.


File Typeapplication/pdf
File Modified2020-09-22
File Created2020-09-22

© 2024 OMB.report | Privacy Policy