3170-XXXX Supporting Statement part A OMB

3170-XXXX Supporting Statement part A OMB.pdf

Debt Collection Quantitative Disclosure Testing

OMB: 3170-0070

Document [pdf]
Download: pdf | pdf
BUREAU OF CONSUMER FINANCIAL PROTECTION
PAPERWORK REDUCTION ACT SUBMISSION
INFORMATION COLLECTION REQUEST
SUPPORTING STATEMENT PART A
DEBT COLLECTION QUANTITATIVE DISCLOSURE TESTING
(OMB CONTROL NUMBER: 3170-XXXX)

OMB TERMS OF CLEARANCE: Not applicable. This is a new collection. There are no
terms of clearance at this time.

ABSTRACT:
The Dodd-Frank Wall Street Reform and Consumer Protection Act and other federal consumer
financial laws authorize the Consumer Financial Protection Bureau (BCFP or Bureau) to engage
in consumer protection rule writing. This PRA clearance request seeks approval from the Office
of Management and Budget (OMB) to conduct a web survey of 8,000 individuals as part of the
Bureau’s research on debt collection disclosures.
The survey will explore consumer comprehension and decision making in response to debt
collection disclosure forms. The survey will oversample respondents who have had experience
with debt collection in the past.

JUSTIFICATION
1. Circumstances Necessitating the Data Collection
The Dodd-Frank Wall Street Reform and Consumer Protection Act (Pub.L. 111–203) and other
federal consumer financial laws authorize the Consumer Financial Protection Bureau (BCFP or
Page 1 of 16

Bureau) to engage in consumer protection rule writing. The Bureau relies on empirical evidence
and rigorous research to improve its understanding of consumer financial markets for
regulatory purposes.
The Fair Debt Collection Practices Act (FDCPA) establishes the rights, liabilities, and
responsibilities of participants in the debt collection system, including third-party debt
collectors, debt buyers, and consumers. Among other things, the FDCPA was enacted to
“eliminate abusive debt collection practices by debt collectors, [and] to insure that those
debt collectors who refrain from using abusive debt collection practices are not
competitively disadvantaged.”
To achieve these purposes, the FDCPA: (1) prohibits debt collectors from engaging in
abusive, deceptive, or unfair practices; (2) imposes restrictions on debt collectors’
communications with consumers and on their communications with others to locate
consumers; and (3) mandates a debt dispute process under which collectors provide
consumers with basic information about their alleged debts, consumers have the right to
dispute their alleged debts, and collectors must verify disputed debts before continuing to
collect on them.

The FDCPA requires that debt collectors make certain disclosures as part of the collection
process. Most notably, Section 809 of the FDCPA requires debt collectors to provide
“validation notices” (sometimes called “g-notices”) to consumers at the start of the collection
process. These notices contain information about the debt collection process, such as the
consumer’s right to dispute the debt, as well as information about the debt being collected,
such as the name of the debt’s owner and the amount owed.
Certain other disclosures are also required by the FDCPA. For instance, Section 807(11)
requires what is commonly called the “mini-Miranda” warning. In the collector’s initial
communication, it requires that collectors state that they are calling to collect a debt and that
any information obtained during the course of the call may be used to collect that debt. For all
communications, it also requires that debt collectors disclose that the communication is from a
Page 2 of 16

debt collector.
As part of a potential upcoming rulemaking implementing the FDCPA, the BCFP is
considering whether additional information should be added to the validation notice to help
consumers recognize whether they owe the debts. The BCFP also is considering whether
additional information about consumer rights under the FDCPA should be disclosed to
consumers at the time the validation notice is given. The BCFP further is considering whether
consumers should receive disclosures in validation notices or subsequent communications
regarding time-barred debts (i.e., debts that are older than the applicable state statute of
limitations) or if other disclosures should be provided.
2. Use of the Information
The BCFP will use information gathered as part of this research study to help assess
whether it can improve the clarity of forms used during debt collection to facilitate
consumer decision making. Insights from this survey may provide information about how
consumers respond to disclosures that can be leveraged to inform the development of future
consumer disclosures.

The BCFP plans to conduct a web-based survey that would test a number of outstanding
questions related to disclosures the Bureau is developing in conjunction with its debt
collection rulemaking, especially with regard to “time-barred” debt. This survey will test
outstanding issues regarding the disclosures on a large sample of consumers possessing a
broad range of demographic characteristics, oversampling consumers who indicate that they
have experience with debts in collection.
The BCFP has retained a contractor to conduct the proposed research; the contractor will
subcontract with a survey research firm to assist with administration of the web survey. The
study will be conducted in English and will use the subcontractor’s proprietary online panel.
The survey will not involve ongoing data collection; it is a one-time web survey. Participation
will be voluntary.
Page 3 of 16

The BCFP plans to share aggregated findings from the survey with the public as appropriate,
for example, in a future study on debt collection or in connection with any potential
rulemakings related to debt collection.
3. Use of Information Technology
The survey will be a web-based data collection effort. Respondents will be recruited from
GfK’s KnowledgePanel, an online panel. Panelists will receive an email containing a
personalized URL (e.g., www.researchsurvey/123456) for the web survey that includes a
unique, non- sequential identifier for secure login. Upon clicking on the URL that our
contractor will host, the respondent will be directed to the survey. They will be asked to read a
validation notice and then answer questions based on a hypothetical situation. The web
instrument will automatically guide the respondent through the survey questions. Respondents
may save their responses and suspend/resume the survey where they left off. At any time,
respondents will be able to refer to the validation notice.
Collecting data electronically will help to reduce errors and improve data reliability by:
•

Providing paradata, helping us understand how people interact with the survey (i.e.
how often they refer to the validation notice and for how long, and whether they
return to previous questions during the survey);

•

Providing uniform question sequencing;

•

Automatically skipping questions, where appropriate, based on prior answers
to questions;

•

Randomizing disclosure forms to participants; and

•

Rejecting invalid responses or data entries.

Additionally, the subcontractor may collect data on the length of the survey and unit and item
non-response rates. This type of information can be used to improve the data collection
process.

Page 4 of 16

4. Efforts to Identify Duplication
The proposed consumer survey will not duplicate empirical research that the BCFP has
identified to date. The debt collection disclosure form alternatives that will be tested through
the survey are currently being developed, informed by previous qualitative research performed
under OMB Control # 3170-0055, Generic Information Collection Plan to Conduct Cognitive
Research and Pilot Testing under and information collection titled “Debt Collection Disclosure
Testing Quantitative Study, Pretesting of Survey Questions.” No empirical studies to date have
quantitatively tested consumers’ comprehension and decision making around these debt
collection disclosure form alternatives. Moreover, the quantitative testing will not be
duplicative of the qualitative form testing study. The qualitative study uses much smaller
sample sizes to identify any large trends in consumers’ reactions to specific aspects of the
forms (e.g., the forms’ formatting and layout). The quantitative form testing study will test
consumers’ comprehension and decision making using updated versions of the forms with a
much larger and representative sample.
The BCFP will continue to monitor empirical research and related work by Federal
Regulatory agencies and other researchers to ensure that the BCFP’s research techniques
reflect the most current knowledge and best practices.
5. Efforts to Minimize Burdens on Small Entities
Not applicable. The data collection will not burden small entities because the survey will
only collect information from individuals.
6. Consequences of Less Frequent Collection and Obstacles to Burden Reduction
Each surveyed individual will only participate once.
If the survey was not implemented, the BCFP would be limited in its ability to provide
an analysis of how the debt collection disclosure form alternatives facilitate
consumers’ comprehension and decision making.

Page 5 of 16

By implementing the survey, the BCFP will be able to test for differential patterns in form
comprehension and decision making across different types of disclosures. If the survey was not
implemented, the BCFP would not be able to assess these critical questions.
7. Circumstances Requiring Special Information Collection
There are no special circumstances. The collection of information is conducted in a manner
consistent with the guidelines in 5 C.F.R. 1320.5(d)(2).
8. Consultation Outside the Agency

In accordance with 5 C.F.R. 1320.8(d)(1), the Bureau published a Federal Register notice
(FRN) allowing the public 60 days to comment on this proposed new, collection of
information.
Further, and in accordance with 5 C.F.R. 1320.5(a)(1)(iv), the Bureau has published a notice
in the Federal Register allowing the public 30 days to comment to OMB on the submission
of this information collection request. Further, as noted above the questions in this survey
were pre- tested in pilot testing conducted under OMB Control #3170-0055.
The BCFP received 9 responsive comments during the 60-day notice period, and 5
comments were directed to OMB during the 30-day notice period. Commenters included
industry groups, consumer advocates, academics, and private citizens. Commenters were
generally supportive of research into debt collection disclosures, but asked that we delay the
information collection. In response, we pulled this collection from OMB review, and are
now re-submitting for review and republishing another 30-day notice inviting the public to
submit comments to OMB about this collection. We also thoughtfully considered the areas
of improvement that the commenters proposed, and we address those comments below.

Disclosure Notices
Several commenters expressed concern that the PRA submission materials did not include the
disclosure notices and text to which survey respondents will be asked to respond. The Bureau
Page 6 of 16

has included the various versions of the model form and disclosure options that will be tested.
The Bureau has also previously released examples of possible consumer disclosures as part of
the Outline of Proposals Under Consideration for the Small Business Review Panel for Debt
Collector and Debt Buyer Rulemaking. The Bureau has received and continues to receive
feedback from stakeholders on these examples and related topics, and these disclosures
continue to be under consideration and development. Any disclosures that become part of a
rulemaking will be released at a later date and will be subject to public notice and comment.

Use of Hypothetical Scenario in Survey Questions
Commenters also expressed concern about the applicability of hypothetical questions and
scenarios to real world decisions. Bureau researchers acknowledge that there is a large
literature suggesting that consumers may be inaccurate in predicting how they will react to
hypothetical future events. The Bureau has therefore taken steps to evaluate this methodology,
and believes the methods proposed are the most appropriate for three reasons: (1) The
performance of the methodology in qualitative testing and consultant support (2) A focus on
treatment effects over baseline estimates (3) Empirical support for the methodology. These
are discussed in more detail below.
(1) Testing and consultant support of the hypothetical vignette method.
To evaluate the proposed vignette methodology, the Bureau has explored different research
methodologies with expert contractors and visiting scholars, and performed qualitative testing
of the disclosures and the survey instrument, including the vignette. In previous versions
where consumers were asked to estimate their own behavior rather than that of a hypothetical
Person A, researchers found that consumers without debt collection experience dwelled on the
idea that they would never be in the position of owing a debt, which interfered with their ability
to complete the survey. Switching to a third person proved easier for both those with and
without debt collection experience to answer questions about the information on the form.

Page 7 of 16

(2) Focusing on treatment effects
In addition, the Bureau is interested in relative differences between groups in disclosure
comprehension, depending on the disclosure that each group receives; the Bureau does not
intend to rely on this research project to understand incidence rates in the population. The
hypothetical nature of the questions should have similar effects (if any) on participants in all
experimental groups, and therefore would be a common factor across groups. Comparing
relative responses across groups, as opposed to measuring the incidence rate of responses for a
particular group, should render any effect of the hypothetical nature of the questions irrelevant
for the Bureau’s purposes.
(3) Empirical support for the methodology
Using “vignettes” (also called factorial or decision scenarios) to ask survey questions is a
common methodology in the social sciences. Evidence suggests that what people express on
web surveys is associated with their actual behavior in the real world, 1,2,3 and external
validation of the vignette method suggests responses are somewhat consistent among different
demographic groups. 4 For example, evidence suggests that how people respond in surveys
using the vignette method of questioning is related to how they behave in field studies,
although there are biases, including in the reporting of more prosocial behavioral norms
compared to behavior in the real world. 5 There may also be biases in survey responses based on
automatic processes which affect consumer behavior but of which the consumer is not
consciously aware. 6 However, these biases are not limited to hypothetical questions, but rather
Couper, Mick, Singer, Eleanor, Conrad, Frederick, and Groves, Robert. 2010. “Experimental Studies of Disclosure
Risk, Disclosure Harm, Topic Sensitivity, and Survey Participation.” Journal of Official Statistics, 26(2): 287–300
2
Hensher, David A. 2009. “Hypothetical Bias, Choice Experiments and Willingness to Pay.” Transportation
Research Part B, 44: 735-752.
3
Adams, P., Guttman-Kenney, B., Hayes, L., Hunt, S. (2018). Helping credit card users repay their debt: a summary
of experimental research. Financial Conduct Authority Research Note. Available online at:
https://www.fca.org.uk/publication/research/research-note-helping-credit-card-users-repay-their-debt-summaryexperimental-research.pdf
4
Teti, Andrea, Gross, Christiane, Knoll, Nina, and Bluher, Stefan. 2016. “Feasibility of the Factorial Survey Method
in Aging Research: Consistency Effects Among Older Respondents.” Research on Aging, 38(7): 715–741.
5
Eifler, Stefanie. 2010. “Validity of a Factorial Survey Approach to the Analysis of Criminal Behavior.”
Methodology, 6(3):139–146
6
Verneau, Fabio, La Barbera, Francesco, and Del Guidice, Teresa. 2017. “The Role of Implicit Associations in the
1

Page 8 of 16

are common in surveys in general.
There are strategies to mitigate the impact of hypothetical bias that the BCFP employs in this
research study. One way is to highlight the importance of the study such that “the participant
cares about the results of the research, and believes that his or her answers will influence
decisions to be made as a result of the research,” and to ask about the likelihood of various
decisions rather than indicating a decision with “yes” or “no.” 7 In fact, qualitative testing
revealed that asking about likelihoods was more effective than asking about a list of potential
behaviors. Another method the Bureau is using to minimize hypothetical bias is to probe
respondents for the certainty or confidence of their answers, rather than asking consumers to
indicate whether or not they will do a particular behavior. 8

Other Survey Question Comments
Several commenters suggest that the Bureau track whether survey participants refer back to
the notices during the online survey. Other commenters suggested that the Bureau look at
differences in disclosure comprehension between subgroups. In addition, commenters urged
the Bureau to ensure that the survey has enough statistical power to see differences between
groups, and to perform robustness checks related to the study’s overweighting of people with
debt collection experience. The BCFP plans to do each of these things by collecting survey
paradata (which tracks respondents’ process flow throughout the survey) and individual
difference measures, which we plan to use in the analysis of this study. We will also receive
demographic information on respondents from Gfk as well. To the extent that it is possible to
estimate the effect sizes that will be observed, the Bureau has also conducted power analyses
to ensure sufficient statistical power.
One commenter suggested that a field trial would be more impactful. The Bureau agrees that
field trials are highly valuable, but the Bureau cannot compel cooperation in a field trial.
Hypothetical Bias.” The Journal of Consumer Affairs, 51(2): 312-328.
7
Fifer, Simon, Rose, John, and Greaves, Stephen. 2014. “Hypothetical Bias in Stated Choice Experiments: Is it a
Problem? And if so, How do We Deal With it?” Transportation Research Part A, 61: 164-177.
8
Blumenschein, Karen, Blomquist, Glenn C., Johannesson, Magnus, Horn, Nancy, and Freeman, Patricia. 2007.
“Eliciting Willingness to Pay Without Bias: Evidence from a Field Experiment.” The Economic Journal, 118(525):

Page 9 of 16

Furthermore, the Bureau believes that the survey methodology proposed by Bureau
researchers will provide the necessary knowledge to evaluate the disclosures.
In addition, several commenters expressed concern about changes to the survey that the
Bureau may make after the “soft launch” and before the “full launch.” The Bureau expects
that any changes identified during the soft launch will not have PRA implications.
In addition to and preceding the “soft launch” the Bureau intends to pilot new questions on a
small group of 200 respondents from the GfK panel, evaluate these questions for
effectiveness, and decide whether to retain them in the final survey instrument. This is
consistent with the spirit of PRA in that ineffective questions can be removed or refined in
order to decrease the burden to the remaining respondents. The Bureau does not anticipate
any changes made during the pilot will have PRA implications, as changes will consist of
refining wording or excluding ineffective items, and not any substantive changes.
During the soft launch, the Bureau will review the results to make sure responses seem
correct from a technical perspective. Because of the Bureau’s pretesting work, however, the
Bureau believes that the probability of identifying concerns that would significantly change
the questions of interest are very small.
The Bureau considered other commenter suggestions about whether to add or omit certain
questions, but decided either that the Bureau found value in the current questions, or that the
new questions were outside the scope of this study. One commenter disagreed with the
Bureau’s plan to ask respondents about their subjective beliefs in the survey instrument. The
Bureau believes that these questions are important controls to better understand how
respondents are interpreting the disclosure forms.
Another commenter suggested using financial literacy questions as controls and to
understand the perspective of the least sophisticated consumer. With consideration for space
limitations in the survey and the challenges to consumers to answer financial literacy
questions 9, the Bureau will make use of demographic information like education, race, age,
gender, and income to understand the perspectives of a very diverse group of consumers,
including the most vulnerable and least sophisticated consumers.

Page 10 of 16

Commenters had suggestions around objective comprehension questions: one commenter did
not think the Bureau asked enough questions to ascertain whether respondents comprehend
the disclosure, and another thought that the comprehension questions should be open-ended.
The Bureau has added additional multiple choice comprehension questions and believes that
the current number and scope of comprehension questions is sufficient to understand
differences between forms.

9. Payments or Gifts to Respondents
Survey recipients will receive a cash payment, currently expected to be five dollars, as an
inducement to complete and return the survey questionnaire. Recipients who fail to respond
to the initial survey solicitation may receive an additional cash inducement of a similar
amount.
Meta-analyses of mail surveys find that incentives given initially with the questionnaire yield
significantly higher response rates than do incentives contingent on return of the survey or no
incentives; furthermore, monetary incentives produce a stronger effect that non-monetary
incentives. 10, 11 Many recurring federally-funded surveys use monetary incentives, including
the Survey of Consumer Finances, the Survey of Income and Program Participation, and the
National Survey of Drug Use and Health, and self-administered surveys such as the Survey of
Doctorate Recipients, the National Survey of Recent College Graduates, and the National
Survey of Mortgage Borrowers. 12, 13 Incentives have consistently been found to improve
Allan H. Church, “Estimating the Effect of Incentives on Mail Survey Response Rates: A Meta-Analysis,” Public
Opinion Quarterly 57, no. 1 (1993): 62-79.
11
Fernandes, D., Lynch Jr, J. G., & Netemeyer, R. G. (2014). Financial literacy, financial education, and
downstream financial behaviors. Management Science, 60(8), 1861-1883.
12
Phil Edwards, Ian Roberts, Mike Clarke, Carolyn DiGuiseppi, Sarah Pratap, Reinhard Wentz, and Irene Kwan,
“Increasing Response Rates to Postal Questionnaires: Systematic Review,” British Medical
Journal324 (2002):1183-1189.
13
Fan Zhang, “Incentive Experiments: NSF Experiences,” NSF Working Paper, 2010.
14
Eleanor Singer (2002), “The Use of Incentives to Reduce Nonresponse in Household Surveys.” In R.M. Groves,
D.A. Dillman, J.L. Eltinge, and R.J.A. Little (eds), Survey Nonresponse. New York: Wiley, pp. 163-177.
15
Eleanor Singer, and Cong Ye (2013), “The Use and Effects of Incentives in Surveys.” The Annals of the American
Academy of Political and Social Science, 645 (1):112–141.
16
Martha Berlin et al. (1992), “An Experiment in Monetary Incentives.” Proceedings of the Survey Research
Methods Section, American Statistical Association, pp. 393-398.
17
Eleanor Singer, John Van Hoewyk, and M. Patricia Maher (2000), “Experiments with Incentives in Telephone
10

Page 11 of 16

response rates across a variety of survey topics and modes.14,15 Incentives have been found to
be cost-effective in different modes, often reducing the effort required to contact and
interview sample persons or reduce the number of follow-up mailings.16, 17, 18
The Public will also have an opportunity to comment on the proposed disclosures when the
Bureau publishes its notice of Proposed Rulemaking for the rule that this research will
support

10. Assurances of Confidentiality
The BCFP will not provide an explicit pledge of confidentially. The BCFP shall treat the
information in accordance with applicable federal law, and the Bureau’s own privacy rules,
and all applicable laws and regulations that apply to federal agencies for the protection of
privacy, security and integrity of information.
The BCFP provides notice to individuals to explain how their information will be used
through Privacy Act Statements. Privacy Act Statements are made available prior to the
collection of information and explain whether the information is mandatory or voluntary; the
authority for the information collection; whether there are any opportunities to consent to
sharing and submission of information; how the information will be secured, and what
System of Records applies.
In the survey’s introduction, respondents will be informed about the study’s purpose, the
authority under which the data are being collected, that cooperation is voluntary, and that
direct identifying information will not be provided to the BCFP or to any other party.
Regarding respondents’ personally identifiable information (“PII”), the subcontracted survey
research firm uses user- and role-based access by separating identifying and non-identifying
Surveys.” Public Opinion Quarterly, 64 (2): 171-188.
18
Gwen L. Alexander et al. (2008), “Effect of Incentives and Mailing Features on Recruitment for an Online Health
Program.” American Journal of Preventive Medicine, 34 (5): 382-388.

Page 12 of 16

data into different database systems, each of which has its own defined security roles. Access
to survey data is limited to the relevant research staff but explicitly denied to anybody who
may deal with panelists’ PII. Only the subcontractor’s IT, Panel Management staff, and
selected vendors with a need to know have access to panelists’ PII. The BCFP will not have
access to panelists’ PII.
The contractor will deliver to the BCFP the data as received from the subcontracted survey
research firm, so that BCFP can analyze the data. The BCFP will only receive and keep
response data stripped of direct identifying PII. Moreover, in order to limit the amount of
potentially identifying information that the BCFP receives through demographic variables, the
BCFP will seek to receive demographic variables included in the data that shall be provided by
the contractor/subcontractor in ranges (e.g., age 18-34) rather than specific values (e.g., age
21) where appropriate.
Conducting this survey implicates privacy concerns because a breach of confidentiality, or reidentification, could result in an individual suffering harm. To reduce the risk of breaches of
privacy, the BCFP designs recruitment materials so as not to disclose sensitive information
about those it seeks to recruit, and uses appropriate security controls to protect information
used in research. There is also risk related to misuse of information collected for research.
Misuse might involve secondary types of research that are incompatible with the purposes of
the initial collection, or a use of the information that individuals do not understand or to which
they have not provided consent.

To reduce the risk of misuse, the BCFP minimizes access to PII based on need-to-know; any
contractor staff assigned to the project also sign confidentiality agreements. Any responses
transmitted to the Bureau from this survey will be de-identified and / or aggregated before the
Bureau receives them. When appropriate, survey results will be presented in aggregated form
to protect the privacy of firms or consumers, and any publicly released version of data will
use disclosure protection techniques (e.g., rounding, imputation, exclusion of some variables,
aggregation of categorical responses) to minimize the risk of releasing personally identifiable
or otherwise sensitive information (12 C.F.R. 1070.40 et seq.). The Bureau treats the
Page 13 of 16

information collected from participating persons in a manner consistent with the Bureau’s
privacy regulations, and all data and analyses are subject to legal and privacy review prior to
their release. For the assurances of confidentiality provided to respondents by
KnowledgePanel, please see: http://www.knpanel.com/participate/privacy2.html.
The Bureau also evaluates the potential privacy risk and harm to individuals of specific
research relative to that authorized purpose, and vets research proposals to ensure that they
serve an authorized purpose. Surveys will be consistent with the Privacy Act and the EGovernment Act. The requisite SORNs and PIAs will document the collection, use,
disclosure, and retention of PII; and the technical, administrative, and physical controls used
to minimize privacy risks. This collection is covered by the CFPB.022 Market and Consumer
Research Records, 77 FR 67802 System of Records Notice, and the Consumer Experience
Research PIA.

11. Justification for Sensitive Questions
Questions about an individual’s finances, for example, whether a person has experience with
debt collection, are commonly considered sensitive. Nonetheless, the BCFP must ask these
kinds of questions in order to understand consumer behavior and recognize financial trends
and emergent risks relevant to consumers. Because these types of questions are central to the
BCFP mission, we believe that we are justified in asking these types of sensitive questions.
In addition, some people may believe that questions about race or other socioeconomic factors
may be considered sensitive. It is the Bureau’s opinion that these consumer characteristics are
important to measure: because (1) they an important source of variance that can be accounted
for, (2) this information allows researchers to determine whether Bureau disclosures operate
similarly for a diverse body of consumers, from the most vulnerable to the most sophisticated.
(3) Measuring demographic characteristics permits Bureau researchers to evaluate the extent
to which the survey sample is similar to other samples. Finally, these types of questions are
routinely asked by the online panel we are using for this study. For these reasons, we feel
justified in asking these types of sensitive questions. For information collections involving
questions of race/ethnicity, we will ensure that the OMB standards for Classification of
Page 14 of 16

Federal Data on Race and Ethnicity (Federal Register, October 30, 1997, Volume 62, Number
210, pages 58781-59790) are followed.
Respondent participation is voluntary; subjects will be made aware of this fact. All
respondents are free to opt-out of a data collection at any time and for any reason.
12. Estimated Burden of Information Collection
Frequency
Information
No. of
Annual
Collection
Respondents
Responses
Requirement

Average
Response
Time

Annual
Burden
Hours

Screening /
Recruitment

17,750

1

17,750

0.05

888

Web Survey

8,000

1

8,000

0.33

2,667

Totals:

17,750*

25,750

3,555

*Respondents to the Web Survey are a subset of those who responded to the screener.
The screening and recruitment responses are estimated to require an average response time of
approximately three minutes, as the number of screening questions will be limited. The estimate
for average burden per response to the web survey is based on the contractors’ study proposal
and test plan.

13. Estimated Total Annual Cost Burden to Respondents or Recordkeepers
There are no capital/start-up or ongoing operation/maintenance costs associated with
this information collection.
14. Estimated Cost to the Federal Government
There will be no annualized capital/start-up costs for the government to receive the survey
information. The testing is funded with non-appropriated funds. The contract to carry out the
study will cost $ 445,806.80.
Page 15 of 16

15. Program Changes or Adjustments
This is a new, one-time information collection request. Therefore, all the burden is considered
to be new burden and will be accounted for as a “program change” for the purposes of OMB’s
PRA inventory. The burden will be removed from OMB PRA inventory after the survey is
completed.
16. Plans for Tabulation, Statistical Analysis, and Publication
The contractor’s report will provide tabulations at the aggregate level. Once the data is
tabulated, it will be presented to the BCFP along with an executive summary and detailed
findings about consumer comprehension and decision-making related to our debt collection
form alternatives for participants in the study.
The BCFP will also receive the underlying data from the contractor, to conduct our own
additional analysis, if appropriate. As discussed above, the BCFP may share aggregate
findings from the survey with the public as appropriate, for example, in connection with the
release of a further study of debt collection, or in connection with any potential rulemaking
related to debt collection. BCFP will only release unweighted analyses as part of any
publications related to this study.
17. Display of Expiration Date
The BCFP plans to display the OMB number and expiration date for OMB approval in the
survey instruments. Additionally, the OMB control number and expiration date will be
displayed on the Federal government’s electronic PRA docket at www.reginfo.gov.
18.

Exceptions to the Certification Requirement

The Bureau certifies that this collection of information is consistent with the requirements of 5
C.F.R. 1320.9, and the related provisions of 5 C.F.R .1320.8(b)(3) and is not seeking
an exemption to these certification requirements.
Page 16 of 16


File Typeapplication/pdf
AuthorMiddlewood, Brianna (CFPB)
File Modified2018-11-28
File Created2018-11-28

© 2024 OMB.report | Privacy Policy