DEPARTMENT OF THE TREASURY
WASHINGTON, D.C. 20220
Supporting Statement B
Evaluation of the Community Development Financial Institutions Fund Bank Enterprise Award Program
B. Collections of Information Employing Statistical Methods
1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.
The data collection is a census survey of 156 BEA Program Applicants and therefore does not employ sampling methodology. A response rate of approximately 70 percent is expected for this data collection effort. In addition to strategic data collection and follow-up methods, the expected response rate is based on the fact that prospective respondents have a vested interest in the BEA Program or other Federal government programs designed to promote local community and economic development interests. In addition, the CDFI Fund will communicate generic encouragement of participation through various media, which is likely to have a positive effect on response rates. However, the CDFI Fund will not take any action to encourage individual BEA Program applicants/awardees to participate in the survey.
2. Describe the procedures for the collection of information including:
Statistical methodology for stratification and sample selection,
Estimation procedure,
Degree of accuracy needed for the purpose described in the justification,
Unusual problems requiring specialized sampling procedures, and
Any use of periodic (less frequent than annual) data collection cycles to reduce burden.
The data collection is a census survey and therefore does not employ sampling methodology. The survey will be administered one time and will include primarily closed-ended questions (e.g., Likert scale, rating scale, rank order, or multiple response items), with a minimal number of “other (specify)” items, and open-ended questions. The survey will take approximately 35 minutes to complete.
ARDX will program and test the web-based survey instrument using customized software. Following OMB Clearance, the survey will be published (with the OMB clearance number and burden statement) and administered as outlined below:
Validation calls to confirm contact person and contact information
Personalized letter (electronic) from the CDFI Fund announcing the survey
Letter from ARDX (electronic) one week following the initial letter including a personalized link
Reminder email sent to non-respondents (one week after the survey link)
Follow-up telephone call to non-respondents offering respondents an option of completing the survey by telephone (one week following the reminder email)
ARDX will develop a receipt control system that is directly linked to the online survey program to track response rates in real time and identify non-respondents for follow-up. ARDX’s goal is to achieve a final online survey response rate of approximately 70 percent.
3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.
Respondents will receive a personalized electronic survey from the CDFI Fund announcing the survey. That will be followed by a personalized survey invitation from ARDX containing a URL link, login name, and password to complete the survey on-line. ARDX will send an e-mail reminder one week after distribution of the survey invitation to those who have not responded, which should result in an increased response rate. Telephone reminder calls will be conducted with remaining non-respondents one week later, offering them the option of completing the survey by telephone. The choice of modes is expected to further increase response rates. Telephone reminders will be made at different times of the day and days of the week. The survey website will be available 24 hours a day, seven days a week. A toll-free technical support phone number will be listed on the electronic survey letter from ARDX in case a respondent has questions about the research study or encounters difficulty navigating the survey on the Internet.
4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.
The
evaluation team conducted a cognitive pre-test of the survey
instrument with five (5) respondents to ensure that all research
questions were addressed, questions were not ambiguous, and response
choices were mutually exclusive and exhaustive. The pre-test was also
used to estimate respondent burden in terms of the amount of time
required to complete the survey.
The pretest replicated the live data collection process, including selection of actual BEA Program awardees. Following online survey completion, the evaluation staff administered a brief telephone follow-up interview to obtain feedback from pretest respondents regarding the survey instrument. The selected respondents will be removed from the distribution list for the live survey, and their responses will be incorporated into the overall results as appropriate.
In addition to the design pretest, the online instrument was tested for technical factors such as programming accuracy and browser compatibility. Comprehensive testing scenarios were developed to verify each step of the programming process, and the instrument was subjected to developer testing, quality assurance review, and user acceptance testing (UAT) prior to distribution to pretest respondents. Testing included item-by-item checks for:
Accuracy of skip patterns and logic checks for all appropriate scenarios
Accuracy of programming of radio buttons for single-response items or check boxes for multiple response items
Adequate field length for open-ended questions
Connectivity
Functionality of survey links
508 compliance
Correct export order and variable names
In general, the pretest respondents found the online survey straightforward and felt that the questions adequately measured the effectiveness of the BEA Program. However, one modification will be incorporated as a result of the pretest and follow-up discussions with the CDFI Fund. The instrument will include the following “pop-up” BEA Program definition of Distressed Communities:
BEA Distressed Communities are census tracts that individually or collectively meet each of the following requirements:
Economic Requirement
At least 30% of population living in poverty; and
Unemployment rate at least 1.5 times the national average.
Geographic requirement:
A total population of at least 4,000, if near a MSA with a population of 50,000 or greater;
A population of at least 1,000 if no portion is located near a MSA; or
Located within an Indian Reservation
One pretest respondent suggested that that the survey include a definition of Distressed Communities since individuals who work in the banking industry have different definitions of the term “distressed,” which might influence the survey results.
5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the bureau unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the bureau.
The agency responsible for receiving and approving contract deliverables is:
The Community Development Financial Institutions Fund
U.S. Department of the Treasury, 6th Floor
1801 L Street NW
Washington, DC 20036
Person Responsible: Greg Bischak, (202) 653-0315, [email protected]
The organization responsible for administering the online survey of BEA Program applicants and awardees is:
A. Reddix & Associates
5700 Lake Wright Drive, Suite 203
Norfolk, VA 23502
Persons Responsible: Ms. Angela Reddix, (757) 410-7704, [email protected]
Ms. Sadie Bennett, (757) 321-4123, [email protected]
The organizations responsible for data analysis are:
Woodstock Institute
29 E. Madison, Suite 1710
Chicago, IL 60602
Person Responsible: Dr. Spencer Cowan, (312) 368-0310, [email protected]
National Community Reinvestment Coalition (NCRC)
727 15th Street, Suite 900
Washington, DC 20005
Person Responsible: Mr. Jason Richardson, (202) 464-2722, [email protected]
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Jeremy Crum |
File Modified | 0000-00-00 |
File Created | 2021-01-24 |