Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, state and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.
SBA is interested in deriving annual estimates of satisfaction and outcomes for Emerging Leaders, as well as the descriptive characteristics of the small business owners participating in the initiative. Due to potential variation in participants across the sites and the importance of site-specific feedback, a census survey (rather than sample) of the approximately 20 participants per site will be conducted. Exhibit 1 details the expected survey population size and response rate.1 Estimates are based on the response rates obtained in previous years of this data collection. Although Across 48 sites, SBA estimates a survey population of 960 small businesses. Response to the intake and feedback forms administered during the training program is expected to remain high as in previous years, over 80 percent. Response to the follow-up surveys is expected to decline over the three years of follow-up, starting with a high of 65 percent in the first year after the initiative.
Exhibit 1. Expected survey population and response rate for each data collection instrument
Instrument |
Survey population |
Response rate |
Total respondents |
|
|
|
|
Intake form |
960 |
100% |
960 |
Feedback form |
960 |
87% |
838 |
|
|
|
|
Follow-up surveys (graduates only) |
|
|
|
1st year |
838 |
65% |
541 |
2nd year |
838 |
39% |
324 |
3rd year |
838 |
20% |
165 |
Describe statistical methodology for stratification and sample selection, estimation procedure, degree of accuracy needed for the purpose described in the justification, unusual problems requiring specialized sampling procedures, and any use of periodic (less frequent than annual) data-collection cycles to reduce burden.
As noted in response to B.1, no sampling strategy is used in this data collection. The estimates are intended to be generalizable to the initiative participants as a whole. One purpose of the collection is to monitor the extent to which the initiative is reaching its target population of disadvantaged or under-represented entrepreneurs. Given the high response rate at intake, no statistical procedures are needed to estimate characteristics for the population, such as the proportion that are located in a low or moderate income area. A second purpose is to estimate business outcomes after participation in the initiative. If the response rate falls below the 80 percent threshold included in the Office of Management and Budget (OMB) guidance, nonresponse bias analysis will be conducted and reported. Business information collected in the intake survey (e.g. industry, business size, location, race/ethnicity of owner), which serves as the survey frame for the follow-up surveys, will be used to measure differences in the respondent and non-respondent population. Analysis of differences is conducted using standard statistical tests such as the Student-T and Chi-square.
The Student-T test will also be used to measure whether the changes over time in key small business outcomes (e.g. small business revenue or employment) are statistically significant and in the hypothesized direction. The analysis may also use multivariate regression techniques to assess change using a pooled sample of several years of data, particularly given low response by year three of the survey. The multivariate regression will control for factors such as the cohort, time period, location, time-variant community economic characteristics (e.g. local unemployment rates), and time-variant small business characteristics (e.g. industry). Estimated standard errors will be adjusted appropriately for grouping of data by site. Standard inferential statistics will be employed to determine if estimated differences are statistically significant, using a standard threshold of a 5 percent significance level to reject the null hypothesis of no difference.
The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.
To maximize the survey response rate and minimize respondent burden, the following data-collection techniques will be used (see appendices C-1 through C-9):
Each of the three surveys (intake, feedback and follow-up) will be sent from the official that is most likely to elicit a response from the participant. The intake is sent by the instructor with whom the participant will spend the next 7-months. The feedback and follow-up forms are sent from the Emerging Leaders administrators (i.e. the contract awardee) so that participants can fully and honestly express their satisfaction or dissatisfaction without concern for offending the instructor. Because the follow-up surveys have historically had a lower response rate, they will also be preceded by an invitation e-mail. The invitation will be sent from SBA and the administrators. Including the federal sponsor will help legitimize the multi-year follow-up activities and highlight their importance.
In order to encourage participants to respond, survey invitations and introductions will outline the potential benefits the survey offers to businesses. For example, the intake survey describes that these data are collected at the beginning of the curriculum in order for the instructor to help identify specific areas of improvement for the business. The follow-up survey provides a link to the aggregate survey results from the previous year and a link to the respondent’s survey answers from the previous year to promote self-evaluation, growth, and development.
Reminder e-mails are used to address nonresponse due to bad timing for the respondent. The survey system will send automatic reminders over a period of approximately one month to encourage potential survey respondents to complete the survey before the deadline. The reminders will be sent at different times of the day to increase the probability of reaching respondents at an opportune time. A reminder email is sent from the SBA District Office to participants regarding the intake survey. This occurs as a follow-up to the instructor’s email and helps emphasize the surveys importance for the program funder.
To decrease nonresponse due to issues with email addresses, the follow-up invitation email requests that the participant add the survey software address to his/her safe list. It also requests from the participant the best email to deliver the survey if the one being used is not preferred. If an email invitation “bounces”, the program administrators will work to replace it using other contact information requested at intake and each year of the survey.
The surveys are designed to use skip patterns and simple, often multiple-choice answer options to facilitate the ease with which they can be completed. Surveys can be halted and resumed by participants to provide greater flexibility, and participants are free to skip question(s) that they may not know how to answer or do not want to answer. The invitations alert the participant to what information he or should gather when starting the survey. All these efforts are used to decrease the burden on the participant and increase the response rate.
Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.
The follow-up survey, which is the most extensive, was pre-tested with three small business owners to ensure reliability, minimize measurement error, and minimize respondent burden. First, it was administered to two EL participants using a cognitive interviewing method. Respondents were asked to verbalize their thought process by “thinking out loud” with a researcher on the phone while the respondent completed the survey online. This approach utilized follow-up probes to determine whether respondents were having difficulty understanding the meaning of the questions or any words, problems remembering the information needed to answer questions, issues with unnecessary questions or response options, or trouble with missing questions or response categories. The items that were reported by the respondents or were noted by the researcher as problematic, difficult, or time-consuming to answer were further discussed to determine the reasons for the difficulties. Second, the survey was reviewed by an additional EL participant who provided written feedback on the instrument and any areas of confusion or doubt he experienced.
Pretests with three internal research staff were completed to provide a burden estimate for the final instruments. Staff were instructed to assign themselves a fictional business identity and to complete the survey in one sitting, at a time and place with minimal distractions. The average burden across testers is detailed in exhibit 2.
Exhibit 2. Burden estimates of the test surveys
Instrument |
Average burden in minutes |
Intake survey |
17 |
Feedback survey |
9 |
Follow-up survey |
25 |
Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.
Questions regarding this collection should be directed to SBA. Contact information for SBA staff associated with this program is provided in the table below. The consultants used to design this data collection for the agency are from Optimal Solutions Group, LLC. Their contact information is also provided in the table.
Specialization |
Name |
Title |
Organization |
E-mail address |
Evaluation |
Brittany Borg |
Program Analyst |
U.S. Small Business Administration |
|
Evaluation |
Jennifer Auer |
Project Director |
Optimal Solutions Group, LLC |
|
Labor economist |
Mark Turner |
CEO |
Optimal Solutions Group, LLC |
1 Although the intake survey does not always obtain a 100% response rate, it is typically 95% or higher. Thus 100% is used to account for a maximum burden potential.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Rich, Curtis B. |
File Modified | 0000-00-00 |
File Created | 2021-01-15 |