OED-Scale-up Supporting Statement Part. B 9-9-15

OED-Scale-up Supporting Statement Part. B 9-9-15.docx

Scale-up America initiative Evaluation Study

OMB: 3245-0389

Document [docx]
Download: docx | pdf

Supporting Statement for ScaleUp America Initiative

PART B. Collections of Information Employing Statistical Methods


(OMB Control Number: 3245-XXXX)

  1. Universe and Sampling Respondent Selection

Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.


The intake and follow-up instruments will be sent to all ScaleUp participants in Cohorts 1 and 2 at the eight sites. The participant universe is 272 small businesses that received ScaleUp assistance during fiscal year 2015. A sampling strategy will not be used due to the small size of the cohorts. The universe of comparable small businesses that are not participating in the initiative at each site is unknown. The goal is to identify 2,000 potentially comparable businesses and after screening and nonresponse, maintain a comparison group of about 270 businesses. The initial list of 2,000 will be identified from local (e.g. Chambers of Commerce) and national (e.g. Hoovers) small business directories. It is estimated that 20 percent of businesses identified will respond to the screening questions (400) and 80 percent of those businesses will be eligible to participate (320). The recruitment strategy is built around the need for a minimum number of response units to detect statistically significant differences from the participants, as well as maintaining approximately comparably sized groups between the participants and comparison members across all sites.


Exhibit 1 details the expected sample sizes. Because matched intake and follow-up surveys are required for the difference-in-difference analysis planned, the minimum response rate (most likely to the follow-up survey) is shown. The comparison group response rate estimate is based on a similar Department of Labor study that used an incentive for its comparison group (Michaelides and Benus 2012). That study achieved an 80 percent response rate. An 80 percent response from participants is higher than is often achieved in business assistance outcome surveys.1 However, an intentional awareness strategy was developed at the start of the ScaleUp initiative. This strategy informed ScaleUp administrators of the types and purpose of the data collections so they could inform participants very early on in his/her acceptance to the program. Moreover, the cohort design of the ScaleUp initiative ensures regular contact between participants and administrators.


Exhibit 1: Expected Response Rate by Survey Population


Type of respondent

Universe

Expected number of respondents

Expected Response Rate

Participant Surveys

272

218

80%

Comparison Group Surveys

320

256

80%


  1. Describe the Procedures for the Collection of Information

Statistical methodology for stratification and sample selection, estimation procedure, degree of accuracy needed for the purpose described in the justification, unusual problems requiring specialized sampling procedures, and any use of periodic (less frequent than annual) data collection cycles to reduce burden.


As noted in question 1, no sampling strategy is used in this data collection. Data will be collected from all ScaleUp participants as well as from a matched comparison group of small business owners in the same community. The estimates will be generalizable to the eight ScaleUp sites, which were selected purposefully to fulfill predefined SBA contract criteria, but will not necessarily be representative of other regions of the U.S. or other locations in which the initiative may be deployed in the future. If the follow-up response rate falls below the expected 80 percent, non-response bias analysis will be conducted using demographic and business information collected in the intake survey and reported in the final report.2 The primary objective of the impact evaluation is to provide statistically valid and reliable estimates of the incremental effects of ScaleUp services on key business outcomes, including revenue and employment growth. The entire program rather than the individual sites are the unit of analysis for the statistical methods employed.


The primary analysis method used is regression adjusted difference-in-difference estimation, which allows for the estimation of differences between participants and comparison group members (the first difference) in the changes over time in key variables (the second difference). The changes over time in key small business outcomes (e.g. small business revenue or employment) that are attributable to participation in ScaleUp are estimated, controlling for time period, program participation, location, time-variant community economic characteristics (e.g. local unemployment rates), and time-variant small business characteristics. Estimated standard errors are adjusted appropriately for grouping of data by site. Double-differencing has the effect of helping control for selection issues by eliminating the influence of all observed and unobserved time-invariant characteristics (that have time-invariant effects on the outcome in question). Standard inferential statistics will be employed to determine if estimated differences are statistically significant, using a standard threshold of a 5 percent significance level to reject the null hypothesis of no difference.


  1. Describe Methods to Maximize Response Rates and methods to Deal with Issues of Non-Response

The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.


To maximize the survey response rate and minimize respondent burden, the following data collection techniques will be used. Surveys will be introduced to the cohort participant and comparison group members via an official introduction e-mail from the ScaleUp initiative director (or SBA in the case of the comparison group). This will help to legitimize the survey and highlight its importance. The introduction outlines potential benefits to the business community resulting from the evaluation. It highlights that the survey will directly contribute to finding ways to further improve the program so that the needs of growth-oriented small business owners and entrepreneurs are better served. It also explains the confidentiality of the information provided by the respondent.


After the introduction letters, Optimal will send an e-mail to cohort participants and comparison group members with the direct survey link embedded. The surveys are designed to use skip patterns and simple, often multiple-choice answer options to facilitate the ease with which the survey can be completed. Optimal will also send up to four reminder e-mails encouraging potential survey respondents to complete the survey before the deadline. The reminders will be sent at different times of the day to increase the probability of reaching respondents at an opportune time.


Importantly, response rates are expected to be extremely low for comparison group members who have not received the free ScaleUp services. As a result, all comparison group members will be offered a $50 VISA gift card incentive for completing each survey. This compensation is equivalent to about $100 per hour and is expected to generate high interest in completing the surveys. Among participants, SBA has worked to promote the importance of the study to administrators on multiple occasions since the start of the program and to convey this information to the participants to increase their expectation and willingness to complete the surveys.


  1. Describe Any Tests of Procedures or Methods to be Undertaken

Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from ten or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


The participant and comparison group member intake surveys and the participant and comparison group member follow-up surveys and comparison group screening questionnaire were pretested to ensure reliability, minimize measurement error, and minimize respondent burden. The surveys were revised based on the feedback received from four invited respondents that are comparable in business types and size to the ScaleUp participants. Cognitive interviewing methods were used to review the instruments and items the respondent or the researcher noted were problematic, difficult, or time-consuming to answer, were further discussed to determine the reasons for the difficulties and ways to improve the question.


In addition, pretests on internal research staff were completed to provide a burden estimate for the final instruments. Staff were instructed to complete the survey in one sitting, at a time and place with minimal distractions. The longer of the intake and follow-up instrument was used for the estimate. The average burden across testers was as follows.


Exhibit 2: Burden estimates


Instrument

Average burden in minutes

Intake survey

17

Screening questionnaire

6



  1. Expert Contact Information

Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


The consultants used to design, collect and/or analyze the information for the agency are from Optimal Solutions Group, LLC. The contact information for the Optimal research team is below.


Program Evaluation/ Labor economist

Laura Leete, Ph.D.

Senior Research Associate

Optimal Solutions Group, LLC

 [email protected]

Program Evaluation/ Labor economist

Mark Turner, Ph.D.

CEO and President

Optimal Solutions Group, LLC

[email protected]

Program Evaluation

Jennifer Auer, Ph.D.

Project Director

Optimal Solutions Group, LLC

[email protected]


For questions regarding the study or questionnaire design or statistical methodology, contact the staff members listed above at:


Optimal Solutions Group, LLC

M Square Research Park

5825 University Research Court, Suite 2800

College Park, MD 20740-9998

Telephone: 301-918-7301;

E-mail: [email protected]


References

Michaelides, M., and Benus, J. (April 2012). Are Self-Employment Training Programs Effective? Evidence from Project GATE. IMPAQ International.








2 It is anticipated that the response rate to intake surveys will be close to 100% because participants will be in weekly classes and administrators can prompt them to complete the survey.

3


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorRich, Curtis B.
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy