Part A - SET demonstration-revised 01 02 2013

Part A - SET demonstration-revised 01 02 2013 .docx

Self-Employment Training (SET) Demonstration Evaluation

OMB: 1205-0505

Document [docx]
Download: docx | pdf

Self-EMployment Training Demonstration Evaluation

PART A: SUPPORTING STATEMENT FOR PAPERWORK
REDUCTION ACT SUBMISSION

The U.S. Department of Labor (DOL), Employment and Training Administration (ETA) contracted with Mathematica Policy Research to implement and evaluate the Self-Employment Training (SET) Demonstration. This demonstration is a reemployment program targeted towards dislocated workers, as defined by the Workforce Investment Act (WIA)1, who are interested in starting or growing a business in their fields of expertise. The demonstration will seek to connect such workers to self-employment training, intensive business development assistance, and other services (including seed capital microgrants) to help them become more successful in self-employment.

The main objective of the evaluation of the SET Demonstration is to understand whether providing dislocated workers with access to intensive business development services and self-employment training increases their likelihood of reemployment, their earnings, and their propensity to start a business. The evaluation will use a rigorous experimental design in which approximately 3,000 applicants to the program in up to eight study sites are randomly assigned to a program group or a control group with equal probability. Enrollment will occur over an 18- to 24-month period2, and members of the program group will receive ongoing access to intensive business development counseling from a self-employment advisor, as well as training and other assistance related to their specific self-employment needs, free of charge to them, for up to 12 months. Program group members who achieve key program participation milestones (such as completing a business plan) will have the opportunity to apply for seed capital microgrants of up to $1,000 to help pay for inventory, equipment, licenses, or other business establishment costs. The control group will not have access to SET services during the demonstration’s 30- to-36-month implementation period and will be ineligible for the SET microgrants. Both groups will be able seek out and make use of other self-employment services offered by existing community providers, although program group members will have such services partially subsidized through the demonstration. Impacts will be measured 18-months after randomization. An implementation study will also be conducted to provide information that will help ETA further refine the self-employment services made available to dislocated workers and other customers of the workforce system.

This package requests clearance for five data collection efforts to be conducted as part of the SET Evaluation:3

  1. An Application Package. This package includes: (1) a consent form for study enrollment, which will be administered to ensure that applicants are fully informed about the study’s goals and expectations; (2) a dislocated worker screener form, which will allow the evaluation team to determine whether applicants qualify for the demonstration under one of the WIA-defined dislocated worker categories; (3) a background information form, which will be used to obtain baseline information about applicants’ demographic characteristics and previous work experiences; (4) a business idea form requesting detailed information about the applicant’s proposed business and how it relates to his or her prior work experience; and (5) a contact information form requesting information on three relatives or friends who could help locate the applicant for follow-up data collection (only if needed). The application package forms are included as Appendix A.

  2. Program Participation Records. These data will be collected to better understand the flow of individuals through the SET program and will include selected information from the following forms: (1) participant tracking forms that will be used by the program’s self-employment advisors to record participant status and assessment at intake, the types and quantities of specific training and assistance received by program group members each month, and the business development milestones reached, (2) a seed capital request form that will be used for qualifying SET program participants to request seed capital funds to cover the costs of approved business expenses, and (3) a service termination form to be filled out by the provider in the event that a participant exits the program. Appendix B includes copies of the forms that will be used to collect program participation records.

  3. A Follow-Up Survey. The follow-up survey of study members will be administered 18 months after random assignment to gather information about their economic outcomes (Appendix C).

  4. Site Visit Interviews. Two rounds of in-person visits to each study site will provide information about implementation of the SET Demonstration program. Interviews will be conducted with SET program staff, workforce staff, and others involved in the SET demonstration program. A master protocol is included as Appendix D.

  5. Case Study Interviews. Qualitative information about the experiences of selected study participants who were assigned to the program group will be gathered through a series of case study interviews. These interviews will provide an opportunity to explore in greater depth the patterns of service utilization and self-employment trajectories among selected members of the program group. A master protocol is included as Appendix E.


1. Circumstances Necessitating the Data Collection

As of October 2011, the unemployment rate in the United States was 9 percent.4 Almost 14 million people were looking for jobs. Although not as high as during the trough of the 2007 recession, the rate of unemployment has not fallen notably since April 2011 and remains higher than at any point over the 24-year period leading to the recession. The economic downturn has led to greater attention on the role of microenterprise in helping individuals return to work and improving the economy. Accordingly, ETA seeks to implement and rigorously evaluate the effectiveness of innovative strategies for promoting reemployment based on the authority granted to DOL under Title I of WIA.5

The SET Demonstration focuses specifically on self-employment as a reemployment strategy for dislocated workers. The demonstration is premised on the hypotheses that (1) self-employment could be a viable strategy for dislocated workers to become reemployed; (2) starting a small business is difficult, especially for individuals who lack business development expertise or access to start-up capital; and (3) dislocated workers might experience difficulties identifying and accessing training programs and other forms of assistance that could effectively prepare them for self-employment via the existing workforce infrastructure.

The SET Demonstration will implement a new service delivery model that seeks to better connect dislocated workers to self-employment training and related assistance. This approach differs from previous large-scale demonstration programs, which have provided mixed evidence on the effectiveness of self-employment services on earnings and employment, because the SET Demonstration will (1) offer more intensive business development services than prior demonstrations have offered, (2) concentrate on individuals who have fairly limited traditional employment prospects but are well-positioned to benefit from self-employment counseling and training, and (3) make available seed capital microgrants to those individuals who engage strongly in the program and achieve key participation milestones to help them establish their businesses. The SET Evaluation will assess the effectiveness of the SET Demonstration model.

The remainder of this section provides additional information on the context and nature of the SET Evaluation in five subsections. The first subsection provides an overview of the Federal policy environment in which the demonstration will be implemented. The second discusses prior ETA research projects that have examined the effects of self-employment service provision. The third subsection describes the main features of the program model to be implemented in the SET Demonstration. The fourth describes the criteria that will be used to select sites implementing the demonstration and the target population of dislocated workers that the demonstration seeks to serve. The fifth subsection outlines the main features of the SET Evaluation.

a. Policy Context

Federal policymakers place a high priority on increasing entrepreneurship, self-employment, and business ownership, while recognizing the difficulties that individuals face when starting a business, such as limited business development expertise and access to start-up capital. Notably, the American Recovery and Reinvestment Act of 2009 (ARRA) included $730 million for the Small Business Administration (SBA) to increase small business lending opportunities. Likewise, the Small Business Jobs Act (P.L. 111-240), signed by President Obama on September 27, 2010, sought to promote small business job creation through a variety of mechanisms, including increasing access to capital and targeted counseling and technical assistance. More recently, the Middle Class Tax Relief and Job Creation Act of 2012 (P.L. 112-96) included provisions for an expansion of state Self-Employment Assistance (SEA) programs serving dislocated workers receiving unemployment insurance (UI) benefits.

The goal of the SET Demonstration is to rigorously test whether access to a self-employment assistance package that is centered around intensive business development counseling from a self-employment advisor (described later) increases the employment, earnings, and rates of self-employment among dislocated workers. The transition to self-employment could be particularly challenging for dislocated workers because their work experience is strongly tied to previous wage and salary employment. Although some skills and substantive knowledge can be transferrable to self-employment, starting and running a small business often calls for a broader set of managerial skills (such as knowledge of finance, bookkeeping, marketing, and human resources) than what is required in many wage or salary jobs. Frontline staff members of the Federal workforce system, such as at WIA American Job Centers (AJCs) and state employment offices, also have typically focused on traditional wage and salary employment when supporting their customers in their reemployment efforts. AJC staff tend to be familiar with traditional job-search strategies and job-matching assistance or avenues for pursuing occupation-specific training, and have less experience in linking customers to providers of self-employment assistance and training. Thus, the service model implemented in this demonstration could improve the ability of the existing workforce system to leverage available self-employment services for dislocated workers interested in pursuing self-employment as a reemployment strategy.

The program model implemented in this demonstration potentially augments SEA programs, which have so far been implemented in seven states and serve a small subset of dislocated workers who are likely to exhaust their UI benefits and are interested in self-employment. A temporary program created in response to the North American Free Trade Act, SEA programs received permanent authorization in 1998. The seven states that have implemented a permanent SEA program are Delaware, Maine, Maryland, New Jersey, New York, Oregon, and Pennsylvania. California passed enabling legislation, but its SEA program is currently inactive. Eligibility for SEA programs is limited to UI recipients identified by state Worker Profiling and Reemployment Services (WPRS) systems as being likely to exhaust their regular UI benefits.


SEA programs seek to address the challenges facing such dislocated workers by providing self-employment training and technical assistance.6 However, although SEA legislation requires that participants be provided with such services, a comparison of programs found that “the intensity or extent of the provision varie[d] greatly” across SEA states (Kosanovich and Fleck 2002; page 11). The new SEA provisions contained in the Middle Class Tax Relief and Job Creation Act allow states the option of establishing SEA programs for individuals eligible for Extended Benefits (EB) or Emergency Unemployment Compensation (EUC). The new provisions also authorize the distribution of $35 million in grants to help states implement or improve administration of SEA programs (for both claimants of regular UI and extended benefits). These funds may be used to promote the SEA programs and enroll individuals in these programs, but cannot be used to pay for training.

The SET Demonstration could provide valuable information to states seeking to establish a more intensive service model to complement their SEA programs. The evaluation will provide evidence on innovative approaches that could be used to more consistently connect dislocated workers to services that may ease the challenges they face in their transition to self-employment. Additionally, the SET Demonstration will examine a broader dislocated worker population beyond likely UI exhaustees.

b. Prior Research on Self-Employment Services

ETA has examined how entrepreneurial services affect the outcomes of individuals seeking self-employment through two waves of prior research. During the late 1980s and early 1990s, ETA funded the implementation and evaluation of two UI Self-Employment Demonstration (UISED) projects. During the first decade of the 2000s, ETA sponsored and evaluated a multistate initiative known as the Growing America Through Entrepreneurship project (Project GATE). Prominent features and results of the UISED projects and Project GATE are described in additional detail in this subsection.7

UI Self-Employment Demonstrations. UISED programs were implemented in two states: (1) the Enterprise Project in Massachusetts and (2) the Self-Employment and Enterprise Development (SEED) Project in Washington state. In both sites, UI recipients interested in self-employment were offered the opportunity to attend self-employment classes and workshops that covered a core set of business training modules after attending an orientation session and completing an application to the program. Participants were also given financial assistance similar to participants in state SEA programs. There were two notable differences between the Massachusetts and Washington UISED programs. First, the Enterprise Project restricted eligibility to UI recipients who, according to the state’s Worker Profiling and Reemployment Services (WPRS) statistical model, were likely to exhaust their benefits, whereas the SEED project permitted all UI recipients to participate. Second, Washington allowed program participants who had completed certain required training activities to receive their remaining UI benefits in a lump sum that could be used as business seed capital.8 The Enterprise Project in Massachusetts did not include this provision.

Both UISED programs were evaluated using experimental designs that compared program and control group outcomes at 21 months and 32 months, on average, after random assignment. While the findings from both evaluations were generally positive, there were notable differences in the results between the states (Benus et al. 1995). In Massachusetts, the Enterprise Program increased the propensity to enter self-employment during the evaluation period by 12 percentage points—a substantial effect compared with the 50 percent rate of entry into self-employment seen in the control group. However, at the time of the final follow-up, there was no significant difference between the two groups in the percentage of individuals who remained self-employed. The Enterprise Project also increased overall earnings during the follow-up period by more than 50 percent, but the higher earnings were largely due to wage and salary jobs, not self-employment. In Washington, the rate of entry into self-employment was 22 percentage points higher in the SEED program group than in the control group—63 percent, compared with 41 percent—by the last follow-up. The SEED program also substantially increased the likelihood of remaining in self-employment through the end of the study period by 12 percentage points. Yet, the program had no effect on employment or total earnings at any point, because the increase in the rate of, and greater earnings from, self-employment were almost exactly offset by a decrease in the rate of, and earnings from, wage and salary employment.

Project GATE. Although the previous Federal demonstration projects had targeted UI recipients exclusively, Project GATE sought to use AJCs to help a broader population of interested participants gain access to self-employment training and technical assistance, including help in applying for business loans. The AJCs conducted outreach and hosted the program’s orientations. Although many of the services available to Project GATE participants were already available in the community from Small Business Development Centers (SBDCs), Women’s Business Centers (WBCs), and community-based organizations (CBOs), the demonstration project sought to augment this existing infrastructure in three important ways:

  1. Any individual who expressed an interest in self-employment and completed an application was enrolled in the demonstration.

  2. Program participants received a formal assessment after being enrolled in the program to establish services most appropriate for them. After this assessment, participants were referred to a single provider (that is, an SBDC, a WBC, or a CBO) to attend entrepreneurship classes and/or receive technical assistance.

  3. All services were provided to Project GATE participants free of charge.

Project GATE was implemented in five sites—Philadelphia, Pittsburgh, Minneapolis-St. Paul, Northeast Minnesota, and the state of Maine—from fall 2003 to summer 2005.

Project GATE was evaluated using a random assignment design in which the program group, which could receive demonstration services, was compared with a control group that could not participate in Project GATE, but could access any other self-employment services available in its communities. The evaluation found that the program had a significant, but small, impact on the rate of business ownership in the early quarters after program enrollment, but this impact eroded over time (Benus et al. 2008). By the 18-month follow-up survey, members of the program group were 3 percentage points more likely to own a business, relative to the control group (in which the rate of business ownership was 40 percent). The impact of GATE was slightly higher on individuals receiving UI benefits at the time of random assignment at 18 months: 5 percentage points, compared with an insignificant 2 percentage point differential between UI nonrecipients in the program and control groups. However, five years after random assignment, Project GATE participants and control group members were equally likely to own a business (Benus et al. 2009). Further, the five-year follow-up also indicated that Project GATE had no significant impact on total employment at any point during the five years after randomization and no impacts on UI benefit receipt, receipt of public assistance benefits, or household income. Finally, although the GATE program group earned less than the control group during the first six months after random assignment—presumably because they were engaged in entrepreneurial training—there was no statistically significant effect of GATE on total earnings at any other time during the remainder of the five-year evaluation period.

c. The SET Program Model

The SET Demonstration builds on the lessons from previous demonstration projects by seeking to implement an intensive business development counseling model that (1) promptly engages program group members in services, (2) periodically (re)assesses their evolving needs and helps link them to the most appropriate services, and (3) provides ongoing motivation and support to overcome obstacles and persist in efforts at self-employment (as appropriate). As described in the next subsection, the demonstration services will be targeted toward dislocated workers whose prior experience makes them likely to benefit from self-employment assistance and training. Another important component of the demonstration will be the offer of seed capital microgrants of up to $1,000 to those program participants who engage strongly with the program and achieve key participation milestones (such as completing a business plan), to help them cover important business establishment costs.

A self-employment advisor at a local partner organization will be assigned to each member of the SET program group. Similar to the GATE demonstration, this advisor will assess the participant’s initial situation, develop a customized package of training and technical assistance services tailored to the specific needs of the participant, and help connect SET participants with appropriate providers in the community. The demonstration’s program model also calls for the self-employment advisor to promptly engage SET program group members (within a week of random assignment) and develop a deeper ongoing relationship with these clients, by meeting periodically to assess the participants’ progress toward self-employment and their evolving service needs, and by facilitating linkages to appropriate self-employment assistance services. Although the package of specific self-employment supports will be customized for each participant, it is expected that a common theme among self-employment advisors will be to promote business development outcomes that are associated with self-employment success. Examples of such intermediate milestones include gaining access to startup capital, registering a business, and completing a business and/or marketing plan, as applicable.

SET program group members are likely to vary substantially in their self-employment assistance needs. Knowledgeable and experienced self-employment advisors are ideally positioned to help these participants identify and marshal the most appropriate and effective resources that are readily available in their communities. Recommended services may include microenterprise training, individualized business development counseling and technical assistance, access to mentors or peer support/networking groups, and other logistical support (such as incubator office space, discounted business services, meeting rooms, or high-speed Internet access). Hence, the demonstration’s approach aims to provide greater flexibility in developing a customized package of services for SET program group members than was feasible under Project GATE, which relied on a relatively small number of pre-specified providers and offered a core set of services. Preliminary research by the SET evaluation contractor suggests that limited staff availability and other resource constraints prevent many microenterprise development organizations (MDOs) from widely adopting an intensive business development counseling model similar to that proposed for the SET Demonstration. Thus, this approach has the potential to generate substantial differences between program and control group members in both the quantity and quality of services received.

Another important feature of the SET Demonstration is that program group members will be promptly engaged by their designated self-employment advisors after being accepted into the program. In Project GATE, initial assessments occurred, on average, 3.6 weeks after random assignment (Bellotti et al. 2006). This delay could have contributed to the 10 percent drop-off in the number of individuals receiving assessments, compared with the number of individuals who were randomized to the GATE program group. Preliminary research by the SET evaluation contractor suggests that capacity constraints might result in similarly demotivating delays for individuals seeking assistance from MDOs today. In light of these issues, the study team will link SET program group members to partner providers with adequate capacity to engage them in services promptly after random assignment.

The SET self-employment advisors will also be expected to follow up with program participants on a regular basis during a 12-month period to monitor their progress toward self-employment and periodically reassess their service needs. Such longer-term support is likely to represent an important improvement over the one-time, initial assessments used in Project GATE. As the needs of SET program participants change, self-employment advisors should be able to link them to additional self-employment services, if necessary, and help them troubleshoot difficulties that they encounter. Long-term follow-up is also expected to address the marked drop-off in service utilization over time seen in the GATE program, by promoting stronger engagement and persistence with self-employment efforts beyond the initial enrollment period.

A key component of the SET demonstration will be the offer of seed capital microgrants to qualifying program participants. That is, those SET participants who engage strongly with the program and achieve key participation milestones will have the opportunity to apply for and receive seed capital microgrants of up to $1,000 to help them cover direct costs associated with establishing and beginning to operate their businesses. Access to seed capital may be an important determinant of self-employment success for aspiring business owners, but few MDOs offer such support currently. Startup microgrants may help meet SET program participants’ immediate business capital needs while they wait for microloan applications or other funding sources to be approved. Such microgrants could help pay for start-up equipment and supplies or help defray business costs that may otherwise prevent aspiring business owners from establishing their businesses. They may also improve access to other sources of seed capital among dislocated workers who have a poor credit history and limited collateral.

The SET Evaluation contractor is in the process of exploring the feasibility of various approaches to implementing the SET Demonstration program. For example, the demonstration could rely on MDOs that already use a similar model or negotiate with existing providers to add such services and offer them exclusively to the SET program group members. The preferred approach is likely to be contingent on the providers available in those sites that ultimately are recruited to operate the demonstration and, hence, will be refined as the demonstration moves forward.

The study team is also working with state agencies to assess the feasibility of allowing program group members who are UI recipients to continue receiving their remaining UI benefits as long as they are making satisfactory progress in the SET program (as was done in the UISED programs). This provision could increase the self-employment benefits derived by SET program participants who also qualify to receive and have UI benefits remaining.

d. Selecting Study Sites and Participants

The SET Demonstration will be implemented in up to eight study sites at which recruitment will target dislocated workers likely to meet the study’s eligibility criteria (described below). It is expected that up to 4,000 individuals will apply to the SET program after completing a web-based orientation session describing the program. The evaluation team will select 3,000 individuals meeting the eligibility criteria—the application process will be closed once this target is reached.9 Successful applicants will then be randomly assigned to a program group, which will have access to the services described previously, or to a control group, which will not be entitled to receive the demonstration services. Descriptions of the factors to be used to select study sites and potential participants follow, while Section A.2 describes in greater detail the respondents that will be included in each of the study’s data collection efforts.

Selecting high-demand and high-capacity sites. ETA is currently working with the contractor to identify states and local sites with sufficient demand to permit the evaluation to meet the study enrollment target and where the capacity of microenterprise providers and the workforce system will allow the demonstration to deliver a strong intervention. The evaluation team is focusing on large metropolitan statistical areas (MSAs) with the following characteristics:

  • High unemployment rates. To meet the recruitment goal of 3,000 sample members (with an average of 375 potential participants in each site), the study team will concentrate on large MSAs in which the unemployment rates are high.

  • A dislocated worker population with diverse industry experience. Choosing sites with diverse types of dislocated workers will increase the relevance of the demonstration to other states and reduce the odds that study participants will be faced with excessive competition in launching a small business.

  • A strong network of AJCs and workforce system partners. Identifying states where local sites have strong AJC and workforce system partnerships will enable the demonstration to draw effectively on existing capacity for program marketing, intake, and referrals.

  • A strong presence of MDOs. MDOs and other similar CBOs are likely to have self-employment advisors with experience delivering the intensive business development counseling services described previously. The study team is using information from the Aspen Institute’s microTracker database and other sources to identify sites with a strong MDO presence.

To support adequate implementation of the demonstration’s program activities, the evaluation contractor will provide modest compensation to MDO and workforce system partners in each study site. Local MDOs that partner with the demonstration to deliver SET services to members of the program group will be compensated for delivering such services, providing information on participant engagement and service receipt for the evaluation, and cooperating with other evaluation data collection activities, such as site visit interviews. Compensation will be provided according to the terms of memoranda of understanding negotiated between these organizations and the evaluation contractor. Partner workforce agencies and state UI agencies will also receive modest compensation to cover the costs of outreach and recruitment activities undertaken in direct support of the SET demonstration, as well as cooperation with evaluation activities. Compensation terms for these organizations will be negotiated and established via formal memoranda of understanding between such agencies and the evaluation contractor.

Selecting a study population likely to benefit from SET services. The services offered as part of the SET Demonstration will be concentrated on dislocated workers who, at baseline, already have established behaviors suggesting that they will be responsive to and benefit from self-employment training. In Project GATE, any interested individual could enter into the program, and the UISED projects enrolled all UI recipients (or likely exhaustees) who expressed an interest in self-employment. The SET Demonstration departs from this approach in recognition that self-employment might not be a realistic option for underprepared individuals. Moreover, concentrating scarce program resources on those individuals who are likely to receive some benefit from the program increases the likelihood that the SET Evaluation will detect statistically significant impacts. It also helps to avoid creating false expectations for success in self-employment among individuals likely to experience significant difficulties.

To identify dislocated workers who are likely to benefit from the program, the SET Demonstration will concentrate on applicants whose prior work experience explicitly relates to their proposed business idea. This decision is motivated, in part, by research suggesting that individuals who have prior work experience in a related occupation, business, or industry are more likely to succeed as small business owners (Fairlie and Robb 2008; Harada 2003; Baptista et al. 2007). In addition, related work experience can be an important factor in lenders’ decisions to approve business loans. For example, the SBA’s web site notes that when lending institutions assess an applicant, “experience in business as well as past achievements in your industry will be reviewed.”10

To operationalize this targeting strategy, SET program applicants will be asked to describe in detail how their proposed business idea relates to their prior work experiences. As the design of the demonstration is refined and study sites are identified, the study team will explore the feasibility of having evaluation staff, staff at the participating AJCs, or staff at the partner MDOs screen applications based on detailed guidelines for assessing this business–experience match. If local partner organizations do not have the capacity to reliably and uniformly apply the proposed screening procedures, the study team will assess applications to the SET Demonstration centrally.

The eligibility criteria for the demonstration will be explicitly described in SET publicity materials and during online orientations that all individuals interested in applying to the program will be required to complete. At these orientations, prospective applicants will also be informed that (1) applications not meeting SET’s eligibility criteria will be screened out and (2) meeting the eligibility criteria qualifies them only for a 50 percent chance to enter the SET program, based on the outcome of the random assignment lottery. Although eligibility criteria will also be made clear to interested individuals through mandatory online orientations, it is assumed that up to one in four applicants could be screened out after submitting an application. Thus, achieving a total pre-randomization study enrollment of approximately 3,000 individuals implies that applications could be obtained for up to 4,000 individuals.

As previously discussed, the 3,000 individuals meeting the eligibility screens will be randomly assigned to the program and control groups with equal probability. The program group will be eligible to receive services through the SET Demonstration. The control group will be ineligible for such services. Both groups will have access to other existing services available through AJCs and community providers of standard self-employment assistance and training. The evaluation team will select partner MDOs that will help support the integrity of this control-group design, by providing SET services only to the members of the program group for the demonstration’s implementation period.e. Overview of SET Evaluation

The SET Evaluation will analyze the effectiveness of the SET Demonstration and will include two major components: (1) an implementation study and (2) an impact analysis. The results of the evaluation will provide ETA with valuable information to determine the extent to which the SET program model can serve as an effective and realistic approach for helping the target population of dislocated workers become reemployed.

Implementation study. This component of the evaluation will describe: (1) the implementation of the SET Demonstration program in each of the study sites and (2) the experiences of up to 32 individuals participating in the program. Much of the analysis for the implementation study will be qualitative (see Part B of this Office of Management and Budget (OMB) package for further details) and is structured to address the following research questions:

  • What is the context in which the SET Demonstration is implemented? Documenting the community setting and existing program infrastructure is essential for understanding the potential effects of the services provided as part of the SET Demonstration. This information will also enable the evaluation team to understand the “counterfactual” against which the SET Demonstration is being tested—that is, the entrepreneurial infrastructure that would be available to program participants in the absence of the intervention.

  • What organizations participate in SET service delivery, and what are their responsibilities? Describing the characteristics and roles of workforce development partners and MDO providers is important for understanding the quality of program implementation and determining the kind of partnerships necessary for successful provision of SET services.

  • What services are offered to program group members as part of the SET Demonstration and what other services are available to them? Examining the types of services offered through the SET program and how these differ from or complement microenterprise development and other services available to program participants will make it possible to determine whether the demonstration improves access to services. Of particular interest is whether SET providers are able to conduct ongoing follow-up with program group members, offer more customized technical assistance, and facilitate or provide increased access to capital.

  • How well was the SET program implemented, and how did implementation vary across sites? Understanding any challenges in implementing the SET program model will provide ETA with information on the feasibility of scaling up the demonstration and/or lessons for how access to self-employment services might be improved in the future within the context of the Federal workforce system. Variation across sites provides information on contextual factors that might influence implementation. This variation can also help understand differences in the demonstration’s impacts, as discussed in later sections.

  • What are the characteristics of the SET Demonstration study population? Learning about the baseline demographic and economic characteristics of applicants who met the study’s eligibility screens will help us understand which segments of the dislocated worker population were enrolled in the demonstration. This could provide ETA with insight on the extent to which the demonstration’s impacts have applicability in the broader customer base of the WIA-funded AJC system. In addition, comparing the characteristics of program participants with control group members will be essential for confirming the validity of the evaluation’s random assignment procedures.

  • What were the experiences of selected SET Demonstration program group members? This component of the implementation study will seek to understand the experiences of SET program participants and the responses of selected participants to the SET program model. Of particular interest is whether SET provides customized guidance and intensive follow-up and whether the program helps address perceived barriers to starting a business, such as difficulties accessing startup capital and/or lack of technical expertise. In order to make efficient use of scarce resources, the study team will seek to gather additional detail about the experiences of both program participants who succeed in achieving key self-employment milestones and/or becoming self employed and program participants who fail to do so. Examining differences in program experiences among participants who persist in their self-employment efforts and successfully establish a business, as well as those who do not and decide to focus on wage and salary employment, can shed light on the conditions and ways in which the demonstration’s services meet or fail to meet participants’ needs. Participants’ experiences can also suggest potential gaps in the overall service delivery infrastructure available to aspiring or nascent business owners, including SET demonstration services. For those who chose to focus on wage and salary employment, it will be useful to understand the role that SET had, if any, their decision to return to wage/salary employment (or to combine self-employment and wage/salary employment) and on the industry or occupation of the reemployed participant.

As shown in Table A.1, the implementation study will rely primarily on site visits and case study interviews to obtain the data needed to answer these research questions, with additional information coming from the baseline SET Demonstration’s application forms, program participation records, and the follow-up survey. Each of these data sources is described in greater detail in Section A.2.

Table A.1. Research Questions for Implementation Study by Data Source


Data Source

Research Question

Application Package

Program Participation Records

Follow-Up Survey

Site Visit Interviews

Case Study Interviews

1. What is the context in which the SET Demonstration is implemented?




X


2. What organizations participate in SET service delivery and what are their responsibilities?




X


3. What services are offered to SET program group members and what are the other services available to them?


X


X


4. How well was the program implemented and how did implementation vary across sites?


X


X


5. What are the characteristics of the SET Demonstration study population?

X





6. What were the experiences of SET participants with the program?

X

X

X

X

X


Impact analysis. To rigorously estimate impacts, a randomized design will be used to compare the outcomes of approximately 1,500 program group members with the outcomes of approximately 1,500 control group members. Random assignment will enable the evaluation to obtain causal evidence on the effects of SET Demonstration services, relative to what might be obtained by members of the target population from existing community providers only. Additional information on the statistical methods used to estimate impacts and assess their statistical significance is presented in Part B of this package.

The impact analysis will address the following research questions:

  • What is the net impact of the SET Demonstration program on participants’ overall employment status and total earnings? In light of the goals of ETA for the SET Demonstration, as well as findings from the UISED demonstrations, two of the three primary outcome measures for the evaluation will be employment in any kind of job and total earnings. Findings on these outcomes will be used to summarize the overall effectiveness of the program in achieving its goal of improving the reemployment prospects of the dislocated workers served by the demonstration. The evaluation will also consider outcomes related specifically to self-employment and wage or salary employment separately. To augment the analysis of earnings, the analysis will examine job quality measures such as fringe benefits and availability of health insurance.

  • Does the SET Demonstration increase the likelihood of self-employment? The evaluation will also consider the effects of the demonstration on participants’ likelihood of becoming self-employed and their likelihood of remaining self-employed through the end of the evaluation’s follow-up period. The latter measure is the third primary study outcome.

  • Does the SET program improve intermediate business development outcomes? In order to better understand the channels through which the SET program operates, the evaluation will consider how effectively it encourages participants to take steps associated with self-employment success. The study will specifically consider intermediate milestones such as whether participants were able to gain access to startup capital, register their businesses, and develop and complete a business and/or marketing plan.

  • How does participation in the SET Demonstration affect economic well-being and participation in other programs? It is also of interest to know the demonstration’s impacts on the economic circumstances of program participants. Accordingly, the impact analysis will examine total household income, measures of financial hardship, receipt of UI benefits, receipt of other forms of public assistance (including assistance from the Temporary Assistance for Needy Families [TANF] program and the Supplemental Nutrition Assistance Program [SNAP]), and receipt of government-sponsored job training and supportive services.

  • Do program impacts differ for subgroups of participants defined by baseline characteristics? In addition to assessing whether the SET Demonstration worked, the evaluation seeks to shed light on the groups of individuals for whom the program has the greatest impacts. Accordingly, subgroup analyses will be conducted based on characteristics such as age, gender, race/ethnicity, education level, and previous work experience, industry, and occupation. Subgroups will also be formed based on psychological traits, such as risk tolerance, openness to new experiences, and perceptions of autonomy, which have been associated with entrepreneurial success by previous research (Calliendo et al. 2010, 2011; Evans and Leighton 1989). Receipt of UI benefits through state UI systems, the EB program, and/or the EUC program are of particular interest because the GATE evaluation provided evidence suggesting that UI recipients might have received early benefits from participating in GATE. Differentiating among UI recipients according to factors associated with their likelihood of exhaustion could also inform states’ use of WPRS models to identify candidates for SEA programs.

  • Through what programmatic mechanisms might the SET Demonstration affect participant outcomes? It is important to know the extent to which individuals actually receive intensive business development counseling from a self-employment advisor, because this represents the key channel through which all other study outcomes are hypothesized to be affected. The analysis of this question will also examine the effects of participating in the SET Demonstration on the receipt of services at existing community providers. And, drawing on the results of the implementation study, the study team will determine whether impacts vary for participants in states and sites with different contextual or programmatic features.

The data used to answer these questions will primarily be derived from the demonstration’s application materials, program participation records, and follow-up survey, which are described in detail in Section A.2. Additional information will come from the site visits and case study interviews conducted for the implementation study. Table A.2 displays how these various data sources map to the research questions for the impact analysis.

Table A.2. Research Questions for Impact Analysis by Data Source


Data Source

Research Question

Application Package

Program Participation Records

Follow-Up Survey

Site Visit Interviews

Case Study Interviews

1. What is the net impact of the SET Demonstration program on participants’ overall employment status and total earnings?



X



2. Does the SET Demonstration increase the likelihood of self-employment?



X



3. Does the SET program improve intermediate business development outcomes?


X

X



4. How does participation in the SET Demonstration affect economic well-being and participation in other programs?

X


X



5. Do program impacts differ for subgroups of participants defined by baseline characteristics?

X


X



6. Through what programmatic mechanisms might the SET Demonstration’s program influence participant outcomes?



X

X

X



2. How, by Whom, and for What Purpose the Information Is to Be Used

Clearance is being requested for five data collection efforts: (1) collecting applications to the SET Demonstration; (2) gathering program participation records from SET partner providers; (3) administering a follow-up survey; (4) collecting implementation data through interviews conducted with SET program staff and other providers during in-person visits to the study sites; and (5) conducting case study interviews with selected SET program participants. Each of these efforts is described in more detail in the following subsections. Figure A.1 summarizes visually the respondent groups affected by the evaluation’s data collection efforts and gives an overview of the associated burden.11

a. Application Package

Individuals interested in the SET Demonstration will receive detailed information about the program and associated evaluation during mandatory online orientation sessions to be hosted by the evaluation contractor and publicized by AJCs in each of the participating study sites. At these orientations, the program’s eligibility criteria and study participation requirements (for example, consenting to random assignment) will be carefully described. At the end of the orientations,

FShape1 igure A.1. Respondents Affected by SET Demonstration Data Collection Efforts

individuals interested in applying to the SET Demonstration will be able to request a hardcopy of the application (for their reference) from AJC staff and will be given directions to access the secure website hosting the online application (Appendix A). Thus, prospective applicants will have an opportunity to assess their likelihood of qualifying for the program’s services and choose whether to complete and submit the application package. As noted earlier, it is expected that approximately 4,000 applications will be collected to achieve the study’s target enrollment of 3,000 before random assignment. The application package has five components: (1) a consent form, (2) a dislocated worker screener, (3) a background information form, (4) a business idea form, and (5) a contact information form.

The consent form for the SET Demonstration explains the process of random assignment and that participation in the study is voluntary. The form also informs potential participants about the data that the study team will collect about them and notifies them that they will be required to provide adequate proof of their dislocated worker status to SET providers before being permitted to access program services.12 Applicants are also advised that they have the option to withdraw from the study at any point, and that the form provides information on how to withdraw from the study. By digitally signing the consent form using his or her Social Security Number (SSN), each applicant will acknowledge having read this study information and that they agree to participate in the study.

The dislocated worker screener form will be administered to ensure that applicants qualify for the demonstration under one of the WIA-defined categories of dislocated worker. Applicants will be reminded that they will need to provide documentation supporting their responses to this form to SET providers prior to service receipt. Dislocated worker categories will be clearly explained to potential applicants in mandatory online orientations. Applicants will also be asked on this form whether they are a veteran or an “eligible spouse” of a veteran based on the definitions given in the Jobs for Veterans Act of 2002 (JVA). This information will be used to ensure that study enrollment procedures comply with the “priority of service” rule laid out in the JVA (38 U.S.C. 4215). The form also requests information about current military service of the applicant and spouse because DOL has determined that WIA eligibility criteria shall be applied differently for such individuals (Training and Employment Guidance Letter No. 22-04). The study team will also consider forming subgroups for analysis based on these dislocated worker categories.

The third component of the SET application package will be a background information form, which will collect information in four broad areas:

  1. Identifying and locating information. Identifying information includes the applicant’s complete name, sex, date of birth, mailing address, and Social Security number. This information is needed to ensure that each sample member is randomly assigned only once. This personal information can be combined with information on sample members’ telephone number(s), email address(es), and contact information for up to three relatives or friends who might know how to contact the member to yield locating information, which will facilitate locating study participants for the follow-up survey. Accurate and detailed locating information is essential for achieving high survey response rates.

  2. Demographic and socioeconomic characteristics. The intake form asks for information about the sample member’s demographic characteristics, including race/ethnicity, country of birth, languages spoken at home, marital status, and number of children. It also asks for information on educational attainment and household income and whether the applicant has a disability. In addition, applicants will be asked to provide information on their income; basic asset holdings and outstanding debts; credit history information; and receipt of public benefits, such as welfare payments, Supplemental Security Income (SSI), food stamps or SNAP benefits, and the earned income tax credit (EITC). Intake data on the characteristics of sample members will be valuable for the study team to (1) describe the populations that apply for and are served by the demonstration, (2) define subgroups of interest for the impact analysis, and (3) monitor the random assignment process and verify that it has been implemented properly. Such data can also help improve the precision of impact estimates and adjust for potential bias that might arise from survey nonresponse, as described in greater detail in Part B of this clearance package.

  3. Psychological traits. Previous research has shown that individual attitudes and psychological characteristics can be an important predictor of entrepreneurial success. Specific dimensions that have been analyzed in the literature include (1) the “Big 5” personality traits, consisting of openness to new experiences, conscientiousness, extraversion, agreeableness, and neuroticism (Calliendo et. al 2011); (2) the “locus of control,” which measures perceptions of autonomy (Evans and Leighton 1989); and (3) tolerance for risk (Calliendo et al. 2010). The baseline form will include questions measuring each of these traits based on questionnaires that have well-established psychometric evidence of reliability. The resulting variables will be used to describe the study population and form subgroups for the impact analysis.

  4. History of self-employment and other work experience. The intake form will obtain information about applicants’ previous employment experiences. Data on this experience in both self-employment and wage and salary employment can be used by the study team to better assess the extent to which an applicant’s recent work experience is related to his or her business idea. In addition, this information will be used, like demographic and socioeconomic data, to (1) describe the populations served by the demonstration, (2) define subgroups of interest for the impact analysis, and (3) monitor the random assignment process and verify that it has been implemented properly. Information about collection of UI benefits (through regular state UI programs, the EB program, and/or the EUC program) will be used for similar purposes; UI recipients could be a subgroup of particular interest for this study in light of the findings from earlier ETA-funded self-employment demonstration projects described earlier. Work and UI collection histories will also be used in conjunction with information on the date of the separation from the last wage or salary job and the reason for the separation to determine eligibility for the study based on the criteria ETA has established for the dislocated workers and similar individuals for whom the demonstration has been designed.

The fourth element of the SET application package will be a business idea form that asks for details about the business that the applicant proposes to develop. Information will be collected about the applicant’s plans for the business and whether a formal business plan has been developed, which will be useful in describing the sample, for forming subgroups in the impact analysis based on the extent of development of the business idea at the time of application to the program, and for comparing the characteristics of members of the program and control groups. Applicants will be asked to provide a detailed description of how the proposed business idea relate to their prior work experiences; this information will be used to screen applications as described in Section A.1.d.

The fifth and final element of the SET application package is a contact information form. This form requests contact information for three friends or relatives who do not live with the applicant but are likely to know the applicant’s whereabouts. This information will be used to aid in locating study participants only if the evaluation contractor is unable to reach the participant directly.

b. Program Participation Records

Program participation records will be collected using three forms: (1) participant tracking forms that describe participant status and assessment at intake, the types and quantities of training and other assistance received by program group members, and participant progress towards reaching business development milestones, (2) a seed capital request form, which eligible participants will submit when requesting funds to cover qualifying business expenses, and (3) service termination forms, which will be filled out by SET providers in the event that a participant exits the program.

Self-employment advisors at the demonstration’s MDO partner providers will periodically complete participant tracking forms that they will submit electronically to the evaluation contractor as part of their regular operations while carrying out the SET program. At the outset, they will use these forms to provide the following information for each program group member: (1) when intake was conducted, (2) whether the individual was able to furnish adequate documentation proving their dislocated worker status, (3) the individual’s status in and/or readiness for establishing a business, and (4) the service plan recommended. Subsequently, advisors will draw on these forms to track on a monthly basis: (5) the number of times the individual was contacted or the reason(s) why contact did not occur, (6) the mode of each contact, and (7) the number of hours that the individual participated in classroom training, technical assistance, and peer support groups. They will also report on the individual’s progress towards completing various business development milestones. This information will be used by the evaluation contractor to ensure that program group members are receiving adequate support. Data on service receipt will also be used in the implementation study to better understand the patterns of self-employment assistance receipt in the program group. When a participant terminates participation in the SET program, advisors will indicate the service termination date on this form in addition to completing a service termination form (see below).

Qualifying SET program participants will submit a written seed capital request form to their designated self-employment advisors to apply for microgrant funds. In their requests, applicants will describe (1) the amount of funds requested (not to exceed a cap of $1,000 per participant for the life of the program); (2) the purpose for which they will use the funds (for example, inventory, equipment, licensing, or other costs); and (3) how the request relates to the business being developed. The designated self-employment advisor will review the participant’s request and, if approved, forward it to the evaluation contractor via email or fax. This information will be used by the evaluation contractor to ensure that seed capital funds are disbursed to qualifying individuals for appropriate purposes. Data from the seed capital request forms will also be used in the implementation study to examine the types of business establishment investments made by SET program participants.

Service termination forms will be filled out by SET advisors in cases in which an advisor and a program group member determine jointly (before the end of the one-year SET program period) that self-employment is no longer the best avenue for the participant. In such cases, the self-employment advisor is expected to refer the participant back to an AJC for assistance searching for wage/salary employment. The forms include the reason for service termination, will be signed by the participant, and will be securely transmitted to the evaluation contractor. The forms permit advisors to provide an explanation for instances in which they are unable to obtain a signature from the participant despite their best efforts. The evaluation contractor will use the forms while the demonstration is running to ensure that MDO partner providers are not unilaterally terminating program group members and providing adequate support and engagement before terminating SET services. Information from the service termination forms might also be used in the implementation study to shed light on the characteristics and experiences of individuals who are not able to persist in self-employment.

c. Follow-Up Survey

For both program and control group members, the follow-up survey (Appendix C) will provide data on the outcomes of recipients 18 months after random assignment. An advance letter (Appendix F) will be mailed to study members shortly before fielding of the survey begins and will provide information about the content of the follow-up survey, average administration time, and how to access the web-based instrument. Based on prior experience conducting similar surveys, the response rate is expected to be about 80 percent (that is, approximately 2,400 SET study participants are expected to complete the follow-up survey).

Descriptions of the major content areas of the SET follow-up survey follow. Unless otherwise noted, the information collected in the follow-up survey will be collected for use by the study team as an outcome in the impact analysis. Table A.3 describes the data elements included in the follow-up survey and how they relate to information available from the application package and program participation records.

Screener. Before starting the survey, a screening section will seek to verify each respondent’s date of birth and the last four digits of the SSN. This will ensures that the follow-up survey will only be completed by individuals who went through random assignment.

Section A: Current employment status. At the start of the survey, respondents will be asked whether they are currently self-employed and/or employed in a wage or salary job. Individuals with both forms of employment will be asked to indicate which form of employment they consider to be their primary work activity. Individuals who are not employed will be asked to describe any other recent work-related activities. This information will be used to construct a measure of respondents’ employment status at the time of the survey, which is one of the study’s primary outcomes, as well as a more general measure of labor force participation status that can be compared with existing data from the Current Population Survey.

Section B: Receipt of self-employment services. This section asks about self-employment assistance services accessed since random assignment. Sample members will answer a series of questions about their receipt of such services as intensive business development counseling from a self-employment advisor, entrepreneurial classes, one-on-one technical assistance, peer support, mentoring, and assistance accessing loans. The survey also asks questions that quantify the amount of these services each sample member received, identify the organization that provided key services, and assess the sample member’s overall satisfaction with the services and organizations. Sample members who did not access any self-employment services will be asked to specify their reasons for not doing so, which will supplement the information collected in the implementation study on the appropriateness of the SET program model. Finally, respondents will be asked about the extent to which the services they received addressed specific topics that are potentially important for aspiring small business owners, including achieving three milestones (completing a business plan, completing a marketing plan, and obtaining start-up financing) that are expected to be associated with entrepreneurial success.

Section C: Self-employment experiences. This section of the survey seeks information about the sample member’s self-employment and business development experiences since random assignment. It will include questions about the number of self-employment ventures attempted since

Table A.3. Data Elements in the Follow-Up Survey, by Content Area and Availability of Similar Measures in Other SET Data Sources

Data Element of Follow-Up Survey by Content Area

Similar Baseline Measure Available in Application Package

Similar Measure Available in Program Participation Recordsa

Screening



Date of birth


Last four digits of Social Security number


Current Employment Status



Self-employment status


Wage and salary (W/S) employment status


Primarily work activity (if both self-employed and employed in W/S job)


Reason for not working (if not employed)



Self-Employment Assistance Services



For specific types of services:



Whether received any at all

Quantity received


Organizations providing servicesa



Out-of-pocket expenses



Satisfaction level



Unmet needs



For all types of services:



Reason for not participating in any services (if applicable)



Business development topics addressed



Overall satisfaction



Business Development Milestones



Started or updated a business plan

Started or updated a marketing plan


Applied for/received a business loan



Applied for/received a start-up grant


c

Received assistance in achieving milestone



Self-Employment Experiences



Ever tried to start a business



Ever self-employed

d


Number of business ventures startede

d


Self-employment earnings over previous 12 months


For current/most business venture:



Industry


Start and end dates


Reason(s) for ceasing operations and subsequent work trajectory (if applicable)



Whether business is incorporated

f


Adoption of selected business development strategies and tools



Typical hours worked


Monthly sales and expenses



Whether received monetary compensation from business


Amount, nature, and frequency of compensation received



Number of employees


Ownership structure

f


Sources and amounts of financial capital invested in the business



Challenges faced in attempting to become self-employed





W/S Employment Experience



Ever had a job

d


Number of jobs currently held



Earnings over previous 12 months


For current/most job:



Industry and occupation


Start and end dates


Reason(s) for leaving job (if applicable)


Subsequent work trajectory (if applicable)



Typical hours worked


Rate of pay



Job Satisfaction, Fringe Benefits, Health Insurance, and Program Participation



Level of satisfaction with current employment situation



Current availability of employment-related fringe benefits



Health insurance:


Current availability and source


Gaps since random assignment



Unemployment insurance UI benefits:



Number of weeks collected since random assignment


Current receipt


Whether exhausted all available benefits


Program participation:



Received Trade Adjustment Assistance (TAA) benefits


Received job placement or career counseling services from AJC or state labor exchange


Received on-the-job training


Received adult basic education


Received supportive services including assistance with child care or transportation


Household Composition, Marital Status, Income, and Economic Hardships



Household composition and marital status:



Number of adults in household


Number of children in household


Current marital status


Household income over last 12 months:



Total income


Number of other adults who worked for pay


Receipt of transfer income through selected government programs (e.g., welfare and food stamps)


Economic hardships:



Missed or been late on mortgage payment or rent



Received a notice of a mortgage default



Experienced foreclosure or eviction



Had utilities disconnected



Been at least 60 days delinquent on a monthly credit payment


Been charged late fees on monthly credit payments



Been required by a court order or lawsuit to make payments to a creditor


Declared bankruptcy


Delayed getting medical care



Visited emergency room



Contact Information



Note: Items from the follow-up survey other than screening and contact information will be used to measure outcomes for the impact analysis. Baseline measures from the application package will be used in the impact analysis to form subgroups, to construct covariates to be included in multivariate regressions, and to track changes in outcomes over time. Selected baseline measures will also be used to screen applications, to describe the characteristics of the SET study population, and to conduct survey nonresponse analyses. Program participation measures will be used in the implementation study to provide quantitative information about the experiences of program participants.

aProgram participation records will only be available for members of the treatment group and might only cover a subset of their self-employment service experiences.

bInformation about the organizations providing services and satisfaction levels will only be collected for periodic meetings with a designated self-employment advisor, the main innovation of the SET service model.

cProgram participation records will focus on applications for and receipt of seed capital microgrants available through the SET program.

dFor these items, the baseline information form uses a five-year window prior to application.

eBusiness ventures include owning a business and undertaking other self-employment business activity.

fThe items are captured on the business idea form for the proposed business that the applicant wishes to start or expand through the SET demonstration.



applying to the SET Demonstration and earnings from self-employment over the previous year. For individuals who started a business or other self-employment venture, a series of detailed questions will be asked about the most recent business or venture, including the location and industry of the business, the period of operation, formal registration of the business, and the use of specific business practices that could be associated with long-run success, such as registering the business or writing a business plan and/or a marketing plan. Respondents will also be asked about the hours they devoted to that business, what sources of financing and seed capital they used to start the business, the structure and ownership of the business, the earnings they received as the owner, and, if applicable any income they received from selling it. Each of these outcomes is interesting in its own right, and they may be used together to form a measure of the owner’s returns from business ownership. A measure of business profitability will be constructed to gauge the success of the business based on questions covering sales, expenses, and payroll amounts. To better understand the trajectories out of self-employment, individuals who had been operating a business after the date of random assignment but were no longer doing so at the time of the follow-up will be asked about their reasons for doing so and what their major activity was afterward. Finally, to complement the findings from the implementation study, all individuals who had ever tried to become self-employed will be asked about the challenges they faced in their endeavors.

Section D: Wage and salary employment. The survey will gather information about whether the respondent was employed in any wage or salary job since random assignment. Data collected about wage or salary earnings over the 12 months before the survey will be used together with comparable information about self-employment earnings to construct measures of total earnings. This section of the survey will also ask for additional details about the current or most recent wage or salary job, such as the start date, the industry and occupation of employment, the number of hours worked per week, the number of weeks worked over the prior year, the pay rate, and, if applicable, the end date and reason for separation. This information will enable the evaluation to consider the extent to which receipt of demonstration services altered participants’ patterns of employment in the traditional wage and salary sector.

Section E: Job satisfaction, fringe benefits, health insurance, and UI receipt. Respondents will be asked to assess their satisfaction with their current employment situation at the time of the survey, which will provide qualitative information about the overall labor market impact of the SET Demonstration. Information about the availability of fringe benefits such as paid leave; health, life, and disability insurance; and pension benefits will be used to characterize the effects of the demonstration on the quality of employment. Sample members will also be asked whether they are currently covered by health insurance. In addition, this section will collect information about the duration of UI benefit receipt, as well as the amount of compensation received.. Finally, the survey assesses whether respondents received services though other government-sponsored workforce programs, such as the Trade Adjustment Assistance (TAA) benefits, job placement and career counseling services, job training, and supportive services. Gathering information about these outcomes for both the program and control groups will information allow for a clearer understanding of the counterfactual condition and the extent to which SET services might crowd out or complement other services.

Section F: Household composition, marital status, income, and economic hardships. To learn about the demonstration’s impacts on general economic well-being, the survey will include a series of questions about household income and public assistance from sources such as SSI, TANF, and SNAP. Information collected about household composition can be used in conjunction with income data to create a measure of poverty status that accounts for the number of dependents. The evaluation will also consider the impact of participation in the demonstration on indicators of financial distress using data collected in this section of the survey, for example, whether recipients experienced delinquencies on credit, mortgage, and rent payments; foreclosures and evictions; personal bankruptcy; and delays obtaining medical care.

Section G: Updated contact information. Respondents will be asked to check and, if necessary, revise the contact information that was collected at baseline so that the data collection team can seek clarification about the respondents’ answers, as needed, and ensure proper delivery of incentive payments (discussed in Section A.9). Collecting updated contact information is also important to facilitate further follow-up of the SET Demonstration participants should ETA sponsor such an effort. Although the current evaluation plans specify a follow-up of study participants to be conducted 18 months after random assignment, previous studies of self-employment programs clearly illustrate the importance of longer-term tracking of outcomes for these types of interventions (Benus et al. 1995, 2009). The evaluation’s follow-up survey has been designed to facilitate the collection of comparable information and estimation of impacts on key outcomes at later points in time if longer-term follow-up is conducted.13

d. Site Visit Data Interviews

Two rounds of site visits will be conducted to provide data for the implementation study. The purposes of these visits are to examine fidelity to program design, determine the viability of offering the intensive services specified as part of the SET Demonstration, and to help capture variation in implementation across the study sites and over time. Site visits will focus on gauging program implementer actions, performance, and perceptions, rather than customer experiences and outcomes. (The latter will be captured primarily through the follow-up survey and case study interviews.) The first round of site visits, to occur within the first 6-9 months of program implementation, will provide opportunities for troubleshooting and providing assistance to sites to improve SET implementation. The second round of visits, to occur after the demonstration program has been operating for 15-18 months, will provide an opportunity to assess more steady-state implementation. During the site visits, experienced members of the study team will use semistructured protocols (Appendix D) to systematically gather information from (1) self-employment advisors; (2) administrators and case managers at participating AJCs; and (3) additional staff at institutions providing support for the SET program, such as support staff at the demonstration’s MDO partner providers and field staff at other CBOs, microfinance providers, and community banks.

The site visits are structured to collect information on the following topics:

  • Context in which the SET Demonstration is implemented. Contextual factors that will be explored include demographic, socioeconomic, and geographic characteristics of the communities in which the demonstration is implemented. In addition the site visitors will gather information about the existing microenterprise development infrastructure (including SBDCs and MDOs), how various existing service providers coordinate with one another, the extent to which microenterprise development services are utilized, and the local economic environment. Information on these contextual factors is necessary for understanding the overall supply of self-employment services in the study sites while the demonstration is implemented, as well as factors shaping the demand for such services. This information will provide insight about the entrepreneurial infrastructure that would be available to program participants in the absence of the intervention.

  • Program management. The site visitors will gather information about the managerial roles played by the workforce development systems, MDOs, and SBDCs delivering SET Demonstration services. This information will enable the study team to learn about the how the program is structured and about variations in program management arrangements across sites. It will also highlight the partnerships that could be important to sustain the SET Demonstration’s program model.

  • Program implementation. The site visits will focus on how the demonstration model is implemented in practice by examining who the program serves and how program staff conduct outreach and direct potential clients to the SET Demonstration. This information will be used to examine the uptake of SET Demonstration services. In addition the site visits will investigate how partner providers assess needs, conduct follow-up, and connect program participants to specific entrepreneurial services such as business development counseling, training, technical assistance, and support for accessing capital. This information will help elucidate the degree to which sites implemented the program as planned and will be used to explore whether variations in implementation fidelity correlate with differences in participants’ outcomes. Finally, site visit protocols explore reasons for variations in fidelity of implementation and lessons learned in implementing the program. This information will be used to determine the viability, strengths, and weaknesses of the program design and provide guidance for a potential future scale-up of the SET Demonstration model or a similar approach to entrepreneurial support.

  • Community outcomes. During site visits, the evaluation team will investigate staff perceptions of the effect of participation in the SET Demonstration on access to, and the quality of, microenterprise services; whether there are any consequences of the demonstration for the use of AJC services; and whether businesses created by demonstration participants provide goods and services not previously available in the community. This information will help to identify potentially relevant system-level effects that are unlikely to be captured by the individual-level data collection efforts. Site visits will also investigate if there are other major changes in the community during the SET Demonstration (such as economic development initiatives and fluctuations in the level of economic activity) to gain insight on contextual factors that may magnify or dampen the effects of the program.

Data on these topics will be obtained through semistructured interviews during two rounds of site visits, one conducted during the early stages of implementation and the other when the program has reached a steady state of operations.

e. Case Study Interviews

To provide richer portraits of SET participants’ experiences, case study interviews will be conducted with four members of the program group in each study site (32 in total). Case study interview protocols (Appendix E) will be used to conduct in-depth telephone interviews with members of the program group. These study participants will consist of a mix of those with successful and unsuccessful self-employment outcomes in each site, with success defined as establishing a business and/or becoming self-employed.

The participants selected for this evaluation component will represent a purposively selected sample of SET treatment group members. Based on the program participation records submitted by SET service providers in each study site, evaluation staff will select a mix of treatment group members with different patterns of service receipt and overall program participation and invite them to participate in these in-depth interviews. Case study participants will include both individuals who engage strongly with the SET program and reach important participation milestones (such as completing a business plan and/or establishing a business) and individuals who decide not to pursue self-employment and focus instead on wage or salary employment. These same factors will be used to select replacements if any of the initially selected participants decline to be interviewed.

The case study interviews will cover the following broad areas:

  • Participants’ backgrounds. Case study interviews will gather qualitative information about participants’ reasons for applying to the demonstration, the self-employment venture they pursued, and additional details about how participants’ prior work experiences helped or failed to prepare program participants for the business pursued. This information will provide information about factors that could be related to self-employment success or failure.

  • Participants’ experiences with the SET Demonstration program. Interviewers will seek to collect rich information about the services received as part of the SET Demonstration. Particular attention will be paid to gathering detailed information about the more innovative elements of the demonstration’s program model (for example, intensive and ongoing follow-up from a designated self-employment advisor or peer support, individualized technical assistance, and assistance accessing seed capital). This information will be used to understand the extent to which participants made use of the intensive counseling services offered through the program’s self-employment advisors and how helpful this might have been in connecting participants with other entrepreneurial services in the community appropriate to their needs.

  • Participants’ experiences with self-employment. Case study interviews will be used to understand participants’ activities in pursuit of self-employment, the barriers they face, the degree to which they succeed or fail in overcoming those barriers, and the role that SET services plays in the process. This information will provide insight into participants’ needs and the determinants of self-employment success and failure.

Information on these topics will be used to construct a detailed profile for each participant, as described in Part B of this clearance package. These detailed profiles will serve as restricted-access working documents that will kept in-house and be used by the study team to identify salient themes regarding participants’ experiences with the SET program and with self-employment. The analysis will examine whether these broad themes vary across sites and whether they coincide with any observed variations in fidelity of implementation. In order to minimize burden on respondents, information from application and program participation records will be used to develop a preliminary profile before the interview. This preliminary profile will be used to customize the interview protocol to allow for more targeted questions. For example, information on on service receipt from the program participation records can be used to better target questions about SET or other services accessed by the respondent.

3. Uses of Technology for Data Collection and to Reduce Burden

Advanced technology will be used in the evaluation’s large-scale individual-level data collection efforts (collection of application materials, collection of program participation records, and fielding of the follow-up survey) to reduce burden on study participants and on-site staff.

a. Application Package

Prospective SET Demonstration participants will complete their application packages online using a secure web interface. They may use computers at home or at AJC resource rooms to do so. All data entered into the application will reside on Mathematica's secure server. To reduce the possibility that ineligible individuals end up completing applicants, mandatory online orientations will explain the study’s eligibility requirements in detail and provide links to printable versions of the full application. It is also expected that AJC staff will assist individuals having difficulties completing the online application due to a disability.

Relying on a web-based interface for actual application submission will eliminate the need to ship sensitive information that would occur with a pencil-and-paper application, while still allowing applicants the flexibility to complete the form at home or an AJC facility. Electronic forms are expected to be faster and easier for applicants to fill out, while the skip pattern logic and checks for consistency and validity are expected to yield more accurate answers. In addition, because a hardcopy application would have to be submitted using a secure courier service, electronic submission will allow the contractor to more quickly review applications for completeness and make an eligibility determination.



b. Program Participation Records

Self-employment advisors will use a secure electronic system to submit participant tracking data on a monthly basis. Selected information will be pre-filled and the electronic interface will be optimized for fast and reliable data entry. SET program participants will have the option to fill out their seed capital request forms electronically. Similarly, self-employment advisors will be able to fill out the service termination forms electronically. However, these forms will need to be printed out in order to be approved by the self-employment advisors (for seed capital requests) and signed by the participant (upon termination of SET services). To facilitate rapid and easy transmittal of these forms, Mathematica will set up a secure file hosting service to receive scanned copies of seed capital request forms and service termination forms from self-employment advisors.

c. Follow-Up Survey

Electronic advance letters describing the follow-up survey will be sent out to individuals who provide an email address on the application form (in addition to hard copies delivered in the postal mail). The electronic advance letters will include a hyperlink to the survey website, which should reduce the effort and potential for error that could otherwise occur from typing the site address manually.

Like the application package, the follow-up survey will be conducted on the web to facilitate quick completion and submission. Data from web surveys are stored on secure servers and are immediately available and more accurate than self-administered questionnaires administered via paper and pencil. Web surveys reduce the amount of interviewer labor necessary to complete data collection and enable respondents to complete the questionnaire on their own schedule, in multiple sittings if needed, and without having to return any forms by mail. If an individual exits the application before completing it, they will be given instructions on how to return to the web application and resume from the point they left, without any loss of data. A unique login and password will be provided for them to protect their incomplete application data. Individuals who do not fill out a web survey and express an interest in completing it over the telephone will be administered the survey using computer-assisted telephone interviewing (CATI). To comply with Section 508 of the Rehabilitation Act, sample members likely to have difficulty completing a web survey will be offered the option of completing the survey by telephone by default.

Both self-administration via the web and the interviewer-administered CATI system reduce the respondent burden and costs compared with conducting in-person or paper-and-pencil interviews. Because the web survey is self-administered, it enables respondents to complete the survey on a schedule that is most convenient for them. Self-administration also offers the most cost efficiency because interviewers are not required. The web survey programming includes skip pattern logic, response code validity checks, specification of acceptable ranges, and consistency checks. As much information as possible will be preloaded into the web survey in order to reduce respondent burden. An example of this is lists of local service providers from which respondents will be able to choose. The web interface will be easy to navigate to encourage sample members who open the web survey to continue through completion.

CATI is also a logical choice as a method of administration for telephone interviews with large numbers of respondents. Any information preloaded in the web survey will also be preloaded into the CATI instrument to improve data accuracy and reduce respondent burden. CATI programs are efficient and accept only valid responses based on preprogrammed checks for logical consistency across answers. Interviewers are thus able to correct errors during the interview, eliminating the needs for costly call-backs to respondents. Further, CATI’s flexibility allows for the scheduling of interview times that are convenient for the sample member.

Both versions of the survey are expected to take approximately 60 minutes to complete. Except for language necessary to accommodate self-administration versus being asked by an interviewer, the content of both survey versions will be identical.

4. Efforts to Identify Duplication

Strategies to identify and avoid duplication are discussed in two subsections. The first covers the SET application package, program participation records, and the follow-up survey; the second covers the site visit data collection effort and case study interviews.

a. Application Package, Program Participation Records, and Follow-Up Survey

The SET application package and follow-up survey will provide unique information about the characteristics and outcomes of study participants that is crucial for conducting the impact analysis. No other survey data collection effort has been conducted or has been planned to collect similar information.

Administrative data systems provide very little individual-level data that can reliably estimate the effects of the program model on participant outcomes. There are two potential exceptions. First, earnings by employment status could be derived in some form based on data from the Internal Revenue Service (IRS) and/or Social Security Administration (SSA).14 Second, data on UI receipt could be obtained from state UI benefits records. However, collecting data on these measures using a survey is preferred for the following reasons:

  • Earnings data from IRS and SSA are generally available over periods corresponding to one calendar year that will not, in general, coincide with the 12-month period after random assignment. This imperfect temporal overlap would reduce the precision of the study’s primary impact estimates. Further, research using matched administrative and survey data suggests that earnings information collected from a survey can, in some circumstances, yield an outcome measure that allows impacts to be more accurately estimated, compared with earnings information collected from an administrative source (Kapteyn and Ypma 2007). Thus, the follow-up survey is expected to provide data that will enable the evaluation to obtain more precise, and potentially more accurate, estimates of the SET Demonstration’s impacts on earnings.

  • It is likely to be more cost effective to obtain information on UI benefit receipt via the application form and follow-up survey than by obtaining administrative UI benefit records from states. Substantial costs and burden would be incurred by seeking administrative UI benefits records corresponding to the 3,000 sample members included in the evaluation because the study team would have to negotiate with, and compensate, state UI agencies for the data. The data would also have to be thoroughly cleaned and checked to ensure that the records are properly matched to the sample members. Moreover, states may refuse to provide the data altogether. By contrast, including a few questions about UI benefit receipt will only very slightly increase the burden of the follow-up survey, which is already being fielded, to obtain other unique information about the sample members.

The application form and follow-up survey will provide information on other individual-level outcomes that are not measured in any existing data sources, including self-employment activities, fringe benefits, health insurance, economic self-sufficiency, and financial hardships. In addition, the follow-up survey will be the only source of data on utilization of self-employment services for both the program group and the control group.

The program participation records will provide the only source of ongoing, real-time data for use by the evaluation contractor to ensure that the SET Demonstration is properly implemented and that partner organizations are adhering to the demonstration’s procedures. Each component of the program participation data also provides unique information for use in the implementation study. There is some overlap between the participant tracking data and the follow-up survey in the information provided about the program group’s service utilization and contact with self-employment advisors. As already noted, the follow-up survey will provide comparable, summary information about service utilization for the demonstration’s treatment and control groups. In contrast, the tracking data will provide richer, higher-frequency, and more reliable data for the program group than what could be realistically obtained in the follow-up survey. Because self-employment advisors will be submitting data for multiple participants sequentially, the costs and burden of obtaining this information is expected to be substantially lower than if the study team sought to include more detailed questions about service receipt on the follow-up survey. In addition, because the data are submitted monthly, the information in the participant tracking database is likely to achieve a greater reliability than if it were based on retrospective answers by sample members covering the entire one-year follow-up period.

b. Site Visit and Case Study Interviews

Site visits and case study interviews will provide distinctive information about the novel program model to be implemented as part of the SET Demonstration. Although some CBOs currently offer intensive business development counseling services, such organizations are generally not well integrated with AJCs and do not focus their efforts closely on the target population of dislocated workers of interest to ETA. Thus, existing information about counseling services will not be sufficient for understanding how this model might operate within the context of the Federal workforce system. In addition to filling that informational gap, case study interviews and site visits will provide implementation data specific to the SET Demonstration that can be linked to participants’ outcomes. These data will, as a result, better enable ETA to gain a deeper insight into the factors that could mediate the program’s effects.

5. Methods to Minimize Burden on Small Businesses or Entities

Some sample members will become self-employed and establish small businesses. Because investigating whether and how the demonstration services influence the outcomes of individuals interested in starting a business is the primary goal of the evaluation, these individuals will be asked questions about their businesses and experiences receiving self-employment services. However, only the sample member and not other people in the business will be asked questions, and the extent of the questions will be limited to measures necessary to answer the main research questions of the SET Evaluation.

Some of the MDO partner providers and/or additional CBOs providing support to the SET Demonstration might also be small businesses. These organizations will be selected, in part, based on their commitment to evaluating the SET program model and willingness to provide information to assist with this effort. This selection will reduce the effective burden of collecting program participation records and conducting site visits. The evaluation team has also used advanced technologies (Section A.3) and developed recordkeeping forms (Appendix B) to minimize the burden on MDO partner providers of transmitting program participation records. In addition, the site visit protocols (Appendix D) are structured to efficiently gather targeted information that is needed to conduct the SET Evaluation.

6. Consequences of Not Collecting the Data

The SET Evaluation represents an important opportunity for ETA to learn about a novel program for delivering self-employment services to dislocated workers who seek to pursue self-employment as a re-employment strategy. If the information collection requested by this clearance package is not conducted, policymakers and providers of self-employment services will lack high-quality information on the impacts of the SET Demonstration and whether it represents a program model that can feasibly be scaled up or appropriately modified to serve customers of the Federal workforce system.

The application package obtains information that is critical to ensuring that the SET evaluation team can implement a randomized design in the study population of interest for this demonstration. Random assignment yields the most reliable, cost-effective, and externally valid estimates of program impacts compared with other evaluation methods (such as a comparison group or regression discontinuity design). By completing the consent form, applicants agree to participate in random assignment and allow the study team to collect follow-up data on their outcomes. The background information form yields data that is vital for monitoring random assignment and ensuring that the procedure is properly implemented. (These data can also be used in the impact analysis to examine how program impacts vary across subgroups.) Thus, if the consent form and background information form contained in the application package are not collected, it will be not be possible to cost-effectively deliver credible estimates about the study’s impacts to policymakers and other stakeholders. The study team will use the business idea form (also contained in the application package) to identify applicants who are well positioned to take advantage of the services offered by the SET Demonstration. If this information is not collected it would not be possible to concentrate SET services on participants likely to benefit from the program. This would result in smaller estimates of the program’s effects and reduce the likelihood that the estimates are statistically significant.

Without the program participation data, the contractor will not be able to effectively monitor the implementation of the SET Demonstration. The contractor would have insufficient information to determine whether partner organizations are adhering to study procedures. In addition, the contractor would have limited empirical information from which to provide technical assistance or feedback to participating AJCs and MDOs in the event that demonstration procedures need to be adjusted in the early implementation stages. Without data from the seed capital request forms, the contractor will not be able to ensure that seed capital microgrant funds are distributed according to the procedures specified by ETA. Thus, without these data, it would not be possible to ensure a high-fidelity implementation of the demonstration model. This could reduce the impact of the SET program and, therefore, the chances of finding a significant estimate of its effect. Lack of program participation data would also reduce the depth and scope of the implementation analysis.

The follow-up survey obtains information on a variety of important outcomes that could be affected by participation in the SET Demonstration. Many of these outcomes (for example, participation in self-employment services, specific self-employment activities, fringe benefits, health insurance, economic self-sufficiency, and financial hardships) cannot be measured in any existing data sources. The study’s primary outcomes (employment, earnings, and self-employment) are likely to be measured with lower accuracy in administrative data sources, as described in Section A.4. Thus, without the survey, the capacity of ETA to determine the impact of the SET Demonstration on participants’ outcomes would be severely limited.

Site visit interviews and case study interviews will provide information about the implementation of the SET Demonstration’s novel program model. Without these data collection efforts, ETA will have very little information that could ascertain the feasibility of the SET program model, how this model might operate within the context of the Federal workforce system, or how the program’s effects are mediated by characteristics and programmatic features of the sites implementing the model.

7. Special Data Collection Circumstances

There are no special circumstances surrounding data collection. All data will be collected in a manner consistent with Federal guidelines. There are no plans to require respondents to report information more than quarterly, to prepare a written response to a collection of information within 30 days of receiving it, to submit more than one original and two copies of any document, to retain records, or to submit proprietary trade secrets. The application package and follow-up survey will produce valid and reliable results that can be generalized to the universe of the study and will include only statistical data classifications that have been reviewed and approved by OMB. Both will include a pledge to maintain respondent privacy to the extent that existing statutes and regulations permit; the underlying disclosure and data security policies used by the contractor and DOL (see Section A.10) are consistent with the pledge. It will not unnecessarily impede sharing of data with other agencies for compatible use.

8. Federal Register Notice and Consultations Outside of the Agency

a. Federal Register Notice and Comments

As required by 5 CFR 1320.8 (d), a Federal Register notice, published on June 20, 2012 (FR, Vol. 77, No. 119, pp. 37070-37072), announced the SET Evaluation. The Federal Register announcement provided the public an opportunity to review and comment on the planned data collection and evaluation within 60 days of the publication, in accordance with the Paperwork Reduction Act of 1995. No comments were received from the public.

b. Consultations Outside of the Agency

The data collection instruments, research design, sample design, and analysis plan have been developed based on the expertise of DOL and the contractor. As described in Part B, the application and follow-up surveys have been piloted with up to nine individuals from nonparticipating sites with backgrounds similar to SET Demonstration participants. The instruments and protocols for gathering program participation data will be adjusted, as necessary, based on input from up to nine of the demonstration’s partner providers (once they have been selected). Also, if necessary, protocols for site visit data collection and case study interviews will be refined based on feedback obtained during the first site visit and from the first two interview respondents, respectively. Of course, any adjustments made to instruments and protocols will be implemented only after OMB approves the changes.

9. Respondent Payments

Participants will not be paid for completing the application package required to participate in the demonstration. No payments will be made to individuals completing site visit interviews or case study interviews. The evaluation team will conduct an incentive experiment to determine whether to offer sample members incentives for completing the follow-up survey, which they would receive in the form of a check after they complete the survey.

The offer of incentives could be important to efforts to gain cooperation from sample members; increase response rates; ensure the representativeness of the sample; and provide data that are complete, valid, reliable, and unbiased. ETA seeks to inform policy based on a data collection effort that meets high standards on these criteria, and offering incentives can help achieve that goal. However, because response rates to telephone surveys have declined and costs associated with achieving high response have increased, the use of incentives has become a more common practice for survey studies (Curtin et al. 2005). The current data collection plan for SET describes a mixed mode design, where sample members are asked to complete the survey online using a self-administered web application. Sample members who choose not complete the survey online will be offered the opportunity to complete the survey by Computer Assisted Telephone Interviewing (CATI). Achieving response rates above 80 percent in the absence of survey incentives could require an in-person field effort to convert initial refusals. However, such an in-person effort is likely to be very expensive. In the context of the SET Evaluation, response incentives are expected to represent a more cost-effective approach to achieving an acceptable response rate.

Substantial evidence on the benefits of offering incentives has become available. Incentives can help achieve high response rates by increasing the sample members’ propensity to respond (Singer et al. 2000). Studies offering incentives show decreased refusal rates and increased contact and cooperation rates. Among sample members who initially refused to participate, incentives increased refusal-conversion rates. By increasing sample members’ propensity to respond, incentive payments have been found to significantly reduce the number of calls required to resolve a case and the number of interim refusals. Thus, incentive payments can help contain costs and pass some of the costs of conducting the survey as a gain to the participant rather than into additional survey operations.

In addition to increasing the overall response rate, incentives could also increase the likelihood of participation from subgroups with a lower propensity to cooperate with the survey request, helping to ensure the representativeness of the respondents and the quality of the data being collected. For example, Jäckle and Lynn (2007) found that incentives increased the participation of sample members more likely to be unemployed. There is also evidence that incentives bolster participation among those with lower interest in the survey topic (Jäckle and Lynn 2007; Kay 2001; Schwartz et al. 2006), resulting in data that are more complete. Furthermore, paying incentives might improve quality of the data obtained (such as item nonresponse or the distribution of responses) from groups that would otherwise be underrepresented in a survey (Singer et al. 2000).

Experimental evaluations of survey incentive payments in populations with similar characteristics as SET Demonstration participants have shown that payments of $30 to $50 could be effective for reducing nonresponse among randomly selected potential sample members. For example, Mack et al. (1998) found that $20 incentives (or $29.26 in 2012 dollars) reduced household, person, and item (gross wages) nonresponse rates among potential participants in the U.S. Survey of Income and Program Participation (SIPP) relative to a control group receiving no monetary incentive. By contrast, $10 incentives did not significantly reduce nonresponse rates in the Mack et al. (1998) study. In a 2008 experiment that Mathematica conducted on behalf of ETA as part of the National Evaluation of the Trade Adjustment Assistance (TAA) Program, a $50 incentive was found to be more cost-effective than $25 or $75 incentive payments for reducing survey nonresponse among displaced workers who were eligible Trade Readjustment Allowance benefits under the TAA program.

Based on the accumulated evidence, the evaluation team expects an incentive payment of approximately $25 to be effective for the SET Demonstration’s follow-up survey. Although $25 is lower than what was found to be effective for the SIPP and TAA surveys, the sample members for this demonstration will have a strong and explicit existing relationship with the study and survey organization. This a priori relationship, which was not present for the SIPP or TAA surveys, should raise response rates across the range of possible incentive payments. At the same time, it is anticipated that incentive payments for the SET Demonstration would need to be higher than what was offered for Project GATE when considering the survey response rates from among GATE sample members who could be identified as dislocated workers at baseline.15 Project GATE offered an incentive payment of $15 to such individuals and achieved response rates of 84.0 percent at the 6-month follow-up survey and 74.6 percent at the 18-month follow-up survey for this baseline subgroup. Adjusting for inflation, this suggests that a $18 incentive payment (in 2012 dollars) would yield a response rate of only 79.3 percent on an 18-month survey. The SET Demonstration seeks to achieve at least an 80 percent response rate, and so will offer a larger incentive payment. The SET respondent payment of $25 is also expected to narrow the gaps in response rates of four to seven percentage points that occurred in the GATE 6- and 18-month surveys between members of baseline-dislocated subgroup who were assigned to the program group and those who were assigned to the control group.

As part of the fielding of the SET follow-up survey, the evaluation plan specifies an incentive experiment to verify the appropriate incentive amount for the data collection. Specifically, during the first four months of the follow-up survey, sample members would be re-randomized with equal probability to three groups:16

  1. Group A would be offered a $50 incentive for completing the follow-up survey online within the first four weeks of the field period; thereafter they will receive $25 for completing the survey regardless of mode of completion.

  2. Group B would be offered a $25 incentive for completing the follow-up survey regardless of mode or timing.

  1. Group C would not be offered any monetary incentive.

The contractor’s survey staff will monitor data collection efforts to equalize the final response rates in these three incentive-payment groups.

At the end of the four month incentive experiment period, the contractor will determine whether to offer incentive payments for the remaining sample based on the response rates and fielding costs in Groups A, B and C. The analysis will compare the number and cost per complete for those surveys completed within the $50 incentive window with those completed later in the field period. Such comparisons will be broken out by incentive condition, which will allow for difference-in-difference estimates of the effects of the higher incentive payments on timing and mode of completion. In addition, the analysis will also examine the characteristics of respondents by mode and incentive condition to see if the $50 incentive draws in some subgroups that may otherwise be harder to reach. If incentives are shown to improve the response rate and/or lower the fielding costs, the project team will work with ETA to determine which incentive amount best fits the project’s needs.

The contractor will summarize the results of the auxiliary incentive payment experiment in a memo prepared for OMB. Hence, in addition to informing decisions about respondent payments for this demonstration, the results of the auxiliary experiment could be used by OMB to assess the potential effectiveness of respondent payments for data collection efforts conducted by other studies of similar populations.

10. Privacy

This section contains a discussion of the measures that the evaluation team will take to safeguard the data that are part of this clearance request. The first subsection describes the contractor’s general policies for protecting privacy. The second subsection describes the contractor’s electronic security systems. The third section provides additional detail on the treatment of data with personally identifying information (PII) collected for this evaluation.

a. General Policies to Protect Privacy

This subsection describes, sequentially, the statements that will be made to study participants about privacy protection and the contractor’s staff training and clearance policies related to data security, and plans for a restricted-use data file to ETA (if produced as part of this study).

Statements about protecting respondent privacy. Sample members included in the follow-up survey, site-visit interviews, and case-study interviews will be ensured of the privacy of their responses as study researchers will implement administrative and security systems to prevent the unauthorized release of personal records. (These systems are discussed in detail in the following subsections.) The agency will also give the public notice of the planned evaluation through publication in the Federal Register (see Appendix G). All respondent materials will include assurances of privacy protection. These include letters sent to sample members and information posted on the web site for the SET follow-up survey. In addition, as part of the telephone interviewers’ introductory comments, sample members will be told that their responses are private and will have the opportunity to have their questions answered. Interviewers will be trained in procedures to maintain privacy and will be prepared to describe them in full detail, if needed, or to answer any related questions raised by participants. For example, the interviewer will explain that the individual’s answers will be combined with those of others and presented in summary form only.

Staff training and clearance policies. All data items that identify sample members will be kept only by the evaluation contractor, Mathematica, for use in assembling data and in conducting the follow-up survey and interviews. (As discussed in greater detail below, any data delivered to ETA will not contain personal identifiers, thus precluding individual identification.)It is the policy of Mathematica to efficiently protect private information and data in whatever medium it exists, in accordance with applicable Federal and state laws and contractual requirements. In conjunction with this policy, all Mathematica staff will do the following:

  1. Comply with a Mathematica pledge that is signed by all full-time, part-time, and hourly Mathematica staff, and with the Mathematica Security Manual procedures to prevent the improper disclosure, use, or alteration of private information. Staff may be subjected to disciplinary, civil, or criminal actions for knowingly and willfully allowing the improper disclosure or unauthorized use of private information.

  2. Access private and proprietary information only in performance of assigned duties.

  1. Notify their supervisor, the project director, and the Mathematica Incident Response Team if private information has been disclosed to an unauthorized individual, used in an improper manner, or altered in an improper manner. All attempts to contact Mathematica staff about any study or evaluation by individuals who are not authorized access to the private information will be reported immediately to both the cognizant Mathematica project director and the Mathematica security officer.

In addition, the evaluation team members working with the data for this study will have previously undergone background checks. These may include filling out an SF-85 or SF-85P form, for example, authorizing credit checks, and having fingerprints taken.

Restricted Use Data Files. To facilitate external verification and replication of the study findings, as well as additional research, the evaluation team will consider producing restricted use data files containing key analysis variables created for the SET evaluation at the end of the study. (Current study plans do not provide for creation of such restricted use files.) If produced, these data delivered to ETA will not contain personal identifiers, thus precluding individual identification. These data files will follow the relevant OMB checklist to ensure that they can be distributed to authorized researchers with appropriate restrictions. Steps would also be taken to ensure that sample members cannot be identified in indirect ways. For example, categories of a variable will be combined to remove the possibility of identification due to a respondent being one of a small group of people with a specific attribute. Variables that would be carefully scrutinized include age, race and ethnicity, household composition and location, dates pertaining to employment, household income, household assets, and others as appropriate. Variables would also be combined in order to provide summary measures to mask what otherwise would be identifiable information. Although it cannot be predicted which variables will have too few respondents in a category, the SET evaluation contractor would not report categories or responses that are based on cell sizes of fewer than five. If necessary, statistical methods would be used to add random variation within variables that would be otherwise impossible to mask. Finally, variables that could be linked to identifiers by secondary users would be removed or masked.

b. Systems Security

Mathematica’s computer facilities include state-of-the-art hardware and software. The hardware and software configurations have been designed to facilitate the secure processing and management of both small- and large-scale data sets.

Facility. The doors to Mathematica’s office space and Survey Operations Center (SOC) are always locked and require a key card to gain entry. All SOC staff are required to display current photo identification while on the premises. Visitors are required to sign in and out and must wear temporary ID badges while on the premises. Any network server containing private data is located in a controlled, limited-access area. All authorized external access is through a server under strict password control. The SOC features lockable storage areas for sensitive documents, and controlled access to computerized files and systems.

Network. Sensitive data are stored in secure folders that reside on a Windows 2008 Server volume using Microsoft NT File System (NTFS). BitLocker encryption software, configured to use a 256-bit advanced encryption standard (AES) key, encrypts data on the volume as they are stored. The encryption persists for the life of the volume. NTFS/BitLocker makes the data accessible only to users with authorized access, and makes data inaccessible to software that circumvents normal access control, in case the media are stolen. NTFS/BitLocker stores user data in an encrypted format on the volume, but it works transparently with most applications and backup utilities. All the rules of file system trustee assignments, trustee rights, ownership, sharing, visibility, locking, transactions, and space restrictions remain the same on the encrypted volume. Data in the “Secure_Data” folders are backed up using ArcServe 11.5, which encrypts the contents using the 3DES algorithm. These separate backups are overwritten every two months by backups of newer secure data, a process that enables compliance with secure data destruction requirements.

Access to all network features, such as software, files, printers, Internet, email, and peripherals, is controlled by user ID and password. Mathematica staff are required to change their passwords for computer access no fewer than every three months, and passwords must adhere to the following standards: be at least eight characters long, contain at least one letter (upper or lower case), and contain at least one numeric or special character. All user IDs, passwords, and network access privileges are revoked within one working day for departing staff and immediately for terminated staff. All staff are required to log off the network before leaving for the day.

Printers. Printer access is granted to all staff with a valid user ID and password. The physical hard disks on which the printer queues reside are subject to the same security and crash procedures that apply to the file servers. Printer queues are confined to write-access to all staff. No staff have read-access to the printer queues; that is, they cannot browse the contents of the printer queues. Printer stations are appropriately monitored according to the sensitivity of the printed output produced. No private or proprietary data or information can be directed to a printer outside Mathematica’s offices. Staff are instructed not to print sensitive data if possible and to retrieve printouts containing such data immediately.

Electronic communication. Each of Mathematica’s locations has a site-specific local-area network (LAN). A combination of T1 and ethernet private line (EPL) lines links the site-specific LANs into a wide-area network (WAN) and supports cross-office communications. Traffic on the Mathematica internal network, which is not encrypted, is secured by these links, all of which are private, point-to-point communication lines dedicated to Mathematica traffic and completely contained within Mathematica’s firewalls. Because each office is connected to other offices solely by these private point-to-point lines and not through the Internet, all WAN traffic is contained and protected within Mathematica’s firewalls; no WAN traffic is routed through the Internet.

c. Treatment of Data with Personal Identifying Information Collected for the Evaluation

All data containing PII—including SSN, name, home address, date of birth, and telephone number—are considered to be sensitive, or private data. The SET Evaluation is in compliance with the aforementioned company security policies. In this subsection, study procedures for storing and processing PII are described, followed by a discussion of additional considerations for the PII associated with specific data collection and management activities.

1) Procedures for Handling PII

Data files. When possible, electronic files for everyday use are created without personal identifiers. Data and sample files that must contain sensitive data are stored and analyzed on one of Mathematica’s secure hard-drives. Specifically, staff working on this project will be instructed to maintain all files with private data in project-specific, encrypted folders on the Mathematica network. Access control lists restrict access on a need-to-know basis and only to project staff who are specifically authorized to view the sample data (as designated by the project director or survey director) to select and process the sample or to process the data files. Sensitive data that are no longer needed in the performance of the project will be magnetically erased or overwritten using Hard Disk Scrubber or equivalent software, or otherwise destroyed.

Access. Electronic files with private data will be stored in restricted-access network directories. Access to restricted directories is limited through access control permissions, on a need-to-know basis to staff who have been assigned to and are currently working on the project. When temporarily away from their work areas, project staff are instructed to close files and applications and to lock their workstations using the CTRL-ALT-DEL command. Workstations automatically lock within a set number of minutes and a password must be used to regain access through the protected screen saver.

Electronic communication. For internal emails, staff are forbidden to transmit sensitive study information as a regular file attachment; they are instructed instead to use the “insert hyperlink” feature in Outlook to include a shortcut to the file. This enables the receiver to go to the file directly if authorized, but will not allow access to unauthorized individuals. In addition, staff are instructed to avoid including sample member names or any other PII in internal emails, so that there is no potential for these to be viewed by others.

Emails sent outside Mathematica are not automatically encrypted, and therefore neither the text nor attachments are secure. Before sending an email containing sensitive information, the sender is obligated to ensure that the recipient is approved to receive such data. When files must be sent as attachments outside Mathematica, staff are instructed to use WinZip 14.5 (256-bit AES encryption) to password-protect the file and transmit the password to the recipient using a separate form of communication, preferably via telephone. When a sample member’s name and contact information are sent outside Mathematica, the information is included in a secure attachment rather than in the text of the email.

Hard-copy printouts. Sensitive temporary work files, used to create hard-copy printouts and stored in temporary work files on local hard drives, are deleted on a periodic basis. Hard-copy output with private information is shredded or stored securely when no longer needed. Test printouts of data records carrying personal identifiers that are generated during file construction are shredded.

Incident response. Staff are instructed to report any incidents or potential incidents involving PII to Mathematica’s Incident Response Team immediately by email or using an internal reporting web site. When notified, the Incident Response Team determines whether an incident has occurred and has to be reported to ETA. If so, the Incident Response Team informs ETA of the incident within one hour of its discovery.

2) Additional Considerations for PII Associated with Specific Data Collection Activities

Application package, follow-up survey, and case identification numbers. Application materials and follow-up surveys will be submitted electronically. Applicants will include their SSN as a digital “signature” indicating consent. Sample members who initiate a follow-up survey will be asked to confirm the last four digits of their SSN. To protect this sensitive PII, applications and surveys will be completed and submitted using a secure web-based interface. Mathematica will process and store the results using secure servers consistent with the systems security policies described above. Each applicant will be assigned a unique case identification number that may be used to link all of the study data files together.

Sample management system. Some data elements from the SET application and relevant data will be entered into a sample management system (SMS) when conducting random assignment (see Part B). This is a sequel server database housed on an encrypted server. A hierarchical architecture will be used to assign user rights to specific individuals who will be able to access the system and enter information only at their own location. All activity in the system will be logged. Unless otherwise required by ETA, the information stored in the SMS and the electronic application files will be destroyed when no longer needed in the performance of the project.

Program participation records. Participant tracking data containing study-generated case identification numbers, instead of PII, will be transmitted by self-employment advisors electronically. Advisors will also have the option to submit scanned copies of the seed capital request forms and service termination forms, which will contain participant names and signatures, using a secure file hosting service established by the evaluation contractor. The advisors may also deliver hardcopies of the seed capital request forms and termination forms using a signature-confirmed courier service.

Telephone interviewing and locating for follow-up survey. Telephone interviewers for the SET follow-up survey will be seated in a common supervised area. As part of the process to verify that the correct sample members have been reached, interviewers will have access to respondents’ names and birthdates, as well as the last four digits of their SSNs. Birth date and the last four SSN digits will be displayed on the computer screen only temporarily, at the beginning of the survey, so that the interviewer can verify the sample member’s identity. Interviewing staff for this project receive training that includes general SOC security and privacy procedures, as well as project-specific training that includes explanation of the highly private nature of this information, instructions to not share it or any PII with anyone not on the project team, and warnings about the consequences of any violations. After receiving training, these staff sign privacy and nondisclosure agreements. Telephone interviews are recorded for educational and training purposes only, to aid SOC staff in improving their interviewing skills.

Staff who work on updating sample members’ contact information when the original contact is not successful must have access to key identifying information for short periods. These staff members receive training that includes general SOC security and privacy procedures, as well as project-specific privacy training that includes clear instructions on what data and databases can be accessed and what data are required and can be recorded. After receiving training, these staff sign privacy and nondisclosure agreements.

Locators may talk to sample member’s family, relatives, or other references to obtain updated contact information. To protect the sample member, locators are given scripts on what they can and cannot say when using these sources to obtain information. For example, they will be instructed not to tell anyone that the sample member has been selected to participate in a study of people receiving self-employment training assistance. Rather, they will indicate that Mathematica is trying to reach the sample member for an important study sponsored by ETA. Postcards will describe the need to speak to the person who agreed to participate in the study.

In addition, locating staff keep only the minimum amount of printed personal information needed to perform assigned duties. Hard-copy materials (such as locating or calling contact sheets) containing data with any individual identifiers (for example, name and street address) are stored in a locked cabinet or desk when not being used. When in use, such materials are carefully monitored by a project supervisor and are never left unattended. At the conclusion of the project, a final disposition of all remaining sample will be made, and contact sheets and other associated materials will be destroyed.

Case study interview notes. Before conducting case study interviews, interviewers need access to participants’ application forms and program participation records in order to assemble preliminary participant profiles that will guide the interview. An authorized evaluation team member will remove PII from case study participants’ application forms and survey data and assign in-house identification codes before providing these data to case study interviewers. Any PII (such as name and contact details) that is necessary for conducting the interview will be saved on the secure data server. To limit risk, staff conducting the case study interviews will be thoroughly trained in Mathematica’s security procedures and privacy requirements. Interviewers will not include PII in any case study notes and profiles. When the analysis has been finalized, case study notes and participant profiles will be destroyed.

Site visit interview notes. As with case study interviews, any PII that is necessary for conducting the site visit interviews will be saved on the secure data server. Interviewers will not include PII in any site visit notes and site profiles, which will be destroyed when the analysis has been finalized.

11. Questions of a Sensitive Nature

The application package and follow-up survey contain some questions that could be considered sensitive. These questions are related to earnings, income, marital status, the need for health care, financial hardships, and the receipt of public assistance. The business idea form also requests that applicants provide specific details about their proposed business model. Depending on an individual’s circumstances, any of these questions could be perceived as sensitive. To encourage reporting, an attachment to the application package will include responses to questions that respondents might have about why these questions are being asked and how the responses will be used, as well as a reminder that their answers will be treated securely in a manner consistent with the guidelines described in A.10, and a toll-free number they can call for more information. All respondents will be informed that they can decline to answer any question they do not wish to answer.

All questions in the SET application package and follow-up survey, including those deemed potentially sensitive, will be pretested, and many have been used extensively in prior surveys with no evidence of harm. Questions about income, financial hardships, and receipt of public assistance are necessary to measure the economic well-being of study participants. Obtaining information about these potentially delicate topics is integral to addressing the research questions posed by the study, in order to describe the characteristics of SET participants, describe their outcomes, and assess the impact of the SET program.

12. Estimated Hour Burden of the Collection of Information

The hour burden estimate for the collection of information that is part of this clearance request consists of the burden from the application package, the follow-up survey, site visit data collection, and case study interviews. As shown in Table A.4, the total hour burden is anticipated to be 8,902 across all data collection efforts for the SET Evaluation. As indicated in Table A.5, this translates to an annualized number of burden hours equal to 5,344 and an annualized burden cost of $101,229.

The hour burden for the application package is estimated to be 60 minutes per applicant. Although eligibility criteria will be explicitly outlined in mandatory online orientation sessions, it is assumed that approximately one in four applications will be screened out. Thus, it is anticipated that 4,000 applications will be collected in order to enroll 3,000 study members. This implies a total hour burden of 4,000. Applications will be received over an intake period of 18 to 24 months, as noted previously. Assuming an 18-month intake period, this implies that the annualized number of burden hours will be 4,000 / (18/12) = 2,667. Applying an hourly wage rate of $16.63 yields an annualized burden cost of $44,352.17

Table A.4. Total Hour Burden Estimates for SET Evaluation Data Collection Efforts

Respondents

Number of Responses/ Instances of Collection

Frequency of
Collection

Average Time per Response

Total Hour Burden

Consent and Application Forms





Applicants to the program

4,000a

Once

60 minutes

4,000






Program Participation Records





Participant Tracking Data





Self-employment advisors

24 respondents with 1,500 total casesb

10 responses per casec

3 minutes

750


Seed Capital Request Formsd






SET program participants

1,425

1.5 times, on average

20 minutes

713

Self-employment advisors

24 respondents with 1,425 total cases

1.5 times per case, on average

10 minutes

356

Service Termination Information





Self-employment advisors

24 respondents with 225 total casese

Once

20 minutes

75


Total for Program Participation Records




1,894







Follow-Up Survey





Successful applicants who went through random assignment

2,400f

Once

60 minutes

2,400






Site Visitsg





Self-employment advisors, directors of MDO partners, and local workforce administrators

72 respondents

Twice

120 minutes

288

Additional staff at MDOs, AJCs, and other organizations providing support for the demonstration

96 respondents

Twice

90 minutes

288


Total for Site Visits




576






Case Study Interviews





Selected members of the program group completing follow-up surveys

32

Once

60 minutes

32

Total




8,902


aAlthough eligibility criteria will be explicitly outlined in publicity materials and orientation sessions for the program, it is assumed that approximately one in four applicants will be determined to be ineligible and, therefore, screened out. Thus, it is anticipated that 4,000 applications will be collected in order to enroll 3,000 study members.

bEach of the 1,500 members of program group will be tracked by one of 24 self-employment advisors.

cGiven a one-year service period, it is expected that 10 monthly tracking reports per case, on average, will be received based on the following assumptions: (1) 5 percent of the program group will drop out of the demonstration within the first month after random assignment, and (2) another 15 percent will have services terminated and be referred back to an AJC by SET self-employment advisors within the first four months after random assignment.

dIt is estimated, conservatively, that up to 95 percent of the SET program participants will complete at least one seed capital request form, and that the average number of requests will be 1.5. As described in the text, every request form will be validated and have summary information added by one of the demonstration’s 24 self-employment advisors.

eAn expected total of 225 service terminations will be initiated by one of 24 self-employment advisors.

fThis figure assumes that the follow-up survey will achieve a response rate of 80 percent.

gThe estimates for each site visit respondent include (1) time coordinating with the study team and preparing for the interview and (2) time participating in the on-site meeting.

Table A.5. Annualized Burden Cost Estimates for SET Evaluation Data Collection Efforts

Respondents

Total Hour Burden

Length of Collection Perioda

Annualized Number of Burden Hours

Average Hourly Costb

Annualized Dollar Cost of Burden

Consent and Application Forms






Applicants to the program

4,000

18 months

2,667

$16.63

$44,352







Program Participation Records






Participant Tracking Data






Self-employment advisors

750

30 months

300

$33,87

$10,161


Seed Capital Request Forms






SET program participants

713

30 months

285

$16.63

$4,740

Self-employment advisors

356

30 months

142

$33.87

$4,810

Service Termination Information






Self-employment advisors

75

30 months

30

$33.87

$1,016


Total for Program Participation Records

1,894


30 months

757



$20,727








Follow-Up Survey






Successful applicants who went through random assignment

2,400

18 months

1,600

$16.63

$26,608







Site Visitsh






Self-employment advisors, directors of MDO partners, and local workforce administrators

288

24 months

144

$33.87

$4,877

Additional staff at MDOs, AJCs, and other organizations providing support for the demonstration

288

24 months

144

$28.70

$4,133


Total for Site Visits

576


288


$9,010







Case Study Interviews






Selected members of the program group completing follow-up surveys

32

12 months

32

$16.63

$532

Total

8,902


5,344


$101,229


aThe numbers listed in this column represent the lower bounds of the duration for each collection period, as discussed in the main text, which implies that the table presents upper bounds on annualized burden hour and cost information.

bAs noted in the main text, burden cost calculations assume wage rates of (1) $16.63 per hour among potential applicants and participants in the SET Demonstration; (2) $33.87 per hour among self-employment advisors; and (3) $28.70 per hour among staff at local workforce agencies (including AJCs) and other staff at organizations providing support for the demonstration.

The burden across all collections of program participation records is anticipated to be 1,894 hours in total, which translates into up to 757 annualized burden hours and $20,727 in annualized burden cost. These figures are based on the following assumptions about the three types of program participation data:

  • Participant tracking data. Self-employment advisors will transmit the summary information from the participant tracking forms to the contractor on a monthly basis. Each transmittal will include updated participation data on every active program group member for whom a self-employment advisor is responsible. It is expected that most of the 1,500 program group members will remain active in the SET Demonstration for the entire one-year service period. However, some may become “inactive” before the end of the program. Specifically, it is assumed that 5 percent of the program group will drop out of the demonstration and another 15 percent will have their services terminated by the self-employment advisor, who will refer them back to an AJC. If dropouts occur within the first month and service terminations are evenly spread out over the first four months after random assignment, the average program group member will remain active for approximately 10 person-months. Assuming that it takes three minutes per month for SET advisors to record and transmit summary information on each active participant, this implies a total hour burden of 1,500 × 10 × (3/60) = 750. The program implementation period will be 30 to 36 months, as previously noted. This implies an annualized number of burden hours of up to 750 / (30/12) = 300. If the wage rate for self-employment advisors is $33.87, this implies an annualized burden cost of up to $10,161.18

  • Seed capital request forms. It is assumed, conservatively, that all program group members will complete a seed capital request form, except for the 5 percent who drop out within the first month. As previously noted, participants may submit multiple requests, so long as they do not exceed their cap. It is expected that, on average, 1.5 requests will be filled out by program group members who submit an initial request form and that the form will take approximately 20 minutes to complete. At each submission, a SET advisor will complete a certification section and transmit the form to the contractor, a process that is expected to take approximately 10 minutes. Thus the total hour burden of the seed capital request forms for participants is anticipated to be 1,500 × 0.95 × 1.5 × (20/60) = 713. This corresponds to 713 / (30/12) = 285 annualized burden hours for an annualized burden cost of $16.63 × 285 = $4,740. Likewise, the total hour burden for SET advisors is calculated as 1,500 × 0.95 × 1.5 × (10/60) = 356, which implies an annualized number of burden hours of up to 356 / (30/12) = 142 and an annualized burden cost of $33.87 × 142 = $4,810.

  • Service termination forms. As already noted, approximately 15 percent of the 1,500 program group members are likely to receive a reverse referral from the SET program back to an AJC. It is anticipated that self-employment advisors at the MDO partner providers will take approximately 20 minutes to document such cases and transmit the documentation to the contractor’s study team. Hence, the total hour burden of collecting the service termination forms is 0.15 × 1,500 × (20/60) = 75. Although it is expected that most terminations occur within a participant’s first four months in the program, forms might be submitted throughout the implementation period. This implies an annualized number of burden hours of up to 75 / (30/12) = 30, which corresponds to an annualized burden cost of $33.87 × 30 = $1,016.

The total hour burden of the follow-up survey is expected to be 2,400. This figure is based on the assumptions that (1) the follow-up survey will achieve a response rate of 80 percent from the initial sample of 3,000 study members; and (2) the survey will take, on average, 60 minutes to complete. Using these assumptions, the estimated total hour burden for this data collection effort is calculated as 0.80 × 3,000 × (60/60) = 2,400. The survey will be fielded to each respondent after a similar number of months have elapsed since random assignment. Hence the duration of the fielding period parallels the duration of the application period and is expected to range between 18 and 24 months. Conservatively assuming the smaller duration results in an annualized number of burden hours of 2,400 / (18/12) = 1,600, which implies an annualized burden cost for the follow-up survey of $16.63 × 1,600 = $26,608.

It is anticipated that the total hour burden of the site visits will be 576 hours, which is expected to correspond to 288 annualized burden hours for an annualized burden cost of $9,010. For both rounds of site visits, which will occur over a two-year period, this figure includes 90 minutes spent on site meeting with each of three self-employment advisors, three directors of MDO partner organizations, and three administrators at partner workforce agencies (e.g. the workforce investment board and local UI office), as well as 60 minutes on site meeting with with up to four field staff members local AJCs and each of up to eight additional staff at the main MDO partner organization or other organizations in the community providing explicit support for the demonstration program model. The figure also assumes that each of the 21 respondents per site will spend, on average, 30 minutes coordinating with the study team and planning for each visit before the on-site meetings occur. Thus, with two rounds of site visits and eight study sites, the hour burden for self-employment advisors, MDO directors, and local workforce administrators is calculated as (2 rounds) × (8 sites) × (9 respondents per site) × (90 minute meeting + 30 minutes of preparation)/(60 minutes per hour) = 288 total hours. Given the two-year period over which the site visits will be conducted, this implies an annualized number of burden hours equal to 288 / (24/12) = 144. Applying, as above, an average wage rate of of $33.87 per hour yields an annualized burden cost of $4,877. Similarly, the total hour burden for AJC field staff and additional staff at partner MDOs or other CBOs will be 2 × 8 × 12 × (60 + 30)/60 = 288, or 144 annualized burden hours. Assuming that the average wage of such staff is $28.70 per hour, this implies an annualized burden cost of $4,133.

Sixty-minute case study interviews will be conducted with 32 members selected from the program group. This corresponds to 32 × (60/60) = 32 total burden hours for the case study interviews. Because case study interviews will occur over a period of approximately 12 months, the annualized number of burden hours is also 32. Given an estimated wage rate of $16.63, this implies an annualized burden cost of $532.

13. Estimated Total Annual Cost Burden to Respondents and Record Keepers

There will be no start-up or ongoing financial costs incurred by respondents that result from the data collection efforts of the SET Evaluation. The proposed information collection plan will not require the respondents to purchase equipment or services or to establish new data retrieval mechanisms.

14. Estimated Annualized Cost to the Federal Government

The contractor will incur a cost of $3,200,000 when carrying out the study over a four-year period, for an annualized cost of $800,000. Of these expenses:

  • $342,495 is for development and conduct of the evaluation’s random assignment procedures, corresponding to an annualized cost of $85,623.75;

  • $56,508 is for development, testing, and maintenance of the management information system for program participation records, corresponding to an annualized cost of $14,127;

  • $396,664 is for the administration of the follow-up survey, corresponding to an annualized cost of $99,166; and

  • $250,727 is for conduct of the implementation study site visits and case study interviews, corresponding to an annualized cost of $62,681.75.

15. Changes in Burden

The data collection efforts for the SET Demonstration are new and will count as 8,902 total hours toward ETA’s information collection burden.

16. Publication Plans and Project Schedule

The data collection for which this Supporting Statement is seeking clearance will not result in publicly available records. However, data collected from the baseline applications and follow-up surveys may be made available by ETA to authorized researchers through restricted use data files, if such data files are produced at the conclusion of the study. Data and study progress will be documented internally throughout the project.

The evaluation plan includes a range of deliverables and reports. Table A.5 shows an outline of these deliverables, followed by a fuller explanation of each item.19

Table A.6. Deliverable Time Line

Deliverable

Date



Demonstration Procedures Manual

January 2013

Design Report

March 2013

Issue Briefs (2)

June 2015

Final Report

December 2016

Design report. The SET Demonstration’s design report will detail the study team’s strategy for carrying out the demonstration’s activities. This report will specify the study’s conceptual framework and key research questions; detail random assignment procedures; describe procedures for monitoring random assignment, providing assistance to sites, and monitoring compliance; detail the data collection procedures for the study; detail procedures for carrying out the implementation study; describe the statistical methods that will be used to estimate impacts; and specify key project milestones and deliverables. The design report will also contain copies of all data collection instruments.

Demonstration procedures manual. As part of the study design activities, the contractor will also develop a procedures manual that describes how study sites will identify demonstration participants, obtain consent for study participation, conduct random assignment, and communicate random assignment results. The guide will also detail the menu of services that will be offered to demonstration participants, including business development counseling from a self-employment advisor and assistance gaining access to seed capital, and will communicate expectations about the intensity and quality of services, including the average level of contact between staff and participants and the duration of services. The SET procedure manual will serve as the basis for staff training in each study site. To ensure that this guide is readily available at each site, printed copies will be provided to all relevant site staff. In addition, each site will receive an electronic copy of the guide and its accompanying forms in portable document format (PDF), so staff can print as many additional copies as necessary.

Issue briefs. To provide timely information about SET implementation to ETA and other stakeholders, the contractor will prepare two issue briefs on evaluation findings. The first issue brief will be based on the cross-site implementation analysis and will identify factors or considerations that might help understand why the impacts of the SET program vary from one site to the next or why different participant subgroups experience differential impacts, should either of these scenarios emerge. Implementation study findings will also appear as a chapter in the evaluation’s final report.

The evaluation’s second issue brief will summarize findings from the case study analysis by identifying common themes in the experiences of successful and unsuccessful SET program group members. The brief will provide cross tabulations and other analyses to shed light on similarities and differences in the experiences of program participants according to their preliminary self-employment status. Baseline data from the intake/application form will round out the descriptive analyses. Vignettes drawn from the case study profiles will provide illustrations of the identified themes for the issue brief and final report. (Any vignettes developed would be carefully reviewed to safeguard the participants’ identities.) As with the implementation analysis, findings from the case study analysis will also appear as a chapter of the final report.

Final report. Findings from all the data collected and analyzed for this project, both qualitative and quantitative, will be included in the final report. This report will focus on the impact findings, identifying whether the outcomes of the study groups differ in the 18 months after intake into the study. Additionally, the final report will include data from the issue briefs in an effort to interpret why such differences (or lack thereof) might exist.

17. Reasons for Not Displaying Expiration Date of OMB Approval

The expiration date for OMB approval will be displayed on all forms distributed as part of the data collection.

18. Exceptions to the Certification Statement

Exception to the certification statement is not requested for the data collection.

REFERENCES

Baptista, Rui, Murat Karaöz, and Joana Mendonça. “Entrepreneurial Backgrounds, Human Capital, and Start-Up Success.” Jena Economic Research Papers. Jena, Germany: Friedrich-Schiller University and the Max Planck Institute of Economic, 2007.

Bellotti, Jeanne M., Sheena M. McConnell, and Jacob Benus. “Growing America Through Entrepreneurship: First Findings from Project GATE.” Princeton, NJ: Mathematica Policy Research, August 2006.

Benus, Jacob M., Terry R. Johnson, Michele Wood, Neelima Grover, and Theodore Shen. “Self-Employment Programs: A New Reemployment Strategy: Final Report on the UI Self-Employment Demonstration.” Unemployment Insurance Occasional Paper 95-4. Washington, DC: U.S. Department of Labor, Employment and Training Administration, 1995.

Benus, Jacob, Sheena M. McConnell, Jeanne M. Bellotti, Theodore Shen, Kenneth N. Fortson, and Daver Kahvecioglu. “Growing America Through Entrepreneurship: Findings from the Evaluation of Project GATE.” Final report submitted to the U.S. Department of Labor, Employment and Training Administration. Columbia, MD: IMPAQ International, LLC, May 2008.

Benus, Jacob, Theodore Shen, Sisi Zhang, Marc Chan, and Benjamin Hansen. “Growing America Through Entrepreneurship: Final Evaluation of Project GATE.” Final report submitted to the U.S. Department of Labor, Employment and Training Administration. Columbia, MD: IMPAQ International, LLC, December 2009.

Calliendo, Marco, Frank Fossen, and Alexander Kritikos. “The Impact of Risk Attitudes on Entrepreneurial Survival.” Journal of Economic Behavior and Organization, vol. 76, no. 1, October 2010, pp. 45–63.

Calliendo, Marco, Frank Fossen, and Alexander Kritikos. “Personality Characteristics and the Decision to Become and Stay Self-Employed.” IZA Discussion Paper 5566. Bonn, Germany: Institute for the Study of Labor (IZA), 2011.

Curtin, Richard, Stanley Presser, and Eleanor Singer. “Changes in Telephone Survey Nonresponse Over the Past Quarter Century.” Public Opinion Quarterly, vol. 69, no. 1, spring 2005, pp. 87–98.

Evans, David S., and Linda S. Leighton. “Some Empirical Aspects of Entrepreneurship.” American Economic Review, vol. 79, no. 3, June 1989, pp. 519–535.

Fairlie, Robert W., and Alicia Robb. Race and Entrepreneurial Success: Black-, Asian-, and White-Owned Businesses in the United States. Cambridge, MA: MIT Press, 2008.

Harada, Nobuyuki. “Who Succeeds as an Entrepreneur? An Analysis of the Post-Entry Performance of New Firms in Japan.” Japan and the World Economy, vol. 15, no. 2, April 2003, pp. 211–222.

Jäckle, Annette, and Peter Lynn. “Respondent Incentives in a Multi-Mode Panel Survey: Cumulative Effects on Nonresponse and Bias.” Working paper presented to the Institute for Social and Economic Research, University of Essex, Colchester, United Kingdom, 2007.

Kapteyn, Arie, and Jelmer Y. Ypma. “Measurement Error and Misclassification: A Comparison of Survey and Administrative Data.” Journal of Labor Economics, vol. 25, no. 3, July 2007, pp. 513–551.

Kay, Ward R. “The Use of Targeted Incentives to Reluctant Respondents on Response Rates and Data Quality.” Proceedings of the American Association for Public Research. Montreal, Canada: American Association for Public Opinion Research, 2001.

Kosanovich, William T., and Heather Fleck. “Comprehensive Assessment of Self-Employment Assistance Programs.” ETA Occasional Paper 2002-01. Washington, DC: U.S. Department of Labor, Employment and Training Administration, 2002.

Mack Stephen, Vicki Huggins, Donald Keathley, and Madhi Sundukchi. “Do Monetary Incentives Improve Response Rates in the Survey of Income and Program Participation?” Proceedings of the American Statistical Association, Survey Research Methods Section, 1998, pp. 529–534.

Schwartz, Lisa K., Lisbeth Goble, and Edward M. English. “Counterbalancing Topic Interest with Cell Quotas and Incentives: Examining Leverage-Salience Theory in the Context of the Poverty in America Survey.” Proceedings of the American Association for Public Research. Montreal, Canada: American Association for Public Opinion Research, 2006.

Singer, Eleanor, John Van Hoewyk, and Mary P. Maher. “Experiments with Incentives in Telephone Surveys.” Public Opinion Quarterly, vol. 64, no. 2, summer 2000, pp. 171–188.



1 To receive training services under Title I of WIA, a dislocated worker is an individual who (1) (A) has been terminated or laid off or has received a notice of termination or layoff from employment , and (B) (a) is eligible for or has exhausted unemployment insurance or (b) has demonstrated an appropriate attachment to the workforce, but is not eligible for unemployment insurance, and (C) is unlikely to return to a previous industry or occupation; (2) has been terminated or laid off or received notification of termination or layoff from employment as a result of a permanent closure or substantial layoff, or is employed at a facility where the employer has made the general announcement that the facility will close within 180 days; (3) was self-employed but is unemployed as a result of general economic conditions in the community or because of a natural disaster; or (4) is a displaced homemaker who is no longer supported by another family member. Individuals will be considered eligible for the SET Demonstration if they meet any of these four qualifications, irrespective of whether they register for staff-assisted services with a WIA American Job Center.

2 Intake into the demonstration will proceed until the demonstration reaches its participation target (3,000 eligible applicants) across participating study sites.

3 In addition, the evaluation of the SET demonstration will request from the appropriate public agencies in the states where the demonstration is implemented administrative records data on study participants’ receipt of unemployment insurance (UI) benefits and their UI-covered employment and wages. These data are regularly collected by these agencies, rather than exclusively for the purposes of this demonstration.

4 The unemployment statistics cited in this paragraph are based on data maintained by the Bureau of Labor Statistics and available at http://data.bls.gov/timeseries/lns14000000.

5 Title I of WIA, Section 171(b) states that DOL shall “… through grants or contracts, carry out demonstration and pilot projects for the purpose of developing and implementing techniques and approaches, and demonstrating the effectiveness of specialized methods, in addressing employment and training needs.” Section 172 grants DOL the authority to evaluate the activities authorized under Section 171, and Section 172(c) specifies that the agency “shall utilize appropriate methodology and research designs, including the use of control groups chosen by scientific random assignment methodologies.”

6 SEA participants also receive an allowance of equal value in lieu of their UI benefit and are not subject to work-search requirements, as long as they are engaged full-time in qualifying activities related to starting their business. These SEA allowances are not diminished by any self-employment earnings. As described later, the evaluation team will explore whether it is possible to provide similar allowances in the sites ultimately selected to participate in the SET Demonstration.

7 In June 2008, ETA awarded GATE II grants to the state workforce agencies in Alabama, Minnesota, North Carolina, and Virginia to provide self-employment assistance to older workers and workers in rural areas. However, a discussion of the GATE II project is not included here because no findings have been released as of this writing.

8 Seed capital offered through the Washington demonstration project differed from the microgrants that will be offered to SET participants in three important ways. First SEED offered these funds in a way that simulated a “cash out” of a participant’s UI entitlement. In the SET program, microgrants will be independent of an individual’s UI benefits receipt. Second, the average lump-sum payment received by SEED participants was $7,129 in 2012 dollars, which is substantially greater than the maximum allocation of $1,000 per SET participant. Third, in addition to meeting business milestones and program participation, the SEED program specified that lump-sum payments be given to participants who had already obtained “adequate financing” (Benus et al. 1995, page iv). The SET program, by contrast, will not require participants to have preexisting financing. Instead, a potential use of the seed capital microgrants could be to better position SET participants to attract subsequent loans, capital grants, or investments.

9 As such, the study population consists of a purposively selected quota sample recruited from a broader population of interested individuals who self-select into SET orientation sessions. There is no burden imposed on the broader population of individuals participating in orientations, since no information will be collected from them. However, as noted below in this section, it is anticipated that the burden associated with completing study application materials could be incurred by up to 4,000 applicants in order to meet the study’s enrollment target of 3,000 individuals going through random assignment. The implications of the study selection process for interpreting the study’s statistical findings are described in Part B of this package.

10 U.S. Small Business Administration. “Credit Factors.” Washington, DC: SBA, n.d. Available at http://www.sba.gov/category/navigation-structure/loans-grants/small-business-loans/application-process/credit-factors. Accessed November 10, 2011.

11 Respondent burden is discussed more detail in Section A.12.

12 The consent form also indicates that ETA may sponsor additional follow-up surveys and collection of data about employment, earnings, and receipt of public benefits using administrative data sources. The current evaluation plans specify that a follow-up survey be conducted 18 months after random assignment, a period for which survey data is likely to be more advantageous than administrative data in measuring outcomes (see Section A.4). However, previous research (Benus at al. 1995, 2009) highlights the importance of examining the effects of self-employment training interventions over a longer period. Obtaining consent for additional rounds of follow-up and collection of administrative records data at intake would facilitate tracking the outcomes of sample members over a longer period. A separate OMB clearance request would be submitted for data collection instruments associated with such a longer-term evaluation.

13 For instance, Sections A, C, D, E, and F of the survey (or portions of these sections) could be slightly modified to reflect a longer follow-up period—for example, 3, 5, or 8 years after random assignment—to support the estimation of longer-term impacts of the SET program. A separate OMB clearance request would be submitted for any modified data collection instruments associated with a longer-term evaluation.

14 Wage records from state administrative UI systems do not cover self-employment earnings.

15 This baseline-dislocated subgroup of the GATE study sample consists of individuals who were (1) unemployed at the time of their application to the program and (2) had collected UI benefits in the 12 months prior to applying. Such individuals are likely to most closely resemble the pool of dislocated workers who will apply to the SET Demonstration. Public-use data files for Project GATE are available online from ETA at: http://www.doleta.gov/reports/projectgate/.

16 ETA is also open to any suggestions that OMB might have regarding alternative approaches to structuring the incentive payment experiment.

17 Hourly wage rates were calculated using the Project GATE public use dataset based on members of the control group meeting the baseline dislocation criteria described in Section A.9. At the six-month follow-up survey (the midpoint of which was April, 2005), the average wage rate among employed members of this sub-group was $14.15. At the eighteen month follow-up, this average was $14.62. Adjusting for inflation, these wage rates translate to $16.62 and $16.64, respectively, in 2012 dollars. Given the similarity between these figures, a wage rate of $16.63 per hour is used for potential SET Demonstration applicants and participants.

18 Based on the May 2011 National Occupational Employment and Wage Estimates maintained by the Bureau of Labor Statistics (http://www.bls.gov/oes/current/oes_nat.htm), the average wage for “Business Operations Specialists, All Other” was $33.21, which corresponds to $33.87 in 2012 dollars.

19 The table and subsequent narrative do not include a restricted-use data file, since such a deliverable is outside the scope of the current plans for the study.

1

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorDPatterson
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy