OMB part B R5-6 DEI ver 6.0 9-9-16 ch

OMB part B R5-6 DEI ver 6.0 9-9-16 ch.docx

Disability Employment Initiative Evaluation

OMB: 1230-0010

Document [docx]
Download: docx | pdf









Paperwork Reduction Act Submission
Supporting Statement Part B for the Office of Disability Employment Policy (ODEP)



Evaluation of the Disability Employment Initiative Evaluation



August 1, 2016

Table of Contents

Shape1 Shape2

Part B: Collection of Information Employing Statistical Methods 3

B.1 Respondent Universe and Sampling Methods 3

B.1.1 Site Selection 5

B.1.2 Respondent Universe and Study Sample Size 8

B.2 Procedures for the Collection of Information 10 B.2.1 Power Analysis 12

B.3 Methods of Maximize Response Rates and Deal with Non-Response 14

B.3.1 Site Visits 14

B.3.2 Use of Existing Data Systems 15

B.3.2.1 Creation of a Participant Tracking System 15

B.3.2.2 New DEI Round 5 Data Collection Requirements 16

B.3.2.3 DEI Round 5 Indicator (DEIRS) 16

B.3.2.4 DEI Round 5 Career Pathways Indicator (DEIRS-CP) 16

B.3.2.5 DEI Round 5 Service Delivery Strategy Indicators 16

B.3.2.6 Creation and Implementation of a Participant Survey 16

B.4 Tests of Procedures or Methods to be Undertaken 17

B.5 Individuals Consulted on Statistical Aspects and/or Analyzing Data 17

References – Part B 18


List of Exhibits & Appendix

Shape3

Exhibit B.1: DEI Round 5 Grantees and Expected Total and Career

Pathways Enrollment (Over Grant Period) 6

Exhibit B.2: WDAs Included in the Selection Pool 6

Exhibit B.3: Number of WDAs Participating in the DEI and Grantee

Estimated of the Number of Individuals Participating in the

DEI Data Collection (Treatment Group Estimates) 8

Exhibit B.4: Power Calculations for a Cluster Analysis 13

Exhibit B.5: Power Calculations for a Non-Cluster Analysis 13



Appendix B.1: DEI Solicitation for Grant Applications (attached) 19

Appendix B.2 New Questions on Career Pathways for Site

Visit/Interview Protocol 24


Part B: Collection of Information Employing Statistical Methods


Due to the Disability Employment Initiative’s (DEI) emphasis on systems change, the unit of analysis for the DEI Rounds 5 and 6 is the Workforce Development Area (WDA), and any impact analysis must be conducted at the WDA (or system) level. For the DEI Rounds 1-4 evaluation, which is currently underway, DOL elected to randomly assign WDAs to treatment and control groups. The purpose of this experimental design was to ensure that the impact of the DEI program could be estimated by comparing the participant outcomes in each group.


For the Rounds 5 and6 DEI Evaluation, DOL has elected to use a quasi-experimental design to determine the impact of DEI interventions on participant outcomes. The Round 5 and 6 impact evaluation includes two study designs. The first study design is a matched comparison group design, with the treatment and comparison conditions established at the WDA level. WDAs are purposively assigned to the treatment or comparison group. The treatment WDAs, were selected by the DEI Round 5 grantees as having the capacity to implement the DEI. The comparison WDAs were selected by Social Dynamics/Abt Associates for their close proximity to the treatment WDAs and their demographic and economic characteristics.


In this design, the matched comparison group of individuals with disabilities provides the counterfactual so that there are no systematic differences between the two groups of participants that may influence program outcomes, and the overall impact of the program can be attributed to the intervention. For this design the unit of analysis is the WDA, and inference is at the WDA level. Because separate Local Workforce Investment Boards (LWIBs) administer the funding and programs delivered in the American Job Centers located in their respective WDAs, and because many of the DEI strategies are implemented at the WDA system level, rather than at the individual level, it is necessary for the WDA to be the unit of analysis for the evaluation.


A second, additional design will evaluate the impact of the specific career pathways component that is part of the R5FR. In this design, we will match similar participants within the Round 5 grantee treatment WDAs, with the only primary difference being enrollment in the career pathways component. Inference is at the participant level, and this design allows us to determine the impact of career pathways on participant outcomes.



B.1 Respondent Universe and Sampling Methods


As with DEI Rounds 1-4, DEI R5FR will be implemented in selected states that received a DEI grant. DOL selected states to receive DEI grants based on the funding available and the merits of the DEI activities proposed in the states’ applications. In addition, all of the awarded states have had previous DEI Rounds 1-4 grants and Disability Navigator Program (DPN) grants. DOL solicited applications from states in Rounds 5 and 6, the first (Round 5) occurring in the summer of 2014 and the second in the summer of 2015 (Round 6). The selection criteria used by DOL were the same for both rounds: the state’s strategic approach; partnership commitments and resources; demonstrated experience providing services targeting people with disabilities; successful project management experience; and potential for achieving stated outcomes and sustainability after the grant period. R5FR will include career pathways and development of job-driven, innovative, integrated, flexible, and universally-designed service delivery strategies that effectively increase the participation of individuals with disabilities. The career pathways approach complements strategies implemented under the DEI Rounds 1-4 by focusing on:


  • Developing partnerships and collaboration across multiple service delivery systems;

  • Blending and braiding of funds to leverage resources;

  • Providing flexible opportunities and access to training and employment of persons, including low-income youth and adults with disabilities, and other multiple challenges to employment; and

  • Creating systemic change.


R5FR grantees are also required to implement six program requirements:


1. Hire a full-time state DEI Project Lead;

2. Hire one or more Disability Resource Coordinators (DRC);

3. Maintain American Job Center accessibility;

4. Participate in the Ticket to Work program as an Employment Network;

5. Plan for sustaining DEI activities after the three-year grant period; and

6. Career Pathways as a DEI strategy.


In addition, Rounds 5 and 6 are funded to implement the following activities, all of which were included in DEI Rounds 1-4:


  1. Integrated resource teams;

  2. Integrated resources;

  3. Customized employment;

  4. Self-employment;

  5. Implementation of the guideposts for success;

  6. Asset development strategies;

  7. Fostering partnerships and collaboration;

  8. Using principles of universal design;

  9. Implementing customized employment;

  10. Aligning youth and adult career pathways programs with the use of individualized learning plans.


A primary goal of the DEI evaluation is to identify the impact of the DEI activities on employment-related and other outcomes for people with disabilities. To accomplish this, a quasi-experimental design with a matched comparison group will be used. The treatment group will consist of eligible participants in the DEI Round 5 WDAs, and the comparison group will consist of eligible participants in surrounding WDAs that are not participating in DEI Round 5 activities, located within the same state. All individuals receiving services at an American Job Center located in a treatment site and who self-identify as having a disability will be eligible to receive the DEI services, and will become members of the treatment group. Individuals receiving services at an American Job Center located in a comparison site and who self-identify as having a disability will receive traditional services and will become members of the comparison group. DEI services include one or more of the activities (1-10) listed above, in addition to case management services and career pathways programs. Comparison group individuals will have access to the traditional American Job Center services, including WIA-WIOA intensive and training services.


B.1.1 Site Selection


As noted above, DOL selected the states that would participate in the DEI based on the funding available and the merits of the grant applications submitted by states. In their applications, states were asked to identify the specific Local Workforce Investment Areas (WDAs) that would be willing and able to implement the DEI strategies and also participate in the data collection activities associated with the evaluation. Because separate Local Workforce Investment Boards (LWIBs) administer the funding and programs delivered in the American Job Center located in their respective WDAs, and because many of the DEI strategies are implemented at the WDA system level, rather than at the individual level, the WDA is the unit of assignment for the primary evaluation. WDAs were selected by each grantee with support from DOL using a purposive sampling methodology.


Round 5 Sites. Six grantees were awarded in 2014: California, Kansas, Illinois, Massachusetts, Minnesota and South Dakota. Each grantee selected a number of treatment WDAs based on available resources and their capacity to implement the R5FR program. Comparison group participants (who enrolled in services in surrounding WDAs) will be selected based on the demographic characteristics of their participant populations based on WIASRD data and economic characteristics of the WDAs. The selection of potential comparison sites was completed in fall 2015.


To create a similar comparison group of participants, we will first take the WDAs that are surrounding the treatment WDAs, and determine the expected number of comparison participants from those surrounding WDAs. We will continue to add comparison WDAs until we reach an expected comparison group size that is approximately 25% to 30% larger than the number of treatment group participants within that state, since we assume that non-response to the survey on disability type will be higher for the comparison group. In addition, there will be some participants in the comparison group that are not appropriate matches for the treatment group participants, and will thus be dropped from the analysis sample.


This selection of comparison WDAs by geographic proximity will allow us to make sure that comparison group participants enrolled in services provided by local AJCs. This will control for differences in economic and demographic characteristics across WDAs, by ensuring that treatment and comparison WDAs are located within the same region of the state.


Once the treatment and comparison WDAs are selected, we will then use a propensity score matching strategy to match comparison group participants with similar treatment group participants, using the demographic characteristics that we have through our data systems, participant tracking system, and the survey on disability type.


Exhibit B.1 lists the expected number of treatment group participants in each state provided by the Round 5 grantees, along with the expected number of participants who will enroll in the career pathways component of the R5 intervention.


Exhibit B.1: DEI Round 5 Grantees and Expected Total and Career Pathways Enrollment in Treatment WDAs (Over Grant Period)

California

Adult

Kansas

Adult

Illinois

Youth

Massachusetts

Adult

Minnesota

Adult

South Dakota

Adult

TOTAL Expected Enrollment

Expected CP Enrollment

Expected Enrollment

Total/CP

Expected Enrollment

Total/CP

Expected Enrollment

Total/CP

Expected Enrollment

Total/CP

Expected Enrollment

Total/CP

Expected Enrollment

Total/CP



375/245 CP

140/120 CP

320/195 CP

165/140 CP

155/120 CP

75 CP1

1,230

895

WDAs = 3

WDAs = 3

WDAs = 2

WDAs = 3

WDAs = 3

WDA = 1


 



Exhibit B.2 lists the names and IDs of the treatment group WDAs, along with the anticipated comparison group WDAs that will be used in the impact evaluation.


Exhibit B.2: WDAs Included in the Selection Pool

STATE

WDA Code

WDA Name

Treatment/

Comparison Group

Massachusetts

25020

North Central Mass

Treatment

25030

Metro North

Treatment

25025

Central Mass

Treatment

25035

Brockton

Comparison

25070

Metro South/West

Comparison

25055

Merrimack Valley

Comparison

Minnesota

27055

Southwest Minnesota

Treatment

27105

Central Minnesota

Treatment

27085

Anoka County

Treatment

27035

North East Minnesota

Comparison

27045

North West Minnesota

Comparison

27115

Ramsey County

Comparison

27040

Rural Minnesota Concentrated Employment Program

Comparison

27030

South Central

Comparison

27110

Stearns-Benton

Comparison

27100

Washington County

Comparison

27080

Winona

Comparison

Kansas

20005

WIA-WIOA Administrative Entity # 1

Treatment

20015

WIA-WIOA Administrative Entity # 3

Treatment


20020

WIA-WIOA Administrative Entity #4

Treatment


20010

WIA-WIOA Administrative Entity #2

Comparison


20025

WIA-WIOA Administrative Entity #5

Comparison

Illinois

17035

Cook County, Balance of – Area 7

Treatment

17030

Dupage County

Treatment

17015

Boone/Winnebago Counties #3

Comparison

17060

Bureau/Lasalle/Lee/Putnam #12

Comparison

17085

Champaign Consortium

Comparison

17055

Grundy/Kankakee/Livingston

Comparison

17025

Kane/Dekalb/Kendall Counties

Comparison

17005

Lake County

Comparison

17010

McHenry County

Comparison

17050

Will County

Comparison

South Dakota

N/A

Central Area

Treatment

California

COMP Sites Not Yet Selected

6170

Sacramento City/County

Treatment

6090

Merced County

Treatment

6280

SELACO—Southeast Los Angeles Consortium

Treatment


6115

Solano County

Comparison


6175

San Joaquin County

Comparison


6210

Yolo County

Comparison


6125

Stanislaus County

Comparison


6160

Fresno County

Comparison


6220

Madera County

Comparison


6015

Long Beach (city)

Comparison


6020

Los Angeles (city)

Comparison

6275

Balance of Los Angeles County

Comparison


* The entire state of South Dakota is a single WDA.



B.1.2 Respondent Universe and Study Sample Size


The population of interest includes American Job Center and their users in the DEI treatment and comparison WDAs who self-identify as having a disability. Five of the Round 5 grantees (California, Kansas, Massachusetts, Minnesota and South Dakota) will target adult participants with disabilities. One grantee, Illinois, will target youth participants with disabilities.


Individuals receiving American Job Center services in the treatment and comparison WDAs that self-disclose a disability will be included in the study sample.2 A probability sample is not used for two reasons. First, administrative data is currently collected from all individuals using American Job Center services for the WIASRD and Wagner-Peyser systems. The use of a random sample creates a relatively small burden per person, and a process that randomly samples people with disabilities would be cumbersome without substantially reducing overall participant burden. Second, many sites are likely to have only a small number of participants with disabilities, so collecting data from only a sample would result in low precision. The higher precision is needed in order for the impact analysis to detect the smallest effects that are economically meaningful. The precision of the impact estimates is determined primarily by the site selection process.


It is estimated that 910 adults in 13 sites will target adult clients and 320 youth in 2 sites will target youth customers over the three year period of the grant.3 The total estimated sample of DEI participants is 1,230. Therefore, estimates of the impact analysis power are based on a small number of individuals and sites. As future rounds of the DEI are implemented, individuals will be added to the evaluation. Exhibit B.3 shows the estimated number of WDAs and individuals that will participate in the DEI in treatment sites, disaggregated by whether the grantee will target adults or youth. In each of these sites, all individuals who self-identify as having a disability will participate in the data collection effort.


The study universe is American Job Center service users who self-identify as having disabilities and receive services at American Job Center centers that have indicated a willingness and ability to implement DEI strategies to improve services to people with disabilities, in states sufficiently motivated to apply for DEI grant funding, and whose applications were deemed superior by DOL among all applications received. There are important implications of the site selection process and respondent universe for estimating the impacts of the DEI and generalizing the results of the study, which we discuss below.


Exhibit B.3: Number of WDAs Participating in the DEI and Grantee Estimates of the Number of Individuals Participating in the DEI Data Collection (Treatment Group Estimates)


Number of WDAs/Areas

Estimated number of Participants

Total Number of Individuals with Disabilities in the Participating WDAs

All sites participating in DEI Round 5

15

1,230

7,8814

Five Round 5 Adult Grantees

13

910

6,497

One Round 5 Youth Grantee

2

320

1,384


States and WDAs Included in the Study. The fact that both the states and the WDAs included in the study were selected based on their willingness to seek DEI grants and DOL’s assessment of the quality of the state’s proposal suggests that the impact findings will not necessarily be representative of the experiences of all states and WDAs that might implement such strategies. The estimated impacts might be biased as estimates of impacts for a national program, in which grant funds are made available to all WDAs, but the direction of the bias is uncertain. By design, the estimates should be unbiased for the set of WDAs that agree to participate in the grants and whose states are successful in obtaining grants (internal validity). If grants are made indiscriminately to WDAs external to the study, impacts might be smaller because those WDAs are not motivated or capable of using the grants as intended. It is also possible, however, that impacts for the external WDAs will be larger because, in contrast to the study WDAs which are presumably already motivated to implement DEI and ready and able to use the funds well, the external WDAs might become motivated by the newly available funding and might learn how to use the funds well from technical assistance that would also be available. Ultimately, the impacts of a national program will depend on how DOL distributes grant funds, regulates their use, provides technical assistance, and monitors WDA performance.


While the impact estimates and other study findings may not necessarily be generalizable to all American Job Centers, this does not diminish the need to conduct a rigorous evaluation. DOL expects to learn in greater detail about the experiences of American Job Center in implementing a variety of service strategies intended to improve the employment outcomes of participants with disabilities, a group that is disproportionately long-term unemployed, underemployed, and living in poverty (Employment and Training Administration, 2015).5 Selecting states and WDAs that are motivated to improve services to this population, rather than selecting WDAs at random, increases the likelihood that the DEI strategies will be implemented with integrity. The result will be a rigorous, internally valid test of the impacts of these strategies and a comprehensive assessment of American Job Center experiences in undertaking them that can inform how best to implement such services nationally.


We acknowledge that the WDAs that were selected for Rounds 5 and 6 may have a greater capacity and experience to provide services to individuals with disabilities seeking employment. The evaluation team will conduct site visits to collect information on WDA characteristics and the implementation of DEI to determine whether these factors differ significantly across treatment and comparison WDAs. In the regression analyses, we will address the differences across the WDAs by adding WDA-level indicator variables. This will provide a multilevel model that will help control for inherent differences across the WDAs.



Individuals Who Self-Identify a Disability. Initially, there was a concern that focusing on individuals who self-identify their disability might affect the impact estimation if the DEI strategies implemented at the treatment sites affect the likelihood that individuals might self-identify their disability in greater numbers or change the composition of those who self-identify. For example, one of the DEI strategies for all DEI rounds is to have American Job Centers increase their efforts to become and provide services as an Employment Network under SSA’s Ticket to Work program. It was hypothesized that this activity, and the community outreach associated with it, might result in more Social Security disability beneficiaries seeking American Job Center services at DEI treatment sites. As those meeting the SSA disability criteria have, by definition, very severe and long-lasting disabilities, and most face significant barriers to employment, it is possible that having a larger share of these individuals represented in the DEI treatment sites will result in lower observed employment outcomes at the treatment sites. However, findings from a recent report on DEI Rounds 1-2 found that treatment site employment rates, employment retention rates and wages were higher than those of the comparison WDAs. These findings were statistically significant.


To further address this issue, and other potential issues associated with relying on participant identification of disabilities in order to identify the DEI target population, DOL plans to match the DEI data with SSA administrative records to identify American Job Center participants who have recently participated in the Social Security disability programs. Recent disability program participation will act as another independent measure of disability that can be used to assess the impact of the DEI on American Job Center service utilization and employment outcomes among people with disabilities. Further, the evaluation will be able to assess the extent to which the impact of the DEI on American Job Center use by Social Security disability beneficiaries and on the characteristics of beneficiary users, and contributes to the impact on mean outcomes for beneficiary users. The match of DEI data with SSA data will take place in 2017 for Rounds 1-4 and 2019 for Round 5.


B.2 Procedures for the Collection of Information


Outcomes of interest will be measured at the individual level and include measures of service utilization and employment outcomes. The key utilization measure is whether an individual registering for services in a American Job Center self-identifies as having a disability, or, for some analyses, is identified as a current or recent participant in a Social Security disability program (as determined by matching the DEI data with SSA administrative data). Process evaluation outcomes include the following:


  • Number and percentage of participants who enter training in career pathways programs;

  • Number and percentage of participants who enter certification or degree programs;

  • Number and percentage of participants who enter core, WIA-WIOA staff-assisted, WIA-WIOA intensive services, and WIA-WIOA training programs.

  • Number and percentage of participants who complete training in career pathways programs;

  • Number and percentage of participants who attain credentials and the types of credentials;

  • Employment rate of participants by program enrollment (e.g., core, staff-assisted, intensive, training, career pathways, other certification and degree programs);

  • Employment retention rate by program enrollment (e.g., core, staff-assisted, intensive, training, career pathways, other certification and degree programs); and

  • Earnings by program enrollment (e.g., core, staff-assisted, intensive, training, career pathways, other certification and degree programs).

In addition, WIASRD and WP data will be extracted?for 4 quarters after the completion of Round 5 to determine the extent to which Round 5 produces long-term employment:


    • Employment rate of participants by program enrollment (e.g., core, staff-assisted, intensive, training, career pathways, other certification and degree programs);

    • Employment retention rate by program enrollment (e.g., core, staff-assisted, intensive, training, career pathways, other certification and degree programs);

    • Earnings by program enrollment (e.g., core, staff-assisted, intensive, training, career pathways, other certification and degree programs).



The evaluation team will collect WIOA data on a quarterly basis. Outcomes are analyzed based on the time period that begins at enrollment through exit and job placement. Once a participant has exited an AJC, s/he may re-enroll in services. Outcomes will be captured for each episode of enrollment-exit-job placement.

In order to analyze individual outcomes, the evaluation team will create a longitudinal file for each individual and participation date. The most recent record for the individual becomes the primary record for the analysis, as this contains the most up-to-date information on that individual.

We will explore the possibility of accessing longer-term wage and employment information using the National Directory of New Hires (NDNH) data on our sample of treatment and comparison participants. This will give us the ability to track participants’ job information for longer than four months after their completion of services at the AJC.


Methods of Analysis: Overall DEI Round 5 Impact. The Quasi-Experimental Design (QED)should minimize systematic observable or unobservable differences between WDAs in the treatment and comparison groups except for the availability of the DEI services. We will use a multilevel model regression analysis to determine the impact of the DEI Round 5 services, to account for the fact that the level of inference is at the WDA level and to control for both WDA-level and participant-level characteristics.


The analysis will follow the model below:


(1)


Where

= the outcome for participant i in WDA j in state k

α = covariate-adjusted mean participant outcome for comparison WDAs

= covariate-adjusted impact of DEI Round 5 (i.e., the difference between the mean outcome for treatment WDAs and the mean outcome for comparison WDAs)

= 1 for treatment WDA and 0 for comparison WDA

= parameter estimate for the effect of the participant-level baseline measure (e.g., pre-enrollment wages)

= pre-test measure for each participant i in WDA j in state k

= a vector of parameter estimates for the effects of participant-level covariates, in WDA j in state k

= a vector of covariates for each participant i in WDA j in state k. The set of planned covariates includes gender, race and ethnicity, prior employment (a binary variable indicating whether the participant was previously employed), prior earnings, education level, and disability type (from the participant survey).

= parameter estimate for the average effect of WDA-level covariates

= WDA-level covariate for each WDA j

= parameter estimate for the average effect of state-level covariates

= state-level covariate for each state k

= a random error term for WDA j

= a random error term for state k

= a random error term for participant i in WDA j in state k


Impacts will be estimated using a three-level model with the treatment impact estimated at the WDA-level.6 We account for inherent differences between the WDAs and states by controlling for their characteristics through a multilevel fixed and random effects model. The effect of the DEI Round 5 intervention is represented by the level-2 parameter estimate, β1. The parameter estimate quantifies the difference in the participant outcome for treatment WDAs compared to the outcome for “business-as-usual” comparison WDAs. If the p-value for the parameter estimate is less than 0.05, we will conclude that there is a statistically significant impact of the DEI Round 5 intervention on the given participant outcome.


In order to take full advantage of the individual-level data characteristics that we have through the Workforce Investment Act Standardized Record Data (WIASRD) and Wagner-Pesyer (W-P) data systems, we will use demographic and employment characteristics, in addition to information on disability type and activities of daily living, to help create the match between participants in the treatment WDAs and comparison WDAs. We anticipate that there will be enough variation at the individual level for the evaluation team to create fine-grained propensity scores. Matching participants across disparate regions (even within the same state, such as California) and across different states will not occur during the propensity score matching process. WDA information will be collected on all WDAs in the analytic sample.


The propensity score matching analysis will involve multiple iterations of the matching process to ensure that our final match across the treatment and comparison groups is optimal. Although we will have more individual characteristics at our disposal to use in the matching process, we will also have WDA-level characteristics to contribute to the creation of the propensity score for each individual. After iterations of the matching process, we will determine if the treatment and comparison groups are equivalent at baseline; that is, equivalent in their characteristics prior to the start of their enrollment services at the AJC, within 0.25 standard deviations of each other. Among these characteristics, we are particularly interested in wage information prior to enrollment in services at the AJC, as this information provides us with variation


Well prior to the actual propensity score matching analysis, the evaluation team will make a determination as to whether the treatment WDAs and the potential comparison WDAs within the same state have similar demographic and employment characteristics. In the following analysis, we observe the descriptive characteristics of 1) treatment WDAs, 2) potential comparison WDAs within the same state, and 3) all other WDAs within the same state. For Round 5 grantees, we ran this information for the three largest states, in terms of population: California, Illinois, and Massachusetts. We used WIASRD data that were collected on all individuals who received services in WIOA programs, and who enrolled at any point between October 2012 and March 2015. As some individuals had multiple enrollments, we looked at only their longest period of enrollment, thereby creating a dataset with unique individual-level observations.


California Summary Statistics







Variable

Treatment

Comparison

Non-Treatment Non-Compare


Demographic Characteristics





Female

57.4%

52.2%

52.4%


Individual identifies as Hispanic

39.7%

57.4%

44.4%


Individual identifies as Asian

13.7%

7.3%

9.9%


Individual identifies as White

39.8%

36.8%

47.1%


Individual identifies as Black

21.9%

27.3%

14.7%


Age at participation

33.60

32.83

36.78


Disability Characteristics





Individual has a disability

4.9%

6.3%

7.5%


Individual has a physical impairment

5.1%

6.6%

6.8%


Individual has a mental impairment

7.1%

12.3%

12.2%


Individual has both physical and mental impairments

2.3%

1.8%

1.9%


Individual did not disclose their disability type

85.5%

79.3%

79.1%


Education and Wage Characteristics





Individual's highest degree is high school diploma, GED or equivalent

41.1%

42.0%

41.9%


Individual's highest degree is an associate's degree

1.5%

1.8%

1.7%


Individual's highest degree is a bachelor's degree

7.6%

9.4%

13.2%


Average Earnings (Adult)

14,314.29

14,115.51

14,670.34


Sample Ns

12,411

65,474

135,212




Illinois Summary Statistics







Variable

Treatment

Comparison

Non-Treatment Non-Compare


Demographic Characteristics





Female

54.2%

52.8%

55.3%


Individual identifies as Hispanic

19.6%

17.0%

6.8%


Individual identifies as Asian

4.8%

4.2%

0.9%


Individual identifies as White

31.0%

62.7%

78.5%


Individual identifies as Black

64.2%

33.6%

20.8%


Age at participation

34.17

34.85

33.91


Disability Characteristics





Individual has a disability

6.7%

6.5%

6.2%


Individual has a physical impairment

11.4%

15.9%

17.9%


Individual has a mental impairment

82.9%

73.5%

71.1%


Individual has both physical and mental impairments

3.6%

6.8%

5.6%


Individual did not disclose their disability type

2.1%

3.8%

5.4%


Education and Wage Characteristics





Individual's highest degree is high school diploma, GED or equivalent

42.5%

45.8%

44.5%


Individual's highest degree is an associate's degree

3.3%

2.7%

3.4%


Individual's highest degree is a bachelor's degree

13.4%

8.7%

4.1%


Average Earnings (Adult)

13,562.01

13,928.48

14,837.56


Sample Ns

25,000

10,986

11,862






Massachusetts Summary Statistics







Variable

Treatment

Comparison

Non-Treatment Non-Comparison


Demographic Characteristics





Female

61.5%

52.7%

59.9%


Individual identifies as Hispanic

21.6%

20.9%

21.1%


Individual identifies as Asian

7.5%

6.0%

7.1%


Individual identifies as White

69.9%

70.9%

69.0%


Individual identifies as Black

19.3%

17.1%

19.8%


Age at participation

37.92

39.35

37.00


Disability Characteristics





Individual has a disability

9.8%

10.1%

10.6%


Individual has a physical impairment

0.0%

0.0%

0.0%


Individual has a mental impairment

0.0%

0.0%

0.0%


Individual has both physical and mental impairments

0.0%

0.7%

2.8%


Individual did not disclose their disability type

100.0%

99.3%

97.2%


Education and Wage Characteristics





Individual's highest degree is high school diploma, GED or equivalent

38.8%

39.5%

41.8%


Individual's highest degree is an associate's degree

7.6%

8.1%

6.6%


Individual's highest degree is a bachelor's degree

14.5%

16.5%

10.4%


Average Earnings (Adult)

11,425.95

12,244.64

11,569.10


Sample Ns

4,051

3,011

11,545



Overall, these raw demographic and employment characteristics reveal that the treatment WDAs and the proposed comparison WDAs are not substantially different from each other, in a variety of key variables. Education attainment is relatively similar across these three states, as is average earnings for adults. The percentage of people who disclosed a disability is very similar across the treatment and comparison groups for Illinois and Massachusetts, and roughly similar for California. Age at participation is also similar across both groups, for all three states.


It should be noted that Illinois’ treatment WDAs had more participants than all other WDAs in the state. The two treatment WDAs make up Chicago and a significant portion of the surrounding area, which accounts for a large proportion of the entire state’s population. As such, the evaluators will first run an analysis without a matching strategy, in order to use all treatment participants in the sample. The evaluators will then run the same analysis with a matching strategy, which will drop some treatment participants out of the analytic sample (i.e., those participants who are not matched to comparison group participants). This approach will allow us to determine the impact of DEI on all treated individuals, versus those treated individuals who match appropriately to comparison group individuals.


Methods of Analysis: Career Pathways Impact. An additional QED analysis will determine the impact of the career pathways component of the DEI Round 5 intervention. In this analysis, the treatment group consists of participants who enroll in the career pathways services provided by the DEI Round 5 WDA. The comparison group consists of participants in the same WDA who do not enroll in career pathways services.


The analysis will follow the model below:


(1)


Where

= the outcome for participant i in WDA j in state k

α = covariate-adjusted mean participant outcome for the comparison group

= covariate-adjusted impact of career pathways (i.e., the difference between the mean outcome for treatment participants and the mean outcome for comparison participants)

= 1 for career pathways enrollment and 0 if not

= parameter estimate for the effect of the participant-level baseline measure (e.g., pre-enrollment wages)

= pre-test measure for each participant i in WDA j in state k

= a vector of parameter estimates for the effects of participant-level covariates

= a vector of covariates for each participant i in WDA j in state k. The set of planned covariates includes gender, race and ethnicity, prior employment (a binary variable indicating whether the participant was previously employed), prior earnings, education level, and disability type (from the participant survey).

= parameter estimate for the average effect of WDA-level covariates

= WDA-level covariate for each WDA j

= parameter estimate for the average effect of state-level covariates

= state-level covariate for each state k

= a random error term for WDA j

= a random error term for state k

= a random error term for participant i in WDA j in state k


Impacts will be estimated using a single-level model with the treatment impact estimated at the participant-level. The effect of the career pathways intervention is represented by . The parameter estimate quantifies the difference in the participant outcome for career pathways participants, compared to the outcome for those who do not participate in career pathways services. If the p-value for the parameter estimate is less than 0.05, we will conclude that there is a statistically significant impact of the career pathways services on the given participant outcome.

We acknowledge the fact that participants who participate in the career pathways component may be more intellectually capable than those who do not. As the tuition-free career pathways component is only available to participants in treatment WDAs and we are planning to capture career pathways enrollment through the Participant Tracking System, we will attempt to measure any differences across these two groups (i.e. career pathways enrollees vs. non-enrollees) using WIASRD and W-P data.


Enrollment in career pathways is determined by the AJC disability resource counselor or employment counselor. AJCs conduct assessments of the training needs of each DEI participant to determine if they are candidates for career pathways, WIOA services, and/or other services (e.g., core and staff-assisted core). The first QED analysis will measure the impact of all DEI services (i.e.., career pathways, WIOA, core, and staff-assisted core), while the second QED analysis will measure the impact of career pathways versus non-career pathways services within treatment WDAs.


Intent-to-treat versus treatment-on-treated considerations.

The aforementioned model specifications allow for an intent-to-treat analysis. Given that we will receive information on the treatment group’s level of participation in DEI services through the Participant Tracking System, we will be able to run a treatment-on-treated analysis for both sets of QED analyses. We assume exposure to the intervention as the first-stage outcome, and use this information in the second stage to determine the impact of exposure on participants’ outcomes. This analysis will use an instrumental variables (IV) approach, with exposure to DEI as the instrumental variable. In the first QED analysis, exposure to the intervention will come in the form of whether a participant was enrolled in a treatment WDA, or not. In the second QED analysis, exposure to the intervention will come in the form of whether a participant was enrolled in the career pathways component, or not.



B.2.1 Power Analysis


In this section, we calculate the smallest sample size of participants that will allow for meaningful impact estimates, using the pooled sample of all six grantees. To conduct this analysis, we provide three levels of minimum detectable effect sizes (MDES), which provide a range of the minimum effect sizes achieved with the samples given. A lower MDES is preferable, as this provides a lower threshold to achieve an impact that can be detected through our impact analysis. Statistical analysis of impacts is always subject to random sampling error and thus the estimated sample size does not deliver the MDES with certainty.


We adopt several standard practices; for example, a plan to achieve an 80 percent likelihood of detecting the impact if the impact is as large as the MDES, and a 5 percent chance that we will identify an impact as being statistically significant even if there is no true impact. Rather than testing the hypothesis that the impact is larger than zero (the null hypothesis), we will test for both positive and negative impacts. For the sake of simplicity, we assume a 1:1 distribution of treatment and comparison participants. In addition, for the cluster analysis, we provide conservative inputs of WDA-level and participant-level R-squareds of 0.40 each (WDAs and participants alike account for 40% of the variation in outcomes) and an intra-class correlation coefficient (ICC) of 0.03 (the proportion of variance that is between clusters). The R-squareds are derived from Trutko and Barnow’s (2010) report, “Implementing Efficiency Measures for Employment and Training Programs,” which found R-squareds in similar evaluation analyses ranging from 0.41 to 0.50 depending on Ordinary Least Squares (OLS) and fixed effects models. The ICC is derived from Heinrich and Lynn’s (1999) work on evaluating the Job Training Partnership Act (JTPA) program, which was a multi-site program designed to improve employment for disadvantaged workers.


In the first power calculation, we consider the fact that treatment and comparison conditions are dependent on which WDA a participant is part of. As such, we consider this a cluster analysis, where Level 1 consists of participants, and Level 2 consists of WDAs. The power calculation for our first QED analysis shown in Exhibit B.4 suggests that with our anticipated total sample size of 500 treatment participants and 500 comparison participants, we would be able to achieve an MDES of about 0.35. Although a lower MDES is preferred, an MDES of 0.35 would allow us to determine low to moderate impacts with our sample.


Exhibit B.4: Power Calculations for a Cluster Analysis (QED 1)


Parameters

Low MDES

Medium MDES

High MDES

Minimum Detectable Effect Size (MDES)

0.19

0.26

0.32

Total Number of Clusters

30

25

20

Average Cluster Size

50

35

25

Total Participant Sample Size

1,500

875

500


In the next example as presented in Exhibit B.5 the treatment and comparison conditions are at the participant level, much like our additional QED analysis to determine the impact of career pathways on outcomes. Participants select into either the career pathway component or do not, within each WDA. The second power calculation demonstrates that absent of a clustered design, one can achieve lower MDES with smaller sample sizes. We again assume a conservative participant-level R-squared of 0.40.




Exhibit B.5: Power Calculations for a Non-Cluster Analysis (QED 2)


Parameters

Low MDES

Medium MDES

High MDES

Minimum Detectable Effect Size (MDES)

0.15

0.25

0.35

Total Participant Sample Size

839

303

156


Although we specify a balanced (1:1) treatment and comparison group distribution among participants in the power analysis, we acknowledge the fact that the final analytic sample may be unbalanced. However, for our clustered QED design, we determined that a sample with a 1:2 (treatment: comparison) distribution with the same sample size will only increase MDES slightly. For example, in our power calculation for QED 1 with a medium MDES, a 1:1 sample with 875 participants has an MDES of 0.26, while a 1:2 sample with the same number of participants, has an MDES of 0.28.


B.3 Methods to Maximize Response Rates and Deal with Non-response


B.3.1 Site Visits


Conducting site visits and in-person interviews with DEI stakeholders (e.g. state officials, DEI State Leads, Workforce Investment Board staff, DRCs, American Job Center staff, service providers, public and private agency partners, employers and participants) is an important component of the DEI evaluation because:7,8


  • Talking to respondents in person allows the interviewer to establish rapport;

  • Evaluators can get at the whole story through the totality of the interpersonal experience, such as observation of body language and other visual cues;

  • Evaluation staff can observe other activities occurring on-site adding to the “fullness” of the data;

  • Evaluation staff can get a tangible sense of the issues in a locality, which allows site visitors to have richer conversations with respondents because they have a better knowledge of their environment.


Thorough preparation for the site visits will minimize the risk of non-response from sites and participants. The Solicitations for Grant Announcement requires participation in the evaluation. However, we recognize that participation among all stakeholders will require communication to reduce reluctance to participate among partners, employers and participants. Through early communication that emphasizes the importance of all stakeholders’ participation in the study, and the importance of gathering information and different perspectives from each individual respondent, the DEI evaluation team will reduce reluctance to participate fully in the DEI evaluation. The evaluation team will also identify and develop a trusting relationship with a local liaison who will advocate for the study, assist in putting together a site visit schedule that takes into account ease and convenience for the respondents, help convince additional respondents to cooperate, and be persistent in following up with participants. Additionally, the evaluation team has designed minimally intrusive data collection methods and tools to help reduce the burden on participants.


Each site will receive advance communication from the evaluation team about the study, as well as the expectations for participants. Evaluation Liaisons will be assigned to each grantee/staff. The Evaluation Liaison will work with the DEI state leads and WDA staff to arrange the interviews with key stakeholders. The DEI grant requires all participating WDAs to support the evaluation. Once the interviews are scheduled, each participant will receive written confirmation of the scheduled interview and topics to be covered. Once the interviews for a site have been scheduled, the Evaluation Liaison will review the schedule to ensure that an appropriate amount of flexibility is built in to account for potential last-minute schedule conflicts with interview participants. Should an interview respondent fail to keep a pre-scheduled interview appointment, the site visit team will work with the local liaison and the interview participant to reschedule the interview for an alternative time when the team is on site. If this proves too difficult, the site visit team will work with the site liaison and interview participant to schedule a phone interview or identify a substitute respondent. All site visits to treatment and comparison WDAs will include a focus group of participants. Focus group participants will be compensated for participating. All site visit staff and respondents will be asked to sign an informed consent and privacy form.


Similar data collections related to this current submission were approved in February 2012 by the Office of Management and Budget (OMB) under OMB Clearance number 0990-0346 for Rounds 1-4.

New Data Requests:

  1. Amend the fields collected in the DEI Participant Tracking System (PTS) by removing the existing fields and adding fields that identify DEI participants and their use of DEI service delivery strategies, a question on disability categories and questions on activities of daily living;

  2. Add a survey of comparison group individuals to collect information on disability categories, activities of daily living, DEI service delivery strategies used and related outcomes measures.

    1. DEI participant identifier;

    2. Type of disability

    3. Activities of daily living

    4. DEI service delivery strategies used;

    5. Receipt of training related certifications and/or diplomas;

  1. Adjust the DEI Annual Site Visit and Focus Group Protocols to include questions on career pathways design, implementation and utilization.


The evaluators fully anticipate that the information that we gather from the site visits to AJCs within the WDAs will provide not only information to strengthen the regression analysis, but will also shed light on the types of services being provided at the AJCs and the quality of those services. We will incorporate the site visit data into our quasi-experimental designs through the use of scale variables that indicate the quality of services, indicators of the scope of systems change, and WDA capacity to effectively serve individuals with disabilities. We anticipate conducting site visits to all treatment WDAs and s ample of comparison WDAs to collect information through interviews and focus groups. This information will be used to provide measures of systems change and WDA capacity.


B.3.2 Use of Existing Data Systems


WIASRD and W-P data will be used to capture information on both process and outcomes. Therefore, it is expected that response rates will be relatively high as participants that receive services are required to register at an American Job Center and the response rates for the required data elements have consistently high response rates. For example, item non-response from the 2013 WIASRD file for key descriptive and outcome variables are as follows: Gender (7.9%); Race/Ethnic Category (6.0%); Employment (1.0%); Earnings (2.1%); Educational Level (7.4%); Disability Status (2.1%). Other data elements, including completion of certification or diploma, have non-response rates of 8-10 percent. Social Dynamics and its partners have worked closely with participating DEI state and WDA personnel to ensure that staff members are trained on the DEI evaluation requirements. This occurs through webinars and site visits to WDAs.


As with the DEI Rounds 1-4 evaluation, Social Security Numbers will be included in the data submitted to the DEI Evaluation team for the DEI Rounds 5 and Future Rounds Evaluation so that WIASRD and W-P data can be linked with SSA data to capture specific information on disability status and revenue generated by SSA beneficiaries that enroll in Ticket to Work.


In order to minimize non-response for the information submitted through WIASRD and W-P, the DEI Evaluation Team will provide ongoing technical assistance to all participating American Job Centers, including treatment and comparison sites, using webinar technology and site visits. These technical assistance activities also will include a DEI Evaluation Web Page, a toll-free helpline and quarterly monitoring of incoming data for data quality and completion. In addition, the evaluation team will review on a quarterly basis WIASRD and W-P data submitted by each state. Other approaches to non-response will be considered at the conclusion of each Round of the evaluation.


B.3.2.1 Creation of a Participant Tracking System


For the purposes of tracking individual DEI Round 5 participants and collecting information that is not collected by WIASRD or W-P, the DEI Evaluation Team will use a participant tracking system that is independent of the WIASRD and W-P systems. Two systems were developed in 2012 for Rounds 1-4 and in 2014 for the state of Connecticut. These systems allow the evaluation team to directly access basic tracking information from the participating American Job Centers, such as participation in specific DEI Round 5 service delivery strategies. Most importantly, it will allow the DEI Evaluation Team to identify DEI participants from each state and WDA and alert local staff to missing WIASRD and W-P data.


B.3.2.2 New DEI Round 5 Data Collection Requirements

Three administrative data collection requirement will be added for the R5FR evaluation. Rounds 1-4 targeted individuals with disabilities enrolled in WIA-WIOA services. The R5FR states target all individuals with disabilities some of whom may not be enrolled in traditional American Job Center services. Therefore, the participant tracking system will include three new fields that “flag” DEI customers. Grantees may use the DEI participant tracking system or blank text fields that are available in their WIASRD and W-P systems.


B.3.2.3 DEI Round 5 Indicator (DEIR5)


DEI R5 grantees are required to add a DEI Round 5 customer indicator to their WIASRD and W-P systems or consolidated (co-enrollment) system or use the participant tracking system. This will allow the DEI evaluation team to identify which AJC customers are DEI participants.


B.3.2.4 DEI Round 5 Career Pathways Indicator (DEIR5-CP)


R5 grantees are required to indicate which AJC customers are DEI Round 5 Career Pathways Customers. This will allow the DEI evaluation team to identify which AJC customers are DEI participants and Career Pathways participants.


B.3.2.5 DEI Round 5 Service Delivery Strategy Indicators


DEI R5 grantees are required to add to their grant data collection activities, their selected service delivery strategies and indicate, for all DEI R5 customers, which service delivery strategies they receive. This will allow the DEI evaluation team to identify which AJC customers are DEI participants and received one or more of their state’s selected service delivery strategies.

 

B.3.2.6 Creation and Implementation of a Participant Survey


We propose to create and implement a participant phone survey that will allow us to determine the type and severity of the disability or disabilities that a participant wishes to disclose (see Appendix B.3). This information will also be available to treatment group individuals through the Participant Tracking System, with the telephone survey acting as a non-response follow-up tool. Comparison group individuals will be surveyed only and will not have access to the Participant Tracking System. This will allow us to not only provide a descriptive picture of the range of disabilities that participants disclose, but will also provide us with more accurate matches across the treatment and comparison groups in both impact analyses.


In addition to type and severity of disability, we also propose to add in questions that would provide us with more accurate information on outcomes, particularly on academic outcomes that are currently difficult to access through existing administrative databases. For example, we will ask questions on whether a participant completed a credential at a local community college or provider (and if so, the type of credential), and how long it took the participant to complete the credential.


The participant survey will be relatively short, with an anticipated completion time of approximately 12 minutes per participant. The administration of surveys will take place each calendar year. Each cohort of participants will be surveyed approximately 18 months from receipt of services at the AJC. The first survey would be administered in fall 2016, or 18 months from the start of grant services in spring 2015. We anticipate an analytic sample of approximately 1,000 participants, in both the treatment and comparison conditions. The anticipated response rate for a phone survey for this population of respondents is approximately 40%. As such, we expect to contact approximately 2,500 participants.


In an attempt to maximize the response rate for the participant survey, we plan to provide a $15 incentive to participate in the survey. Interviewers will make 10 attempts to contact landline cases. More calls will be attempted if contact is made with an eligible participant but the interviewer is asked to call back later. Interviews will be conducted during various times of the day and seven days a week to increase the likelihood of finding the respondent at home. Respondents will be provided with the option of scheduling the interview at the time that is convenient for them. For soft-refusals, “interview converters” who have extensive training in telephone interviewing and converting non-responders will be used to increase the response rate.


To address nonresponse, Abt will evaluate this in the participant survey in two ways: 1) a non-response follow-up survey (NRFU) and 2) a comparison across the levels of difficulty in reaching participants.


Non-Response Follow-Up Survey. The NRFU will collect information on participants who fail to respond to the participant survey and provide insight into whether the non-respondents differ from the respondents on the characteristics of interest (e.g., previous work experience), or the data appear to be missing at random and without an underlying characteristic. Specifically, interviewers will call back a subsample of participants that declined the original survey. These interviewers will attempt to recruit an eligible participant to complete a shortened interview with remuneration. In addition, all landline sample cases that can be matched to an address (through reverse lookup) will receive a letter encouraging them to cooperate with the interview.

Abt will compare the employment characteristics of the participant survey respondents with the characteristics of NRFU respondents. This analysis will provide insights about the direction and magnitude of possible nonresponse bias, and whether or not the nonresponse was at random. To determine whether the non-responders have characteristics that are different from the responders, we will use t-tests9 to determine if there were statistically significant differences in characteristics between the two groups. To increase robustness to this test, we will also determine whether the standardized differences between the two groups are larger than 0.25 standard deviations. If these tests indicate that there are no significant differences in characteristics across the two groups, we can safely determine that the data were most likely missing at random.


The NRFU estimates will be compared with both weighted and unweighted estimates from the main survey. Abt will investigate whether any differences remain after controlling for major weighting cells (e.g., within race/ethnicity and education level groupings). If weighting variables eliminates any differences, this suggests that weighting adjustments will reduce nonresponse bias in the final survey estimates. If, however, the differences persist after controlling for weighting variables, then this would be evidence that the weighting may be less effective in reducing nonresponse bias.


Level of Recruitment Difficulty. The second technique that Abt will use to assess nonresponse bias is an analysis of the level of recruitment difficulty. This analysis will compare the unweighted demographic and employment characteristics of respondents who were easier to reach with respondents who were harder to reach. The level of difficulty in reaching a respondent will be defined in terms of the number of call attempts required to complete the interview and whether the case was a converted refusal. In some studies, this is described as an analysis of “early versus late” respondents, though Abt proposes to also explicitly incorporate refusal behavior. If the demographic and employment characteristics of the harder-to-reach cases are not significantly different from characteristics of the easy-to-reach cases, this would suggest that survey estimates may not be substantially undermined by nonresponse bias. The harder-to-reach cases serve as proxies for the non-respondents who never complete the interview. If the harder-to-reach respondents do not differ from the easy-to-reach ones, then presumably the sample members who were never reached would also not differ from those interviewed.


In the easy-to-reach versus hard-to-reach analysis, Abt will define the easy/hard dimension in three ways: (1) in terms of ease of contact, as defined by the number of calls required to complete the interview; (2) in terms of amenability, as defined by whether or not the case was a converted refusal; and (3) in terms of both ease of contact and amenability, as defined by a hybrid metric combining number of call attempts and converted refusal status. This analysis will provide evidence as to which, if either, of these two mechanisms may be leading to nonresponse bias in survey estimates.


Addressing Missing Survey Data and the Impact Analysis. As the expected response rate for our phone survey is 40%, we anticipate the analytic sample size to be approximately 1,000 participants. Among the 1,500 participants who do not respond to the survey, their information will not be used in the analytic sample. Since the information that is provided in the survey includes a key matching variable, disability type, we cannot keep participants in the analytic sample if they do not have information on this variable. As such, this results in a case-wise deletion of in the regression analysis, in the sense that all units in the regression analysis will have complete survey information.




B.4 Tests of Procedures or Methods to be Undertaken


The evaluation team has tested the site visit instruments with individuals knowledgeable about the workforce system and employment issues for people with disabilities to ensure that question wording is clear, that the questions are evoking the appropriate information, and that the overall process is not placing an unreasonable burden on participants. These tests included interviews with stakeholders and a discussion about each component of the site visit instrument. Revisions were made to ensure that the questions collect information that is relevant to the study’s research questions. These tests were designed to identify and eliminate problems, allowing the evaluation team to make corrective changes or adjustments before actually collecting the data. The DEI Evaluation Team reviewed the completed interviews to determine if respondents interpreted the questions and probes the way they were intended to be interpreted; analyzed the data and modified the instruments based on the information gathered during the pilot test. Tests were completed in WDAs that are not part of the DEI Evaluation.


The qualitative data collected through the treatment test interviews indicated that the instruments collect the information they were supposed collect and the resulting data were relevant to DEI and each stakeholder group. However, the pilot interviews also showed how the instruments could be improved. Questions were reworded and reordered for clarity and flow of the interview process and several probes were added to the instruments. The DEI Evaluation Team will continue to test the site visit instruments during the OMB review and comment period to refine wording and confirm the estimates of burden.



B.5 Individuals Consulted on Statistical Aspects And/Or Analyzing Data10


  1. Ed Bein, Ph.D. Abt Associates. (DEI Evaluation Team Member. Consulted on statistical aspects of design; will collect information.)

[email protected]


  1. Robert Bleimann, Ph.D. Social Dynamics, LLC. (Director of Policy and Research. Consulted on statistical aspects of design; will collect information.)

[email protected]


  1. Sung-Woo Cho, Ph.D. Abt Associates. (DEI Evaluation Team Member – Abt Project Director. Consulted on statistical aspects of design; will collect information.)

[email protected]


  1. Carolyn Heinrich, Ph.D. University of Texas at Austin; Vanderbilt University. (Technical Working Group Member. Will consult on statistical aspects of design only.)

[email protected]


  1. Shanna Jaggars, Ph.D. Community College Research Center. (Technical Working Group Member. Will consult on statistical aspects of design only.)

[email protected]


  1. Douglas Klayman, Ph.D. Social Dynamics, LLC. (DEI Evaluation Project Director. Consulted on statistical aspects of design; will collect information.)

[email protected]


  1. Pamela J. Loprest, Ph.D. Urban Institute. (Technical Working Group Member. Will consult on statistical aspects of design only.)

[email protected]


  1. Amy Minzner, M.A., M.S. Abt Associates. (DEI Evaluation Team Member. Will collect information only.)

[email protected]


  1. Peter Mueser, Ph.D. University of Missouri-Columbia. (Technical Working Group Member. Will consult on statistical aspects of design only.)

[email protected]


  1. Louise Rothschild, M.P.P. Abt Associates. (DEI Evaluation Team Member. Will collect information only.)

[email protected]





References – Part B


Dunham, K., & Wiegand, A. (2008). The effort to implement the Youth Offender Demonstration Project (YODP) impact evaluation: Lessons and implications for further research. Oakland, CA: Social Policy Research Associates.

French, W. L., & Bell, C. H. (1995). Organization development: Behavioral science interventions for organization improvement.  5th Edition.  Englewood Cliffs, N.J.: Prentice-Hall.

Heinrich, C. J., & Lynn, L. E. (1999). Governance and performance: The influence of program structure and management on Job Training Partnership Act (JTPA) program outcomes. Evanston, IL: Joint Center for Poverty Research.

Livermore, G., & Coleman, S. (2010). Use of American Job Centers by Social Security Disability beneficiaries in four states implementing Disability Program Navigator initiatives. Washington, DC: Mathematica Policy Research.

Midgley, G. (Ed.) (2003). Systems thinking. London: Sage.

Miller, C., Bos, J., Porter, K., Tseng, F., & Abe, Y (2005). The Challenge of Repeating Success in a Changing World: Final Report on the Center for Employment Training Replication Sites. New York, NY: MDRC.

Schochet, P. Z., Burghardt, J. & Glazerman, S. (2000). National Job Corps Study: The Short-Term Impacts of Job Corps on Participants’ Employment and Related Outcomes. Princeton, NJ: Mathematica Policy Research.

Trutko, J., & Barnow, B. S. (2010). Implementing efficiency measures for employment and training programs. Prepared for the U.S. Department of Labor Employment and Training Administration.

U.S. Department of Labor. (n.d.). Wagner-Peyser Act employment services, state by state PY 2009 performance. Retrieved from http://www.doleta.gov/performance/results/wagner-peyser_act.cfm






Appendix B.1


New Questions on Career Pathways for Site Visit/Interview Protocol


Due to the emphasis on career pathways programming, it is necessary to add to the existing site visit/interview protocol, questions that collect information on the implementation of career pathways programs. The following questions are based on information provided by DOL on the implementation, quality and components of career pathways (CP) programs.11


1) Career Pathways Staff and Stakeholder Questions


  1. Collaboration


        • What collaborative relationships were needed to be in place to design and implement the career pathways program?

        • In what ways is the existing career pathways system accessible?

        • In what ways is it inclusive?

        • In what ways does it accommodate people with disabilities?


How does it accommodate the following groups?

  • At the secondary level (such as local education agencies, high schools, alternative high schools, Job Corps programs, YouthBuild program, career academies, and secondary career technical education programs)

  • At the postsecondary level (such as occupational certificate programs offered by community colleges, Registered Apprenticeship programs, and associate’s and bachelor’s degree programs),

  • With workforce agencies, business and other community stakeholders

  • Workforce Investment Board(s) and local job centers

  • Secondary Education staff;

  • Adult Basic Education providers

  • Transitional Assistance for Needy Families (TANF) providers and Human Service agencies

  • Economic Development agencies

  • Business/Employer representative(s)

  • Vocational Rehabilitation specialists

  • Other community-based organizations

  • State agencies:

  • State DOL

  • Adult Basic and Postsecondary Education

  • Economic Development

  • Human Services

  • Rehabilitation

  • Corrections/Juvenile Justice

  • Mental Health

  • Intellectual/DD

  • Other Stakeholders/Services

  • Medicaid

  • Social Security/Ticket to Work Employment Networks

  • Transportation

  • Housing

  • Registered apprenticeship programs

  • Asset development entities

  • Carl D. Perkins Act providers

  • Career and Technical Education (CTE) providers


  • Are there written agreements that clearly define the agreed upon roles and responsibilities of partnership members?


  • Has a leadership or steering committee for the DEI collaborative partnership been established to guide the process of making the existing career pathways system inclusive of people with disabilities?


  • Please describe the structure of this committee, or alternative governance structure, and its role in the development of the DEI career pathways initiative.


  • Who are the specific sector/industry partners for your DEI career pathways initiative?


  • How were they approached and what role(s) do they play in the project? (Programming development? Advisory? Placement/Work Experience?)


  • What does the Disability Resource Coordinator(s) play in the context of the career pathway program in terms of design? What about in terms of local level implementation?


  • What role does the DEI Project leadership play in the context of the career pathway strategic design at the state level?


  • What steps or supports are in place to follow the individual through the various components of a career pathway process (i.e. enrollment, completion, support services, transportation, advisory services, counseling services, completion, lattice (vertical movement) and ladders (vertical and horizontal movement)?


  • How are the stakeholders, including the American Job Centers, involved in assisting the individual progress of participants?


  • In what ways do they provide comprehensive support that leads to employment and career advancement?



  • Are there specific strategies or supports being used, such as:

    • Using the Integrated Resource Team (IRT) approach to integrate resources and services, blend and braid funds, and leverage resources across multiple service delivery systems;

    • Participating in the Social Security Administration's (SSA) Ticket to Work program to access training and employment resources;

    • Fostering partnerships and collaborations at the state and local levels;

    • Implementing the "Guideposts for Success”;

    • Implementing customized employment;

    • Hiring/designating a dedicated staff person, at the local level, with workplace and disability experience and expertise (Disability Resource Coordinators or Disability Program Navigators);

    • Using Universal Design Principles;

    • Aligning adult and youth career pathways programs with the use of

ILPs.


  1. Employer Engagement


  • Are there state industry organizations, business associations, or local employers involved in the design of curriculum and/or work-based learning opportunities (such as on-the-job training, summer youth employment, Registered Apprenticeships, internships, and other paid and unpaid work experiences)?


  • Please describe the role they have in any of these activities:

    • Determining which occupations within targeted industries and sectors should be included within the career pathways system.

    • Vetting the set of foundational academic, work readiness, and technical skills, abilities, and knowledge that are chosen as required for key occupations.

    • Vetting the certificates and credentials that are required for key occupations.

    • Collaborating with training institutions to design education and training programs.

    • Participating as instructors or training sites in the skill training programs.

    • Providing training funds for individuals through tuition reim­bursement or class-size training projects.

    • Participating in the skill certification/credentialing process.

    • Serving as mentors.

    • Serving as a job shadowing site.

    • Providing paid or unpaid internship positions for students.

    • Hiring individuals who have obtained the required certificates and credentials.


  1. Professional Development

  • Are there professional development opportunities that support the design, implementation, and maintenance of CP, foster innovative teaching and learning strategies, and are available for administrators, teachers, faculty, and other education professionals?


  • Is career pathways service delivery to youth inclusive, integrated, and based upon and consistent with the “Guideposts for Success?”

    • Please describe how the Guidepost for Success, School-Based Preparatory Experiences, is integrated into your career pathway service delivery model for youth with disabilities.

    • Please describe how the Guidepost for Success, Career Preparation & Work-Based Learning Experiences, is integrated into your career pathway service delivery model for youth with disabilities.

    • Please describe how the Guidepost for Success, Youth Development & Leadership, is integrated into your career pathway service delivery model for youth with disabilities.

    • Please describe how the Guidepost for Success, Connecting Activities, is integrated into your career pathway service delivery model for youth with disabilities.

    • Please describe how the Guidepost for Success, Family Involvement & Supports, is integrated into your career pathway service delivery model for youth with disabilities.


  1. Universal Design for Learning (UDL) and Teaching & Learning Strategies


Is UDL used in the design of career pathways program? What kinds of innovative and creative instructional approaches are being used, that enable teachers to integrate academic and technical instruction?


  1. Secondary & Postsecondary Curriculum


  • Is the secondary & post-secondary curriculum sequenced so students do not duplicate coursework?


  • Are formal agreements in place between secondary and postsecondary systems that allow students to earn postsecondary academic or career & technical education (CTE) credit while in secondary school?


  • Guidance counseling & advising – Is there guidance support and academic advisement that assists students in planning for their careers by mapping a complete sequence of coursework that ensures secondary graduation and preparation for a postsecondary training/education program?


  • Is there a “disability coordinator” or office at the various post-secondary entities involved in the workforce or regional career pathway model?



  1. Wraparound Services


  • Does the program provide wraparound and support services (such as child care, transportation, case management, academic and career counseling, college adjustment and retention services, financial aid, employment assistance/job retention assistance)?

Services may also include such supportive services as:

    • Job coaching,

    • Social support,

    • Financial literacy training,

    • Drop-out prevention services,

    • Life skills

    • Financial capability counseling

    • Using the Integrated Resource Team (IRT) approach to integrate resources and services, blend and braid funds, and leverage resources across multiple service delivery systems;

    • Participating in the Social Security Administration's (SSA) Ticket to Work Program to access training and employment resources;

    • Fostering partnerships and collaborations at the state and local levels;

    • Implementing the "Guideposts for Success”;

    • Implementing customized employment;

    • Hiring/designating a dedicated staff person, at the local level, with workplace and disability experience and expertise (Disability Resource Coordinators or Disability Program Navigators);

    • Using Universal Design Principles;

    • Aligning adult and youth career pathways programs with the use of ILPs.


  • Please describe each component of the wraparound services that are being provided.

    • What role, if any, do the DRCs and integrated resource teams have in connecting the individual to career pathways services? What about the AJC and its partners more broadly?

    • Is active resource coordination being used in conjunction with Integrated Resource Teams to facilitate access to wraparound services?



  1. Academic/Technical Skills, Standards, and Assessment


  • Are content standards clearly defined and are assessments used to ensure students meet them?

    • Have accommodation procedures been incorporated into the assessment process?

    • What actions have been taken to assure that technical skill assessments do not “screen out” individuals with disabilities?


  1. Competency Model


  • Are there competency models that define successful performance in a defined work setting for the career pathways programs that are provided?

    • A competency model is a clear description of what a person needs to know and be able to do – the knowledge, skills, and abilities – to perform well in a specific job, occupation, or industry.

  • Have the education, training, and skill needs of employers in the state/region been analyzed and gaps identified?


  • Has apprenticeship been considered for the career pathway, industry sectors or employers identified in the DEI grant (or regional or local workforce area approach)?


  • Has work experience, on-the-job training, and internships been incorporated into the DEI (or career pathway) strategic design?

  • Are these used to assist individuals in identifying preferable career paths?

  • Do any of the career paths incorporate paid employment into the academic experience? Which ones?



  • Has a plan been put in place to support working with business associations and employers during various phases of the project (design, launch, operation, and evaluation)?


  1. Develop Career Ladders (vertical movement between jobs) and Lattices (horizontal movement between jobs)


  • Are there career ladders and lattices that students can pursue after completing an initial certificate? Please describe the career lattices that are available through your career pathways program.


  • Are programs stackable and articulate to progressively higher-level credentials or degrees


  • Do the programs have multiple entry points and exit points (on-ramps and off-ramps)?


  • Are curricula “chunked” or organized in progressive modules, with each level clearly articulated to the next?


  • Please describe how are contextualized learning and accelerated integrated education and training strategies are being used to facilitate attaining positive employment outcomes for individuals with disabilities. These strategies may include:

    • Compressed training

    • Awarding credit for prior learning

    • Dual enrollment

    • Hybrid learning approaches


  • Are individuals provided opportunities for self and career exploration prior to choosing a career pathway?


  • Please describe how individualized career plans or Individualized Learning Plans (ILPs) are used to support an individual through the pathway.

    • Who is responsible for developing them and supporting the individual in implementing them?


  • Describe the assessment tools used to determine placement and advancement including credit for prior learning.

    • Are accommodations available when using these assessments?


  1. Employer Validation


  • Have your career pathways programs been reviewed by employer and industry personnel that can validate the competencies and pathways for each of your programs? Please describe this process.


  1. Flexible Scheduling


  • Do your career pathways programs provide flexible scheduling or attendance via technologies such as video conferencing for students that may not be able to attend class on a regular basis? Please describe your flexible scheduling and/or attendance policies.


  • Are alternate methods of demonstrating proficiency in course/training materials available to students?


  • Please describe how course/syllabus modifications techniques such as universal design, alternate assessments, individual learning and planning tools and other strategies are being used to effectively integrate and accommodate individuals with disabilities in the existing career pathways system?



2) Career Pathways Participant Focus Group Questions


  1. Enrollment and Onboarding


  • Is there guidance and support available for you from your school or training site? Is there guidance and support available for your family from the school or training site?


  • Is there help available for course work (e.g., tutoring, mentoring, etc.)? What help have you received? Who provided the help? Was it useful?


  • Did you get any kind of academic counseling on planning your career or selecting a career path? Tell me about the kinds of academic counseling you received. What help have you received? Who provided the help? Was it useful?


  • When you first registered for the career pathways program, did you take any kind of assessment to determine your academic skill level or career interests? Tell me about this assessment? Did you find it useful? Why/Why not?


  • Does your program include any instruction related to developing soft skills?


  • Does your program include any instruction related to developing self-advocacy skills?


  • Did you have an opportunity to explore work environments through an internship, work experience, or on-the-job training? Did you find it helpful?


  • Describe how you feel about getting school or training site staff to listen to you when you have a concern about something? How responsive are the staff members to your needs?


  • Were you offered or did you request an accommodation and was it provided?


  • When you enrolled in your program, did you use an individualized career plan to help you in identifying your career path? If so, who helped you in developing your plan? Was self-exploration and career exploration part of the process?


  • Did you get services at your local American Job Center or through a rehabilitation counselor?

    • Did the American Job Center staff or vocational rehabilitation specialist stay in touch with you?


  • Are there other programs or services you are involved with in the community that are helping you progress on your career pathway??


  • Did somebody help you with this process? What help have you received? Who provided the help? Was it useful?



  • Why did you select this career pathway? What do you think about the career path you chose?


  • How hard are the courses you’re enrolled in? Are you working harder than you thought you would?

    • What types of services and supports being provided are most helpful to your being successful in completing your coursework/training program?

    • Are there other types of services and supports that you need that you are not receiving?


  1. Class Experience


  • Is the information you receive in class presented in a way that you understand? What would you change about the way the information is presented?


  • Do you have access to a person at the college who can work with you individually to help you with administrative issues and other issues, like managing your time and work, and getting any accommodations you may need?

    • How did you meet this person? Is this person a staff at the school, friend or colleague? If you do not have access to a person at the college, who can help you? What can you do to find out what resources are available?


  • Are the standards or expectations for your program clearly defined by the instructor?

    • Are the expectations reasonable? Why or why not?


  • Do you believe the experiences you are having in your program are adequately preparing you for the career you have chosen?

    • Are there additional experiences that you believe would enhance your preparation?


  • Does your program provide flexible scheduling, or means of attendance such as video conferencing or recorded lectures, so you can attend to other things such as work, family and errands? What happens if you miss a class or two? Can the work be made up?


  • Do you have difficulty getting transportation to your school or training site? If so, how has it been addressed?


  • Is the course material you receive provided in an accessible way? Can you access it as needed?


  • Once you receive your certificate, does the program provide ways of building on your existing skill level to higher levels of skills and credentials or degrees?


  • About how much reading and writing are you asked to do outside of class?


  • What do you think about the amount of school work you’re asked to do outside of class?


  • Do you ever contact your instructor to ask questions about the coursework?



  • Do you feel that you get the information you need from your instructor? Is it helpful?


  • Do you use email to communicate with your instructor?


  • Is the instructor responsive?


  • Has the American Job Center or educational institution assisted you with obtaining employment?


  • Have you been linked with any employers for mentoring, work experience, or other counseling?


  1. Outside of Class


  • Do you ever work with your classmates outside of class on course assignments?


  • What’s it like working with your classmates on course material?


  • Do you discuss ideas from readings or class discussions with other students or your instructors after class?


  • How do you feel about these discussions? Do you learn anything from other students? Do you find these discussions enjoyable? Is your instructor helpful when you talk with him/her outside of class?


  • Has it been difficult to juggle school and others things in your life?


  • What things have been difficult to juggle?


  • How long does it take you to get to campus?


  • Do you have your own transportation? If not, what kinds of transportation do you use? Is it reliable?


  • Do you have other barriers that may inhibit your ability to participate in your school or training program? These may include housing, funding, independent living, health challenges, etc.


  • Do you participate in extracurricular activities (sports, clubs, mentoring or peer mentoring, etc.) at your school or training site? If yes, what activities do you participate in?


  • Describe your level of confidence concerning whether you can perform as well as your non-disabled peers in your schooling or training?


  1. Disability



  • How, if at all, does your disability impact your participation and success in the career pathways program?

    • If so, what types of supports and strategies work best for you in these situations?

    • Are you able to access them as part of your training program?

    • Do you feel you are benefiting from your training program?

    • Are there additional supports that you believe would help you to obtain more benefit from your training program?


  • Was there a “disability counselor” or program at the academic institution to address accommodation requests and provide support services?


  • Did you request an accommodation? How did the educational entity handle your request?



Modification of the DEI Participant Tracking System


For the purposes of tracking individual DEI Round 5 participants and collecting information that is not collected by WIASRD or W-P, the DEI Evaluation Team will use the DEI Participant Tracking System that is independent of the WIASRD and W-P systems. Two Participant Tracking Systems (PTS) were designed for the DEI. In 2010, the first system was developed for Rounds 1-4 to collect data elements selected by DOL that were not included in the WIASRD and W-P systems. In 2014, another PTS was created for the state of Connecticut due to state legislation that prohibits data with personally identifiable information from residing on servers with data from other states. Both systems are NIST compatible and are linked to WIASRD and W-P via unique identification numbers or SSNs.


These systems allow the evaluation team to directly access DEI customer tracking information from participating AJCs, such as participation in specific DEI Round 5 service delivery strategies. Most importantly, they allow the DEI Evaluation Team to identify DEI participants from each state and WDA, alert local staff to missing WIASRD and W-P data, and provide a way for DEI grantees to collect information without modifying their WIASRD or W-P systems.




1 South Dakota is a single WDA state. It will not be included in the impact evaluation because there is no comparison group than can be used to match treatment individuals with comparison individuals. It will be included in the process and outcome evaluations.

2 DOL has an MOU with the Social Security Administration (SSA) that will be used to match DEI data to SSA data for purposes of identifying American Job Center participants with disabilities based on their enrollment in SSDI, in addition to self-identification. Thus, a part of the analysis will be based on this subset of the client universe.

3 A site refers to an WDA or an area within an WDA in the case of South Dakota, which is a single WDA state. To determine the number of participants with disabilities from whom data will be collected via the DEI Data System, estimates were provided by each Round 5 grantee.

4 We derived these estimates from the number of exiters with disabilities in PY 2013 in each of the six DEI states. Next, we multiplied the estimates by three, i.e., the number of implementation years, and subtracted out the estimated number of DEI participants identified by the states in the study WDAs.

5 Notice of Availability of Funds and Funding Opportunity Announcement for Disability Employment Initiative Cooperative Agreements Announcement Type: Initial Funding Opportunity Number: FOA-ETA-15-08https://www.doleta.gov/grants/pdf/FOA-ETA-15-08.pdf retrieved on December 17, 2015.

6 The Stata command “xtmixed” will be used for all regression analyses to account for WDA and state fixed effects, as well as random effects.



7 Site visits will include visits to each treatment and comparison WDA. Within each WDA, interviews will be conducted with LWIB staff, American Job Center staff, including DRCs and Employment Counselors, public and private agency partners and employers.

8 Given our experience with the DEI through four rounds of grantees, we expect the response rate for site visit interviews to be 100%. The DEI Evaluation Team provides reminders via email to each site visit interview respondent 24 hours in advance to ensure their participation.

9 A t-test is used to determine if two data points are significantly different from each other. Research Methods Knowledge Base. http://www.socialresearchmethods.net/kb/stat_t.php retrieved on December 17, 2015.

10 Individuals are listed in alphabetical order.

11 Adapted from the U.S. Department of Labor, Employment and Training Administration, August-September 2011

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleTable of Contents
AuthorAuthors
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy