Green Jobs Impact Evaluaiton Site Visits SS- Part A - 7 25 2012

Green Jobs Impact Evaluaiton Site Visits SS- Part A - 7 25 2012.docx

Site Visit Data Collection Request for American Recovery and Reinvestment Act funded Grants; Job Training Evaluations

OMB: 1205-0486

Document [docx]
Download: docx | pdf

Part A: Supporting Statement for Paperwork Reduction Act Submission: Site Visit Data Collection

The U.S. Department of Labor’s Employment and Training Administration (ETA) is seeking Office of Management and Budget (OMB) e approval to continue to collect site visit data from organizations that received grants under four Solicitations for Grant Applications (SGAs) that were issued under the American Recovery and Reinvestment Act (ARRA): Pathways Out of Poverty (POP), Energy Training Partnership (ETP), State Energy Sector Partnership (SESP), and Health Care and Other High Growth and Emerging Industries Training grant initiative. POP, ETP and SESP are all Green Jobs training programs. The overall aim of this evaluation is to determine the extent to which enrollees achieve increases in employment, earnings, and career advancement as a result of their participation in the training provided and to identify promising best practices and strategies for replication.

Process Study Site Visits.

Implementation Evaluation Completed:

For the implementation evaluation, one round of site visits was conducted with 36 grantees and has been completed. That round of site visits was approved as part of the initial 1205-0486 request. Therefore the burden hours and other details regarding the implementation evaluation site visits have been removed from this extension request, resulting in a decrease in burden hours and respondent/responses.


Impact Evaluation:

This research activity involves conducting two rounds of site visits to the four grantees in the impact evaluation for the purpose of documenting the program environment, participant flow through random assignment and program services, the nature and content of the training provided, the control group environment and grantee perspectives on implementation challenges and intervention effects.

During the visits, site teams will interview key administrators and staff (including program partners and employers), using a semi-structured interview guide and will hold focus groups with participants (first round only).



1. Circumstances Necessitating the Site Visit Data Collection

As part of a comprehensive economic stimulus package funded under 2009 ARRA, DOL funded a series of grant initiatives to promote training and employment in selected high-growth sectors of the economy. Individuals facing significant barriers to employment, as well as those who were recently displaced as a result of the economic downturn, are the high-priority labor pools targeted by these ARRA initiatives. High-growth and emerging industries are emphasized as part of the ARRA’s focus on labor demand, with a particular focus on emerging “green” sectors of the economy and pressing skill shortages in health care fields. These grant programs are consistent with ETA’s emphasis on more “customized” or “sector-based” labor market solutions, and job seekers, including incumbents, facing significant barriers to economic self-sufficiency become a resource to targeted growth sectors facing skill shortages or anticipating hiring needs.

ARRA’s focus on the needs of high-growth and emerging industries to hire additional workers comes at a critical time. During periods of both recession and expansion, it is important that employers remain attentive to the challenge of building and maintaining a productive workforce to ensure their long-term competitiveness. This applies particularly in industries such as health care, education, and energy, in which the Bureau of Labor Statistics projects significant job growth over an extended time (Bureau of Labor Statistics 2010). However, several factors, including declines in educational attainment among American workers, a skilled workforce that is aging and in need of replacement for retiring workers, and continued immigration are affecting workforce skill levels and employers’ ability to remain competitive and increase productivity (Dohm and Shniper 2007). Training programs like those funded by ARRA are designed either to provide these skills or to begin an entry-level career path toward acquiring them.

ETA’s grant programs represent an important step to increasing postsecondary education and training in high-growth areas, particularly health and green jobs. They provide needed resources to provide training, encourage partnerships between different service delivery systems, feature strong employer involvement, and focus on the provision of innovative and promising training strategies. To learn about the impacts of this significant investment of resources in training programs, ETA has funded rigorous evaluation using a random assignment research design and a comprehensive implementation evaluation.

Previous research in the training field has provided insight into the educational and economic effects of training on participants. However, many of the studies did not use random assignment, which leaves them open to concerns about selection bias and makes it difficult to determine what outcomes would have been in the absence of the training services. To assess the impacts of these training programs effectively, a rigorous design and implementation of random assignment are required. The process study is aimed at clarifying the net impact findings with contextual knowledge.


Overview of the Impact Evaluation

The overriding goals of this evaluation are to determine the extent to which enrollees achieve increases in employment, earnings, and career advancement as a result of their participation in training provided by the Green Jobs and Health Care grantees and to identify promising practices and strategies for replication. The impact study will use an experimental design involving random assignment to measure the impact of the program, as well as a process study to examine implementation and operations. The random assignment study will be conducted in four grantee programs. ETA will select grantees based primarily on the perceived strength and scale of their intervention. Therefore, we will not supply estimates of the impact of the grant programs as a whole, but rather will provide results on interventions operated by selected grantees.

The evaluation will address the following research questions:

  • What is the impact of the programs on the receipt of education and training services, in terms of both the number who receive these services and the total hours of training received?

  • What is the impact of the programs on the completion of educational programs and the receipt of certificates and credentials from the training?

  • What is the impact of the program on the employment levels, earnings, and career advancement of participants?

  • To what extent do the programs result in any employment (regardless of sector)? To what extent do the programs result in employment in the specified sector in which the training was focused?

  • What features of the programs are associated with positive impacts, particularly in terms of target group, curricula and course design, and additional supports?

  • What are the lessons for future programs and practices?

For this evaluation, the treatment condition is defined as having the opportunity to enroll in training funded by either the Green Jobs or the Health Care grants. The treatment condition will vary from site to site depending on the grantees selected for the evaluation and the nature and context of the training programs those organizations choose to implement with their grant funds. The control condition, or counterfactual, is defined as not having the opportunity to enroll in training funded by Green Jobs or Health Care grants. However, control group members will not be prevented from enrolling in other locally available training programs or services in the community. We recognize that some people assigned to the control group will find opportunities to receive some form of training. This configuration—a comparison of access to the focal program’s services to other services in the community—is a common design for random assignment studies of training programs. It is also one that answers the relevant policy question: Does adding the program services funded by the Pathways and Health Care grants to the configuration of training services already available in the community improve participant outcomes?

At each selected impact study site, individuals will be randomly assigned to a treatment or control group. A total of 4,000 sample members will be selected overall, with target sample size totals varying by site, as shown in (Table A.1).

Table A.1. Sample Sizes for the GJ-HC Impact Evaluation



Site

Treatment Group Members

Control Group Members

Total Sample

Site 1

600

600

1,200

Site 2

575

475

1,050

Site 3

600

300

900

Site 4

425

425

850

Total

2,214

1,810

4,000





Overview of Data Collection

Addressing the research questions adequately requires collecting detailed data from multiple sources across multiple points in time.

Site Visits for the Impact Study.

A rigorous random assignment evaluation requires clear and specific documentation of the services provided to treatment group members in each of the grantee sites and the services available to control group members. This qualitative information will enable the evaluation team to describe the program design and operations in each site, interpret the impact analysis results, and identify lessons learned for purposes of program replication. The process study site visits will include semi-structured interviews and focus group discussions with various program stakeholders. Potential respondents and discussion topics are listed below.

  • Interviews with administrators and staff (including instructors and counselors) at each site will document the program services provided to the treatment group. These interviews will collect detailed information on a full range of process study topics including: economic and programmatic context; program and organizational structure; programmatic priorities; recruitment; service delivery systems; key partners; linkages with employers; nature and content of training and support services; funding resources; and the sustainability of the grant program after the grant period. Our overriding aim is to gain a complete understanding of the range of factors (programmatic, institutional, and economic) that serve to facilitate or inhibit the successful implementation and operation of the program. These interviews will also allow us to identify and obtain information on other programs and services that may be available in the communities in which grant services are offered.

  • Interviews with key program partners (e.g. One-Stop Career Centers, TANF agencies, community colleges) will help us understand the historical aspects of the partnership, the current relationships among different collaborating organizations, and the range of services provided. We also hope to interview partners that may provide services to control group members.

  • Interviews with employers (two to three key employers from relevant sectors) will help us understand the extent to which critical “demand side” considerations have been integrated into the program model. The interviews will include discussions of employers’ roles in the planning process, their roles in program and curricula design, and their experiences with placement, hiring, and post-program employment of participants.

  • Focus group interviews with treatment group students will be important to understanding service utilization, reasons why services are or are not successful in achieving their goals, and insights on job advancement or job loss. To supplement the two follow-up surveys of participants, we will conduct informal group discussions with a small number of students in the treatment group as part of the first round of site visits. We will explore what treatment group members hear and know about programs and services; reasons individuals use or do not use program services; particular challenges they may face in attending or completing school; participant knowledge of available resources; and perceptions about the likelihood of career advancement. A protocol for these discussions is presented in Appendix A. For the focus groups, site staff assist in identifying approximately eight students who would be available to participate in a focus group; this will be a convenience sample and will not be intended to represent the broader group of participants.


To develop such documentation, a team of two experienced evaluators will visit each site at two points over the course of the evaluation and follow a specific, detailed field visit protocol and interview guide. The first round of visits will be conducted immediately after OMB approval is received and will last three days. These visits will focus on documenting the initial implementation of the programs and will include interviews with administrators, staff, partners, and employers as well as focus groups. The second round of site visits will occur 9 to 12 months after the start of random assignment when programs have reached maturity. For the POP program, there may not be a second opportunity which makes conducting the first site visit while they are still in operation all the more important.

The site visit data collection will follow a standard protocol for conducting semi-structured interviews with selected staff and administrators. Site teams will conduct interviews with individual staff and administrators in a private office or room on-site following established procedures for maintaining strict individual privacy. Notes from the interview will be handwritten or entered onto a laptop computer. After each visit, the field notes will be stripped of any identifying information to guard against any violations of privacy provisions. Notes will be stored in a secure computer or file cabinet at Abt Associates or its partner that can only be accessed by the evaluation team.

Table A.3. Program Dimensions Examined in Process Study Interviews

Program Dimension

Key Respondents


Local context: Broad community context in which the program operates/services are delivered

  • Socioeconomic and ethnic profile of the population

  • Unemployment rates, availability of jobs, characteristics of available jobs

  • Range of education and training opportunities in the community

  • Availability of public and financial supports



  • Program administrators

  • Program partners

  • Employers


Organizational structure and institutional partners: Characteristics of the grantees: organizational characteristics, staffing, program partners.

  • Organizational Structure: size, operational structure, funding, history, leadership, linkages with other systems (local work force boards, community colleges)

  • Staffing: Number and roles of staff (planned and actual)

  • Partners: Program services offered and delivered, how services are coordinated



  • Program managers

  • Program service delivery staff (teachers, counselors, other professionals)

  • Program partners


Program design/operations: Strategies used by the program to deliver curricula/services or organize activities

  • Outreach and recruitment strategies (planned and actual)

  • Assessment and case management

  • Course requirements and schedules

  • Instructional methods and curricula

  • Counseling and other support services

  • Location of services, activities

  • Role of employers

  • Changes over time



  • Program managers and staff

  • Program partners

  • Employers


Service receipt/utilization:

  • Services received by treatment group members; number and type of services received; length of services

  • Other education, job training, and support service programs available to control group members



  • Program managers and staff

  • Participant focus groups


Participant perspective: Factors that affect use/non-use of services

  • How heard about services/messaging

  • Challenges/facilitators to using services



  • Participant focus groups



Implementation accomplishments/challenges: Factors that facilitated or impeded the effective delivery to services to participants



  • Program managers and staff

  • Program Partners

  • Employers





2. How, by Whom, and for What Purposes Will the Information Be Used?



ETA requests clearance to conduct site visits. The data from the site visits will enable the team to describe the program design and operations in each site, interpret the impact analysis results, and identify lessons learned for purposes of program replication.

The site visits for each study are described in detail below, along with how, by whom, and for what purposes the information collection will be used.

Site Visits for the Impact Study

The site visits will involve semi-structured interviews with administrators and staff, key program partners, employers, informal group discussions with students in the treatment group, and observations of program activities. The site visits are needed for the following purposes:

  • To describe the program design and operations in each site. Because the program as it is described “on paper” (in the grant application or other materials) may differ from the actual program being tested, researchers will collect data during the site visits that will enable them to describe the programs as implemented.

  • To examine the treatment-control differential to help interpret impact results. Impacts on employment patterns, job characteristics, earnings and other outcomes will presumably be driven by differences in the amount and/or types of training and other services received by members of the treatment and control groups. Because the control group can access training and other services outside the grant-funded program, during the site visits researchers will collect data that will enable them to describe and establish levels of service receipt for both treatment and control group members. For example, researchers will collect information on other sources of similar training (including those within the same institution) and sources of funding for training (e.g., other programs).

  • To identify lessons learned for use in replication efforts. Data collected during the site visits—considered within the context of the impact results—will be the key source for formulating recommendations about how to replicate successful programs or improve upon their results. These data will also be used to identify lessons learned about the relative effectiveness of particular training strategies. While it may not be possible to completely disentangle which factors are driving differences in impacts across sites, to the extent possible, the researchers will identify factors that appear to be linked to success, as well as those that are not.

Description of site visits

Two-person site teams will conduct two rounds of site visits. The teams will schedule their first visit shortly after clearance is received and will spend three days at each site. These visits will focus on documenting the initial implementation of the programs and will include semi-structured interviews with administrators, staff, partners, employers, group interviews with students in the treatment group, and observations of grantees’ activities. Site teams will conduct the second round of site visits 9-12 months after the start of random assignment when programs have reached maturity, and will focus on changes and developments in the provision of services as well as issues regarding the sustainability of the grant program. Given that we will already have a basic understanding of the program and its operation, these visits will be two days in length (vs. three days in the first round).

The site visit team will work closely with the sites to arrange for the most convenient but expeditious time to visit their program. The evaluation team will also hold site visitor training for all staff involved in the visits. After each site visit, the data and information collected will be summarized and maintained in site specific databases.

Site visit team members will use prepared discussion guides to conduct the semi-structured interviews, and will be guided by a set of protocols that cover the types of information required to advance our understanding of the training programs. The guide (see Appendix A) provides an outline of key topics of interest with sample questions and probes. The semi-structured nature of the interview guide is purposively designed to allow site visitors maximum flexibility in tailoring their discussions during specific interviews to the different perspectives of respondents and the unique circumstances that prevail at each site while still ensuring that all key topic areas of interest are addressed in each site visit. While we will try to capture as much detail as possible, the questions in the discussion guide will remain open-ended in style to allow for the greatest freedom for the respondent to answer in his or her own words.

3. Use of Improved Technology to Reduce Burden

The data collected through site visits will be recorded electronically and bears no burden on the grantees or participants.

4. Efforts to Identify Duplication

The data to be collected during the site visits are not available from any other source. There is no other data source providing detailed information on the program context, program services, control group environment, and implementation and challenges and successes. The first and second rounds of the site visits will provide different sets of information about program operations, with the first round of visits focusing on grant activities until that point and the second round of visits focusing on changes since the first visit.

5. Methods to Minimize Burden on Small Businesses or Entities

This data collection does not involve small businesses or other small entities.

6. Consequences of Not Collecting the Data

This information collected through the process study site visits will enable the team to describe the program design and operations in each site, interpret the impact analysis results, and identify lessons learned for purposes of program replication. The consequences of not collecting the data from the field-based process analysis is that there would be a lack of in-depth information about the nature of the strategies developed and employed at grantee sites to improve the educational and employment and training outcomes of the students they service. If the in-depth interviews and focus groups are not conducted there will be no information regarding the context, design, implementation, operation, outcomes, and/or replicability and sustainability of the grant programs. Site visits will provide an opportunity to fully document the services being delivered to treatment group members and, for the impact study, the potential services available to control group members. This is an essential part in an experimental design for understanding, for example, if employment outcomes at various points after random assignment are potentially associated with varying services received by treatment and control group members. If there are positive net impacts for the treatment group, it will be vital to understand the specific intervention(s) received by treatment group members so that they could potentially be replicated by other employment and training programs.

7. Special Data Collection Circumstances

This data collection effort does not involve any special circumstances.

8. Federal Register Notice and Consultations Outside the Agency

a. Federal Register Notice

The notice soliciting comments on the proposed collection was published in the Federal Register on Friday, April 20, 2012, Vol. 77, No. 77.

b. Consultations Outside the Agency

Consultations with experts in the field on the research design, sample design, and data needs are part of the study design phase of the evaluation. The purposes of these consultations are to ensure the technical soundness of the study and the relevance of its findings and to verify the importance, relevance, and accessibility of the information sought in the study.

Peer Review Panel Members Consulted for the Impact Evaluation

1. Maureen Conway, [email protected]

2. Harry J. Holzer, [email protected]

3. Robert J. LaLonde, [email protected]

4. Larry Orr, [email protected]

5. Burt S. Barnow, [email protected]

6. Mindy Feldbaum, [email protected]

9. Respondent Payments

There are no payments to study participants for completing the site visits. We plan to pay participants in the focus groups $25 (using a gift card) for attending the focus groups. No payments will be made to any other respondents interviewed for the process study.

10. Confidentiality

Impact Evaluation:

Abt Associates and Mathematica Policy Research have a strong set of methods in place to ensure the privacy and protection of all data collected from study participants. This includes policies and procedures related to privacy, physical and technical safeguards, and approaches to the treatment of personally identifiable information (PII).

  1. Privacy Policy

Abt and Mathematica are very cognizant of federal, state, and DOL data security requirements. All Abt and Mathematica study staff comply with relevant policies related to secure data collection, data storage and access, and data dissemination and analysis. All staff working with PII sign data security agreements. Abt’s and Mathematica’s security policies meet the legal requirements of The Privacy Act of 1974; the “Buckley Amendment,” Family Education and Privacy Act of 1974; the Freedom of Information Act; and related regulations to assure and maintain the privacy of program participants to the maximum extent allowed by the law.

  1. Privacy Safeguards

Process Study Site Visits. The administrators and staff interviewed by evaluators when on-site will be assured that their responses will be combined with those from other sites for analysis, will not be identified by the individual in any reports nor will interview notes be shared with ETA. Individuals will be interviewed separately and in private offices. (See protocol in Appendix A for the statement that will be used during site visits to assure respondents of privacy.) To preserve privacy, paper copies of interview notes will be secured in a locked file cabinet. If any notes are recorded on laptop computers, such notes will be stored in a SQL Server database located in an access-controlled server room at Abt Associates.


11. Questions of a Sensitive Nature

No sensitive questions will be asked during the site visits.

12. Estimates of Annualized Burden Hours and Costs

The hour burden estimate for the collection of information that is part of this clearance request consists of the site visits for the process study.


The Burden Estimates for the Impact Evaluation:

We will interview an average of eight administrators and staff in each of the four sites included in the evaluation at two points over the course of the evaluation. In each site, we will also interview an average of three program partners and two employers. Finally, we will conduct one participant focus group with an average of eight students. The estimated response rate is 100 percent, since when arranging for the site visits, evaluators will confirm scheduled times for interviewing key administrators and staff and set up the focus group in advance. The estimated response time for the individual interviews is an average of 45 minutes and 90 minutes for the focus group. Total estimated response burden for the site visits is 126 hours. (Table A.6)

Table A.6. Burden Estimates for Impact Study Site Visits


Respondents

Number of Respondents/
Instances of Collection

Frequency of Collection

Average Time per Response

Burden

(Hours)

Site Visit Data Collection





Administrators & staff

32

Twice

45 minutes

48

Program partners

12

Twice

45minutes

18

Employers

8

Twice

45minutes

12

Treatment group students

32

Once

90minutes

48

Total for site visits

84

--

--

126



The total annualized cost to staff for the process study visits is presented below in Table A.7. The total estimated costs for these data collection activities are $2,364. The average hourly wage in that table for the site visit data collection is $18.76, based on the BLS average hourly earnings of production and nonsupervisory employees on private, service providing, nonfarm payrolls (September 2010 National Industry-Specific Occupational Employment and Wage Estimates, from the U.S. Department of Labor, Bureau of Labor Statistics and available on the department’s website).1

Table A.7. Total Annualized Cost Estimates for Site Visit Data Collection for the Impact Study


Data Collection Activity

Total Burden Hours

Average Hourly Wage

Total Annualized Cost

Process Study Site Visits

126

$18.76

$2,364



13. Estimates of Annualized Respondent Capital and Maintenance Costs


The proposed data collection for the on-site visits will not require respondents to purchase equipment or services or to establish new data retrieval mechanisms. There are no capital/start-up or ongoing operation/maintenance costs associated with this information collection. The field-based process data collection involves semi-structured interviews discussing staff and administrators’ descriptions of services and service delivery, and their experiences, opinions, and factual information. Therefore, the cost to respondents solely involves the time involved in being interviewed. These costs are captured in the burden estimates provided in Item 12.


(a) We do not expect any total capital and start-up costs.

(b) We do not expect extensive time spent on generating, maintaining, and disclosing or providing the information.


14. Estimated Annualized Cost to the Federal Government

Following is the annual cost for the entire Green Jobs and Health Care Impact Evaluation to the federal government. To average the annualized costs of these varying estimates over the next three years, there are two approaches: divide the five year total ($7,992,852) by five for an average annual cost of $1,598,570. Or, add the first three year totals and divide by 3, for an average of $1,389,910. However, it is important to note that these figures are total costs for the entire evaluation and not just for the site visits, which are the subject of this submission.

Table A.8. Annual Costs for GJ-HC Impact Evaluation

Year

Dates

Cost

1

2010-2011

$1,466,492

2

2011-2012

$1,012,994

3

2012-2013

$1,690,244

4

2013-2014

$1,997,998

5

2014-2015

$1,825,124

Total


$7,992,852



15. Changes in Burden

The implementation study interviews have been completed; therefore the burden was reduced downward to reflect only the impact study interviews. In the course of making adjustments in Agency estimates for burden for the current submission, an IC was consolidated with the remaining two in order to best organize and reflect this reduction in burden.

16. Publication Plans and Project Schedule

Impact Evaluation:

Baseline data collection began in late spring 2011. Two major project reports will be prepared: (1) the interim report, which will draw from 18-month follow-up data to present the key short-run findings of the impact analysis; and (3) the final report, which will utilize 18- and 36-month follow-up data to present findings on long-run program impact. At the conclusion of the study, the project will also create a public use data file stripped of personally identifiable information. Table A.6 gives the timeline for the deliverables.

Table A.9. Study Timeline



Time

Activity

Summer 2011

Baseline data collection began

Winter 2011

First round of process study site visits conducted


Fall 2012

Second round of process study site visits conducted

Winter 2013

Baseline data collection ends;

18-month participant survey begins

Fall 2014

Interim report summary published

Summer 2015

Survey data collection ends;

36-month participant survey begins

Fall 2016

Final report summary published

Public use data file available




17. Reasons for Not Displaying Expiration Date of OMB Approval

The expiration date for OMB approval will be displayed on all forms associated with this data collection.

18. Exception to the Certification Statement

Exception to the certification statement is not requested for the data collection.


1U.S. Department of Labor, Bureau of Labor Statistics, Table B-8. Average hourly and weekly earnings of production and nonsupervisory employees on private nonfarm payrolls by industry sector, seasonally adjusted (accessed from the following website as of September 2010: http://www.bls.gov/webapps/legacy/cesbtab8.htm)

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleEmergency submission; site visit data collection request for ARRA-funded grants; job training evaluations
AuthorSwick.Savi
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy