H-1B 1205-0507 Supporting Statement_final 7 24 2013

H-1B 1205-0507 Supporting Statement_final 7 24 2013.docx

H-1B Technical Skills Training Grants and H-1B Jobs and Innovation Accelerator Challenge Grants

OMB: 1205-0507

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT for OMB Approval Number 1205-0507

H-1B Technical Skills Training (TST) and Jobs and Innovation Accelerator Challenge (JIAC) Grants Reporting and Recordkeeping Requirements, Change Request



A. JUSTIFICATION


This is a justification for revisions to the Department of Labor, Employment and Training Administration’s (ETA) request to implement 1) reporting, 2) recordkeeping and 3) program evaluation requirements of both the H-1B Technical Skills Training Grants (H-1B) [SGA/DFA PY-10-13]Rounds 1-4 and the H-1B Jobs and Innovation Accelerator Challenge Grants (JIAC) [SGA/DFA PY-10-15] Rounds 1-2. This reporting structure features standardized data collection on program participants and quarterly narrative, performance, and Management Information System (MIS) report formats. All data collection and reporting will be done by grantee organizations (state or local government, not-for-profit, or faith-based and community organizations) or their sub-grantees.


Revisions to ETA’s original request are being made to specify the process of site evaluation originally described in the previous approval. In addition to the program evaluation data collection activities originally described (the baseline data collection), the evaluator will conduct data collection activities for the purposes of site selection and a process study. These data collection activities are described below in addition to the previously approved baseline data collection effort.


The revisions to the request are limited to the program evaluation data collection activities for the H-1B TST grants—no changes will be made to reporting or recordkeeping for the H-1B TST grants or the H-1B JIAC grants.


Performance Monitoring


ETA previously received approval for performance data collection from grantees using a file upload system that will validate participant records. This approach will address the Agency’s goal of minimizing reporting errors by using a data validation process.


Data validation will be required quarterly and consists of two parts:


  1. Data element validation must be completed within 45 days after the performance reporting quarter ends and participant records are due to ETA.

  2. Report validation should be performed prior to the submission of quarterly reports to ETA and must be submitted within 45 days after the quarter ends.


Data validation assesses the accuracy of data collected and reported to ETA on program activities and outcomes. In accordance with meeting the Agency’s goals of collecting accurate and reliable data, ETA has utilized and enhanced existing data validation software (Enterprise Data Report Validation Software – E-DRVS) in order to ensure the accuracy of data collected and reported on program activities and outcomes. Performance monitoring baseline data will consist of participant demographics characteristics, types of services received, placements, outcomes and follow up status. Each requested data element can be reviewed in detail in the document H-1B Data Elements Performance Evaluation.


In conjunction with baseline data, ETA is requesting the collection of quarterly narrative reports that will provide a detailed account of program activities, accomplishments, and progress toward performance outcomes during the quarter.


Quarterly performance reports will be produced using the participant records submitted to ETA that will include an aggregate account of program activities each reporting quarter. Specifically, these reports reflect data on individuals who receive employment and training services, assessment services, skill upgrading, placement into unsubsidized employment, and other services essential to assisting workers gain the skills and competencies needed to obtain or upgrade employment in high-growth industries and/or occupations employers are currently using H-1B visas to hire foreign workers for.


The accuracy, reliability, and comparability of program reports submitted by grantees using Federal funds are fundamental elements of good public administration and are necessary tools for maintaining and demonstrating system integrity. The use of a standard set of data elements, definitions, and specifications at all levels of the workforce system helps improve the quality of performance information that is received by ETA.


Evaluation Collection


Through the original version of this ICR, ETA received approval to create a Participant Tracking System (PTS) which will be populated by information collected by the MIS. This baseline information will be collected by H-1B grantees providing technical skills training for all individuals who seek receipt of grantee services. The baseline data (as referred to in document “H-1B Data Elements Performance Evaluation – H-1B Evaluation Data Elements) covered by this ICR will enable the evaluation to describe the characteristics of study participants at the time they are randomly assigned to a treatment or control group, ensure that random assignment was conducted properly, create subgroups for the analysis, provide contact information to locate individuals for follow-up surveys, and improve the precision of the impact estimates. Such data will be collected on the basis that the evaluation of the technical skills training will consist of an experimental design employing random assignment of participants into treatment and control groups (JIAC grantees will be evaluated using an implementation study). The treatment for the TST grants is envisioned to be the technical skills training that would be provided. The design will include a treatment/no treatment component where half of the eligible participants selected randomly will not be provided the training. The random assignment component will be the subject of a future information collection request, and that feature will not be implemented without OMB authorization; in the interim, however, the agency has a need to obtain baseline data.


In the revisions to the ICR, ETA is requesting approval for two additional data collection activities for the H-1B TST evaluation: 1) Telephone interviews and site visits for site selection, and 2) Process study site visits. Both data collection efforts are described below.


Telephone interviews and site visits for site selection. This research activity involves conducting telephone interviews with program directors and other key staff for up to 24 grantees to identify possible sites for the random assignment study. From these interviews, the research team will identify 12 sites that appear to best meet the site selection criteria. The research team will then conduct 1– to 2–day site visits to these 12 sites to assess their capacity to participate in the study, including the nature of the services provided; the suitability of the enrollment process for random assignment; barriers to the implementation of a random assignment design; and the ability to over-recruit for the control group in order to meet sample size requirements. Based on the site visits, and in consultation with ETA, the research team will select six sites for the evaluation.


Process study site visits. This research activity involves conducting site visits to up to eight grantees for the purpose of documenting the program environment, participant flow through random assignment and program services, the nature and content of the training provided, and grantee perspectives on implementation successes and challenges. Site visits will be conducted for two categories of grantees—six grantees that are selected to participate in the random assignment study, and two grantees that are selected for an implementation study focused on special topics. During the visits site teams will interview key administrators and staff (including program partners and employers) using a semi-structured interview guide.


A.1. Circumstances Necessitating Data Collection


The H-1B Technical Skills Training and Jobs and Innovation Accelerator Challenge Grants are authorized under Section 414(c) of the American Competitiveness and Workforce Improvement Act of 1998 (ACWIA), as amended (29 U.S.C. 2916a) and are designed to support applicants in providing education, training, and job placement assistance in occupations and/or industries that have high-growth potential for which employers are using H-1B visas to hire foreign workers, and the related activities necessary to support such education, training, and placement activities. In addition to reporting participant information and performance-related outcomes, H-1B grantees must agree to participate in a random assignment evaluation and must also demonstrate a selected training strategy.


29 CFR 95.51(b) (59 F.R. 38271, July 27, 1994), which governs monitoring and reporting program performance under grants and agreements with non-profit organizations, states that DOL shall prescribe the frequency with which performance reports shall be submitted, and that performance reports shall not be required more frequently than quarterly or, less frequently than annually. If ETA does not comply with these requirements, funding for demonstration programs would be compromised. In applying for H-1B grants, grantees agree to meet ETA’s reporting requirements as indicated in the Solicitation for Grant Applications (SGA/DFA PY-10-13, SGA-DFA PY-10-15), which requires the submission of quarterly reports within 45 days after the end of the quarter.


Reports:

Three outcome measures will be used to measure success in the H-1B grants: entered employment rate, employment retention rate (this includes incumbent workers who retain their positions or advance into new positions and get wage gains after the program), and the average six-month post-program earnings. All of these conform to the common performance measures implemented across DOL job training programs as of July 1, 2005. By standardizing the reporting and performance requirements of different programs, the common measures give ETA the ability to compare across programs the core goals of the workforce system – how many people entered jobs; how many stay employed; and how many successfully completed an educational or vocational training program. In addition to the three outcome measures, grantees will report on a number of leading indicators that serve as predictors of success. These include placement into unsubsidized jobs, attainment of degrees or certificates, placement into post-secondary education or vocational training, on-the-job training (OJT), classroom occupational training, contextualized learning, distance learning, and customized training, including incumbent worker training, and placement into high-growth industries and occupations.


In applying for the H-1B grants, grantees and their sub-grantees agree to submit participant-level data and aggregate reports on participant characteristics, services provided, placements, outcomes, and follow-up status. Grantees will collect and report quarterly H-1B performance data using a new enhanced web-based reporting application based on an existing ETA-provided Management Information System (MIS). Grantees will also collect participant Social Security Numbers which will allow ETA to match wage records for grantees and lessen the burden on grantees to track post-program outcomes.


The progress reports have two components: a performance report and narrative report. Both reports are submitted on a quarterly basis by all H-1B grantees. The quarterly performance reports include aggregate data on demographic characteristics, types of services received, outcomes, and retention and follow-up status. The quarterly narrative reports provide a detailed account of program activities, accomplishments, and progress toward performance outcomes during the quarter. The new web-based MIS system will collect participant files from grantees, validate the files and generate a quarterly performance report.


The American Competitiveness and Workforce Improvement Act of 1998, Title IV, Section 414 (c), as amended by the Consolidated Appropriations Act of 2005, Division J, Title IV, Subtitle B, Section 428 directs the Secretary to require grantees to report on the employment outcomes obtained by workers receiving training under this subsection using indicators of performance that are consistent with other indicators used for employment and training programs administered by the Secretary, such as entry into employment, retention in employment, and increases in earnings.


WIA section 185 broadly addresses reports, recordkeeping, and investigations across programs authorized under Title I of the Act. The provisions of section 185:


  • Require the Secretary to ensure that all elements of the information required for reports be defined and reported uniformly [section 185(d)(2)]

  • Direct each state, local board, and recipient (other than a sub-recipient, sub-grantee, or contractor of a recipient) to prescribe and maintain comparable management information systems, in accordance with the guidelines that shall be prescribed by the Secretary designed to facilitate the uniform compilation, cross tabulation, and analysis of programmatic, participant, and financial data, on statewide, local area, and other appropriate bases, necessary for reporting, monitoring, and evaluating purposes, including data necessary to comply with section 188 [section 185(c)(2)]

  • Require that recipients of funds under Title I shall maintain such records and submit such reports in such form and containing such information as the Secretary may require regarding the performance of programs and activities carried out under Title I [section 185(a)(2)]

  • Require that recipients of funds under Title I shall maintain standardized records for all individual participants and provide to the Secretary a sufficient number of such records to provide for an adequate analysis of the records [section 185(a)(3)]

  • Specify that the reports shall include information about programs and activities carried out under Title I pertaining to:

    • Relevant demographic characteristics (including race, ethnicity, sex, and age) and other related information regarding participants

    • Programs and activities in which participants are enrolled and the length of time that participants are engaged in such programs and activities

    • Outcomes of the programs and activities for participants, including the occupations of participants and placement for participants in nontraditional employment

    • Specified costs of the programs and activities

    • Information necessary to prepare reports to comply with section 188 and 29 CFR Part 37.37 [(a-b),(d-e)]

    • Require that all elements of the information required for the reports described in section 185(d)(1)(A-E) above are defined and uniformly reported


WIA section 189(d) requires the Secretary to prepare and submit to Congress an annual report regarding the programs and activities carried out under Title I. The report must include:


  • a summary of the achievements, failures and problems of the programs and activities in meeting the objectives of Title I

  • a summary of major findings from research, evaluations, pilot projects, and experiments conducted under Title I in the fiscal year prior to the submission of the report

  • recommendations for modifications in the programs and activities based on analysis of such findings

  • such other recommendations for legislative or administrative action as the Secretary determines to be appropriate


Data Collection for Program Evaluation:


ETA received approval for baseline data collection for the evaluation through the original version of this ICR. The baseline data collection is described below.


Baseline Data Collection. During the intake period, all persons who apply for the program and are determined to be eligible will be told about the study and asked to sign a form confirming they have been informed about and understand the study. The participants will be asked to sign an Informed Consent Form (the form is attached). Program staff will then enter the person’s data into the web-based PTS. This information includes:

  • Identifying Information. This includes complete name, address, telephone number, email, birth date, gender, and social security number (SSN). Identifiers are necessary for tracking and locating sample members for follow-up surveys and for ensuring that this data can be accurately matched with administrative records on sample members.

  • Demographic and Socioeconomic Characteristics and Employment History. This information will be used to describe the study sample and to document differences in the populations served across the study sites. This information will allow conducting subsequent analyses of subgroups.

  • Barriers to and Attitudes Toward Work. The baseline data collection effort will also ask questions about specific barriers that study participants may face (health, child care, and transportation) and attitudes toward work. This information will be used (1) to describe the sample and understand issues that could affect their ability to participate in training and work, and (2) to conduct subsequent subgroup analyses (for example, to determine whether the program was effective for those with specific barriers).

  • Locating Information. Accurate locating information is crucial to achieving high survey response rates. As mentioned above, the PTS will capture each applicant’s landline and cellular telephone numbers and email address. The form will also capture Facebook Name and alternative contact information for up to three relatives or friends who might know how to contact the sample member.


An evaluator was procured in May 2013 for the H-1B TST evaluation, and as a result ETA is requesting approval for revisions to the original ICR for two additional data collection activities: (1) phone interviews and site visits to select sites for the evaluation, and (2) process study site visits to examine program operations. Approval for the survey design, follow-up telephone surveys, and random assignment analysis will be requested separately in the early spring, 2014. The separate and new request for data collection will be developed for this part of the evaluation.

Site Selection. To inform the selection of six sites for the impact analysis, the evaluation team will conduct telephone interviews with and site visits to H-1B TST grantees. Specifically, the team will gather pertinent information about grantees’ ability to meet sample size requirements, its enrollment procedures, the nature and content of its training programs, and current and planned program enrollment. The team will also assess grantees’ receptivity to study participation and identify any potential barriers to participation.

To begin the site selection process, the evaluation team will conduct telephone interviews with the program directors and other key staff from 24 grantees that appear to have potential to be included in the evaluation based on a review of their grant applications and other information available to ETA. Each telephone interview will last approximately 60 minutes. The team will then identify up to 12 grantees that best meet the selection criteria for in-person site visits. These 1– to 2–day site visits will explore in detail the capacity of grantees to meet sample size requirements within the evaluation timeframe, the services provided, the suitability of the enrollment process for random assignment, the ability to over-recruit for a control group, and any issues that would need to be addressed in implementing a random assignment research design. Following these visits the evaluation team will recommend study candidates. ETA will select six grantees for the impact evaluation. Discussion topics for the telephone interviews and site visits are included in the attached Site Selection Telephone Interview Protocol and the Site Selection Site Visit Interview Guide.

Process Study Site Visits. The evaluation team will conduct two types of process study site visits:

  • For the six impact study sites, the team will document services available to the treatment group members as well as the types of services likely accessed by the control group members. This information will be critical for describing the program design and operations in each site, interpreting impacts, and identifying lessons learned for future investments.

  • For the two special topics sites that are not involved in the random assignment impact evaluation, the site visits will provide in-depth information on specific issues related to the design, implementation, and operations of the H-1B TST grants. We will work with ETA to determine up to six special topics to be addressed in these two sites, as well as in the impact sites. Potential topics include early start-up experiences, recruitment strategies, the design and implementation of the OJT component, and program sustainability. Working with ETA, these two sites will be selected based on their “fit” in addressing the chosen topic, based on phone interviews discussed above or other information from ETA

The process study site visits will include semi-structured interviews with a number of program stakeholders. Potential respondents are listed below, and discussion topics are presented in Table 1 and in the Process Study Interview Guide.

  • Program Administrators, Staff and Program Partners. Interviews with grantee administrators and staff as well as applicable key partners (training providers, employer associations, workforce systems) at each site will document the program design and services delivered. Through interviews, the evaluation team will collect detailed information on a full range of process study topics including program and organizational structure, the service delivery system, recruitment, nature and content of training and support services, key partners, linkages with employers, programmatic priorities, funding resources, sustainability of the grant program after the grant period, and economic and programmatic context. The aim is to gain a complete understanding of the range of factors (programmatic, institutional, and economic) that serve to facilitate or inhibit the program’s successful implementation and operation. The evaluation team will also inquire about the services and providers potentially available to control group members.

  • Interviews with Employers. Each of the grantees partnered with one or more employers (or employer organization) as a critical partner and stakeholder, particularly in the OJT-focused grantees. Additional employers may not play an active role in the grantee partnership but can provide insight into the readiness and productivity of program completers. The evaluation team will conduct semi-structured interviews with two or three key employers in the grantee’s target sector and all the key employers in OJT-focused interventions. For the “other” training sites, this activity will explore the extent to which critical “demand side” considerations have been integrated into the program model, including discussions of employers’ role in the planning process, their role in program and curricula design, and their experience with placement, hiring, and post-program employment. For the OJT sites, these interviews will focus on the nature of positions including length, amount and length of wage subsidy, supports and supervision, type of skill building provided, and post-placement support.

For four of the impact sites, the research team will conduct two rounds of process study site visits. For two impact sites, the research team will conduct one round of site visits and one round of telephone interviews. The first round of visits will be conducted approximately six months after the start of random assignment and will focus on documenting the initial implementation of the programs and will include interviews with administrators, staff, partners, and employers as well as focus groups. The second round of site visits/telephone interviews, 12 to 18 months after the random assignment begins, will focus on changes and developments in the provision of services as well as issues regarding the grant program’s sustainability. For the special topic-only sites, the research team will conduct one round of site visits.

Table 1. Key Program Dimensions Examined in Process Study Interviews

Program Dimension

Key Respondents

Local context: Community context in which the grant-funded program operates

  • Socioeconomic, racial and ethnic profile of the population

  • Unemployment rates, availability of jobs, characteristics of available jobs, employment in sector of interest

  • Education and training providers in the community

  • Availability of public and financial supports


  • Program administrators

  • Program partners

  • Employers

Organizational structure and institutional partners: Characteristics of the grantees, including

  • Organizational Structure: size, operational structure, funding, history, leadership, linkages with other systems (local work force boards, community colleges)

  • Staffing: Number and roles of staff (planned and actual)

  • Partners: Program services offered and delivered, how services are coordinated


  • Program managers

  • Program service delivery staff (teachers, counselors, other professionals)

  • Program partners

Program design/operations: Strategies used by the program to deliver curricula/services or organize activities

  • Outreach and recruitment strategies (planned and actual)

  • Assessment and case management

  • Course requirements and schedules

  • Instructional methods and curricula

  • Structure of OJT positions, including wage subsidy

  • Counseling and other support services, including financial support

  • Location of services, activities

  • Role of employers

  • Changes over time


  • Program managers

  • Program service delivery staff

  • Program partners

  • Employers

Service receipt/utilization:

  • Services received by treatment group members; number and type of services received; length of services

  • Other education, job training, and support service programs available to control group members


  • Program managers

  • Program service delivery staff

  • Program partners


Implementation accomplishments/challenges: Factors that facilitated or impeded the effective delivery to services to participants


  • Program managers

  • Program service delivery staff

  • Program partners

  • Employers



A.2. How, by Whom, and For What Purpose the Information Is to Be Used


Quarterly Reports:

Grantees will be expected to implement new recordkeeping and reporting requirements with grant funds. As a government-procured MIS will be provided to all grantees, their implementation costs will be minimized. Grant funds may also be used to upgrade computer hardware and Internet access to enable projects to utilize an ETA MIS system and enable grantees to accurately track participants.


Grantees will track data on individuals who receive services through H-1B programs and their partnerships with Federal agencies, and other partner agencies and upload participant files to the ETA MIS system. These data will be used by the Department and ETA to evaluate performance and delivery of H-1B program services. In addition to the required data elements to be collected, MIS will allow grantees to collect additional participant data beyond those elements required by H-1B. Please refer to document H-1B Data Elements Performance Evaluation that contains a comprehensive list of the data elements grantees will be required to collect from enrolled participants that will inform ETA on program progress and outcomes. Each data element will have a subsequent definition and explanation on how to record each element. Further, ETA will provide grantees with a comprehensive Reporting Handbook - Quarterly Program Reporting and Instructions guide.

ETA will use the data to track total participants, characteristics, services, training strategies and outcomes for employed, unemployed and long-term unemployed participants. Common measures will enhance ETA’s ability to assess the effectiveness of the H-1B program within the broader workforce investment system.


Within ETA, the data are used by the Offices of Workforce Investment, Policy Development and Research, Financial and Administrative Management, Information Systems and Technology, and Regional Management (including the regional offices). Other DOL users include the Offices of the Assistant Secretary for ETA and Assistant Secretary for Policy.


The reports and other analyses of the data will be made available to the public through publication and other appropriate methods and to the appropriate congressional committees through copies of such reports. In addition, information obtained through the MIS information and reporting system will be used at the national level during budget and allocation hearings for DOL compliance with the Government Performance and Results Act (GPRA) and other legislative requirements, and during legislative authorization proceedings.


Data Collection for Program Evaluation:

ETA previously received approval to collect baseline data that will allow for location of participants in the future, and inform the evaluation. The PTS and the consent forms are described in detail below; along with how, by whom, and for what purposes the information collection will be used.


In the revisions to this request, ETA also now specifies the site selection and process study data collection activities. These activities are described below; along with how, by whom, and for what purposes the information collection will be used.


Consent to Participate in the Study:

The informed consent form will be administered to all eligible individuals at the selected sites. The grantee staff will ask the person to read the form, or will read the form to the person, and will then answer any questions. The consent form ensures that the potential study participant has been fully informed about the study, including random assignment, data collection, and privacy of the data. It ensures that people are aware of the participation requirements of the study and know that they can decline to participate or drop out at any time.


Baseline Data Collection:

We will collect basic demographic, socioeconomic, and barrier-related characteristics on all consenting persons prior to the study. We will also collect the name, address, phone number, and email address of up to three of the participant’s close friends or relatives who would likely have knowledge of his or her whereabouts at the time of follow-up data collection.

Baseline and contact data are needed for the following purposes:

  • To Locate Participants for Surveys and Collect Administrative Data. The PTS will collect detailed identifying information for each participant, including people who may know the whereabouts of the participant. This information is crucial for locating study participants for the follow-up surveys and thereby increasing the response rates. The participant’s SSN is essential for obtaining administrative data, such as Unemployment Insurance quarterly wage records that will be used to measure employment outcomes, since agencies typically provide data by matching on SSN. The SSN is also helpful in locating participants for follow-up surveys, as many of the locating services search for current address and telephone number using the SSN.

  • To Describe the Sample. Baseline data will allow researchers to describe in more detail the population being served by each program.

  • To Define Subgroups for the Impact Analyses. Baseline data on the characteristics of sample members are essential to define subgroups for which impacts can be estimated. These include characteristics such as sex, race/ethnicity, health status, employment history, barriers to employment, and attitudes toward work.

  • To Increase the Precision of Impact Estimates. By including information on the baseline characteristics of study participants in regression models, the precision of impact estimates can be improved.


The baseline data will be collected from everyone who has been found eligible and has given signed consent to participate in the study. As with the consent forms, grantee staff will be available to answer questions. For people with low literacy or other reading barriers, the grantee’s staff can provide assistance. The baseline data elements are shown in Attachment A.


Site Selection Telephone Interviews and Site Visits

Site selection will begin with telephone interviews with the program directors and other key staff from a subset of 24 grantees. Each interview will gather detail on the nature of the training programs, current and planned program enrollment, and program operations. To further assess sites’ appropriateness for the evaluation, the evaluation team will conduct site visits with up to 12 grantees. The telephone interviews and site visits are needed for the following purposes:

  • To document program size and enrollment timing. Program size is a paramount consideration in site selection. To be able to detect differences in outcomes between the treatment and control groups, the evaluation team estimates grantees must have a minimum of 1,000 study participants equally distributed to the treatment and control groups. Because the evaluation seeks to generate site-specific impact estimates, larger sites would yield more precise estimates. Timing of enrollment is important in selection of sites because those that have already enrolled a large portion of their planned program participants under the grant would have difficulty reaching the sample size necessitated by the evaluation.

  • To assess grantees’ ability to create a control group. First, grantees must be able to “over-recruit” to both fill program slots and create an equally sized control group. The second issue is a preference for sites where nothing similar to the grant-funded training under study exists in the community that the control group could access. Sites in which the treatment-control contrast is sharper provide better opportunities for learning about the effectiveness of program services.

  • To document type and strength of the intervention. In order to learn about the effectiveness of different approaches to training participants, the evaluation team will document the type and strength of each grantee’s intervention. In addition to an OJT model, this may include other types of promising training approaches such as those that integrate math, science, and language classes into the technical skills training. The evaluation team will also explore the extent to which grantees integrate supportive services (e.g., career and personal counseling), soft skills development, and financial assistance into their programs. Finally, the team will document the nature of the grantee’s connection to employers and jobs. This information will be important for selecting the two additional sites for the special topic process study.

  • To assess grantees’ research capacity. A successful experimental evaluation requires forming a working partnership between the evaluation team and participating grantees. At a minimum, sites selected for the study must be fundamentally supportive of the basic research mission and methodology of the study. Concerns may exist about random assignment. In addition, grantees must be able to accommodate the needs of such an evaluation.


Process Study Site Visits

The site visits will involve semi-structured interviews with administrators and staff, key program partners, employers, informal group discussions with students in the treatment group, and observations of program activities. The site visits will be used for the following purposes:

  • To describe the program design and operations in each site. Because the program as it is described “on paper” (in the grant application or other materials) may differ from the actual program being tested, evaluators will collect data during the site visits that will enable them to describe the programs as implemented.

  • To examine the treatment-control differential to help interpret impact results. Impacts on employment patterns, job characteristics, earnings and other outcomes will be driven by differences in the amount and/or types of training and other services received by members of the treatment and control groups. Because the control group can access training and other services outside the grant-funded program, during the site visits researchers will collect data that will enable them to describe and establish levels of service receipt for both treatment and control group members. For example, evaluators will collect information on other sources of similar training (including those within the same institution) and sources of funding for training (e.g., other programs). In addition, data from the process analysis will suggest possible factors that might explain differences in impacts across sites.

  • Identification of lessons learned for use for future investments. The process analysis will serve as a “road map” to aid policy makers and program administrators interested in sustaining efforts to address critical skill shortages. Data from the process analysis—considered within the context of the impact results—will be the key source for formulating recommendations about how to replicate successful programs or improve upon their results. These data will also be used to identify lessons learned about the relative effectiveness of particular training strategies.

  • Address special topics of interest. The process study site visits will address special topics and focus on an in-depth look at particular implementation and operational issues of interest to ETA. The evaluators and ETA will determine up to six special topics that will be addressed through these site visits. Candidates include early start-up experiences, recruitment strategies, the design and implementation of the OJT component, and program sustainability.



A.3. Use of Technology to Reduce Burden


Reports:

To comply with the Government Paperwork Elimination Act, ETA is streamlining the collection of participant data and the preparation of quarterly reports to the extent feasible by providing a Web-based MIS system and by providing uniform data elements and data definitions to grantees across ETA programs. All H-1B data and reports will be submitted to ETA via an electronic reporting system (currently in development) that will concur with the above objective. Grantees will collect, retain, and report all information electronically through this system and will be provided comprehensive training on how to upload all reporting information, when and how.


To further reduce grantee burden, ETA will track Adult Common Measures on behalf of the grantee requesting that grantees submit the following information on participants: 1) Social Security Numbers; 2) Employment status at participation; 3) Date of exit; and 4) Reason for exit.


Data Collection for Program Evaluation:

The PTS is a Web-based system within the existing MIS for gathering background information on participants in the evaluation. Its drop-down menus and response categories minimize the data entry burden on grantee staff.


The revisions to this request for the program evaluation site selection and process study data collection activities will not involve the use of technology.


A.4. Efforts to Identify Duplication


Reports:

The Department holds grantees accountable by requiring them to identify and work toward comprehensive performance standards and establishing quarterly reports for competitive projects. The data items identified in Attachment B and Attachment C will be used to generate Quarterly Performance Reports.


ETA minimized the reporting burden by establishing the number of data elements required commensurate with the level of resources expended and services received. Data items collected by program reports and individual records are needed to: (1) account for the detailed services provided by multiple agencies to help participants prepare for job placement; (2) better identify overlapping and unproductive duplication of services; and (3) support the ongoing evaluation efforts in determining the effectiveness of the program model. Information provided through the H-1B management information and reporting system is not available through other data collection and report systems.


Data Collection for Program Evaluation:


Baseline Data Collection

The information being recorded in the PTS for the evaluation is not otherwise available in the format required in a manner that: (1) is systematic across sites, (2) maintains integrity of the process, and (3) ensures privacy of information. The research team reviewed the quarterly reports each grantee submits to ETA and found that they present financial information and minimum participant-level data. The reports fail to provide key identifiers and contact information for future data collection, to document the characteristics of the sample in terms of education and employment history and work-related barriers, or to supply key data needed to create meaningful subgroups.


Site Selection Telephone Interviews and Site Visits

The data to be collected during the site selection process are not available from any other source. No currently available data provides detailed information on the nature of the training programs, current and planned program enrollment, program operations, and suitability for study participation. While grantee applications and progress reports to ETA provide some information on program services, they are not in the level of detail needed to address issues of concern to the evaluation. The telephone interviews will provide information on a broader set of grantees, and the site visits conducted with a subset of promising grantees will provide a greater level of detail on program design and considerations for the evaluation, including barriers to the implementation of a random assignment design and ability to over-recruit for the control group.


Process Study Site Visits

The data to be collected during the site visits are not available from any other source. No other data source provides the detailed information needed on the program context, program services, control group environment, and implementation and challenges and successes. The first and second rounds of the site visits (or telephone interviews) will provide different sets of information about program operations, with the first round of visits focusing on grant activities until that point and the second round of visits focusing on changes since the first visit. The special topics site visits will provide in-depth information regarding specific implementation and operational issues of interest to ETA that is not available from any other source.



A.5. Methods to Minimize Burden on Small Businesses


For reporting purposes, the involvement of small businesses or other small entities that are not grantees or sub-grantees is extremely limited. The only time contacting them may be required is during the provision of a service. Methods to minimize the burden on small entities that are grantees or sub-grantees are discussed in other sections of this supporting statement.


A.6. Consequences of Less-Frequent Data Collection


Reports:

29 CFR 95.51(b) (59 F.R. 38271, July 27, 1994), which governs monitoring and reporting program performance under grants and agreements with non-profit organizations, states that DOL shall prescribe the frequency with which performance reports shall be submitted, and that performance reports shall not be required more frequently than quarterly or, less frequently than annually. If ETA does not comply with these requirements, funding for demonstration programs would be compromised. In applying for H-1B grants, grantees agree to meet ETA’s reporting requirements as indicated in the Solicitation for Grant Applications (SGA/DFA PY-10-13, SGA-DFA PY-10-15), which requires the submission of quarterly reports within 45 days after the end of the quarter.


Data Collection for Program Evaluation:


Baseline Data Collection

The lack of baseline information would limit the ability to describe the population of training program participants and would limit the analysis of impacts of the program on subgroups, hence limiting the ability to determine the groups for which the program is most effective. Without baseline data, impact estimates would be less precise (so that small impacts would be less likely to be detected), and adjustments for nonresponse to the follow-up surveys would have to be based on less-detailed administrative data. Finally, if it did not collect detailed contact information for study participants, the study would not be able to track them and administer the follow-up surveys. This would likely lead to a higher nonresponse rate and thus pose a greater risk to the quality of survey data and, in turn, the impact estimates.


Site Selection Telephone Interviews and Site Visits

The information collected through telephone interviews and site visits with grantees is necessary for the evaluation team to recommend to ETA appropriate grantees for the study. The team will assess grantee appropriateness on criteria that includes program size and enrollment timing. Without accurate information on grantees’ potential to recruit the expected 1,000 study participants, the evaluation would risk having too small a sample to detect the impacts of H1-B TST grantees. Additional information to be gathered during these data collection activities is critical to answering study research questions, such as the type and strength of intervention, ability to create a control group, and research capacity. Existing grantee information is not sufficient to guide the team’s recommendations.


Process Study Site Visit Data Collection

The information collected through the process study site visits is necessary for the evaluation team to describe the program design and operations in each site in detail. The information will be used to interpret the impact analysis results, identify lessons learned for future investments, and address special topics of interest to ETA. By not collecting this data, the evaluation team will lack in-depth information about the nature of the strategies developed and employed at grantee sites to improve the educational and employment and training outcomes of the participants they serve, as well as understand the control group service environment. This is an essential part in an experimental design for understanding, for example, if employment outcomes at various points after random assignment are potentially associated with varying services received by treatment and control group members. If there are positive net impacts for the treatment group, it will be vital to understand the specific intervention(s) received by treatment group members so that they could potentially be replicated by other employment and training programs.



A.7. Special Circumstances for Data Collection


None of the data collection efforts involve any special circumstances.


A.8 Preclearance Notices and Responses


In accordance with the Paperwork Reduction Act of 1995, the public was given 60 days to review the original information collection request; a notice was published in the Federal Register on January 23, 2012 (vol. 77, p. 3284) (see comments in table below).



Comment Submitter

Comment Summary

ETA’s Response

Jeffery Oleander

Respondent is interested in the information collection from the sponsors of H-1B visas and the comparative data of U.S. hires in the same sectors.


The H-1B Technical Skills Training Grant [SGA/DFA PY-10-13] and the H-1B Jobs and Innovation Accelerator Challenge Grants (JIAC) [SGA/DFA PY-10-15]are authorized under Section 414 (c) of the American Competitiveness and Workforce Investment Act of 1998 (ACWIA), as amended (29 USC 2916a).  The purpose of these grants is to provide education, training, and job placement assistance in the occupations and industries for which employers are using H-1B visas to hire foreign workers.  H-1B training grants are financed by a user fee paid by employers who bring foreign workers into the United States under a nonimmigrant visa program.


Within ETA, the Office of Foreign Labor Certification (OFLC) manages the DOL certification process for issuing all Permanent, H-1B, H-1C, H-2A, H-2B, and D-1 visas.  All OFLC Case Disclosure data can be accessed at: http://www.foreignlaborcert.doleta.gov/quarterlydata.cfm.  In addition, you may find employer-specific data at: http://www.flcdatacenter.com/


New York State Department of Labor, Division of Employment and Workforce Solutions

  1. The Survey asks invasive questions

  2. The total annual cost burden for this program has been underestimated.

  3. The proposed control group will compromise our ability to effectively run these programs by making us unable to recruit enough participants to be able to spend the funding appropriately.

  4. USDOL made no mention of this type of evaluation/reporting format prior to our submittal of its application.

  1. The Baseline Information Form (BIF) will collect background information on participants who have consented to participate in this evaluation. Information on date of birth, address, and telephone numbers is needed to identify and contact participants. The BIF also collects information on characteristics of participants, such as sex, race/ethnicity, marital status, education level, employment history, and work-related barriers—data used to ensure that random assignment was conducted correctly, to create subgroups for the analysis, and to enhance the impact estimates. This type of information is generally collected as part of enrollment in most programs and is therefore not considered sensitive.

  2. The estimated annual burden has been reexamined in light of comments received and determined that no changes were necessary. The commenter postulated certain summary costs without providing additional information to allow the Agency to replicate the calculations. The Agency calculations appear in items 12 through 14.

  3. The funding will be used to both recruit additional enrollees for the program and to provide services to those you are randomly selected to participate in the program.  These additional recruitment activities are an allowable cost under this grant.   The evaluation will provide support in both handling the recruitment and the random assignment process. This support is expected to significantly reduce the burden on the grantee.

  4. ETA included the evaluation requirement as part of the original Solicitation Grant Agreement (SGA) that was published in the Federal Register on May 2, 2011. Section VI B.4i, page 26 of the SGA states “By accepting grant funds, grantees agree to participate in an evaluation should they be selected to participate. Grantees must make records on participants, employers and funding available and to provide access to program operating personnel and to participants, as specified by the evaluator(s) under the direction of ETA, including after the period of operation.” Further, the One Stop Operating System (OSOS) that NYSDOL references is a data collection system. The new H-1B grant reporting system is a web-based upload system similar to the current WISARD (WIA Reporting System), with enhancements. This system will allow grantees to upload participant files using csv, txt or dat file formats which the system will then validate. This system will be built to reduce burden on grantees and reduce the duplication of data input.


As part of the development of the Solicitation for Grant Applications for both H-1B Technical Skills Training and JIAC grants, various Federal agency partners were consulted, including the U.S. Department of Commerce, Economic Development Administration (EDA) and Small Business Administration (SBA) staff, with both providing feedback on data collection instruments, types of data collection, and the availability of data, as well as the overall program design for the Jobs and Innovation Accelerator Challenge project and reporting.


A.9 Payments to Respondents

There are no payments to respondents other than the funds provided under the grant agreement. No payments will be made to respondents interviewed for the process study or for purposes of site selection. Payments for participants who respond to follow-up surveys will be addressed in a separate data collection request.


A.10. Confidentiality Assurances


Reports:

While this information collection makes no express assurance of confidentiality, ETA is responsible for protecting the privacy of the H-1B participant and performance data and will maintain the data in accordance with all applicable Federal laws, with particular emphasis on compliance with the provisions of the Privacy and Freedom of Information Acts. This data is covered by a System of Records Notice, DOL/ETA-15, published April 8, 2002 (67 FR 16898 et seq). The Department is working diligently to ensure the highest level of security whenever personally identifiable information is stored or transmitted. All contractors that have access to individually identifying information are required to provide assurances that they will respect and protect the privacy of the data.

As added protection, Social Security numbers will be converted into a unique participant ID for each case record. The H-1B system will also include a statement that informs the individual where the information he/she has provided is being stored, the name and location of the system, and that the information is protected in accordance with the Privacy Act. Any information that is shared or made public is aggregated by grantee and does not reveal personal information on specific individuals.


Data Collection for Program Evaluation:


Baseline Data Collection

At baseline, all information on individual participants will be entered into the PTS. All sensitive data will be encrypted and protected by Secure Socket Layer (SSL). Logging or output files will not contain sensitive data and will be limited to generic system and error data. Furthermore, data extracts for use by the evaluators will be available in files encrypted and available to team members on Relyon’s File Transfer Protocol over SSL (FTPS) site.

The system will segregate user data into sensitive data, user-identifiable data, and project-specific data. Sensitive data such as SSNs will be stored in a separate database table containing a system-generated ID number with the SSN stored in encrypted form. Sensitive data will be entered into the system but will at no point be displayed or downloadable by users of the system. User-identifiable data will be stored separately from project-specific data and will be available for updating only by designated administrators on the research team. Baseline data from the study will be available to the project team in specific extracts and reports.


The PTS will be accessible only to staff who are currently working on the project. Staff access to participant-level data will be restricted. To access the PTS, users will first log on to their workstations and then to the database using a separate log-in prompt. The database will be removed and securely archived at the end of the baseline data collection period.


Site Selection Telephone Interviews and Site Visits & Process Study Site Visit Data Collection

The administrators and staff interviewed by evaluators when on-site will be assured that their responses will be combined with those from other sites for analysis, will not be identified by the individual in any reports nor will interview notes be shared with ETA. To preserve privacy, paper copies of interview notes will be secured in a locked file cabinet. Notes will be stored on a secured Abt Associates or MEF Associates laptop.


A.11. Additional Justification for Sensitive Questions


Reports:

While sensitive questions will be asked of participants in the proposed data collection, the privacy of participants will be protected as discussed in Section A.10. In addition, security will be built into the data collection system by the MIS contractor. As mentioned in Section A.10. ETA has strict protocol on protecting all Personally Identifiable Information (PII) and will provide a data collection system that has the capacity to encrypt any PII submitted to ETA. Performance reporting data will only collect SSNs that will be encrypted and sent directly to the Common Reporting Interchange System (CRIS). CRIS uses state and Federal Employment Data Exchange System (FEDES) and Wage Record Interchange System (WRIS) in generating reports.  CRIS provides common performance measures for the grant programs, which do not have the ability to collect the common measure outcomes ie Entered Employment Rate, Retention Rate, and Average Earnings etc on their own.

Participant responses to these sensitive questions will allow ETA and the evaluation contractors to comprehensively evaluate the effectiveness of the H-1B program.



Baseline Data Collection:

The PTS will collect background information on participants who have consented to participate in this evaluation. Information on date of birth, address, and telephone numbers is needed to identify and contact participants. The PTS also collects information on characteristics of participants, such as sex, race/ethnicity, marital status, education level, employment history, and work-related barriers—data used to create subgroups for the analysis, and to enhance the impact estimates. This type of information is generally collected as part of enrollment in government-funded training programs including the H-1B training program and is therefore not considered sensitive.


The PTS will also collect Social Security Numbers (SSN). SSNs are needed for two important purposes. First, SSNs will be used avoid duplication ; as a completely distinct form of identification, checking for matching SSNs is the only completely dependable method for ensuring that participants are not counted twice, either in the same site or at different sites. The PTS is designed to securely check for duplicate SSNs within and across sites, without ever displaying the SSN to the user (once the SSN is entered into the system, it becomes and remains hidden to PTS users).


Secondly, SSNs will be used so that the researchers can collect critical administrative data—Unemployment Insurance (UI) records—that will be used to measure the primary outcome of interest to the evaluation: the impact of the training programs on employment and earnings. The wage records collected by state UI agencies consist of quarterly earnings, by employer, for all UI-covered employees in the state. The only way to accurately access an individual’s UI data is through their SSN; other identifiers such as name and date of birth are not unique enough to ensure that the correct data will be obtained. An advantage of UI data is that they can be collected for a large sample for a long period of time. Thus, they allow the researchers to estimate precisely, for the whole sample and for subgroups, the net impacts on employment and earnings over a three-year period. They are also fairly uniform across states and over time, a characteristic that facilitates a straightforward approach to analysis and allows for cross-site comparisons. Compared to survey data, these data have the advantage that they are not subject to the potential biases that can occur because of recall error and survey non-response.


Site Selection Telephone Interviews and Site Visits & Process Study Site Visit Data Collection

No sensitive questions will be asked during site selection or the process study site visits.


A.12. Estimates of the Burden of Data Collection


The annual national burden for H-1B data collection has two components: (1) the respondent data collection burden; and (2) the data collection cost burden. This response provides a separate burden for each of the two components. Grantee burden estimates include four rounds of H-1B Technical Skills Training Grants, and two rounds of H-1B Jobs and Innovation Accelerator Challenge Grants.


  1. Respondent Data Collection Burden

H-1B respondent data collection burden considers the amount of participant and performance-related information collected and reported on the participant case record that would not have to be collected by the grantees as part of their customary and usual burden to run the program. Thus, the burden reflects the information collected solely to comply with Federal reporting and evaluation requirements.


The data collection burden will vary by participant, based on the range and intensity of services provided by the grantee and its partners. For example, data collection may involve acquiring information from the various partner agencies regarding employment training and placement, education assistance, and on-the-job training, in addition to the collection of personal and demographic information and some evaluation data elements collected by the grantees themselves.


To arrive at the average annual figure of 2.66 hours per participant record, ETA assessed the time for entries based on scenarios postulating a variety of services possible for a range of anticipated participants. This information, in turn, was based on similar programs of this sort, including other employment and training programs. This figure is split between the data entry staff person (2.33 hours) and the participant orally providing data (0.33 hours).


Finally, ETA program managers consulted with grantees that have collected this sort of information over the past several years to verify that 2.66 hours is the average amount of time spent per record. If we use this average figure (that includes the data entry into a participant tracking system) for the estimated 45,300 participants per program year, averaged across approximately 151 grantees, 2.66 hours represents the best combined response estimate of time devoted to data entry for each participant, given the range of entries anticipated for each participant, as described above.


Table 1 – Data Collection Burden Hours


Activity

Estimated Number of Respondents

Annual Frequency of Response

Average Time per Respondent per once (in minutes)

Burden Hours per once

Annual Burden Hours

Annual Burden Hours Per Participant

Participant data collection (including evaluation data)

45300

4

35

0.583333333

2.333333333

105700

Quarterly Performance Report

151

4

600

10

40

6040

Quarterly Narrative Report

151

4

600

10

40

6040

Site Visit Phone Interviews

72

1

60

1

1

72

Site Selection Site Visits

60

1

480

8

8

480

Process Study Site Visits

120

1-2

52.5

.875

1.31

145

Total

 

 




118,477



Data collection burden hours are based on consultation with similar programs that collect the same ETA required outcomes, along with modified elements that reflect specific H-1B training program outcomes.


Quarterly Performance and Narrative Progress Report Burden

The H-1B quarterly performance report burden assumes that all grantees will use ETA-provided MIS to generate quarterly performance reports. The MIS will be designed to apply edit checks to participant files and to generate facsimiles of the aggregate information on enrollee characteristics, services provided, placements, and outcomes in quarterly report format. The burden includes reviewing and correcting errors identified by the MIS in the participant-level data and generating, reviewing, and approving the aggregate quarterly reports.


The H-1B quarterly narrative progress report burden involves providing a detailed account of all activities undertaken during the quarter including in-depth information on accomplishments, promising approaches, progress toward performance outcomes, and upcoming grant activities. ETA assumes each grantee will spend approximately ten hours per quarter preparing this report.


Program Evaluation: Site Selection and Process Study Burden

Site selection will begin with telephone interviews with 24 grantees that may potentially be included in the evaluation. For each grantee, the evaluation team will interview three persons (total across administrators and staff). The team will then conduct site visits to 12 of these grantees to further inform the site selection process. For each grantee, the team will interview five persons (across administrators and staff). The estimated response time for site selection is 60 minutes for telephone interviews and six hours for site visits. Total estimated response burden for site selection is 552 hours.


For the process study site visits, the evaluation team will interview eight administrators and staff in each of the six sites included in the evaluation. For four of the impact sites, there will be two rounds of process study site visits. For the two remaining impact sites, there will be one round of site visits and one round of telephone interviews. In each site, the evaluation team will interview three program partners and two employers. For the special topics process study, the evaluation team will interview approximately eight respondents in each of the two selected sites. The estimated response time for the individual interviews is 45-60 minutes, depending on the respondent. Total estimated response burden for the process study data collection is 145 hours.


The hour burden estimate for the collection of information that is part of the revisions to the request for approval is below.


Table 2. Burden Estimates for Site Selection and Process Study

Respondents

Number of Respondents

Frequency of Collection

Average Time per Response

Burden

(Hours)


Site Selection Telephone Interviews





Administrators & staff

72 (3 respondents per site, 24 sites)

Once

60 minutes

72

Site Selection Site Visits





Administrators & staff

60 (5 respondents per site, 12 sites)

Once

8 hours

480

Total for Site Selection

132

--

--

552

Process Study Site Visits: Impact Sites with Two Rounds





Administrators & staff

32 (8 per site,
4 sites)

Twice

60 minutes

64

Program partners

12 (3 per site,
4 sites)

Twice

45 minutes

18

Employers

8 (2 per site,

4 sites)

Twice

45 minutes

12

Process Study Site Visits: Impact Sites with One Round





Administrators & staff

16 (8 per site, 2 sites)

Once

45 minutes

12

Program partners

6 (3 per site, 2 sites)

Once

45 minutes

4.5

Employers

4 (2 per site, 2 sites)

Once

45 minutes

3

Process Study Site Visits: Special Topics Sites





Interview respondents at special topics sites

16 (8 per site,
2 sites)

Once

45 minutes

12

Process Study Telephone Calls





Administrators & staff

16 (8 per site,
2 sites)

Once

45 minutes

12

Program Partners

6 (3 per site,
2 sites)

Once

45 minutes

4.5

Employers

4 (2 per site,
2 sites)

Once

45 minutes

3

Total for Process Study

120

--

--

145

Total for Site Selection and Process Study Data Collection


252


--


--

697




(2) Data Collection Cost Burden


Table 3 – Data Collection Cost Burden


Activity

Number of Respondents

Annual Burden Hours

Average Hourly Wage Rate*

Annual Cost Burden

Participant data collection (including evaluation data)

45300

105700

$14.40

$1,522,080.00

Quarterly Performance Report

151

6040

$14.40

$86,976.00

Quarterly Narrative Report

151

6040

$14.40

$86,976.00

Total

 

117780

 

$173,952.00

*Hourly rates used to calculate hourly costs depend upon the type of organization administering the program. The current minimum wage was used for participants’ opportunity costs.


Hourly rates used to calculate cost depend upon the type of organization administering the program. For private non-profit grantees, the hourly rate is the average hourly earnings in the Bureau of Labor Statistic’s Social & Human Assistance Industry category (2009 - 2011, National Compensation Survey, Bureau of Labor Statistics).


The Federal minimum wage of $7.25 has been used as an approximation of the value of participant time.




Table 4. Total Annualized Cost Estimates for Baseline Data Collection and Process Study Visits


Data Collection Activity

Total Burden Hours

Average Hourly Wage

Total Annualized Cost

Site Selection

552

$19.74

$10.896

Process Study Site Visits

145

$19.74

$2.862

Total

697


$13,758

The average hourly wage in that table for the staff interviews and training is $19.74, based on the BLS average hourly earnings of production and nonsupervisory employees on private, service providing, nonfarm payrolls (January 2013 National Industry-Specific Occupational Employment and Wage Estimates, from the U.S. Department of Labor, Bureau of Labor Statistics and available on the department’s website).1




A.13. Estimated Cost to Respondents


(1) Start-Up/Capital Costs: There are no start-up costs, as ETA will provide grantees with a free, Web-based, data collection and reporting system that grantees will use to collect and maintain participant data, apply edit checks to the data, and generate quarterly performance reports.


(2) Annual Costs: There are no annual costs, as ETA will be responsible for the annual maintenance costs for the free, Web-based data collection and reporting system.


A.14. Estimates of Annualized Costs to Federal Government


ETA collects and maintains all quarterly reports through its Office of Information Security and Technology’s on-line Enterprise Business Support System (EBSS). Since the electronic mechanisms for collecting and storing grantee performance data on a quarterly basis are already in place to support other ETA programs, the annualized cost to the Federal government (maintaining the quarterly and annual reports and records through EBSS, matching SIR data with state UI wage records and other Federal employment databases, generating quarterly performance reports and any program close-out activities) for H-1B grant reports is minimal. The estimated cost for modifying and updating an existing ETA reporting system and adapting H-1B Grant specifics is $ 200,000. Thus, the costs of maintaining the system and developing training and technical assistance guides are estimated to be $50,000 annually for four years.


Continuing operations costs to the Federal government include staff and contractor costs required to maintain and manage data validation and review quarterly reports as outlined in Table 3 – Cost Estimate of Data Validation and Report Review to Federal Government. The annual cost of contractor support to provide continual technical support to grantees will total approximately $339,334 for PY 2013. Costs for ETA staff to manage data validation and review reports will be $50,587 for continuing operations throughout PY 2013



Table 5 - Cost Estimate of Data Validation and Report Review to Federal Government


Continuing Operations – Cost Estimate of Data Validation and Report Review to Federal Government

Contractor Support

$339,334

ETA Staff Total

$50, 587

1 GS-12 (1/2 time)

$37,436

1 GS-14 (1/8 time)

$13,151



Total Cost

$389,921


Note: Staff costs are based on Salary Table 2011-DCB (Step 1, incorporating the 1.5% general schedule increase and a locality payment of 24.22% for the locality pay area of Washington-Baltimore-Northern Virginia, DC-MD-VA-WV-PA), Department of Labor grade ranges as of January 2011.


Following is the annual cost to the federal government of the entire H-1B Technical Skills Training Impact Evaluation (Table 6). To estimate the average cost per year, the total five-year budget for the project ($6,975,587) is averaged over five years. It is important to note that these are figures are total costs for the entire evaluation and not just the data collection activities discussed in this submission.

Table 6. Annual Costs for H-1B TST Impact Evaluation

Year

Dates

Cost

1

2013-2014

$1,395,117.4

2

2014-2015

$1,395,117.4

3

2015-2016

$1,395,117.4

4

2016-2017

$1,395,117.4

5

2017-2018

$1,395,117.4

Total


$6,975,587



A.15. Changes in Burden


The changes in burden are the result of the revisions to the request for approval to include site selection and process study data collection activities involved in the H-1B TST evaluation.


A.16. Tabulation of Publication Plans and Time Schedules for the Project


Grantees will submit narrative progress and MIS performance reports on a quarterly basis to ETA within 45 days of the end of each quarter. Quarterly report data will be analyzed by ETA staff and used to evaluate performance outcomes and program effectiveness.


Each quarter, ETA issues the Quarterly Workforce System Results. Data contained in the H-1B system may be included in these reports. The data will also be used to prepare GPRA reports, management and budget reports, and other ad hoc reports, as needed.


Program Evaluation

Site selection will begin after receipt of OMB approval of a non-substantive change to 1205-0507, and baseline data collection (included in a previous data collection request) is anticipated to begin in early 2014. ETA will also seek OMB approval for a new public comment period for the random assignment procedures and participant survey design for the evaluation at that time. . Two major project reports will be prepared: (1) the interim report, which will draw from 18-month follow-up data to present the key short-run findings of the impact analysis; and (3) the final report, which will utilize 18- and 30-month follow-up data to present findings on long-run program impact. At the conclusion of the study, the project will also create a public use data file stripped of personally identifiable information. Table 7 gives the timeline for the deliverables.


Table 7. Study Timeline


Time

Activity

July 2013- August 2013

Site selection phone calls to identified sites  

September 2013-October 2013

Site selection phone calls to identified priority sites Recommendations for site visits

November 2013-February 2014

Site visits and selection of random assignment and initial process study sites

March 2014 – May 2014

Development of random assignment procedures and protocols, site training on random assignment

May 2014

Baseline data collection begins in all sites

May 2014-October 2015

Process study site visits

April 2015

Baseline data collection ends

November 2015

18-month follow-up survey begins

October 2016

18-month follow-up survey ends

November 2016

30-month follow-up surveys begin

April 2017

Interim report summary published

October 2017

30-month follow-up survey ends

May 2018

Final report published

June 2018

Public use data files available



A.17. Approval Not to Display OMB Expiration Date


The expiration date for the OMB approval will be displayed.


A.18. Exceptions to OMB Form 83-I


No exceptions are requested in the “Certification of Paperwork Reduction Act Submissions.”




1 U.S. Department of Labor, Bureau of Labor Statistics, Table B-8a. Average hourly and weekly earnings of production and nonsupervisory employees on private nonfarm payrolls by industry sector, seasonally adjusted (accessed from the following website as of April 2013: http://www.bls.gov/web/empsit/ceseeb8a.htm)

18

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleMEMORANDUM
AuthorWilliam Garrett
File Modified0000-00-00
File Created2021-01-29

© 2024 OMB.report | Privacy Policy