1205-0507 Supporting Statement_FINAL_20160527

1205-0507 Supporting Statement_FINAL_20160527.docx

H-1B Technical Skills Training and Jobs and Innovation Accelerator Challenge Grants

OMB: 1205-0507

Document [docx]
Download: docx | pdf

H-1B Technical Skills Training (TST) and Jobs and Innovation Accelerator Challenge (JIAC) Grants

OMB Control No. 1205-0507

May 2016

SUPPORTING STATEMENT

H-1B Technical Skills Training (TST) and Jobs and Innovation Accelerator Challenge (JIAC) Grants

OMB Control No. 1205-0507


A. JUSTIFICATION

The Department of Labor, Employment and Training Administration (ETA) request for an extension with revisions to OMB Control No. 1205-0507, expiration May 31, 2016, to allow ETA to support the 1) reporting, 2) recordkeeping, and 3) program evaluation requirements for the following H-1B funded grant programs currently reporting under the OMB-approved reporting package: H-1B Technical Skills Training (TST) grant program; Jobs and Innovation Accelerator Challenge (JIAC) grant program; and the Ready To Work (RTW) grant program (previously referred to as TST round 4).


The proposed changes to the collection consist of:

1) Formatting revisions of code values identified and in the “Handbook to Performance Reporting” to support accurate reporting (i.e. instead of “select 1 for yes and 2 for no”, some fields were changed to enter “YYYMMDD”;

2) Adding additional technical assistance guidance to the Performance Reporting Handbook; these inclusions respond to grantee technical assistance needs and supports more accurate reporting;

3) General formatting of the “Handbook to Performance Reporting” documents; this improves increased readability;

3) Removing duplicative data elements in the “Handbook to Performance Reporting” and on the ETA-9166 QPR Form; and

4) Revisions to an evaluation consent form and the addition of a baseline information form to support data collection.


These changes do not increase the burden on grantees.


The H-1B Performance Reporting Handbooks provide detailed guidelines on the reporting structure, which features standardized data collection on program participants and quarterly narrative, performance, and Management Information System (MIS) report formats. The final versions of the handbooks are provided as Attachment A: H-1B TST JIAC Performance Reporting Handbook Final and Attachment B: H-1B RTW Performance Reporting Handbook Final. The edited versions of the performance handbooks reflecting any changes are provided as Attachment A1: H-1B TST JIAC Performance Reporting Handbook Edited Version and Attachment B1: H-1B RTW Performance Reporting Handbook Edited Version.


All data collection and reporting will be done by grantee organizations (i.e. state or local government, not-for-profit, or faith-based and community organizations) or their sub-grantees. Grantees report through an ETA-provided, Web-based management system called the HUB Reporting System. The HUB system already exists and is currently in use by the grantees.


The H-1B Quarterly Narrative Reports (QNR) provides a detailed account of program activities, accomplishments, and progress toward performance outcomes during the quarter. There are no changes to these documents, which are provided as Attachment C: H-1B TST QNR Template, Attachment D: H-1B JIAC Integrated Work Plan QNR Template and Attachment E: H-1B RTW QNR Template.


The H-1B Quarterly Performance Report (QPR) Forms include aggregate information on participants’ demographic characteristics, types of services received, placements, outcomes, and follow-up status.


Specifically, these reports reflect data on individuals who receive employment and training services, assessment services, skill upgrading, placement into unsubsidized employment, and other services essential to assisting workers gain the skills and competencies needed to obtain or upgrade employment in high-growth industries and/or occupations employers are currently using H-1B visas to hire foreign workers for. There are no changes to the TST QPR Form, provided as Attach F_H-1B TST JIAC QPR Form. There are minor changes to the RTW QPR Form ETA-9166. These documents are provided as Attach G_H-1B RTW QPR Form_Final and Attach G1_H-1B RTW QPR Form_Edited Version.


The information collection for program evaluation includes setting up a Participant Tracking System (PTS) through the MIS with baseline information similar to the quarterly reports but at the individual participant level. The baseline data covered by this clearance will enable the evaluation to describe the characteristics of study participants at the time they are randomly assigned to a treatment or control group, ensure that random assignment was conducted properly, create subgroups for the analysis, provide contact information to locate individuals for follow-up surveys, and improve the precision of the impact estimates. Such data will be collected on the basis that the evaluation will consist of an experimental design employing random assignment of participants into treatment and control groups. A web-based PTS will execute the random assignment procedures and compile baseline data on study sample members. This PTS will assure that participant data will be in a consistent format across sites.


A rigorous program evaluation also requires clear and specific documentation of the services provided to treatment group members in each of the grantee sites and the services available to control group members. This qualitative information will enable the evaluation to describe the program design and operations in each site, interpret the impact analysis results, and identify lessons learned for purposes of program replication. The process study site visits will include semi-structured interviews and focus group discussions with various program stakeholders.

The accuracy, reliability, and comparability of program reports submitted by grantees using Federal funds are fundamental elements of good public administration and are necessary to maintain and demonstrate system integrity. The use of a standard set of data elements, definitions, and specifications at all levels of the workforce system helps improve the quality of performance information that is received by ETA. These programs will also be the focus of a random assignment evaluation, making access to high-quality performance data, qualitative data on program sites as well as baseline data on program participants all the more important.



1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.


H-1B TST, JAIC, and RTW grants are authorized under section 414(c) of the American Competitiveness and Workforce Improvement Act of 1998 (ACWIA) as amended (29 U.S.C. § 3224a), which identified performance accountability requirements for these grants. More specifically, 29 U.S.C. § 3224a(7), states, “The Secretary of Labor shall require grantees to report on the employment outcomes obtained by workers receiving training under this subsection using indicators of performance that are consistent with other indicators used for employment and training programs administered by the Secretary, such as entry into employment, retention in employment, and increases in earnings. The Secretary of Labor may also require grantees to participate in evaluations of projects carried out under this section.”


To support these legislative requirements, in applying for the H-1B TST, JAIC, and RTW grant programs, grantees agreed to submit participant-level data quarterly for individuals who receive services through these programs. The reports include aggregate data on demographic characteristics, types of services received, placements, outcomes, and follow-up status. Specifically, grantees summarize data on employment and training services, placement services, and other services essential to successful unsubsidized employment through H-1B programs.


ETA uses data collected to report to Congress and the public on the progress and success of these grants; hold grantees accountable for the Federal funds they receive; support oversight, management, and technical assistance efforts; and support information collection for an ongoing program evaluation where data is necessary to support random assignment for a rigorous impact evaluation to understand the effectiveness of the program strategies. The information collection for program evaluation includes setting up a Participant Tracking System (PTS) through the MIS to collect baseline information similar to the quarterly reports but at the individual participant level. The reporting and record keeping system, along with the PTS and site visit information incorporate each strategy component necessary for program evaluation.


Reports:

Three outcome measures will be used to measure success in the H-1B grants: entered employment rate, employment retention rate (this includes incumbent workers who retain their positions or advance into new positions and get wage gains after the program), and the average six-month post-program earnings. All of these conform to the common performance measures implemented across DOL job training programs as of July 1, 2005. By standardizing the reporting and performance requirements of different programs, the common measures give ETA the ability to compare across programs the core goals of the workforce system – how many people entered jobs; how many stay employed; and how many successfully completed an educational or vocational training program. In addition to the three outcome measures, grantees will report on a number of other common performance measures that serve as predictors of success. These include placement into unsubsidized jobs, attainment of degrees or certificates, placement into post-secondary education or vocational training, such as on-the-job training (OJT), classroom occupational training, contextualized learning, distance learning, customized training, incumbent worker training, and placement into high-growth industries and occupations.


In applying for the H-1B grants, grantees and their sub-grantees agree to submit participant-level data and aggregate reports on participant characteristics, services provided, placements, outcomes, and follow-up status. Grantees will collect and report quarterly H-1B performance data using the HUB Reporting System. Grantees will also collect participant Social Security Numbers which will allow ETA to match wage records for grantees and lessen the burden on grantees to track post-program outcomes for the common performance measures.

The progress reports have two components: a performance report and narrative report. Both reports are submitted on a quarterly basis by all H-1B grantees. The quarterly performance reports include aggregate data on demographic characteristics, types of services received, outcomes, and retention and follow-up status. The quarterly narrative reports provide a detailed account of program activities, accomplishments, and progress toward performance outcomes during the quarter. The HUB Reporting System collects participant files from grantees, validates the files, and generates a quarterly performance report.


Section 414(c) of ACWIA as amended (29 U.S.C. § 3224a), identified performance accountability requirements for these grants. More specifically, 29 U.S.C. § 3224a(7), states, “The Secretary of Labor


shall require grantees to report on the employment outcomes obtained by workers receiving training under this subsection using indicators of performance that are consistent with other indicators used for employment and training programs administered by the Secretary, such as entry into employment, retention in employment, and increases in earnings. The Secretary of Labor may also require grantees to participate in evaluations of projects carried out under this section.”


Data Collection for Program Evaluation:

During the intake period, all persons who apply for the program and are determined to be eligible will be told about the study (including random assignment) and asked to sign a form confirming they have been informed about and understand the study using the Evaluation Consent Form. The final version of this form is provided as Attachment H: H-1B Evaluation Consent Form Final and the edited version showing track changes is provided as Attach H1_Evaluation Consent Form_Edited Version.

Everyone who consents to participate will be asked to complete the Baseline Information Form, provided as Attach M: Baseline Information Form. This form is a new document and identifies for everyone who consents to participate the evaluation data elements to be collected that were originally submitted in the H-1B Data Elements for Performance and Evaluation document. No changes were made to the H-1B data elements, provided here as Attach L_H-1B Data Elements for Performance and Evaluatio

n. Program staff will then enter the person’s data into the web-based Participant Tracking System (PTS). The PTS will randomly assign each participant to either the treatment or the control group. Staff will then notify the individual of his or her assignment. People who do not consent to participate in the study will not be randomized or served with grant funding, but they may obtain training and employment help from other sources on their own.

The evaluation data collection is limited to the baseline information that must be collected at the outset of the study for people who are randomly assigned as well as the qualitative data collected during the site visits for the process study. The discussion below addresses those elements in order; first, the baseline data and second, the process study site visits.


  • Identifying Information. This includes complete name, address, telephone number, email, birth date, gender, and social security number (SSN)—enough information to ensure that each individual is randomly assigned only once. Identifiers are also necessary for tracking and locating sample members for follow-up surveys. In addition, this information ensures that the data can be accurately matched with administrative records on sample members.

  • Demographic and Socioeconomic Characteristics and Employment History. Baseline data in these areas are required to ensure that the random assignment process was conducted properly (by confirming that the research groups have similar characteristics at baseline) and to monitor random assignment. This information will be used to describe the study sample and to document differences in the populations served across the study sites. Moreover, this information will allow subsequent analyses of subgroups to be conducted.

  • Barriers to and Attitudes Toward Work. The baseline data collection effort will also ask questions about specific barriers that study participants may face (i.e. health, child care, and transportation) and attitudes toward work. This information will be used (1) to describe the sample and understand issues that could affect their ability to participate in training and work, and (2) to conduct subsequent subgroup analyses (for example, to determine whether the program was effective for those with specific barriers).


  • Locating Information. Accurate locating information is crucial to achieving high survey response rates. As mentioned above, the BIF will capture each applicant’s landline and cellular telephone numbers and email address. The form will also capture Facebook Name and alternative contact information for up to three relatives or friends who might know how to contact the sample member.

A web-based PTS will execute the random assignment procedures and to compile baseline data on study sample members. This PTS will assure that participant data will be in a consistent format across sites. The PTS will also perform random assignment.


A separate package, to be submitted at a later date, will request clearance for an 18-month survey to collect follow-up data from Ready to Work study participants.


2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.


Quarterly Reports:

Grantees will be expected to implement new recordkeeping and reporting requirements with grant funds. As an ETA-provided web-bases HUB reporting system is provided to grantees, their implementation costs will be minimized. Grant funds may also be used to upgrade computer hardware and Internet access to enable projects to utilize the HUB Reporting System and grantees to accurately track participants.

Grantees will track data on individuals who receive services through H-1B programs and their partnerships with Federal agencies, and other partner agencies and upload participant files to the ETA HUB Reporting System. The H-1B Performance and Evaluation Data Collection Elements, provided as Attach L_H-1B Data Elements for Performance and Evaluation, will be used by the Department and ETA to evaluate performance and delivery of H-1B program services. In addition to the required data elements to be collected, MIS will allow grantees to collect additional participant data beyond those elements required by H-1B. This document contains a list of the required data elements to be collected in the MIS and the purpose for collecting each item.


ETA will use the data to track total number of participants, participant characteristics, services, training strategies and outcomes for employed, unemployed, long-term unemployed, and incumbent worker participants. Common measures will enhance ETA’s ability to assess the effectiveness of the H-1B program within the broader workforce investment system.


Within ETA, the data are used by the Offices of Workforce Investment, Policy Development and Research, Financial and Administrative Management, Information Systems and Technology, and Regional Management (including the regional offices). Other DOL users include the Offices of the Assistant Secretary for ETA and Assistant Secretary for Policy.


The reports and other analyses of the data will be made available to the public through publication and other appropriate methods and to the appropriate congressional committees through copies of such reports, including the ETA’s Quarterly Workforce System Results Report. In addition, information obtained through the HUB Reporting System will be used at the national level during budget and allocation hearings for DOL compliance with the Government Performance and Results Act (GPRA) and other legislative requirements, and during legislative authorization proceedings.


Data Collection for Program Evaluation:

ETA requests clearance to collect baseline data and conduct site visits. The baseline data that will be used to perform and monitor random assignment, determine the location of participants in the future, and inform the later analysis, The data from the site visits will enable the evaluation to describe the program design and operations in each site, interpret the impact analysis results, and identify lessons learned for purposes of program replication.

The BIF, consent form, and the site visits are described in detail below; along with how, by whom, and for what purposes the information collection will be used.


Baseline Consent to Participate in the Study:

The informed consent form will be administered to all eligible individuals at the selected sites. The grantee staff will ask the person to read the form, or will read the form to the person, and will then answer any questions. The consent form ensures that the potential study participant has been fully informed about the study, including random assignment, data collection, and privacy of the data. It ensures that people are aware of the participation requirements of the study and know that they can decline to participate or drop out at any time.

Baseline Information Form (BIF):

The BIF will collect basic demographic, socioeconomic, and barrier-related characteristics on all consenting persons prior to random assignment. It will also collect the name, address, phone number, and email address of up to three of the participant’s close friends or relatives who would likely have knowledge of his or her whereabouts at the time of follow-up data collection.

Baseline and contact data are needed for the following purposes:

  • To Conduct Random Assignment. Some basic identifying information (i.e. name, address, date of birth) is needed to conduct random assignment.

  • To Monitor Random Assignment. Baseline information will be used to ensure that no one goes through the random assignment process more than once. Grantee staff at each selected site will enter identifying information about individuals into the PTS, and the system will alert the staff if the person has already been randomly assigned. Checking for previous randomization ensures that people always remain in the same research group and that the research sample does not include duplicate cases. We will also use baseline data to detect differences in the characteristics of members of the treatment and control groups, since such differences would suggest a problem with the assignment process.

  • To Locate Participants for Surveys and Collect Administrative Data. The BIF will collect detailed identifying information for each participant, including people who may know the whereabouts of the participant. This information is crucial for locating study participants for the follow-up surveys and thereby increasing the response rates. The participant’s SSN is essential for obtaining administrative data, such as Unemployment Insurance quarterly wage records that will be used to measure employment outcomes, since agencies typically provide data by matching on SSN. The SSN is also helpful in locating participants for follow-up surveys, as many of the locating services search for current address and telephone number using the SSN.

  • To Describe the Sample at Random Assignment. Baseline data will allow researchers to describe in more detail the population being served by each program.


  • To Define Subgroups for the Impact Analyses. Baseline data on the characteristics of sample members are essential to define subgroups for which impacts can be estimated. These include characteristics such as sex, race/ethnicity, health status, employment history, barriers to employment, and attitudes toward work.

  • To Increase the Precision of Impact Estimates. By including information on the baseline characteristics of study participants in regression models, the precision of impact estimates can be improved.

  • To Adjust for Nonresponse Bias. With random assignment, simple differences in the mean outcomes between the treatment and control groups provide unbiased estimates of the impacts. However, systematic differences between the characteristics of members of the two groups might occur because of differential rates of survey nonresponse across the groups. To the extent that these characteristics are correlated with the outcome variables, this may lead to biased impact estimates. Baseline characteristics can be used to adjust for potential bias that may arise from survey nonresponse.

The BIF will be completed by everyone who has been found eligible and has given signed consent to participate in the study. As with the consent forms, grantee staff administering the BIF will be available to answer questions. For people with low literacy or other reading barriers, the grantee’s staff can provide assistance. The form will be translated into Spanish and other languages as necessary.


Participant Tracking System (PTS):

The grantee staff in the evaluation will use the PTS within the existing MIS to enter participant data at the time of intake to randomly assign participants into treatment and control groups; all baseline items for the study will be entered. The system will ensure that each site collects valid and comparable data required for random assignment, and once data are entered into the PTS, the system has an algorithm that will automatically randomly assign each entered individual into the control or treatment group. Hence, the PTS will ensure that rigorous and unbiased procedures are used to assign people according to the experimental design being used to evaluate net impacts.

Site Visits:

The site visits will involve semi-structured interviews with administrators and staff, key program partners, employers, informal group discussions with students in the treatment group, and observations of program activities. The site visits are needed for the following purposes:

  • To describe the program design and operations in each site. Because the program as it is described in the grant application or other materials may differ from the actual program being tested, researchers will collect data during the site visits that will enable them to describe the programs as implemented.

  • To examine the treatment-control differential to help interpret impact results. Impacts on employment patterns, job characteristics, earnings and other outcomes will presumably be driven by differences in the amount and/or types of training and other services received by members of the treatment and control groups. Because the control group can access training and other services outside the grant-funded program, during the site visits researchers will document other sources of similar training (including those within the same institution) and sources of funding for training (e.g., other programs).


  • To identify lessons learned for use in replication efforts. Data collected during the site visits—considered within the context of the impact results—will be the key source for formulating recommendations about how to replicate successful programs or improve upon their results. These data will also be used to identify lessons learned about the relative effectiveness of particular training strategies. While it may not be possible to completely disentangle which factors are driving differences in impacts across sites, to the extent possible, the researchers will identify factors that appear to be linked to success, as well as those that are not.

Description of Site Visits:

Two- or three-person site teams will conduct three rounds of site visits. The teams will schedule their first visit after clearance is received and approximately six to nine months after the start of random assignment. They will spend three days at each site. These visits will focus on documenting the initial implementation of the programs and will include semi-structured interviews with administrators, staff, and partners, and observations of grantees’ activities. The second and third round site visits will be scheduled approximately 18 and 30 months after the start of random assignment, respectively, when programs have reached maturity. These visits will focus on changes and developments in the provision of services over the course of the grant period as well as issues regarding the sustainability of the grant program. They will include interviews with respondents in the same roles as the first visit, with the addition of interviews with employers involved with grant-funded activities or that have hired program completers. Given that the site visit teams will already have a basic understanding of the program and its operation, these visits will be two days in length (vs. three days in the first round).

The site visit team will work closely with the sites to arrange for the most convenient but expeditious time to visit their program. The evaluation team will also hold site visitor training for all staff involved in the visits. After each site visit, the data and information collected will be summarized and maintained in site specific databases.


Site visit team members will use the Evaluation Process Study Interview Protocol Guides, prepared discussion guides to conduct the semi-structured interviews, and will be guided by a set of protocols that cover the types of information required to advance our understanding of the training programs. The information in these documents did not change and are provided here as Attach I_H-1B TST Evaluation Process Study Interview Protocol Guide and Attach J_H-1B RTW Evaluation Process Study Interview Protocol Guide.


The semi-structured nature of the interview guide is purposively designed to allow site visitors maximum flexibility in tailoring their discussions during specific interviews to the different perspectives of respondents and the unique circumstances that prevail at each site while still ensuring that all key topic areas of interest are addressed in each site visit. While the site visit team will try to capture as much detail as possible, the questions in the discussion guide will remain open-ended in style to allow for the greatest freedom for the respondent to answer in his or her own words.


3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.


Reports:

To comply with the Government Paperwork Elimination Act, ETA is streamlining the collection of participant data and the preparation of quarterly reports to the extent feasible by providing the Web-based HUB Reporting System and by providing uniform data elements and data definitions to grantees across the program. All H-1B data and reports will be submitted to ETA via the HUB Reporting System that will concur with the above objective. Grantees will collect, retain, and report all information electronically through this system and will be provided comprehensive training on how to upload all reporting information, when and how.

To further reduce grantee burden, ETA will track Adult Common Measures on behalf of the grantee requesting that grantees submit the following information on participants: 1) Social Security Numbers; 2) Employment status at participation; 3) Date of exit; and 4) Reason for exit.


Data Collection for Program Evaluation:

The PTS is a web-based system within the existing MIS for gathering background information on participants as well as for executing random assignment for the evaluation. Its drop-down menus and response categories minimize the data entry burden on grantee staff, and the research team will train them all on how to use the system to randomly assign participants.

4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.


Reports:

The Department holds grantees accountable by requiring them to identify and work toward comprehensive performance standards and establishing quarterly reports for competitive projects. The data items identified in Attachment L_H-1B Data Elements for Performance and Evaluation will be used to generate QPRs.

ETA minimized the reporting burden by establishing the number of data elements required commensurate with the level of resources expended and services received. Data items collected by program reports and individual records are needed to: (1) account for the detailed services provided by multiple agencies to help participants prepare for job placement; (2) better identify overlapping and unproductive duplication of services; and (3) support the ongoing evaluation efforts in determining the effectiveness of the program model. Information provided through HUB Reporting System is not available through other data collection and report systems.


Data Collection for Program Evaluation:

BIF & PTS

The information being recorded in the PTS for the evaluation is not otherwise available in the format required for conducting accurate random assignment in a manner that: (1) is systematic across sites, (2) maintains integrity of the process, and (3) ensures privacy of information. The research team reviewed the quarterly reports each grantee submits to ETA and found that they present financial information and minimum participant-level data. The reports fail to provide key identifiers and contact information for future data collection, to document the characteristics of the sample in terms of education and employment history and work-related barriers, or to supply key data needed to create meaningful subgroups.


Site Visits:

The data to be collected during the site visits are not available from any other source. There is no other data source providing detailed information on the program context, program services, control group environment, and implementation, and challenges and successes. The first round of site visits will focus on the grant activities as implemented to that point while the second and third rounds of visits will document changes in activities over the course of the grant period and sustainability of the program or services in the future.

5. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.


For reporting purposes, the involvement of small businesses or other small entities that are not grantees or sub-grantees is extremely limited. The only time contacting them may be required is during the provision of a service. Methods to minimize the burden on small entities that are grantees or sub-grantees are discussed in other sections of this supporting statement.


The data collection for program evaluation purposes does not involve small businesses or other small entities.


6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


Reports:

29 CFR 95.51(b) (59 F.R. 38271, July 27, 1994), which governs monitoring and reporting program performance under grants and agreements with non-profit organizations, states that DOL shall prescribe the frequency with which performance reports shall be submitted, and that performance reports shall not be required more frequently than quarterly or, less frequently than annually. If ETA does not comply with these requirements, funding for demonstration programs would be compromised. In applying for H-1B grants, grantees agree to meet ETA’s reporting requirements as indicated in the Solicitation for Grant Applications, which requires the submission of quarterly reports within 45 days after the end of the quarter.

Data Collection for Program Evaluation:

Without collecting baseline information on study participants, the study could not implement random assignment correctly or ensure that it had been conducted appropriately. The lack of baseline information would limit the ability to describe the population of training program participants and would limit the analysis of impacts of the program on subgroups, hence limiting the ability to determine the groups for which the program is most effective. Without baseline data, impact estimates would be less precise (so that small impacts would be less likely to be detected), and adjustments for nonresponse to the follow-up surveys would have to be based on less-detailed administrative data.

If the study did not collect detailed contact information for study participants, it would not be able to track them and administer the follow-up surveys. This would likely lead to a higher nonresponse rate and thus pose a greater risk to the quality of survey data and, in turn, the impact estimates.


The information collected through the process study site visits will enable the evaluation to describe the program design and operations in each site, interpret the impact analysis results, and identify lessons learned for purposes of program replication. The consequences of not collecting the data from the field-based implementation and process analysis is that there would be a lack of in-depth information about the nature of the strategies developed and employed at grantee sites to improve the educational and employment and training outcomes of the students they service. Site visits will provide an opportunity to fully document the services being delivered to treatment group members and the potential services available to control group members. This is an essential part in an experimental design for understanding, for example, if employment outcomes at various points after random assignment are potentially associated with varying services received by treatment and control group members. If there are positive net impacts for the treatment group, it will be vital to understand the specific intervention(s) received by treatment group members so that they could potentially be replicated by other employment and training programs.


7. Explain any special circumstances that would cause an information collection to be conducted in a manner:


None of the data collection efforts involve any special circumstances.


8. If applicable, provide a copy and identify the date and page number of publication in the

Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on

the information collection prior to submission to OMB. Summarize public comments received in

response to that notice and describe actions taken by the agency in response to these comments.

Specifically address comments received on cost and hour burden.


Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.


Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years - even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.


In accordance with the Paperwork Reduction Act of 1995, the public was given 60 days to review the Federal Register Notice on February 23, 2016 (81 FR 8992 ). No comments were received.


9. Explain any decision to provide any payment or gift to respondents, other than

remuneration of contractors or grantees.


There are no payments to respondents other than the funds provided under the grant agreement.


10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.


Reports:

While this information collection makes no express assurance of confidentiality, ETA is responsible for protecting the privacy of the H-1B participant and performance data and will maintain the data in accordance with all applicable Federal laws, with particular emphasis on compliance with the provisions of the Privacy and Freedom of Information Acts. This data is covered by a System of Records Notice, DOL/ETA-15, published April 8, 2002 (67 FR 16898 et seq). The Department is working diligently to ensure the highest level of security whenever personally identifiable information is stored or transmitted. All contractors that have access to individually identifying information are required to provide assurances that they will respect and protect the privacy of the data.

As added protection, Social Security numbers will be converted into a unique participant ID for each case record. The HUB Reporting System also includes a statement that informs the individual where the information he/she has provided is being stored, the name and location of the system, and that the information is protected in accordance with the Privacy Act. Any information that is shared or made public is aggregated by grantee and does not reveal personal information on specific individuals.


Data Collection for Program Evaluation:

Baseline Data Collection:

At baseline, all information on individual participants will be entered into the PTS. All sensitive data will be encrypted and protected by Secure Socket Layer (SSL).Logging or output files will not contain sensitive data and will be limited to generic system and error data. Furthermore, data extracts for use by the evaluators will be available in files encrypted and available to team members on Relyon’s File Transfer Protocol over SSL (FTPS) site.


The system will segregate user data into sensitive data, user-identifiable data, and project-specific data. Sensitive data such as SSNs will be stored in a separate database table containing a system-generated ID number with the SSN stored in encrypted form. Sensitive data will be entered into the system but will at no point be displayed or downloadable by users of the system. User-identifiable data will be stored separately from project-specific data and will be available for updating only by designated administrators on the research team. Baseline data from the study will be available to the project team in specific extracts and reports.


The PTS will be accessible only to staff who are currently working on the project. Staff access to participant-level data will be restricted. To access the PTS, users will first log on to their workstations and then to the database using a separate log-in prompt. The database will be removed and securely archived at the end of the baseline data collection period.


Process Study Site Visits:

The administrators and staff interviewed by evaluators when on-site will be assured that their responses will be combined with those from other sites for analysis, will not be identified by the individual in any reports nor will interview notes be shared with ETA. Individuals will be interviewed separately and in private offices. Interview notes taken on laptop computers during the site visits will be stored on the evaluation contractor’s secure server, which requires a username and password to log-in. If any interview notes are recorded on paper, they will be secured in a locked file cabinet or shredded after being typed or scanned and saved on the secure server.

11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.

Reports:


While sensitive questions will be asked of participants in the proposed data collection, the privacy of participants will be protected as discussed in Section A.10. In addition, security is built into the HUB Reporting System, in accordance with ETA security requirements. Participant responses to these sensitive questions allow ETA and the evaluation contractors to comprehensively evaluate the effectiveness of the H-1B program.


Data Collection for Program Evaluation

Baseline Data Collection:

The BIF will collect background information on individuals who have consented to participate in this evaluation. Information on date of birth, address, and telephone numbers is needed to identify and contact participants. The BIF also collects information on characteristics of participants, such as sex, race/ethnicity, marital status, education level, employment history, and work-related barriers—data used to ensure that random assignment was conducted correctly, to create subgroups for the analysis, and to enhance the impact estimates. This type of information is generally collected as part of enrollment in most programs and is therefore not considered sensitive.

The BIF will also collect Social Security Numbers (SSN). SSNs are needed for two important purposes. First, SSNs will be used avoid duplication of random assignment; as a completely distinct form of identification, checking for matching SSNs is the only completely dependable method for ensuring that participants are not randomly assigned twice, either in the same site or at different sites. The PTS is designed to securely check for duplicate SSNs within and across sites, without ever displaying the SSN to the user (once the SSN is entered into the system, it becomes and remains hidden to PTS users).


Secondly, SSNs will be used so that the researchers can collect critical administrative data—Unemployment Insurance (UI) records—that will be used to measure the primary outcome of interest to the evaluation: the impact of the training programs on employment and earnings. The wage records collected by state UI agencies consist of quarterly earnings, by employer, for all UI-covered employees in the state. The only way to accurately access an individual’s UI data is through their SSN; other identifiers such as name and date of birth are not unique enough to ensure that the correct data will be obtained. An advantage of UI data is that they can be collected for a large sample for a long period of time. Thus, they allow the researchers to estimate precisely, for the whole sample and for subgroups, the net impacts on employment and earnings over a three-year period. They are also fairly uniform across states and over time, a characteristic that facilitates a straightforward approach to analysis and allows for cross-site comparisons. Compared to survey data, these data have the advantage that they are not subject to the potential biases that can occur because of recall error and survey non-response.


Process Study Site Visits


No sensitive questions will be asked during the site visits.


12. Provide estimates of the hour burden of the collection of information.


The annual national burden for the HUB Reporting System has three components: (1) the participant data collection burden; (2) the quarterly narrative progress report burden; and (3) the quarterly performance report burden. This response provides a separate burden for each of the three components.


  1. Participant Data Collection Burden


H-1B participant data collection burden considers the amount of participant and performance-related information collected and reported on the participant case record that would not have to be collected by the grantees as part of their customary and usual burden to run the program. Thus, the burden reflects the information collected solely to comply with Federal reporting and evaluation requirements.


The data collection burden will vary by participant based on the range and intensity of services provided by the grantee and its partners. For example, data collection may involve acquiring information from the various partner agencies regarding employment training and placement, education assistance, and on-the-job training, in addition to the collection of personal and demographic information and some evaluation data elements collected by the grantees themselves.


To arrive at the average annual figure of 2.66 hours per participant record, ETA assessed the time for entries based on scenarios postulating a variety of services possible for a range of anticipated participants. This information, in turn, was based on similar programs of this sort, including other employment and training programs. This figure is split between the data entry staff person (2.33 hours) and the participant orally providing data (0.33 hours).


Finally, ETA program managers consulted with grantees that have collected this sort of information over the past several years to verify that 2.66 hours is the average amount of time spent per record. Using this average figure (that includes the data entry into a participant tracking system) for the estimated 25,000 participants per program year, averaged across approximately 85 grantees, 2.66 hours represents the best combined response estimate of time devoted to data entry for each participant, given the range of entries anticipated for each participant, as described above.


ETA calculated the number of estimated program year national count based on the number of TST, JAIC, and RTW grants that are still active.



MIS data entry including evaluation data; Participant Record Burden

Average Hours per Record

Estimated Program Year National Count

Annual National Burden Hours

Hourly Rate for data entry person

Annual National Burden Dollars

Grantee


2.33

25,000

58,250

$15.96

$929,670

Program participant

0.33

25,000

8,250

$7.25

$59,812.5

Total

2.66

25,000

66,500



Total

2.66

25,000

66,500


$

989,482.50



Hourly rates used to calculate hourly cost for H-1B programs depend upon the type of organization administering the program. For their private non-profit grantees, the hourly rate is the average hourly earnings in the Bureau of Labor Statistic’s Social & Human Assistance Industry category identified in the 2015, Occupational Employment Statistics Survey, Bureau of Labor Statistics (http://www.bls.gov/oes/current/oes211093.htm#st).


The Federal minimum wage of $7.25 has been used as an approximation of the value of participant time identified in the Economic Policy Institute website (http://www.epi.org/minimum-wage-tracker/).


  1. Quarterly Narrative Progress Report Burden


H-1B quarterly narrative progress report burden involves providing a detailed account of all activities undertaken during the quarter including in-depth information on accomplishments, promising approaches, progress toward performance outcomes, and upcoming grant activities. ETA assumes each grantee will spend approximately ten hours per quarter preparing this report.


ETA calculated the estimated number of grantees based on the number of TST, JAIC, and RTW grants that are still active.



Report

Hours per Year per Grantee

Estimated Number of Grantees

Annual National Burden Hours

Applicable Hourly Rate

Annual National Burden Dollars

Quarterly Narrative Progress Report

40 (10 hours per quarterly report x 4 quarters)

85

3,400

$15.96

$54,264



  1. Quarterly Performance Report Burden


H-1B quarterly performance report burden assumes that all grantees will use the HUB Reporting System to generate quarterly performance reports, as is required by the grant. The HUB Reporting System is designed to apply edit checks to participant files and to generate facsimiles of the aggregate information on enrollee characteristics, services provided, placements, and outcomes in quarterly report format. The burden includes reviewing and correcting errors identified by the HUB Reporting System in the participant-level data and generating, reviewing, and approving the aggregate quarterly reports.


Report

Hours per Year per Grantee

Number of Grantees

Annual National Burden Hours

Applicable Hourly Rate

Annual National Burden Dollars

Quarterly Performance Report

40 (10 per quarterly report x4)

85

3,400

$15.96

$54,264





Form/Activity


Total Annual Burden Hours

Estimated Total Respondents


Average Annual Hours/Respondent

Participant Data Collection

66,500

85 grantees, 25,000 participants

226

Quarterly Narrative Progress Report

3400

85 grantees

40


Quarterly Performance Report

3400

85 grantees

40

Total

73,300

85 grantees, 25,000 participants




(4) Site Visit Data Collection Burden


The site visit data collection burden is developed based on the assumption that we will interview a total of 60 administrators and staff in grantee sites included in the evaluation at three points over the course of the evaluation. The estimated response rate is 100 percent, since when arranging for the site visits, evaluators will confirm scheduled times for interviewing key administrators and staff and set up in advance. The estimated response time for the individual interviews is an average of 60 minutes. Total estimated response burden for the site visits is 180 hours.



Form/Activity

Estimated Total

Respondents

Frequency

Total

Annual

Response

Average

Time per

Response

(hours)

Total

Annual

Burden

Hours

Applicable Hourly Rate

Total Annual Burden Cost

Site Visit Data Collection

60 total staff

Three times (once per year)

60

1 hour

60

$20.73

$1,243.80


The average hourly wage in the table for site visit data collection is $20.73, based on the BLS average hourly earnings of production and nonsupervisory employees, total private, seasonally adjusted, identified in the 2014, Occupational Employment Statistics Survey, Bureau of Labor Statistics BLS Data Viewer (http://beta.bls.gov/dataViewer/view/timeseries/CES0500000008).


Totals:

Activity

Number of Respondents

Frequency

Total Annual Responses

Time Per Response

Total Annual Burden (Hours)

Hourly Rate*

Monetized Value of Respondent Time

Participant Data Collection – Grantee

85 grantees, 25,000 participants

Continual

25,000

2.33 hours

58,250

$15.96


$929,670

Participant Data Collection - Participant

25,000 participant

Continual

25,000

0.33 hours

8,250

$7.25

$59,812.5

Quarterly Narrative Report

85

40 (10 per quarterly report x4)

340 (85 grantees x 4)

40 hours

3,400

$15.96


$54,264

Quarterly Performance Report

85

40 (10 per quarterly report x4)

340 (85 grantees x 4)

40 hours

3,400

$15.96


$54,264


Site Visit Data Collection

60

Three times (once per year)

60

1 hour

60

$$20.73

$1,243.80


Unduplicated Totals

25,315

83

50,740 (annual)

83.66 (annual)

73,360




$1,099,254.30



*Hourly rates used to calculate hourly cost for H-1B programs depend upon the type of organization administering the program. For their private non-profit grantees, the hourly rate of $115.96 is the average hourly earnings in the Bureau of Labor Statistic’s Social & Human Assistance Industry category identified in the 2015, Occupational Employment Statistics Survey, Bureau of Labor Statistics (http://www.bls.gov/oes/current/oes211093.htm#st).


The average hourly wage in the table for site visit data collection is $20.73, based on the BLS average hourly earnings of production and nonsupervisory employees on private, service providing, nonfarm payrolls identified in the 2014, Occupational Employment Statistics Survey, Bureau of Labor Statistics BLS Data Viewer (http://beta.bls.gov/dataViewer/view/timeseries/CES0500000008).


The Federal minimum wage of $7.25 has been used as an approximation of the value of participant time, identified in the Economic Policy Institute website (http://www.epi.org/minimum-wage-tracker).


13. Provide an estimate for the total annual cost burden to respondents or record keepers resulting from the collection of information. (Do not include the cost of any hour burden already reflected on the burden worksheet).


a) Start-Up/Capital Costs: There are no start-up costs, as ETA provides grantees with a free, Web-based, data collection and reporting system – the HUB Reporting System – which grantees use to collect and maintain participant data, apply edit checks to the data, and generate quarterly performance reports.


b) Annual Costs: There are no annual costs, as ETA will be responsible for the annual maintenance costs for the HUB Reporting System.


14. Provide estimates of annualized costs to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours,

operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information.


The annual costs of maintaining the system, estimated to be $285,000, are borne by ETA. This is a reduction from previous costs, as those costs included both the development and maintenance of the HUB reporting system. Since HUB, the electronic mechanisms for collecting and storing grantee performance data on a quarterly basis is already in place, the annualized cost to the Federal government (maintaining the quarterly reports and records through HUB, matching SIR data with state UI wage records and other Federal employment databases, and generating quarterly performance reports and any program close-out activities) reflects maintenance only costs.


15. Explain the reasons for any program changes or adjustments reported on the burden worksheet.


The decrease in burden hours reflect (1) a decrease in participant data collection burden hours to account for respondents serving less participants annually (25,000 participants served, in comparison to 45,300 identified in the previous ICR); (2) removal of burden hours associated with Site Visit Interviews (previously accounting for 217 hours total annual responses) as this collection has been completed; and (3) a reduction in the number of total respondents, frequency, and burden hours for Site Visit Data Collection (referred to Process Study Visits in the previous ICR), accounting for 420 hours.


16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.


Grantees will submit narrative progress and performance reports on a quarterly basis to ETA through the HUB Reporting System, within 45 days of the end of each quarter. Quarterly report data will be analyzed by ETA staff and used to evaluate performance outcomes and program effectiveness.


Each quarter, ETA issues the Quarterly Workforce System Results. Data contained in the HUB Reporting System is included in these reports. The data is also be used to prepare GPRA reports, management and budget reports, and other ad hoc reports, as needed.


17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


The expiration date for the OMB approval will be displayed.


18. Explain each exception to the topics of the certification statement identified in “Certification for Paperwork Reduction Act Submissions,”


No exceptions are requested in the “Certification of Paperwork Reduction Act Submissions.”


B. Collections of Information Employing Statistical Methods


No statistical methods will be used.  All data collection will be based on a 100 percent sample of the inference population.  In all reports and other publications and statements resulting from this work, no attempt will be made to draw inferences to any population other than the set of units that responded to the data collection effort.


ATTACHMENTS


Below is the list of attachments included in this request. For each document, we’ve identified if there were any changed made the versions approved under the original OMB approval. If changes were made, we’ve identified and include both a clean version and a version with edits in track changes.


The H-1B Performance Reporting Handbooks

  • Attachment A: H-1B TST-JIAC Performance Reporting Handbook (Final)

    • Attachment A1: H-1B TST-JIAC Performance Reporting Handbook (Edited Version with track changes to reflect revisions to original document)

  • Attachment B: H-1B RTW Performance Reporting Handbook (Final)

    • Attachment B1: H-1B RTW Performance Reporting Handbook (Edited Version with track changes to reflect revisions to original document)

The H-1B Quarterly Narrative Reports

  • Attachment C: H-1B TST Quarterly Narrative Report Template (No changes made to the original document)

  • Attachment D: H-1B JIAC Integrated Work Plan Quarterly Narrative Report Template (No changes made to the original document)

  • Attachment E: H-1B RTW Quarterly Narrative Report Template (No changes made to the original document)

The H-1B Quarterly Performance Report Forms

  • Attachment F: H-1B TST JIAC QPR Form (No changes made to the original document)

  • Attachment G: H-1B RTW QPR Form (Final)

    • Attachment G1: H-1B RTW QPR Form (Edited Version with track changes to reflect revisions to original document)


Evaluation Supporting Documents

  • Attachment H: H-1B Evaluation Consent Form (Final)

    • Attachment H1: H-1B Evaluation Consent Form (Edited Version with track changes to reflect revisions to original document)

  • Attachment I: H-1B TST Evaluation Process Study Interview Protocol Guide (No changes made to the original document)

  • Attachment J: H-1B RTW Evaluation Process Study Interview Protocol Guide (No changes made to the original document)


The H-1B Performance and Evaluation Data Collection Elements

  • Attachment L: H-1B Data Elements for Performance and Evaluation (No changes made to the original document)

  • Attachment M Baseline Information Form (New document)


Federal Register Notice 2016 - 03682

  • Attachment N: Federal Register Notice 2016-03682

(02/23/2016, Vol. 81, No. 35, Pgs. 8992 –8993) (No changes made to the original document)

17

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleMEMORANDUM
AuthorWilliam Garrett
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy