SUPPORTING STATEMENT FOR REQUEST FOR OMB APPROVAL
UNDER THE PAPERWORK REDUCTION ACT
Employment and Training Data Validation Requirement
OMB Control Number 1205-0448
PART A – JUSTIFICATION
This is a justification for the Department of Labor’s request for approval to extend a currently approved data validation requirement for five Employment and Training Administration (ETA) programs. Data validation assesses the accuracy of data collected and reported to ETA on program activities and outcomes. The accuracy and reliability of program reports submitted by states and grantees using federal funds are fundamental elements of good public administration, and are necessary tools for maintaining and demonstrating system integrity. The data validation requirement for employment and training programs strengthens the workforce system by ensuring that accurate and reliable information on program activities and outcomes is available. The following programs are subject to the data validation requirement: Workforce Investment Act (WIA) Title IB, Wagner-Peyser Act, Trade Adjustment Assistance (TAA) and National Farmworker Jobs Program (NFJP) and the Senior Community Service Employment Program (SCSEP). All of these programs must conduct both report and data element validation. However, the specific processes and required data elements that must be validated are program specific. All program specific information is discussed in the instructions for carrying out data validation for these programs. The Indian and Native American Program (INAP) is no longer part of this supporting statement (removed during a previous clearance cycle); INAP has integrated data validation software into its electronic system, “Bear Tracks,” accounted for in Office of Management and Budget (OMB) control number 1205-0422.
ETA is currently in the process of phasing in the software states use to conduct data validation, per the OMB Notice of Action on this collection in 2012. Prior to Program Year (PY) 2012, ETA provided states with standalone distributed software, the Data Reporting and Validation System (DRVS), which states had to download, install and update on one or more of their own computers. Data Validation results were then uploaded to ETA’s Enterprise Business Support System (EBSS) where they were stored. After completing data validation, states were required to upload separate individual record files to ETA for WIA, TAA, NFJP, and SCSEP reporting purposes covered under separate OMB control numbers. Wagner-Peyser individual records were not previously stored by ETA. As indicated in ETA’s most recent Data Validation collection renewal in 2012, ETA has begun to update the data validation software. Beginning in PY 2012 for Wagner-Peyser and in PY 2013 for WIA Title IB and TAA, ETA implemented an updated Enterprise Data Reporting and Validation System (EDRVS) that is web based and consolidates the reporting and data validation processes into one system. While the software states and grantees use for conducting data validation has changed, the methodology (described in Part B) used to draw the samples and produce error rates is unchanged. As a result, the burden estimates (described in Item A.12) associated with this collection renewal request are unchanged. All changes in burden are associated with program reporting, covered under separate OMB control numbers (for example the WIA Reporting System, OMB control no. 1205-0240).
Per the OMB Notice of Action in 2012 approving the implementation of the EDRVS software, ETA was asked to collect and provide state feedback data validation components of the software. However, due to delays in the deployment of the software itself and the time lag between the completion of a program year and when states must have completed validating their records, states have just begun to use the software. As a result, there has not been sufficient time to fully utilize EDRVS by the current expiration date for this data collection (May 31, 2014). ETA believes the software deployment can be successfully concluded by the end of 2015, and at that time will report to OMB on testing results, per the OMB Notice of Action in 2012 approving the implementation of the software.
1. Reasons for Data Collection
States and grantees receiving funding under WIA Title IB, Wagner-Peyser Act, TAA, and the Older Americans Act are required to maintain and report accurate program and financial information (WIA section 185 (29 U.S.C. 2935) and WIA Regulations 20 CFR 667.300(e)(2); Wagner-Peyser Act section 10 (29 U.S.C. 49i), Older Americans Act section 503(f)(3) and (4) (42 U.S.C. 3056a(f)(3) and (4)), and TAA Regulations 20 CFR 617.57). Further, all states and grantees receiving funding from ETA and the Veterans’ Employment and Training Service are required to submit reports or participant records and attest to the accuracy of these reports and records.
The Department’s Office of Inspector General (OIG) conducted an audit of WIA performance data oversight from July 2000 through October 2001. The audit, released in September 2002, found that, “Because of insufficient local, state, and Federal oversight, the Employment and Training Administration (ETA) has little assurance that the state-reported WIA performance outcomes data are either accurate or verifiable.” The OIG recommended that states should validate reported data using rigorous validation methodology. To address the concerns raised by the OIG and to meet the Agency’s goal for accurate and reliable data, ETA implemented a data validation requirement in order to ensure the accuracy of data collected and reported on program activities and outcomes.
ETA has developed a process for validating data submitted by states and grantees. Data validation consists of two parts:
1) Report validation evaluates the validity of aggregate reports submitted to ETA by using EDRVS to automatically generate the state-level aggregate reports based on the state’s certified individual record files submitted to EDRVS and the performance reporting specifications for the quarterly and annual reports. Report validation under EDRVS is implicit as states no longer generate their own reports. Rather, EDRVS does that automatically, based on state certified individual record file submissions.
Data element validation exists for use as a management tool such that it appraises the accuracy of participant data records. Data element validation is conducted by manually reviewing samples of participant records with respect to their underlying source documentation in an effort to (1) underwrite the accuracy of the data contained in the states’ and grantees’ management information systems and (2) to affirm compliance with program-specific Federal definitions. The results of data element validation are utilized to identify areas on which to focus system resources in order to systematically improve program management over time.
This approach addresses the two fundamental sources of reporting errors within ETA program data: data entry error and inaccurate computation of the required aggregate reports at the state and grantee level. If the data collected are systematically incorrect or data entry errors routinely occur, then the outcomes information will not be accurate even though EDRVS is used to produce the aggregate reports. Data element validation addresses this issue by comparing performance-related data in each state’s participant record file to the original data in the source files and determining an error rate that indicates the degree of accuracy of each data element used in calculating the state’s performance results. As well, EDRVS uses the state’s individual record file submissions to automatically generate the aggregate quarterly and annual reports. States must certify that the reports accurately reflect their program participants before the error rates are determined for each performance outcome reported by the state.
ETA maintains the software and requires that states use it for program reporting and validation. While the software is updated on a rolling basis, per any changes to the reporting requirements or to fix software bugs, the mechanics of the system with regard to the general data collection, instructions for using the software and required data elements remain exactly the same as they were under the previous collection authorization.
WIA Title IB, Wagner-Peyser, TAA, NFJP and SCSEP program staff have been conducting data validation for several years. Program staff received training prior to the implementation of data validation and continue to receive ongoing training and technical assistance from ETA’s data validation contractor throughout the validation process.
Previous experience with data validation has indicated the following:
States and grantees are able to conduct data validation with a reasonable, but sustained, level of effort.
The validation process allows states and grantees to identify and address reporting errors.
The average staff requirements for a state to complete validation for the WIA Title IB, Wagner-Peyser, and TAA programs are about 792 hours per year (or less than 1/2 of a staff year). There is no startup burden for these programs because it was incurred upon initial implementation. The average annual time estimate for NFJP and SCSEP grantees to complete validation is approximately 161 hours (approximately 1/20 of a staff year).
Changes to the EDRVS software are continuous. This results from changes in program legislation and required data elements, as well as, in the EDRVS itself.
On the basis of the significant benefits of data validation along with its minimal burden going forward, ETA seeks to extend the existing data validation requirement for employment and training programs.
Currently, data validation is required annually as follows:
Report validation is conducted by EDRVS automatically at every file upload via the use of the software to generate the quarterly and annual reports.
Data element validation must be completed within 120 days after required annual reports or participant records are due at ETA. Exact deadlines for the completion of data validation vary by program.
States and grantees are required to send data element validation output reports to ETA within 120 days after they submit required annual reports or participant records.
States and the following grantees use EDRVS to validate the reports and participant records shown in Table 1 below.
Program |
Report/Records |
OMB Approval No. |
Workforce Investment Act Title IB |
ETA 9091 (annual report) |
1205-0420 |
Wagner-Peyser |
ETA 9002, VETS 200 |
1205-0240 |
Trade Adjustment Assistance |
TAPR |
1205-0392 |
National Farmworker Jobs Program |
WIASPR |
1205-0425 |
Senior Community Service Employment Program |
ETA 5140 (annual report) |
1205-0040 |
The user handbooks for each program provide a more detailed overview of the validation process. These are made available electronically within the EDRVS by clicking the help link. Training and Employment Guidance Letter No. 28-11 outlines ETA’s most recent data validation policy.
2. Purpose of Information Collection
ETA uses data validation results to evaluate the accuracy of data collected and reported to ETA on program activities and outcomes. This information collection enables ETA to assure its customers, partners, and stakeholders of the validity of performance data underlying the respective programs. Further, data validation ensures that performance information used for WIA accountability purposes and to meet Government Performance and Results Act (GPRA) responsibilities are accurate.
Data validation was also developed with the goal of assisting states and grantees in providing more accurate data. Validation allows states and grantees to detect and identify specific problems with their reporting processes, including software and data issues, and to enable them to correct the problems. In addition, the tools developed by ETA help states and grantees analyze the causes of performance successes and failures by displaying participant data organized by performance outcomes. These tools are available at no cost to states and grantees.
3. Technology and Obstacles Affecting Reporting Burden
ETA knows of no technical obstacles to implementing and continuing data validation. ETA has developed new web based software that states and grantees must use to conduct data validation:
Software developed by ETA generates samples, worksheets, and reports on data accuracy. For report validation, the software is used to automatically generate the aggregate reports that states or grantees must certify. For data element validation, the software generates a random sample of the participant records and data elements for the state or grantee to manually validate. The software produces worksheets on which the validator records information after checking the source documentation in the sampled case files. The software calculates error rates for each data element, with confidence intervals of 3.5 percent for large states/grantees and 4 percent for small states/grantees.
User handbooks are provided within the software under the help link and provide detailed information on using the software and completing data element validation. The handbooks also explain the validation methodology, including sampling specifications and data element validation instructions for each data element to be validated. The current handbooks will undergo an iterative revision process as the software is deployed and as states begin to utilize all of the various components.
Currently, all states and grantees use the software provided by ETA to conduct validation for WIA Title IB, Wagner-Peyser, and TAA, NFJP and SCSEP. States and grantees can obtain technical assistance on validation procedures and the use of the validation tools from ETA’s data validation contractor.
The software can also be used to generate the aggregate information required in reports submitted to ETA. States and grantees that use the software provided by ETA to generate this aggregate information are not required to conduct report validation. However, states still must demonstrate that they used the validation software to calculate their aggregate reports.
For both report validation and data element validation, the ETA software uses the validation data provided by the states or grantees to produce validation summary reports which, in compliance with the Government Paperwork Elimination Act, are submitted via the system now used for electronic transmission of reports to ETA.
4. Duplication
The data validation requirement does not duplicate any existing ETA program.
5. Burden on Small Business or Other Small Entities
While data validation is conducted mostly by state governments and large, private, non-profit organizations, some small entities are required to conduct validation. Some of the grantees operating NFJP and SCSEP are small, private, non-profit organizations providing services to a small number of individuals. However, because of the low burden estimates associated with data validation for these programs, this information collection does not significantly impact these small entities.
The data element validation process allows states and grantees to randomly select validation samples from the complete data file, in order to compute statistically significant error rates, rather than requiring the validation of every participant case file. To reduce the relative burden on smaller states and grantees as much as possible, the sample size for smaller entities is less than for larger grantees and states. This leads to the slightly larger acceptable error rates of 4 percent for small states compared to 3.5 percent for large states.
6. Consequences of Failure to Collect Data
As mentioned in Item A.1, a concern was raised in the past related to the monitoring and inability to assure, consistently, the validity of performance outcomes reported by states and grantees. ETA regional staff continues to conduct data quality reviews based on current data validation efforts to determine if states are in compliance with data validation guidelines. The proposed continuation of the data validation requirement will allow ETA to continue to address these issues. If data validation is discontinued, ETA will not be able to ensure that critical data used for performance reports and accountability purposes, to meet GPRA responsibilities, and for other management purposes, are reliable.
7. Special Circumstances Involved in Collection of Data Validation Information
This request is consistent with 5 CFR 1320.5.
8. Pre-Clearance Notice and Responses
A Pre-clearance Notice for sixty days’ public comment was published in the Federal Register on December 24, 2013 (Vol. 78, page 77718). Comments are summarized below and paired with ETA’s responses to the comments.
SCSEP grantee Comments/Issues |
ETA Response |
|
Employment status at participation is a critical data element. This data element is used to record an individual’s employment status at the point of program participation. This element is critical to program eligibility as well as the entered employment outcome measure.
|
NFJP Comments/Issues |
ETA Response |
One of the burdens that they encounter is obtaining the wages for the participants that have been placed during the follow-up period. One suggestion they provided is to have access to the Wages Records in each state that we operate the NFJP program. They can have a Memorandum of Understanding with the state and be able to retrieve the wages for our participants during the placement follow-up period the way the Main WIA program does. |
|
WIA/TAA/Wagner-Peyser Comments/Issues |
ETA Response |
In regards to enhancing the quality, utility and clarity of the information collected, DOL would have to advise the states how this new process enhances the quality, clarity and utility of collected information. This state found the new EDRVS and the new DOL DEV process to be a much greater burden then the previous software and processes. Hopefully the new software will be better maintained, which should result in less technical issues than its predecessor. The state feels the extra time required for entering code values and review time outweighs the utility needs of the Department of Labor, particularly if the sample size is not going to be reduced in the new EDRVS system. Should the Department of Labor consider reducing the sample sizes for WIA DEV, the burden for the states could be reduced. Two additional comments stated that they felt that the sample size should be reduced.
Another state commented that it did not concur with DOL’s statement that states and grantees are able to conduct data validation annually with a reasonable but sustained level of effort. The demands that are placed on staff to conduct the data element validation activities are not reasonable. The exercise requires effort and resources that exceed the value of the benefits. The state is enduring budget cuts for DOL funded programs while travel expenses are increasing. For the most recent Title 1 data element validation process (PY 2012), the estimated cost, including staff time, travel, and other expenses was $58,000. The state suggested that DOL consider making data validation a biennial rather than annual activity.
|
ETA is considering alternative methods to verify UI wage record data. One possibility is modifying the methodology to verify the accuracy of the wage database rather than each individual wage record. However, ETA has not identified how that process could be integrated into the sampling procedures and error rate computations. Since the wage records are the most critical variables for computing performance outcomes, this modification must be done with careful attention. While respondents may consistently produce accurate data, ETA must ensure that every state and grantee reports accurate wage information in a consistent manner.
While there have been several improvements, the data element validation procedure remains essentially the same. This is due to the fact that the process continues to rely on drawing a sample and verifying that the underlying information matches what was submitted in the states individual record file submission. There is little that the new software could do to reduce the burden associated with that process. ETA is actively working to reduce the sample sizes in a manner that would not compromise precision in the error rate estimates.
Local area reporting capability has not yet been automated within the EDRVS. These reports will be built into the system in future versions. ETA has implemented a workaround whereby the local area reports are produced and provided in excel format. ETA will continue to provide local area reports in this manner until these reports are generated automatically by EDRVS. |
9. Payments to Respondents
This information collection does not involve direct payments to respondents. ETA does provide administrative funding to the participating states and grantees, which are listed as the respondents for purposes of the Paperwork Reduction Act. The requirement to perform data validation derives from states’ and grantees’ responsibility to provide accurate information on program activities and outcomes to ETA. States and grantees are expected to provide resources from their administrative funds for the data validation effort. Validation of program performance is a basic responsibility of grantees, which are required to report program performance, under Department of Labor regulations (29 CFR 95.51 and 97.40).
10. Confidentiality
Participant record layouts used in data validation for the WIA Title IB, Wagner-Peyser, TAA, NFJP and SCSEP utilize state-assigned individual identifiers rather than Social Security Numbers (SSNs). This means that the data files that are uploaded to the EDRVS software have had the social security numbers necessary for obtaining wage record information removed from the file after the wage information was reported. The social security fields are replaced with state-specific unique identifiers before the file is uploaded to EDRVS. However, since data element validation necessarily involves the state accessing the underlying wage record information by social security number in order to verify the accuracy of wage information contained in the participant records submitted to ETA, the records that end up in the data element validation sample must include an identifier states can use to link back to the actual wage data reported in the individual record file. This is because validation works by comparing the information in the case file (or wage record file in this case) against that reported in the individual record file. To do this, the state uses the unique identifier associated with the particular record to identify the relevant case file. The case file information is then used to link to the wage information for the purposes of data element validation.
To protect the privacy of program participants, the validation software does not ever receive an SSN and includes user functionality that allows program administrators to limit access to this information based on administrative clearance. The program administrator is the only person with access to the password required to use the software, as one user name and password is issued to the state. No other means of access to these data is permitted. Keeping data private is not an issue with report validation because this aspect of data validation simply involves verifying the accuracy of aggregate reports submitted to ETA and so contains no private information.
11. Questions of a Sensitive Nature
The data collection includes no questions of a sensitive nature.
12. Respondent Annual Burden
Data validation is estimated to require an annual burden of 62,174 hours for all 5 programs subject to this validation requirement.
Burden estimates for state programs – WIA Title IB, Wagner-Peyser, and TAA – are outlined in Item 12.A. Data validation is estimated to require a total annual burden of 41,970 hours with an equivalent value of $1,079,888 for all state programs. Burden estimates for grantee programs – NFJP and SCSEP – are outlined in Item 12.B. Data validation is estimated to require a total annual burden of 20,204 hours with an equivalent value of $354,580 for private non-profit grantees and Federally-recognized tribes/$519,849 for state, county, and U.S. territory government grantee programs.
Table 2 provides an overview of the annual burden for the WIA Title IB, Wagner-Peyser, and TAA programs, including average hours and costs across states in all three programs. The estimated annual hours needed to conduct validation for these programs is 792 hours (rounded) on average per state and 41,970 hours for all states. The estimated annual cost of performing validation is $20,378 on average per state with an equivalent value of $1,079,888 for all states.
Table 2 - Calculation of Combined Annual Burden for WIA Title IB, Wagner-Peyser, and TAA Programs
State Size |
No. of States (Respondents) |
Reports per year per State |
Total Annual Reports |
Hours per Report |
Hours per State |
Total Hours |
Rate in $/hr |
Monetized Value |
Large State |
18 |
4 |
72 |
301.5 |
1,206 |
21,708 |
$25.73 |
$558,547 |
Medium State |
18 |
4 |
72 |
186.5 |
746 |
13,428 |
$25.73 |
$345,502 |
Small State |
17 |
4 |
68 |
100.5 |
402 |
6,834 |
$25.73 |
$175,839 |
All States Total |
53 |
4 |
212 |
Varies |
Varies |
41,970 |
$25.73 |
$1,079,888 |
Average per State |
1 |
4 |
4 |
198 (rounded) |
792 |
792 (rounded) |
$25.73 |
$20,378 |
The calculation of the hours required to conduct validation includes sample size, the time for validators to review sampled case files (34 minutes per file), the travel time to local offices to review the files, and 15 percent of a supervisor’s time.
States have been divided into three categories – large, medium, and small – based on the number of participants that exit a state’s program in a year. The size of the state impacts the number of sampled case files that must be reviewed and the travel time to local offices.
The annual travel time per office is estimated as 8 hours for large states, 6 hours for medium states, and 3 hours for small states. This estimate is based on the assumption that states will conduct data element validation separately for the WIA Title IB and TAA programs. If states conduct data element validation for both programs at the same time, the travel time required to perform validation will decrease.
The hourly rate is the estimated average hourly earnings in the “administration of economic programs” industry, North American Industry Classification System (NAICS) code 926110 (Calendar Year (CY) 2013, Quarterly Census of Employment and Wages, Bureau of Labor Statistics, http://data.bls.gov/pdq/querytool.jsp?survey=en).
Table 3 provides an overview of the annual burden for the NFJP and SCSEP including average hours and cost across grantees in this program. The estimated annual hours needed to conduct validation for this program is 161 hours (rounded) on average per grantee and 20,204 hours for all grantees. The estimated monetary equivalency of the burden hours to conduct validation is $2,814 (for private non-profit grantees and Federally-recognized tribes)/$4,126 (for state, county, and U.S. territory government grantees) on average per grantee and $354,580/$519,849, respectively, for all grantees.
Table 3 – Summary Calculation of Annual Burden for NFJP and SCSEP Grantees
Grant Program |
No. of Grantees (Respondents) |
Reports per year per Grantee |
Total Annual Reports |
Hours per Report |
Hours per Grantee |
Total Hours |
Rate in $/hr |
Monetized Value |
NFJP |
52 |
4 |
208 |
39.5 |
158 |
8,216 |
$17.55/$25.73 |
$144,191/ $211,398 |
SCSEP |
74 |
4 |
296 |
40.5 |
162 |
11,988 |
$17.55/$25.73 |
$210,389/ $308,451 |
All Grantees Total |
126 |
4 |
504 |
Varies |
Varies |
20,204 |
$17.55/$25.73 |
$354,580/ $519,849 |
Average per Grantee |
NA |
4 |
4 |
40 (rounded) |
160.35 |
160.35 |
$17.55/$25.73 |
$2,814/ $4,126 |
The calculation of the hours required to conduct validation includes the time for validators to review sampled case files (40 minutes per file) and 15 percent of a supervisor’s time. (Travel is not required for grantees to conduct validation).
The hourly rate used to calculate cost depends upon the type of organization receiving the grant. For private non-profit grantees and Federally-recognized tribes, the hourly rate is the average hourly earnings in the civic and social organizations industry, Current Employment Statistics (CES) code 80313400 (March 2014, CES survey, U.S. Census Bureau, http://data.bls.gov/PDQ/outside.jsp?survey=ce.) For state, county, and U.S. territory government grantees, the hourly rate is the estimated average hourly earnings for employees in the administration of economic programs industry, NAICS code 926110 (CY 2013, Quarterly Census of Employment and Wages, Bureau of Labor Statistics, http://data.bls.gov/pdq/querytool.jsp?survey=en).
Tables 4 through 6 provide a more detailed account of the annual burden estimates for each grantee program.
Table 4 – Disaggregated Summary Calculation of Annual Burden for NFJP and SCSEP Grantees
Grant Program |
No. of Grantees (Respondents) |
Reports per year per Grantee |
Total Annual Reports |
Hours per Report |
Hours per Grantee |
Total Hours |
Rate in $/hr |
Monetized Value |
NFJP Total |
52 |
4 |
208 |
39.5 |
158 |
8,216 |
$17.55/$25.73 |
$144,191/ $211,398 |
Private Sector |
50 |
4 |
200 |
39.5 |
158 |
7,900 |
$17.55 |
$138,645 |
State, Local, or Tribal Government |
2 |
4 |
8 |
39.5 |
158 |
316 |
$25.73 |
$8,131 |
SCSEP Total |
74 |
4 |
296 |
40.5 |
162 |
11,988 |
$17.55/$25.73 |
$210,389/ $308,451 |
Private Sector |
18 |
4 |
72 |
40.5 |
162 |
2,916 |
$17.55 |
$51,175 |
State, Local, or Tribal Government |
56 |
4 |
224 |
40.5 |
162 |
9,072 |
$25.73 |
$233,423 |
All Grantees Total |
126 |
4 |
504 |
NA |
NA |
20,204 |
$17.55/$25.73 |
$354,580/ $519,849 |
Table 5 - Calculation of Annual Burden for NFJP
Type of grantee |
No. of Grantees |
Hours |
Rate in $/hr |
Monetized Value |
Private Sector (Non-Profits) |
50 |
158 (per grantee) |
$17.55 |
$2,773 (per grantee) |
State, Local or Tribal Government |
2 |
158 (per grantee) |
$25.73 |
$4,065 (per grantee) |
All Grantees |
52 |
8,216 |
-- |
$132,660 |
Avg. per Grantee |
-- |
158 |
-- |
$2,551 |
Note: The hourly rate used to calculate cost depends upon the type of organization receiving the grant. For private non-profit grantees and Federally-recognized tribes, the hourly rate is the average hourly earnings in the civic and social organizations industry, CES code 80313400 (March 2014, CES survey, U.S. Census Bureau, http://data.bls.gov/PDQ/outside.jsp?survey=ce). For state, county, and U.S. territory government grantees, the hourly rate is the estimated average hourly earnings for employees in the administration of economic programs industry, NAICS code 926110 (CY 2013, Quarterly Census of Employment and Wages, Bureau of Labor Statistics, http://data.bls.gov/pdq/querytool.jsp?survey=en).
Table 6 - Calculation of Annual Burden for SCSEP
Type of Grantee |
No. of Grantees |
Hours |
Rate in $/hr |
Cost |
Private Sector (Non-Profits) |
18 |
162 (per grantee) |
$17.55 |
$2,843 (per grantee) |
State, Local, or Tribal Government |
56 |
162 (per grantee) |
$25.73 |
$4,168(per grantee) |
All Grantees |
74 |
11,988 |
-- |
$414,435 |
Avg. per Grantee |
-- |
162 |
-- |
$5,600 |
Note: For private non-profit grantees and Federally-recognized tribes, the hourly rate is the average hourly earnings in the civic and social organizations industry, CES code 80313400 (March 2014, CES survey, U.S. Census Bureau, http://data.bls.gov/PDQ/outside.jsp?survey=ce). For state, county, and U.S. territory government grantees, the hourly rate is the estimated average hourly earnings for employees in the administration of economic programs industry, NAICS code 926110 (CY 2013, Quarterly Census of Employment and Wages, Bureau of Labor Statistics, http://data.bls.gov/pdq/querytool.jsp?survey=en).
Table 7 provides the total burden for this information collection, broken out by type of respondent.
Table 7 – Disaggregated Summary Calculation of Annual Burden for NFJP and SCSEP Grantees
Respont Type and Grant Program |
No. of Grantees (Respondents) |
Reports per year per Grantee |
Total Annual Reports |
Hours per Report |
Hours per Grantee |
Total Hours |
Rate in $/hr. |
Monetized Value |
State, Local, and Tribal Governments |
56 |
4 |
444 |
Varies |
Varies |
51,358 |
$25.73 |
$1,321,441 |
WIA Title IB, Wagner-Peyser, and TAA |
53 |
4 |
212 |
Varies |
Varies |
41,970 |
$25.73 |
$1,079,888 |
NJFP |
2 |
4 |
8 |
39.5 |
158 |
316 |
$25.73 |
$8.131 |
SCSEP |
56 |
4 |
224 |
40.5 |
162 |
9,072 |
$25.73 |
$233,423 |
Private Sector (Non-Profits) |
68 |
|
272 |
Varies |
Varies |
10,816 |
$17.55 |
$189,821 |
NJFP |
50 |
4 |
200 |
39.5 |
158 |
7,900 |
$17.55 |
$138,645 |
SCSEP |
18 |
4 |
72 |
40.5 |
162 |
2,916 |
$17.55 |
$51,176 |
Unduplicated Totals |
121 |
4 |
716 |
Varies |
Varies |
62,174 |
Varies |
$1,511,262 |
13. Estimated Cost to Respondents
The Agency associates no burden with this collection beyond the value of respondent time.
14. Cost to Federal Government
Federal costs are the staff and contractor costs required to maintain and manage data validation as outlined in Table 8 below. The annual cost of contractor support to provide continual technical support to grantees and states and any needed updates to validation tools for WIA Title IB, Wagner-Peyser, TAA, NFJP and SCSEP will total approximately $775,000 per PY. Costs for ETA staff to manage the data validation program will be $72,164 for continuing operations.
Table 8 - Cost of Data Validation to Federal Government
Table 8 – Federal Cost
Continuing Operations (WIA Title IB, Wagner-Peyser, TAA, and NFJP – per year) |
|
Contractor Support |
$775,000 |
Maintenance and Upgrades |
$500,000 |
Technical Assistance |
$275,000 |
ETA Staff Total |
$72,164 |
1 GS-15 (1/8 time) |
$15,624 |
1 GS-14 (1/4 time) |
$26,566 |
1 GS-13 (1/3 time) |
$29,974 |
Total Cost |
$847,164 |
Note: Staff costs are based on Salary Table 2014-DCB (Step 1, and a locality payment of 24.22% for the locality pay area of Washington-Baltimore-Northern Virginia, DC-VA-WV-PA), Department of Labor grade ranges are as of January 2014. See http://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2014/DCB.pdf.
15. Reasons for Program Change and Change in Burden
There are no program changes from the previously approved submission. The Department has disaggregated burden according to respondent type, in accordance with OMB guidance. Correcting an earlier inadvertent error of reporting some reporting frequency in the reginfo.gov database (from annual to quarterly) has resulted in an adjustment of 398 responses. Other summary burden information was correct. Note, states have just begun to use the software so there has not been sufficient time to conclude evaluating it by the current expiration date for this data collection (May 31, 2014). ETA believes the software deployment can be successfully concluded by the end of 2015, and at that time will report to OMB on testing results, per the OMB Notice of Action in 2012 approving the implementation of the software.
16. Publication Information
ETA publishes the results of data validation in an annual validation report.
17. Reasons for Not Displaying Date OMB Approval Expires
ETA, as part of building its system, will display OMB approval and expiration information on the validation reports. Currently that information is aligned with the approved WIA reports with which the data validation occurs.
18. Exceptions to Certification
There are no exceptions to the certification statement.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Modified | 0000-00-00 |
File Created | 2021-01-28 |