Supporting_Statement A - 1205-0448 - (2014-06-20)

Supporting_Statement A - 1205-0448 - (2014-06-20).docx

Employment and Training Data Validation Requirement

OMB: 1205-0448

Document [docx]
Download: docx | pdf


SUPPORTING STATEMENT FOR REQUEST FOR OMB APPROVAL

UNDER THE PAPERWORK REDUCTION ACT

Employment and Training Data Validation Requirement

OMB Control Number 1205-0448


PART A – JUSTIFICATION




This is a justification for the Department of Labor’s request for approval to extend a currently approved data validation requirement for five Employment and Training Administration (ETA) programs. Data validation assesses the accuracy of data collected and reported to ETA on program activities and outcomes. The accuracy and reliability of program reports submitted by states and grantees using federal funds are fundamental elements of good public administration, and are necessary tools for maintaining and demonstrating system integrity. The data validation requirement for employment and training programs strengthens the workforce system by ensuring that accurate and reliable information on program activities and outcomes is available. The following programs are subject to the data validation requirement: Workforce Investment Act (WIA) Title IB, Wagner-Peyser Act, Trade Adjustment Assistance (TAA) and National Farmworker Jobs Program (NFJP) and the Senior Community Service Employment Program (SCSEP). All of these programs must conduct both report and data element validation. However, the specific processes and required data elements that must be validated are program specific. All program specific information is discussed in the instructions for carrying out data validation for these programs. The Indian and Native American Program (INAP) is no longer part of this supporting statement (removed during a previous clearance cycle); INAP has integrated data validation software into its electronic system, “Bear Tracks,” accounted for in Office of Management and Budget (OMB) control number 1205-0422.


ETA is currently in the process of phasing in the software states use to conduct data validation, per the OMB Notice of Action on this collection in 2012. Prior to Program Year (PY) 2012, ETA provided states with standalone distributed software, the Data Reporting and Validation System (DRVS), which states had to download, install and update on one or more of their own computers. Data Validation results were then uploaded to ETA’s Enterprise Business Support System (EBSS) where they were stored. After completing data validation, states were required to upload separate individual record files to ETA for WIA, TAA, NFJP, and SCSEP reporting purposes covered under separate OMB control numbers. Wagner-Peyser individual records were not previously stored by ETA. As indicated in ETA’s most recent Data Validation collection renewal in 2012, ETA has begun to update the data validation software. Beginning in PY 2012 for Wagner-Peyser and in PY 2013 for WIA Title IB and TAA, ETA implemented an updated Enterprise Data Reporting and Validation System (EDRVS) that is web based and consolidates the reporting and data validation processes into one system. While the software states and grantees use for conducting data validation has changed, the methodology (described in Part B) used to draw the samples and produce error rates is unchanged. As a result, the burden estimates (described in Item A.12) associated with this collection renewal request are unchanged. All changes in burden are associated with program reporting, covered under separate OMB control numbers (for example the WIA Reporting System, OMB control no. 1205-0240).

Per the OMB Notice of Action in 2012 approving the implementation of the EDRVS software, ETA was asked to collect and provide state feedback data validation components of the software. However, due to delays in the deployment of the software itself and the time lag between the completion of a program year and when states must have completed validating their records, states have just begun to use the software. As a result, there has not been sufficient time to fully utilize EDRVS by the current expiration date for this data collection (May 31, 2014). ETA believes the software deployment can be successfully concluded by the end of 2015, and at that time will report to OMB on testing results, per the OMB Notice of Action in 2012 approving the implementation of the software.


1. Reasons for Data Collection


States and grantees receiving funding under WIA Title IB, Wagner-Peyser Act, TAA, and the Older Americans Act are required to maintain and report accurate program and financial information (WIA section 185 (29 U.S.C. 2935) and WIA Regulations 20 CFR 667.300(e)(2); Wagner-Peyser Act section 10 (29 U.S.C. 49i), Older Americans Act section 503(f)(3) and (4) (42 U.S.C. 3056a(f)(3) and (4)), and TAA Regulations 20 CFR 617.57). Further, all states and grantees receiving funding from ETA and the Veterans’ Employment and Training Service are required to submit reports or participant records and attest to the accuracy of these reports and records.


The Department’s Office of Inspector General (OIG) conducted an audit of WIA performance data oversight from July 2000 through October 2001. The audit, released in September 2002, found that, “Because of insufficient local, state, and Federal oversight, the Employment and Training Administration (ETA) has little assurance that the state-reported WIA performance outcomes data are either accurate or verifiable.” The OIG recommended that states should validate reported data using rigorous validation methodology. To address the concerns raised by the OIG and to meet the Agency’s goal for accurate and reliable data, ETA implemented a data validation requirement in order to ensure the accuracy of data collected and reported on program activities and outcomes.


ETA has developed a process for validating data submitted by states and grantees. Data validation consists of two parts:


1) Report validation evaluates the validity of aggregate reports submitted to ETA by using EDRVS to automatically generate the state-level aggregate reports based on the state’s certified individual record files submitted to EDRVS and the performance reporting specifications for the quarterly and annual reports. Report validation under EDRVS is implicit as states no longer generate their own reports. Rather, EDRVS does that automatically, based on state certified individual record file submissions.


  1. Data element validation exists for use as a management tool such that it appraises the accuracy of participant data records. Data element validation is conducted by manually reviewing samples of participant records with respect to their underlying source documentation in an effort to (1) underwrite the accuracy of the data contained in the states’ and grantees’ management information systems and (2) to affirm compliance with program-specific Federal definitions. The results of data element validation are utilized to identify areas on which to focus system resources in order to systematically improve program management over time.


This approach addresses the two fundamental sources of reporting errors within ETA program data: data entry error and inaccurate computation of the required aggregate reports at the state and grantee level. If the data collected are systematically incorrect or data entry errors routinely occur, then the outcomes information will not be accurate even though EDRVS is used to produce the aggregate reports. Data element validation addresses this issue by comparing performance-related data in each state’s participant record file to the original data in the source files and determining an error rate that indicates the degree of accuracy of each data element used in calculating the state’s performance results. As well, EDRVS uses the state’s individual record file submissions to automatically generate the aggregate quarterly and annual reports. States must certify that the reports accurately reflect their program participants before the error rates are determined for each performance outcome reported by the state.


ETA maintains the software and requires that states use it for program reporting and validation. While the software is updated on a rolling basis, per any changes to the reporting requirements or to fix software bugs, the mechanics of the system with regard to the general data collection, instructions for using the software and required data elements remain exactly the same as they were under the previous collection authorization.


WIA Title IB, Wagner-Peyser, TAA, NFJP and SCSEP program staff have been conducting data validation for several years. Program staff received training prior to the implementation of data validation and continue to receive ongoing training and technical assistance from ETA’s data validation contractor throughout the validation process.


Previous experience with data validation has indicated the following:

  • States and grantees are able to conduct data validation with a reasonable, but sustained, level of effort.

  • The validation process allows states and grantees to identify and address reporting errors.

  • The average staff requirements for a state to complete validation for the WIA Title IB, Wagner-Peyser, and TAA programs are about 792 hours per year (or less than 1/2 of a staff year). There is no startup burden for these programs because it was incurred upon initial implementation. The average annual time estimate for NFJP and SCSEP grantees to complete validation is approximately 161 hours (approximately 1/20 of a staff year).

  • Changes to the EDRVS software are continuous. This results from changes in program legislation and required data elements, as well as, in the EDRVS itself.


On the basis of the significant benefits of data validation along with its minimal burden going forward, ETA seeks to extend the existing data validation requirement for employment and training programs.


Currently, data validation is required annually as follows:

  • Report validation is conducted by EDRVS automatically at every file upload via the use of the software to generate the quarterly and annual reports.

  • Data element validation must be completed within 120 days after required annual reports or participant records are due at ETA. Exact deadlines for the completion of data validation vary by program.


  • States and grantees are required to send data element validation output reports to ETA within 120 days after they submit required annual reports or participant records.


States and the following grantees use EDRVS to validate the reports and participant records shown in Table 1 below.


Table 1 – Reports and Participant Record Files Validated


Program

Report/Records

OMB Approval No.

Workforce Investment Act Title IB

ETA 9091 (annual report)

1205-0420

Wagner-Peyser

ETA 9002, VETS 200

1205-0240

Trade Adjustment Assistance

TAPR

1205-0392

National Farmworker Jobs Program

WIASPR

1205-0425

Senior Community Service Employment Program

ETA 5140 (annual report)

1205-0040


The user handbooks for each program provide a more detailed overview of the validation process. These are made available electronically within the EDRVS by clicking the help link. Training and Employment Guidance Letter No. 28-11 outlines ETA’s most recent data validation policy.


2. Purpose of Information Collection


ETA uses data validation results to evaluate the accuracy of data collected and reported to ETA on program activities and outcomes. This information collection enables ETA to assure its customers, partners, and stakeholders of the validity of performance data underlying the respective programs. Further, data validation ensures that performance information used for WIA accountability purposes and to meet Government Performance and Results Act (GPRA) responsibilities are accurate.


Data validation was also developed with the goal of assisting states and grantees in providing more accurate data. Validation allows states and grantees to detect and identify specific problems with their reporting processes, including software and data issues, and to enable them to correct the problems. In addition, the tools developed by ETA help states and grantees analyze the causes of performance successes and failures by displaying participant data organized by performance outcomes. These tools are available at no cost to states and grantees.




3. Technology and Obstacles Affecting Reporting Burden


ETA knows of no technical obstacles to implementing and continuing data validation. ETA has developed new web based software that states and grantees must use to conduct data validation:


  • Software developed by ETA generates samples, worksheets, and reports on data accuracy. For report validation, the software is used to automatically generate the aggregate reports that states or grantees must certify. For data element validation, the software generates a random sample of the participant records and data elements for the state or grantee to manually validate. The software produces worksheets on which the validator records information after checking the source documentation in the sampled case files. The software calculates error rates for each data element, with confidence intervals of 3.5 percent for large states/grantees and 4 percent for small states/grantees.


  • User handbooks are provided within the software under the help link and provide detailed information on using the software and completing data element validation. The handbooks also explain the validation methodology, including sampling specifications and data element validation instructions for each data element to be validated. The current handbooks will undergo an iterative revision process as the software is deployed and as states begin to utilize all of the various components.


Currently, all states and grantees use the software provided by ETA to conduct validation for WIA Title IB, Wagner-Peyser, and TAA, NFJP and SCSEP. States and grantees can obtain technical assistance on validation procedures and the use of the validation tools from ETA’s data validation contractor.


The software can also be used to generate the aggregate information required in reports submitted to ETA. States and grantees that use the software provided by ETA to generate this aggregate information are not required to conduct report validation. However, states still must demonstrate that they used the validation software to calculate their aggregate reports.


For both report validation and data element validation, the ETA software uses the validation data provided by the states or grantees to produce validation summary reports which, in compliance with the Government Paperwork Elimination Act, are submitted via the system now used for electronic transmission of reports to ETA.


4. Duplication


The data validation requirement does not duplicate any existing ETA program.


5. Burden on Small Business or Other Small Entities


While data validation is conducted mostly by state governments and large, private, non-profit organizations, some small entities are required to conduct validation. Some of the grantees operating NFJP and SCSEP are small, private, non-profit organizations providing services to a small number of individuals. However, because of the low burden estimates associated with data validation for these programs, this information collection does not significantly impact these small entities.


The data element validation process allows states and grantees to randomly select validation samples from the complete data file, in order to compute statistically significant error rates, rather than requiring the validation of every participant case file. To reduce the relative burden on smaller states and grantees as much as possible, the sample size for smaller entities is less than for larger grantees and states. This leads to the slightly larger acceptable error rates of 4 percent for small states compared to 3.5 percent for large states.


6. Consequences of Failure to Collect Data


As mentioned in Item A.1, a concern was raised in the past related to the monitoring and inability to assure, consistently, the validity of performance outcomes reported by states and grantees. ETA regional staff continues to conduct data quality reviews based on current data validation efforts to determine if states are in compliance with data validation guidelines. The proposed continuation of the data validation requirement will allow ETA to continue to address these issues. If data validation is discontinued, ETA will not be able to ensure that critical data used for performance reports and accountability purposes, to meet GPRA responsibilities, and for other management purposes, are reliable.


7. Special Circumstances Involved in Collection of Data Validation Information


This request is consistent with 5 CFR 1320.5.


8. Pre-Clearance Notice and Responses


A Pre-clearance Notice for sixty days’ public comment was published in the Federal Register on December 24, 2013 (Vol. 78, page 77718). Comments are summarized below and paired with ETA’s responses to the comments.


SCSEP grantee Comments/Issues

ETA Response

  1. The eligibility element, “employed prior to participation,” could be eliminated, since only unemployed individuals are eligible to participate in the SCSEP program. Furthermore, it is unclear what the element means because it does not specify any time frame. Clearly, someone who is unemployed would have been employed at some point prior to applying for the services.


  1. The commenter states that the element “reason for approved break in participation” does not seem to be needed if exit the reason for exiting the program is not due to unsubsidized employment, the other reason for exiting. With the exception of documenting the exclusion, there is no reason to validate a termination other than unsubsidized employment when the performance element is unsubsidized employment. Therefore the commenter believes this section could be eliminated.


  1. The commenter recommends that a maximum sample size of 10 percent would be adequate for purposes of conducting a Data Validation Study. This would reduce the burden on state programs as well as smaller or medium size grantees.


  1. Felony/Ex-offender – The commenter believes this element should be added to the list of elements. While this may be an issue for reauthorization, it should be included in the hard-to-serve category. If this omission could be remedied through an update in the Data Validation process, the commenter urges the Department of Labor to make add this element.

  1. Data Validation is a critical process for monitoring data collection, reporting and performance measurement. DV seeks to validate the most critical data elements to ensure consistent compliance with federal definitions, legislation, and program outcome measures.


Employment status at participation is a critical data element. This data element is used to record an individual’s employment status at the point of program participation. This element is critical to program eligibility as well as the entered employment outcome measure.


  1. The reason approved breaks in participation and exit exclusions are important is for justifying the exclusion from outcome measures. A 90 day gap in services results in program exit and, hence, inclusion in program outcome measures. If there is a gap in service receipt in excess of 90 days that is an approved exclusion, the individual is not included in program outcomes. This element is validated to ensure that those exclusions were appropriate.


  1. Sample sizes are determined according to the statistical methods underlying the data validation procedures. Furthermore, in several states, 10% samples would be significantly larger than those drawn under the existing method. There is also a tradeoff between decreased sample size and the level of precision associated with the error rate estimates. Decreasing the sample sizes would generally result in increasing the variance of the error rate estimates. ETA continues to explore several methods to reduce the sample sizes for data validation while being cognizant of concerns regarding precision and travel cost minimization. As of yet, ETA is not able to reduce sample sizes without decreasing precision below the established threshold or increasing travel costs associated with validation of the documentation.


  1. While ex-offender status is an important data element for program management, it does not affect eligibility or the program outcome measures. For these reasons, this data element is not included in the data element validation process.




NFJP Comments/Issues

ETA Response

  1. One grantee commented that their current Data Validation process is smooth.  They implemented data validation at the beginning, at the intake level.  Once the field staff submits the eligibility documentation to the Central Office, the information is reviewed and validated and the participant is enrolled in the NFJP program. 

 

One of the burdens that they encounter is obtaining the wages for the participants that have been placed during the follow-up period.   One suggestion they provided is to have access to the Wages Records in each state that we operate the NFJP program.  They can have a Memorandum of Understanding with the state and be able to retrieve the wages for our participants during the placement follow-up period the way the Main WIA program does.

  1. While ETA understands the efficiency of obtaining wage records through the Wage Record Interchange System (WRIS), the establishment of such a process is well beyond the scope of the data validation collection.




WIA/TAA/Wagner-Peyser Comments/Issues

ETA Response

  1. One state commented that their only experience with using the new data collection software for data validation was with the TAA program, which was recently completed. With this limited experience in mind, it is the commenter’s opinion that the proposed data collection burden would be at least as great as or greater than the previous methodology. Under the new system, blank fields are to be completed by researching participant files for specific data that has already been reported. In some cases the Data Validation monitor could have several data values to choose from to enter on a particular field; e.g., Date Entered Training. This element might have dates in case notes, school transcripts, certificates, or attendance reports. It is much easier to find an already reported date in one of these documents rather than guessing which date was reported in the database. Also, since UI wage data would have to be looked up and entered, it will take considerably longer to look up eleven data items in the UI wage databank. If the number of data elements to be verified remains the same as in the past, this new process will take longer. How much longer depends on several variables: experience of the reviewers, internet speed, size of sample, program knowledge, etc. One suggestion would be to eliminate verifying UI wage data from data validation because it is always 99.99% accurate in this state. Three additional comments were received on the issue of having to enter the data element validation values into the system EDRVS for TAA and Wagner-Peyser.


  1. Data validation is perceived as a laborious, monotonous task; only somewhat helpful; and states cannot always be sure of how well they are doing relative to other states. Three states commented that they have nothing with which to compare their results (other than previous year’s performance), and receive no feedback from USDOL. Is there an acceptable error rate? What elements have the highest failure rate? Is the clarity or understanding of data elements consistently applied throughout the state let alone by 179 respondents nationwide? How consistent is the data validation process among all the various respondents? How independent are the data reviewers from the program operators? How is this information used to improve program effectiveness? These are a few of the questions that are relevant to issues of quality, utility, and clarity of the data. The answers to these questions are unique to each respondent, and they probably will vary from state to state. At a minimum, this state does provide annual feedback to each individual region on the results of their data validation review, and we also do exit conferences to present preliminary results. Policy guidance letters are updated annually to provide instruction and policy on how to successfully pass data validation elements.


  1. Through this year (PY 12), the data validation process has been a manual process. With the new system going to an internet based system, it will eliminate some of the copying and collating that currently exists. Additionally, it will not be necessary to do the end of project data entry which currently takes up to two days to complete. Other than this, it does not appear that the new system will take any less time to complete. One area that could reduce the burden would be a smaller sample size. Over the years, our sample has ranged from 1100 to 1300 records, with gradually improving results. Also, for some data elements, the error rate is consistently low, e.g., Date of Birth. If a state continues to show improvement in their overall error rate, or demonstrates a consistently low error rate for individual data elements, then the sample size should be reduced, and some data elements eliminated entirely. This would be a step in the right direction to help minimize the burden of data validation.


  1. The new WIASRD in particular has 57 new elements and 17 new dates with edit checks and new rejects surrounding these items that previously were yes/no values. This resulted in more work for states both in programming and the changes and challenges of date edit checks, research, analysis and correction to complete submissions. This will add even greater difficulty with Data Element Validation (DEV) system, file review and submission.


  1. One state commented that their experience with the new EDRVS (Wagner-Peyser and TAA programs) has proved more time consuming. This state estimates the processes for Wagner-Peyser took 4 times longer due to the requirement that the state enters in the data validation values manually rather than being shown the value and checking pass or fail with regard to whether or not it matches the paper trail. The state reports that validating the 25 Wagner-Peyser records took 40 hours of staff time (10 hours for 4 staff). The state stated that with the WIA DEV samplings being 25 times the sample size for Wagner-Peyser, they estimate the WIA process to take 1,000 hours. The hands on file review is required on location at multiple providers and the state reports that it took 352 hours in the past under the old DRVS. Two other related comments were received that suggested that the flexibility to conduct data element validation from a centralized location would reduce burden.


In regards to enhancing the quality, utility and clarity of the information collected, DOL would have to advise the states how this new process enhances the quality, clarity and utility of collected information. This state found the new EDRVS and the new DOL DEV process to be a much greater burden then the previous software and processes. Hopefully the new software will be better maintained, which should result in less technical issues than its predecessor. The state feels the extra time required for entering code values and review time outweighs the utility needs of the Department of Labor, particularly if the sample size is not going to be reduced in the new EDRVS system. Should the Department of Labor consider reducing the sample sizes for WIA DEV, the burden for the states could be reduced. Two additional comments stated that they felt that the sample size should be reduced.


  1. One state commented that each element currently requiring data collection has been reviewed and is believed to directly related to performance, eligibility, and/or outcomes for WIA. For TAA, this state commented that the source documentation options are unclear, particularly Petition Number and the Waiver from Training requirement. This state estimated that the staff time required for data validation exceeded the estimated average time per response given in the FRN for data validation (Average time per response = 218 hours). This state estimates that it takes 792 hours to conduct WIA data validation each year. Additional burden costs for respondents total thousands of dollars due to staff travel required to conduct on-site reviews and technical assistance training. However, the state reported no issues with TAA burden estimates.


Another state commented that it did not concur with DOL’s statement that states and grantees are able to conduct data validation annually with a reasonable but sustained level of effort. The demands that are placed on staff to conduct the data element validation activities are not reasonable. The exercise requires effort and resources that exceed the value of the benefits. The state is enduring budget cuts for DOL funded programs while travel expenses are increasing. For the most recent Title 1 data element validation process (PY 2012), the estimated cost, including staff time, travel, and other expenses was $58,000. The state suggested that DOL consider making data validation a biennial rather than annual activity.


  1. Three states commented that requiring identical date matches for things like training and exit dates is not necessary and does not have practical utility. For example, requiring identical date matches for dates of training does not relate to program performance. In most cases, it matters little to not at all if the dates are only a day or two off.


  1. One state commented that each year the software is released in numerous versions and patches. This development and repair cycle is difficult to contend with due to the state’s own information technology security requirements and policies. This state also commented that the data element validation worksheets do not contain identifying information on every page when printed out and that for TAA the worksheets do not contain enough information to easily identify the individuals in their system. The state suggested that ETA include participant names on the worksheets for TAA. This state also commented that the software does not provide the state with the ability to produce local-level or custom reports. One other state also commented that local area reports should be made available.


  1. The first program built into the new data validation software (EDRVS) was Wagner-Peyser. This portal required the state to manually enter the values into the software. These values were then compared to those in the data uploaded by states or grantees when conducting data validation. The thinking was that by having states enter the values; the system would provide a more robust check on the underlying data in that it required the validators to actively report the data rather than passively ensure what was reported was correct. Since Wagner-Peyser validation is based on only 25 records, this process was not considered time consuming. ETA and the TAA program office believe that the methodology of entering values is a necessary method for validating whether the values entered by the state in reporting are valid or not. States should develop a consensus on “what” date (amongst available options) is entered for fields such as “Date Entered Training”. Full data entry thereby not only provides data “element” validation, but also provides states with an opportunity to validate whether the original date was the correct one to report amongst the available options.


ETA is considering alternative methods to verify UI wage record data. One possibility is modifying the methodology to verify the accuracy of the wage database rather than each individual wage record. However, ETA has not identified how that process could be integrated into the sampling procedures and error rate computations. Since the wage records are the most critical variables for computing performance outcomes, this modification must be done with careful attention. While respondents may consistently produce accurate data, ETA must ensure that every state and grantee reports accurate wage information in a consistent manner.


  1. Data validation is essential for ensuring consistent and properly calculated performance outcomes. Report validation (now done automatically under EDRVS) is necessary for ensuring the performance outcomes are calculated in a consistent and accurate manner across the states and grantees. Data element validation (the focus of this comment) is necessary for ensuring that the underlying data are accurate and that they are consistent with program eligibility requirements and outcome calculations. Data element validation is meant to be a management tool for respondents. It informs them of the underlying problems in their data and provides information they can use to target and prioritize issues with their systems, databases, and documentation processes. There is no error rate threshold for data element validation. Data element validation is a management tool for states and grantees. Each state or grantee generates their own DEV results and uploads them to ETA through EDRVS. The results should be used to identify reporting issues within the state or grantee. The results are also used by the Regional Offices for their data validation reviews. Problems are identified and documented through these audits. The DEV process helps improve program effectiveness indirectly via ensuring that program data is consistent and accurate across the states and grantees. Consistent and accurate program reporting is necessary for effective program management and evaluation. States and grantees receive the data validation results directly from the system. States then certify those results, which makes them available to ETA. As a result, states and grantees are fully aware of the DEV results and, in fact, are the ones who report them to ETA. The results include overall and reported error rates for each individual element. ETA does not provide states and grantees with the results from other areas but addresses the problems within each state through the Regional Offices and the audit processes. Data validation is applied consistently using the same methodology for each program.


  1. The new data validation software (EDRVS) reduces burden in a number of ways. For one, quarterly program reporting and the data validation processes have been integrated. Under EDRVS, states upload one individual record file rather than uploading multiple files to multiple data systems. Two, the system also automatically generates the quarterly and annual reports for the state, freeing the state from that significant burden. Three, states are no longer required to maintain the software on a desktop environment in their office. Using the previous software, states were responsible for downloading and installing updates to the software on an annual basis in addition to maintaining a desktop environment capable of running the software.


While there have been several improvements, the data element validation procedure remains essentially the same. This is due to the fact that the process continues to rely on drawing a sample and verifying that the underlying information matches what was submitted in the states individual record file submission. There is little that the new software could do to reduce the burden associated with that process. ETA is actively working to reduce the sample sizes in a manner that would not compromise precision in the error rate estimates.


  1. The new elements and updated edit checks were approved in May 2013 under OMB Control Number 1205-0420. The burden associated with those modifications was accounted for in that collection. In the past, states had to upload a separate dataset to the old DRVS for the purposes of conducting data validation. The old DRVS was a separate system that was not integrated with the WIA reporting system. The current WIASRD and data validation system (EDRVS) have been integrated so that states and grantees only have to upload the file once each quarter. Data validation no longer requires its own upload and edit check processes. This integration has resulted in less upload and edit check burden placed on states.


  1. The WIA DEV process using EDRVS will not require the states manually enter the data validation values. The EDRVS for WIA will use the same process as the old DRVS and will generate those values automatically and only require the state to check pass or fail, depending on whether or not those values match their paper files. As well, no additional data elements will be required for validation. Due to the consolidation of the WIASRD upload and data validation procedures and the automation of the report validation process, the increases and decreases in burden will cancel out. Per section 12.A. of this supporting statement, ETA estimates an annual burden of 402 hours for small states conducting WIA, TAA, and Wagner-Peyser data validation. For medium and large states, the estimates are 746 and 1,206 burden hours respectively. The states’ estimate of 352 hours for conducting data validation under the old system plus the additional time for entering the values manually into EDRVS for TAA and Wagner-Peyser is still well below ETA’s burden estimates. Lastly, the file reviews for data validation are not required on location. It is perfectly allowable for the states to minimize travel expenditures by securely transferring (physically or electronically) the paper documentation to a central location for review. In fact, a couple of states are currently working with the ETA Regional Offices to conduct data validation and data validation reviews entirely remotely via electronic means. States and grantees may contact their ETA Regional Offices for additional information.


  1. ETA will work to clarify the source documentation requirements for TAA, particularly the Petition Number and Training Waiver requirements. The state estimated that it took 792 hours to conduct data validation for WIA and that the estimate for TAA data validation was correct. The state interpreted the average time per response of 218 hours to mean that ETA estimates that it will take a state 218 to conduct data validation. In fact, a state submits 5 responses for TAA, Wagner-Peyser and WIA (WIA Adult, Dislocated Worker, and Youth). The 218 hours cited by the state is an average time per response across all of the responses, including SCSEP and NFJP and abstracting from the differences in the sizes of states. ETA estimates that large states (this state is one of the largest in terms of number of program participants and exiters) will take approximately 1,206 hours and $49,217 to conduct data validation for these 5 programs. The burden hours and costs estimated by both states are in line with ETA’s burden estimates contained in 12.A. Again, states are encouraged to conduct data validation from a central location. ETA will consider the option of conducting data validation biennially, as part of a possible future ICR; however, the tradeoff associated with biennial reviews is that data problems will endure for twice as along.


  1. Exact date matching is difficult. In general, the point of data validation is to ensure that the individual record data on participants is accurate and that the performance outcomes are calculated consistently. If a training date does not match what’s reported in the individual record submission it is technically incorrect. In the vast majority of cases, differences of a couple of days would not affect performance or eligibility. However, it certainly could affect performance or eligibility, and the point of the process is to highlight cases where the data does not match. Since data element validation is a management tool, it is designed to identify all discrepancies.


  1. Prior to EDRVS, the old DRVS required states and grantees to download the DRVS software and install it on a local machine. As a result, annual installation patches and updates had to be incorporated every year in every state and grantee. This was a considerable obstacle and one that caused ETA to expend considerable resources on technical assistance and training. Migration to EDRVS has reduced this burden by moving the software to an online environment. In that environment, EDRVS can be updated one time from a central location freeing states and grantees from the burden of having to maintain the software in house. Substantial upgrades to EDRVS that make it easier to identify the records being validated have already been put into production. While ETA does not use participant names or other personally identifiable information for security purposes, the identification of the records for state management purposes has been made much more manageable in the most recent versions of the software.


Local area reporting capability has not yet been automated within the EDRVS. These reports will be built into the system in future versions. ETA has implemented a workaround whereby the local area reports are produced and provided in excel format. ETA will continue to provide local area reports in this manner until these reports are generated automatically by EDRVS.


9. Payments to Respondents


This information collection does not involve direct payments to respondents. ETA does provide administrative funding to the participating states and grantees, which are listed as the respondents for purposes of the Paperwork Reduction Act. The requirement to perform data validation derives from states’ and grantees’ responsibility to provide accurate information on program activities and outcomes to ETA. States and grantees are expected to provide resources from their administrative funds for the data validation effort. Validation of program performance is a basic responsibility of grantees, which are required to report program performance, under Department of Labor regulations (29 CFR 95.51 and 97.40).


10. Confidentiality


Participant record layouts used in data validation for the WIA Title IB, Wagner-Peyser, TAA, NFJP and SCSEP utilize state-assigned individual identifiers rather than Social Security Numbers (SSNs). This means that the data files that are uploaded to the EDRVS software have had the social security numbers necessary for obtaining wage record information removed from the file after the wage information was reported. The social security fields are replaced with state-specific unique identifiers before the file is uploaded to EDRVS. However, since data element validation necessarily involves the state accessing the underlying wage record information by social security number in order to verify the accuracy of wage information contained in the participant records submitted to ETA, the records that end up in the data element validation sample must include an identifier states can use to link back to the actual wage data reported in the individual record file. This is because validation works by comparing the information in the case file (or wage record file in this case) against that reported in the individual record file. To do this, the state uses the unique identifier associated with the particular record to identify the relevant case file. The case file information is then used to link to the wage information for the purposes of data element validation.


To protect the privacy of program participants, the validation software does not ever receive an SSN and includes user functionality that allows program administrators to limit access to this information based on administrative clearance. The program administrator is the only person with access to the password required to use the software, as one user name and password is issued to the state. No other means of access to these data is permitted. Keeping data private is not an issue with report validation because this aspect of data validation simply involves verifying the accuracy of aggregate reports submitted to ETA and so contains no private information.


11. Questions of a Sensitive Nature


The data collection includes no questions of a sensitive nature.


12. Respondent Annual Burden


Data validation is estimated to require an annual burden of 62,174 hours for all 5 programs subject to this validation requirement.


Burden estimates for state programs – WIA Title IB, Wagner-Peyser, and TAA – are outlined in Item 12.A. Data validation is estimated to require a total annual burden of 41,970 hours with an equivalent value of $1,079,888 for all state programs. Burden estimates for grantee programs – NFJP and SCSEP – are outlined in Item 12.B. Data validation is estimated to require a total annual burden of 20,204 hours with an equivalent value of $354,580 for private non-profit grantees and Federally-recognized tribes/$519,849 for state, county, and U.S. territory government grantee programs.


A. State Programs: WIA Title IB, Wagner-Peyser, and TAA


Table 2 provides an overview of the annual burden for the WIA Title IB, Wagner-Peyser, and TAA programs, including average hours and costs across states in all three programs. The estimated annual hours needed to conduct validation for these programs is 792 hours (rounded) on average per state and 41,970 hours for all states. The estimated annual cost of performing validation is $20,378 on average per state with an equivalent value of $1,079,888 for all states.


Table 2 - Calculation of Combined Annual Burden for WIA Title IB, Wagner-Peyser, and TAA Programs


State Size

No. of States (Respondents)

Reports per year per State

Total Annual Reports

Hours per Report

Hours per State

Total Hours

Rate in $/hr

Monetized Value

Large State

18

4

72

301.5

1,206

21,708

$25.73

$558,547

Medium State

18

4

72

186.5

746

13,428

$25.73

$345,502

Small State

17

4

68

100.5

402

6,834

$25.73

$175,839

All States Total

53

4

212

Varies

Varies

41,970

$25.73

$1,079,888

Average per State

1

4

4

198 (rounded)

792

792

(rounded)

$25.73

$20,378


  • The calculation of the hours required to conduct validation includes sample size, the time for validators to review sampled case files (34 minutes per file), the travel time to local offices to review the files, and 15 percent of a supervisor’s time.

  • States have been divided into three categories – large, medium, and small – based on the number of participants that exit a state’s program in a year. The size of the state impacts the number of sampled case files that must be reviewed and the travel time to local offices.

  • The annual travel time per office is estimated as 8 hours for large states, 6 hours for medium states, and 3 hours for small states. This estimate is based on the assumption that states will conduct data element validation separately for the WIA Title IB and TAA programs. If states conduct data element validation for both programs at the same time, the travel time required to perform validation will decrease.

  • The hourly rate is the estimated average hourly earnings in the “administration of economic programs” industry, North American Industry Classification System (NAICS) code 926110 (Calendar Year (CY) 2013, Quarterly Census of Employment and Wages, Bureau of Labor Statistics, http://data.bls.gov/pdq/querytool.jsp?survey=en).



B. Grantee Programs: NFJP and SCSEP


Table 3 provides an overview of the annual burden for the NFJP and SCSEP including average hours and cost across grantees in this program. The estimated annual hours needed to conduct validation for this program is 161 hours (rounded) on average per grantee and 20,204 hours for all grantees. The estimated monetary equivalency of the burden hours to conduct validation is $2,814 (for private non-profit grantees and Federally-recognized tribes)/$4,126 (for state, county, and U.S. territory government grantees) on average per grantee and $354,580/$519,849, respectively, for all grantees.


Table 3 – Summary Calculation of Annual Burden for NFJP and SCSEP Grantees


Grant Program

No. of Grantees (Respondents)

Reports per year per Grantee

Total Annual Reports

Hours per Report

Hours per Grantee

Total Hours

Rate in $/hr

Monetized Value

NFJP

52

4

208

39.5

158

8,216

$17.55/$25.73

$144,191/ $211,398

SCSEP

74

4

296

40.5

162

11,988

$17.55/$25.73

$210,389/ $308,451

All Grantees

Total

126

4

504

Varies

Varies

20,204

$17.55/$25.73

$354,580/ $519,849

Average per Grantee

NA

4

4

40 (rounded)

160.35

160.35

$17.55/$25.73

$2,814/ $4,126


  • The calculation of the hours required to conduct validation includes the time for validators to review sampled case files (40 minutes per file) and 15 percent of a supervisor’s time. (Travel is not required for grantees to conduct validation).

  • The hourly rate used to calculate cost depends upon the type of organization receiving the grant. For private non-profit grantees and Federally-recognized tribes, the hourly rate is the average hourly earnings in the civic and social organizations industry, Current Employment Statistics (CES) code 80313400 (March 2014, CES survey, U.S. Census Bureau, http://data.bls.gov/PDQ/outside.jsp?survey=ce.) For state, county, and U.S. territory government grantees, the hourly rate is the estimated average hourly earnings for employees in the administration of economic programs industry, NAICS code 926110 (CY 2013, Quarterly Census of Employment and Wages, Bureau of Labor Statistics, http://data.bls.gov/pdq/querytool.jsp?survey=en).


Tables 4 through 6 provide a more detailed account of the annual burden estimates for each grantee program.







Table 4 – Disaggregated Summary Calculation of Annual Burden for NFJP and SCSEP Grantees


Grant Program

No. of Grantees (Respondents)

Reports per year per Grantee

Total Annual Reports

Hours per Report

Hours per Grantee

Total Hours

Rate in $/hr

Monetized Value

NFJP Total

52

4

208

39.5

158

8,216

$17.55/$25.73

$144,191/ $211,398

Private Sector

50

4

200

39.5

158

7,900

$17.55

$138,645

State, Local, or Tribal Government

2

4

8

39.5

158

316

$25.73

$8,131

SCSEP Total

74

4

296

40.5

162

11,988

$17.55/$25.73

$210,389/ $308,451

Private Sector

18

4

72

40.5

162

2,916

$17.55

$51,175

State, Local, or Tribal Government

56

4

224

40.5

162

9,072

$25.73

$233,423

All Grantees

Total

126

4

504

NA

NA

20,204

$17.55/$25.73

$354,580/ $519,849


Table 5 - Calculation of Annual Burden for NFJP


Type of grantee

No. of Grantees

Hours

Rate in $/hr

Monetized Value

Private Sector (Non-Profits)

50

158 (per grantee)

$17.55

$2,773 (per grantee)

State, Local or Tribal Government

2

158 (per grantee)

$25.73

$4,065 (per grantee)

All Grantees

52

8,216

--

$132,660

Avg. per Grantee

--

158

--

$2,551


Note: The hourly rate used to calculate cost depends upon the type of organization receiving the grant. For private non-profit grantees and Federally-recognized tribes, the hourly rate is the average hourly earnings in the civic and social organizations industry, CES code 80313400 (March 2014, CES survey, U.S. Census Bureau, http://data.bls.gov/PDQ/outside.jsp?survey=ce). For state, county, and U.S. territory government grantees, the hourly rate is the estimated average hourly earnings for employees in the administration of economic programs industry, NAICS code 926110 (CY 2013, Quarterly Census of Employment and Wages, Bureau of Labor Statistics, http://data.bls.gov/pdq/querytool.jsp?survey=en).








Table 6 - Calculation of Annual Burden for SCSEP


Type of Grantee

No. of Grantees

Hours

Rate in $/hr

Cost

Private Sector (Non-Profits)

18

162 (per grantee)

$17.55

$2,843 (per grantee)

State, Local, or Tribal Government

56

162 (per grantee)

$25.73

$4,168(per grantee)

All Grantees

74

11,988

--

$414,435

Avg. per Grantee

--

162

--

$5,600


Note: For private non-profit grantees and Federally-recognized tribes, the hourly rate is the average hourly earnings in the civic and social organizations industry, CES code 80313400 (March 2014, CES survey, U.S. Census Bureau, http://data.bls.gov/PDQ/outside.jsp?survey=ce). For state, county, and U.S. territory government grantees, the hourly rate is the estimated average hourly earnings for employees in the administration of economic programs industry, NAICS code 926110 (CY 2013, Quarterly Census of Employment and Wages, Bureau of Labor Statistics, http://data.bls.gov/pdq/querytool.jsp?survey=en).


Table 7 provides the total burden for this information collection, broken out by type of respondent.


Table 7 – Disaggregated Summary Calculation of Annual Burden for NFJP and SCSEP Grantees


Respont Type and Grant Program

No. of Grantees (Respondents)

Reports per year per Grantee

Total Annual Reports

Hours per Report

Hours per Grantee

Total Hours

Rate in $/hr.

Monetized Value

State, Local, and Tribal Governments

56

4

444

Varies

Varies

51,358

$25.73

$1,321,441

WIA Title IB,

Wagner-Peyser, and TAA

53

4

212

Varies

Varies

41,970

$25.73

$1,079,888

NJFP

2

4

8

39.5

158

316

$25.73

$8.131

SCSEP

56

4

224

40.5

162

9,072

$25.73

$233,423

Private Sector (Non-Profits)

68


272

Varies

Varies

10,816

$17.55

$189,821

NJFP

50

4

200

39.5

158

7,900

$17.55

$138,645

SCSEP

18

4

72

40.5

162

2,916

$17.55

$51,176

Unduplicated Totals

121

4

716

Varies

Varies

62,174

Varies

$1,511,262


13. Estimated Cost to Respondents


The Agency associates no burden with this collection beyond the value of respondent time.





14. Cost to Federal Government


Federal costs are the staff and contractor costs required to maintain and manage data validation as outlined in Table 8 below. The annual cost of contractor support to provide continual technical support to grantees and states and any needed updates to validation tools for WIA Title IB, Wagner-Peyser, TAA, NFJP and SCSEP will total approximately $775,000 per PY. Costs for ETA staff to manage the data validation program will be $72,164 for continuing operations.


Table 8 - Cost of Data Validation to Federal Government


Table 8 – Federal Cost


Continuing Operations

(WIA Title IB, Wagner-Peyser,

TAA, and NFJP – per year)

Contractor Support

$775,000

Maintenance and Upgrades

$500,000

Technical Assistance

$275,000

ETA Staff Total

$72,164

1 GS-15 (1/8 time)

$15,624

1 GS-14 (1/4 time)

$26,566

1 GS-13 (1/3 time)

$29,974

Total Cost

$847,164


Note: Staff costs are based on Salary Table 2014-DCB (Step 1, and a locality payment of 24.22% for the locality pay area of Washington-Baltimore-Northern Virginia, DC-VA-WV-PA), Department of Labor grade ranges are as of January 2014. See http://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2014/DCB.pdf.


15. Reasons for Program Change and Change in Burden


There are no program changes from the previously approved submission. The Department has disaggregated burden according to respondent type, in accordance with OMB guidance. Correcting an earlier inadvertent error of reporting some reporting frequency in the reginfo.gov database (from annual to quarterly) has resulted in an adjustment of 398 responses. Other summary burden information was correct. Note, states have just begun to use the software so there has not been sufficient time to conclude evaluating it by the current expiration date for this data collection (May 31, 2014). ETA believes the software deployment can be successfully concluded by the end of 2015, and at that time will report to OMB on testing results, per the OMB Notice of Action in 2012 approving the implementation of the software.


16. Publication Information


ETA publishes the results of data validation in an annual validation report.

17. Reasons for Not Displaying Date OMB Approval Expires


ETA, as part of building its system, will display OMB approval and expiration information on the validation reports. Currently that information is aligned with the approved WIA reports with which the data validation occurs.


18. Exceptions to Certification


There are no exceptions to the certification statement.

11


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy