1205-0431 Supporting Statement_FINAL 4.14.16

1205-0431 Supporting Statement_FINAL 4.14.16.docx

Unemployment Insurance Data Validation

OMB: 1205-0431

Document [docx]
Download: docx | pdf

Unemployment Insurance Data Validation Program (DV)

OMB Control No. 1205-0431

April 2016


SUPPORTING STATEMENT

Unemployment Insurance Data Validation (DV)

OMB Control No. 1205-0431


A. JUSTIFICATION


This is a justification for the Department of Labor's request for approval to extend the Unemployment Insurance Data Validation (UI DV) program with revisions. The DV program assesses the accuracy of reports data which States submit to the Department monthly, quarterly or annually pursuant to the Department’s authority in Section 303(a)(6) of the Social Security Act. UI DV is operated pursuant to authority granted under OMB number 1205-0431; collection authority expires on May 31, 2016.


The UI DV program validates 332 report elements from 114 UI required reports to ensure that these key data, used for performance management purposes, are reliable. Appendix A-1 contains the relevant sections of the Social Security Act on which the reporting and validation activity is based.


1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.

Section 303(a)(6) of the Social Security Act specifies that the Secretary of Labor will not certify State unemployment compensation programs to receive administrative grants unless the State’s law includes provision for: “...making of such reports...as the Secretary of Labor may from time to time require, and compliance with such provisions as the Secretary may from time to time find necessary to assure the correctness and verification of such reports.”


In the 1970s the Department discovered that the UI required reports data it was using to determine States’ administrative budget allocations were often reported inaccurately. In response, it developed the Workload Validation (WV) program to assess the accuracy of eleven key workload items used to allocate funds for UI administration among States. Under the WV program, States were required to validate 29 elements from four UI required reports. This validation activity was conducted using the instructions in the UI Workload Validation Handbook No. 361 approved for use through 12/31/2000 (OMB Number 1205-0055).

The 29 elements validated through the WV program constituted slightly over 1% of the approximately 2,400 elements States report on 47 UI required reports and only a few Government Performance and Results Act (GPRA) elements. In light of this, the General Accounting Office and the Department of Labor’s Office of Inspector General (OIG) recommended that the program be revised to cover more data elements associated with performance measures.


In the early 1990s as part of a project to develop and test new measures for the timeliness and quality of benefit payment operations, called the Performance Measurement Review (PMR), the Department asked the technical support contractor, Mathematica Policy Research (MPR), to develop a method for validating benefits data. MPR developed an approach based on WV concepts but more amenable to automation. It included nearly 1,200 data elements from 13 benefits reports. The methodology was subsequently extended to include the data reported on the primary tax performance report, ETA 581, “Contribution Operations” (OMB Control No. 1205-0178, expires 06/30/2018). After the method was tested in three states with satisfactory results, the Department had MPR develop software that all states could use to perform validation operations. States installed this software on individual state PC or Sequel Server computers. After completing data validation, results could be exported from the software in different formats (e.g. text, Excel, Word, PDF files) and sent to the National Office via email.


Although the MPR software represented an improvement from the original DV methodology, DOL recognized that it could be enhanced, and as a response, DOL began developing its own version in 2003. The DOL software, released in 2005, is the software that is currently used to validate reported data. It has the advantage that it resides on the Sun-Unix servers that states use for UI reporting operations, which facilitates the deployment of new software versions, allows transmission of validation results directly to the National Office UI database, and allows automatic retrieval of reported counts from the UI database instead of requiring validators to enter those data as in the previous software. The latest version, Version 5.3.0, was released in February 2015.


The Department required states to implement UI DV. It set July 31, 2003 as the target for states to install the common DV software and develop the extracts of benefits and tax transactions the software processes; the states were to submit reports containing DV results by September 30, 2003. As of June 2015, reports had been received from 52 of the 53 states and territories that operate UI programs. It is expected that all states and territories will complete data validation by the next reporting deadline, June 10, 2016.


Unemployment Insurance Program Letter (UIPL) 22-05, available on the DV Web site’s Advisories Archive at http://www.oui.doleta.gov/dv/archive_advisor.asp, summarizes the UI DV policy framework. Data Validation is administered on a “Validation Year” basis. The Validation Year (VY) coincides with the State Quality Service Plan (SQSP) performance year, which runs from April 1 through the following March 31. For example, a validation of any report for UI activity that occurs between April 1, 2014 and March 31, 2015 is part of VY 2015 and is to be submitted by June 10, 2015. The timing allows validation results to be part of the SQSP process (see A-2, below).


The DV program operates on a three-year cycle. All but three groups of transactions—called populations--that pass validation must be validated every third year. The other three populations—which validate report cells from which GPRA indicators are formulated--must be validated annually; and populations that fail validation must be revalidated the next year. UI DV has also adopted the WV parameters for validity: if report counts are within ± 2% of reconstructed (validation) counts, and the random samples show that no more than 5% of transactions in the extract file universe contain invalid data elements, the reported counts are considered valid. (The GPRA report counts are held to a ± 1% standard instead of ± 2%.) The rules for passing validation are also contained in UIPL 22-05. Over half of all validation items due as of June 10, 2015, passed validation; 28% were due were not submitted; About three quarters of items that states submitted passed validation for both Benefits and Tax.

Both Benefits and Tax portions of the Data Validation handbook are available on the ETA Office of Workforce Security web site at http://ows.doleta.gov/dv/.


The data validation handbook and software show the following approval information:


OMB No.: 1205-0431

OMB Expiration Date: 05/31/2016


OMB Burden Statements:  SWA response time for this collection of information is estimated to average 500 hours per response (this is the average of a full validation every third year with an estimated burden of 730 hours, and partial validations in the two intervening years), including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden to the U. S. Department of Labor, Employment and Training Administration, Office of Unemployment Insurance (Attn: Rachel Beistel), 200 Constitution Avenue, NW, Room S-4519, Washington, D.C. 20210 (Paperwork Reduction Project 1205-0431).

2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.

a. Users, Uses, and Purposes. Accurately reported data are essential for properly assessing State performance and for ensuring that States are treated equitably when administrative funds are distributed. The Department will use UI DV results to evaluate the accuracy of the data States report on selected UI required reports. This will enable it to assure its customers, partners and stakeholders of the accuracy of the performance measures that form the basis of the UI Performs system, allow the Congress to determine the extent the UI system is accomplishing the goals set for the GPRA Strategic and Annual Performance plans and also to ensure that UI administrative funds are allocated properly to the 53 State UI programs.


The Department will use the UI DV information as a performance measure in the State Quality Service Plan (SQSP) process. SQSP is a permanent framework for the annual planning and budget process in each State UI agency. As part of the SQSP process, states that inaccurately submit data on validated reports—from which, among other things, key UI Performs or GPRA measures are based--will be required to take corrective action to ensure accurate reporting.


Because the performance measures are intended to guide and assist States in improving their performance and service to their ultimate customers (workers and employers), States also have an interest in ensuring that the measures are correct. The DV system includes detailed guidance to State programmers for correctly constructing various report elements. This not only ensures they will construct the DV extract files correctly, but also will simplify their tasks when new tax or benefit system software is installed.


b. Consequences of Failure to Collect Data. As noted in A-1 above, the Department has been criticized for failing to validate the data on which key measures are based and for failing to increase the extent of the validation of the reports data it receives from States. The UI DV system allows the Department to respond positively to those criticisms and to ensure the accuracy of the reports data on which it relies to discharge its oversight responsibilities.


3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also, describe any consideration of using information technology to reduce burden.


In compliance with the Government Paperwork Elimination Act, beginning on April 1, 2005, the UI DV program has had States submit UI DV findings via the system now used for electronic transmission of UI reports.


The Department has provided each State UI agency with an automated menu-driven system to input, store, and transmit reports data. The UI DV results are entered electronically as a report and stored in the UI Database.


The UI DV system is highly automated. The State-specific UI DV handbooks give States detailed specifications for developing data extract files, based on their own State management information systems. The States also enter or download these extract files to the automated system where a software module manipulates the input files for validation by producing validation displays, sorts, reconstruction counts, duplicate detection tests, and drawing various samples. The software module extracts from the state’s report database the report elements being validated so that validators do not need to enter those counts. The software also provides input interfaces where validators enter the findings of their reviews of quality samples, known as Data Element Validation. The results of completed validations are reported electronically to the Department using the overnight pick-up mechanism used to transmit other UI reports data.


In the last few years the Department has increased DV automation. DV summary results from the DOL database are available to Department and state users through a Web-based system. A separate Web-based system allows state users access to a database to maintain Module 3 of the benefits and tax handbooks. Module 3 is the “map” that enables the states to relate their key data to Federal reporting definitions. It gives Federal definitions from reporting and validation requirements; and based on those requirements, states note where these elements are to be found in their management information systems. State programmers use Module 3 to find the data elements needed to build DV extract files; validators use it to test the extract files when they examine sampled records during the data element validation phase.


The Department knows of no technical obstacles to implementing and operating UI DV.


4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.


The proposed UI DV program does not duplicate any existing Department program. The Department believes that there is no satisfactory alternative to the UI DV methodology for this purpose.


5. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.


There will be no impact on small businesses or any entities other than State UI agencies.


6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


The Department requires that States conduct a complete validation every third year and a partial validation every year for those items determined to be reported inaccurately by a previous year’s validation results or that are used for GPRA indicators. A validation is required within a year after states change State reports-generating software, such as a new state management information system or the Department changes reporting requirements.


7. Explain any special circumstances that would cause an information collection to be conducted in a manner:


This request is consistent with 5 CFR 1320.5.


8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden. Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.


Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years—even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.


In accordance with the Paperwork Reduction Act of 1995, the public was allowed 60 days to review and comment through a Federal Register on October 23, 2015 (80 FR 64450). No comments were received.


9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


This program does not involve payments per se to respondents. However, because UI DV

is a mandated data collection, the Department provides funding to the participating States, who are technically listed as the "respondents" for purposes of the Paperwork Reduction Act. States undertake validation activities out of their UI administrative grant funds, the same process for its predecessor, workload validation.


10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.


Keeping data private is not an issue with this program, which simply involves verifying the accuracy of aggregated data counts, submitted earlier on required reports. States submit no individual records with personal (e.g. Social Security Numbers) or business (e.g., Employer Account Number) identifiers.


11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.


The data collection includes no questions of a sensitive nature.


12. Provide estimates of the hour burden of the collection of information.


1. Annual Hour Burden: 23,644 hours

Old Burden: 34,550 hours

Change: 10,906 hours

  • Number of respondents: 53 State Employment Security Agencies

  • Burden for a full validation: 730 hours

    • Based on reassessed estimates received from three states that have conducted validations; they assumed they were conducting routine validations for a normal validation year using the validation software.

  • Annual Estimate Assumes:

    • States conduct a full validation in year 1;

    • In year 2, they validate half of report cells (i.e., GPRA elements and some failures from full validation, based on initial validation results that show at least half of report items failed);

    • In year 3, they validate one third of report cells in third year (i.e., the GPRA populations and some continued failures from validation in year 2);

    • These assumptions may be conservative; degree of improvement from one validation to another is not known. Burden is expected to decline in the future as report validity improves due to fewer report cells to be revalidated and more random samples passing at the first stage.

  • Calculation of Average Annual Burden

    • Full validation = 53*730 = 38,690

    • hours (Year 1)

    • Half validation = 53*730/2 = 19,345 (Year 2)

    • Third validation = 53*730/3 = 12,897 (Year 3)

    • 3-Year Average = (38,690+ 19,345 + 12,897)/3 = 23,644



2. Cost of Annual Hour Burden to Respondents: $1,115,997

  • Annual Burden = 23,644 hours

  • Average Wage Rate for State UI agencies: $47.20 = average salary used for UI budget formulation purposes for FY 2016 = $80,712; average yearly work hours used is 1,710)

  • Annual Cost = 23,644hours * $47.20 /hour = $1,115,997

  • Change: $-129,320


3. Startup Burden to Respondents: N/A

All states have implemented DV.


The following table can be used as a guide to calculate the total burden of an information collection. Based on anticipated passing rates and that most DV scores are valid for three years, the average burden and annual cost estimates below are based on these assumptions:

  • Year 1 states: most benefits and tax populations are due

  • Year 2 states: about half of all benefits and tax populations are due

  • Year 3 states: about one-third of all benefits and tax populations are due


Activity

Number of Respondents

Frequency

Total Annual Responses

Time Per Response

Total Annual Burden (Hours)

DV State 1 Year 1

1

Once annually

16 or more

72

1150

DV State 2 Year 2

1

Once annually

About 10

70

700

DV State 3 Year 3

1

Once annually

About 5

70

350

Average Burden





730


Activity

Number of Respondents

Average

Annual Burden (Hours)

Total Annual Burden (Hours)

Hourly Rate*

Monetized Value of Respondent Time

DV States Year 1

53

730

38,690

47.20

$1,826,168

DV States Year 2

53

730/2

19,345

47.20

$913,084

DV States Year 3

53

730/3

12,897

47.20

$608,738

Annual Cost



23,644


$1,115,997


  • * [Hourly rate based on average wage rate from State UI agencies; Data concerning the DV program and state results can be found online: http://ows.doleta.gov/dv/

13. Provide an estimate for the total annual cost burden to respondents or record keepers resulting from the collection of information. (Do not include the cost of any hour burden already reflected on the burden worksheet).


The States should not incur any capital costs to be added to the staff costs reported in A-12. The Department provides States with the computer equipment necessary for retrieving, manipulating, storing, and reporting the validation results.

14. Provide estimates of annualized costs to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information. Agencies may also aggregate cost estimates from Items 12, 13, and 14 in a single table.


Federal costs are the staff and contractor costs required to maintain and manage UI DV. These include the costs of providing technical assistance to states; monitoring the validation process in the states; maintaining the DV handbooks and the validation software to reflect changes in reporting; improving the functionality of the UI DV state software and programming the Federal software used to produce reports on DV results; and analyzing DV results and using those results to assess state reporting operations and making sure that states continue to improve reporting accuracy. These costs are expected to cost approximately $345,000 per year over the 2016-2019 period. Federal allocations of funds for State UI administration will also cover the costs in A-12.



Category


Estimated Costs


Staff:




DOL ETA National Office

GS-12 Step 5 (2.0 SY)


$134,880



DOL ETA Regional Office

GS-12 Step 5 (1.2 SY)


$80,928


Programming/ADP Support


$120,000


Contractor Technical Validation Support (3 Webinars at $3,000)


$9,000



15. Explain the reasons for any program changes or adjustments reported on the burden worksheet.


A Summary of Data Validation Changes in Response to Revisions to the ETA 227 Report

Overview: The Revisions to the 227 report released in August 2015 increased the high dollar overpayment threshold to $25,000 which is captured on the ETA 227 report. These changes were made to report UI overpayments that exceeded $25,000 made to a single individual in his/her current claim series (High Dollar Overpayments); validating this change required modifications to DV Populations 12. This change is outlined below.


Documentation. The key documents detailing the changes in DV will be incorporated in the ETA Handbook 361, Benefits or the ETA Handbook 411, DV Operations Guide.


Outline of Changes by Population:

  1. Population 12. This population validates Section A of the 227 report, Overpayments Established by Cause .

For purposes of completing ETA 227, High Dollar Overpayments occur when total overpayments to an individual on a claim exceeds $25,000. Overpayments may be a single overpayment or multiple overpayments established during or prior to the reporting quarter; however, the high dollar overpayment is reported for the quarter in which the cumulative amount overpaid to an individual on a claim exceeds $25,000.


The changes for instructions for lines 112 and 113 on the ETA 227 reflect the changes in the definition of High Dollar Overpayments.


Line 112. High Dollar Fraud Overpayment. Report those fraud cases and dollar amounts of overpayments for a claim that exceeds $25,000 during the reporting quarter. Overpayments may be for a single overpayment or for multiple overpayments established during or prior to the reporting quarter. The high dollar overpayment is reported for the quarter in which the cumulative amount overpaid exceeds $25,000.


If an overpayment exceeds $25,000, but part is fraud and part is non-fraud, report the fraud portion on line 112 and the non-fraud portion on line 113. A case count will be reported on the line with the higher dollar amount. If the amounts are equal, a case count will be reported on the fraud line (112).


Line 113 High Dollar Non-Fraud Overpayment. Report those non-fraud cases and dollar amounts of overpayments for a claim that exceeds $25,000 during the reporting quarter. Overpayments may be for a single overpayment or for multiple overpayments established during or prior to the reporting quarter. The high dollar overpayment is reported for the quarter in which the cumulative amount overpaid exceeds $25,000.


If an overpayment exceeds $25,000, but part is fraud and part is non-fraud, report the fraud portion on line 112 and the non-fraud portion on line 113. A case count will be reported on the line with the higher dollar amount. If the amounts are equal, a case count will be reported on the fraud line (112). Reporting accurate data in lines 112 and 113 is important to meeting the requirements of Executive Order 13520, Reducing Improper Payments and Reducing Waste in Federal Programs.


Due to the high level of automation and routine nature of the DV program, state procedures normalize and account for the change in hour burden figures.


16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.


The Department publishes results of validations on its Data Validation Web site at http://ows.doleta.gov/dv/.


17. If seeking approval not to display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


The Department will display approval information in the ETA Handbook 361.


18. Explain each exception to the topics of the certification statement identified in “Certification for Paperwork Reduction Act Submissions,”


There are no exceptions.



B. Collections of Information Employing Statistical Methods


This information collection does employ statistical methods.

10


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Title••••••••
AuthorBurman Skrable
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy