Supporting_Statemen t A_-05-04-2011_Final

Supporting_Statemen t A_-05-04-2011_Final.DOC

Employment and Training Data Validation Requirement

OMB: 1205-0448

Document [doc]
Download: doc | pdf


SUPPORTING STATEMENT FOR REQUEST FOR OMB APPROVAL

UNDER THE PAPERWORK REDUCTION ACT

Employment and Training Data Validation Requirement

OMB Control Number 1205-0448


PART A – JUSTIFICATION




This is a justification for the Department of Labor’s request for approval to extend a currently approved data validation requirement for five Employment and Training Administration (ETA) programs. Data validation assesses the accuracy of data collected and reported to ETA on program activities and outcomes. The accuracy and reliability of program reports submitted by states and grantees using federal funds are fundamental elements of good public administration, and are necessary tools for maintaining and demonstrating system integrity. The data validation requirement for employment and training programs strengthens the workforce system by ensuring that accurate and reliable information on program activities and outcomes is available. The following programs are subject to the data validation requirement: Workforce Investment Act (WIA) Title IB, Wagner-Peyser Act, Trade Adjustment Assistance (TAA) and National Farmworker Jobs Program (NFJP) and the Senior Community Service Employment Program (SCSEP). All of these programs must conduct both report and data element validation. However, the specific processes and required data elements that must be validated are program specific. All program specific information is discussed in the instructions for carrying out data validation for these programs. The Indian and Native American Program (INAP) is no longer part of this supporting statement; INAP has integrated data validation software into its electronic system, “Bear Tracks,” accounted for in Office of Management and Budget (OMB) control number 1205-0422.


1. Reasons for Data Collection


States and grantees receiving funding under WIA Title IB, Wagner-Peyser Act, TAA, and the Older Americans Act are required to maintain and report accurate program and financial information (WIA section 185 (29 U.S.C. 2935) and WIA Regulations 20 CFR 667.300(e)(2); Wagner-Peyser Act section 10 (29 U.S.C. 49i), Older Americans Act section 503(f)(3) and (4) (42 U.S.C. 3056a(f)(3) and (4)), and TAA Regulations 20 CFR 617.57). Further, all states and grantees receiving funding from ETA and the Veterans’ Employment and Training Service are required to submit reports or participant records and attest to the accuracy of these reports and records.


The Department’s Office of Inspector General (OIG) conducted an audit of WIA performance data oversight from July 2000 through October 2001. The audit, released in September 2002, found that, “Because of insufficient local, state, and Federal oversight, the Employment and Training Administration (ETA) has little assurance that the state-reported WIA performance outcomes data are either accurate or verifiable.” The OIG recommended that states should validate reported data using rigorous validation methodology. To address the concerns raised by the OIG and to meet the Agency’s goal for accurate and reliable data, ETA implemented a data validation requirement in order to ensure the accuracy of data collected and reported on program activities and outcomes.


ETA has developed a process for validating data submitted by states and grantees. Data validation consists of two parts:


1) Report validation evaluates the validity of aggregate reports submitted to ETA by checking

the accuracy of the reporting software used to calculate the reports. Report validation is conducted by externally processing a complete file of state-level participant records and comparing the results to those reported by the state or grantee.


  1. Data element validation exists for use as a management tool such that it appraises the accuracy of participant data records. Data element validation is conducted by manually reviewing samples of participant records with respect to their underlying source documentation in an effort to (1) underwrite the accuracy of the data contained in the states’ and grantees’ management information systems and (2) to affirm compliance with program-specific Federal definitions. The results of data element validation are utilized to identify areas on which to focus system resources in order systematically improve program management over time.


This approach addresses the two fundamental sources of reporting errors within ETA program data: data entry error and inaccurate computation of the required aggregate reports at the state and grantee level. If the data collected are systematically incorrect or data entry errors routinely occur, then the outcomes information will not be accurate even if the states’ and grantees’ reporting systems are functioning properly. Data element validation addresses this issue by comparing performance-related data in each state’s participant record file to the original data in the source files and determining an error rate that indicates the degree of accuracy of each data element used in calculating the state’s performance results. Similarly, if the states’ and grantees’ reporting systems do not meet Federal standards, they could calculate the performance outcomes incorrectly at the aggregate level. Report validation addresses this issue by externally calculating performance results with respect to each state’s participant record file and comparing those results to the values reported by the state. Error rates are determined for each performance outcome reported by the state.


ETA maintains a set of validation tools – software and instructional user handbooks – that states and grantees can use to validate data. While the software is updated on a rolling basis, such as to change acceptable program exit dates to facilitate reporting in future years or to fix software bugs, the mechanics of the system with regard to the general data collection, instructions for using the software and required data elements remain exactly the same as they were under the previous collection authorization.


WIA Title IB, Wagner-Peyser, TAA, NFJP and SCSEP program staff have been conducting data validation for several years. Program staff received training prior to the implementation of data validation and continue to receive ongoing training and technical assistance from ETA’s data validation contractor throughout the validation process.


Previous experience with data validation has indicated the following:

  • States and grantees are able to conduct data validation with a reasonable, but sustained, level of effort.

  • The validation process allows states and grantees to identify and address reporting errors.

  • The average staff requirements for a state to complete validation for the WIA Title IB, Wagner-Peyser, and TAA programs are about 792 hours per year (or less than 1/2 of a staff year). There is no startup burden for these programs because it was incurred upon initial implementation. The average annual time estimate for NFJP and SCSEP grantees to complete validation is approximately 161 hours (approximately 1/20 of a staff year).

  • Changes to the DRVS software are continually required. This results from continuous improvements in the Workforce Investment Act Standardized Record Data (WIASRD), state and grantee participant files and in the DRVS itself.


On the basis of the significant benefits of data validation along with its minimal burden going forward, ETA seeks to extend the existing data validation requirement for employment and training programs.


ETA is considering redesigning the DRVS system over the next year. In acknowledgement of the constantly evolving nature of DRVS, ETA believes that moving towards an integrated program reporting and DRVS would decrease the cost and burden associated with maintaining the system year to year. This is because current modifications require patches to each and every copy of DRVS utilized by each state and grantee. Establishing one consolidated DRVS system, which is accessible to state and grantees, would make year to year transition much less complicated, as well as, it would facilitate the integration of ETA databases with DRVS.


Currently, data validation is required annually as follows:


  • Report validation should be performed before submission of annual reports.


  • Data element validation must be completed within 120 days after required annual reports or participant records are due at ETA. Exact deadlines for the completion of data validation vary by program.


  • States and grantees are required to send data element validation output reports to ETA within 120 days after they submit required annual reports or participant records.


States and the following grantees use DRVS to validate the reports and participant records shown in Table 1 below.








Table 1 – Reports and Participant Record Files Validated


Program

Report/Records

OMB Approval No.

Workforce Investment Act Title IB

ETA 9091 (annual report)

1205-0420

Wagner-Peyser

ETA 9002, VETS 200

1205-0240

Trade Adjustment Assistance

TAPR

1205-0392

National Farmworker Jobs Program

WIASPR

1205-0425

Senior Community Service Employment Program

ETA 5140 (annual report)

1205-0040


The user handbooks for each program provide a more detailed overview of the validation process. These are available on ETA’s validation tools web site at <http://www.doleta.gov/performance/reporting/tools_datavalidation.cfm>. Training and Employment Guidance Letter No. 17-09 outlines ETA’s data validation policy.


2. Purpose of Information Collection


ETA uses data validation results to evaluate the accuracy of data collected and reported to ETA on program activities and outcomes. This information collection enables ETA to assure its customers, partners, and stakeholders of the validity of performance data underlying the respective programs. Further, data validation ensures that performance information used for WIA accountability purposes and to meet Government Performance and Results Act (GPRA) responsibilities are accurate.


Data validation was also developed with the goal of assisting states and grantees in providing more accurate data. Validation allows states and grantees to detect and identify specific problems with their reporting processes, including software and data issues, and to enable them to correct the problems. In addition, the tools developed by ETA help states and grantees analyze the causes of performance successes and failures by displaying participant data organized by performance outcomes. These tools are available at no cost to states and grantees.


3. Technology and Obstacles Affecting Reporting Burden


ETA knows of no technical obstacles to implementing and continuing data validation. ETA has developed standardized software and user handbooks that states and grantees can use to conduct data validation:


  • Software developed by ETA generates samples, worksheets, and reports on data accuracy. For report validation, the software validates the accuracy of aggregate reports that are generated by the state's or grantee's reporting software and produces an error rate for each reported count. For data element validation, the software generates a random sample of the participant records and data elements for the state or grantee to manually validate. The software produces worksheets on which the validator records information after checking the source documentation in the sampled case files. The software calculates error rates for each data element, with confidence intervals of 3.5 percent for large states/grantees and 4 percent for small states/grantees.


  • User handbooks provide detailed information on software installation, building and importing a validation file, and completing report and data element validation. The handbooks also explain the validation methodology, including sampling specifications and data element validation instructions for each data element to be validated.


Currently, all states and grantees use the software provided by ETA to conduct validation for WIA Title IB, Wagner-Peyser, and TAA, NFJP and SCSEP. States and grantees can obtain technical assistance on validation procedures and the use of the validation tools from ETA’s data validation contractor.


The software can also be used to generate the aggregate information required in reports submitted to ETA. States and grantees that use the software provided by ETA to generate this aggregate information are not required to conduct report validation. However, states still must demonstrate that they used the validation software to calculate their aggregate reports.


For both report validation and data element validation, the ETA software uses the validation data provided by the states or grantees to produce validation summary reports which, in compliance with the Government Paperwork Elimination Act, are submitted via the system now used for electronic transmission of reports to ETA.


4. Duplication


The data validation requirement does not duplicate any existing ETA program.


5. Burden on Small Business or Other Small Entities


While data validation is conducted mostly by state governments and large, private, non-profit organizations, some small entities are required to conduct validation. Some of the grantees operating NFJP and SCSEP are small, private, non-profit organizations providing services to a low number of individuals. However, because of the low burden estimates associated with data validation for these programs, this information collection does not significantly impact these small entities.


The data element validation process allows states and grantees to randomly select validation samples from the complete data file, in order compute statistically significant error rates, rather than requiring the validation of every participant case file. To reduce the relative burden on smaller states and grantees as much as possible, the sample size for smaller entities is less than for larger grantees and states. The leads to the slightly larger acceptable error rates of 4 percent for small states compared to 3.5 percent for large states.




6. Consequences of Failure to Collect Data


As mentioned in Part A.1, ETA was criticized in the past for a lack of monitoring and a consequent inability to assure the validity of performance outcomes reported by states and grantees. ETA regional staff continues to conduct data quality reviews based on current data validation efforts to determine if states are in compliance with data validation guidelines. The proposed continuation of the data validation requirement will allow ETA to continue to address these issues. If data validation is discontinued, ETA will not be able to ensure that critical data used for performance reports and accountability purposes, to meet GPRA responsibilities, and for other management purposes, are reliable.


7. Special Circumstances Involved in Collection of Data Validation Information


This request is consistent with 5 CFR 1320.5.


8. Pre-Clearance Notice and Responses


A Pre-clearance Notice for sixty days’ public comment was published in the Federal Register on September 27, 2010 (Vol. 75, page 59294 et seq).


Comments/Issues

ETA Response

Six SCSEP grantees commented that the Data Validation (DV) process is too time consuming and that the ETA estimated burden time is understated.

Estimating the burden of any data collection does present a challenge because the burden will vary across grantees. The estimate of the burden that DV imposes on grantees is an estimate of the average burden of all grantees. The burden on some grantees will be higher than the estimate, and the burden on other grantees will be lower than the estimate, depending on the number of records. The average time allotted of 1 hour per record is sufficient to conduct validation for the average record if the case files are properly organized in advance of the validation. Grantees have the legal responsibility and authority to properly organize and maintain files.


In addition, it is important to note that Data validation emerged as a response to a DOL OIG audit that concluded that ETA could not verify the accuracy of the data underlying programmatic results reported to congress and other stake-holders. Without comparing the system data to actual documentation, there is no way to validate the system data in order to insure that it is correct. We realize that collecting this information is a burden. However, in order to prove that the person a state is purporting to have served indeed is a real person who was eligible to participate in the program, certain documents must be provided.


While traveling distances for rural areas in large states may be larger than they are in more urbanized smaller states, ETA does not require that the documents be physically brought to a central location. In fact, ETA encourages states to create electronic copies of the records, which could be sent electronically to any location. Furthermore, ETA has strategically sought to minimize the travel burden on both states and federal personnel involved in the data validation process by utilizing a stratified clustered sampling methodology. This approach minimizes the number of individual locations from which the sample is drawn in an effort minimize the burden. For a more detailed description see Part B, section 2(A).

Six SCSEP grantees recommended reducing or streamlining the validation documentation requirements for data elements.

Data Validation is an efficient method for monitoring data collection, reporting, and performance. Using standardized documentation requirements helps ensure consistent compliance with federal definitions, legislation, and program requirements. DV seeks to validate critical data elements with the most appropriate and acceptable documentation. SCSEP DV, however, also attempts to balance the need for formal documentation with less formal, yet wholly acceptable, forms of support such as attestation (self- and third-party) and case notes. Of the 42 data elements included in SCSEP DV, 19 allow for attestation, while 27 allow for case notes.

Three SCSEP grantees and one State Official suggested that the U.S. Department of Labor should hold a meeting and/or annual DV training.

Basic DV training and other assistance is always accessible via the technical assistance providers. Grantees are also encouraged to request specific DV training that can most effectively address their needs. In the future, DOL will explore options to provide routine training at any DOL-sponsored conferences (if held). DOL also plans to review the results of DV across all grantees to identify specific areas where training may prove beneficial to all grantees. Specific tutorials may be created or other training materials provided that address issues affecting multiple grantees.

Two SCSEP grantees and one State Official recommended that the Data Validation handbook should be further reviewed and revised.

The DV handbook is reviewed and revised on an annual basis. DOL receives input from grantees and sub-grantees throughout the year regarding DV in general and the handbook specifically. These comments are very helpful in identifying where additional clarification is needed and where corrections should be made in the handbook. In addition, DOL staff continually review the document to identify other areas for improvement, as well as any changes to DV that are needed as a result of changes made to SCSEP and/or SPARQ that occurred during the year. The revised DV handbook is released on or about the same time that the new DV samples are drawn each year.

Three SCSEP grantees would like to continue the "pilot" phase of the project for an additional year.

The first few years of DV implementation were designated as pilots because grantees were validating records that had been populated with data prior to the publication of the official DV documentation rules and instructions. Because the data elements in both the eligibility and performance samples are final and the rules regarding them have been in place long enough, records should conform to those rules. Therefore, there is no need for another pilot year.

Three SCSEP grantees noted that secondary validation (i.e. a doctor's note) is not feasible for many data elements.

Only a few data elements require medical documentation: disability, severe disability, frailness (in some cases), and exclusions from the common measures due to the participant’s serious illness or the need to care for a severely ill family member. Because of the statutory and regulatory definitions for these data elements, grantee staff are not qualified to make the determination and the opinion of a medical expert is required. If grantee staff were to decide what constitutes severe disability, for example, there would be no consistency among the approximately 900 SCSEP sub-grantees, and the data in SPARQ would be unreliable. In all cases, some form of medical documentation other than a statement from the participant’s doctor is permitted. Please note that medical documentation has always been required when a participant claims the status of a family of one due to disability; this requirement predates DV.

Two SCSEP grantees felt that SCSEP validation requirements should be equal to that of the WIA requirements.

Data validation requirements are developed by the program office based on a variety of factors, including the structure of the program, reporting and legislative requirements, internal and external audits, budgetary considerations, and program monitoring. Because such factors differ between SCSEP and WIA, the programs necessarily have different DV requirements.

Two SCSEP grantees commented that SCSEP Data Validation requirements are too invasive and are actually forcing potential participants away.

Data Validation adds no additional burden on participants to those already required by legislation and program regulations. DV has simply identified the most critical elements that are related to SCSEP eligibility and relies on existing sources of documentation, which are required to substantiate eligibility or for the state to claim credit for the participant in their performance measures. Additionally, eligibility determination is not the only reason to collect supporting documentation; determining priority of service for certain participants and outcomes of performance measures must also be verified. In the case of low income status, this is not a requirement of eligibility in any particular program, but is necessary to determine priority of service. Section 134 (4) (E) of the Workforce Investment Act of 1998 states:


PRIORITY.—In the event that funds allocated to a local area for adult employment and training activities under paragraph (2)(A) or (3) of section 133(b) are limited, priority shall be given to recipients of public assistance and other low-income individuals for intensive services and training services. The appropriate local board and the Governor shall direct the one-stop operators in the local area with regard to making determinations related to such priority.


Long before DV, SCSEP program rules required grantees to document all aspects of eligibility by obtaining copies of source documents that were signed by the applicant and the case worker. No eligible applicant can be or has been denied enrollment in SCSEP because of DV requirements. Nor should any enrollment be delayed while the participant is obtaining documentation of elements required solely for DV. (As set forth above, program rules have always required the documentation of eligibility prior to enrollment.) The only consequence of the inability or failure of a participant to provide DV documentation is that the grantee will be found to have claimed performance credit improperly. Since there are yet no sanctions for failing DV, there are no consequences to any grantee for these violations. However, DV does provide grantees an opportunity to discover when sub-grantee staff do not understand or have failed to apply SCSEP rules regarding eligibility and performance. This enables grantees to provide needed training to ensure both the quality of SCSEP data and compliance with SCSEP rules.

One SCSEP grantee noted that Data Validation requirements may go against some states HIPPA rules.

When a doctor's note is required as proof of health status (disabled, frail, etc.), written authorization from the participant stating to release his or her health status makes the disclosure permissible under HIPPA.

Two SCSEP grantees suggested that Data Validation negatively impacts the ability to assist hard-to-serve individuals due to the many validation requirements.

Clearly the intent of data validation was not to encumber the ability to serve individuals, particularly those with barriers to job entry. However, in the case of SCSEP, there are strict eligibility requirements. Along with these requirements, it is necessary to collect certain identifiable items to aid in the validation process. Hopefully the potential participant can understand the small tradeoff of submitting the required documentation for free employment services.

One State Official stated Data Validation software, user guides, and report submission processes are the weakest point of the DV effort. This is due to the National Office continuously changing the performance standards and reporting requirements faster than most states can handle.

This comment likely had merit a few years ago, when the Department was undertaking significant efforts to reform its programs’ reporting systems to capture performance outcomes and then made further revisions to implement the common performance measures. However, the last major changes to performance calculations and data elements occurred at the beginning of Program Year (PY) 2005 (July 1, 2005), specifically to implement the new common performance measures policy. There have been no significant revisions to the requirements/policy for report validation or data element validation for more than five years.


The software in its current state is completely functional. As far as level of usability, that will vary and depend on the individual user. We have provided detailed data validation handbooks which discuss the process as well as the functionality and usability of the software itself. If problems with the software occur, technical assistance is always available. This can consist of emails, phone calls, or actual training sessions. This being said, we acknowledge that the software’s technology is aged. In order to combat this concern, the implementation of a new web-based software system has been approved and is underway.


To eliminate a photocopy of necessary documents would negate the premise of the data validation process. State and Federal reviewers need a copy of certain documents to verify their accuracy. If these copied documents didn’t exist, there would be nothing to validate against, and thus no way to determine data accuracy. In terms of reducing paperwork, we feel strongly that a “paperless” system (where all necessary documents would be scanned and stored electronically) would be advantageous on numerous fronts, and is very much encouraged. Regarding which documents can be copied: all documents may be copied as there is no restriction on which documents are allowed to be replicated.


Four SCSEP grantees commented that the Data Validation process involves too much paperwork. The process should be made paperless.

While the Department supports efforts to reduce the amount of hard copy documents/paperwork required to effectively administer its programs, a system that has no tangible source documentation defeats the intent of data validation and makes a states’ results, both positive and negative, unverifiable. States may scan or make digital copies of the required documentation and append them to electronic case files to eliminate the need for paper copies and make the system “paperless”.

Two SCSEP grantees and one State Official noted that State agencies would like feedback regarding DV performance.

Feedback is given during the data validation monitoring process (i.e. site visits). However, the Department will examine the possibility of an annual error rate report.


9. Payments to Respondents


This information collection does not involve direct payments to respondents. ETA does provide administrative funding to the participating states and grantees, which are listed as the respondents for purposes of the Paperwork Reduction Act. The requirement to perform data validation derives from states’ and grantees’ responsibility to provide accurate information on program activities and outcomes to ETA. States and grantees are expected to provide resources from their administrative funds for the data validation effort. Validation of program performance is a basic responsibility of grantees, which are required to report program performance, under Department of Labor regulations (29 CFR 95.51 and 97.40).


10. Confidentiality


Participant record layouts used in data validation for the WIA Title IB, Wagner-Peyser, TAA, NFJP and SCSEP have been revised to replace social security number fields with state-assigned individual identifiers. This means that the data files that are uploaded to the DRVS software have had the social security numbers necessary for obtaining wage record information removed from the file after the wage information was reported. The social security fields are replaced with state-specific unique identifiers before the file is uploaded to DRVS. However, since data element validation necessarily involves accessing the underlying wage record information by social security number in order to verify the accuracy of wage information contained in the participant records submitted to ETA, the records that end up in the data element validation sample must include a social security number so that the actual wage data can be validated against what is reported in the individual record file. This is because validation works by comparing the information in the case file (or wage record file in this case) against that reported in the individual record file. To do this, the state uses the unique identifier associated with the particular record to identify the relevant case file. The case file information is then used to link to the wage information for the purposes of data element validation.


To protect the confidentiality of program participants, the validation software includes user functionality that allows program administrators to limit access to this information based on administrative clearance. The program administrator is the only person with access to the password required to use the software, as one user name and password is issued to the state. No other means of access to this data is permitted. Confidentiality is not an issue with report validation because this aspect of data validation simply involves verifying the accuracy of aggregate reports submitted to ETA and so contains no confidential information.


11. Questions of a Sensitive Nature


The data collection includes no questions of a sensitive nature.




12. Respondent Annual Burden


Data validation is estimated to require an annual burden of 62,174 hours for all five programs subject to this validation requirement.


Burden estimates for state programs – WIA Title IB, Wagner-Peyser, and TAA – are outlined in Part 12.A. Data validation is estimated to require a total annual burden of 41,970 and $1,705,450 for all state programs. Burden estimates for grantee programs – NFJP and SCSEP – are outlined in Part 12.B. Data validation is estimated to require a total annual burden of 20,204 hours and $547,095 for all grantee programs.


A. State Programs: WIA Title IB, Wagner-Peyser, and TAA


Table 2 provides an overview of the annual burden for the WIA Title IB, Wagner-Peyser, and TAA programs, including average hours and costs across states in all three programs. The estimated annual hours needed to conduct validation for these programs is 792 hours (rounded) on average per state and 41,970 hours for all states. The estimated annual cost of performing validation is $32,322 on average per state and $1,705,450 for all states.


Table 2 - Calculation of Combined Annual Burden for WIA Title IB,

Wagner-Peyser, and TAA Programs



No. of States

Hours per State

Total Hours

Rate in $/hr

Total Cost

Large State

18

1,206

21,708

$40.81

$885,903

Medium State

18

746

13,428

$40.81

$547,997

Small State

17

402

6,834

$40.81

$278,896

All States Total

53

--

41,970

$40.81

$1,705,450

Average per State

--

792

--

$40.81

$32,322


  • The calculation of the hours required to conduct validation includes sample size, the time for validators to review sampled case files (34 minutes per file), the travel time to local offices to review the files, and 15% of a supervisor’s time.

  • States have been divided into three categories – large, medium, and small – based on the number of participants that exit a state’s program in a year. The size of the state impacts the number of sampled case files that must be reviewed and the travel time to local offices.

  • The annual travel time per office is estimated as 8 hours for large states, 6 hours for medium states, and 3 hours for small states. This estimate is based on the assumption that states will conduct data element validation separately for the WIA Title IB and TAA programs. If states conduct data element validation for both programs at the same time, the travel time required to perform validation will decrease.

  • The hourly rate is the estimated average hourly earnings for employees in state Unemployment Insurance (UI) agencies in FY 2011 (as used for FY 2011 UI budget formulation purposes).


B. Grantee Programs: NFJP and SCSEP


Table 3 provides an overview of the annual burden for the NFJP and SCSEP including average hours and cost across grantees in this program. The estimated annual hours needed to conduct validation for this program is 161 hours (rounded) on average per grantee and 20,204 hours for all grantees. The estimated annual cost of conducting validation is $4,342 on average per grantee and $547,095 for all grantees.


Table 3 - Calculation of Annual Burden for

NFJP and SCSEP Grantees


No. of Grantees

Hours per Grantee

Total Hours

Rate in $/hr

Average Cost per Grantee

Total Cost

NFJP

52

158

8,216

$15.16/$40.81

$2,551

$132,660

SCSEP

74

162

11,988

$15.16/$40.81

$5,600

$414,435

All Grantees

126

--

20,204

$15.16/$40.81

--

$547,095

Average per Grantee

--

161

--

--

$4,342

--


  • The calculation of the hours required to conduct validation includes the time for validators to review sampled case files (40 minutes per file) and 15% of a supervisor’s time. (Travel is not required for grantees to conduct validation).

  • The hourly rate used to calculate cost depends upon the type of organization receiving the grant. For state, county, and U.S. territory government grantees, the hourly rate is the estimated average hourly earnings for employees in state UI agencies in FY 2011 (as used for FY 2011 UI budget formulation purposes). For private non-profit grantees and Federally-recognized tribes, the hourly rate is the average hourly earnings in the civic and social organizations industry (CY 2009, Current Employment Statistics survey, U.S. Census Bureau, http://data.bls.gov/PDQ/outside.jsp?survey=ce).


Tables 4 and 5 provide a more detailed account of the annual burden estimates for each grantee program.


Table 4 - Calculation of Annual Burden for NFJP


Type of grantee

No. of Grantees

Hours

Rate in $/hr

Cost

Private Non-Profit

50

158 (per grantee)

$15.16

$2,395 (per grantee)

State or County Government

2

158 (per grantee)

$40.81

$6,448 (per grantee)

All Grantees

52

8,216

--

$132,660

Avg. per Grantee

--

158

--

$2,551


Note: The hourly rate used to calculate cost depends upon the type of organization receiving the grant. For state and county government grantees, the hourly rate is the estimated average hourly earnings for employees in state UI agencies in FY 2011 (as used for FY 2011 UI budget formulation purposes). For private non-profit grantees, the hourly rate is the average hourly earnings in the social assistance industry (CY 2009, Current Employment Statistics survey, U.S. Census Bureau).


Table 5 - Calculation of Annual Burden for SCSEP


Type of Grantee

No. of Grantees

Hours

Rate in $/hr

Cost

Private Non-Profit

18

162 (per grantee)

$15.16

$2,456 (per grantee)

State or U.S. Territory Government

56

162 (per grantee)

$40.81

$6,611(per grantee)

All Grantees

74

11,988

--

$414,435

Avg. per Grantee

--

162

--

$5,600


Note: The hourly rate used to calculate cost depends upon the type of organization receiving the grant. For state and county government grantees, the hourly rate is the estimated average hourly earnings for employees in state UI agencies in FY 2011 (as used for FY 2011 UI budget formulation purposes). For private non-profit grantees, the hourly rate is the average hourly earnings in the social assistance industry (CY 2009, Current Employment Statistics survey, U.S. Census Bureau).


13. Estimated Cost to Respondents


The total annual cost burden for conducting data validation is estimated to be $2,252,545 for all five programs, as described in Part A.12 above. As the WIA Title IB, Wagner-Peyser, TAA, NFJP and SCSEP have already implemented data validation, there is no startup burden for these programs.


14. Cost to Federal Government


Federal costs are the staff and contractor costs required to maintain and manage data validation as outlined in Table 6 below. The annual cost of contractor support to provide continual technical support to grantees and states and any needed updates to validation tools for WIA Title IB, Wagner-Peyser, TAA, NFJP and SCSEP will total approximately $645,000 for PY 2010. Costs for ETA staff to manage the data validation program will be $73,006 for continuing operations throughout PY 2010.










Table 6 - Cost of Data Validation to Federal Government


Continuing Operations

(WIA Title IB, Wagner-Peyser,

TAA, and NFJP)

Contractor Support

$645,000

ETA Staff Total

$73,006

1 GS-15 (1/8 time)

$15,470

1 GS-14 (1/4 time)

$26,303

1 GS-11 (1/2 time)

$31,234

Total Cost

$718,006


Note: Staff costs are based on Salary Table 2010-DCB (Step 1, incorporating the 1.5% general schedule increase and a locality payment of 24.22% for the locality pay area of Washington-Baltimore-Northern Virginia, DC-VA-WV-PA), Department of Labor grade ranges as of January 2010.


15. Reasons for Program Change and Change in Burden


There are no program changes or proposed changes to the data validation software and handbooks from the previously approved submission in 2007. Changes in hour and cost burden in this current request for extension represent 1) an increase in the hourly cost of conducting data validation, due to the updating of pay grade salary tables, from $32.50 per hour to $40.81 per hour, and 2) a net decrease in hours, due to a decrease in the number of grantees conducting data validation, since one program previously covered by this information collection (INAP) is now covered under OMB Control Number 1205-0422 .


16. Publication Information


ETA publishes the results of data validation in an annual validation report.


17. Reasons for Not Displaying Date OMB Approval Expires


ETA displays OMB approval and expiration information on the validation reports.


18. Exceptions to Certification


There are no exceptions to the certification statement.

16


File Typeapplication/msword
File TitleJUSTIFICATION PART A
Authoraeyoung
Last Modified ByMichel Smyth
File Modified2011-05-04
File Created2011-05-04

© 2024 OMB.report | Privacy Policy