Analysis of Recordkeeping Audits

Audit analysis 2003.pdf

OSHA Data Initiative (ODI)

Analysis of Recordkeeping Audits

OMB: 1218-0209

Document [pdf]
Download: pdf | pdf
OSHA Data Initiative Collection Quality Control:
Analysis of Audits on CY 2003 Employer
Injury and Illness Recordkeeping
—
REPORTING YEAR ANALYSIS
in Multi-Year Reporting Cycle

Task Order No. 5
Option Year 2
Contract No. J-9-F-3-0015

FINAL REPORT
August 31, 2006

Prepared for:
Office of Statistical Analysis
Occupational Safety and Health Administration
Washington, DC
Prepared by:
ERG
Lexington, MA
&

National Opinion Research Center
Chicago, IL

CONTENTS
Page
EXECUTIVE SUMMARY ...............................................................................

ES-1

INTRODUCTION ............................................................................................

1

AUDIT OBJECTIVES ......................................................................................

1

AUDIT METHODOLOGY AND ANALYTICAL APPROACH.......................

2

State Plan State Participation .................................................................
Sampling Universe..................................................................................
Sample Selection of Establishments........................................................
Audit Protocol and Sampling of Employees within Establishments ......
Analysis ...................................................................................................

2
2
3
5
5

Findings ...............................................................................................................

12

Universe Estimates for CY 2003 Recordkeeping.........….......................
Case Analysis .........................................................................................
Submission Comparison Analysis ....................................................…...

12
19
23

SUMMARY AND RECOMMENDATIONS ....................................................

31

APPENDICES
A. List of OSHA Data Initiative Collection Quality Reports and Related Studies
B. Background on the OSHA Injury and Illness Recordkeeping Audit Program
C. Summary of Major Changes in OSHA Injury and Illness Recordkeeping Under the
Revised Rule
D. Tracking Status Codes Used in Processing CY 2003 ODI Submissions
E. OSHA Directive: Audit and Verification Program of 2003 Occupational Injury and
Illness Records

Final Report, August 2006

ii

EXECUTIVE SUMMARY
This report presents findings on the analysis of audits on calendar year (CY) 2003
employer injury/illness recordkeeping. It is the eighth audit program analysis and the second
assessment of records under OSHA’s revised recordkeeping rule, which came into effect as of
January 2002. Last year, a preliminary review of injury/illness recordkeeping in non-construction
establishments under the revised rule found a reasonable level of accuracy that was not
significantly different than the results found in past years under the old rule.
Background
In 1995, the Occupational Safety and Health Administration (OSHA) established its Data
Initiative Collection System (ODI) to gather and compile occupational injury and acute illness
information from some 80,000 establishments in high-hazard industries. At the same time, the
Agency developed mechanisms to ensure the accuracy of the collected ODI data for OSHA’s
use—in combination with other data sources—in targeting enforcement and compliance
assistance interventions and for measuring Agency performance in reducing workplace injuries
and illnesses. OSHA’s ongoing data quality efforts address both the data collection process and
the source records (i.e., employer recordkeeping on the OSHA 300 Log) as an integral part of the
ODI.
OSHA established the audit program with its onsite audits of employer injury and illness
records to annually assess and monitor the quality of employer injury/illness recordkeeping
nationwide. The audit program has focused only on non-construction establishments, with the
exception of the sixth year of the program when OSHA conducted a pilot of the audit
methodology in a sample of construction establishments. Budget constraints have precluded
implementation of the audit program in construction establishments.
OSHA considers onsite audits of employer injury and illness records a key method of
verifying the accuracy of data submitted for the ODI and for estimating the extent of employer
compliance with OSHA recordkeeping requirements defined in 29 CFR 1904. In order to
implement this quality control component, OSHA developed a protocol for reviewing a sample
of employee injury/illness records within a sample of establishments as well as software to
streamline a process that was otherwise too resource intensive for widespread use.
Objectives
The primary objectives for OSHA in the eighth year of the audit program were to:
•

Execute the established protocol in conducting recordkeeping audits in a sample of
non-construction establishments drawn from the standard ODI universe to estimate
the accuracy of employer injury and illness recordkeeping nationwide based on the
ODI universe.

•

Assess employer injury/illness recordkeeping accuracy in the second year of
recordkeeping under the revised rule.

Final Report, August 2006

ES-1

Audit Methodology and Analytical Approach
For the eighth-year audit program, OSHA selected a sample of audit establishments from
a standard ODI universe, using the same approach as used in recent years of the audit program.
For each year, OSHA compiles the standard ODI universe using a file from Dun & Bradstreet
that provides the most currently available industry, employment, and location information on
establishments. OSHA’s objective in defining a standard ODI universe was to be able to
generalize the annual estimates of overall accuracy for employer injury and illness recordkeeping
to ODI establishments nationwide and to facilitate year-to-year comparisons. Prior to the CY
1999 audit program, the sample was selected from a universe of establishments participating in
the ODI in a specific year.
For this year of the program, OSHA again selected establishments from a universe that
covered all years of the ODI. More specifically, OSHA used a standard ODI universe that
included approximately 120,000 establishments nationwide that met the following criteria:
•

Establishment is located in one of the States participating in the ODI (i.e., either in
the Federal OSHA jurisdiction or in one of the participating State Plan States).

•

Establishment has total employment of 40 or more.

•

Establishment is in any of the Standard Industrial Classification (SIC) codes selected
for one of the annual ODI collections except SIC 53 (General Merchandise Stores)
and SIC 806 (Hospitals), which are only selectively included in the ODI.

To select a sample of audit establishments from a standard ODI universe and to increase
the likelihood of having 250 completed audits available for the analysis, OSHA implemented the
following steps:
1. Draw an initial sample of 398 establishments from the standard ODI universe of
124,989 establishments. Before making this initial selection, OSHA sorted
establishments in the sampling frame by industry code, region, and employment size,
resulting in an implicit stratification. OSHA then drew the sample of establishments
using a systematic selection procedure.
2. Include all establishments selected for the initial sample in the ODI universe for the
CY 2003 collection year.
3. At completion of the ODI data collection cycle for CY 2003, eliminate from the
sample any establishments that did not meet audit program requirements. To be
eligible for an audit, establishments had to be located in State Plan States that had
chosen to participate in the audit program and ODI submissions for CY 2003 had to be
OK-verified.
4. Assign the remaining sample of 272 establishments for an audit.

Final Report, August 2006

ES-2

5. Eliminate any completed audits that diverged from audit procedures in the protocol.
For the eighth year of the audit program, OSHA again committed to conducting 250
audits. Previous analyses have established that selecting and assigning a sample of exactly 250
audits at the outset is unlikely to yield the optimum number of completed audits for the analysis.
A shortfall can result because in some instances audits are not conducted due to constraints on
resources.
The target sample size is based on a National Opinion Research Center (NORC)
determination that this approximate number of audits would provide an acceptable level of
power for detecting overall accuracy of employer recordkeeping at-or-above a 95 percent
threshold. This also would enable OSHA to provide reasonable estimates of accuracy for the
universe of establishments. As established for the first-year audit program analysis, at lower
level break-outs, such as at the industry level, universe estimates would be considered unstable
because of the relatively small number of establishments that might occur in the subcategories of
the sample. (See National Opinion Research Center, Final Report: Sample Design for a
Statistically Valid Evaluation of Accuracy and Completeness of an Establishment’s OSHAMandated Employee Records, 1996.)
OSHA implemented the same general approach for analyzing the results of the
establishment audits as was used in past years of the program. The analysis approach addressed
two general areas:
Methodology for Implementing the Audit Cycle
C

Reviewing the documentation on the audits for completeness and adherence to the
established protocol.

C

Comparing the characteristics of the sample of establishments audited to those of
establishments in the standard ODI universe.

Results Related to the Accuracy of Employer Injury/Illness Recordkeeping
C

Calculating universe estimates of the overall accuracy of employer injury and illness
recordkeeping based on the results of the audits and the sample design.

C

Comparing recordkeeping accuracy estimates from the eighth-year audit program
with results from the seventh year (i.e., first year of recordkeeping under the revised
rule) and sixth year (i.e., the last year of recordkeeping under the old rule).

C

Performing a case-level analysis that describes the types of recordable cases the
auditors identified in the sample and details the recording errors they discovered.

C

Comparing the employers’ Log Summary and employment and hours worked data at
the establishment at the time of the audit with the data submitted to OSHA in
response to the CY 2003 ODI collection request.

Final Report, August 2006

ES-3

Three principal size group categories based on average employment were used—“all
small” (40-99 employees), medium (100-249 employees), and large ($250 employees). Also, as
with the past five audit program analyses, a small establishments subcategory of 40-49
employees was used to further assess any effect of the inclusion of smaller establishments in the
ODI.
The universe estimate analysis focused on the types of recording errors that affect an
employer’s injury and illness rate, including:
C

Underrecording of total recordable cases—The employer does not record an injury or
illness that should have been entered on the Log.

C

Underrecording or misrecording of DART cases (days away from work, restriction,
or transfer injury/illness cases)—Either the case is not recorded on the Log or the case
is recorded as a non-DART case.

Recording and correctly classifying DART affects the accuracy of an establishment’s
combined DART injury and illness rate, which is a rate that OSHA uses for targeting purposes.
(In recent years, OSHA also has been using the establishment’s days-away-from-work case rate
in conjunction with the DART rate for targeting.) Other types of recording errors, such as
incorrect day counts or an injury recorded as an illness, were not analyzed because they do not
affect the calculation for either the DART injury and illness rate or the days-away-from-work
rate. Because the auditors did not find any cases of underrecorded or misrecorded fatalities in the
sample, no analysis was required for this type of case.
OSHA examined the overrecording of cases in regard to the universe estimates as a
separate step. Overrecorded cases are those cases found on the employer’s Log that the auditor
has determined are non-recordable based on a review of employee records during the audit.
A case-level analysis looked at the number and percent of establishments with particular
types of injury and illness case recording results. The types of underrecording errors for total
recordable and DART cases reconstructed in the sample were also determined. The numbers in
the case-level analysis are unweighted and are not intended for conclusions about the universe of
establishments. The information suggests relative distributions of the type of recording errors,
but would require additional study or a redesigned, larger sample for future audits to fully
interpret their significance.
Summary of Findings
Overall Accuracy of Employer Recordkeeping. The percent of establishments classified
with accurate recordkeeping (at-or-above the 95 percent threshold) is above 90 percent for both
total recordable and DART injury and illness cases. Based on 95 percent confidence intervals for
the two estimates, the percentages of 92.43 percent for total recordable cases and 90.04 percent
for DART cases, are not statistically different. Overall, the universe estimates for this year are
consistent with the high level of accuracy found (i.e., above 90 percent) for employer injury and
illness recordkeeping over previous years of the audit program. Among manufacturing and nonFinal Report, August 2006

ES-4

manufacturing establishments, the overall percent of establishments below the threshold of
accuracy was very similar for total recordable cases. For DART cases, the overall percent of
establishments below the threshold was slightly higher for non-manufacturing. Medium-sized
manufacturing establishments had the highest percent of any individual size group.
Case Analysis. In the sample of establishments, non-DART cases were the cases most
frequently not recorded on the Log for both injuries and illnesses. For injuries, this was followed
by cases only involving days away from work (DAFW). For illnesses, all of the unrecorded cases
found by auditors were non-DART cases, of which the auditors found 7.
Submission Comparison Analysis. DART cases had the highest percent of establishments
with exactly the same data submitted to OSHA for the ODI and provided by the employer at the
time of the audit. For hours worked, the percent of establishments with exactly the same data was
similar to the percents for establishments with exactly the same data for total recordable and
DART cases.
Summary and Recommendations
Summary. This analysis represents the eighth year of OSHA’s audit program on
employer injury and illness recordkeeping and the second year under the revised rule.
The audit program is well established and the protocol operates efficiently. This year,
OSHA exceeded its target of obtaining 250 audits for the analysis: A total of 251 audits was
usable for the universe estimates, the case-level analysis, and the comparison between submitted
data and data on employer Logs. Only one audit was excluded based on OSHA’s review of
auditor documentation to evaluate compliance with the protocol.
Across all of the years of the program, a number of findings remain consistent:
•

Based on the estimates of the accuracy of employer injury and illness recordkeeping,
the OSHA Log and employment data collected through the ODI represent reasonable
quality for OSHA’s targeting and performance measurement purposes.

•

Both some overrecording and underrecording are observed.

•

Underrecording errors are not widely distributed across the sample of establishments.
A small number of establishments account for most of the underrecorded cases.

•

Differences found in comparing the audit data with the data submitted to OSHA
result in very few changes of the inspection targeting category status of
establishments.

Findings this year on the CY 2003 employer injury and illness recordkeeping suggest:
•

A higher percentage of the audit data for hours worked are the same as the data
submitted to OSHA than under the old recordkeeping rule. The revised Summary

Final Report, August 2006

ES-5

Form 300A requires employers to record the number of hours worked, so this
information is more accessible. More accurate and readily available data on hours
worked supports more accurate injury/illness rates.
•

Although a high level of accuracy continues for both total recordable and DART
cases, the level of accuracy for DART cases was lower for the CY 2003
recordkeeping audits than for CY 2002. (For total recordable and DART cases in CY
2003, however, OSHA found that the accuracy estimates are not statistically
different.) While the finding from the year-to-year comparison may not indicate a
downward trend, OSHA will need to monitor accuracy for the recording of DART
cases in subsequent audits. In the meantime, some proactive outreach would be
helpful in addressing a potential issue. (See Recommendation 4.)

Recommendations
1. OSHA should continue the audit program as a quality control mechanism to ensure
that an acceptable level of accuracy in employer injury/illness recordkeeping for the
ODI data collection is maintained.
2. The ongoing audit program should maintain the refinements that have established a
credible audit process. These improvements include features such as the creation of a
standard universe and emphasis in the auditor training on adherence to the sampling
protocol.
3. Although the audit program is well established and the protocol operates efficiently,
OSHA should examine the effect(s) of any changes in the assumptions initially used
for the sampling parameters (e.g., the establishment non-compliant rate and the
DART incidence rate) to determine whether any refinements or updates to the
sampling methodology are needed. Also, the potential effect, if any, of the shift from
SIC to NAICS should be reviewed.
4. OSHA should continue to use the information from the audit analysis in outreach
efforts to promote improvements in employer injury and illness recordkeeping under
the revised rule. Specifically, this report and/or summaries of the findings should be
made available to OSHA Compliance Assistance Specialists, ODI data collection
agencies, and the compliance officers who conduct the audits. The information should
also be posted on the OSHA Web site. The correct recording of DART cases should
be emphasized.

Final Report, August 2006

ES-6

INTRODUCTION
In 1995, the Occupational Safety and Health Administration (OSHA) established its Data
Initiative Collection System (ODI) to gather and compile occupational injury and acute illness
information from some 80,000 establishments in high-hazard industries. At the same time, the
Agency developed mechanisms to ensure the accuracy of the collected ODI data for OSHA’s
use—in combination with other data sources—in targeting enforcement and compliance
assistance interventions and for measuring Agency performance in reducing workplace injuries
and illnesses. OSHA’s ongoing data quality efforts address both the data collection process and
the source records (i.e., employer recordkeeping on the OSHA 300 Log1) as an integral part of
the ODI. (Appendix A lists audit program analyses, data validation study reports, and related
studies conducted to date.)
OSHA established the audit program with its onsite audits of employer injury and illness
records to annually assess and monitor the quality of employer injury/illness recordkeeping
nationwide. (Appendix B describes OSHA’s initial quality control efforts and provides
background on the development of the audit program.) The audit program has focused only on
non-construction establishments, with the exception of the sixth year of the program when
OSHA conducted a pilot of the audit methodology in a sample of construction establishments.
Budget constraints have precluded implementation of the audit program in construction
establishments.
OSHA considers onsite audits of employer injury and illness records a key method of
verifying the accuracy of data submitted for the ODI and for estimating the extent of employer
compliance with OSHA recordkeeping requirements defined in 29 CFR 1904. In order to
implement this quality control component, OSHA developed a protocol for reviewing a sample
of employee injury/illness records within a sample of establishments (see Appendix E) as well as
software to streamline a process that was otherwise too resource intensive for widespread use.
This report presents findings on the analysis of audits on calendar year (CY) 2003
employer injury/illness recordkeeping. It is the eighth audit program analysis and the second
assessment of records under OSHA’s revised recordkeeping rule, which came into effect as of
January 2002. Last year, a preliminary review of injury/illness recordkeeping in non-construction
establishments under the revised rule found a reasonable level of accuracy that was not
significantly different than the results found in past years under the old rule. Appendix C
summarizes the major changes between the old and the revised rule.
AUDIT OBJECTIVES
The primary objectives for OSHA in the eighth year of the audit program were to:
•

Execute the established protocol in conducting recordkeeping audits in a sample of
non-construction establishments drawn from the standard ODI universe to estimate

1

Under the revised rule, this simplified version of the predecessor Form 200 (Log and Summary of Occupational
Injuries and Illnesses) came into use as of January 2002.

Final Report, August 2006

1

the accuracy of employer injury and illness recordkeeping nationwide based on the
ODI universe.
•

Assess employer injury/illness recordkeeping accuracy in the second year of
recordkeeping under the revised rule.

In the sections that follow, OSHA presents its methodology, analytical approach, and
findings in regard to these objectives using the information gathered during audits on CY 2003
recordkeeping. The final section of the report provides a summary of findings and
recommendations based on the study.
AUDIT METHODOLOGY AND ANALYTICAL APPROACH
The methodology for the analysis covers efforts to maintain the current level of audit
program participation, the implementation of sample selection from a standard ODI universe for
a fifth year that allows for generalizing the estimate of overall recordkeeping accuracy to ODI
establishments nationwide and facilitates year-to-year comparisons, and the continued emphasis
on adherence to the protocol’s procedures for conducting the audits.
State Plan State Participation
OSHA invites State Plan States to participate in the audit program on a voluntary basis.
Based on audit program experience, OSHA assumes that the number of States able to participate
in a particular year will continue at about the current level, despite some variation in
participating States. This time, the number of States participating in the program was 11, one
more than last time. Three of the 11 States this time (Arizona, California, and Hawaii), did not
participate in the program last year, and two participants last time (North Carolina and Vermont)
opted out this time.
Of the 23 State Plan States overall, 4 of them have decided not to participate in the ODI.
Also, the Commonwealth of Puerto Rico and the Virgin Islands (a U.S. Territory) are considered
ineligible for participation in the ODI. Another 6 State Plan States chose not to participate in this
year’s audit program.
The remaining 11 States Plan States that did participate in the eighth-year audit program
are: Arizona, California, Hawaii, Indiana, Kentucky, Maryland, Minnesota, Nevada, New
Mexico, Utah, and Virginia.
Sampling Universe
For the eighth-year audit program, OSHA selected a sample of audit establishments from
a standard ODI universe, using the same approach as used in recent years of the audit program.
For each year, OSHA compiles the standard ODI universe using a file from Dun & Bradstreet
that provides the most currently available industry, employment, and location information on
establishments. OSHA’s objective in defining a standard ODI universe was to be able to
Final Report, August 2006

2

generalize the annual estimates of overall accuracy for employer injury and illness recordkeeping
to ODI establishments nationwide and to facilitate year-to-year comparisons. Prior to the CY
1999 audit program, the sample was selected from a universe of establishments participating in
the ODI in a specific year.
For this year of the program, OSHA again selected establishments from a universe that
covered all years of the ODI. More specifically, OSHA used a standard ODI universe that
included approximately 120,000 establishments nationwide that met the following criteria:
•

Establishment is located in one of the States participating in the ODI (i.e., either in
the Federal OSHA jurisdiction or in one of the participating State Plan States).

•

Establishment has total employment of 40 or more.

A third criterion when the standard ODI universe was first implemented was to include
establishments in any of the Standard Industrial Classification (SIC) codes selected for one of the
annual ODI collections. Several program cycles ago, OSHA modified the criteria somewhat by
including establishments in SIC codes from any of the ODI collections except SIC 53 (General
Merchandise Stores) and SIC 806 (Hospitals). OSHA made this refinement to the definition of
the standard ODI universe to address the possibility that the number of establishments in these
large industry sectors, which are only selectively included in the ODI, could affect the overall
representativeness of the audit sample selection.
The objective of using the standard ODI universe is to addresses analytical limitations
associated with selecting a sample from the collection year-specific ODI universe, which is
subject to shifting characteristics. The shifting characteristics across collection years results from
the Agency’s slightly different data collection needs related to intervention targeting and
performance measurement.
Sample Selection of Establishments
For the eighth year of the audit program, OSHA again committed to conducting 250
audits. Previous analyses have established that selecting and assigning a sample of exactly 250
audits at the outset is unlikely to yield the optimum number of completed audits for the analysis.
A shortfall can result because in some instances audits are not conducted due to constraints on
resources.
The target sample size is based on a National Opinion Research Center (NORC)
determination that this approximate number of audits would provide an acceptable level of
power for detecting overall accuracy of employer recordkeeping at-or-above a 95 percent
threshold. This also would enable OSHA to provide reasonable estimates of accuracy for the
universe of establishments. As established for the first-year audit program analysis, at lower
level break-outs, such as at the industry level, universe estimates would be considered unstable
because of the relatively small number of establishments that might occur in the subcategories of
the sample. (See National Opinion Research Center, Final Report: Sample Design for a
Statistically Valid Evaluation of Accuracy and Completeness of an Establishment’s OSHAFinal Report, August 2006

3

Mandated Employee Records, 1996.)
To select a sample of audit establishments from a standard ODI universe and to increase
the likelihood of having 250 completed audits available for the analysis, OSHA implemented the
following steps:
Step 1. Select an initial sample of establishments from the standard ODI universe.
Before defining the ODI universe for the CY 2003 collection year, OSHA made an initial
selection of 398 establishments from a universe file that was compiled from a Dun &
Bradstreet establishment file. The compiled sample selection file included all 124,989
establishments that met the criteria established for the audit program’s standard ODI
universe. Before making this initial selection, OSHA sorted establishments in the
sampling frame by industry code, region, and employment size, resulting in an implicit
stratification. OSHA then drew the sample of establishments using a systematic selection
procedure.
Step 2. Include all establishments selected for the initial sample in the ODI universe
for the CY 2003 collection year.
OSHA included all 398 establishments selected from the standard ODI universe in the
CY 2003 ODI collection universe.
Step 3. At completion of the ODI data collection cycle for CY 2003, eliminate from
the sample any establishments that do not meet audit program requirements.
After the CY 2003 ODI collection cycle was completed, OSHA screened from the sample
any establishments located in State Plan States that had chosen not to participate in the
audit program. From those that remained, any establishments for which OSHA did not
have an OK-verified submission from the CY 2003 collection were screened out. (OSHA
submission tracking codes that indicate the data are OK verified are: OK, OKPD, and
ECRG. See Appendix D for a glossary of tracking codes.) As a result, 126 establishments
were eliminated from the sample in this step.
Step 4. Assign the remaining sample establishments for an audit.
OSHA assigned 272 establishments for an audit. When any of the original audit
establishment selections could not be audited (e.g., when found to be out-of-business or
to be a headquarters location), replacement establishments were selected from the
collection year CY 2003 ODI universe. An establishment could be selected as a
replacement if it was in the same jurisdiction as the original selection, it matched on the
2-digit SIC code, and the average number of employees was the same or similar.
Step 5. Eliminate any completed audits that were not properly conducted.
As files for audits that auditors were able to conduct and complete were submitted,
Final Report, August 2006

4

OSHA reviewed the files and determined which ones followed requirements in the
recordkeeping protocol (see Appendix E). Based on this review, OSHA eliminated 1
audit due to divergence from the protocol.
Audit Protocol and Sampling of Employees within Establishments
The same approach to sampling employees within establishments and essentially the
same protocol were used this time as in past years of the audit program. (Appendix E presents
OSHA’s compliance directive on recordkeeping audits.) Furthermore, OSHA maintained an
emphasis on adherence to the protocol in its training for staff conducting the audits.
In analyzing the recordkeeping audit program, OSHA has found that the audit protocol
establishes an efficient approach for conducting and documenting recordkeeping audits.
Adherence to the protocol and use of the ORAA software system provide auditors with an
efficient process that allows the Agency to feasibly monitor the quality of employer injury and
illness recordkeeping.
An important feature of the ORAA software is the built-in function that enables the
auditor to determine the number of employees to be sampled at each establishment. After the
auditor enters the number of employees at the establishment and the number of cases on the
employer’s OSHA 300 Log, the software calculates the number of employees to be sampled.
This sample is based on certain assumptions about the occurrence of recordable injuries and
illnesses, the level of recording accuracy, and the likelihood of detecting errors in recording.
Statistical assumptions that were established to determine the sample size included a threshold of
accuracy of 95 percent, an alpha level of 0.05, and a power of 75 percent. (A full discussion of
the statistical power analysis can be found in the National Opinion Research Center Final
Report: Sample Design for a Statistically Valid Evaluation of Accuracy and Completeness of an
Establishment’s OSHA-Mandated Employee Records—see especially pp.4-6.)
Analysis
OSHA implemented the same general approach for analyzing the results of the
establishment audits as was used in past years of the program. The analysis approach addressed
two general areas:
Methodology for Implementing the Audit Cycle
C

Reviewing the documentation on the audits for completeness and adherence to the
established protocol.

C

Comparing the characteristics of the sample of establishments audited to those of
establishments in the standard ODI universe.

Final Report, August 2006

5

Results Related to the Accuracy of Employer Injury/Illness Recordkeeping
C

Calculating universe estimates of the overall accuracy of employer injury and illness
recordkeeping based on the results of the audits and the sample design.

C

Comparing recordkeeping accuracy estimates from the eighth-year audit program
with results from the seventh year (i.e., first year of recordkeeping under the revised
rule) and sixth year (i.e., the last year of recordkeeping under the old rule).

C

Performing a case-level analysis that describes the types of recordable cases the
auditors identified in the sample and details the recording errors they discovered.

C

Comparing the employers’ Log Summary and employment and hours worked data at
the establishment at the time of the audit with the data submitted to OSHA in
response to the CY 2003 ODI collection request.

Approach for Analysis of the Implementation of the Audit Cycle
The compliance officers’ documentation of the audits was carefully reviewed to
confirm the procedures used in the audit. A total of 251 audits was usable for the universe
estimates, the case-level analysis, and the comparisons made between data on the Log and data
submitted to OSHA for the total recordable cases, DART cases, and hours worked. (This number
of establishment audits available for the analysis is consistent with last year’s analysis. In prior
years, OSHA approached without reaching the methodology’s target of 250 audits available for
conducting the analysis.) As in the past, the primary reason for not conducting some of the audits
was resource constraints. Of the audits that were conducted, 1 was excluded based on OSHA’s
review of the documentation for each audit to determine whether auditors had fully followed the
protocol or if an audit should be eliminated for any other reason, compared to 5 last year and 21
in the first year of the audit program. (Appendix E presents OSHA’s procedures for
recordkeeping audits.)
The sample of establishments audited was compared to the standard ODI universe
of establishments by size and industry to determine the representativeness of the sample.
Three principal size group categories based on average employment were used—“all small” (4099 employees), medium (100-249 employees), and large ($250 employees). Also, as with the
past five audit program analyses, a small establishments subcategory of 40-49 employees was
used to further assess any effect of the inclusion of smaller establishments in the ODI.
For industry matching, the sample and universe were compared at the 2-digit SIC level.
Also, comparisons were developed for all manufacturing and non-manufacturing establishments.
Table 1 provides the distribution of audited establishments by size group based on
average employment compared to the standard ODI universe. Sample establishments were
selected from this universe and assigned for an audit if the establishment was in the Federal
OSHA jurisdiction or in one of the 11 State Plan States participating in this year’s audit program,
Final Report, August 2006

6

and if the establishment’s ODI submission for CY 2003 was OK verified.
Table 1
OSHA Audits on CY 2003 Injury and Illness Recordkeeping:
Number and Percent of Establishments in the Recordkeeping Audit Sample and
the Standard ODI Universe by Establishment Size Group
Establishment Size Group
(average number of
employees)

Audit Samplea
Establishments
Percentc

Number

Standard ODI Universeb
Establishments
Number

of Sample

All Small (40-99)d

Percentc
of Universe

114

45.42

68,489

54.80

24

9.56

19,998

16.00

Medium (100-249)

106

42.23

39,343

31.48

Large ($250)

31

12.35

17,157

13.73

251

100

124,989

100

Small (40-49)e

All Sizes

Note on representativeness of sample: Overall, the sample of audited establishments is similar to the distribution of
establishments in the standard ODI universe by size group, reflecting the effect of implicit stratification. OSHA did
not assess further this apparent relationship, noting (as pointed out by Hays, W. L. [1994]. Statistics [5th ed.].
Orlando, FL: Harcourt Brace & Co.) that Pearson’s Chi-Square test would not provide a reliable assessment of
goodness of fit, given that only three size categories are available (i.e., the establishment size grouping of “all
small,” medium, and large). This test provides a reasonable approximation only when the number of categories
available for conducting the comparison—size or industry categories in this analysis—is reasonably large.
a. The audit sample is limited to establishment audits that OSHA assigned from the original sample of
establishments, as drawn from the standard ODI universe, and that OSHA determined were usable for the analysis
after confirming that the audits were conducted according to established recordkeeping audit procedures (see CPL 2
in Appendix E). Establishments in the original sample were assigned for an audit if they were under the OSHA
Federal jurisdiction or in one of the 11 State Plan States that voluntarily participated in the audit program, and if
their CY 2003 OSHA Data Initiative (ODI) submission was OK verified. For the comparison in Table 1,
establishment size group information for the audit sample establishments was derived from the employer-submitted
2003 ODI data.
b. The standard ODI universe includes all establishments that are in States participating in the ODI, have 40 or more
employees, and are in one of the SICs selected for any of the ODI collectionsCexcept SIC 53 (General Merchandise
Stores ) and SIC 806 (Hospitals). Because OSHA has not collected ODI data from all establishments in the standard
ODI universe, for the comparison in Table 1, establishment size group information for establishments in the
standard ODI universe was derived from Dun & Bradstreet data.
c. Because of rounding, percentages may not add to 100.
d. The “all small” (40-99) and “small” (40-49) establishment size groups include 1 establishment with an average
employment of less than 40 employees. This establishment had total employment of 46 for the year, but average
employment during 2003 was determined to be 39.
e. The “small” size group is a subset of the “all small” size group.

Final Report, August 2006

7

The same group of audited establishments presented in Table 1 is compared to the
universe by industry in Table 2 at the 2-digit SIC level. Table 2 also presents the comparison of
all manufacturing and non-manufacturing establishments.
Table 2
OSHA Audits on CY 2003 Injury and Illness Recordkeeping:
Number and Percent of Establishments in the Recordkeeping Audit Sample and
the Standard ODI Universe by Industry (2-digit SIC) Sorted by
Number of Establishments in the Universe
Audit Samplea
Establishments

SIC Code (2-digit level)
and Industry

Standard ODI Universeb
Establishments

Percentc
of Sample

Number

Number

Percentc
of Universe

80

Health Services

30

11.95

13,959

11.17

42

Trucking and Warehousing

15

5.98

10,289

8.23

35

Machinery, Except Electrical

23

9.16

9,728

7.78

34

Fabricated Metal Products

20

7.97

8,594

6.88

27

Printing and Publishing

12

4.78

7,418

5.93

36

Electric and Electronic Equipment

11

4.38

7,006

5.61

20

Food and Kindred Products

22

8.76

6,722

5.38

30

Rubber and Misc. Plastic Products

14

5.58

5,757

4.61

51

Wholesale TradeBNondurable Goods

8

3.19

5,001

4.00

28

Chemicals and Allied Products

9

3.59

4,910

3.93

50

Wholesale TradeBDurable Goods

5

1.99

4,861

3.89

37

Transportation Equipment

8

3.19

4,003

3.20

38

Instruments and Related Products

8

3.19

3,779

3.02

24

Lumber and Wood Products

7

2.79

3,586

2.87

52

Building Materials and Garden
Supplies

9

3.59

3,506

2.81

26

Paper and Allied Products

4

1.59

3,503

2.80

33

Primary Metal Industries

8

3.19

3,156

2.53

32

Stone, Clay, and Glass Products

5

1.99

3,096

2.48

23

Apparel and Other Textile Products

3

1.20

2,594

2.08

Final Report, August 2006

8

Audit Samplea
Establishments

SIC Code (2-digit level)
and Industry

Standard ODI Universeb
Establishments

Percentc
of Sample

Number

Number

Percentc
of Universe

25

Furniture and Fixtures

7

2.79

2,467

1.97

39

Misc. Manufacturing Industries

4

1.59

2,380

1.90

22

Textile Mill Products

4

1.59

2,045

1.64

45

Transportation by Air

6

2.39

1,869

1.50

49

Electric, Gas, and Sanitary Services

3

1.20

1,634

1.31

29

Petroleum and Coal Products

1

.40

612

0.49

01

Agricultural ProductionBCrops

2

.80

546

0.44

02

Agricultural ProductionBLivestock

1

.40

490

0.39

44

Water Transportation

0

0

429

0.34

31

Leather and Leather Products

1

.40

344

0.28

43

U.S. Postal Service

0

0

297

0.24

07

Agricultural Services

1

.40

185

0.15

47

Transportation Services

0

0

138

0.11

21

Tobacco Manufactures

0

0

85

0.07

171

68.13

75,063

60.06

80

31.87

49,926

39.94

251

100

124,989

100

All Manufacturing SICs
All Non-Manufacturing SICs
All SICs

Note on representativeness of sample: Overall, the sample of audited establishments appears representative of the
standard ODI universe by industry, reflecting the effect of implicit stratification. OSHA further evaluated and
supported this finding with Pearson’s Chi-Square test for goodness of fit, using the many more categories available
for this comparison than for the size category comparison. In applying the test, no significant deviations from fit
were observed (Chi-Square = 21.485, df = 30, n.s.).
a. The audit sample is limited to establishment audits that OSHA assigned from the original sample of
establishments, as drawn from the standard ODI universe, and that OSHA determined were usable for the analysis
after confirming that the audits were conducted according to established recordkeeping audit procedures (see CPL 2
in Appendix E). Establishments in the original sample were assigned for an audit if they were under the OSHA
Federal jurisdiction or in one of the 11 State Plan States that voluntarily participated in the audit program, and if
their CY 2003 OSHA Data Initiative (ODI) submission was OK verified. For the comparison in Table 2,
establishment industry information for the audit sample establishments was derived from the employer-submitted
2003 ODI data.
b. The standard ODI universe includes all establishments that are in States participating in the ODI, have 40 or more
employees, and are in one of the SICs selected for any of the ODI collectionsCexcept SIC 53 (General Merchandise
Stores ) and SIC 806 (Hospitals). Because OSHA has not collected ODI data from all establishments in the standard

Final Report, August 2006

9

ODI universe, for the comparison in Table 2, industry information for establishments in the standard ODI universe
was derived from Dun & Bradstreet data.
c. Because of rounding, percentages may not add to 100.

Approach for Analysis of Results Related to the Accuracy of Injury/Illness Recordkeeping
The universe estimate analysis focused on the types of recording errors that affect
an employer’s injury and illness rate, including:
C

Underrecording of total recordable cases—The employer does not record an injury or
illness that should have been entered on the Log.

C

Underrecording or misrecording of DART cases (days away from work, restriction,
or transfer injury/illness cases)—Either the case is not recorded on the Log or the case
is recorded as a non-DART case.

Recording and correctly classifying DART affects the accuracy of an establishment’s
combined DART injury and illness rate, which is a rate that OSHA uses for targeting purposes.
(In recent years, OSHA also has been using the establishment’s days-away-from-work case rate
in conjunction with the DART rate for targeting.) Other types of recording errors, such as
incorrect day counts or an injury recorded as an illness, were not analyzed because they do not
affect the calculation for either the DART injury and illness rate or the days-away-from-work
rate. Because the auditors did not find any cases of underrecorded or misrecorded fatalities in the
sample, no analysis was required for this type of case.
OSHA examined the overrecording of cases in regard to the universe estimates as a
separate step. Overrecorded cases are those cases found on the employer’s Log that the auditor
has determined are non-recordable based on a review of employee records during the audit.
The same steps used in past years’ analyses were involved in classifying an establishment
as accurate in the recording of total recordable cases and the recording of DART cases on the
Log. Estimates of the percent of establishments with accurate recording of these cases are based
on the sample design for both the selection of establishments and the sampling of employees
within establishments. The steps are as follows:
Step 1.

A significance test was applied to the results of the sample of employee records
reviewed for each audit to determine whether an establishment should be
classified as at-or-above a 95 percent threshold of accuracy. (See National
Opinion Research Center, Final Report: Sample Design for a Statistically Valid
Evaluation of Accuracy and Completeness of an Establishment’s OSHAMandated Employee Records, 1996, page 5 for an explanation of the threshold
of accuracy.)

Step 2.

The percent of sample establishments at-or-above the 95 percent threshold of
accuracy was calculated. The sample percent provides an estimate of the

Final Report, August 2006

10

proportion of establishments at-or-above the 95 percent threshold of accuracy in
the standard ODI universe. The projection to this universe is valid because of
the implicit stratified sample design of the sample of establishments.
Step 3.

A standard error of the percent estimate was calculated using the simple random
sampling variance estimator.

Universe estimates for any given year, however, cannot be generalized to all of the
nation’s workplaces for the following reasons:
C

The ODI focuses on selected high-rate industries and excludes establishments with
fewer than 40 employees.

C

Not all State Plan States participate in the ODI or the audit program.

Additional analyses would need to be conducted before such use of the estimates could be
supported.
The extent of overrecorded cases was also calculated. Overrecording occurs when the
employer enters a case on the Log that does not meet the criteria for recordability. For example,
an injury occurred but only required first aid.
See the Findings section for the results of the universe estimates analysis.
A case-level analysis looked at the number and percent of establishments with
particular types of injury and illness case recording results. The types of underrecording
errors for total recordable and DART cases reconstructed in the sample were also determined.
The numbers in the case-level analysis are unweighted and are not intended for
conclusions about the universe of establishments. The information suggests relative distributions
of the type of recording errors, but would require additional study or a redesigned, larger sample
for future audits to fully interpret their significance. See the Findings section for the results of
this analysis.
The employer’s Log Summary at the establishment was compared with the data
submitted to OSHA. Comparisons were made between data on the Log and submitted data for
the total recordable cases, DART cases, and hours worked data by size group and by
manufacturing versus non-manufacturing establishments in the universe. The analysis also
looked at the reasons for the differences between data on the Log and submitted data. The
ORAA software includes a pick-list of reasons provided by establishment recordkeepers and the
capability to distinguish between primary and secondary reasons for differences.
This component of the study used the same 251 audits that were available for use in the
universe estimate and the case-level analyses. See the Findings section for the results of this
analysis.

Final Report, August 2006

11

FINDINGS
This section presents the results related to the accuracy of employer injury and illness
recordkeeping. The assessment includes summary indicators for the universe of establishments,
the types of recordkeeping errors that auditors identified in the sample, and a comparison of the
injury/illness and employment data submitted for the ODI collection with that maintained at the
establishment.
Universe Estimates for CY 2003 Recordkeeping
The primary objective of the audits is to derive estimates of the overall accuracy of
employer injury and illness recordkeeping (as previously defined). In the first three years of the
audit program, the sample results could be applied only to the sampling universe made up of
establishments that were in the ODI universe for the specific collection year and that were
participating in the audit program.
This year is OSHA’s fifth year of sample selection from a universe that is representative
of nearly all establishments nationwide included in the ODI. An exception to the sample’s
representativeness of all ODI establishments was established by a refinement OSHA made a
number of years ago to the standard universe. The change involved excluding two industries—
SIC 53 (General Merchandise Stores) and SIC 806 (Hospitals)—for which OSHA collects Log
summary data and employment information from only a portion of the population of
establishments. For other industries, OSHA collects data from the entire population of
establishments that meet ODI criteria. OSHA made the adjustment to consider the possibility that
the population size of these industry sectors (about 10,000 establishments each) could affect the
overall representativeness of the audit sample selection.
As a result of using the standard universe, again the overall accuracy estimates from the
sample of audits can be generalized to all ODI establishments nationwide, except those in SICs
53 and 806.
Universe estimates for any given year cannot be generalized to all of the nation’s
workplaces because the ODI focuses on selected high-rate industries and excludes
establishments with fewer than 40 employees. Also, not all State Plan States participate in the
ODI or the audit program. Additional analyses would need to be conducted before such use of
the estimates could be supported.
The sample of establishments and the sample of employees within establishments was
designed to allow a reasonable estimation of the extent to which employers enter recordable
cases on their Logs (the extent to which cases are not underrecorded) or correctly classify DART
cases. Two years ago, a first fatality case was identified by an auditor in the sample of
establishments; no fatality cases have been identified since.
Table 3 provides the results of the universe estimates analysis for CY 2003
recordkeeping. Generalizing from the sample of audit establishments, the percent of
establishments classified with accurate recordkeeping (at-or-above the 95 percent threshold) is
Final Report, August 2006

12

above 90 percent for both total recordable and DART injury and illness cases. Based on 95
percent confidence intervals for the two estimates, the percentages of 92.43 percent for total
recordable cases and 90.04 percent for DART cases, are not statistically different.
Overall, the universe estimates for this year are consistent with the high level of accuracy
found (i.e., above 90 percent) for employer injury and illness recordkeeping over previous years
of the audit program.
Table 4 presents universe estimates for the past two years of the audit program—the last
year under the old recordkeeping rule and the first year under the revised rule—which OSHA
found in an analysis conducted last year to be consistent with the level of recording accuracy
observed previously.
For this year’s analysis, OSHA also applied a statistical test to the accuracy estimates for
the first two years of recordkeeping under the new rule (i.e., the second-year results shown in
Table 3 and the first-year results shown in the right-hand portion of Table 4). The test found no
significant difference in the means for total recordable cases. A significant difference overall was
indicated, however, for DART cases. OSHA considers the drop in overall accuracy for the
recording of DART cases in the second year of recordkeeping under the revised rule to be a
preliminary finding rather than a trend.

Final Report, August 2006

13

Table 3
Universe Estimates for OSHA Audits on CY 2003 Injury and Illness Recordkeeping:
Number and Percent* of Establishments Classified as Accurate in Recording the Number of
Total Recordable and Days Away, Restriction, or Transfer (DART) Injury and Illness Cases
with the Standard Error of the Estimate
Type of Case

2003 AUDIT RESULTS
Number of establishments classified with accurate
recording (at-or-above the 95% threshold of
accuracy)

Percent of establishments classified with
accurate recording (at-or-above the 95%
threshold of accuracy)

Standard error of the
estimate (percent)

Total Recordable

232 / 251
(19 below)

92.43%

1.66%

DART

226 / 251
(25 below)

90.04%

1.88%

* The percent of establishments “at or above” the 95% threshold of accuracy calculated from the sample also provides an estimate that can be extrapolated to the
standard ODI universe (i.e., establishments nationwide that are in States participating in the ODI, have 40 or more employees, and are in one of the SICs selected
for any of the ODI collections—except SIC 53 (General Merchandise Stores ) and SIC 806 (Hospitals)).
Note: The standard error of the estimate was calculated using the simple random sampling variance estimator.

Final Report, August 2006

14

Table 4
Universe Estimates for OSHA Audits on CY 2001 and CY 2002 Recordkeeping:
Number and Percent* of Establishments Classified as Accurate in Recording the Number of
Total Recordable and Lost Workday (LWD) or Days Away, Restriction, or Transfer (DART) Injury and Illness Cases
with the Standard Error of the Estimate
Type of Case

2001 AUDIT RESULTS**

2002 AUDIT RESULTS

(Last year under old recordkeeping rule.)
Standard error
Percent of
Number of
of the estimate
establishments
establishments
(percent)
classified with
classified with
accurate recording
accurate recording
(at-or-above the
(at-or-above the
95% threshold of
95% threshold of
accuracy)
accuracy)

(First year under revised recordkeeping rule.)
Standard error
Percent of
Number of
of the estimate
establishments
establishments
(percent)
classified with
classified with
accurate recording
accurate recording
(at-or-above the
(at-or-above the
95% threshold of
95% threshold of
accuracy)
accuracy)

Total
Recordable

235 / 246
(11 below)

95.53%

1.31%

239 / 252
(13 below)

94.84%

1.39%

LWD***
or DART

231 / 246
(15 below)

93.90%

1.52%

236 / 252
(16 below)

93.65

1.53%

* The percent of establishments “at or above” the 95% threshold of accuracy calculated from the sample also provides an estimate that can be extrapolated to the
standard ODI universe (i.e., establishments nationwide that are in States participating in the ODI, have 40 or more employees, and are in one of the SICs selected
for any of the ODI collections—except SIC 53 (General Merchandise Stores ) and SIC 806 (Hospitals)).
** The standard ODI universe was further refined as of the CY 2001 program by adding SIC 43 (U.S. Postal Service), with 297 establishments.
*** For calendar years before 2002, “accuracy” refers to recordable cases recorded on the Log 200 or lost workday cases recorded on the Log as lost workday
cases. As of CY 2002, with implementation of the revised recordkeeping rule, “accuracy” refers to recordable cases recorded on the Log 300 or DART cases
recorded on the Log as DART cases.
Note: The standard error of the estimate was calculated using the simple random sampling variance estimator.

Final Report, August 2006

15

Tables 5 and 6 show the distribution of establishments that fell below the 95 percent
threshold of accuracy by establishment size and industry category for total recordable and DART
cases, respectively. For total recordable cases, the overall percent of establishments below the
threshold of accuracy was very similar among manufacturing and non-manufacturing
establishments. Of the individual size groups, large manufacturing establishments had the
highest percent below the threshold for total recordable cases.
For DART cases, the overall percent of establishments below the threshold was slightly
higher for non-manufacturing. However, medium-sized manufacturing establishments had the
highest percent of any individual size group.
For both total recordable and DART cases, no non-manufacturing establishments in the
large size group were below the threshold of accuracy.
Compared to audits on CY 2002 recordkeeping, both manufacturing and nonmanufacturing establishments did better last year in recording both total recordable and DART
cases. In particular, last year no manufacturing establishments in the large size group were below
the threshold of accuracy for either total recordable or DART cases. This year for that size group,
the percent of establishments below the threshold was 16.67 and 12.50, respectively. The
opposite occurred with large non-manufacturing establishments, none of which were below the
threshold of accuracy this year for either total recordable or DART cases.

Final Report, August 2006

16

Table 5
OSHA Audits on CY 2003 Injury and Illness Recordkeeping:
Number and Percent of Establishments Below the Threshold of Accuracy for
Total Recordable Cases by Establishment Size Group
and Manufacturing vs. Non-Manufacturing
Industry Category
Establishment Size
Category
(average number of
employees)

Manufacturing

Non-Manufacturing

Number

Percent

Number

Percent

2 / 83

2.41

2 / 31

6.45

1 / 18

5.56

0/6

0

Medium (100-249)

7 / 64

10.94

4 / 42

9.52

Large ($250)

4 / 24

16.67

0/7

0

7.60

6 / 80
(74 pass / 80)

7.50

All Small (40-99)*
Small (40-49)**

Total

13 / 171
(158 pass / 171)

* The “all small” (40-99) and “small” (40-49) establishment size groups include 1 establishment with an average
employment of less than 40 employees. This establishment had total employment of 46 for the year, but average
employment during 2003 was determined to be 39.
** The “small” size group is a subset of the “all small” size group.

Final Report, August 2006

17

Table 6
OSHA Audits on CY 2003 Injury and Illness Recordkeeping:
Number and Percent of Establishments Below the Threshold of Accuracy for
Days Away, Restriction, or Transfer (DART) Injury and Illness Cases
by Establishment Size Group
and Manufacturing vs. Non-Manufacturing
Industry Category
Establishment Size
Category
(average number of
employees)
All Small (40-99)*
Small (40-49)**
Medium (100-249)
Large ($250)
Total

Manufacturing

Non-Manufacturing

Number

Percent

Number

Percent

2 / 83

2.41

4 / 31

12.90

1 / 18

5.56

0/6

0

11 / 64

17.19

5 / 42

11.90

3 / 24

12.50

0/7

0

16 / 171
(155 pass / 171)

9.36

9 / 80
(71 pass / 80)

11.25

* The “all small” (40-99) and “small” (40-49) establishment size groups include 1 establishment with an average
employment of less than 40 employees. This establishment had total employment of 46 for the year, but average
employment during 2003 was determined to be 39.
** The “small” size group is a subset of the “all small” size group.

In examining the overrecording of cases (i.e., cases classified as non-recordable found by
the auditor on the Log) in regard to the universe estimates, OSHA found the following:
•

Overall. A total of 86 entries (84 injuries and 2 illnesses) were found on employers’ Logs
for incidents that are not considered OSHA-recordable cases. These overrecorded cases
were distributed across 44 establishments. At 24 of these 44 establishments, only one
instance of overrecording was found.
Only 8 of these 86 overrecorded cases were classified as DART cases by employers. These
8 overrecorded DART cases were distributed across 6 establishments.

•

Total recordable cases. Overall, 241 of 251 (96.02%) establishments were at-or-above the
95 percent threshold of accuracy with respect to overrecording.
Of the 232 establishments at-or-above the 95 percent threshold of accuracy with respect to
underrecording of recordable cases, 223 (96.12%) were found to be at-or-above the
threshold with respect to overrecording.

Final Report, August 2006

18

One of the 19 establishments below the 95 percent threshold of accuracy with respect to
underrecording tested below the 95 percent threshold of accuracy for overrecording
(94.74% accurate).
•

DART cases. Overall, 249 of 251 (99.20%) establishments were at-or-above the 95
percent threshold of accuracy with respect to overrecording.
Of the 226 establishments at-or-above the 95 percent threshold of accuracy with respect to
underrecording of DART cases, 224 (99.12%) were found to be at-or-above the threshold
with respect to overrecording.
None of the 25 establishments below the 95 percent threshold of accuracy for DART
underrecording tested below the 95 percent threshold of accuracy for overrecording.

Case Analysis
The distribution of cases was analyzed to provide descriptive information about the
auditors’ findings in the sample of establishments. The data are raw frequencies of the
reconstructed cases from the audits. The analysis of cases by establishments is different from the
determination of the universe estimates in that the sample size and design did not provide for
estimates at this level of detail. The breakdown of different types of cases identified by the
auditors are not weighted by their respective contribution to the sample. As a result, broad
conclusions cannot be drawn about the universe from these findings.
Table 7 indicates the type of recordkeeping errors that the auditors identified in the
discovered cases. In the sample of establishments, the percentage of cases not recorded at all was
higher than the percentage of errors involving either DART cases recorded as non-DART cases
or non-DART cases recorded as DART cases. More DART cases recorded as non-DART cases
were found than non-DART cases recorded as DART cases. The analysis found, however, that
these recordkeeping errors are not widely distributed across the audit sample. For instance, 4
establishments (with a total of 17 cases) accounted for over 27 percent of the 62 underrecorded
DART cases found by auditors. Similarly, for the approximately 11 percent of errors attributable
to not recording cases, 4 establishments (with a total of 20 cases) accounted for almost 25
percent of the 81 cases found by auditors that were not recorded on the employer Logs.
Table 8 shows the types of injury and illness cases identified by the auditors that were not
recorded on the employer Logs. In the sample of establishments, non-DART cases were the
cases most frequently not recorded on the Log for both injuries and illnesses. For injuries, this
was followed by cases only involving days away from work (DAFW). For illnesses, all of the
unrecorded cases found by auditors were non-DART cases, of which the auditors found 7.
Table 9 presents the categories of misrecording of DART cases identified by the auditors.
In the sample of establishments, injury cases only involving restricted work activity or transfer
(RWA) were the type of cases most often misrecorded on the Log as non-DART cases. For
illnesses, all of the misrecorded cases identified by auditors involved RWA only, of which there
were 4.
Final Report, August 2006

19

Table 7
OSHA Audits on CY 2003 Injury and Illness Recordkeeping:
Number and Percent of Recordable Injury and Illness Cases Identified by Auditors
by Type of Recordkeeping Errors*
Recordable Cases
Type of Recording Error

Number

Percent**

Not Recorded

81 / 747

10.84

DART Recorded as Non-DART

62 / 747

8.30

Non-DART Recorded as DART

4 / 747

0.54

Total Recording Errors (above)

147 / 747

19.68

Total Cases with None of the
Above Errors

600 / 747

80.32

Total

747

100

* The frequencies in this table are unweighted and should not be used to draw broad conclusions
about the recordkeeping audit universe.
** Because of rounding, percentages might not add to 100.

Final Report, August 2006

20

Table 8
OSHA Audits on CY 2003 Injury and Illness Recordkeeping:
Number and Percent of Recordable Injury and Illness Cases Identified by Auditors
and Not Recorded on the Employer’s Log*
Injury/Illness
Category
Injuries

Illnesses

Injuries and
Illness Combined

Type of Case

Number of Cases
Not Recorded

Number of Cases
Discovered by Auditor

Percent of Category
Not Recorded

Percent of All Cases
Not Recorded

Non-Days Away, Restriction, or
Transfer (DART) Cases

37

189

37 / 189 = 19.58

37 / 81 = 45.68

Days Away From Work
(DAFW) Only

13

148

8.78

16.05

Restricted Work Activity or
Transfer (RWA) Only

20

252

7.94

24.69

DAFW and RWA

4

96

4.17

4.94

All Types for Injuries (Total)

74

685

10.80

91.36

Non-DART Cases

7

29

7 / 29 = 24.14

7 / 81 = 8.64

DAFW Only

0

4

0

0

RWA Only

0

19

0

0

DAFW and RWA

0

10

0

0

All Types for Illnesses (Total)

7

62

11.29

8.64

Non-DART Cases

44

218

44 / 218 = 20.18

44 / 81 = 54.32

DAFW Only

13

152

8.55

16.05

RWA Only

20

271

7.38

24.69

DAFW and RWA

4

106

3.77

4.94

All Types (Total)

81

747

10.84

100

* The frequencies in this table are unweighted and should not be used to draw broad conclusions about the recordkeeping audit universe.

Final Report, August 2006

21

Table 9
OSHA Audits on CY 2003 Injury and Illness Recordkeeping:
Number and Percent of Recordable Days Away, Restriction, or Transfer (DART) Injury and Illness Cases Identified by Auditors
and Recorded on the Employer’s Log as Non-DART Cases*
Injury/Illness
Category

Injuries

Illnesses

Injuries and
Illnesses
Combined

Type of Case

Number Cases
Recorded as
Non-DART
Cases

Number Cases
Discovered by
Auditor

Percent of
Category Not
Recorded as
DART Case

Percent of All
DART Cases
Recorded as
Non-DART
Cases

Days Away from Work (DAFW)
Only

16

148

16 / 148 = 10.81

16 / 62 = 25.81

Restricted Work Activity or
Transfer (RWA) Only

36

252

14.29

58.06

DAFW and RWA

6

96

6.25

9.68

All Types for Injuries (Total)

58

496

11.69

93.55

DAFW Only

0

4

0/4=0

0 / 62 = 0

RWA Only

4

19

21.05

6.45

DAFW and RWA

0

10

0

0

All Types for Illnesses (Total)

4

33

12.12

6.45

DAFW Only

16

152

16 / 152 = 10.53

16 / 62 = 25.81

RWA Only

40

271

14.76

64.52

DAFW and RWA

6

106

5.66

9.68

All Types (Total)

62

529

11.72

100

* The frequencies in this table are unweighted and should not be used to draw broad conclusions about the recordkeeping.

Final Report, August 2006

22

Submission Comparison Analysis
Stringent criteria were used for the submission comparison. The analysis considered the
auditors’ comparison of the employers’ injury/illness and hours worked data submitted for the
ODI with the injury and illness data on the Log and the hours worked provided by the employer
at the time of the audit. For this analysis, OSHA used all 251 audits available for the universe
estimate and case-level analysis.
As shown in Table 10, DART cases had the highest percent of establishments with
exactly the same data. For the establishments with data that differed, the audit data were both
more and less than the ODI collection submission for all categories (i.e., there was no pattern to
the differences). For both DART and total recordable cases, the audits produced a higher
percentage of additional cases for large size firms than for other establishment size categories.
The “all small” category had the highest percentages of establishments with the same number of
total recordable and DART cases for both the ODI submission and the onsite Log. (Note,
however, that for DART cases, the “small” subcategory had the same percentage as the “all
small” category.)
As shown in Table 11, the percent for all establishments with the same data for hours
worked—submitted for the ODI and on the employer’s Log—was similar to the results in the
comparison on type of cases. The audits produced more hours worked for firms in the “small”
subcategory than for the other size groups.
Table 12 indicates that manufacturing establishments had a higher percentage of
establishments with data that matched exactly for DART cases and for total recordable cases
than for non-manufacturing establishments. There was no difference, however, for hours worked,
as shown in Table 13.

Final Report, August 2006

23

Table 10
OSHA Audits on CY 2003 Injury and Illness Recordkeeping:
Results of the Comparison of Total Recordable Injury and Illness Cases and
Days Away, Restriction, or Transfer (DART) Injury and Illness Cases Submitted to OSHA for the Data Collection
with Data on the Employer’s Log as Found During Audits by Establishment Size
Establishment Comparison Results
Establishment
Size Group
(average number of
employees)

Total Recordable Injury and Illnesses Cases
Audit Less

Audit Same

DART Injury and Illness Cases

Audit More

Audit Less

Audit Same

Audit More

Number

Percent

Number

Percent

Number

Percent

Number

Percent

Number

Percent

Number

Percent

All Small (40-99)*
(114 establishments)

12

10.53

87

76.32

15

13.16

6

5.26

95

83.33

13

11.40

Small (40-49)**
(24 establishments)

4

16.67

18

75.00

2

8.33

3

12.50

20

83.33

1

4.17

Medium (100-249)
(106 establishments)

14

13.21

76

71.70

16

15.09

10

9.43

79

74.53

17

16.04

Large ($250)
(31 establishments)

1

3.23

22

70.97

8

25.81

1

3.23

21

67.74

9

29.03

ALL SIZES
(251 establishments)

27

10.76

185

73.71

39

15.54

17

6.77

195

77.69

39

15.54

* The “all small” (40-99) and “small” (40-49) establishment size groups include 1 establishment with an average employment of less than 40 employees. This
establishment had total employment of 46 for the year, but average employment during 2003 was determined to be 39.
** The “small” size group is a subset of the “all small” size group.

Final Report, August 2006

24

Table 11
OSHA Audits on CY 2003 Injury and Illness Recordkeeping:
Results of the Comparison of Hours Worked Data Submitted to OSHA for the Data Collection with
Hours Worked Provided During Recordkeeping Audits by Establishment Size
Establishment Comparison Results
Establishment
Size Group
(average number of
employees)

Hours Worked
Audit Less

Audit Same

Audit More

Number

Percent

Number

Percent

Number

Percent

All Small (40-99)*
(114 establishments)

15

13.16

84

73.68

15

13.16

Small (40-49)**
(24 establishments)

6

25.00

14

58.33

4

16.67

Medium (100-249)
(106 establishments)

10

9.43

80

75.47

16

15.09

Large ($250)
(31 establishments)

3

9.68

24

77.42

4

12.90

ALL SIZES
(251 establishments)

28

11.16

188

74.90

35

13.94

* The “all small” (40-99) and “small” (40-49) establishment size groups include 1 establishment with an average employment of less than 40 employees. This
establishment had total employment of 46 for the year, but average employment during 2003 was determined to be 39.
** The “small” size group is a subset of the “all small” size group.

Final Report, August 2006

25

Table 12
OSHA Audits on CY 2003 Injury and Illness Recordkeeping:
Results of the Comparison of Total Recordable Injury and Illness Cases and
Days Away, Restriction, or Transfer (DART) Injury and Illness Cases Submitted to OSHA for the Data Collection with
Data on the Employer’s Log as Found During Recordkeeping Audits
by Industry Type (Manufacturing vs. Non-Manufacturing)
Establishment Comparison Results
Industry Type

Total Recordable Injury and Illnesses Cases
Audit Less

Audit Same

Audit More

DART Injury and Illness Cases
Audit Less

Audit Same

Audit More

Number

Percent

Number

Percent

Number

Percent

Number

Percent

Number

Percent

Number

Percent

All Manufacturing SICs
(171 establishments)

14

8.19

134

78.36

23

13.45

11

6.43

141

82.46

19

11.11

All Non-Mfg SICs
(80 establishments)

13

16.25

51

63.75

16

20.00

6

7.50

54

67.50

20

25.00

All SICs
(251 establishments)

27

10.76

185

73.71

39

15.54

17

6.77

195

77.69

39

15.54

Final Report, August 2006

26

Table 13
OSHA Audits on CY 2003 Injury and Illness Recordkeeping:
Results of the Comparison of Hours Worked Data Submitted to OSHA for the Data Collection with
Hours Worked Provided During Recordkeeping Audits
by Industry Type (Manufacturing vs. Non-Manufacturing)
Establishment Comparison Results
Hours Worked

Industry Type
Audit Less

Audit Same

Audit More

Number

Percent

Number

Percent

Number

Percent

All Manufacturing SICs
(171 establishments)

21

12.28

127

74.27

23

13.45

All Non-Manufacturing SICs
(80 establishments)

7

8.75

61

76.25

12

15.00

All SICs
(251 establishments)

28

11.16

188

74.90

35

13.94

Final Report, August 2006

27

As found in past analyses, there are a variety of reasons why the two datasets may differ.
Tables 14 and 15 display the reasons for differences in case counts and hours worked,
respectively. Changes or corrections to the Log after submission to the ODI accounted for
differences in case counts in almost 40 percent of the establishments. Checkmark errors
accounted for another 11 percent. Differences of these types do not necessarily indicate
inaccuracy of the data maintained by the employer or submitted to the Agency. No reason was
established for only about 6 percent of the differences. This relatively low percentage of
unexplained differences is consistent with recent years of the audit program analysis and may be
attributable to enhancement of the audit software that made it easier to capture and categorize
recordkeeper responses by primary and secondary reasons.
For hours worked, the primary reasons provided to explain differences were: (1) the
number of hours was estimated rather than calculated for the submission, and (2) the submission
included errors associated with omitting hours worked by certain employee groupings (e.g.,
temporary labor or salaried employees).
Many of the differences observed were fairly small. Taking into account all of the
differences, 4 establishments would have changed targeting category relative to the primary
inspection list for OSHA’s Site-Specific Targeting (SST) Program, which is based on either the
DART injury and illness rate or the days away from work (DAFW) injury and illness rate of
establishments as calculated from the ODI data. Specifically, 2 establishments would have
moved onto the primary list for the high-rate targeting program and 2 would have dropped off
the primary list. (OSHA also maintains a secondary inspection list for establishments that are
considered a lesser priority based on lower thresholds for these rates.)

Final Report, August 2006

28

Table 14
OSHA Audits on CY 2003 Injury and Illness Recordkeeping:
Primary Reasons for Differences Between the Injury and Illness Data
Submitted to OSHA for the Data Collection and Injury and Illness Data
on the Employer’s Log Provided During the Recordkeeping Audits

Reason(s) Given for Difference(s) in Injury and Illness Data

Primary Reason
for Difference*
Number

Percent**

Log change(s) or correction(s) made after the data were submitted,
reflecting new information brought to the attention of recordkeeper(s)
pertaining to cases on the Log

31

39.24

Checkmark error(s)

11

13.92

Other reasons

10

12.66

Clerical error(s) (e.g., typo or transposition)

8

10.13

Blank or auditor could not determine reason

6

7.59

Survey processing edit(s) (employer’s Log was otherwise the same as the
submitted data)

5

6.33

Error(s) associated with reporting data from the wrong facility or
facilities

4

5.06

Addition error(s)

3

3.80

Error(s) associated with omitting reporting components
(e.g., temporary labor, salaried employees)

1

1.27

Establishment Totals***

79

100

* The audit software also provides fields for noting any secondary reasons given to explain the differences. This
analysis considers only the primary reasons.
** Because of rounding, percentages might not add to 100.
*** Although 79 establishments provided a primary reason for a difference (as noted in this table), the difference
resulted in a change in total recordable injury and illnesses case counts for only 66 establishments (see total of Audit
Less and Audit More for Total Recordable Injury and Illnesses Cases in Table 10). In the 13 instances where there
was no impact on the total case count, Log column differences in effect canceled each other out.
.

Final Report, August 2006

29

Table 15
OSHA Audits on CY 2003 Injury and Illness Recordkeeping:
Primary Reasons for Differences Between the Data on Hours Worked
Submitted to OSHA for the Data Collection and Data on Hours Worked
Provided During the Recordkeeping Audits

Reason(s) Given for Difference(s) in Hours Worked Data

Primary Reason for
Difference*
Number

Percent**

Estimated value instead of actual value

27

42.86

Error(s) associated with omitting reporting components (e.g.,
temporary labor, salaried employees)

20

31.75

Other reasons

7

11.11

Error(s) associated with reporting from wrong facility or facilities

5

7.94

Blank or auditor could not determine reason

4

6.35

Establishment Totals

63

100

* The audit software also provides fields for noting any secondary reasons given to explain the differences. This analysis
considers only the primary reasons.
** Because of rounding, percentages might not add to 100.

Final Report, August 2006

30

SUMMARY AND RECOMMENDATIONS
Summary
This analysis represents the eighth year of OSHA’s audit program on employer injury and
illness recordkeeping and the second year under the revised rule, which went into effect on January
1, 2002.
The audit program is well established and the protocol operates efficiently. This year, OSHA
exceeded its target of obtaining 250 audits for the analysis: A total of 251 audits was usable for the
universe estimates, the case-level analysis, and the comparison between submitted data and data on
employer Logs. Only one audit was excluded based on OSHA’s review of auditor documentation to
evaluate compliance with the protocol.
Across all of the years of the program, a number of findings remain consistent:
•

Based on the estimates of the accuracy of employer injury and illness recordkeeping, the
OSHA Log and employment data collected through the ODI represent reasonable quality
for OSHA’s targeting and performance measurement purposes.

•

Both some overrecording and underrecording are observed.

•

Underrecording errors are not widely distributed across the sample of establishments. A
small number of establishments account for most of the underrecorded cases.

•

Differences found in comparing the audit data with the data submitted to OSHA result in
very few changes of the inspection targeting category status of establishments.

Findings this year on the CY 2003 employer injury and illness recordkeeping suggest:
•

A higher percentage of the audit data for hours worked are the same as the data submitted
to OSHA than under the old recordkeeping rule. The revised Summary Form 300A
requires employers to record the number of hours worked, so this information is more
accessible. More accurate and readily available data on hours worked supports more
accurate injury/illness rates.

•

Although a high level of accuracy continues for both total recordable and DART cases,
the level of accuracy for DART cases was lower for the CY 2003 recordkeeping audits
than for CY 2002. (For total recordable and DART cases in CY 2003, however, OSHA
found that the accuracy estimates are not statistically different.) While the finding from
the year-to-year comparison may not indicate a downward trend, OSHA will need to
monitor accuracy for the recording of DART cases in subsequent audits. In the meantime,
some proactive outreach would be helpful in addressing a potential issue. (See
Recommendation 4.)

Final Report, August 2006

31

Recommendations
1. OSHA should continue the audit program as a quality control mechanism to ensure
that an acceptable level of accuracy in employer injury/illness recordkeeping for the
ODI data collection is maintained.
2. The ongoing audit program should maintain the refinements that have established a
credible audit process. These improvements include features such as the creation of a
standard universe and emphasis in the auditor training on adherence to the sampling
protocol.
3. Although the audit program is well established and the protocol operates efficiently,
OSHA should examine the effect(s) of any changes in the assumptions initially used
for the sampling parameters (e.g., the establishment non-compliant rate and the DART
incidence rate) to determine whether any refinements or updates to the sampling
methodology are needed. Also, the potential effect, if any, of the shift from SIC to
NAICS should be reviewed.
4. OSHA should continue to use the information from the audit analysis in outreach
efforts to promote improvements in employer injury and illness recordkeeping under
the revised rule. Specifically, this report and/or summaries of the findings should be
made available to OSHA Compliance Assistance Specialists, ODI data collection
agencies, and the compliance officers who conduct the audits. The information should
also be posted on the OSHA Web site. The correct recording of DART cases should be
emphasized.

Final Report, August 2006

32

Appendix A
List of OSHA Data Initiative Collection Quality Reports
and Related Studies
The following analyses have been conducted on OSHA’s audit program:
•

OSHA Data Collection Validation Study: Pilot Test on the Data Collection Quality
and Verification of Employer Injury and Illness Records. September 12, 1997 (Final
Report). Eastern Research Group, Inc. (Contract No. J-9-F-3-0043: Task Order No. 5,
Option Year Two.)

•

OSHA Data Initiative Collection Quality Control: Analysis of Audits on 1996
Employer Injury and Illness Recordkeeping. September 17, 1998 (Final Report). The
Lexington Group, Eastern Research Group, Inc., and the National Opinion Research
Center. (Contract No. J-9-F-7-0043: Task Order No. 7, Base Year.)

•

OSHA Data Initiative Collection Quality Control: Analysis of Audits on 1997
Employer Injury and Illness Recordkeeping. August 23, 1999 (Final Report). Eastern
Research Group, Inc. and the National Opinion Research Center. (Contract No. J-9-F7-0053: Task Order No. 7, Option Year One.)

•

OSHA Data Initiative Collection Quality Control: Analysis of Audits on 1998
Employer Injury and Illness Recordkeeping. September 29, 2000 (Final Report).
Eastern Research Group, Inc. and the National Opinion Research Center. (Contract
No. J-9-F-7-0053: Task Order No. 17, Option Year Two.)

•

OSHA Data Initiative Collection Quality Control: Analysis of Audits on 1999
Employer Injury and Illness Recordkeeping. September 28, 2001 (Final Report).
Eastern Research Group, Inc. and the National Opinion Research Center. (Contract
No. J-9-F-7-0053: Task Order No. 24, Option Year Three.)

•

OSHA Data Initiative Collection Quality Control: Analysis of Audits on 2000
Employer Injury and Illness Recordkeeping. September 27, 2002 (Final Report).
Eastern Research Group, Inc. and the National Opinion Research Center. (Contract
No. J-9-F-7-0053: Task Order No. 33, Option Year Four.)

•

OSHA Data Initiative Collection Quality Control: Analysis of Audits on 2001
Employer Injury and Illness Recordkeeping. December 5, 2003 (Final Report).
Eastern Research Group, Inc. and the National Opinion Research Center. (Contract
No. J-9-F-3-0015: Task Order No. 1, Base Year.)

•

OSHA Data Initiative Collection Quality Control: Analysis of Audits on 2002
Employer Injury and Illness Recordkeeping—Interim Year Analysis in Multi-Year
Reporting Cycle. September 30, 2006 (Final Report). Eastern Research Group, Inc.
and the National Opinion Research Center. (Contract No. J-9-F-3-0015: Task Order

Final Report, August 2006

A-1

No. 2, Option Year 1.)
Studies related to ODI collection quality include the following:
•

Sample Design for a Statistically Valid Evaluation of Accuracy and Completeness of
an Establishment’s OSHA-Mandated Employee Records. 1996. The National Opinion
Research Center.

•

OSHA Data Collection Validation Study: Initial Assessment of the Accuracy of the
OSHA-Collected Data—An Analysis of the Data Edit Reports and a Review of State
Agency Impressions. February 1997 (Final Report). Eastern Research Group, Inc.
(Contract No. J-9-F-3-0043: Task Order No. 5, Option Year Two.)

•

OSHA Data Collection Validation Study: Descriptive Characteristics of the 1995
OSHA-Collected Data and Comparison with the Bureau of Labor Statistics’ Annual
Survey on Occupational Injuries and Illnesses. September 12, 1997 (Final Report).
Eastern Research Group, Inc. (Contract No. J-9-F-3-0043: Task Order No. 5, Option
Year Two.)

•

OSHA Data Collection Validation Study: Issues with Creating a Matched File for
Comparing the OSHA 200 Log Data Collected by Compliance Officers During Onsite
Interventions with the Injury/Illness Data from the OSHA Log Data Collection.
September 12, 1997 (Final Report). Eastern Research Group, Inc. (Contract No. J-9F-3-0043: Task Order No. 5, Option Year Two.)

•

A Summary of Findings on the Correlation of Establishment Injury/Illness Rate Data
from the OSHA Data Initiative and the IMIS Log Data. September 25, 2000 (Final
Report). The Lexington Group, Eastern Research Group, Inc., and Dr. Wayne Gray.
(Contract No. J-9-F-7-0043: Task Order No. 23, Subtask 1, Option Year Two.)

•

A Summary of Findings on the Correlation of Establishment Injury/Illness Rate Data
from the OSHA Data Initiative and the BLS Annual Survey. September 25, 2000
(Final Report). The Lexington Group, Eastern Research Group, Inc., and Dr. Wayne
Gray. (Contract No. J-9-F-7-0043: Task Order No. 23, Subtask 2, Option Year Two.)

Final Report, August 2006

A-2

Appendix B
Background on the OSHA Injury and Illness Recordkeeping Audit Program
Program-Related Analyses and Key Findings
As an initial step in assessing the quality of information compiled by OSHA’s Data
Initiative (ODI) collection system, the Agency conducted two data validation studies in 1996:
C

An analysis of the data collection system’s edit criteria results and commentary on
data quality from State agencies assisting in the collection effort.

C

Calculation of descriptive statistics on the collected data and comparison of the data
with injury and illness data from the BLS Annual Survey.

Findings from the studies indicated that OSHA had implemented a credible system to provide the
Agency with useful, establishment-specific data on occupational injuries and acute illnesses.
At the same time, the studies underscored the need for OSHA to continue efforts to
ensure the quality of the OSHA-collected data. Under the audit program, OSHA conducts onsite
audits of employer injury and illness records to verify the overall accuracy of source records,
estimate the extent of employer compliance with the OSHA recordkeeping requirements defined
in 29 CFR 1904, and assess the consistency between data on the employer’s Log and data
submitted to the Agency under the ODI.
In 1997, OSHA conducted an audit pilot program in nine establishments to test the
Agency’s protocol designed for efficient use of resources in performing recordkeeping audits.
The protocol is designed to save auditors time through the review of records for a statistical
sampling of employees within an establishment and through use of the OSHA Recordkeeping
Audit Assistant (ORAA) software system for streamlining the process of conducting,
documenting, tracking, and analyzing the establishment audit.
Overall, OSHA’s analysis of the pilot test, which reviewed calendar year (CY) 1995
records, demonstrated the feasibility of the protocol for use in a larger audit program. In 1998,
based on its experience with the pilot test, the Agency modified the protocol slightly for use in
the first full-scale program for auditing employer injury and illness records. (That first year
involved audits on CY 1996 records.) Similarly, for the next five years of the audit program,
OSHA drew upon its earlier experience and made minor adjustments in implementation of the
program for audits on establishments’ CY 1997, 1998, 1999, 2000, and 2001 records,
respectively.

Final Report, August 2006

B-1

In summary, OSHA’s analyses of the first six years of the audit program found the
following:
C

The sample of establishments audited was representative of the sampling universe.

C

The audit protocol, including sampling of employees within establishments, appears
to provide OSHA with a feasible process to monitor the quality of employer injury
and illness recordkeeping.

C

The estimates of overall accuracy for total recordable and lost workday cases (i.e.,
establishments at-or-above the 95 percent threshold) suggest that the ODI collection
currently provides reasonably accurate data that OSHA can use to help meet its
program and performance measurement data needs. Related findings include:
o

The percent of establishments with injury/illness recordkeeping determined to be
at-or-above the threshold of accuracy has increased.

o

Errors are not widely distributed across the sample establishments. A small
number of establishments account for most of the underrecorded cases.

o

Both overrecording and underrecording are observed.

o

Differences found in comparing the audit data with the data submitted to OSHA
result in very few changes of the targeting category status of establishments for
inspections.

o

There is no evidence that small establishments have less accurate injury/illness
records than medium or large size establishments.

The sixth year of the audit program marked the last analysis of injury/illness
recordkeeping under the old version of 29 CFR 1904. Subsequent annual audit program cycles
focus on records maintained by employers under the revised rule, which went into effect on
January 1, 2002. The intention of the revisions made to the recordkeeping requirements is to
simplify injury/illness recordkeeping for employers and contribute to the quality of establishment
injury/illness data.
The seventh year of the audit program focused on CY 2002 injury/illness recordkeeping
and provided a preliminary review of accuracy in non-construction establishments under the first
year of the revised recordkeeping rule. The annual analysis indicated that recordkeeping
accuracy was not significantly different than the results found in past years under the old rule.

Final Report, August 2006

B-2

Highlights of Annual Recordkeeping Audits and Analyses
Second Year of Program (Audits on CY 1997 Recordkeeping). Notable differences in
implementation of the second-year audit program included expanding the audit universe beyond
the Federal OSHA jurisdiction to include establishments in State Plan States. Also, before
selecting a sample of audit establishments, OSHA implemented implicit stratification of the
universe by first sorting establishments on Standard Industrial Classification (SIC) code,
followed by OSHA Region, and last by employment size. This approach is designed to provide
sample establishments in similar proportions to their SIC, geographic, and size distribution in the
universe. Compared to a simple random sampling approach, implicit stratification distributes the
audit workload among the OSHA Regions better and balances the industry (manufacturing vs.
non-manufacturing/non-construction) and establishment size distributions for the analysis.
Third Year of Program (Audits on CY 1998 Recordkeeping). In the third year of the
audit program, OSHA began to explore the use of a standard sampling universe to facilitate
comparison of year-to-year estimates. OSHA also increased the number of establishments in the
audit sample and the number of assigned audits in order to increase the likelihood that the
number of audits available for analysis would be closer to the approximate target of 250.
Additionally, the third-year audit program’s coverage was expanded by including establishments
with an average employment between 40 and 49 (compared to the previous cut-off at 50 in 1997
and 60 in 1995 and 1996) and by encouraging a greater number of State Plan States to
participate.
Fourth Year of Program (Audits on CY 1999 Recordkeeping). For the fourth-year audit
program, OSHA modified its approach for selecting audit establishments from a universe of
establishments participating in the ODI in a specific year. Instead, OSHA selected a sample from
a standard ODI universe that covered all years of the ODI. OSHA’s objective in sampling from a
standard ODI universe was to establish a credible basis for generalizing the estimate of overall
accuracy for an individual year’s employer injury and illness recordkeeping to ODI
establishments nationwide. Additionally, use of a standard universe would anticipate the benefit
of conducting year-to-year comparisons to assess recordkeeping under the new rule.
In the first four years of the program, the analysis found that about 90 percent of
establishments in the sampling universe for the specific year were estimated as having accurately
recorded the number of total recordable cases; about 88 percent of establishments were found to
be accurate in recording lost workday cases.
Fifth Year of Program (Audits on CY 2000 Recordkeeping). For the fifth-year audit
program, OSHA selected a sample for a second time from a standard ODI universe that covered
all years of the ODI. This enabled OSHA to include a preliminary comparison of recordkeeping
accuracy estimates for ODI establishments nationwide. The comparison indicated consistency in
recordkeeping accuracy estimates across the fourth and fifth years of the audit program.
Interpretation of the comparison was limited somewhat because OSHA had further refined the
definition of the standard ODI universe in the fifth year by excluding two industries that are only
selectively included in the ODI. Nonetheless, the preliminary comparison provided potential
baseline data for using such comparisons to assess recordkeeping under the new rule.
Final Report, August 2006

B-3

Sixth Year of Program (Audits on CY 2001 Recordkeeping). For the sixth-year audit
program, OSHA again selected a sample from a standard ODI universe that covered all years of
the ODI. The analysis found consistency between CY 2000 and CY 2001 recordkeeping
accuracy estimates for ODI establishments nationwide. Further, in applying a statistical test to
the comparison of accuracy estimates, OSHA found no significant difference in the means for
the two years, suggesting overall recordkeeping improvement. (An additional, minor refinement
to the standard ODI universe should be noted regarding this year-to-year comparison; i.e., for the
sixth year program, the Agency included SIC 43 (U.S. Postal Service)—now under OSHA
jurisdiction—in the universe, which added 297 facilities.) In the fifth and sixth years of the
program, the analysis found that about 95 percent of establishments in the sampling universe
were estimated as having accurately recorded the number of total recordable cases; about 93
percent of establishments were found to be accurate in recording lost workday cases.
Also for the sixth year’s audit program, OSHA conducted a pilot test of audits at a
sample of construction firms using a protocol that addressed issues specific to the construction
industry and its operation of “short-term establishments.” OSHA selected a sample of
construction audit establishments from a universe of about 9,000 establishments that had
submitted complete data for the CY 2001 ODI collection and met relevant criteria (e.g., operate
under one of the three 2-digit construction SIC codes).
In analyzing the results of pilot audits on establishments in construction industries,
OSHA implemented the same general approach used for audits at non-construction
establishments. Overall, the analysis found a slightly lower percent of construction
establishments at-or-above the threshold of accuracy for both total recordable and lost workday
cases in comparison to the accuracy estimates for non-construction establishments. While the
construction pilot findings indicate that the audit methodology developed for non-construction
establishment can be implemented in construction establishments, unique aspects of construction
operations require allowances for flexibility in maintaining records, which yield a mix of
recordkeeping audits that vary in terms of establishment scope. Because of fundamental
differences in the recordkeeping procedures between the construction and non-construction
industries, if OSHA continues collecting data from construction SICs, it is recommended that the
ODI construction universe and audit analysis remain separate from the non-construction analysis.
Seventh Year of Program (Audits on CY 2002 Recordkeeping). For the seventh-year
audit program, OSHA again selected a sample from a standard ODI universe. This analysis on
CY 2002 recordkeeping provided a preliminary review of injury/illness recordkeeping accuracy
in non-construction establishments under the first year of employer implementation of OSHA’s
revised recordkeeping rule. The study indicated that recordkeeping accuracy in the first year
under the revised recordkeeping rule is not significantly different than the results found in past
years under the old rule. (Note that for calendar years before 2002, “accuracy” refers to
recordable cases recorded on the Log 200 or lost workday cases recorded on the Log as lost
workday cases. As of CY 2002, with implementation of the revised recordkeeping rule,
“accuracy” refers to recordable cases recorded on the Log 300 or DART cases recorded on the
Log as DART cases.)

Final Report, August 2006

B-4

Also in the seventh year of the program, OSHA established a multi-year analysis cycle
for audits on employer recordkeeping that includes interim and comprehensive analyses. OSHA
is no longer required to report annually on its monitoring of ODI data quality to OMB. Although
OSHA will continue to conduct annual recordkeeping audits, it will now report every third year
to OMB in conjunction with the Agency’s request for clearance to continue the annual ODI data
collection. For the non-reporting year(s) of a multi-cycle, OSHA will conduct only a summary
analysis of the annual audit program. The analysis on CY 2002 records addressed an interim year
of the OMB reporting cycle.

Final Report, August 2006

B-5

Appendix C
Summary of Major Changes in OSHA Injury and Illness Recordkeeping
Under the Revised Rule

Final Report, August 2006

C-1

MAJOR CHANGES TO OSHA'S RECORDKEEPING RULE
(from OSHA Web site: http://www.osha.gov/recordkeeping/RKmajorchanges.html)

This document provides a list of the major changes from OSHA's old 1904 recordkeeping rule to
the new rule employers will [began] using in 2002. This list summarizes the major differences
between the old and new recordkeeping rules to help people who are familiar with the old rule to
learn the new rule quickly.
Scope
The list of service and retail industries that are partially exempt from the rule has been updated.
Some establishments that were covered under the old rule will not be required to keep OSHA
records under the new rule and some formerly exempted establishments will now have to keep
records. (§1904.2)
The new rule continues to provide a partial exemption for employers who had 10 or fewer
workers at all times in the previous calendar year. (§1904.1)
Forms
The new OSHA Form 300 (Log of Work-Related Injuries and Illnesses) has been simplified and
can be printed on smaller legal-sized paper.
The new OSHA Form 301 (Injury and Illness Incident Report) includes more data about how the
injury or illness occurred.
The new OSHA Form 300A (Summary of Work-Related Injuries and Illnesses) provides
additional data to make it easier for employers to calculate incidence rates.
Maximum flexibility has been provided so employers can keep all the information on computers,
at a central location, or on alternative forms, as long as the information is compatible and the
data can be produced when needed. (§1904.29 and §1904.30)
Work related
A "significant" degree of aggravation is required before a preexisting injury or illness becomes
work-related. (§1904.5(a))
Additional exceptions have been added to the geographic presumption of work relationship;
cases arising from eating and drinking of food and beverages, blood donations, exercise
programs, etc. no longer need to be recorded. Common cold and flu cases also no longer need to
be recorded. (§1904.5(b)(2))
Criteria for deciding when mental illnesses are considered work-related have been added.
(§1904.5(b)(2))
Final Report, August 2006

C-2

Sections have been added clarifying work relationship when employees travel or work out of
their home. (§1904.5(b)(6) and §1904.5(b)(7))
Recording criteria
Different criteria for recording work-related injuries and work-related illnesses are eliminated;
one set of criteria is used for both. (The former rule required employers to record all illnesses,
regardless of severity). (§1904.4)
Employers are required to record work-related injuries or illnesses if they result in one of the
following: death; days away from work; restricted work or transfer to another job; medical
treatment beyond first aid; loss of consciousness; or diagnosis of a significant injury/illness by a
physician or other licensed health care professional. (§1904.7(a))
New definitions are included for medical treatment and first aid. First aid is defined by
treatments on a finite list. All treatment not on this list is medical treatment. (§1904.7(b)(5))
The recording of "light duty" or restricted work cases is clarified. Employers are required to
record cases as restricted work cases when the injured or ill employee only works partial days or
is restricted from performing their "routine job functions" (defined as work activities the
employee regularly performs at least once weekly). (§1904.7(b)(4))
Employers are required to record all needlestick and sharps injuries involving contamination by
another person's blood or other potentially infectious material. (§1904.8)
Musculoskeletal disorders (MSDs) are treated like all other injuries or illnesses: they must be
recorded if they result in days away, restricted work, transfer to another job, or medical treatment
beyond first aid. (§1904.12)
Special recording criteria are included for cases involving the work-related transmission of
tuberculosis or medical removal under OSHA standards. (§1904.9 and §1904.11)
Day counts
The term "lost workdays" is eliminated and the rule requires recording of days away, days of
restricted work, or transfer to another job. Also, new rules for counting that rely on calendar days
instead of workdays are included. (§1904.7(b)(3))
Employers are no longer required to count days away or days of restriction beyond 180 days.
(§1904.7(b)(3))
The day on which the injury or illness occurs is not counted as a day away from work or a day of
restricted work. (§1904.7(b)(3) and §1904.7(b)(4))

Final Report, August 2006

C-3

Annual summary
Employers must review the 300 Log information before it is summarized on the 300A form.
(§1904.32(a))
The new rule includes hours worked data to make it easier for employers to calculate incidence
rates. (§1904.32(b)(2))
A company executive is required to certify the accuracy of the summary. (§1904.32(b)(3))
The annual summary must be posted for three months instead of one. (§1904.32(b)(6))
Employee involvement
Employers are required to establish a procedure for employees to report injuries and illnesses and
to tell their employees how to report. (§1904.35(a))
The new rule informs employers that the OSH Act prohibits employers from discriminating
against employees who do report. (§1904.36)
Employees are allowed to access the 301 forms to review records of their own injuries and
illnesses. (§1904.35(b)(2))
Employee representatives are allowed to access those parts of the OSHA 301 form relevant to
workplace safety and health. (§1904.35(b)(2))
Protecting privacy
Employers are required to protect employee's privacy by withholding an individual's name on
Form 300 for certain types of sensitive injuries/illnesses (e.g., sexual assaults, HIV infections,
mental illnesses, etc.). (§1904.29(b)(6) to §1904.29(b)(8))
Employers are allowed to withhold descriptive information about sensitive injuries in cases
where not doing so would disclose the employee's identity. (§1904.29(b)(9))
Employee representatives are given access only to the portion of Form 301 that contains
information about the injury or illness, while personal information about the employee and his or
her health care provider is withheld. (§1904.35(b)(2))
Employers are required to remove employees' names before providing injury and illness data to
persons who do not have access rights under the rule. (§1904.29(b)(10))
Reporting information to the government
Employers must call in all fatal heart attacks occurring in the work environment.
(§1904.39(b)(5))
Final Report, August 2006

C-4

Employers do not need to call in public street motor vehicle accidents except those in a
construction work zone. (§1904.39(b)(3))
Employers do not need to call in commercial airplane, train, subway or bus accidents.
(§1904.39(b)(4))
Employers must provide records to an OSHA compliance officer who requests them within 4
hours. (§1904.40(a))

Final Report, August 2006

C-5

Appendix D
Tracking Status Codes Used in
Processing CY 2003 ODI Submissions
Distribution and Collection Status Codes
BLANK
ML
CI
ES
NRM
NRC
OTM
PO
PRM

Establishment record (address information only) in the database
Mailed form
Checked in form returned from establishment
Electronically submitted data by establishment
Nonresponse form mailed
Nonresponse telephone call made
Optional third mailing of form
Post office return
Remailed form to corrected address

Processing Status Codes
DE1
COMP
ECRG

Primary data entry
Secondary data entry and data compared
Edit condition report generated

Final Status Codes
OK
FD
UNR
NC
DU
OB
OS
OO
OKOS
OKPD
PHD
RU
UM

Data are complete and accurate
Final data for business that has ceased operations
State determined information is unreliable
Noncompliant establishment
Duplicate form
Out of business
Out of scope
Only office/sales staff at establishment
Data are complete and accurate but out of scope
Data are complete and accurate—partial year data
Phone disconnected
Records unavailable
Unmailable, no new address found

Final Report, August 2006

C-1

Appendix E
OSHA Directive: Audit and Verification Program of
2003 Occupational Injury and Illness Records
Directive Number: 05-01 (CPL 02)

Final Report, August 2006

E-1

[insert compliance directive]

Final Report, August 2006

E-2


File Typeapplication/pdf
File TitleAnalysis of Audits on CY 2003 Employer Injury and Illness Recordkeeping
SubjectOSHA Data Initiative Collection Quality Control
AuthorERG
File Modified2006-08-31
File Created2006-08-31

© 2024 OMB.report | Privacy Policy