HDCI2ss07

HDCI2ss07.doc

HDCI 2 Survey of Group Health Plans

OMB: 1210-0129

Document [doc]
Download: doc | pdf

HDCI 2 Survey of Group Health Plans

OMB Number 1210-NEW

February 2007

SUPPORTING STATEMENT FOR PAPERWORK REDUCTION ACT 1995 SUBMISSIONS

A. Justification


  1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.


In 1996, Congress amended the Employee Retirement Income Security Act of 1974 (ERISA) to add Part 7, creating new protections for employees and their families under group health plans and imposing new requirements on group health plans. The Part 7 requirements arise out of a series of laws affecting group health plans, specifically the Health Insurance Portability and Accountability Act of 1996 (HIPAA), the Newborns’ and Mothers’ Health Protection Act of 1996, the Mental Health Parity Act of 1996, and the Women’s Health and Cancer Rights Act of 1998. These laws impose specific limitations on group health plans and create specific rights concerning group health coverage. The Employee Benefits Security Administration (EBSA), which exercises delegated authority under ERISA to protect workers’ pensions and group health benefits, issued regulations under Part 7, codified at 29 C.F.R. 2590.701-1 et seq., to effectuate these rights and provide guidance to affected group health plans.


Carrying out the Department of Labor’s (Department’s) commitment to assist employers and workers in understanding how to comply with the federal employment laws under its jurisdiction,1 EBSA launched an extensive education campaign beginning immediately after passage of the new laws. EBSA undertook outreach and educational programs, developed compliance assistance publications, and, in 1999, conducted a pilot program, called Health Disclosure and Claims Issues, under which approximately 200 group health plans were investigated for compliance with Part 7. Based on its experiences in connection with the pilot program, in 2001 EBSA initiated the Health Disclosure and Claims Issues: Fiscal Year 2001 Compliance Project (HDCI), which sought to increase compliance with these health care provisions through investigations and improve EBSA’s ability to provide effective compliance assistance by assessing more comprehensively the extent and nature of compliance with Part 7 among group health plans.


Under HDCI, in 2001 EBSA conducted investigations of a large number of group health plans and published a report summarizing its findings. The report, entitled Health Disclosure and Claims Issues: Fiscal Year 2001 Compliance Project Report (HDCI Report) can be viewed on EBSA’s public website at http://www.dol.gov/ebsa/publications/hdci.html. EBSA relied on the HDCI Report results to pinpoint areas in which the regulated public misunderstood the regulatory requirements under Part 7. As a result of its findings, EBSA initiated the HIPAA Compliance Assistance Program (H-CAP), a multifaceted program to improve compliance with the provisions of Part 7 through a combination of publications, outreach, self-audit materials, and other compliance tools.


EBSA is now planning to conduct a second round of investigations among group health plans (HDCI 2), scheduled to begin in mid-year 2007, in order to continue its efforts to improve its compliance assistance programs. EBSA plans to examine a number and variety of group health plans that is sufficient to constitute a representative sample of existing plans from which EBSA can extrapolate compliance rates for group health plans in general, as further described in Part B of this submission. The investigations will bring about increased compliance by the plans subject to investigation and also permit EBSA to evaluate the impact of its compliance assistance programs by comparing the current compliance rates with the compliance rates obtained from the initial round of investigations in 2001.


The HDCI 2 project is designed to assess compliance for three basic types of group health plans, following the approach taken in 2001: plans sponsored by firms having 100 or more employees, plans sponsored by firms having 3-99 employees, and multiemployer plans. The first two groups of plans are single-employer plans. EBSA intends to identify an appropriate group of multiemployer group health plans to investigate from data included in the annual report (Form 5500) filings of multiemployer plans for recent years. Because all multiemployer plans must file such annual reports, the Form 5500 data, which is already available to EBSA, is sufficient for the purposes of identifying multiemployer group health plans; there is no need to conduct any information collection to derive an appropriate list.


However, EBSA does not have any established source of data from which it can derive a list of appropriate ERISA-covered single-employer group health plans that would be representative of the entire universe of existing single-employer group health plans. The gap in EBSA’s information arises because EBSA’s regulations provide exemptions from the annual report filing requirements for a large number of single-employer group health plans. Many ERISA-covered single-employer group health plans, therefore, do not file any reports with the government, and EBSA is not able to identify a comprehensive list of ERISA-covered single-employer group health plans through any existing records that could serve as a sampling frame.


EBSA has determined that it can obtain the names and contact information for a large number of firms, of different sizes, from a commercial enterprise that specializes in providing business information. This company does not obtain or make available information about employee benefits provided by the firms. In order to derive an appropriate list of ERISA-covered single-employer group health plans, both small and large, EBSA intends to conduct a narrow scope telephone survey, described in detail in Part B of this submission, to contact a random sample of the listed firms to determine, by asking a limited number of questions, whether the firm sponsors a group health plan that is subject to ERISA.

EBSA believes that conducting the telephone survey is the simplest, least burdensome method of identifying an appropriate group of representative ERISA-covered single-employer group health plans. This information collection request (ICR) therefore seeks approval of the telephone survey that EBSA needs to conduct in order to identify single-employer group health plans.


  1. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.


This is a new collection of information. The purpose of the information collection is to identify ERISA-covered single-employer group health plans in two groups: (1) plans sponsored by firms with 3-99 employees, and (2) plans sponsored by firms with 100 or more employees. EBSA field-office staff making the telephone survey calls will enter the responses into an electronic database. Group health plans that are identified through the telephone survey will be subject to separate investigation for compliance with the requirements of Part 7 of ERISA. Investigations will be conducted by EBSA investigators, who will use a checklist that is essentially the same as the HIPAA Compliance Checklist relied upon in the 2001 project to determine a group health plan’s level of compliance. Plans cited with violations will be required to make changes to plan documents and benefits administration necessary for compliance. EBSA also will work with health insurance companies, HMOs, and third-party administrators that offer model policies, plan documents, and centralized claims processing systems to bring about required changes in these documents and services affecting other ERISA-covered group health plans. In addition, the compliance results will be used to assess the effectiveness of EBSA’s H-CAP program, to guide EBSA’s future enforcement targeting efforts, and to assist EBSA in developing additional compliance assistance programs. EBSA needs the information collection that will be provided through the telephone survey in order to establish and use verifiable program evaluation measures to meet its goals under the Secretary of Labor’s Strategic Plan and the President’s Performance Assessment Rating Tool (PART).


  1. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration for using information technology to reduce burden.


To reduce burden, EBSA staff will be encouraged to conduct an internet search, before conducting any telephone surveys, to determine whether a particular firm indicates on its website that it offers group health coverage to its employees. When this method cannot be employed with respect to a particular named firm, the staff will contact the firm by telephone and conduct the survey as specified in the attached script and instructions. In all cases in which a telephone survey is conducted, the telephone survey will be limited both in time and in the content of information gathered, with each telephone call anticipated not to take more than five minutes. In many cases, the call will take less than that amount of time.


  1. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.


As described above, EBSA wishes to conduct this information collection because no comprehensive source of data for identifying appropriate ERISA-covered single-employer group health plans exists. There is, therefore, no duplication of effort or of data collection in this ICR.


  1. If the collection of information impacts small businesses or other small entities (Item 5 of OMB Form 83-I), describe any methods used to minimize burden.


The telephone survey will seek to identify both large group health plans (plans sponsored by firms with 100 or more employees) and small group health plans (plans sponsored by firms with 3-99 employees). Therefore, EBSA expects to contact via telephone firms that are both large and small. Although small entities, principally small firms, will be included in the firms contacted through the telephone survey in order to ensure that small group health plans are represented among the plans identified for later investigation, the impact on small entities will be minimal because of the limited nature and extent of the survey itself.


  1. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


Undertaking the telephone survey is necessary to identify plans for investigation in order to increase compliance with the health care provisions of Part 7 of ERISA. In the 2001 project, more than 45 percent of all plans (and more than 60 percent of multiemployer plans) were cited with at least one violation of ERISA Part 7. EBSA investigators were able to work with plan representatives and service providers to plans to address violations and bring about increased compliance with the law, both for the specific plans investigated and for other ERISA plans using the same service providers.


In addition, the collection of this information is crucial to EBSA’s successful completion of a second round of investigations under HDCI. Unless it is possible to identify a representative sample of ERISA-covered group health plans, EBSA will not be able to assess the effectiveness of its H-CAP compliance assistance program and will not be able to satisfy requirements under the President’s Performance Assessment Rating Tool (PART) and the Secretary of Labor’s Strategic Plan. The second round of investigations is essential to completing the work summarized in the HDCI Report, which established a baseline for compliance with Part 7. Comparing the baseline with newly obtained compliance results would allow EBSA to evaluate its current programs and improve its effectiveness in future efforts.


  1. Explain any special circumstances that would cause an information collection to be conducted in a manner:

requiring respondents to report information to the agency more often than quarterly;

requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

requiring respondents to submit more than an original and two copies of any document;

requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records for more than three years;

in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;

requiring the use of a statistical data classification that has not been reviewed and approved by OMB;

that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

requiring respondents to submit proprietary trade secret, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.

None.


  1. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.


Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.


Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years -- even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.


The Department’s Federal Register notice was published on December 8, 2006 (71 FR 71190). The notice solicited public comment on the proposed information collection and provided 60 days within which to submit comments, as required by 5 CFR 1320.8(d). No comments were received.


  1. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


No payments or gifts will be provided to respondents.


  1. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.


As specified in the telephone script (attached to this ICR as a collection instrument) that will be used in conducting the telephone survey, respondents will be informed that a report prepared in connection with this survey will summarize findings across the sample and will not associate responses with a specific firm or individual and that information identifying the respondent and/or the firm will not be provided to anyone outside the agency except as required by law.


  1. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.


The information collection will not involve any questions of a sensitive nature.


  1. Provide estimates of the hour burden of the collection of information. The statement should:

Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.

If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens in Item 13 of OMB Form 83-I.

Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included in Item 13.


The hour burden arising from this information collection derives solely from the time that an individual at a contacted firm will spend answering a telephone query from EBSA field-office staff conducting the telephone survey. Because the information being sought is very simple (e.g., whether or not the firm sponsors a group health plan for its employees), no additional time will be needed to gather or analysis information in order to respond. As described in Part B, below, EBSA anticipates that field-office staff will need to contact a maximum of approximately 5,000 firms in order to obtain an adequate sample of ERISA-covered single-employer plans for its investigations.


In order to estimate the hour burden of this information collection, the Department analyzed the likely length of the telephone calls that EBSA will make, based on a variety of factors, including the number of calls that will go unanswered, the number of calls that will be terminated earlier than the end of the telephone script, and the number of calls that will require additional time for explanation and to answer questions posed by the contacted firm personnel. Based on this analysis, the Department estimates an average of five minutes for each call, resulting in a total hour burden of approximately 417 hours (5000 calls x 5 min. / 60 = 416.666 hours).


  1. Provide an estimate of the total annual cost burden to respondents or recordkeepers resulting from the collection of information.


There is no additional cost burden to respondents or recordkeepers resulting from this collection of information.


  1. Provide estimates of annualized cost to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information. Agencies also may aggregate cost estimates from Items 12, 13, and 14 in a single table.


The Department estimates a total cost to the Federal government from the conduct of the telephone survey of approximately $97,000. The following table describes how that estimate was derived:



HDCI - Supporting Statement A15. Cost to the Federal Government











Office

Hours

Average Grade Level

Total Cost Per Hour

Total Cost




National Office (NO)

200

14

$ 70.47

$ 14,094




Training1

76

12

$ 50.14

$ 3,811




Calls2

660

11

$ 41.84

$ 27,618




BMDS3

n/a

n/a

n/a

$ 51,600




Total

 

 

 

$ 97,122




1Net Meeting/Training; (10 field offices x 3 staff per office + 8 NO staff) = 38 participants


2Call hours: 263 respondent hours * 2.5 = 657.5

3EBSA Information Management personnel provided an estimate of the total (past and future) cost of developing a tracking system for the telephone survey calls. This estimate was $64,500, which has been reduced by 20% because a small portion of the information that the system will collect is unrelated to this ICR.


  1. Explain the reasons for any program changes or adjustments reporting in Items 13 or 14 of the OMB Form 83-I.


This is a new collection of information.

  1. For collections of information whose results will be published, outline plans for tabulation, and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.


The results of the telephone survey will not be published. ERISA-covered group health plans identified through the telephone survey will be subject to investigations, conducted by EBSA investigators, that will assess the plan’s level of compliance with Part 7 of ERISA. It is anticipated that the results of all of the HDCI investigations will be aggregated and analyzed, and that EBSA will extrapolate from those results certain conclusions about group health plans’ compliance with Part 7 in general. It is possible that EBSA may ultimately publish a report on the results of these investigations as a follow up to the HDCI Report described in the answer to item 1, above.


  1. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


The OMB expiration date will be published in the Federal Register following OMB approval.

  1. Explain each exception to the certification statement identified in Item 19, "Certification for Paperwork Reduction Act Submission," of OMB 83-I.


Not applicable. There are no exceptions to the certification statement.


B. Collections of Information Employing Statistical Methods


  1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection methods to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.


The universe for this project consists of health plans sponsored by firms having three or more employees that are subject to Part 7 of ERISA. An estimate based on data collected from the 2001 project was that this universe consisted of 1.3 million health plans.


This universe is divided into three parts: plans sponsored by firms having 100 or more employees, plans sponsored by firms having 3-99 employees, and multiemployer plans. Because only the samples of plans sponsored by large and small firms require surveys, the discussion below focuses on these two samples. The size of the 2007 universe is estimated through two adjustments to the 2001 universe size estimates – one factor that accounts for changes in the number of firms and a second factor that accounts for changes in the percentage of firms that sponsor in-scope plans.


First we consider the changes in the sponsorship rate. Health plan sponsorship rates and response rates from the MEPS-IC and from the Employer Health Benefits Surveys of the Kaiser Family Foundation (KFF) provide no clear basis for forecasting the sponsorship rate that EBSA will find in its 2007 HDCI project.


Year

Percentage of private sector establishments that offer health insurance/1

Percentage of Firms Offering Health Benefits/2

Percentage of Firms Offering Coverage Subject to Part 7 of ERISA/3

Fewer than 50 employees

50 or more employees

3-199 Employees

200 or more Employees

3-99 Employees

100 or more employees

1996



59

99



1997

40.4

95.6





1998

43.7

96.3

54

100



1999

47.1

96.9

65

99



2000

47.2

96.8

68

99



2001

46.0

96.9

67

98

35.1

88.1

2002

44.5

96.5

66

98



2003

43.2

95.4





2004

41.9

96.3

63

99



2005

40.5

96.3

59

98



2006

39.2

96.3

58.5

98



2007 /4

37.8

96.3

56.8

98

31.1

88.1

Decline: 2001-2007

8.2

0.6

10.2

0.0

4

0

Change factor/5





0.886

1.000

Note: Bold indicates straight line interpolation or extrapolation

1/ Source: Agency for Healthcare Research and Quality, Center for Financing, Access and Cost Trends. 2003 Medical Expenditure Panel Survey-Insurance Component

2/ Source: Kaiser Family Foundation, Employer Health Benefits 2005 Annual Survey, http://www.kff.org/insurance/7315/sections/ehbs05-2-2.cfm

3/ EBSA 2001 Health Disclosure and Claims Issues FY 2001 Compliance Project

4/ The 2007 HDCI offer rate was computed using the 2001-2007 change, which was judgmentally assigned as discussed.

5/ Projected 2007 offer rate divided by 2001 offer rate


The health plan sponsorship rate found by the KFF in 2000/2002 for firms having 3-199 employees was 32 percentage points higher than that found by the 2001 HDCI for firms having 3-99 employees, despite the fact that both surveys used a Dun and Bradstreet sampling frame. Certainly some portion of the difference between the results could be explained by the difference in the size groups used, but even the KFF sponsorship rates for firms having 3-9 workers (about 58%) were much higher than those observed in the HDCI (35%) for firms having 3-99 workers. The HDCI achieved a higher response rate than the KFF, even when compared to the KFF rate of response solely for the health plan sponsorship question. Even if the higher level of response achieved by the HDCI had been achieved exclusively by adding responses from firms without plans, that fact would explain only about nine percentage points of the observed differences.


Survey

Sponsorship Rate

Response

Adj. Sponsorship Rate *

Size of Adj.

Sponsorship Dif. from HDCI

% of Dif. Explained by Adj.

HDCI

35%

87%

35%




MEPS-IC

46%

78%

42%

4%

11%

35%

Kaiser

67%

71%

58%

9%

32%

29%

* Adjusted to the 87% response rate of the 2001 HDCI survey using the assumption that among the added firms, none actually sponsored plans. Thus the assumption is that the MEP-IC adds 9% of firms that do not sponsor plans as respondents and the KFF survey

adds16% of firms that do not sponsor plans as respondents.


The possible explanations for the differences in sponsorship rates that remain relate to survey methods. Although the KFF had extensive follow-up to its sponsorship question, the response rate (71%) and sponsorship rate (67%) cited are solely for the sponsorship question. It is possible that many of the 20+ percent of firms that responded to the sponsorship question, but not to the core survey, did not really sponsor plans. EBSA investigators encountered this problem when, upon initial contact, firms indicated that they sponsored plans, but the subsequent investigation proved otherwise. Although we know that this problem occurred more than once in the 2001 HDCI, we did not keep records on the problem – firms were recorded as not sponsoring plans regardless whether that fact was determined on initial contact or at the investigative phase. For the 2007 survey, EBSA plans to distinguish between the two stages at which this determination could be made to permit assessment of the importance of this explanation.


The health plan sponsorship rates observed on the MEPS-IC (46%) and the HDCI (35%) are not comparable in that: (1) The upper bound of the MEPS-IC size interval is 50, while it is 99 for the HDCI, (2) the MEPS-IC size category has no lower bound, while the lower bound for the HDCI was 3 employees, and (3) the unit of measurement for the MEPS-IC is the establishment, while for the HDCI it is the firm (or subsidiary). The first two differences would tend to make MEPS-IC sponsorship rates lower compared to the HDCI because the size category bounds are lower and smaller establishments have lower sponsorship rates. The third difference would tend to make MEPS-IC sponsorship rates higher because some of small establishments offer health plan coverage sponsored by a headquarters, which may be a large firm, even using the HDCI threshold of 100 employees. It is hard to estimate the magnitude, or even the direction, of the net effect of these differences.

One could argue that because the MEPS-IC and the KFF surveys found higher sponsorship rates than did the 2001 HDCI, that the 2007 HDCI is likely to find a higher sponsorship rate to more closely resemble those surveys. This argument lacks a causal explanation. Because the standard error of the small firm sponsorship rate estimated from the 2001 HDCI was only 1.4 percentage points, the observed differences in health plan sponsorship rates between surveys were almost certainly not a mere statistical fluke. Thus if the HDCI survey were repeated in 2007 using the same survey methods that were used in 2001 and actual rates of health plan sponsorship in the U.S. were unchanged, there is a negligible chance that the observed gap in sponsorship rates would narrow significantly.

One could also argue that because sponsorship rates observed in both the MEPS-IC and the KFF surveys have been declining that the sponsorship rate observed in the 2007 HDCI is likely to decline as well. Because this argument offers a causal explanation – declining sponsorship rates – we assume that this effect will predominate. Because the first argument appears to have some merit, and because smaller rates are presumably less subject to decline than higher rates are, we assume that the HDIC sponsorship rate will decline from 2001 to 2007 by a smaller amount than we project among small entities for the other two surveys (8.2 and 10.2 percentage points). The assumed decline in the sponsorship rate among small firms is 4 percentage points. On that basis, we estimate that the small firm sponsorship rate will have declined by a factor of 0.886 (11.4%) from 35.1 to 31.1 percent over the period from 2001 to 2007.

The second of the two factors that enters into estimating the size of the projected universe is changes in the number of firms. The U.S. Census Bureau has published the number of firms for each year from 1991 to 2004 (http://www.census.gov/epcd/susb/2003/us/US--.HTM). The table below shows those estimates, EBSA projection of those estimates through 2007, and the estimates of the number of in-scope plans derived by adjusting universe estimates from the 2001 HDCI study for both growth in the number of firms and changes in the rates at which firms sponsor health plans.


Year

Total Firms

 Small Firms (Fewer than 100 employees)

 Large Firms

(100 or more employees)

Number

Percent

Number

Percent

1991

5,051,025

4,970,209

98.40%

80,816

1.60%

1992

5,095,356

5,013,830

98.40%

81,526

1.60%

1993

5,193,642

5,105,350

98.30%

88,292

1.70%

1994

5,276,964

5,187,256

98.30%

89,708

1.70%

1995

5,369,068

5,277,794

98.30%

91,274

1.70%

1996

5,478,047

5,384,920

98.30%

93,127

1.70%

1997

5,541,918

5,447,705

98.30%

94,213

1.70%

1998

5,579,177

5,484,331

98.30%

94,846

1.70%

1999

5,607,743

5,512,411

98.30%

95,332

1.70%

2000

5,652,544

5,550,798

98.20%

101,746

1.80%

2001

5,657,774

5,555,934

98.20%

101,840

1.80%

2002

5,697,759

5,600,897

98.30%

96,862

1.70%

2003

5,767,127

5,663,319

98.20%

103,808

1.80%

2004

5,885,784

5,779,840

98.20%

105,944

1.80%

2005p

5,939,365

5,832,049

98.19%

107,317

1.81%

2006p

5,999,338

5,890,278

98.18%

109,060

1.82%

2007p

6,059,312

5,948,507

98.17%

110,804

1.83%







Plans subject to Part 7, 2001

1,309,266

1,217,807

 

91,459

 

Growth in number of firms (2001-2007)

 

1.071

 

1.088

 

Sponsorship growth factor (2001-2007)*

 

0.886

 

1

 

Estimated plans subject to Part 7, 2007

 

 

 

 

 

Number

1,255,242

1,155,139

 

100,103

 

% change from 2001

-4.1%

-5.1%

 

9.5%

 

p=based on separate straight-line projections for large and small firms.

* See discussion above.

The above estimates will be used solely for planning purposes. Final estimates will rely on counts that Dunn and Bradstreet will produce as a byproduct of the sample selection process.

Sample sizes for the large and small firm sample are presented after discussion of the sample size formulas in response to question 2.

The table below summarizes results of the 2001 surveys of large and small firms in the manner that OMB recommends for calculation of response rates. Because in 2001, EBSA experienced very little non-response among reachable firms, and because the extent to which unreachable firms sponsor in-scope plans is unknown, the response rates for 2001 have been calculated in three ways depending on whether they assume that unreachable firms are:


  1. In-scope to the same extent that reachable firms are in-scope (OMB recommended)

  2. Always in-scope (minimum)

  3. Always out-of-scope (maximum)


Response Category

Large firm sample

Small firm sample

Total sample

623

1,604

Investigations

469

863

In-scope, not investigated

8

10

Out-of-scope

106

1,016

Unreachable

40

338

Percentage of reachable firms that were in-scope

81.8%

46.2%

Response Rate

OMB recommended/2

92.0%

83.9%

Minimum

90.7%

71.3%

Maximum

98.3%

98.9%

Because unreachable firms are judged to be often out-of-business and rarely to sponsor health plans, the maximum response rate is judged to be more accurate for this survey than the OMB-recommended rate.


  1. Describe the procedures for the collection of information including:

  • Statistical methodology for stratification and sample selection

  • Estimation procedure

  • Degree of accuracy needed for the purpose described in the justification

  • Unusual problems requiring specialized sampling procedures, and

  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


The Dun and Bradstreet (D&B) database from which the samples of large and small firms were selected for the 2001 compliance project, and from which the samples of large and small firms will be selected for the 2007 project distinguishes subsidiaries from parent companies. EBSA has defined the universes for the three samples in such a way as to partition the universe of ERISA covered plans into three all-inclusive pieces. Inclusion of subsidiaries in the large and small firm universes was judged necessary to achieve the desired inclusiveness. Excluding subsidiaries from the universe would either result in the exclusion of plans sponsored by subsidiaries, or would require EBSA investigators to change their normal procedures which limit the investigation to the plan or firm that is the subject of the complaint or referral. Including subsidiaries, which was the chosen approach, introduced a complication. Any health plan that covers employees of more than one subsidiary has more than one chance of inclusion in the project. The statistical weights for these plans must therefore be adjusted downward to correct for their higher probability of selection. In the 2001 project this complication did not arise at all in the sample of small firms, but it affected about 1 in 6 large firms. Statistical weights for the sample of large firms must reflect the probability of each plan being selected for the large firm sample, which depends on the number of large subsidiaries whose employees are covered under the plan. In particular, the weight for plan i sponsored by a large firm or subsidiary will be 1/Pi, where Pi is the probability that one or more or the firms or subsidiaries whose employees are covered by the plan is selected for the sample and Pi is calculated as:



Where PL is the (uniform) probability of selection in the large firm sample and Li is the number of large firms or subsidiaries whose employees are covered under the plan. Because some of these plans also cover employees of small subsidiaries, there will be some overlap between the set of plans identified through the large firm sample and the set of plans identified through the small firm sample. A second set of weights will be used to estimate violation rates for all health plans, regardless of the sample through which the plan was identified. Such estimates are not the primary goal of the survey. (If they were, a stratified sample design may have been appropriate.) They are, nevertheless, of interest, even though the fact that plans of small firms constitute about 98% of all plans means that the overall estimates will be virtually indistinguishable from the estimates for plans of small firms. These overall weights will be the reciprocals of the probabilities of selection Qi , where



where PS and LS are the small firm analogs of PL and Li defined above.


The key variables to be estimated are the overall rates at which plans violate one of more applicable provisions of Part 7 of ERISA. For each reachable, in-scope sample plan i investigators will make this determination Vi, where Vi = 1 if plan i is determined to be in violation and 0 otherwise. Let the weight assigned to plan i be wi . The overall violation rate for each sample in 2007 (HDCI year 2) is V2 and will be estimated as the weighted mean if the Vi:



Kish refers to this estimator as “generally the preferred estimate.” [Kish, p. 67]. It is a form of the Horvitz-Thompson estimator that “is a mainstay of the design-based approach” and is “unbiased under very mild conditions” [Little, p. 549]. The fact that the sample design is random rather than systematic reduces concerns about selection of a variance estimator that Little points to as the first potential major deficiency of this estimator. The dichotomous nature of the dependent variable Vi renders it immune to the problem of distortion by outliers that Little cites as the second of its two major deficiencies of HT.


And the variance of V2 will be estimated as:



where n is the sample size, N is the estimated number of plans in the universe3, the summations run over all sample plans, and wi is the weight for the ith plan. This formula is derived from a more general formula for the variance of a mean estimated from a complex survey that appears in the online documentation: (\core\help\statug.chm::/statug.hlp/surveymeans_sect19) for the SAS Surveymeans procedure Version 9.1.3.4


For 2007 we assume that this complication will, again, not arise in the small firm sample, and that it will affect the large firm sample to the same extent that it did in the 2001 project. The necessary weighting in the large firm sample increased the variance of estimated violation rates by a factor of 1.074 compared to the variance that would have arisen from a simple random sample of the same size. This factor, called the design effect (D), was, by definition, equal to one for the samples of small firms and multiemployer plans, which were simple random samples.


Let the violation rate, estimated from a simple random sample of size n1 in year 1, be p1, and the violation rate, estimated from a simple random sample of size n2 in year 2, be p2.


The variances of these estimates are V1 and V2 given by:


and



The change in the violation rate is p2 - p1 and, because V1 and V2 arise from independent surveys having covariance of zero, the standard error of this difference is:


(1)



A test of the null hypothesis that the violation rate did not change from 2001 to 2007 against the alternative hypothesis that the rate of compliance changed by at least 10 percentage points uses a critical value, X, that is as unknown. A Z-statistic with an absolute value less than X will lead to the conclusion that the rate of compliance did not significantly change, and a Z-statistic with an absolute value greater than X will lead to the conclusion that the rate of compliance changed (either upward or downward depending on the sign of the Z-statistic). The critical value X must satisfy the conditions imposed by the ceilings we impose on type I and type II errors. Specifically, the probability of type I error must not exceed .05 if the null hypothesis (no change in compliance rate) is true. Because type I error under a two-tailed test can occur if the Z-statistic is either too high or too low, we can accept only a 2.5% probability that the Z-statistic will be too low. Thus the critical value must fall at Z.025= -1.96 on the null hypothesis curve. Under the null hypothesis, the expected difference in means is 0 so an actual difference of X would depart from this expectation by X – 0 = X. The standard error of this difference would be given by the equation above with p1 = p2. Thus,


(2)


The critical value X must also limit the probability of type 2 error to 10%. If the decrease in the violation rate from 2001 to 2007 is found is found to be X, then we must assure that the probability that the Z-statistic exceeds X under the alternate hypothesis is no more than 10%. We have this assurance if the observed difference X=Z0.90=1.282. Substituting into this equation a computation of the Z-score for X under the alternate hypothesis, we have:


(3)


Solving the last two equations for X yields:


(4)


And



(5)


Equating these two expressions (4 and 5) for X yields:



(6)


With X now eliminated, p1, n1, and D known from the 2001 survey, and p2 set by the alternate hypothesis, we have one equation and in one unknown, namely n2, the sample size needed in 2007. This equation is not algebraically solvable. We use an iterative computation method to solve for n2, specifically the SAS NLIN procedure.


This formula assumes that the universe from which each sample will be drawn is infinite. If application of the formula results in a computed 2007 sample of size , then given a universe of size N, the same variance can be achieved using a smaller, finite-population-corrected sample size of n [See Kish, p. 45], where


(7)


Conversely, each 2001 sample size n achieves the same variance as an infinite-population-corrected sample size of , derived by reversing the 2001 finite population correction:


(8)


Thus the 2001 sample sizes n1 are input into equation (6) after upward adjustment using (8) and the computed 2007 sample sizes are adjusted downward using equation (7).


The results of applying (6), (7), and (8) to the three samples are:


Sample


2001


2007

Actual Sample Size (n)

Universe

Sample Size for Infinite Population ( )

Sample Size for Infinite Population ( )

Estimated

Universe

Required Sample Size

(n)

Large firm

469

91,459

471

621

100,000

617

Small firm

394

1,217,807

394

739

1,155,000

739

Multiemployer

416

1,759

545

476

1,759

375

Total

1,279





1,731


EBSA plans to contact firms identified by Dun and Bradstreet in random order, and terminate the screening calls as near as possible to the point when the target number of firms that sponsor in-scope plans has been located in each sample. Because the rate at which sample firms will be found to sponsor plans is not known precisely, the total number of contacts required is not known precisely. The estimate below of the number of records that EBSA plans to purchase from Dun and Bradstreet therefore allows for a 20% margin of error.


Derivation of Expected Number of Large and Small Firm Contacts Required for HDCI, 2007




Single Employer Samples

Total


Small (3-99)

Large (100+)

1) Plan Sponsors Needed for FY 2007

739

616

1,355

2) Among Firms Providing Coverage, Percentage Sponsoring Own Plan/1

98%

99%

98%

3) Number of Firms Providing Coverage Needed (#1 / #2)

754

622

1,376

4) Predicted Percentage of In-scope Firms Offering (Single or Multiemployer) Coverage/2

31.1%

88.1%

44%

5) Number of In-scope Firms Needed (#3 / #4)

2,425

706

3,131

6) Percentage of Firms on List Predicted to be In-scope/3

72%

88%

75%

7) Estimated Number of Firms Contacts Needed (#5/#6)

3,368

802

4,170

8) Plus 20% Margin of Error (#7 * 1.2)

4,041

962

5,003


1Calculated from Table 1 of the 2001 HDCI Report and assumed unchanged for 2007. These percentages are 1 minus the percentage of firms that provide coverage solely through multiemployer plans.

2Estimates above estimates of expected universe.

3Calculated from Table 1 of the 2001 HDCI Report and assumed unchanged for 2007


Validation of Sample Sizes through Simulation


Although the critical value X was eliminated from equation (6), that equation effectively solves for both n2 and X, because once n2 is known, X can be derived by substituting n2 into either equation (4) or equation (5). The goal of the simulation is to determine, for each sample, whether the calculated pair of values, n2 and X, achieve the target probabilities for type I error (5%) and type II error (10%). The inputs to the simulation are the calculated sample size and critical value for each sample, the 2001 overall violation rates as published in the 2001 HDCI report, the standard error of these violation rates, and the data in the table above.


Each run of the simulation program simulates selection of 5,000 samples for 2007, each with the sample size shown in the table above. For each sample plan, a violation status is randomly assigned assuming that the true universe violation rate decreased by 10 percentage points from its 2001 level in accordance with the assumption for type II error. For the large firm sample, a weight is also randomly selected from the 2001 distribution of weights. A weight of 1 is assigned for the other two samples, which are simple random samples. A weighted violation rate is then computed for each sample. To recognize the uncertainty associated with the point estimate of the 2001 violation rate, we replace each published point estimate with a violation rate randomly selected from a normal distribution having a mean equal to the 2001 point estimate, and standard deviation equal to the standard error of that estimate. For each sample, the change in simulated violation rate is then computed, and changes having an absolute value greater than X are deemed significant. Simulated levels of type I error are all .05 as targeted. Type II error occurs in a sample if the simulated change in violation rate is insignificant despite the assumed decrease in violation rate of 10 percentage points. Simulated levels of type II error depart insignificantly from their targeted level of 10% as summarized below:



Simulation number

Sample

Large Firms

Small firms

Multiemployer Plans

#1

9.78

10.46

10.22

#2

10.10

9.9

10.46

#3

10.94

10.08

10.12

#4

9.74

9.48

10.54

#5

10.04

9.34

10.54

#6

9.86

10.20

10.04

#7

9.70

10.56

9.74

#8

10.54

9.88

9.78

#9

9.88

10.00

9.72

#10

10.18

9.88

9.44

Summary




Mean

10.08

9.98

10.06

Standard error

0.125

0.121

0.122


Although the simulations cannot validate the conceptual framework for the sample size calculations because certain assumptions are common to both, it does, at least approximately, validate all computational aspects other than the finite population correction.


  1. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.


EBSA plans to contact sample firms by telephone, a method known to produce substantially higher response rates than mail surveys. Field staff will be instructed to make three attempts to contact each sample firm. A tracking system will be used to record the time and day of week of the calls, and EBSA staff will be instructed to vary the times and days after unsuccessful attempts to maximize the chances of finding a representative of the firm at work. Field staff will ask the screening questions in the order that will most expeditiously eliminate out-of-scope firms and thereby minimize the average number of answers required per respondent and the number of incomplete surveys.


Based on experience with the 2001 project, non-response is expected to arise from only two sources: unreachable firms and ERISA section 504(b) restrictions5. In 2001, the number of plans that could not be investigated due to 504(b) restrictions was 3% for large plans and 1% overall. In 2007 this restriction may be even more minor because the fact that the project will be in the field for more than 12 months may make it possible to defer investigations until the 504(b) window of restriction has expired.


Unreachable firms are the more important source of potential non-response. The response rate formula recommended by OMB6 (and by the American Association for Public Opinion Research) assumes that unreachable potential respondents are out-of-scope to the same extent as reachable respondents. Of course, the true extent to which unreachable respondents are out-of-scope could, in theory, range from all to none. The response rate table should in response to question 1 above shows minimum and maximum response rates in addition to the OMB-recommended rates. Because unreachable firms are believed to often be out-of-business and rarely to sponsor health plans, estimates are not adjusted for non-response. This decision results from a judgment that the maximum response rates shown above are reasonable and that there is no credible basis for estimating violation rates among the very small number of plans precluded from investigation due to 504(b) restrictions.


  1. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of test may be submitted for approval separately or in combination with the main collection of information.


The project will be guided by experience gained from the 2001 HDCI project.


  1. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


David McCarthy of EBSA’s Office of Policy and Research (202.693.8430) did the statistical design for this project. The statistical methodology described in this supporting statement was reviewed and approved by the Department’s Bureau of Labor Statistics, Office of Survey Methods Research (Steven Cohen 202.691.7400). The data will be collected by EBSA field offices. Jeffrey Monhart (202.693.8454) is chief of Division of Field Operations in EBSA.


References


Kish, L. (1965), Survey Sampling, New York: Wiley


Little, Roderick J. (20004), “To Model of Not to Model? Competing Modes of Inference for Finite Population Sampling,” Journal of the American Statistical Association, 99, 546-556

1 Outcome Goals 2 and 3 of the Department’s FY 2003-2008 Strategic Plan describe the Department’s commitment to promoting voluntary compliance, and thereby reducing violations, by helping the public understand the federal employment laws the Department administers and enforces.

2 See response to question #63 in memorandum from OMB Administrator John D. Graham to the President’s Management Council date January 20, 2006 – subject: “Guidance on Agency Survey and Statistical Information Collections.”


3 Although this formula strictly applies to a known, rather than estimated universe size, given the relative magnitudes of n and N, the variance is extremely insensitive to error in the estimation of N.

4 An email from SAS Technical Support to David McCarthy dated September 5, 2006 provided this derivation.


5 ERISA section 504(a) grants investigative authority to the Secretary of Labor and section 504(b) limits that authority by stating: “The Secretary may not under the authority of this section require any plan to submit to Secretary any books or records of the plan more than once in any 12 month period, unless the Secretary has reasonable cause to believe there may exist a violation of this subchapter or any regulation or order thereunder.

6 See response to question #63 in memorandum from OMB Administrator John D. Graham to the President’s Management Council date January 20, 2006 – subject: “Guidance on Agency Survey and Statistical Information Collections.”


22



File Typeapplication/msword
File TitleSUPPORTING STATEMENT FOR PAPERWORK REDUCTION ACT 1995 SUBMISSIONS
Authorkellyc
Last Modified BySusan Lahne
File Modified2007-02-09
File Created2007-02-09

© 2024 OMB.report | Privacy Policy