Supporting Statement Part B NCS Final

Supporting Statement Part B NCS Final.docx

National Compensation Survey

OMB: 1220-0164

Document [docx]
Download: docx | pdf

National Compensation Survey (NCS)

1220-0164

March 2018


Supporting Statement National Compensation Survey (NCS)



B. Collection of Information Employing Statistical Methods



For detailed technical materials on the sample allocation, selection, and estimation methods as well as other related statistical procedures see the BLS Handbook of Methods, BLS technical reports, and ASA papers listed in the references section. The following is a brief summary of the primary statistical features of the NCS.


As described in Sections 1 – 3 of this document, the NCS sample is selected using a 2-stage stratified design with probability proportional to employment sampling at each stage.  The first stage of sample selection is a probability sample of establishments within pre-defined geographic areas, industry, and ownership type (privately-owned and State and local government), and the second stage of sample selection is a probability sample of jobs within sampled establishments.  The NCS uses 24 geographic areas, one for each of the 15 largest metropolitan areas by employment and one for the remainder of each of the nine Census Divisions. Data from all sampled establishments are used to produce the cost and benefit products.



1a. Universe


The NCS measures employee compensation in the form of wages and benefits for detailed geographic areas, industries, and occupations as well as national level estimates by industry and occupation. The universe for this survey consists of the Quarterly Contribution Reports (QCR) filed by employers subject to State Unemployment Insurance (UI) laws. The BLS receives these QCR for the Quarterly Census of Employment and Wages (QCEW) Program from the 50 States, the District of Columbia, Puerto Rico, and the U.S. Virgin Islands. The QCEW data, which are compiled for each calendar quarter, provide a comprehensive business name and address file with employment, wage, detailed geography (e.g., county), and industry information at the six-digit North American Industry Classification System (NAICS) level. This information is provided for over nine million business establishments that are required to pay for unemployment insurance, most of which are in the scope of this survey.


1b. Sample


Stratification, Sample Allocation, and Sample Selection


In FY 2012, the NCS began selecting samples using a 2-stage stratified design with probability proportional to employment sampling at each stage. The first stage of sample selection is a probability sample of establishments within pre-defined geographic areas, industry, and ownership of interest, and the second stage of sample selection is a probability sample of jobs within sampled establishments. For more information on the sample design for private industry establishments, see the ASA paper by Ferguson et al. titled, “Update on the Evaluation of Sample Design Issues in the National Compensation Survey” (Attachment 1). For more information on the sample design for State and local government businesses, see the ASA paper by Ferguson et al. titled, “State and Local Government Sample Design for the National Compensation Survey” (Attachment 2).

Each sample of establishments is drawn by first stratifying the establishment sampling frame by defined geographic area of interest, industry and ownership. The industry strata for State and local government as well as private industry samples are shown below (North American Industry Classification System - NAICS-based). The 24 geographic areas of interest for the NCS appear in the third table below.


NCS Industry Stratification for State and Local Government Samples


Aggregate Industry

Detailed Industry

Included NAICS Codes

Establishments in Frame

Sample Size

Education

Elementary and Secondary Education

6111

61,939

631

Education

Colleges and Universities

6112, 6113

7,457

205

Education

Rest of Education

61 excl 6111-6113

1,256

6

Financial Activities

Other Service-providing - Part A

52-53

244

24

Goods Producing

Goods-Producing

21, 23, 31-33

6,349

25

Health Care, including Hospitals and Nursing Care

Hospitals

622

2,571

83

Health Care, including Hospitals and Nursing Care

Nursing Homes

623

2,100

16

Health Care, including Hospitals and Nursing Care

Rest of Health

62, excl 622-623

8,490

28

Service Providing

Trade, Transportation, and Utilities

42, 44-45, 48-49, 22

12,847

49

Service Providing

Public Administration

92 excl 928

108,358

472

Service Providing

Other Service-providing - Part B

51, 54-56, 71-72, 81 excl 814

20,430

57


NCS Industry Stratification for Private Establishment Samples


Aggregate Industry

Detailed Industry

Included NAICS Codes

Establishments in Frame

Sample Size*

Education

Educational Services (Rest of)

61 (excl 6111-6113)

78,008

60

Education

Elementary and Secondary Schools

6111

16,899

86

Education

Junior Colleges, Colleges and Universities

6112, 6113

8,023

274

Finance, Insurance and Real Estate

Finance (Rest of)

52 (excl 524)

279,462

950

Finance, Insurance and Real Estate

Insurance

524

185,133

581

Finance, Insurance and Real Estate

Real Estate, Renting, Leasing

53

349,578

213

Goods Producing

Mining

21

34,579

84

Goods Producing

Construction

23

744,370

913

Goods Producing

Manufacturing

31-33

334,610

1074

Health Care, including Hospitals and Nursing Care

Healthcare, Social Assistance (Rest of)

62 (excl 622, 623)

1,230,175

186

Health Care, including Hospitals and Nursing Care

Hospitals

622

8,419

254

Health Care, including Hospitals and Nursing Care

Nursing and Residential Care Facilities

623

72,659

387

Service Providing

Utilities

22

17,130

122

Service Providing

Wholesale Trade

42

619,782

703

Service Providing

Retail Trade

44-45

1,031,277

1446

Service Providing

Transportation and Warehousing

48-49

225,026

316

Service Providing

Information

51

143,541

367

Service Providing

Professional, Scientific, Technical

54

1,075,999

411

Service Providing

Management of Companies and Enterprises

55

58,245

71

Service Providing

Admin., Support, Waste Management

56

485,943

455

Service Providing

Arts, Entertainment, Recreation

71

127,658

105

Service Providing

Accommodation and Food Services

72

647,059

394

Service Providing

Other Services (excl Public Administration)

81 (excl 814)

563,765

352


* The sample size is the estimated total three-year sample size allocated to each industry. Approximately one-third of these units will be initiated each year.


24 Geographic Areas for the NCS Sample Design


Atlanta--Athens-Clarke County--Sandy Springs, GA CSA


Boston-Worcester-Providence, MA-RI-NH-CT CSA


Chicago-Naperville, IL-IN-WI CSA


Dallas-Fort Worth, TX-OK CSA


Detroit-Warren-Ann Arbor, MI CSA


Houston-The Woodlands, TX CSA


Los Angeles-Long Beach, CA CSA


Minneapolis-St. Paul, MN-WI CSA


New York-Newark, NY-NJ-CT-PA CSA


Philadelphia-Reading-Camden, PA-NJ-DE-MD CSA


San Jose-San Francisco-Oakland, CA CSA

Seattle-Tacoma, WA CSA


Washington-Baltimore-Arlington, DC-MD-VA-WV-PA CSA


Miami-Fort Lauderdale-Port St. Lucie, FL CSA


Phoenix-Mesa-Scottsdale, AZ MSA

Rest of New England Census Division (excl. areas above)

Rest of Middle Atlantic Census Division (excl. areas above)

Rest of East South Central Census Division (excl. areas above)

Rest of South Atlantic Census Division (excl. areas above)

Rest of North Central Census Division (excl. areas above)

Rest of West North Central Census Division (excl. areas above)

Rest of West South Central Census Division (excl. areas above)

Rest of Mountain Census Division (excl. areas above)

Rest of Pacific Census Division (excl. areas above)


* The above area definitions are based on the 2010 Decennial Census data which was released by the Census Bureau in February of 2013. NCS began using the above definitions with the State and Local Government sample selected in FY2015 which began collection in June 2015. The first private sample using the above definitions was selected in FY2016 and began collection in June 2016. NCS Samples prior to FY2015 were selected using the area definitions released by the Census Bureau in 2009.


After the sample of establishments is drawn, jobs are selected in each sampled establishment. The number of jobs selected in a private establishment ranges from 4 to 8 depending on the total number of employees in the establishment, except for aircraft manufacturing establishments and establishments with fewer than 4 workers. In government entities, the number of jobs selected ranges from 4 to 20, except for establishments with fewer than 4 workers. In aircraft manufacturing, the number of jobs selected ranges from 4 for establishments with fewer than 50 workers to 32 for establishments with 10,000 or more workers, unless the establishment has fewer than 4 workers. In establishments with fewer than 4 workers, the number of jobs selected equals the number of workers. The probability of a job being selected is proportional to its employment within the establishment, except in cases where all jobs are selected from the establishment.

Scope - The NCS sample is selected from the populations as defined above.


Sample Allocation — The total NCS sample consists of approximately 11,400 establishments. The private portion of this sample, approximately 9,800 establishments, is allocated based on strata defined by the survey’s 24 geographic areas and five aggregate industries. The government portion of this sample, approximately 1,600 establishments, is also allocated based on strata defined by the survey’s 24 geographic areas and five aggregate industries, although the detailed industries for governments are different than those for private industry. Self-representing, certainty establishments are assigned a sampling weight of 1.00, and non-certainty establishments are assigned a sampling weight equal to the inverse of their selection probability.


For private industry samples, NCS computes detailed allocations and identifies multi-year certainty establishments once every three years under the three-year rotation cycle. If budget or resource levels change significantly, allocation will be recomputed, multi-year certainties will be re-selected, and a new three-year rotation will begin.

The sample allocation process starts with a total budgeted sample size. The NCS will use targeted percentages across industries along with the frame data to determine how to distribute the sample units among the sampling cells for private industry samples. Due to the small sample size of the NCS for the private non-aircraft manufacturing allocation, NCS uses a five aggregate-industry stratum allocation with a modified measure of size within each of the 23 detailed industries. This adjustment to the measure of size (MOS) allows fewer strata but controls the number of units needed in the twenty-three detailed industries for which the NCS wants to publish private industry estimates.


The total three-year NCS private non-aircraft manufacturing sample size is first allocated to the five aggregate-industry strata. The sample size of each stratum is calculated so that the distribution of the new sample mirrors the desired distribution of the full sample in order to maximize the ability to meet publication goals. Next, each of the five aggregate stratum allocations is divided among the 24 geographic areas in proportion to the total adjusted employment of the frame units in the areas, resulting in 120 initial area-industry cell allocations. The MOS of a frame unit is the product of the unit’s employment and an adjustment factor that is used to maintain the current distribution of the sample among the 23 detailed NCS industries.


Multi-year private industry certainty units are identified using the initial cell allocations and the adjusted MOS. Each initial area-industry cell allocation is then reduced by the number of certainty units in the cell to create 120 non-certainty area-industry cell allocations. The MOS adjustment factors are recalculated to exclude the certainty units. Finally, the non-certainty allocations are divided among the three years of the sample design by distributing the non-certainty sample sizes across each of the three years. This distribution is done by dividing each size by three and assigning the integer portion of the result to each of the three years. The remainder is assigned to the appropriate number of years, one establishment at a time, in a manner that allows each annual selected sample to be the same size and the size for each sampling cell to vary by no more than one from year to year.


Under normal processing, private industry sample allocation and identification of multi-year private certainty units are executed once every three years. During years when prior year allocations and multi-year certainties are being used, the most recently identified set of multi-year certainty establishments are removed from the frame for operational purposes and added to the final selected sample. This ensures that each sample group represents the entire frame. The most recent non-certainty allocations and the sample frame without the multi-year certainties are used to compute the final measure of size adjustment factor and to set the non-certainty sample size for each of the 120 area-industry sampling cells.


A sample of private aircraft manufacturing establishments is selected once every three years at the same time that we begin a new three-year rotation for the rest of private industry. Measures of size are set for these establishments in the same manner as is done for the other private industry establishments, except that there is no adjustment for detailed industries within this group.


A sample of State and local government, also referred to as the public sector, establishments is selected once every ten years. The public sector allocation process starts with the total budgeted sample size. First, one establishment is allocated to each of the 120 sampling cells. The remaining total sample size is then allocated to the five aggregate industry strata in proportion to the total employment within each industry. Finally, each of the five aggregate stratum allocations is divided among the 24 geographic areas in proportion to the total employment of the frame units in the areas, resulting in 120 area-industry cell allocations.


State and local government certainty units are identified using the initial cell allocations and the establishment MOS. When an establishment MOS is greater than the total employment in the cell divided by the initial cell allocation for the cell, the public sector establishment is flagged as a certainty unit and the remaining cell allocation is reduced by the number of certainty selections. This identification process is repeated until no more certainty units exist in any cell. Each initial area-industry cell allocation is then reduced by the number of certainty units in the cell to create 120 non-certainty area-industry cell allocations.


Sample Selection - Under this design, NCS selects an independent non-certainty sample of establishments every year within each of the five aggregate industry and 24 geographic area sampling cells. Within each of the sampling cells, units are sorted by detailed industry (23 for private and 11 for public sector), final adjusted MOS, and establishment identification number. The selection process follows a systematic Probability Proportionate to Size (PPS) approach where the measure of size includes the adjustment factor as defined in the allocation section. The private industry multi-year certainty units identified in the previous step are added to each private industry non-certainty sample to form the entire establishment sample.


Sample weights are assigned to each of the selected non-certainty establishments in the sample to represent the non-certainty portion of the frame. Units selected as certainty are self-representing and will carry a sample weight of one. The sample weight for the non-certainty units is the inverse of the probability of selection.



2a. Sample Design


Sample Rotation - Collection of the first private industry sample (including a sample of aircraft manufacturing establishments) under the national based design began in the spring of 2012. Starting in the spring of 2013, NCS began initiating the second private industry sample. The third private industry sample began initiation in the spring of 2014, and completed the first rotation of private industry establishments under this sample design. For the State and local government sectors, a full sample is selected approximately every 10 years. The first State and local government sector sample under this new national design began initiation collection in the spring of 2015.


All sample units, private, State and local government, are assigned to one of four collection panels for initiation. Once a sample of establishments is selected and collection panels have been assigned, BLS Regional Office employees review and refine the sample before collection begins. As part of this refinement process, establishments may be moved from one collection panel to another to coordinate initiation with more than one establishment and/or to reduce travel costs associated with initiation efforts. Establishments are initiated over a fifteen month period with one collection panel required to be completed every three months after an initial 3 month detailed refinement period. Once initiated, a unit is then updated quarterly until it rotates out of the design. No newly initiated establishment is used in the NCS estimates until the entire sample has been initiated and updated for a common/base quarter. During this base quarter, data for the newly initiated sample as well as the prior sample is updated. After the base quarter is completed, the entire sample is added to the data available for estimation while the oldest sample in estimation is dropped from further updates and inclusion in the estimates.



Sample Replacement Scheme


Each year, the NCS selects a new sample of private sector establishments from the most recent available frame data. A new sample of jobs is selected within each private establishment at least once every 3 years (10 years for government) as the establishment is initiated into the survey process. The entire private NCS sample is completely replaced over a three year period on a rotational basis.


The primary objectives of the replacement scheme are to reduce the reporting burden of individual establishments by rotating units out of the sample and to insure that the establishment sample is representative of the universe it is designed to cover over time. Because samples are selected independently from year to year, there is a chance for establishments to be selected in a future replacement sample(s) regardless of whether or not it has already rotated out of the current sample.


2b. Estimation Procedure


The National Compensation Survey (NCS) produces indexes measuring change over time in labor costs through the Employment Cost Index (ECI) and the level of average costs per hour worked through the Employer Costs for Employee Compensation (ECEC). The NCS also provides benefits incidence for the percentage of workers with access to and participating in employer-sponsored benefit plans. The survey covers a broad range of benefits including holidays and vacations, sick leave, health and life insurance, and retirement plans. Details of employer-provided health and retirement plan provisions are also available.

The ECI is a measure of the change in the employer costs of labor, independent of the influence of employment shifts among occupations and industry categories. The total compensation series include changes in wages and salaries and in employer costs for employee benefits. The ECI calculates indexes of total compensation, wages and salaries, and benefits separately for all civilian workers in the United States (as defined by the NCS), for private industry workers, and for workers in state and local government. For all of these categories, the ECI calculates the same measures by occupational group, industry group, and worker and establishment characteristics. Seasonally adjusted series are calculated as well.


The ECI is a modified Laspeyres index (i.e. index reflecting the change in labor costs over time), where the basic computational framework for the ECI is the standard formula for an index number with fixed index weights, modified by special statistical conditions and accounting for sampling methodology.


The ECEC series measures the average cost to employers for wages and salaries, and benefits, per employee hour worked. The series provides data on employer costs per hour worked for total compensation, wages and salaries, total benefits, and the following benefits: paid leave—vacations, holidays, sick leave, and personal leave; supplemental pay—premium pay (such as overtime, weekend, and holiday) for work in addition to the regular work schedule and for shift differentials, and nonproduction bonuses (such as yearend, referral, and attendance bonuses); insurance benefits—life, health, short-term disability, and long-term disability insurance; retirement and savings benefits—defined benefit and defined contribution plans; and legally required benefits—Social Security (refers to Old-Age, Survivors, and Disability Insurance (OASDI) program), Medicare, federal and state unemployment insurance, and workers’ compensation. Cost data are presented both in dollar amounts and as percentages of total compensation and published quarterly.


The NCS collects and publishes data annually on the incidence of employer-provided benefits and on the key provisions (terms) of employee benefit plans, for civilian workers, workers in private industry, and state and local government workers. In addition, the NCS publishes data on detailed provisions of coverage in two major benefit areas: health insurance and retirement plans. Health data include medical plan provisions, such as deductibles, coinsurance, and out-of-pocket maximums, as well as details of dental, vision, and prescription drug benefits. Provisions of defined benefit and defined contribution retirement plans, such as eligibility requirements and benefit formulas, also are published. Detailed provision estimates are produced based on the initiation year (e.g., first year of participation in the NCS for the sampled establishment) of each sample group collected data via Summary Plan Description (SPD), plan summary sheets, and Summary of Benefits and Coverage (SBC).

For more detailed information on the procedures used in the calculation of the NCS estimates see Chapter 8 of the BLS Handbook of Methods (available on the BLS Internet at http://www.bls.gov/opub/hom/pdf/homch8.pdf).



2c. Reliability


The estimation of sample variances for the NCS is accomplished through the method of Balanced Repeated Replication (BRR). This replication technique uses half-samples of the original sample and calculates estimates using those sub-samples. The replicate weights in both half-samples are modified using Fay’s method of perturbation. The sample variance is calculated by measuring the variability of the estimates across these sub-samples. For a mathematical presentation of this method, see the BLS Handbook of Methods listed in the references.


Before estimates of these characteristics are released to the public, they are first screened to ensure that they do not violate the BLS confidentiality pledge. A promise is made to each private industry respondent and those government sector respondents who request confidentiality, that BLS will not release its reported data to the public in a manner which would allow others to identify the establishment, firm, or enterprise.


Measuring the Quality of the Estimates


The two basic sources of error in the estimates are bias and variance. Bias is the amount by which estimates systematically do not reflect the characteristics of the entire population. Many of the components of bias can be categorized as either response or non-response bias.


Response bias occurs when respondents’ answers systematically differ in the same direction from the correct values. For example, this occurs when respondents incorrectly indicate no change in benefits costs when benefits costs actually increased. Another possibility of having response bias is when data are collected for a unit other than the sampled unit. Response bias can be measured by using a re-interview survey. If properly designed and implemented, re-interview surveys can also indicate where improvements are needed and how to make these improvements. The NCS has a Technical Re-interview Program (TRP) that does a records check of a sample of each field economist’s schedules of collected data. TRP is a part of the overall review process. TRP verifies directly with respondents a sample of elements originally collected by the field economist. The results are reviewed for adherence to NCS collection procedures. Although not explicitly used to measure bias, this program allows the NCS to identify procedures that are being misunderstood and to make improvements in the NCS Data Collection Manual and training program.


Non-response bias is the amount by which estimates obtained do not properly reflect the characteristics of non-respondents. This bias occurs when non-responding establishments have earnings and benefit levels and movements that are different from those of responding establishments. Non-response bias is being addressed by continuous efforts to reduce the amount of non-response. The results from initial analysis were documented in the 2006 ASA Proceedings of Survey Research Methods Section1. A follow-up study from 2008 is also listed in the references. NCS continues to monitor response rates to ensure that the potential for non-response is kept as low as possible. Details regarding adjustment for non-response and current non-response bias research are provided in Section 3 below.


Another source of error in the estimates is sampling variance. Sampling variance is a measure of the fluctuation between estimates from different hypothetical samples using the same sample design. Sampling variance in the NCS is calculated using a technique called balanced half-sample replication. For national estimates this is done by forming 120 different re-groupings of half of the sample units. For each half-sample, a "replicate" estimate is computed with the same formula as the regular or "full-sample" estimate, except that the final weights are adjusted. If a unit is in the half-sample, its weight is multiplied by (2-k); if not, its weight is multiplied by k. For all NCS publications, k = 0.5, so the multipliers are 1.5 and 0.5. Sampling variance computed using this approach is the sum of the squared difference between each replicate estimate and the full sample estimate averaged over the number of replicates and adjusted by the factor of 1/(1-k)2 to account for the adjustment to the final weights. For more details, see the NCS Chapter of the BLS Handbook of Methods. Standard error, which is the square root of variance, for primary aggregate estimates of the index of quarterly change is typically less than 0.5 percent. Relative standard error, which is the square root of variance divided by the estimate, for aggregate estimates of compensation, wage, or benefit levels are typically less than 5 percent. The standard errors or relative standard errors are included within published NCS reports at the following website: http://www.bls.gov/ncs/ect/ectvar.htm.


Variance estimation also serves another purpose. It identifies industries and occupations that contribute substantial portions of the sampling variance. Allocating more sample units to these domains often improves the efficiency of the sample. These variances will be considered in allocation and selection of the future replacement samples.



2d. Data Collection Cycles


NCS data are collected quarterly for all schedules.


3. Non-Response


There are three types of non-response: permanent non-response, temporary non-response, and partial non-response. The non-responses can occur at the establishment level, occupation level, or benefit item level. The assumption for all non-response adjustments is that non-respondents are similar to respondents, within certain cells formed by cross-classifying establishment characteristics.


To adjust for permanent establishment or occupation non-response at the initial interview, weights of responding units or occupations that are deemed to be similar are adjusted appropriately. Establishments are considered similar if they are in the same ownership, 2-digit NAICS, and establishment employment size class. If there are no sufficient data at this level, then a broader level of aggregation is considered.


For temporary and partial non-response, a replacement value is imputed based on information provided by establishments with similar characteristics. Imputation is done separately for each benefit both in the initial period and in subsequent update periods. Imputation is also done for each missing wage estimate after the initial period. In the rare event that the BLS cannot determine whether or not a benefit practice exists for a non-respondent, the average cost is imputed based on data from all responding establishments (including those with no plans and plans with zero costs).


There is a continuous effort to maximize response rates. We are developing and providing respondents with new and useful products to demonstrate the importance of NCS data and their participation. Examples include the Beyond the Numbers publications (http://www.bls.gov/opub/btn/) and industry briefs on selected industries that are provided to field economists to help them identify industry-specific collection challenges. We are continually exploring alternative methods for respondents to report their data.


For June 2017, the un-weighted response rates for the NCS initiation establishments was 73% and 90% during update collection. For initiation schedules, the weighted response rate for NCS was 71% for initiation and 89% for update establishments during the June 2017 quarter.

3a. Maximize Response Rates


To maximize the response rate for this survey, field economists initially refine addresses ensuring appropriate contact with the employer. Then, employers are mailed a letter explaining the importance of the survey and the need for voluntary cooperation. The letter also includes the Bureau’s pledge of confidentiality. A field economist calls the establishment after the package is sent and attempts to enroll them into the survey. Non-respondents and establishments that are reluctant to participate are re-contacted by a field economist specially trained in refusal aversion and conversion. Additionally, respondents are offered a variety of methods, including telephone, fax, email, and the internet, through which they can provide data.


3b. Non-Response Adjustment


As with other surveys, despite all efforts to maximize response rates NCS experiences non-response. To adjust for the non-responses, NCS has divided the non-response into two groups, 1) unit non-respondents and 2) item non-respondents. Unit non-respondents are the establishments or jobs that do not report any compensation data, whereas item non-respondents are the establishments that report only a portion of the requested compensation data, such as wages for a sub-set of sampled jobs.


The unit non-response is treated using a Non Response Adjustment Factor (NRAF). Item non-response is addressed by filling in missing data with plausible values had the data been actually collected, i.e. by item imputation. Within each sampling cell, NRAFs are calculated each year based on the ratio of the number of viable establishments to the number of usable respondents in that month. The details regarding the NRAF procedure are given in Chapter 8 of the Bureau of Labor Statistics’ Handbook of Methods (http://www.bls.gov/opub/hom/pdf/homch8.pdf).


The method used to adjust for item non-response at the establishment and quote level is a cell-mean-weighted procedure. Details of this procedure are available in BLS Handbook of Methods (http://www.bls.gov/opub/hom/pdf/homch8.pdf). Other techniques are used to impute for item non-response for benefit estimates and are described in the following CWC article: “Recent Modifications of Imputation Methods for National Compensation Survey Benefits Data”, may be found at the following link: http://www.bls.gov/opub/mlr/cwc/recent-modification-of-imputation-methods-for-national-compensation-survey-benefits-data.pdf.


3c. Non-Response Bias Research


Research has been completed to assess whether the non-respondents to the NCS survey differ systematically in some important respect from the respondents of the survey and would thus bias NCS estimates. Details of this study are described in the two papers by Ponikowski, McNulty, and Crockett referenced in Section 6.



4. Testing Procedures


4a. Tests of Collection Procedures


The NCS has identified two projects to explore alternative methods of collection for data elements that are currently being collected from respondents. The major goal of the first project is for a reduction in burden for private sector respondents.  Presently, respondents are asked for their expenditures for State Unemployment Insurance (SUI), in order to calculate SUI costs per hour worked.  The Quarterly Census of Employment and Wages (QCEW) collects, with a lag, information on an establishment’s quarterly contributions to SUI, along with taxable wages paid quarterly for SUI purposes.  The ratio of the two can be used, in combination with other information already collected from the respondent, to form an alternative measure of SUI costs per hour worked.  Analysis is underway to determine whether this alternative measure is of sufficient quality to be used instead of direct collection.


The second project under the NCS is a test to perform text analysis on a sample Summary of Benefits and Coverages (SBCs) collected from NCS respondents.  Production of SBCs is required by the Affordable Care Act (ACA).  In order to facilitate comparisons between plans, SBCs use a standardized template, which may be customized by respondents to capture plan particularities.  This makes them more readily amenable to automated analysis through text analysis techniques.  For this project the following steps must be completed: (1) Select a sample of SBCs for plan provisions coded into the NCS database by NCS field economists.  This will be the training sample from which (3) is developed. (2) Convert the SBCs in the training sample to readable text using optical character recognition software. (3) Using the readable text from (2) develop a text analysis program to extract and code a small set of NCS data elements.  (4) Select a new sample of SBCs (this will be the validation sample) and their coded provisions to test output from the previous step for accuracy.  A preliminary run of the test has been completed for two provisions. The goal is to continue working in FY18 to increase the accuracy of the text analysis and expand the number of provisions extracted.


4b. Tests of Survey Design Procedures


Over the last several decades, the NCS had undergone many changes leading up to the survey design previously in operation which had been used by NCS since the mid-1990’s. In February 2011, the Bureau of Labor Statistics began implementing a change to the Locality Pay Survey (LPS) component of NCS which is used to produce annual occupational earnings data for the nation, each Census Division, and selected geographic areas. Under this change occupational earnings estimates are now produced using a modeling technique that combines the national data from the NCS with the locality data from the Occupational Employment Statistics (OES) survey (BLS Handbook of Methods Chapter 3). All other data estimates computed using NCS data, including the Employment Cost Index, Employer Costs for Employee Compensation, and various measures of access and participation in employer provided benefits continue to be produced. With the elimination of a need to produce locality estimates directly from the NCS sample, the remaining NCS outputs could follow a more efficient national based sample design.


The new design, now in full production, is a national design with a 3-year sample rotation cycle. Prior to the first production sample, NCS completed the evaluation and testing for this change as detailed in this section.


The BLS staff examined potential changes to the NCS sample design that included the following options:


  • Moving from an area-based sample design to a national design, thus eliminating the first stage of sampling to select areas

  • Implementing a new allocation methodology to correspond with the non-area-based sampling

  • Moving from a five-year rotation to a three-year rotation for private industry establishments


For each of these options, NCS tested the proposed change using the general scheme described below.


  • Obtain a full frame of data,

  • Use establishment total wage data from the frame to compute average monthly wages across all establishments,

  • Implement the proposed change using the full frame of data,

  • Select multiple (100 or more) simulated samples using the proposed methodology,

  • Compute estimates of the average monthly wages using the weighted data from each of the simulated samples,

  • Compute the mean and standard error of the average monthly wages across all the simulated samples, and

  • Compare the estimated average monthly wages across the simulated samples to those from the frame.


In addition to analyzing the potential effect of the redesign on the reliability of the estimates, the effect of any redesign on response rates and bias was studied with details described in the attached paper, "Update on the Evaluation of Sample Design Issues in the National Compensation Survey", by Ferguson et al.


The NCS also evaluated options for the transition to a national design for State and local government samples. This research and evaluation were done using similar methods to those used for the private industry sample design evaluation. This research is documented in the paper “State and Local Government Sample Design for the National Compensation Survey” by Ferguson et al.


The sample design for State and local government was tested using sample simulations for which we obtained a complete sample frame from the second quarter of 2011, assigned measures of size, executed the allocation process, and selected certainty establishments. We then selected 100 non-certainty samples and evaluated the resulting samples to ensure that the total weighted employment for the samples matched the frame employment and that the desired sample sizes were obtained.


With the December 2016 ECI release, the NCS began producing outputs (ECI, ECEC, and incidence and key provisions) with a total sample size of approximately 11,400 establishments (Private, State and local government) sampled and collected under the new national sample design. The number of detailed estimates for these products has not changed and the measures of reliability are comparable to estimates produced under the previous locality-based design. Estimates in the NCS detailed benefits product line are produced from the most recent single initiation sample.  Despite budget cuts resulting in an overall sample reduction, with the move to a three-year rotation, each initiation sample is larger than the previous five-year rotation would have yielded resulting in estimates that are of the same quality and level of detail as prior publications for this product.


Testing of Modeled Wage Estimates


NCS implemented a model-based estimation approach to continue to produce data for the President’s Pay Agent that was previously available through the Locality Pay Survey. This model-based estimation approach allows BLS to continue to produce wage estimates by worker characteristic such as full-time vs. part-time or union vs. non-union.


NCS completed the evaluation and test phases for producing estimates of worker wages by worker characteristics using data from both the NCS and the Occupational Employment Survey (OES). This data is now in production using a wage model that combines the large quantity of wage data available from OES along with the detailed worker characteristics from the NCS. The model is described in the August 2013 Monthly Labor Review article titled “Wage estimates by job characteristic: NCS and OES program data” by Lettau and Zamora. In a follow-up article titled “Revisiting the dilemma of review for modeled wage estimates by job characteristic” by Lettau and Zamora, the estimates obtained from the model and the criteria for publication of an estimate are reviewed. This second article also reports a more extensive set of experimental wage estimates by area, occupation, and job characteristic.



5. Statistical and Analytical Responsibility


Jeffrey Gonzalez, Chief, Statistical Methods Group of the Office of Compensation and Working Conditions, is responsible for the statistical aspects of the NCS program. Jeffrey Gonzalez can be reached on 202-691-7517. BLS seeks consultation with other outside experts on an as needed basis.



6. References


Attachment 1 - Ferguson, Gwyn, Coleman, Joan L., Ponikowski, Chester, (August 2011), “Update on the Evaluation of Sample Design Issues in the National Compensation Survey,” ASA Papers and Proceedings, http://www.bls.gov/osmr/abstract/st/st110230.htm


Attachment 2 - Ferguson, Gwyn R., Ponikowski, Chester H., McNulty, Erin, and Coleman, Joan L., (September 2012), “State and Local Government Sample Design for the National Compensation Survey”, ASA Papers and Proceedings, https://www.bls.gov/osmr/abstract/st/st120280.htm


Additional References:

Crockett, Jackson, McNulty, Erin, Ponikowski, Chester H., (October 2008), “Update on Use of Administrative Data to Explore Effect of Establishment Non-response Adjustment on the National Compensation Survey Estimates,” ASA Papers and Proceedings, http://www.bls.gov/osmr/pdf/st080190.pdf


Cochran, William, G., (1977), Sampling Techniques 3rd Ed., New York, Wiley and Sons, 98, 259-261.


Federal Committee on Statistical Methodology, Subcommittee on Disclosure Limitation Methodology, "Statistical Policy Working Paper 22," http://www.fcsm.gov/working-papers/SPWP22_rev.pdf


Ferguson, Gwyn, Ponikowski, Chester, Coleman, Joan, (August 2010), “Evaluating Sample Design Issues in the National Compensation Survey,” ASA Papers and Proceedings, www.bls.gov/osmr/abstract/st/st100220.htm


Lettau, Michael K. and Zamora, Dee A., Monthly Labor Review (August 2013), “Wage estimates by job characteristic: NCS and OES program data”, http://www.bls.gov/opub/mlr/2013/article/lettau-zamora-1.htm


Lettau, Michael K. and Zamora, Dee A., Monthly Labor Review (August 2015), “Revisiting the dilemma of review for modeled wage estimates by job characteristic”, https://www.bls.gov/opub/mlr/2015/article/revisiting-the-dilemma-of-review-for-modeled-wage-estimates-by-job-characteristic-1.htm


Ponikowski, Chester H., and McNulty, Erin E., (December 2006), “Use of Administrative Data to Explore Effect of Establishment Nonresponse Adjustment on the National Compensation Survey Estimates,” ASA Papers and Proceedings, http://www.bls.gov/osmr/pdf/st060050.pdf


Stafira, Sarah, (August 2009), “Recent Modification of Imputation Methods for National Compensation Survey Benefits Data, U.S. Bureau of Labor Statistics, Compensation and Working Conditions Online, http://www.bls.gov/opub/mlr/cwc/recent-modification-of-imputation-methods-for-national-compensation-survey-benefits-data.pdf


US Bureau of Labor Statistics, “BLS Handbook of Methods: Chapter 8 National Compensation Methods,” http://www.bls.gov/opub/hom/pdf/homch8.pdf


US Bureau of Labor Statistics, “National Compensation Survey: Guide for Evaluating Your Firm’s Jobs and Pay,” https://www.bls.gov/ncs/ocs/sp/ncbr0004.pdf



1 Ponikowski, Chester H. and McNulty, Erin E., " Use of Administrative Data to Explore Effect of Establishment Nonresponse Adjustment on the National Compensation Survey", 2006 Proceedings of the American Statistical Association, Section on Survey Methods Research [CD-ROM], American Statistical Association, 2006 http://www.bls.gov/ore/abstract/st/st060050.htm


15


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorGRDEN_P
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy