2021_Supporting_Statement_B

2021_Supporting_Statement_B.docx

Report on Occupational Employment and Wages

OMB: 1220-0042

Document [docx]
Download: docx | pdf

Occupational Employment and Wages

1220-0042

October 2021




Supporting Statement


Report on Occupational Employment and Wages


B. Collection of Information Employing Statistical Methods


1(a) Respondent Universe


The universe for this survey consists of the Quarterly Contribution Reports (QCR) filed by employers subject to State Unemployment Insurance (UI) laws. The U.S. Bureau of Labor Statistics (BLS) receives these QCR for the Quarterly Census of Employment and Wages (QCEW) Program from the 50 States, the District of Columbia, Puerto Rico, and the U.S. Virgin Islands. The QCEW data, which are compiled for each calendar quarter, provide a comprehensive business name and address file with employment, wage, detailed geography (i.e., county), and industry information at the six-digit North American Industry Classification System (NAICS) level. This information is provided for nearly ten million business establishments of which about 7.7 million are in the scope of this survey. The final data is stored in a Longitudinal Data Base (LDB), which is then used as a sampling frame for sample selection. Data for Federal Government employees covered by the Unemployment Compensation for Federal Employees program (UCFE), the US Postal Service, and the Tennessee Valley Authority are also included. Other data used for sampling include the universe of railroad establishments obtained from Federal Railroad Administration and a universe of establishments in the U.S. territory of Guam obtained from the Government of Guam, Department of Labor.


1(b) Sample


Scope--The OEWS measures occupational employment and wage rates of wage and salary workers in nonfarm establishments in the 50 States and the District of Columbia. Guam, Puerto Rico, and the Virgin Islands are also surveyed, but their data are not included in national estimates. The survey covers the following NAICS industry sectors:


11 Logging (1133), support activities for crop production (1151), and support activities for animal production (1152) only

21 Mining

22 Utilities

23 Construction

31-33 Manufacturing

42 Wholesale Trade

44-45 Retail Trade

48-49 Transportation and warehousing

51 Information

52 Finance and insurance

53 Real estate and rental and leasing

54 Professional, scientific, and technical services

55 Management of companies and enterprises

56 Administrative and support and waste management and remediation services

61 Educational services

62 Health care and social assistance

71 Arts, entertainment, and recreation

72 Accommodation and food services

81 Other services, except public administration [private households (814) are excluded]

99 Federal, State, and local government (OEWS designation)


Sample Size--The sample size is approximately 1.1 million establishments over a 3-year period. The sample is divided into six panels over three years with two semi-annual samples of about 180,000 establishments selected each year. The following table shows the estimated number of universe units, sampled units, and responding units for all in-scope NAICS by Fiscal Year for the regular OEWS survey:


Table 1: Universe and Sample Size Summary

Survey

NAICS Coverage

Responding Units

Sample Units

Universe Units

FY 2016 (November 2015 & May 2016 panels)

All in-scope NAICS

277,184

404,219

7,522,534

FY 2017 (November 2016 & May 2017 panels)

270,949

397,069

7,636,124

FY 2018 (November 2017 & May 2018 panels)

260,597

371,575

7,738,856

6-panel Totals

 

808,730

1,172,863

7,738,856



Stratification--Units on the sampling frame are stratified by State/Metropolitan Statistical Area (MSA) and Balance of State, and by three-, four-, five-, or six-digit NAICS industry code. The frame is further stratified into certainty and non-certainty portions for sample selection. Certainty units include Federal and State governments, hospitals, railroads, and large establishments. These are sampled with probability equal to one every 3-year cycle because of their occupational employment coverage and economic significance. All remaining establishments are non-certainty establishments and are selected with probability less than one but greater than zero.


2(a) Sample Design


Allocation method--A variation of Neyman allocation procedure called a Power Allocation (Bankier, 1988)i is used to allocate the non-certainty sample to each State-/area/3-4-5-6-digit NAICS stratum. The allocation methodology balances employment size of areas and industries with the variability of the occupational employment in each industry. The use of the power allocation shifts sample away from very large areas and industries to medium and smaller ones allowing more comparable estimates for smaller domains. The power allocation is calculated as follows:



Where,


h = the stratum defined as defined as State/area/3-4-5-6 digit NAICS industry

= non-certainty sample allocated to stratum h

= national sample size less the number of certainty units

Xh = non-certainty frame employment in stratum h

Sh = average occupational variability within stratum h


Additionally, OEWS ensures that a minimum sample size is allocated to each sample stratum such that the final sample allocation for each stratum is equal to the maximum of the minimum allocation and the power allocation. Further discussion on the strengths and weaknesses of this approach are presented in Lawley et. al., 2007ii and Piccone, 2009iii.


Sample Selection--Within each stratum, the sample is selected using probability proportional to estimated employment size with large units being selected with certainty. Each semi-annual panel sample is designed to represent the frame. Every attempt is made to ensure that private and local government establishments are only selected once every three years. Consequently, each sampled establishment is assigned a sampling weight equal to the reciprocal of its probability of selection in the sample. Note: Censuses of federal and state government are collected annually.


Occupational employment data from prior survey rounds are used by BLS-Washington to produce sample allocations that result in relative standard errors (RSE) on mean wages of 10 to 20 percent for the typical occupations in each MSA/ three-, four-, five-, or six-digit industry cell. Mean wage estimates for typical occupations at higher aggregate levels of area and industry will have substantially smaller relative standard errors.


Frequency of Sampling--Each year, semiannual panels of about 180,000 to 190,000 establishments each are selected for the May and November reference periods.


Sampling Issues--Sampling procedures introduced for collection efficiencies may result in a small downward bias in employment estimates in some industries for some areas. This bias is estimated to be between 0.1 and 0.2 percent of total employment. This may occur in cases where the single panel allocation can be rounded to either 0 or 1. In the case when the allocation is 0 for all 6 panels contributing to the 6-panel estimate, the estimated employment for the cell is 0, while in all other cases, the expected employment is correct. This bias is mitigated at higher levels of aggregation through benchmarking.


More detailed information about OEWS sample allocation and selection procedures can be found in the 2018 OEWS Technical Noteiv and Survey Methods and Reliability Statement for May 2018v.

2(b) Estimation Procedures


Annual estimates of occupational employment and wage rates are produced using data from the current May semiannual panel’s survey and survey data from the five semiannual panels prior to the current panel (a total sample of about 1,080,000 establishments). Data from several panels are combined in order to reduce the sampling error of the estimates at detailed levels (MSA by 3-4-5-6 digit NAICS). Combining samples from six panels increases the sample counts.


Model-Based Estimation


The Model-Based Estimation using 3 years of data (MB3), a product of a long-term research project, will be used for the first official production run with the 2021 OEWS estimates. Testing indicates that the accuracy and reliability of the MB3 estimates improved over the former approach. In September 2019, BLS published 2016 data using the new estimation method as a research series, and a Monthly Labor Review article describing the method with comparisons to the old methods.


The MB3 method takes advantage of the fact that BLS observes key determinants of occupational staffing patterns and wages for all units in a target population. In particular, the QCEW provides data on the detailed industry, ownership status, geographic location, and size for every establishment whose workers are covered by state unemployment insurance laws. OEWS sample information is used to model wage distributions and industry/area/size/ownership/time wage adjustments. The estimation system includes redesigned components for model fitting, unit matching, and variance estimation. Further details of the method are presented in the Monthly Labor Review article, Model-based estimates for the Occupational Employment Statistics Programvi and the Survey Methods and Reliability Statement for MB3 Research Estimates of the Occupational Employment and Wage Statistics Survey (MB3 Survey Methods Statement)vii.


Estimates


Occupational employment and wage estimates are computed using observed data and predicted data for the population of about 8 million units. Predicted data are created for each unobserved unit of the population, so estimates are computed using full-population expressions.


Occupational Employment Estimates


Estimates of occupational employment totals are computed by summing all employment counts of a given occupation over the modeled population data. Estimates are made over area, industry, and ownership. For occupation o, where unit i is any establishment in cell c, the occupational employment estimate is:



Hourly wage rate estimates


Mean hourly wage is calculated as the total hourly wages for an occupation divided by its total modeled population employment. Wage rate information is available for every individual federal employee and some state and private sector employees. Other wage data are in wage interval ranges and are converted to local hourly wages for each employee. These local hourly wages are predicted using adjusted estimates of local interval means, and then treated as point data. Mean wage is calculated as a sum of the hourly wage for each employee in a cell divided by the total number of employees in a cell. Employees E in a given occupation and wage interval at a single establishment will all have the same predicted wage w. For establishments i, wage ranges r, and occupation o in cell c, the computation is as follows:



Percentile wage rate estimates are computed directly from the predicted population using the empirical distribution function with averaging, which is implemented in many statistical packages.


Annual wage rate estimates


These estimates are calculated by multiplying mean or percentile hourly wage rate estimates by a “year-round, full time” figure of 2,080 hours (52 weeks x 40 hours) per year for most occupations. These estimates, however, may not represent mean annual pay should the workers work more or less than 2,080 hours per year.


Alternatively, some workers are paid based on an annual basis but do not work the usual 2,080 hours per year. For these workers, survey respondents report annual wages. Since the survey does not collect the actual number of hours worked, hourly wage rates cannot be derived from annual wage rates with any reasonable degree of confidence. Only annual wages are reported for some occupations.


More information on MB3 can be found in the MB3 Survey Methods Statement.


2(c) Reliability


A probability based sample design is used to develop the OEWS survey. This design allows the Bureau to control and measure the sampling error of the occupational employment and wage rate estimates. Relative standard error estimates are used to measure sampling error. Variances for both mean wage estimates and occupational employment estimates are computed using the “bootstrap” replication technique. Background on the variance estimator used in OEWS can be found in the MB3 Survey Methods Statement.


Nonsampling errors, unlike sampling error, are not easily measurable but can be controlled through various quality control procedures. One such procedure is the requirement that all States use data processing software provided by the BLS national office. This standardization and automation of survey operations should reduce several sources of nonsampling error. State and BLS staff use automated and manual data screening procedures that help identify data that was misreported, or keypunched incorrectly.


2(d) Special Procedures


In order to produce wage rate and employment estimates at detailed geographic levels, the OEWS combines data across a three-year time period (six semiannual panels). Special sampling procedures are in place to allocate the sample, to limit the inclusion of units to once in a three-year time period, and to combine the data to produce estimates. Collecting all of the certainty units each year would allow these data to be used in the validation of the updating process; we are evaluating several collection options with respect to these units.


2(e) Data Collection Cycles


Occupational employment and wage-range data are collected for all nonagricultural industries over a six-panel semiannual cycle, with data collected for a reference period of May and November of each year. In each panel, one sixth of a 1,080,000 establishment sample is selected for nonagricultural industries, where overlap with the prior five panels’ samples is minimal. Thus, establishments will be included in the sample at most once every three years.


3(a) Maximizing Response


A goal of the OEWS survey is that each State achieves an 80 percent response rate. The overall response rate for the 2018 survey was approximately 71.2 percent based on units.

Each State is responsible for collecting the questionnaire forms mailed to the sample units selected in their State. Every effort is made to maximize response rates to achieve the 80-percent goal by:


  • Surveying sampled units at most once every three years (once every six panels). With the research to improve time series capabilities of the OEWS survey this constraint may be relaxed for some employers.

  • Conducting extensive address refinement to ensure that the survey form reaches the correct establishment in a timely manner.

  • Providing each sampled unit with a cover letter explaining the importance of the survey and the need for voluntary cooperation.

  • Giving each private sector sample unit the Bureau’s pledge of confidentiality.

  • Sending pre-notification letters to establishments before they are contacted to provide data.

  • Sending each nonresponding unit two to three additional mailings after the initial mail-out (if necessary); the BLS also recommends that the States obtain specific contact names for each sampled firm.

  • Contacting key nonresponding units in each MSA/three-, four-or-five-digit industry or MSA/three-digit industry cell by telephone.

  • Contacting critical employers through personal visits (if necessary).

  • Including fact sheets that explain the many uses of the OEWS data.

  • Using status reports and control files to identify MSA/industry cells with low response rates.

  • Stressing to respondents that assistance is available to help them complete the survey form.

  • Providing a link on the survey form to OEWS data on the BLS Internet website to demonstrate program usefulness.

  • Using a respondent web page that provides detailed information about responding to the OEWS survey, including state contact information for those needing assistance.

  • Increasing the use of electronic and telephone collection in order to allow the respondent to provide information in a way that is most convenient to them.

  • Providing email and online data submission options.

  • Advertising electronic submission options.

  • Coordinating collection for multi-unit establishments.

  • Initial and follow-up email blasts to units that provided an email address.

  • Conducting periodic Response Analysis or Nonresponse Analysis surveys to learn what motivates employers to respond.


Response Burden--The following table shows the estimated response burden by Mandatory/Voluntary and Ownership:



Table 2: Response Burden



Survey year

Mandatory / voluntary

Ownership

NAICS Coverage

Estimated Responding Units

Estimated burden hours

FY 2020

Voluntary

private

1133, 1151, 1152, 21-81 (exc. 814)

186,586

93,293


Voluntary

government

State and local government

8,356

4,173


Mandatory

private

1133, 1151, 1152, 21-81 (exc. 814)

59,009

29,505


Mandatory

government

State and local government

2,817

1,408


Total

256,768

128,384























































Survey year

Mandatory / voluntary

Ownership

NAICS Coverage

Estimated Responding Units

Estimated burden hours

FY 2021

Voluntary

private

1133, 1151, 1152, 21-81 (exc. 814)

193,683

96,842


Voluntary

government

State and local government

8,648

4,324


Mandatory

private

1133, 1151, 1152, 21-81 (exc. 814)

61,244

30,622


Mandatory

government

State and local government

2,914

1,457


Total



266,489

133,245





















Survey year

Mandatory / voluntary

Ownership

NAICS Coverage

Estimated Responding Units

Estimated burden hours

FY 2022

Voluntary

private

1133, 1151, 1152, 21-81 (exc. 814)

193,683

96,842


Voluntary

government

State and local government

8,648

4,324


Mandatory

private

1133, 1151, 1152, 21-81 (exc. 814)

61,244

30,622


Mandatory

government

State and local government

2,914

1,457


Total

266,489

133,245



3(b) Nonresponse Adjustment


Under MB3, there are three types of unobserved units, including nonrespondents, non-sampled units, and sampled units with an “unstable” response. An “imputation,” or prediction, procedure is used to impute for all unobserved units including nonresponse. A staffing pattern and wage distribution are imputed to all three types of unobserved units based on up to 10 “nearest neighbor” responses based on panel/State/MSA/employment size, and 6-digit industry cells. Within each cell, the 10 “nearest donors” (i.e., responding units) are selected for each nonrespondent. The nonrespondents’ sampling frame employment and the donors’ summed total employment are used within a cell to match donors with nonrespondents. Once donors and nonrespondents are matched, the occupational employment totals of the nonrespondent are computed using the proportional distribution of the donors’ occupational employment totals.



3(c) Non-Response Bias Research


Extensive research has been previously done to assess whether the non-respondents to the OEWS survey differ systematically in some important respect from the respondents of the survey and would thus bias OEWS estimates. A study of comparison of average hourly occupational wages from OEWS survey were compared to those of another of the Bureau’s programs, the National Compensation Survey (NCS), used for Employment Cost Index and for President’s Pay Agent in setting locality pays. The comparisons were done at the national level and at various MSA levels. This research showed that the mean hourly wages for 70% of the occupations were not statistically different between the two surveys and an additional 10% were not economically different; that is, the two mean occupational wages were within 10% of each other. Many of the remaining 20% of the occupations had wage discrepancies that could be explained by conceptual differences and coding inconsistencies, etc. between the two programs. An important finding from this report was the differences in mean occupational wages between the two surveys were in both directions with a 40-60% split. These findings were the same at the National and MSA levels (Barkume, 2006)viii. The results from this study show that the nearest neighbor imputation method and the wage imputation method as described in this document are performing a reasonable adjustment for missing data. Future plans include repeating this type and other similar analysis of non-response bias, resources permitting.


3(d) Confidentiality


Before occupational employment estimates are released to the public, they must first be screened to ensure that they do not violate the Bureau of Labor Statistics’ (BLS) confidentiality pledge. A promise is made by the Bureau to each private sector sample unit that the BLS will not release its employment data to the public in a manner that would allow others to identify the unit. If an occupational employment or wage estimate fails confidentiality screening, the estimate is suppressed.


3(e) Publishability


After confidentiality screening, the estimates (including confidentiality collapsed estimates) are screened a second time to ensure that they satisfy the Bureau’s publishability standards. Among them, employment estimates must have an associated relative standard error of 50 percent or less, there must be at least two responses (i.e., not imputed) for the occupation, and the employment estimate must be 30 or greater for BLS publication. Records with employment estimates between 10 and 30 are released to states. For wage estimates, the relative standard errors must be 30 percent or less.


4. Developmental Tests


The OEWS program regularly evaluates its processes and methods and occasionally updates its collection materials as a result of this ongoing research. Current and future plans are outlined below. Additional details regarding testing or improvements will be submitted through the non-substantive change request process.


Test of Email Timing and Format


OEWS uses a combination of emails and letters for non-response follow-ups.  Currently, the mailings are sent out on fixed schedule, every four weeks.  BLS would like to explore the effectiveness of varying this schedule, to see if sending the email closer to the letter mailing leads to improved response rates or electronic reporting.  The content of the emails would not vary from the OMB approved materials, but the frequency or gap between contacts would change depending on the treatment group. For states that volunteer to participate, we would randomly assign respondents to treatment groups. The groups would receive the emails on different schedules.  Analysis of the final response rates for each group would be analyzed to identify the most effective schedule to follow in the future.  


In conjunction with timing, email blast formats must be periodically evaluated to ensure maximization of character counts and other response factors. With evaluation and comments from BLS and state office staff, and respondents, OEWS may decide to alter the email blast templates accordingly. A test plan and proposed template will be submitted BLS through the non-substantive change process.


Current Research


Using wage rate data in combination with interval data


Currently some employers provide electronic files of employees with actual wage rates rather than wages reported in intervals because it requires less time and effort. Many employers find it easier to provide wage rate data rather than converting their data to wage intervals. Many employers already do this and BLS or the SWAs convert the reported data to intervals. We would like to provide employers the option of filling out a survey form with wage rates, or providing their data in Weblite by wage rate. In addition to adapting to respondents preferred method of reporting, this data is valuable in conducting research on methodology improvements. As a result of future research, OEWS may alter their collection forms. In the event that methodology testing is conducted, and forms must be changed, OEWS will submit the test plan and updated form template through the non-substantive change process.


Examination of wage rate data already provided by respondents on a volunteer basis has suggested that improvements in methodology can be obtained by capturing the wage rate data from employers. This is especially true when wages are clustered, such as for minimum wage workers, or for union workers within a particular industry or area. In addition, wages for highly paid occupations are affected by assigning them to predefined intervals. In order to study this more fully, and incorporate it into our methodology, BLS is exploring ways to incorporate the data from employers that choose to report wage rates.


Future Research


Adjusting solicitation materials


OEWS solicitation materials are designed to be dynamic. Solicitation materials are under constant evaluation to ensure the proper information is being conveyed to respondents, to ensure that respondents are properly assisted with submitting their data, and to ensure that the data being collected is that of the highest quality. To fulfill our promise of quality data and to control respondent burden, OEWS may need to make periodic changes to our solicitation materials. All changes will be subject to approval through the non-substantive process. OEWS will submit detailed test and change plans, which will include templates and design, to OMB when necessary.


Collecting actual wage rates


Subject to funding availability, OEWS would like to conduct tests to investigate the viability of collecting additional information from employers, which could potentially improve the quality of occupational wages, reduce response burden on employers, allow for further analysis of factors affecting occupational wages, and lead to expansion of OEWS data products. BLS is planning to develop more detailed proposals on testing the collection of actual wage rates from more employers, additional data items, and additional establishments such as larger employers and multi-unit establishments. This research may use a split panel design, and may lead to reduced burden hours. Sample designs, survey instruments, and survey methods for this research will be detailed later. As with all research and tests, OEWS will submit detailed test plans through the non-substantive process to receive approval.

Asking employers for additional data elements


BLS would like to ask employers to report data items that many already report without solicitation. Often employers include additional data items such as the number of hours the person is paid for. Many employers already provide many data elements in their electronic OEWS report that we do not ask for. These data elements include information that is requested by customers, but cannot be provided by OEWS or other BLS surveys. For example, establishments report data items along with the occupation and wages such as: part-time or full-time status, hours, whether or not employees are exempt from the Fair Labor Standards Act, gender, age, EEO category, union status, specific job title, department, and others. While some of these occupational characteristics are available from other BLS sources, none are available for states and all areas, and in the case of demographic data, they cannot be associated with a particular employer’s industry or size, and are not available for many occupations. A small-scale test successfully collected extra data elements. These results showed that extra elements can be collected from respondents. While the test was limited in time and scope, the response rates mirrored those of regular OEWS data collection. Also, a Response Analysis Survey (RAS) conducted in 2011-12 showed that most employers are willing to provide additional data like hours worked and part-time/full-time status. BLS would like to continue this research.


Asking employers to report for all their establishments instead of randomly selected establishments


Multiunit companies often report occupational wage data for all establishments rather than just providing data for the requested sampled establishments. Since so many employers provide this information without being asked, BLS would like to explore how employers would respond if we specifically asked employers to provide these data. Many employers provide comprehensive electronic data files or data dumps containing payroll data for all of their establishments every year, rather than providing data for just the sampled establishment. The OEWS analyst sorts through the reports, and matches them to the sampled units, saving the respondent the burden of doing so. The OEWS analysts ignore the unsolicited establishments. Some of the volunteered establishments might be included in the OEWS sample a different year, and the newer data will be solicited. For the units that aren’t in the 6-panel sample used for estimates, their inclusion might help local area estimates. Capturing newer data for units that are in older panels might improve the currency of the data. While OEWS is not a time series, there are many customers that would like to use it this way. Capturing data for some employers that report electronically every panel might facilitate the time series qualities of OEWS data. BLS is interested in testing ways to improve time series. Asking more multi-unit reporters to report all their data, rather than selected sample units is one way to do so.


Two categories included in the data dumps have already been proven useful in quality control. For example, hours worked data provided by some airlines helped to improve wage estimates for pilots and flight attendants. Wage rate data has shown the necessity to use wage rate data rather than intervals for the US Post office, where even nationwide, occupational wages are clustered. The job titles provided in the data dumps have helped to find job titles that are coded in the wrong occupations, or paid employees, such as students, who should not be in the scope of the OEWS survey. We would like to explore the possibility of asking selected employers to provide this data in their OEWS report to address any bias that may be the result of self-selection to report this data.


5. Statistical and Analytical Responsibility


Mr. Edwin Robison, Chief, Statistical Methods Division of the Office of Employment and Unemployment Statistics, is responsible for the statistical aspects of the OEWS program. Additionally, BLS seeks consultation with other outside experts on an as needed basis.




6. References


i Bankier, Michael D. (1988). Power Allocations: Determining Sample Sizes for Subnational Areas. American Statistician, Vol. 42, pp. 174-177.

ii Lawley, Ernest; Stetser, Marie; and Valaitis, Eduardas, (2007). Alternative Allocation Designs for a Highly Stratified Establishment Survey, 2007 Joint Statistical Meetings.

iii Piccone, David and Stetser, Marie, (2009). National Sample Reallocation for the Occupational Employment Statistics Survey, 2009 Joint Statistical Meetings.

iv Technical Notes for May 2018 OES Estimates, Bureau of Labor Statistics, 2018 (http://www.bls.gov/oes/current/oes_tec.htm).

v Survey Methods and Reliability Statement for May 2018 (BLS Handbook of Methods, Ch. 3).

vi Dey, Matthew; Piccone, David; and Miller, Stephen (August 2019). Model-based estimates for the Occupational Employment Statistics program. Monthly Labor Review (https://www.bls.gov/oes/mb3-methods.pdf).

vii Survey Methods and Reliability Statement for MB3 Research Estimates of the Occupational Employment and Wage Statistics Survey (https://www.bls.gov/oes/mb3-methods.pdf).

viii Barkume, Tony; Dey, Matthew; Ernst, Larry; Gittleman, Maury; and Polivka, Anne (November 2006). Comparing OES and NCS Wage Estimates. BLS internal report. (final report).

1


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleA. Justification
AuthorJOSEPH C. BUSH
File Modified0000-00-00
File Created2022-05-05

© 2024 OMB.report | Privacy Policy