2020 OES Supporting Statement B (1220-0042)

2020 OES Supporting Statement B (1220-0042).docx

Report on Occupational Employment and Wages

OMB: 1220-0042

Document [docx]
Download: docx | pdf

Occupational Employment and Wages

1220-0042

March 2020




Supporting Statement


Report on Occupational Employment and Wages


B. Collection of Information Employing Statistical Methods


1(a) Respondent Universe


The universe for this survey consists of the Quarterly Contribution Reports (QCR) filed by employers subject to State Unemployment Insurance (UI) laws. The U.S. Bureau of Labor Statistics (BLS) receives these QCR for the Quarterly Census of Employment and Wages (QCEW) Program from the 50 States, the District of Columbia, Puerto Rico, and the U.S. Virgin Islands. The QCEW data, which are compiled for each calendar quarter, provide a comprehensive business name and address file with employment, wage, detailed geography (i.e., county), and industry information at the six-digit North American Industry Classification System (NAICS) level. This information is provided for nearly ten million business establishments of which about 7.7 million are in the scope of this survey. The final data is stored in a Longitudinal Data Base (LDB), which is then used as a sampling frame for sample selection. Data for Federal Government employees covered by the Unemployment Compensation for Federal Employees program (UCFE), the US Postal Service, and the Tennessee Valley Authority are also included. Other data used for sampling include the universe of railroad establishments obtained from Federal Railroad Administration and a universe of establishments in the U.S. territory of Guam obtained from the Government of Guam, Department of Labor.


1(b) Sample


Scope--The OES measures occupational employment and wage rates of wage and salary workers in nonfarm establishments in the 50 States and the District of Columbia. Guam, Puerto Rico, and the Virgin Islands are also surveyed, but their data are not included in national estimates. The survey covers the following NAICS industry sectors:


11 Logging (1133), support activities for crop production (1151), and support activities for animal production (1152) only

21 Mining

22 Utilities

23 Construction

31-33 Manufacturing

42 Wholesale Trade

44-45 Retail Trade

48-49 Transportation and warehousing

51 Information

52 Finance and insurance

53 Real estate and rental and leasing

54 Professional, scientific, and technical services

55 Management of companies and enterprises

56 Administrative and support and waste management and remediation services

61 Educational services

62 Health care and social assistance

71 Arts, entertainment, and recreation

72 Accommodation and food services

81 Other services, except public administration [private households (814) are excluded]

99 Federal, State, and local government (OES designation)


Sample Size--The sample size is approximately 1.1 million establishments over a 3-year period. The sample is divided into six panels over three years with two semi-annual samples of about 180,000 establishments selected each year. The following table shows the estimated number of universe units, sampled units, and responding units for all in-scope NAICS by Fiscal Year for the regular OES survey:


Table 1: Universe and Sample Size Summary


Survey

NAICS Coverage

Responding Units

Sample Units

Universe Units

FY 2016 (November 2015 & May 2016 panels)

All in-scope NAICS

277,184

404,219

7,522,534

FY 2017 (November 2016 & May 2017 panels)

270,949

397,069

7,636,124

FY 2018 (November 2017 & May 2018 panels)

260,597

371,575

7,738,856

6-panel Totals

 

808,730

1,172,863

7,738,856


Stratification--Units on the sampling frame are stratified by State/Metropolitan Statistical Area (MSA) and Balance of State, and by three-, four-, five-, or six-digit NAICS industry code. The frame is further stratified into certainty and non-certainty portions for sample selection. Certainty units include Federal and State governments, hospitals, railroads, and large establishments. These are sampled with probability equal to one every 3-year cycle because of their occupational employment coverage and economic significance. All remaining establishments are non-certainty establishments and are selected with probability less than one but greater than zero.


2(a) Sample Design


Allocation method--A variation of Neyman allocation procedure called a Power Allocation (Bankier, 1988)i is used to allocate the non-certainty sample to each State-/area/3-4-5-6-digit NAICS stratum. The allocation methodology balances employment size of areas and industries with the variability of the occupational employment in each industry. The use of the power allocation shifts sample away from very large areas and industries to medium and smaller ones allowing more comparable estimates for smaller domains. The power allocation is calculated as follows:



Where,


h = the stratum defined as defined as State/area/3-4-5-6 digit NAICS industry

= non-certainty sample allocated to stratum h

= national sample size less the number of certainty units

Xh = non-certainty frame employment in stratum h

Sh = average occupational variability within stratum h


Additionally, OES ensures that a minimum sample size is allocated to each sample stratum such that the final sample allocation for each stratum is equal to the maximum of the minimum allocation and the power allocation. Further discussion on the strengths and weaknesses of this approach are presented in Lawley et. al., 2007ii and Piccone, 2009iii.


Sample Selection--Within each stratum, the sample is selected using probability proportional to estimated employment size with large units being selected with certainty. Each semi-annual panel sample is designed to represent the frame. Every attempt is made to ensure that private and local government establishments are only selected once every three years. Consequently, each sampled establishment is assigned a sampling weight equal to the reciprocal of its probability of selection in the sample. Note: Censuses of federal and state government are collected annually.


Occupational employment data from prior survey rounds are used by BLS-Washington to produce sample allocations that result in relative standard errors (RSE) on mean wages of 10 to 20 percent for the typical occupations in each MSA/ three-, four-, five-, or six-digit industry cell. Mean wage estimates for typical occupations at higher aggregate levels of area and industry will have substantially smaller relative standard errors.


Frequency of Sampling--Each year, semiannual panels of about 180,000 to 190,000 establishments each are selected for the May and November reference periods.


Sampling Issues--Sampling procedures introduced for collection efficiencies may result in a small downward bias in employment estimates in some industries for some areas. This bias is estimated to be between 0.1 and 0.2 percent of total employment. This may occur in cases where the single panel allocation can be rounded to either 0 or 1. In the case when the allocation is 0 for all 6 panels contributing to the 6-panel estimate, the estimated employment for the cell is 0, while in all other cases, the expected employment is correct. This bias is mitigated at higher levels of aggregation through benchmarking.


More detailed information about OES sample allocation and selection procedures can be found in the 2018 OES Technical Noteiv and Survey Methods and Reliability Statement for May 2018v.

2(b) Estimation Procedures


Annual estimates of occupational employment and wage rates are produced using data from the current May semiannual panel’s survey and survey data from the five semiannual panels prior to the current panel (a total sample of about 1,080,000 establishments). Data from several panels are combined in order to reduce the sampling error of the estimates at detailed levels (MSA by 3-4-5-6 digit NAICS). Combining samples from six panels increases the sample population counts at the total level by six fold. However, not all cells have sample in them in each panel because many of the detailed cells are sparsely populated. To avoid a multiple count of up to six fold, the sampling weight for each establishment in a “State-MSA/3-4-5-6 digit sampling cell” is reduced by a factor 1/d, where, d is a counter indicating if the sampling cell has establishments from 1, 2, 3, 4, 5, or 6 panels. The six-panel combined sample weight is used to calculate the employment and wage estimates.


Employment Estimation


The OES survey publishes estimates of occupational employment for three-, four-, five-, or six-digit industry cells within States and the Nation and cross-industry cells for MSAs. The estimation process begins with an edit procedure to identify and correct inconsistent or incomplete data on the file. The procedure also identifies and makes adjustments to atypical reporting units. Afterwards, a hot-deck nearest-neighbor imputation is used to impute occupational staffing patterns for the nonresponding units. Next, a mean of cell imputation procedure is used to impute for missing occupational wage distributions. After the data are edited and imputed, the d-weighting as described in the previous paragraph is processed. Finally, the weighted sampled employment totals are ratio adjusted, or benchmarked, to known employment totals. These known employment totals are extracted from the Bureau’s Enhanced Quarterly Unemployment Insurance (EQUI) files, which are the raw state micro-data used to create the longitudinal links for the QCEW. The sampling weight of each unit is multiplied by the benchmark factor to produce a final weight value for the unit. The following equation is used to calculate occupational employment estimates for an estimation cell defined by geographic area, industry group, and benchmark size class:


Where,

o = occupation;

h = estimation cell;

wi = six-panel combined sample weight for establishment i;

BMFi = final benchmark factor for establishment i;

xio = reported employment for occupation o in establishment i;

= estimated employment for occupation o in cell h


Each occupational employment estimate has a standard error that is calculated using a random group jackknife variance estimator.


Wage Estimation


Mean wage and median wage estimates are calculated for each occupation within an MSA/three-, four-, five-, or six-digit industry cell (and these are summed to higher levels). Traditionally OES wage rate data are collected in broad wage bands instead of exact data points. In 2009, OES began incorporating exact point data for federal and state governments for wage and percentile estimates.


For data that is collected in wage bands, the mean wage rate for each wage band is obtained externally from the Bureau's National Compensation Survey (NCS). The inflation factor from the Employment Cost Index is used to update wage data collected in past panels to be used in current wage estimates. Because most wage rate data are collected in broad wage bands instead of exact data points, the standard error for each mean wage rate estimate is calculated using a components model. This model accounts for the variability of both the observed and unobserved components of wage rate data. A traditional ratio variance estimator is used to account for the variability observed in the collected OES wage data. Since the mean wage rate for each wage band is obtained externally from the Bureau's National Compensation Survey (NCS), there are unobserved components of wage rate variance that are modeled. Detailed wage data collected by the NCS are used to estimate the variability of the unobserved components.


As mentioned above, OES began incorporating federal and state point data for both wage and percentile estimates in 2009. State governments report some of their data as exact wage rates and some in interval bands. OES receives a census of occupational employment and exact wage information from these reporters each year so these do not contribute to the sampling error for the survey. Furthermore, these reporters are currently out-of-scope for the NCS and would not introduce bias by using exact wages in the estimator instead of the NCS means.


For more detailed information about estimation procedures for employment and wages please see Bureau of Labor Statistics’ Handbook of Methods, Chapter 3 and Appendix M of the OES State Operations Manualvi.


Model-Based Estimation Development


The Model-Based Estimation using 3 years of data (MB3), a product of a long term research project, is under development with potential for use in production. Testing indicates that the accuracy and reliability of the estimates improved over the current approach. This system is one of the components necessary to produce future estimates appropriate for year-to-year comparisons. In September 2019, BLS published 2016 data using the new estimation method as a research series, and a Monthly Labor Review article describing the method with comparisons to the new methods.


The MB3 system takes advantage of the fact that BLS observes key determinants of occupational staffing patterns and wages for all units in a target population. In particular, the QCEW provides data on the detailed industry, ownership status, geographic location, and size for every establishment whose workers are covered by state unemployment insurance laws. Sample information is used to model wage distributions and industry/area/size/ownership/time wage adjustments. The estimation system includes redesigned components for model fitting, unit matching, and variance estimation. Further details of the system are presented in the Monthly Labor Review article, Model-based estimates for the Occupational Employment Statistics Programvii.


2(c) Reliability


A probability based sample design is used to develop the OES survey. This design allows the Bureau to control and measure the sampling error of the occupational employment and wage rate estimates. Relative standard error estimates are used to measure sampling error. A random group jackknife variance estimator is used to estimate the relative standard errors for the occupational employment estimates. A variation of the stratified ratio estimator is used to estimate the relative standard errors for the mean wage estimates. Background on the variance estimator used in OES can be found in an ASA conference paperviii.


Nonsampling errors, unlike sampling error, are not easily measurable but can be controlled through various quality control procedures. One such procedure is the requirement that all States use data processing software provided by the BLS national office. This standardization and automation of survey operations should reduce several sources of nonsampling error. State and BLS staff use automated and manual data screening procedures that help identify data that was misreported, or keypunched incorrectly.


2(d) Special Procedures


In order to produce wage rate and employment estimates at detailed geographic levels, the OES combines data across a three-year time period (six semiannual panels). Special sampling procedures are in place to allocate the sample, to limit the inclusion of units to once in a three-year time period, and to combine the data to produce estimates. Among these special procedures are methods to update, or “age” the previous years’ wage data to reflect the current time period. Background on procedures for wage updating can be found in a BLS research publicationix. BLS continues to conduct research to evaluate the effectiveness of the updating process, and to improve it where possible. Collecting all of the certainty units each year would allow these data to be used in the validation of the updating process; we are evaluating several collection options with respect to these units.


2(e) Data Collection Cycles


Occupational employment and wage-range data are collected for all nonagricultural industries over a six-panel semiannual cycle, with data collected for a reference period of May and November of each year. In each panel, one sixth of a 1,080,000 establishment sample is selected for nonagricultural industries, where overlap with the prior five panels’ samples is minimal. Thus, establishments will be included in the sample at most once every three years.


3(a) Maximizing Response


A goal of the OES survey is that each State achieves an 80 percent response rate. The overall response rate for the 2018 survey was approximately 71.2 percent based on units.

Each State is responsible for collecting the questionnaire forms mailed to the sample units selected in their State. Every effort is made to maximize response rates to achieve the 80-percent goal by:


  • Surveying sampled units at most once every three years (once every six panels). With the research to improve time series capabilities of the OES survey this constraint may be relaxed for some employers.

  • Conducting extensive address refinement to ensure that the survey form reaches the correct establishment in a timely manner.

  • Providing each sampled unit with a cover letter explaining the importance of the survey and the need for voluntary cooperation.

  • Giving each private sector sample unit the Bureau’s pledge of confidentiality.

  • Sending pre-notification letters to establishments before they are contacted to provide data.

  • Sending each nonresponding unit two to three additional mailings after the initial mail-out (if necessary); the BLS also recommends that the States obtain specific contact names for each sampled firm.

  • Contacting key nonresponding units in each MSA/three-, four-or-five-digit industry or MSA/three-digit industry cell by telephone.

  • Contacting critical employers through personal visits (if necessary).

  • Including fact sheets that explain the many uses of the OES data.

  • Using status reports and control files to identify MSA/industry cells with low response rates.

  • Stressing to respondents that assistance is available to help them complete the survey form.

  • Providing a link on the survey form to OES data on the BLS Internet website to demonstrate program usefulness.

  • Using a respondent web page that provides detailed information about responding to the OES survey, including state contact information for those needing assistance.

  • Increasing the use of electronic and telephone collection in order to allow the respondent to provide information in a way that is most convenient to them.

  • Providing email and online data submission options.

  • Advertising electronic submission options.

  • Coordinating collection for multi-unit establishments.

  • Initial and follow-up email blasts to units that provided an email address.

  • Conducting periodic Response Analysis or Nonresponse Analysis surveys to learn what motivates employers to respond.


Response Burden--The following table shows the estimated response burden by Mandatory/Voluntary and Ownership:


Table 2: Response Burden


Survey year

Mandatory / voluntary

Ownership

NAICS Coverage

Estimated Responding Units

Estimated burden hours

FY 2020

Voluntary

private

1133, 1151, 1152, 21-81 (exc. 814)

186,586

93,293


Voluntary

government

State and local government

8,356

4,173


Mandatory

private

1133, 1151, 1152, 21-81 (exc. 814)

59,009

29,505


Mandatory

government

State and local government

2,817

1,408


Total

256,768

128,384























































Survey year

Mandatory / voluntary

Ownership

NAICS Coverage

Estimated Responding Units

Estimated burden hours

FY 2021

Voluntary

private

1133, 1151, 1152, 21-81 (exc. 814)

193,683

96,842


Voluntary

government

State and local government

8,648

4,324


Mandatory

private

1133, 1151, 1152, 21-81 (exc. 814)

61,244

30,622


Mandatory

government

State and local government

2,914

1,457


Total



266,489

133,245



















Survey year

Mandatory / voluntary

Ownership

NAICS Coverage

Estimated Responding Units

Estimated burden hours

FY 2022

Voluntary

private

1133, 1151, 1152, 21-81 (exc. 814)

193,683

96,842


Voluntary

government

State and local government

8,648

4,324


Mandatory

private

1133, 1151, 1152, 21-81 (exc. 814)

61,244

30,622


Mandatory

government

State and local government

2,914

1,457


Total

266,489

133,245


3(b) Nonresponse Adjustment


A hot deck (nearest neighbor) imputation procedure is used to impute for unit nonresponse. This type of nonresponse occurs when a unit reports no employment data. In this procedure, units in the sample are stratified into ‘Year / State / 5 or 6-digit industry’ cells. Within each cell, the “nearest” donor (i.e., responding unit) is selected for each nonrespondent; an algorithm is used, which minimizes the possibility of selecting the same donor twice. The nonrespondent’s sampling frame employment and the donor’s summed total employment are used within a cell to match donors with nonrespondents. Once a donor and nonrespondent are matched, the occupational employment totals of the nonrespondent are computed using the proportional distribution of the donor’s occupational employment totals. In the event that a donor is not available at the ‘Year / State / 5 or 6-digit industry level, the procedure advances to succeeding higher-level cells until a donor is found. Background on nearest neighbor imputation procedures for OES can be found in Robertson, 1995x.


A variation of the mean imputation procedure is used to impute for item nonresponse. This type of nonresponse occurs when a unit reports the total-employment for an occupation but not the corresponding wage data. Units where occupational employment was imputed in the previous step are also included as nonrespondents in this procedure. In this procedure, units in the sample are stratified into ‘Year / MSA / 4-digit industry / Size class’ cells. A wage-employment distribution is then calculated for those occupations missing wage information based on the usable data in the cell. Missing wage-employment is then imputed by using that distribution to prorate the reported total-employment across the wage intervals.


3(c) Non-Response Bias Research


Extensive research has been previously done to assess whether the non-respondents to the OES survey differ systematically in some important respect from the respondents of the survey and would thus bias OES estimates. A study of comparison of average hourly occupational wages from OES survey were compared to those of another of the Bureau’s programs, the National Compensation Survey (NCS), used for Employment Cost Index and for President’s Pay Agent in setting locality pays. The comparisons were done at the national level and at various MSA levels. This research showed that the mean hourly wages for 70% of the occupations were not statistically different between the two surveys and an additional 10% were not economically different; that is, the two mean occupational wages were within 10% of each other. Many of the remaining 20% of the occupations had wage discrepancies that could be explained by conceptual differences and coding inconsistencies, etc. between the two programs. An important finding from this report was the differences in mean occupational wages between the two surveys were in both directions with a 40-60% split. These findings were the same at the National and MSA levels (Barkume, 2006)xi. The results from this study show that the nearest neighbor imputation method and the wage imputation method as described in this document are performing a reasonable adjustment for missing data. Future plans include repeating this type and other similar analysis of non-response bias, resources permitting.


3(d) Confidentiality


Before occupational employment estimates are released to the public, they must first be screened to ensure that they do not violate the Bureau of Labor Statistics’ (BLS) confidentiality pledge. A promise is made by the Bureau to each private sector sample unit that the BLS will not release its employment data to the public in a manner that would allow others to identify the unit. If an occupational employment or wage estimate fails confidentiality screening, the estimate is suppressed.


3(e) Publishability


After confidentiality screening, the estimates (including confidentiality collapsed estimates) are screened a second time to ensure that they satisfy the Bureau’s publishability standards. Among them, employment estimates must have an associated relative standard error of 50 percent or less, there must be at least two responses (i.e., not imputed) for the occupation, and the employment estimate must be 10 or greater. For wage estimates, the relative standard errors must be 30 percent or less.


4. Developmental Tests


The OES program regularly evaluates its processes and methods and occasionally updates its collection materials as a result of this ongoing research. Current and future plans are outlined below. Additional details regarding testing or improvements will be submitted through the non-substantive change request process.



Test of Email Timing and Format


OES uses a combination of emails and letters for non-response follow-ups.  Currently, the mailings are sent out on fixed schedule, every four weeks.  BLS would like to explore the effectiveness of varying this schedule, to see if sending the email closer to the letter mailing leads to improved response rates or electronic reporting.  The content of the emails would not vary from the OMB approved materials, but the frequency or gap between contacts would change depending on the treatment group. For states that volunteer to participate, we would randomly assign respondents to treatment groups. The groups would receive the emails on different schedules.  Analysis of the final response rates for each group would be analyzed to identify the most effective schedule to follow in the future.  


In conjunction with timing, email blast formats must be periodically evaluated to ensure maximization of character counts and other response factors. With evaluation and comments from BLS and state office staff, and respondents, OES may decide to alter the email blast templates accordingly. A test plan and proposed template will be submitted BLS through the non-substantive change process.


Current Research


Using wage rate data in combination with interval data


Currently some employers provide electronic files of employees with actual wage rates rather than wages reported in intervals because it requires less time and effort. Many employers find it easier to provide wage rate data rather than converting their data to wage intervals. Many employers already do this and BLS or the SWAs convert the reported data to intervals. We would like to provide employers the option of filling out a survey form with wage rates, or providing their data in Weblite by wage rate. In addition to adapting to respondents preferred method of reporting, this data is valuable in conducting research on methodology improvements. As a result of future research, OES may alter their collection forms. In the event that methodology testing is conducted, and forms must be changed, OES will submit the test plan and updated form template through the non-substantive change process.


Examination of wage rate data already provided by respondents on a volunteer basis has suggested that improvements in methodology can be obtained by capturing the wage rate data from employers. This is especially true when wages are clustered, such as for minimum wage workers, or for union workers within a particular industry or area. In addition, wages for highly paid occupations are affected by assigning them to predefined intervals. In order to study this more fully, and incorporate it into our methodology, BLS is exploring ways to incorporate the data from employers that choose to report wage rates.


Evaluating the use of model based estimation

(See Section 2(b) above.)







Future Research


Adjusting solicitation materials


OES solicitation materials are designed to be dynamic. Solicitation materials are under constant evaluation to ensure the proper information is being conveyed to respondents, to ensure that respondents are properly assisted with submitting their data, and to ensure that the data being collected is that of the highest quality. To fulfill our promise of quality data and to control respondent burden, OES may need to make periodic changes to our solicitation materials. All changes will be subject to approval through the non-substantive process. OES will submit detailed test and change plans, which will include templates and design, to OMB when necessary.


Collecting actual wage rates


Subject to funding availability, OES would like to conduct tests to investigate the viability of collecting additional information from employers, which could potentially improve the quality of occupational wages, reduce response burden on employers, allow for further analysis of factors affecting occupational wages, and lead to expansion of OES data products. BLS is planning to develop more detailed proposals on testing the collection of actual wage rates from more employers, additional data items, and additional establishments such as larger employers and multi-unit establishments. This research may use a split panel design, and may lead to reduced burden hours. Sample designs, survey instruments, and survey methods for this research will be detailed later. As with all research and tests, OES will submit detailed test plans through the non-substantive process to receive approval.


Asking employers for additional data elements


BLS would like to ask employers to report data items that many already report without solicitation. Often employers include additional data items such as the number of hours the person is paid for. Many employers already provide many data elements in their electronic OES report that we do not ask for. These data elements include information that is requested by customers, but cannot be provided by OES or other BLS surveys. For example, establishments report data items along with the occupation and wages such as: part-time or full-time status, hours, whether or not employees are exempt from the Fair Labor Standards Act, gender, age, EEO category, union status, specific job title, department, and others. While some of these occupational characteristics are available from other BLS sources, none are available for states and all areas, and in the case of demographic data, they cannot be associated with a particular employer’s industry or size, and are not available for many occupations. A small-scale test successfully collected extra data elements. These results showed that extra elements can be collected from respondents. While the test was limited in time and scope, the response rates mirrored those of regular OES data collection. Also, a Response Analysis Survey (RAS) conducted in 2011-12 showed that most employers are willing to provide additional data like hours worked and part-time/full-time status. BLS would like to continue this research.


Asking employers to report for all their establishments instead of randomly selected establishments


Multiunit companies often report occupational wage data for all establishments rather than just providing data for the requested sampled establishments. Since so many employers provide this information without being asked, BLS would like to explore how employers would respond if we specifically asked employers to provide these data. Many employers provide comprehensive electronic data files or data dumps containing payroll data for all of their establishments every year, rather than providing data for just the sampled establishment. The OES analyst sorts through the reports, and matches them to the sampled units, saving the respondent the burden of doing so. The OES analysts ignore the unsolicited establishments. Some of the volunteered establishments might be included in the OES sample a different year, and the newer data will be solicited. For the units that aren’t in the 6-panel sample used for estimates, their inclusion might help local area estimates. Capturing newer data for units that are in older panels might improve the currency of the data. While OES is not a time series, there are many customers that would like to use it this way. Capturing data for some employers that report electronically every panel might facilitate the time series qualities of OES data. BLS is interested in testing ways to improve time series. Asking more multi-unit reporters to report all their data, rather than selected sample units is one way to do so.


Two categories included in the data dumps have already been proven useful in quality control. For example, hours worked data provided by some airlines helped to improve wage estimates for pilots and flight attendants. Wage rate data has shown the necessity to use wage rate data rather than intervals for the US Post office, where even nationwide, occupational wages are clustered. The job titles provided in the data dumps have helped to find job titles that are coded in the wrong occupations, or paid employees, such as students, who should not be in the scope of the OES survey. We would like to explore the possibility of asking selected employers to provide this data in their OES report to address any bias that may be the result of self-selection to report this data.


5. Statistical and Analytical Responsibility


Mr. Edwin Robison, Chief, Statistical Methods Division of the Office of Employment and Unemployment Statistics, is responsible for the statistical aspects of the OES program. Additionally, BLS seeks consultation with other outside experts on an as needed basis.


6. References


i Bankier, Michael D. (1988) Power Allocations: Determining Sample Sizes for Subnational Areas. American Statistician, Vol. 42, pp. 174-177.


ii Lawley, Ernest; Stetser, Marie; and Valaitis, Eduardas, (2007) Alternative Allocation Designs for a Highly Stratified Establishment Survey, 2007 Joint Statistical Meetings.


iii Piccone, David and Stetser, Marie, (2009) National Sample Reallocation for the Occupational Employment Statistics Survey, 2009 Joint Statistical Meetings.


iv Technical Notes for May 2018 OES Estimates, Bureau of Labor Statistics, 2018 (http://www.bls.gov/oes/current/oes_tec.htm).


v Survey Methods and Reliability Statement for May 2018 (PDF)


vi Occupational Employment Statistics State Operations Manual, Appendix M, Bureau of Labor Statistics, 2014 (http://199.221.111.170/Programs/OES/docs/documentation/OES%20State%20Operations%20Manual_2014.zip).


vii Dey, Matthew; Piccone, David; and Miller, Stephen, Model-based estimates for the Occupational Employment Statistics program. Monthly Labor Review, August 2019 (https://www.bls.gov/opub/mlr/2019/article/model-based-estimates-for-the-occupational-employment-statistics-program.htm).


viii Robertson, Kenneth; Tou, Albert; and Huff, Larry, Developing An Estimator TO Estimate The Variance Of Mean Wage Rates Computed From Grouped Data In The Occupational Employment Statistics Survey, ASA Proceedings, 1997

(http://www.asasrms.org/Proceedings/papers/1997_081.pdf).


ix Butani, Shail; Robertson, Kenneth; and Werking, George, ALTERNATIVE METHODS FOR UPDATING WAGE DATA FROM THE OCCUPATIONAL EMPLOYMENT STATISTICS SURVEY, BLS Research Paper. 2000 (https://www.bls.gov/osmr/research-papers/2000/pdf/st000080.pdf).


x Robertson, Kenneth; Tou, Albert; and Huff, Larry, STUDY OF DONOR POOLS AND IMPUTATION METHODS FOR MISSING EMPLOYMENT DATA, BLS Internal Research Paper, 1995 (https://www.bls.gov/osmr/research-papers/1995/pdf/st950180.pdf).


xi Barkume, Tony; Dey, Matthew; Ernst, Larry; Gittleman, Maury; and Polivka, Anne, Comparing OES and NCS Wage Estimates. BLS internal report. November 2006 (final report).



Response Analysis Survey Results for the Occupational Employment Statistics Survey, Bureau of Labor Statistics, Internal Document, 2013




1


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleA. Justification
AuthorJOSEPH C. BUSH
File Modified0000-00-00
File Created2021-01-14

© 2024 OMB.report | Privacy Policy