Download:
pdf |
pdf
U.S. DEPARTMENT OF LABOR
BUREAU OF LABOR STATISTICS
Occupational
Requirements
Survey
Phase 3 Summary Report
1 | P a g e
Executive Summary
The National Compensation Survey (NCS) recently completed Phase 3 of the Occupational Requirements Survey (ORS)
tests conducted in cooperation with the Social Security Administration (SSA). The main objective of Phase 3 was to test
whether ORS field economists from across the country could collect all of the ORS data elements and occupational
wages and leveling information in a uniform and efficient manner. Phase 3 also included supplemental tests to assess
the feasibility of Central Office Collection (COC), joint collection of ORS and Employment Cost Index (ECI) elements, and
conducting “efficiency” interviews. The results of the Phase 3 test demonstrate the effectiveness of the revised
materials and procedures and the continued viability of BLS collection of data relevant to the SSA’s disability program.
Respondents agreed to participate in the test, BLS field economists were able to capture the required data from
traditional NCS respondents, and individual data element response rates were very high.
Field collection for Phase 3 testing was conducted in areas surrounding six cities: Nashville, TN; Providence, RI;
Cincinnati, OH; Kansas City, MO; Pittsburgh, PA; and Orange, CA. Tests ran concurrently in all of these cities between
April 22, 2013 and July 19, 2013. Establishments were selected from the current NCS sampling frame for businesses in
the test cities, excluding all establishments currently active in any NCS sample. Probability Selection of Occupations
(PSO) was used to determine the occupations selected for collection. Experienced field economists from each BLS
regional office collected the data and more than a third of the interviews were observed by BLS or SSA staff.
Upon completion of the ORS collection, respondents were asked questions to gauge their reactions to the survey, and
BLS field economists and observers completed a post‐interview debriefing. Throughout the fielding period, regular
debriefing sessions were held with BLS field economists, observers, and other staff for the purposes of discussing
interviewing experiences, identifying potential issues with the materials and procedures, and sharing lessons learned.
City debriefing sessions were held for each test city and an overall Phase 3 debriefing summarized the major findings
identified during testing and allowed for expanded discussions of these and other issues between BLS and SSA staffs.
Key Findings
The results of the Phase 3 test overall were very promising. BLS field economists completed interviews with 638
companies representing a total of 667 establishments. Personal visit interviews were conducted for 92.3% of the
schedules, 6.3% were conducted by telephone, and 1.2% involved some combination of phone and personal visit
collection. During these interviews, field economists collected ORS and NCS data elements for a total of 3,259 sampled
occupations, representing 496 eight‐digit classification levels within the SOC structure. On average, data collection took
68 minutes per schedule. Each schedule had an average of 5.2 quotes, with an overall average of 13.2 minutes per
quote.
The field economists were very positive about the set of materials and resources that were developed or refined for
Phase 3 to explain the background, purpose, and methods of the test and to secure respondent participation. In
particular, a two‐sided ORS flyer with an endorsement letter from SSA on the back was widely used, as were area
economic summary sheets and an introductory letter tailored to industry sectors and BLS region.
Phase 3 training activities were effective in conveying the key ORS concepts and procedures. Field economists
particularly valued the pre‐collection ‘mock’ interviews, the debriefing following the mock interview, and the personal‐
visit mentoring appointments. The daily and weekly debriefing sessions continued to be an excellent training tool as
well, offering field economists the opportunity to exchange information, clarify their understanding of materials, and
share suggestions about collection issues. The Phase 3 mentoring process also was well received, though field
economists recommended that in future rounds of testing mentees should observe schedules before training and have
additional time to write up their schedules and have them reviewed before independent collection. Field economists
2 | P a g e
recommended that ORS expand training content and explore different delivery options to maximize engagement and
decrease the time needed to independently collect ORS.
The Multiple Quote tool – which collected information by element for all jobs simultaneously – was the most popular
choice among the field economists collecting data in Phase 3, and seemed to provide a good balance between collecting
quality data and reducing respondent burden. Field economists would have liked additional space on the tool to record
explanatory notes. As in previous rounds of tests, certain words and concepts continued to be unclear or confusing to
respondents (e.g., required; time to average performance). Field economists relied on their professional understanding
and professional judgment when interpreting respondent‐provided information and administering follow‐up probes.
Applications of professional judgment were most common when there were apparent inconsistencies among elements,
with combination jobs and combined tasks, and when respondents reported higher‐than‐expected frequency estimates
(e.g., especially for speaking and reaching). Field economists emphasized that the use of professional judgment is
essential when collecting ORS information and recommended additional training be developed on this topic to ensure
that it is applied it consistently and documented with sufficient detail. They noted the need for additional refinements
to question wording, probing guidelines, and development of better examples for each physical demand and
environmental factor (possibly tailored to specific industries or establishment types). Despite these areas for
improvement, respondents reported quite positive responses about the ORS initiative and experience, and there was
consensus among field economists that the ORS materials and procedures significantly improved between test phases
and throughout Phase 3.
The results of the small‐scale supplemental tests conducted in Phase 3 also were very promising. For example, although
only 60 schedules were completed in the efficiency burden test, the results point to the potential benefits of asking
questions about combination elements (i.e., driving, keyboarding, and writing) at the start of the interview to provide
better context for the subsequent elements and improve the application of professional judgment. Together, the
findings from these supplemental tests underscore the importance of continuing to develop strategies for increasing the
efficiency of ORS element collection, collecting high‐quality NCS and ORS data simultaneously, and securing ORS
elements from large employers with multiple locations.
The introduction of a new web‐based system for data capture in Phase 3 (Phases 1 and 2 relied on a spreadsheet) also
yielded valuable information. The system was designed to permit data entry by field economists, review of the captured
data, and tabulation of the results. Overall, field economists and reviewers liked the web‐based system, but cited the
inability to use the system without on‐line access as a shortcoming. Some field economists expressed the need for a
wage‐import feature for bigger establishments and the ability to enter compound wages. Field economists and
reviewers also recommended that additional development and testing be carried out in fiscal year 2014 to better
integrate the data capture system with other components (e.g., the appointment calendar, review questions, time‐
reporting spreadsheets, etc.).
The consensus opinion of BLS staff is that the Phase 3 objectives were successfully met and that these activities lay a
strong foundation for future tests. The body of this report provides additional details about these and other issues that
arose during Phase 3 testing and identifies considerations and suggestions for fiscal year 2014 testing.
3 | P a g e
Phase 3 ‐ Background and Test Overview
In April 2012, the Bureau of Labor Statistics (BLS) signed an interagency agreement with the Social Security
Administration (SSA) for the purpose of designing, developing, and carrying out a series of tests to assess the feasibility
of using the National Compensation Survey (NCS) platform as a means to accurately and reliably capture data relevant to
the SSA’s disability program purposes. The resulting initiative – the Occupational Requirements Survey (ORS) Project –
recently completed the third of three phases of testing planned for fiscal year 2013 as outlined in the Interagency
Agreement Deliverable sent to SSA on September 28, 2012. That document outlines the work the BLS will perform, key
objectives, and a detailed test plan:
In fiscal year 2013, the BLS performed work to meet the following objectives:
1. Evaluate survey design options and begin developing the protocols, aids, and final design to meet SSA data
needs;
2. Collect data to test and refine the protocols and aids; and
3. Provide documentation to the SSA summarizing the work performed by the BLS, conclusions drawn, and
recommendations for future data collection.
In order to accomplish these objectives, the BLS conducted a series of field tests with general test goals as described
below:
Phase 1 – Initial Proof of Concept Testing: The primary goal of this phase of testing was to ensure that the BLS field
economists knew how to describe the survey and ask for the new data elements. In addition, the BLS created and
tested an initial set of data‐collection protocols and a preliminary set of data collection aids.
Phase 2 – Collection Protocol Testing: The primary goal of this phase of testing was to test collection of the new data
elements while performing a selection of occupations from each respondent. In addition, the BLS refined the
collection protocols and aids based on an analysis of the test results.
Phase 3 – Broad‐Scale Testing: The primary goal of this phase of testing was to evaluate the BLS’s ability to select a
sample of occupations within each establishment, collect the new data elements needed by SSA, and collect other
NCS data elements that are of research interest to SSA such as wages and job‐leveling information. A second
objective of Phase 3 testing was to assess the feasibility of collecting the data needed by SSA in addition to all of the
NCS data elements needed to produce the Employment Cost Index, Employer Costs for Employee Compensation,
and various benefits products.
Phase 1 testing was successfully conducted in the Washington, D.C. consolidated metropolitan statistical area (CMSA)
from November 28, 2012 through December 6, 2012. BLS staff completed interviews with 27 establishments, collecting
detailed information on SSA data elements for 104 occupations. Phase 2 testing was successfully conducted in the
Indianapolis ‐ Anderson ‐ Columbus, IN and Portland ‐ Vancouver ‐ Hillsboro, OR‐WA metropolitan areas from January
28, 2013 through March 21, 2013. BLS field economists completed 227 interviews representing a total of 240
establishments, collecting detailed job information for 1,094 occupations. Probability selection of occupations was
successfully implemented in over 90 percent of interviews.
4 | P a g e
Phase 3 ‐ Test Objectives
In addition to the primary objectives of the Broad‐Scale Testing described above, Phase 3 activities also were designed
to test:
1. Collecting SSA data along with occupational wage data and job leveling;
2. The feasibility of Central Office Collection (COC), joint collection of ORS and Employment Cost Index
(ECI) elements, and conducting “efficiency” interviews (collectively, these are referred to as the
supplemental tests);
3. A new data capture/write‐up system; and
4. New schedule‐review procedures.
The goals of the supplemental tests were as follows:
Central Office Collection:
1. Assess the extent to which remote respondents could provide ORS data;
2. Determine whether current NCS Central Office Authorization (COA) and COC collection procedures will
need to be changed to accommodate ORS;
3. Develop an initial list of best practices for handling COC/COA establishments; and
4. Document situations where ORS data are not available through central office respondents.
Employment Cost Index Collection:
1. Develop an initial list of best practices; and
2. Evaluate the feasibility of incorporating fully integrated benefits collection in future testing.
Efficiency Burden Testing:
1. Identify collection best practices that produce high quality data but are more efficient than the current
structured interview; and
2. Determine the impact of modified collection protocols on data quality.
Cross‐cutting all Phase 3 test activities were the goals of further refining the collection methods and tools, attempting to
achieve collection results in a uniform and efficient manner, and ensuring that the data collected would meet SSA’s
requirement needs.
Phase 3 ‐ Test Methods and Materials
Field collection for Phase 3 testing was conducted in areas surrounding six cities: Nashville, TN; Providence, RI;
Cincinnati, OH; Kansas City, MO; Pittsburgh, PA; and Orange, CA. Tests ran concurrently in all of these cities between
April 22, 2013 and July 19, 2013. In addition, some data collection was done in Baltimore, MD in order to provide SSA
staff the opportunity to observe interviews.
Selected Establishments
Establishments were selected from the current NCS sampling frame for businesses in the test cities, excluding all
establishments currently active in any NCS sample but including some units recently rotated out of NCS production. The
target number of completed interviews for Phase 3 testing was 690 establishments (115 per test city) representing as
broad a mix of industries as possible given the size and time constraints of the test. At the conclusion of the testing
period, BLS completed 667 interviews across all test cities (see the Results section for more details about participating
establishments).
5 | P a g e
Field Economists
Phase 3 data were collected by experienced NCS field economists from BLS national and each regional office. Field
economists who had participated in previous ORS phases served as field data collectors in Phase 3 and as mentors to the
field economists new to ORS. Interviewer training for Phase 3 collection followed regular BLS practices and consisted of
five components:
Self study – field economists were provided the SSA background materials, ORS high‐level test plans, element
definitions, and pre‐course exercises.
Webinar – In the weeks prior to the start of test data collection, field economists participated in web‐based
training sessions in which they were given an overview of the SSA disability process, practice with the data
capture system, and information about the ORS test objectives, procedures, and materials.
In‐person training – This occurred on April 30, 2013 and May 1, 2013. In‐person training consisted of: review of
the technical memorandum; introduction to ORS elements; an overview of instrument edits and data analysis
objectives; a calibration exercise; mock interviews; and a mock‐interview debrief session.
Mentoring ‐ In addition to formal training, each field economist new to ORS was assigned a mentor who
collected in prior ORS testing. The mentor served as the field economists’ primary resource for technical advice.
Mentees observed mentors conducting two interviews prior to conducting their own interviews, and then were
themselves observed by their mentor on their first two interviews. Mentors also reviewed the mentees’ write‐
up of their initial schedules. Once certification requirements were met, mentees were moved to independent
collection with schedules subject to Regional Schedule Review and ORS Data Analysis Review.
On‐the‐job training – Throughout the Phase 3 Test fielding period, field economists engaged in a number of
activities designed to reinforce formal ORS training concepts. During data collection, on‐the‐job training was
provided through daily debriefing sessions, formal technical guidance. In addition, informal conversations
between field economists and project staff helped to identify challenging issues, lessons learned, and best
practices.
Refinement Procedures
Field economists were provided with their collection assignments prior to their arrival in the test city, and immediately
began the refinement process (e.g., using Internet searches, phone calls) to validate the assigned establishments. Only
establishments that could provide data locally (i.e., within each test city’s designated area or within a reasonable
distance of the test city with the test coordinator’s approval) were considered in scope. Field economists were
instructed to proceed to the next establishment if data had to be procured from outside the test area (e.g., from a
corporate office in another city), or when refinement proved problematic after reasonable effort had been made.
To secure appointments, field economists contacted potential respondents and attempted to speak with company
personnel with the same job titles that NCS traditionally works with (e.g., HR Directors, Personnel Managers, and Hiring
Officials). If those individuals were not available, the field economist worked with the establishment contact to identify
the person most familiar with the target data elements to interview. When the appropriate company contact was
located, the field economist used a standardized script or “spiel” to explain the purpose and importance of the survey,
and attempted to schedule an appointment. Field economists also were provided with standard answers to commonly
asked questions to aid this effort. Some of the field economists sent e‐mails to establishment contacts to provide
additional background information about the purpose of and procedures for the interview. In some cases, the
appropriate contact could not be found so field economists conducted an unscheduled personal visit to try to secure
participation. All potential respondents were informed that the data collection effort was part of a test being done at
the request of the SSA and that participation was voluntary.
When contact could not be made with listed establishments or the establishment points of contact expressed reluctance
or indicated that they were unavailable during the test field period, the field economists were instructed to forego the
normal NCS conversion attempts and simply advance to the next establishment on their assignment list. This process
continued until each field economist had secured their allotted appointments.
6 | P a g e
Occupational Selection for Responding Units
The Phase 3 tests continued to evaluate the use of NCS Probability Selection of Occupations (PSO) procedures as an
occupational selection tool. Field economists were instructed to use standard NCS coverage and procedures for PSO as
much as possible. They attempted to obtain an employee list, refine the list based on NCS inclusion/exclusion criteria,
and then select the appropriate number of occupations for collection. Field economists selected up to 8 occupations for
the largest establishments (i.e., those with more than 250 employees), and fewer occupations for smaller
establishments (e.g., 4 occupations for companies with fewer than 50 employees). If the respondent was unwilling or
unable to do formal PSO, a fall‐back selection procedure was used. Field economists attempted to identify the most
highly populated occupations in the establishment and then select between four and eight of those that also spanned
different major SOC classifications. As a last resort, the selection of occupations could be accomplished by selecting the
occupations most convenient for the respondent. PSO could be done at the start or in advance of the interview.
Standard NCS worker characteristics (e.g., full‐time/part‐time, union/non‐union, time/incentive, supervisor/lead worker
information, etc.) were collected for all selected occupations. These characteristics were used to help identify unique
occupations. In some occupations, NCS uniquely defined jobs required further refinement in order to accurately collect
the ORS elements.
Procedures for Phase 3 Supplemental Tests
As another component of the Broad‐Scale Testing, BLS also conducted three, small‐scale supplemental tests in Phase 3
carried out by a subset of Phase 3 field economists.
Central Office Collection (COC) Test
One to two COC establishments were selected within each of the six test areas, with no restrictions on industry size or
class (but no more than one state or local government COC nationwide). In the event that a contacted COC respondent
was unable to provide ORS data, field economists attempted to collect the data locally at the sampled location; if they
could not, they documented the situation and no replacement was selected. If a contacted COC establishment refused
to participate entirely, field economists attempted to identify and then interview a suitable replacement establishment
within their regional boundary.
Employment Cost Index (ECI) Test
Two establishments from pre‐determined industry groups were selected within each test city from among the list of
current ECI units scheduled for initiation during the Phase 3 test period. In the event that no current initiating ECI units
could be recruited, field economists could substitute private establishments in the same industry from the ORS sample
or, as a last resort, ECI establishments in the same industry but outside the six test areas.
Efficiency Burden Test
Experienced ORS field economists attempted to conduct 10 efficiency burden interviews in each test city. Experienced
ORS field economists served as observers for this test, accompanied field economists on each efficiency burden
interview, and independently coded the interviewer based on the questions asked and responses obtained. As part of
dual‐coding evaluation, observers and interviewers were instructed not to discuss the collected data once the
appointment began.
Data Collection Tools and Protocols
All of the following data elements collected in prior phases of ORS testing continued to be collected in Phase 3:
7 | P a g e
Description of establishment operations for the purpose of assigning the proper industry
Name and title of all respondents
Total employment
PSO employment
Occupational employment
Work setting
Traditional NCS worker characteristics (Full‐time/Part‐time, Union/Non‐union, and Time/Incentive)
Occupational work schedules
Non‐supervisory/Supervisory/Lead Worker designations
Job duties to code the eight‐digit SOC code using O*NET
Job duties to code the nine‐digit Dictionary of Occupational Titles (DOT) code
Specific Vocational Preparation (SVP), Physical Demands, and Work Environment
Any other useful documents (e.g., organizational charts, job descriptions)
In addition, Phase 3 collected individual wage rates for occupations and occupational job leveling information.
Basic information about the establishment (e.g., company name, total employment, description of products and the
facility, etc.) was recorded on the Establishment Information Sheet. Field economists’ use of the PSO and Random
Decimal Table ensured random selection of positions at the company.
For the Phase 3 Broad‐Scale test, field economists used separate tools to collect (a) SVP and leveling‐related information
and (b) information on Physical Demands and Environmental Conditions. A two‐sided, paper Quote Info Leveling and
SVP tool collected information about education, experience, training, and core function requirements for a specific job,
and captured job characteristics such as union/non‐union, full time/part time, supervisory duties, number of incumbents
and work setting. On the back of the tool were fields to capture the amount of time the incumbent spent driving, the
type of vehicle driven, and information regarding Generic Leveling as captured in NCS. (A separate Leveling Guide was
available to field economists for quick reference during the interview when coding leveling information).
To capture the Physical Demands and Environmental Conditions elements, field economists had three paper data
collection tool options: a Single Quote tool, a Multiple Quote tool, and an Advanced Multiple Quote tool. Each tool was
designed to collect all of the required data elements, but they differed in their visual format and collection method. The
Single Quote tool was configured to collect the occupational requirements one occupation at a time. The Multiple
Quote tool was designed to collect this information by element for all selected jobs simultaneously. The Advanced
Multiple Quote tool was designed to allow field economists to first determine whether each job required certain
elements or not (e.g., does the job require the worker to be exposed to extreme heat or not?), and then to go back and
administer follow‐up questions only for eligible occupations (e.g., how often does the job require the worker to be
exposed to extreme heat?).
Use of the Advanced Multiple Quote tool was restricted to field economists who collected in prior ORS test phases. For
all remaining Phase 3 Broad Scale test interviews, the decision of whether to use the Single Quote tool or Multiple Quote
tool (or combination thereof) was left up to the individual field economists. Regardless of the tool selected, field
economists were instructed to read the questions as worded on the tool for testing consistency, but were permitted to
reorder the elements and provide additional explanations and ask follow‐up questions as necessary.
For the Efficiency Burden Test, field economists were given the latitude to deviate from structured collection protocols.
Some field economists experimented with adapting the Advanced Multiple Quote tool; others created their own
collection tools, respondent aids, and procedures.
In addition to Phase 3 collection tools, a number of aids were developed to help clarify the survey task and definitions. A
Respondent Visual Aid provided frequency definitions. The Field Economist Reference Guide provided definitions and
clarifying examples of ORS elements. Field economists were encouraged to consult these aids during the interview and
to share them with respondents as necessary.
Both in‐person and remote data collection were tested in Phase 3. The goal was to collect data through personal visit
for 85% of the schedules and through telephone for the remaining 15% of schedules. In the event that a respondent
refused or was unable to schedule an in‐person appointment, additional phone interviews were permissible. Phone
collection protocols were very similar to those for in‐person collection. Field economists were required to use the
8 | P a g e
collection tools, collect all of the ORS Phase 3 elements, ask questions as worded, and probe unclear answers or
situations in which respondents’ answers did not match expected job patterns.
Data Capture and Review
Field economists entered data from the collection tools into a data capture system on a flow basis. The data capture
tool was designed to permit easy data entry by field economists, the ability to review the captured data, and capacity to
tabulate results. Review parameters were developed for Phase 3 and were used to evaluate data elements for internal
consistency. Specifically, three internal consistency review types (i.e., data capture system edits; secondary review
edits; cross‐schedule edits) were used in the data review and analysis. The parameters identified expected relationships
within individual ORS data elements and compared these against DOT coding.
Observers
To help garner feedback about the interview and data collection processes, an observer accompanied the field
economists on approximately one‐third of the data collection appointments. The observers represented a mix of roles
and stakeholders within the ORS Project (e.g., NCS/ORS management; BLS Assistant Regional Commissioners and Branch
Chiefs; ORS mentors; BLS survey methods and operations staff; BLS senior executives; SSA officials).
The observer did not participate directly in ORS collection, but was responsible for taking notes about the collection
process using the Observer tool. They noted the duration of the interview (and the time needed to administer sections
within the interview), and their observations about the ORS elements and the interview as a whole.
Other Debriefing Activities
In addition to the Observer tool, the following debriefing activities were conducted to assess the effectiveness of Phase
3 ORS materials and procedures:
Respondent and Field Economist Debriefing – at least 10 minutes were set aside at the end of each ORS
interview to ask respondents about their interview preparation and experiences. This debrief was
administered either by the field economist or the observer and consisted of a brief set of questions targeting
potential comprehension issues, perceptions of burden, etc. In addition, field economists were asked to
complete a post‐interview questionnaire about their impressions of respondent comprehension issues and the
effectiveness of the interview process and materials.
Daily Collection Debriefs – from the first day that a mentoring appointment was conducted in the test city
through May 23, 2013, a daily in‐person debriefing session was held as time permitted to assess the day’s
interviews, discuss problems or questions that arose, and share ideas for best‐practice solutions. All available
field economists and observers who had participated in that day’s collection interviews attended these daily
debrief meetings. Others involved in the ORS project attended, as available.
Weekly Collection Debriefs – From May 28, 2013 through July 1, 2013, weekly debriefing sessions were held.
These sessions were more formalized than the daily debriefs, with targeted topics and questions to cover the
range of items being tested in Phase 3 (e.g., training, data write up and review, use of professional judgment,
etc.). All available field economists and observers attended these weekly debrief meetings.
Supplemental Test Debriefs – Special debriefing sessions were held for the Phase 3 supplemental tests. All
participating field economists and observers attended these debrief meetings.
City Debriefs – Individual city debriefs were held in each test city between June 4, 2013 and June 13, 2013. The
purpose of these sessions was to summarize key findings from the Phase 3 tests in the respective cities based
on all the information collected to date through the various assessment activities, to solicit additional feedback
about components of the test, and begin to identify issues and recommendations relevant to future ORS test
collections. These meetings were attended by ORS project staff, as available.
9 | P a g e
End‐of‐Phase Debriefing – held on July 10, 2013. The purpose of this session was to summarize key findings from
the Phase 3 tests in the respective cities and the supplemental tests based on all the information collected to
date through the various assessment activities, to solicit additional feedback about components of the test, and
to identify issues and recommendations relevant to future ORS test collections. This meeting was attended by
ORS project staff and SSA sponsoring officials, as available.
Phase 3 ‐ Test Results
This section and those that follow provide information on not only the aggregated results but also specific feedback on
the various aspects of Phase 3 testing. Combined they reflect information obtained from each debriefing component
including feedback collected from observers, respondents and field economists through questionnaires and debriefing
sessions. The information garnered from each of these components contributed significantly to the overall Phase 3
findings and conclusions and is organized throughout the various results sections as appropriate rather than individually
by source. This section begins with a broad assessment of the feasibility of collecting the data needed by SSA through
the ORS, followed by an overview of the collection effort.
Feasibility
The main objective of the work that was completed as part of the ORS project in fiscal year 2013 was the assessment of
whether it is feasible for BLS to collect data relevant to the SSA’s disability determination program. The results of this
broad‐scale test suggest that the collection of the ORS data elements using a probability selection of occupations in
conjunction with selected NCS data elements is viable.
In Phase 3, BLS field economists completed interviews with 638 companies representing a total of 667 establishments.
Personal visit interviews were conducted for 92.3% of the schedules, 6.3% were conducted by telephone, and 1.2%
involved some combination of phone and personal visit collection. During these interviews, field economists collected
ORS and NCS data elements for a total of 3,259 sampled occupations, representing 496 eight‐digit classification levels
within the SOC structure. Table 1 provides the 10 most frequently collected SOC classifications.
Table 1. Most Frequently Collected SOC Classifications
Number
SOC Code
SOC Title
Collected
43‐4051.00
43‐3031.00
29‐1141.00
43‐9061.00
25‐2021.00
35‐3031.00
43‐6014.00
31‐1014.00
37‐2011.00
39‐9021.00
Customer Service Representatives
Bookkeeping, Accounting and Auditing Clerks
Registered Nurses
Office Clerks, General
Elementary School Teachers, Except Special Education
Waiters and Waitresses
Secretaries and Administrative Assistants, Except Legal, Medical and
Executive
Nursing Assistants
Janitors and Cleaners, Except Maids and Housekeeping Cleaners
Personal Care Aides
68
65
59
57
53
50
48
47
47
45
Standard probability selection of occupation was completed in 505 units (75.7%); the balance of the units (162, 24.3%)
selected occupations using one of the fallback protocols. Wages were collected for approximately 59.1% of the
sampled occupations, reflecting workload constraints that did not permit the extensive follow‐up activities that typically
occur when collecting wages in NCS. Information to level occupations was collected for approximately 89.1% of the
sampled occupations.
As was true in Phases 1 and 2 testing, a number of issues arose in Phase 3 (e.g., with certain data elements, aspects of
the collection and assessment, etc.) that will be addressed and evaluated in subsequent phases of testing; many of those
10 | P a g e
are reflected in the sections that follow. However, the consensus opinion of BLS staff is that Phase 3 objectives were
successfully met and these activities lay a strong foundation for future tests.
Phase 3 Ownership and Industry Targets
Ownership and industry targets for Phase 3 testing were established for each test city. In prior testing phases, the
ownership and industry targets were the same for all ownership‐industry pairings. In Phase 3, the established targets
were consistent across the test cities but varied by ownership‐industry pairing. Table 2 below contains the targets for
Phase 3 testing:
Table 2. Ownership‐Industry Targets for Each Phase 3 Test City
Ownership
Industry
Target
State & Local Government
(Overall = 18)
Private Industry
(Overall = 97)
Education
Financial Activities
Goods Producing
Healthcare
Service Providing
Education
Financial Activities
Goods Producing
Healthcare
Service Providing
4
2
2
4
6
11
14
14
12
46
Participating Establishments
Table 3 provides additional details about the establishments that participated in Phase 3 and the occupations for which
ORS data were collected. As can be seen, BLS staff continued their success in securing interviews from a variety of
industry groups and collecting data for a range of occupations in a tight fielding period.
11 | P a g e
Table 3. Description of Establishments and Selected Occupations for Phase 3 Testing
Ownership
Industry
Group
Education
Financial
Activities
Goods
Producing
# of Participating Units
Overall
65
89
83
Private
Industry
Healthcare
12 | P a g e
74
Service
Producing
248
Total
559
By City
Providence = 11
Pittsburgh = 12
Nashville = 11
Cincinnati = 8
Kansas City = 11
Orange County = 12
Providence = 16
Pittsburgh = 16
Nashville = 14
Cincinnati = 14
Kansas City = 15
Orange County = 14
Providence = 14
Pittsburgh = 14
Nashville = 14
Cincinnati = 13
Kansas City = 14
Orange County = 14
Providence = 13
Pittsburgh = 11
Nashville = 12
Cincinnati = 12
Kansas City = 12
Orange County = 14
Providence = 46
Pittsburgh = 38
Nashville = 46
Cincinnati = 32
Kansas City = 42
Orange County = 44
Providence = 100
Pittsburgh = 91
Nashville = 97
Cincinnati = 79
Kansas City = 94
Orange County = 98
Employment
Size
(range)
# of Jobs
Collected
Jobs Collected
(selected)
Elementary School Teachers
Middle School Teachers
Secondary School Teachers
Education Counselors
Self‐Enrichment Education
Teachers
435
Financial Managers
Real Estate Managers
Insurance Underwriters
Insurance Sales Agents
Billing Clerks
423
Accounting Clerks
Customer Service Representatives
Construction Supervisors
Construction Laborers
Industrial Machinery Mechanics
421
Registered Nurses
Licensed Vocational Nurses
Nursing Assistants
Cooks
Food servers
2 ‐ 2200
1240
Cooks
Bartenders
Janitors
Ground Maintenance Workers
Cashiers
2 ‐ 3200
2865
4 ‐ 455
346
2 ‐ 1769
2 ‐ 2536
2 ‐ 3200
Table 3, continued
Ownership
Industry
Group
Education
Financial
Activities
Goods
Producing
# of Participating Units
Overall
26
12
12
State and
Local
Government
Healthcare
22
Service
Producing
36
Total
108
By City
Providence = 4
Pittsburgh = 4
Nashville = 5
Cincinnati = 5
Kansas City = 4
Orange County = 4
Providence = 2
Pittsburgh = 2
Nashville = 2
Cincinnati = 2
Kansas City = 2
Orange County = 2
Providence = 2
Pittsburgh = 2
Nashville = 2
Cincinnati = 2
Kansas City = 2
Orange County = 2
Providence = 3
Pittsburgh = 4
Nashville = 4
Cincinnati = 5
Kansas City = 4
Orange County = 2
Providence = 7
Pittsburgh = 6
Nashville = 6
Cincinnati = 5
Kansas City = 6
Orange County = 6
Providence = 18
Pittsburgh = 18
Nashville = 19
Cincinnati = 19
Kansas City = 18
Orange County = 16
Employment
Size
(range)
# of Jobs
Collected
6 ‐ 6175
5 ‐ 302
7 ‐ 199
2 ‐ 762
Jobs Collected
(selected)
170
Teacher Assistants
Elementary School Teachers
Janitors
Secondary School Teachers
Secretaries
56
Correspondence Clerks
Librarians
Library Technicians
Janitors
Secretaries
54
122
13 ‐ 2974
218
2 ‐ 6175
620
Highway Maintenance Workers
Construction Supervisors
Construction Laborers
Construction Equipment
Operators
Maintenance Workers
Registered Nurse
Licensed Vocational Nurse
Nursing Assistants
Medical Secretaries
Mental Health Social Worker
Police Officer
Recreation Officer
Lifeguards
Construction Inspectors
Wastewater Treatment Plant
Operators
Marketing
Field economists availed themselves of many of the marketing resources developed for Phase 3 by the BLS Marketing
Steering Group. The items that field economists reported most frequently providing respondents were:
The SSA endorsement letter signed by Acting Commissioner Colvin;
13 | P a g e
The two‐sided ORS handout (with the SSA letter or explanatory text on back);
The one‐page introductory letter tailored to industry sector (e.g., private, public) and BLS region, signed by the
Regional Commissioners.
Area Economic Summary Flyers prepared by the regional Economic Analysis and Information (EA&I) (e.g., ORS
flyer). Each summary presented a sampling of economic information for the area; supplemental data were
provided for regions and the nation. Subjects included unemployment, employment, wages, prices, spending,
and benefits.
Other commonly used materials included:
Occupational Employment Statistics area publications
NCS publications (e.g., ECI, ECEC, and EBS News Releases)
BLS Customer Service Guide
BLS – How We Serve the Nation and You
Zooming in on Compensation Data Booklet
Benefit Cost Levels
Respondent Customized Application for Publications (RECAP) created documents
Additionally, field economists used the SSA Occupational Information System Project (OIS) and the BLS Occupational
Requirements Survey websites to inform respondents about the ORS collection efforts, and employed regular NCS
marketing methods (e.g., discussing other BLS programs in which the respondent may have an interest, offering lists of
BLS resources for information on BLS products and data, etc.) to aid respondent understanding and encourage
participation.
Just prior to the start of Phase 3, the BLS Marketing Steering Group also conducted an e‐mail blast in each of the test
cities through the regional EA&I offices. The Group searched for organizations or groups that they thought may have an
interest in the ORS. They then searched for e‐mail contacts within these organizations or groups. Each EA&I office then
took their typical e‐mail list and added the additional contacts that were obtained through this research. A statement
describing the ORS and asking for their support was then written up, and sent to all e‐mail contacts in these lists. This
effort resulted in at least two media stories about ORS: an article published in the Kansas City Star (“Disability
information will be sought in KC area”); and a radio and print news piece (“Bureau of Labor Statistics Seeks More Specific
Job Descriptions”) produced by WESA‐FM (PBS, Pittsburgh). In addition, this effort produced an ORS Proposed
Stakeholders List which contains information on individuals or organizations that may have an interest in the ORS and is
intended to be a living document to which stakeholder information can be added or revised over time.
Interview Duration
Based on the approved testing plan submitted to the Office of Management and Budget (OMB), field economists were
allotted 1.5 to 2 hours to collect the ORS and NCS data and conduct the respondent debriefing in each establishment.
On average, data collection took 68 minutes per schedule. Each schedule had an average of 5.2 quotes, which gives an
overall average of 13.2 minutes per quote. Occupation selection took an average of 5 minutes to complete when not
done in advance. SVP and leveling took an average of 18 minutes to complete. Physical Demands and Environmental
Conditions together took 33 minutes to complete. The post‐interview, respondent debrief questions took an average of
6.4 minutes to administer. Pre‐collection activities such as setting up appointments took an average of 29 minutes to
complete. Post collection activities including write‐up took an average of 4.6 hours, or slightly less than one hour per
quote.
Collection Materials
A variety of resources were developed and/or refined and made available to field economists in Phase 3 to aid collection
(e.g., the Collection Guide, Field Economist Reference Guide, Technical Memorandum, collection tools, etc.). Reactions
14 | P a g e
to these materials were generally very positive: only about 12 percent of field economists’ debrief reports indicated any
issue with one or more of these resources (i.e., 88% reported no issues). Table 4 lists the types of issues that were
experienced in this small number of cases.
Table 4. Distribution of Reported Issues with Phase 3 Collection Resources
Issue
%
Collection Tool – General Issues (too many pages, no room for notes, etc.)
DOT Coding Process and Resources
Issues with Questions – Wording, etc.
Issues Related to Respondent
Issues with Questions ‐ Order of Questions
Blank – Checked “Yes” with no text
33.3%
27.8
22.2
8.3
2.8
5.6
The Collection Tools
Field economists identified a range of issues that arose during Phase 3 collection and were applicable to one or more of
the collection tools.
A number of field economists reported that there were too many questions and that greater effort should be
made to consolidate and streamline questions where possible.
o For example, field economists said that there were too many questions for the carrying/lifting and the
low‐to‐the‐ground elements (e.g., stoop, crouch, and kneel), and that it was unclear when an item
needed to be asked and when it could be skipped.
Field economists noted that some items would benefit from question‐wording improvements:
o For example, it continued to be difficult to convey the meaning and objective of the lift/carry questions
to respondents and difficult for respondents to provide answers that could be coded.
o A number of field economists thought that the alternate sit/stand question was awkward and needed to
be phrased more succinctly.
In the SVP section, some respondents appeared to interpret the word “degree” to mean a college level diploma
and as a result failed to include a high school degree requirement.
The wording of the time to adequate performance question seemed to prompt some respondents to give an
answer when perhaps there was not really any time required to reach adequate performance beyond that
required to complete basic orientation activities. In addition, many field economists reported that respondents
misinterpreted this item to mean “How long it takes the incumbent to work at a fully functional level?”
A number of field economists said that they felt that the driving questions disrupted the flow between SVP and
leveling elements
Many respondents and field economists found the transitions from yes/no questions to frequency questions
jarring and unintuitive.
A number of field economists commented on the likely relationship between the order of the questions and
data quality – asking the difficult questions early on when the respondent is fresh vs. saving the hard questions
for the end.
None of the Phase 3 Collection tools had an area to record wages but there was room for leveling
documentation (which was not utilized to any extent).
Other Collection Resources
Field economists reported that the spiel and pre‐amble were effective in conveying relevant information about
ORS and encouraging participation in the Phase 3 test, but several issues were identified in the various
debriefing activities.
o Some respondents seemed surprised when they were asked to provide wage information. Although the
written spiel did indicate that the survey would collect wages associated with the selected jobs, this
topic was not among the spiel’s highlighted text, and this led to some variability across field economists
15 | P a g e
in the extent to which this point was emphasized. Field economists felt that this issue could be easily
addressed through minor changes to the spiel’s formatting and interviewer training.
o Field economists who participated in the supplemental tests also reported that they would have liked to
have spiels tailored to the content and procedures for this test.
o In all three phases of testing, there were a small number of respondents who expressed reluctance
because they were concerned that ORS information might be used for audit purposes. Field economists
generally were successful in explaining the purpose of ORS, but agreed that it would help them to have
talking points to address those concerns.
A number of Phase 3 field economists indicated that they wound up not using the respondent aid because they
found that respondents tended to gravitate to the frequency categories (less desirable) rather than reporting
specific percentages or times. Instead these field economists attempted to get respondents to report the times
associated with the jobs’ tasks and only showed the Respondent Visual Aid if the respondent could not provide
specific timing information.
Field economists agreed that the Reference Guide was a valuable resource but requested the development of
additional examples of the conditions and factors that would be encountered in a modern work environment.
Field Economist and Observer Training
Phase 3 test collections allowed for further development and refinement of the training procedures for ORS staff. As the
training staff gained greater insight into the ORS data collection process and ORS concepts, they were able to create
training materials that helped get staff new to ORS up to speed more efficiently.
Field economists and observers were largely positive about their experiences in a number of the individual components
of Phase 3 training. They reported that the pre‐collection practice (or ‘mock’) interviews gave them the opportunity to
rehearse using the collection and observer tools and gave them confidence that they could ask the questions as
intended and collect high‐quality data pertinent to the ORS elements. The daily debriefing sessions continued to be an
excellent training tool, as well. The semi‐structured setting once again provided an exchange of information which
allowed for training and clarifications of materials already provided. Finally, field economists and observers emphasized
the value of other informal, on‐the‐job learning opportunities. These on‐going conversations helped to solidify lessons
learned, best practices, and to identify areas in need of further development.
Data Collection
Respondent Reactions
Data from respondents’ answers to the post‐survey assessment questions were collected on an observer tool and were
analyzed to guage respondent reactions to the survey. Of the 667 interviews conducted in the six test cities, 223 had
observers; of those 223 observations, 168 (75%) resulted in a completed observer tool being submitted. Most
respondents reported that the time needed to complete the survey was appropriate, but about 14 percent said that it
took too long. This number is consistent with the information that came out of the field economist debriefings, suggests
that the duration of the appointment itself (i.e., a little over an hour on average) is not currently problematic, and that
other factors (e.g., question difficulty, amount of effort required, etc.) may be key drivers of respondents’ perceptions of
burden in ORS moving forward. For example, 27 percent of respondents indicated that the survey was “very easy,’
about 70 percent fell into the middle two categories (“somewhat easy” and “somewhat difficult),” and only 2.2 percent
admitted to finding the survey very difficult. More than half of the respondents (56.1%) said that they put in “a lot” or
“a moderate amount” of effort into survey (i.e., in preparation for the interview and in the collection effort itself).
Nearly 90 percent of respondents expressed some interest in the survey, but about 10 percent were willing to tell their
interviewing field economist that they felt the survey was “not at all interesting.” The sensitivity of the items does not
seem to be an issue for ORS at present: 81.1 percent of respondents said the items were “not at all sensitive,” about 15
16 | P a g e
percent reported some small to moderate amount of sensitivity, and only 2.9 percent found the questions to be “very
sensitive.” When asked directly how burdensome they found the survey to be, about 90 percent of respondents said
that it was “not at all” or only “a little” burdensome. There was not sufficient sample size or time to examine sub‐group
comparisons on these burden‐related items, but such an approach could yield valuable insights in future rounds of
testing. In addition, analysis of these types of post‐survey assessment questions may help to evaluate the impact of
design changes and inform design decisions moving forward.
In addition to questions related to respondent burden, Phase 3 respondents also were asked if they thought collecting
ORS data through alternate methods of collection (phone or email) would affect the quality of the information collected.
If we had tried to collect this information by phone or email, do you think the quality
%
of the information would have been better, worse or about the same?
Better
2.7%
Worse
67.3%
About the same
30.0%
When respondents who stated the quality would be worse were asked why, they cited the following reasons:
They would not have taken the time to gather the requested information and reply back to the field economist.
The face‐to‐face interview allowed for clarifications and further guidance.
They could not have seen physical demonstrations of the elements.
It is important for the field economist to see the respondent and gauge whether they are clear on the concepts
or struggling to understand.
Phone would be preferable to email, but in‐person is still the best method.
Many respondents indicated they would have refused to participate in a phone survey that lasted an hour or
more.
Multiple respondents said that on the phone they would be multi‐tasking and not fully engaged in the
discussion.
There would be more interruptions during a phone call than a face‐to‐interview.
The accuracy would suffer because the information being requested is very detailed.
Some respondents indicated they would not have put as much effort into getting the information for a faceless
person on the phone.
When the respondents who stated that email or phone would result in better information were asked why, they cited
the following reasons:
It would allow more time to compile information and gather/create job descriptions for the selected jobs.
It would allow them to further understand what is being asked and which colleagues to consult to get the
correct information.
Field Economists’ Professional Understanding and Judgment
In addition to direct respondent feedback, Phase 3 afforded the opportunity to incorporate additional experienced NCS
field economists into the ORS testing process. This infusion of knowledge and experience resulted in a greater insight
into the survey process and the collected data elements from people that were new to ORS testing and improved the
application of professional judgment. Professional judgment was used throughout the testing process – within the
interview to clarify or challenge respondent‐provided answers; in write‐up when uncovering inconsistencies in
respondent‐provided answers; in review when addressing edits or reviewer questions.
17 | P a g e
Reasons for Applying Professional Judgment
Primarily field economists cited that they used professional judgment when frequencies were not adding up.
The presence or lack of an element was generally determined by the respondent’s knowledge and not subject to
professional judgment.
Whenever a field economist was able to observe a job in action or the work environment, they were better able
to apply professional judgment when coding the elements.
Professional judgment is also required when determining how much to probe. Too many probing questions can
make a respondent feel defensive or even offended. It was important to read a respondent’s reactions and
demeanor and tailor the interview accordingly.
In many situations, the amount of professional judgment required was related to the respondent’s knowledge of
the selected job. The less a respondent knew about the job, the more judgment was required.
Professional judgment was less effective in situations pertaining to a specific establishment’s hiring
requirements. Although such requirements may be unusual or unexpected, that is what the company requires
and no amount of probing will change that.
Instances Where Professional Judgment Was Applied
Many respondents tended to overestimate the amount of reaching required. This caused the field economist to
probe further to get a better idea of the kind of reaching and the amount of reaching required.
Professional judgment played a very large role in looking for consistency within the elements. For instance a job
that is sitting all day cannot be coded for certain physical demands at high levels of duration.
Several instances were cited where a field economist used professional judgment to go back and revisit a
previous element when the respondent gave an answer that conflicted with a previous answer.
There were multiple combination jobs with a mix of workers doing different things under the same job title. This
required professional judgment to determine which tasks were required as opposed to incidental.
Occupations where job duties and functions varied day‐by‐day depending on different situations also required
professional judgment when coding frequency.
Field Economist Best Practices
Getting the job descriptions and reviewing them in advance allowed field economists to become more familiar
with the occupations before asking the ORS element questions and offered clues about where they should
probe further with the respondent.
Field economists recommended pursuing parallel collection techniques similar to those developed for ECI
collection (e.g., pre‐collection of some data by phone or email).
Field economists found it very helpful when they could provide respondents with examples that pertained to the
specific jobs, and when possible, to observe how and where the job is being done.
Issues with Data Collection Concepts and Elements
As was true in previous rounds of testing, a number of issues arose in Phase 3 that will be addressed and evaluated in
subsequent phases of testing. BLS communicated these issues to SSA on a flow basis throughout Phase 3 testing and
during the city and phase debriefing sessions. A selection of those issues included the following:
Field economists and respondents continued to struggle when distinguishing “required” job functions and
incidental functions.
Coding frequency was more difficult when job tasks varied by person or rotated among the group (e.g., road
crews), or when the selected job performed different tasks more often during certain periods of time than
others.
The application of thresholds continued to be problematic for some elements (e.g., pushing/pulling; exposure to
wetness; climbing; etc.).
18 | P a g e
A number of respondents confused the job’s probationary period with the time to adequate performance.
The issue of worker choice (e.g., in whether they sit or stand, or stoop/crouch/kneel, etc.) continued to cause
some issues.
Field Economists indicated that it would be helpful to have examples specific to industry (e.g., in use of
hand/arm controls, lifting, field of vision, etc.).
When reporting frequency of speaking, respondents sometimes wanted to include all speaking during the day,
even incidental small talk, and this required field economists to ask follow‐up questions to ensure collection of
speaking related only to the core function of the job.
Restaurants with indoor/outdoor seating are difficult to collect outside working time.
Field economists recommended that guidance continue to be refined on how to handle combined elements
(e.g., driving, keyboarding) to ensure that they are coded accurately and consistently, and to streamline the
interview.
Guidance on Phase 3 procedural questions was communicated to field economists throughout the test phase
through regular procedural updates.
Phase 3 Supplemental Test Results
Phase 3 testing included three supplemental tests: 1) efficiency burden, 2) COC/COA, and 3) ECI initiation. The results of
each of these supplemental tests in these three areas follow.
Efficiency Burden
During testing the interviewing field economists tried several strategies designed to produce more efficient collection,
these included:
Using branching questions and skip patterns to eliminate the need to ask about certain elements,
Engaging the respondent in a conversation about the occupations by focusing on the job duties and functions
and then determining the elements that apply and the duration that they are performed or experienced,
Reordering elements to establish the existence of elements that have a correlation to other elements first,
Changing or eliminating questions based on occupational tasks and functions or in an effort to improve
respondent understanding, and/or
Creating or amending the collection tools to help field economists apply the strategies that they were using.
Although only 60 schedules were completed as part of this test, many insights into ways to make ORS collection more
efficient were gained. Overall, the field economists participating in this test as either an interviewing or observing field
economist concluded that it is possible to develop more efficient strategies for collecting ORS data. However, in
addition to developing strategies and best practices, they also stated that some efficiency will come as field economists
become more comfortable with the conceptual framework behind the ORS elements and have a stronger fundamental
understanding of them.
Field economists participating in this test also reported that respondent fatigue was reduced when the interview was
conducted by having a conversation with the respondent about the occupation’s job functions, the ORS elements
involved, and the duration spent on the tasks rather than the structured interview protocol. Additionally, it was noted
that burden is in part dictated by respondent, not the approach we are using as some like to ponder the questions while
others just answer. Despite the challenges encountered in this test, the field economists thought that the quality of the
data collected was as good as or better than that gathered using the structured interview protocol.
Central Office Collection (COC)/Central Office Authorization (COA)
COC testing was limited in Phase 3 with the established target set as 1‐2 companies in each test area. Field economists
collected 21 schedules from 9 companies. Results of this test show that it was successful in meeting many of the goals.
19 | P a g e
However, with the small number of units contacted, it is not clear yet if our ‘normal NCS respondents’ in these COCs will
be the ORS respondents, and if so, whether they will have enough knowledge about job tasks related to ORS elements
(e.g., specific motions and their durations). There is still additional learning that needs to be achieved in terms of
collecting ORS data from these COC firms.
With regard to how collecting ORS data compared with collecting NCS data from these COC firms, the field economists’
experiences were mixed. Most expected that the collection of ORS data would be easy. For some, that expectation
proved true, especially in medium‐size COC firms where the typical NCS respondent could provide the ORS information,
while for others it didn’t. In the case of one firm, it appeared that the firm had conducted a similar type of study for
their own purposes so collecting the ORS data was relatively easy. The specific types of challenges that were
encountered included:
Inability to collect information on certain employment groups (e.g., unionized occupations, primarily the result
of political issues and limited timeframe for the test).
Trying to “do no harm” to established NCS relationships when collecting ORS data in addition to NCS data. Time
spent collecting ORS elements in addition to NCS data could make respondents think twice about providing
future data.
Determining the need to work with multiple respondents to secure the ORS data and being unable to make a
connection with all of them during the testing period. This is similar to NCS collection from these firms in
relation to the fact that there are usually different respondents; but unlike NCS, the respondents were not all
working in the same general department.
Securing information in the temporary help industry where specific job details are proprietary to clients and the
temporary help respondent, especially at the corporate office level, has limited knowledge of the requirements
within the client.
Encountering respondent’s reliance on job descriptions in very large COC firms with hundreds of occupations yet
the job descriptions were very limited with regard to ORS elements. For a couple of the occupations, the
respondent was unable to provide the ORS information due to limited knowledge about the requirements of the
job beyond the written job description.
Obtaining clearance to participate in the test took longer than anticipated (approximately one month) due to
the various corporate layers that needed to be contacted to receive authorization.
Needing to secure a personal visit to gather the ORS data when the existing COC relationship to collect NCS data
is managed via telephone and e‐mail.
Gathering ORS data was more problematic in some instances. In one instance, the field economist surmised
that this was due to the test environment where it does not have the same perceived level of importance to the
respondent as regular NCS collection.
Using job descriptions to code ORS data elements may be problematic as they often don’t include information
on the types of movements that the task require or the duration at which tasks are performed. However, they
are potentially a viable source for discerning the tasks of the occupations making ORS element collection more
focused and efficient.
One of the specific goals of this test was attempting to determine whether ORS data would be available at the corporate
level or would necessitate collection at the local level. The field economists’ experiences in this area were mixed. Some
of the typical corporate headquarter respondents who provide NCS data were able to provide ORS data while others
weren’t. Specific issues arising with regard to collecting at the corporate vs. local level included:
When asking the respondent if it would be possible to collect data at the local level, some corporate
respondents indicated that it would be more difficult to collect at the local level.
Respondents at some corporate locations may not have knowledge beyond the written job descriptions with
regard to the ORS elements, which may necessitate contacting the local levels to collect the data.
In some cases, it was difficult to determine the proper respondent for the ORS data, particularly given the
compressed testing period. As is often the case with NCS collection, it may take more time in these COC firms to
track down the respondent, who can best provide ORS data.
20 | P a g e
Even in instances where COC collection under this test didn’t present any problems, field economists
acknowledge the fact that collecting ORS data from COC firms could get complicated in the future as additional
contacts and efforts to determine the best respondent for ORS data could create potential problems.
It may be difficult to use a consistent strategy when collecting the COC firms as it is possible that the need to
collect data at corporate or local levels may vary from firm to firm and location to location based on respondent
knowledge and familiarity with the occupations selected.
Regardless of the level at which the data is gathered (corporate vs. local), field economists question whether the
respondents would agree to continue to answer ORS questions on different occupations on a recurring basis
(e.g. as new schedules are sampled) as this would be very time consuming (e.g. multiple locations may be
selected for ORS each quarter).
Employment Cost Index (ECI) Collection
ECI testing was limited in Phase 3 with the established target set as 2 establishments in each test area. Field economists
collected 10 schedules (83.3% of the established target); two of the units approached refused to participate. Results of
this test show that progress was made toward meeting the stated test goals. However, with the small number of units
contacted, it is not clear whether it is possible to collect both ORS and NCS data from all types of firms or the extent to
which overlapping samples is feasible so additional testing is needed to fully address the test goals. Even if the decision
to select distinct samples for ORS and NCS is made based on the compilation of ORS testing, the potential for some
overlapping units in the two distinct samples exists so it will be important for future ECI testing to continue to build on
the lessons learned in this test to continue finding the best methods/strategies for collecting both NCS and ORS data
from a single establishment.
In general, field economists indicated that ORS is easier to “sell” than the ECI, as has been experienced in other ORS
testing and NCS collection, and that reluctance encountered in this test would have been similar if only collecting NCS so
ORS did not seem to attribute negatively to initially obtaining cooperation. Additionally, some used the combined
collection effort as a way that BLS is saving taxpayer dollars, which seemed to appeal to respondents. Other results of
their collection experiences are detailed below.
Field economists were asked about the time that it took to complete the interview, the amount of data for ORS and NCS
that they were able to collect, the respondents who provided the data, and if anything was different in their approach.
For most of the field economists, actual collection time ranged from 25 minutes (using the advanced multiple
quote tool) to 1.5 hours. The ORS elements seemed to take approximately half of the interview duration in
most cases. Most field economists indicated that respondent burden was not too much of a factor in most of
these schedules; in others, similar frustration with some of the ORS elements as was seen in standard ORS
collection was experienced.
As for the amount of data collected, the results were mixed. Some field economists indicated that they were
able to get all of the NCS and ORS data in the time allotted by the respondent while others were only able to
collect the ORS data or limited wage/benefit data for NCS either due to time constraints or respondent
reluctance. For those where limited NCS data was collected during the appointment, some were able to secure
the information during subsequent exchanges via e‐mail similar to NCS where data is obtained over time.
In terms of respondents, all of the field economists indicated that the typical NCS respondent was able to
provide the data for ORS in the participating units.
Most field economists indicated that their approach for this test wasn’t any different than how they would
approach ECI‐only collection. The lone exception was that one field economist indicated the respondent wanted
to conduct the survey via e‐mail; however, the field economist insisted that the ORS portion be conducted via
phone.
Field economists participating in this test were asked about the most challenging aspects of collecting wage, benefit,
leveling and ORS data at the same time. Their views included the following:
21 | P a g e
Gathering the amount of information required for both NCS and ORS. Most field economists started with the
ORS data first and were, in some cases, unable to collect benefits as the time allotted by respondents expired.
Switching between ORS and NCS elements is awkward with regard to survey content. It is difficult to switch
from ORS data that is more interpretative to NCS data that is more concrete.
Organizing collection materials and determining strategy prior to the appointment to maximize the amount of
data collected while balancing keeping respondent burden to a minimum.
Attempting to collect all of the ORS and NCS data in the initial PV appointment. Often, the collection of NCS
data is iterative and initial contacts are made to set‐up future collection (typically via e‐mail) that allows for
“growing” the data over time rather than needing to get it all in one sitting.
Securing cooperation for all requested elements. Some field economists indicated that they had little difficulty
obtaining ORS data but that NCS data was not provided because it was perceived as more sensitive. For these
field economists, their perception was that NCS collection would have been difficult anyway and that ORS was
not the primary reason for the firm’s reluctance.
Having less time in the testing environment than would typically be provided in NCS production. Some field
economists thought that the compressed timeframe for conducting the test impacted their ability to collect all
the NCS elements. It was their sense that given a longer timeframe more NCS data would have been collected
as well as better quality data for ORS. Additionally, it would have afforded more time to build a relationship
with the respondent via e‐mail and when setting up the appointments.
Debriefing Activities
Each of the Phase 3 debriefing components remained useful. The daily and targeted debriefing sessions were the most
crucial assessment elements, offering focused insights into issues that will need to be addressed and further evaluated
in fiscal year 2014 (e.g., data capture system; streamlining of collection methods and procedures; training effectiveness;
etc.).
The daily debriefing sessions in the first few weeks were characterized by the following attributes:
They were helpful in getting the field economists new to ORS up to speed with the process;
Each city designated a facilitator for the debriefings;
Debriefing sessions were structured around collection appointments, with field economists and observers
sharing their experiences.
These sessions identified potentially problematic items or procedures and let field economists exchange
approaches that they found helpful in clarifying meaning and securing collection goals.
When the debriefing schedule changed to a weekly format, they took on the following characteristics:
Began focusing on a specific topic each week, in addition to collection appointments.
Allowed collection of ideas and thoughts on topics that other ORS teams could specifically use.
Gave other ORS teams the opportunity to give their input as to what information they wanted to get out of the
debriefing discussions.
The city debriefings provided the opportunity for field economists to summarize their Phase 3 findings, as well as
anything else that had occurred. The phase debriefing provided the opportunity to present a large amount of data that
had been collected in Phase 3 to BLS and SSA staff and take questions from those groups. As with all the previous
phases, significant problems were identified as well as smaller issues that need adjusting. The tools and procedures are
continuously being updated, and there is a lot of feedback provided by these sessions leading into fiscal year 2014.
The Phase 3 observer tool was changed to having a large space for note taking not separated according to question. The
booklet idea for the tool was something that came out of Phase 2 and proved to be simple and useful for observers.
22 | P a g e
Data Capture
In place of a spreadsheet to capture the test data, Phase 3 implemented a web‐based system for data capture. The
system was designed to permit data entry by field economists, the ability to review the captured data, and capacity to
tabulate results.
This data capture system received mixed reviews by the field economists when accessing the usability of this tool and its
perceived value. Overall, field economists and reviewers liked the web‐based system, but cited the inability to use the
system without on‐line access as a shortcoming (i.e., there was no way to work locally). Some field economists
expressed the need for a wage‐import feature for bigger establishments and the ability to enter compound wages. Field
economists and reviewers also noted that there was a lot of back and forth between the SharePoint calendar, the data
capture system, review questions, time reporting spreadsheets, etc., and they would prefer to have a more integrated
system to cover these components.
Data Analysis
BLS was tasked with developing procedures for review to determine the validity and accuracy of the data gathered in
Phase 3 as well as creating calibration activities to help ensure inter‐coder reliability in data collection.
Calibration Activities
A calibration activity was conducted in each of the six test cities prior to the beginning of collection to help ensure that
different field economists coded the same specific vocational preparations, physical demands and environmental
conditions for a job the same way during a collection interview. This also served to augment the training provided to the
field economists by emphasizing best practices developed throughout the previous test phase.
Each field economist participating in collection for the city was given a Single Quote collection tool and viewed a video of
a scripted interview produced by the regional trainers. Field economists were instructed to code all elements in which
they felt there was sufficient information in the video. They also were asked to make notes in places in which they
thought there was enough information to code the data element but would be more comfortable if they could ask
additional questions of the respondent. In such instances, the field economists also were asked to note the specific
questions they would ask. Additionally, field economists were asked to note instances in which they could determine
the existence of a required element but would be unable to determine the frequency.
Once all participants completed their coding based upon the video, group discussion was facilitated by the regional
trainers. This discussion looked at each question on the collection tool as well as the answers arrived at by the field
economists and why they gave each particular answer. Comparison of the field economists’ collection tools were then
made to analyze the consistency of coding and identify areas for further training and development. The results of this
comparison are currently underway.
Data Review Procedures
BLS developed additional review procedures for Phase 3 to improve data quality and consistency. The edits used in
Phase 2 were refined and incorporated into the Data Capture system for use by field economists during schedule write‐
up and self‐review. When an edit was flagged in the Data Capture system (e.g., based on a conflict or expected
relationship between two or more elements), the field economist was alerted and required to either verify that the data
in question was correct and add documentation or change the data to clear the edit before being able to mark a
schedule complete.
Once a schedule was marked as “complete” by the field economist, it was randomly assigned to one of four review
types. Two types – mentor review and regional review – were performed by regional staff. Two other types of review
were conducted: targeted review and secondary review. Mentor review involved experienced field economists who had
collected in Phase 1 or 2 observing and reviewing the first two schedules collected by a field economist new to ORS as
23 | P a g e
part of their training process. Regional review was also performed by experienced field economists and consisted of a
full review of the first two schedules independently collected by field economists to help ensure data quality and
consistency. Additionally, 20 percent of all schedules were randomly assigned to regional review. Targeted review was
performed on a randomly selected set of schedules not marked for mentor or regional review, equal to approximately
20 percent of the total. Targeted review focused only on specific elements identified in previous phases as likely to
generate review questions (NAICS, SOC, SVP, Sitting and Standing) that were checked in every schedule, plus two
elements chosen at random. Secondary review was performed on the remaining schedules exclusively using the review
tools in a batch‐review type system.
BLS also developed a set of tools designed to flag anomalous or outlying values, including queries comparing ORS data to
similar data on the DOT and the O*NET, as well as distribution analysis and edits to check for additional expected
relationships between elements. These edits and queries were used on all schedules to supplement review.
For all types of review done, reviewers formulated questions that were sent to the field economist who collected the
schedule in question to check the values and coding for certain elements that were flagged as potentially anomalous or
inconsistent. Field economists responded by adding supporting documentation to explain the data, changing the data,
or both. Once the field economist and the reviewer were satisfied regarding the validity of the data in question, the
schedule was marked complete and the final data was saved to the ORS database.
Efficiency Test Analysis
Schedules from the efficiency test were compared and analyzed to investigate issues of inter‐coder reliability and the
effect of the review process itself on the data. During each interview two schedules were coded by two different field
economists at the same time. These schedules were compared with each other both before and after review had been
completed on them, with any differences in coding being noted. Elements such as reaching, speaking, lifting and
carrying showed the most variation in coding between paired schedules and will be the subject of future analysis. After
going through review, schedules were also compared with the pre‐review version to identify areas where data was
changed and improve both the edits system and the review process.
Data Analysis – Phase 3 Report Summary
In analyzing the microdata, BLS compared the values for elements for each SOC code with the values given in the DOT to
determine consistency and understand trends in the changing occupational requirements of certain jobs. Project staff
also compared the values for elements within each SOC code and family to determine internal consistency and improve
the ability of the edits system to identify outliers or regional variations in the data. Certain elements in particular SOC
codes were flagged for closer inspection to investigate the driving factors behind unusual coding combinations, such as
higher than expected weights being lifted and carried by nurses and hospital aids. Data mining, including text comment
searches, was carried out to identify and analyze the driving factors behind particular coding combinations such as high
SVP levels in occupations with low educational degree requirements. In addition, item non‐response rates, the
distribution of element coding within occupations that experienced procedural changes during Phase 3 and the
frequency analysis of edits and actions by field economists were analyzed. Further analysis of variations in coding by
industry, size class, ownership, and work setting was also performed.
Analysis showed that of the 47 major ORS elements, 12 elements accounted for nearly 64% of the total questions asked
in review, with SVP generating the most questions. In our SOC analysis, of the 10 SOC codes that produced the most
questions, 5 of the 10 were in Major Occupational Group (MOG) 43 – Office and Administrative Support. This may
indicate that qualifications for clerical jobs may be harder to quantify; or that the edits in the data capture system may
need to be improved. Analysis of the efficiency burden test indicated that differences in coding occurred between the
field economist and the observer, but were not necessarily due to the collection process.
24 | P a g e
Phase 3 Test Coordination
For Phase 1 and 2 testing, BLS primarily used one test coordinator who was responsible for overseeing testing efforts
within the test city. The coordinator was responsible for determining field economist assignments; tracking progress
against testing targets; ensuring that milestones were achieved; serving as a liaison and facilitating communication;
seeking resolution to issues occurring during testing; etc.
Given that Phase 3 data collection occurred simultaneously within six test cities from across the nation, it was necessary
to alter the means by which data collection activities were coordinated in Phase 3. Rather than continuing to use one
test coordinator, BLS drew upon its existing regional managerial structure by having one Branch Chief or a regional
designee serve as test coordinator in each test city. Although the responsibility for coordinating Phase 3 testing was
assigned to six regional test coordinators, the coordination activities were similar to prior testing phases.
Training
In April 2013, training was conducted for the staff assigned as test coordinators for Phase 3 testing. The training was
conducted via webinar and covered the following topics: refining the establishment list, determining resources (staffing
and budgetary), securing logistics (i.e. space, materials/supplies, technology, telecommunications), making assignments
(collection, mentoring, and supplemental tests), facilitating communication (with field economists, observers, and
national managers), coordinating data collection activities including observational visits, tracking progress (mentoring,
collection and review), conducting debriefing sessions, and summarizing collection results.
Overall, the training was well received. Those participating in it indicated that they received the types of information
that was necessary to coordinate data collection activities within the test cities.
Data Collection Assignments and Appointments
Regional test coordinators received the list of establishments on April 3, 2013. Field economist collection assignments
were completed for all six test cities by April 11, 2013. Establishment screening sheets were prepared and delivered to
the test coordinators on April 15, 2013. The timing of these activities was planned to make use of a suggestion from
Phase 2 debriefing that indicated field economists did not have adequate time to make appointments prior to arriving in
the test cities. This timing provided between two and three weeks for setting appointments prior to arrival in the test
cities for most field economists. Some field economists actually made and collected some appointments prior to the in‐
person training for the test cities, which occurred the week of April 29. As a result, they only had approximately one
week to make appointments.
Several lessons learned from Phase 2 testing were provided during the test coordinator training in regard to making
assignments. These included setting individual targets for field economists, organizing assignments geographically,
giving one or two field economists all units within those ownership/industry sectors that were limited in the number to
avoid the need to switch assignments as sector targets were met, keeping some units in reserve that could be assigned
to field economists who needed additional units to reach their individual targets, and striving to maximize the mix of
industries within each field economist’s assignment. Given that most of the test coordinators have experience in
managing the NCS, many of the test coordinators drew upon that experience in determining how best to make
assignments for Phase 3 testing so the adoption of these lessons learned from prior ORS testing may not have occurred
in every test city. Regardless of the means by which assignments were made, all test coordinators met the deadline for
making assignments and continued to manage them throughout the testing period to work to ensure that as many
ownership‐industry targets as possible were achieved.
Most of the test coordinators adopted the best practices from earlier testing and retained a small pool of units that
could be assigned to field economists as needed to help them achieve their individual targets. However, it was noted
that the timing of release of these additional units did not allow ample time to adequately pursue these units to gain
their cooperation in the test.
25 | P a g e
Test Coordinator Resources
During Phase 3 testing, several resources were available to the test coordinators to help them track progress. These
included:
Appointment Calendar – This SharePoint list was used by field economist to enter the appointments that they
made. For each appointment, details about the appointment including schedule number, date and time, used
for mentoring or supplemental test, etc. were entered into the calendar. Views were established for the test
coordinators to see all appointments entered for their test city as well as a summary of appointments to
ownership‐industry targets.
Collection Portal – For each test city, a portal was established to help the test coordinators manage data
collection activities. The portal included the following information: collection and ORS announcements,
appointment calendar, links to Phase 3 documents/materials/resources, link to the data capture system,
ownership‐industry targets, current status of appointments, and the ORS calendar.
Review Portal – For each test city, a portal was established to help the test coordinators manage regional review
activities. This portal included the following: appointment calendar, ownership‐industry targets, current status
of appointments, review type and assignments, and questions/answers about collected schedules.
Progress Spreadsheets – These were provided weekly to the test coordinators to help with tracking progress of
collection (making and completing appointments), mentoring, supplemental tests, data entry and review. Such
progress was updated manually.
Progress Meetings – These were held bi‐weekly to discuss testing progress, share information about testing,
discover and address testing issues and questions, etc.
Test Coordinator Experiences
A debriefing was held on July 23, 2013 with the staff who served as regional test coordinators to discover their
experiences, lessons learned, and suggestions for future testing. Overall, the regional test coordinators indicated that
they enjoyed this role and project and it was nice to work on something new and different. They also expressed that
while managing the test was not the same as managing a production survey there were similarities. In addition to
sharing that overall they liked the test coordinator role, they also expressed challenges that were encountered in it.
These included:
Having more than one observer at a time made coordination difficult as it was necessary to ensure that there
were enough scheduled appointments to support them.
Scheduled appointments were not always collected by the same field economist. Since the appointment
calendar tracked the appointment based on the field economist who entered the appointment, this made
tracking individual field economist progress more difficult as well as ensuring the correct review type was
assigned to schedules, especially in the case of those that were part of the mentoring process.
Limiting appointments to one unit per company made it more difficult to achieve ownership‐industry targets in
some cases, particularly in the private industry services sector.
Compressed time frame for ORS testing due to overlapping the NCS’ Employment Cost Index collection and the
use of staffing resources for both efforts did not make ideal conditions for mentoring staff new to ORS in Phase
3 or follow‐up work to ensure that overall ownership‐industry targets were achieved.
Hard to track the submittal of all of the supplemental feedback documents (i.e. respondent debrief, field
economist post‐interview debrief, observer tool) and to follow‐up with staff participating in ORS Phase 3 testing
as necessary.
It was more difficult to coordinate testing when everyone working on the test were not located in the test city in
terms of monitoring progress, checking in with staff, having a central location for coordination, etc.
ORS testing required more staff travel, which added a level of complexity to ORS testing not typically
experienced in NCS collection.
26 | P a g e
Despite the challenges, the test coordinators indicated that they thought Phase 3 testing was successful and indicated
that there were best practices to carry into future testing as well as lessons learned of changes that should be made in
future testing.
27 | P a g e
File Type | application/pdf |
File Title | ORS Phase 3 Summary Report |
Subject | ORS Phase 3 Summary Report |
Author | U.S. Bureau of Labor Statistics |
File Modified | 2013-11-05 |
File Created | 2013-11-05 |