Factors Affecting Response to the OES Survey

factorsaffectingresponse.pdf

Report on Occupational Employment and Wages

Factors Affecting Response to the OES Survey

OMB: 1220-0042

Document [pdf]
Download: pdf | pdf
Factors Affecting Response to the Occupational Employment
Statistics Survey
Presented at the 2007 Federal Committee on Statistical Methodology Conference
Arlington, VA, November 5, 2007

Polly A. Phipps and Carrie K. Jones
Bureau of Labor Statistics
[email protected]
[email protected]

Polly Phipps
Polly Phipps is a research statistician in the Office of Survey Methods Research at the Bureau of Labor
Statistics in Washington, DC. She has worked on a number of OES projects, including the brochure
experiment in 2006, OES program review in 2005, and response analysis surveys in 1988 and 1989 to
evaluate the first wage survey questionnaires. She conducts research on survey measurement error, data
quality measures, questionnaire design, and nonresponse.
Carrie Jones
Carrie Jones is an economist at the Bureau of Labor Statistics national office in Washington, DC. She
has worked in the Occupational Employment Statistics office for 11 years. She works on OES State
operations procedures, central printing, and operations research. Ms. Jones is currently the State
Operations and Analysis branch chief.

Factors Affecting Response to the Occupational Employment Statistics
Survey
Polly A. Phipps and Carrie K. Jones
Bureau of Labor Statistics
[email protected]
[email protected]

Introduction
Response rates are increasingly a source of concern for government agencies and other organizations conducting
establishment surveys. While there has not been a consistent pattern of increasing nonresponse, in the past
several decades it has become more difficult to achieve and maintain high response rates (Christianson and
Tortora, 1995, Interagency Group on Establishment Nonresponse, 1998). While establishment surveys conducted
by the government have higher response rates on average than other organizations, government agencies are not
exempt from low response rates or declines in response to ongoing surveys. Given the concern with response
rates, one would expect to find a substantial literature on who participates in establishment surveys and who does
not. However, as many have noted, the literature on establishment survey participation is fairly small, and
quantitative studies are even less common. Our interest is in contributing to the literature by exploring survey
participation through an analysis of establishment characteristics and survey design and administration factors.
Ultimately, we hope that understanding these factors will provide direction on how to address nonresponse and
improve the quality of survey estimates.
For our analyses, we use the Occupational Employment Statistics survey (OES), a bi-annual establishment survey
measuring occupational employment and wage rates for wage and salary workers by industry for the U.S., States,
certain U.S. Territories, and Metropolitan Statistical Areas within States. This voluntary survey of establishments
with one or more employees is conducted by State employment workforce agencies in cooperation with the
Bureau of Labor Statistics. While the response rate achieved by OES is quite high at approximately 76 percent,
there is considerable variation by state, industry, and establishment size (Jones, 1999).
We first review the literature on establishment survey response and nonresponse, and then provide background on
the OES sample, data collection procedures, and state survey administration. Next, we describe OES response
rates over time by major areas of interest, including establishment size, size of metropolitan area, and industry
groups, and we describe patterns of nonresponse, including survey refusals. Finally, using data from the May
2006 OES survey, the Quarterly Census of Employment and Wages, and a survey of state administrative
practices, we use logistic regression models to predict the likelihood of survey response. We test the effect of a
number of conceptual factors on response to the OES survey, including establishment characteristics, such as
establishment age, multi-establishment firm status, industry, size, location; and survey design and administration
factors, including survey form type, nonresponse followup strategies, State staff composition, experience, and
turnover, and selection into other BLS surveys. A small percentage of OES data are collected centrally by the
BLS national and regional offices. Given the differences in collection methods, we do not use it in this paper, and
for the same reason, we exclude U.S. Territories.
Establishment Survey Response and Nonresponse
There is a large literature on household survey response rates, and while the corresponding establishment survey
literature is not as extensive, it covers many of the same topics. These include nonresponse standards
(Hidiroglou, Drew and Gray, 1993), nonresponse trends (Interagency Group on Establishment Nonresponse,

1

1998), procedures or experiments designed to increase response (Moore and Baxter, 1993), and post-survey
adjustments for nonresponse (Sommers, Riesz, and Kashihara 2004), among others. The establishment survey
literature is limited in the number of studies that analyze the likelihood of participation using establishment and
survey administration characteristics. An excellent example of this type of research in household surveys is
Groves’ and Couper’s (1998) analysis of factors that influence participation using six household surveys and
decennial census records. As a conceptual framework, the authors use features of the study population that are
not under the control of the survey researcher (social environment and household characteristics), and features
under researcher control (survey design and interviewer characteristics) to explore survey response.
Several studies have set out a theoretical framework and proposed factors that influence the likelihood that
establishments will respond to a survey request (Tomaskovic-Devey, Leiter, and Thompson, 1994; Willimack,
Nichols, and Sudman, 2002). Willimack, Nichols, and Sudman (2002) have proposed a theory for establishment
survey response that includes factors affecting the external environment, the business, the respondent, and the
survey design; the components are shown in Exhibit 1. Using Groves and Couper’s (1998) conceptual framework
they identify factors that are and are not under the control of the survey researcher. Those not under researcher
control include the external environment, the business, and respondent characteristics. Survey research
organizations can control sample and instrument design, other types of survey materials (e.g., instructions),
contact strategies, mode of administration, and timing, among others. Willimack et al. base their theory on
qualitative research conducted by the U.S. Census Bureau to study the reporting process in large multi-unit firms.
Willimack and colleagues propose under the external environment that weak economic conditions can affect
participation, since fewer staff may be available to complete a survey, and businesses may be more reluctant to
disclose information. Survey climate, i.e., the number of survey requests a business receives, may affect
response, as well as other reporting requirements that are a higher priority than survey participation. They
consider data availability a strong component of response, and related to a number of factors, including business
characteristics such as size, type, industry, ownership, and the availability of staff to respond to mandatory and
voluntary surveys. Respondent characteristics include having authority to provide data or delegate the task,
capacity or knowledge of the data, and motivation to attend to the task. The authors find that of the survey design
characteristics, particularly mode of administration, contact during high workload time periods, and contact
strategies to prenotify or identify respondents are important for unit response rates. Overall, Willimack and
colleagues propose that businesses weigh the burden of the survey response against business goals in their
decision on whether to participate, and that the external environment, business, respondent, and survey design are
factors in the weighing of survey burden and business goals.
Exhibit 1. Business Survey Participation (Willimack et al., 1998)
Out of researcher control
Under researcher control
External environment
Business
Respondent
Survey design
Economic conditions
Data availability
Authority
Sample
Survey climate
---Business characteristics Capacity
Survey topic
Legal/regulatory
---Organizational structure Motivation
Instrument design
requirements.
---Management needs
Mode of administration
---Regulatory reporting
Time schedules
Environmental
Contact strategies
dependence
Company policy
Respondent identification
Resource availability
Legal authority
Survey sponsor
Confidentiality

Tomaskovic-Devey and colleagues (1994) propose that organization complexity is related to authority and
capacity to respond to a survey, and the organizational environment influences the capacity and motive to

2

respond. Their view of authority, capacity, and motivation is associated with the larger organization, rather than
the respondent. Authority to respond can be formal or informal, organizational capacity refers to practices and
processes tied to assembling the requested information, and there can be individual or organizational motives
regarding information disclosure. They test their theory using survey data from a North Carolina employment and
health survey and establishment public records. The authors find that establishments that are subsidiaries, large,
have higher profits, and have greater sales concentration are less likely to respond, while establishments with high
R and D intensity, and in price, safety regulated, and publicly-traded industries, are more likely to respond. They
did not find industry significantly associated with response after controlling for organizational factors. They
conclude that motive measures are most important in explaining response: establishments in profitable and
concentrated markets are more independent of their environment and less likely to respond to a survey request,
while price, safety regulated, and publicly traded industries have a higher motive to cooperate and shape public
opinion. For capacity measures, they argue that increased establishment size is a reflection of dispersal of
information and less capacity to respond.
The conceptual frameworks for establishment survey response discussed above have had very limited testing.
One of the reasons for the lack of empirical studies is likely a limited number of explanatory variables available in
the survey data. An exception to this is a study by Potter (2000) analyzing nonresponse characteristics using the
1996 Nursing Home component of the Medical Expenditure Panel Survey. In this study, market, establishment,
and survey administration characteristics were tested. Market characteristics included a state-level Medicaid
reimbursement measure, and county-level data items for the establishment location: rural/urban, market
environment (hospital and nursing home beds per capita and percent population 75 and above, percent for profit
nursing home beds), and county health status (mortality rate). Characteristics of the nursing home establishment
included type of ownership, number of beds and residents, and federal certification for reimbursement under
Medicare. Survey design characteristics were twofold: endorsement by the state nursing home association and
interviewing field cost strata. Interviewer characteristics included demographics, work experience and caseload.
Significant predictors of nonresponse included two market measures: a flat rate Medicaid reimbursement, as
opposed to more generous reimbursement method, and counties with a lower supply of hospital beds. Related to
market measures, whether the nursing home was hospital based increased the likelihood of response, as did
location in the Midwest, compared to the northeast, south, or west. One survey design characteristic --areas
requiring an overnight stay to collect data compared to larger clusters of cases--increased response. And a
number of interviewer characteristics were associated with lower nonresponse, including white interviewers,
interviewers with some college, and greater interviewer experience.
An analysis of the schools and staffing establishment survey sponsored by the National Center for Educational
Statistics (1997) used logistic regression analysis to predict response for public schools. Univariate analyses
showed that minority enrollment, region, urban/rural location, school level, size and type significantly affected
response. However, the multivariate analysis found only three factors -- school level, size, and type—had a
significant effect on nonresponse. The researchers found that secondary schools were more likely to respond than
combined secondary and elementary schools, and elementary schools only; small schools were more likely to
respond than larger schools; and schools that offer regular instruction, as opposed to special instruction in
vocational, special, or alternative courses were more likely to respond.
Several other studies have focused on establishment nonresponse. Sommers and colleagues (Sommers, Riesz, and
Kashihara, 2004) found that establishment employment, state, industry, age of firm, single or multi-unit firm,
urban/rural county, and average wage were significant in predicting response. Tulp, Hoy, Kusch and Cole (198)
used an experimental design to test the effect of mandatory and voluntary reporting. They found that mandatory
reporting was more effective in obtaining higher response for establishments overall, for establishments new to
the survey, and establishments who had previous survey exposure under mandatory conditions. Respondent
identification has been explored by Moore and Baxter (1993), who found mixed results for use of contact name,
i.e., small business with a contact name had higher response, particularly in wholesale, finance/real

3

estate/insurance and small service sectors; while having or not having a contact name did not affect large business
response.
While conceptual frameworks have been offered and some empirical studies have explored participation in
establishment surveys, many more studies are necessary to understand the dimensions of establishment survey
participation. In fact, a 1998 Interagency Group on Establishment Nonresponse listed research on the
characteristics and correlates of nonresponse as an area in need of development. This analysis is the first step in
exploring and attempting to model OES survey participation.
OES Background
The Occupational Employment Statistics (OES) survey is primarily a mail survey. Data are collected by the State
Workforce agencies, in cooperation with the Bureau of Labor Statistics. OES data are collected by analysts in
State government offices. For survey administration purposes the State OES offices are grouped into six regions.
Each region has a BLS office, and BLS personnel are assigned to guide, monitor, and assist the State OES offices.
Respondents report the number of employees by occupation and wage ranges. The occupational employment and
wage data from sampled establishments are used to calculate employment estimates for nearly 800 occupations
annually for the 50 States, the District of Columbia, Puerto Rico, the US Virgin Islands, and Guam, as well as the
nation as a whole. OES also produces employment and wage estimates for Metropolitan Statistical Areas (MSAs)
and specific industries. Occupations are classified using the Standard Occupational Classification (SOC) system
while industries are classified using the North American Industry Classification System (NAICS).
The OES Sample
The survey is conducted over a rolling 6-panel semi-annual (or 3-year) cycle. Each panel’s sample contains
approximately 200,000 establishments. Over the course of a 6-panel cycle, approximately 1.2 million
establishments are sampled. When possible, non-government establishments are only sampled once every six
panels. A census of Federal government, executive branch only, is taken for every panel. A census of State
government units is taken every November.
The sample is drawn from a universe of about 6.5 million establishments across all non-farm industries. The
sample is stratified by geography, industry, and employment size. The sample frame comes from Unemployment
Insurance (UI) reports filed by almost all establishments. Only establishments in Guam as well as the railroad
industry are exempt from mandatory UI filing; the frame for those units is obtained elsewhere.
Data Collection
The OES survey collection instrument consists of 97 industry-specific survey forms used for medium and large
sized firms and one open-ended survey form used for smaller firms. Respondents report employment data by
occupation across 12 wage ranges, using a matrix format. The industry-specific forms have occupations already
printed on the form and range in length from 16 to 24 pages, as shown in Exhibit 2. In addition, there is one 32page form for colleges and universities and a 44-page form for government units. The occupations on each form
are selected based on industry staffing patterns derived from previously collected data. Most survey forms cover
a 3-digit NAICS industry. However, there are some forms that, due to heterogeneous staffing patterns, cover only
a 4-digit NAICS industry. The 4-page open-ended form, in Exhibit 3, has space for respondents to write-in the
occupations found in their forms. This form is used primarily for small size establishments, and each state defines
their own values for “small”; the top value ranges from 9 to 99 employees, depending on state.

4

Exhibit 2. Example of occupation found on an industry-specific form

Exhibit 3. Example of space found on the open-ended write-in form

The OES survey is initially mailed out to almost all establishments in the sample. The initial mailing is done by a
central mail facility and occurs as close to the survey reference date as possible; either November 12th or May
12th. Three follow-up mailings are sent to nonrespondents at approximately 3-4 week intervals. The initial
mailing as well as the first two follow-up mailings use a mix of industry-specific survey forms with occupations
already printed on them for the larger firms as well as the open-ended form for the smaller establishments. The
last mailing uses only the open-ended survey form regardless of establishment size. Telephone follow-up calls
are made to nonrespondents. Some data for larger establishments are collected via personal visits. Other modes
of collection include email, phone-in, facsimile, and electronic media such as disc or tape. The percentage of total
responses returned via each collection mode for the May 2006 panel is shown in Exhibit 4.
Exhibit 4. Respondent Collection Mode, May 2006
Collection Mode
Percent Collection Mode
Mail
71.9% Electronic unspecified
Phone Call
11.8% Hard copy printout
E-mail
7.1% Diskette, CD, DVD
Fax
3.9% Personal visit

Percent
3.6%
0.9%
0.6%
0.2%

State Survey Administration
State agencies follow general federal guidelines in conducting the OES survey, but states are allowed flexibility
and in turn, utilize different practices and procedures. In addition, state sample sizes vary dramatically. For
example, Wyoming, with a sample of 743 establishments accounts for .4 percent of the OES sample, while
California, with 15,691 establishments in the sample, accounts for 8.8 percent (See Appendix 1, Table 1). Since
states vary in size and practice, we gathered information about states and state survey administration. These data
were provided by the BLS regional offices and included information on staff composition, staff vacancies, size of
the staff, management structure, and procedures used during the May 2006 survey panel. The full set of results is
shown in Appendix 1, Table 2, and we highlight some results below.
Personnel is an important part of survey administration. In May 2006, the number of full-time equivalent
positions in States funded by BLS ranged from 1.3 to 18, with an average of 5.1 positions. On average, about
three out of five positions were managerial or professional positions. About 60 percent of state personnel in
management positions had over six years of OES experience, while approximately 32 percent of non-management

5

staff had over six years of experience. Approximately 35 percent of states had some unfilled positions during the
May 2006 panel, and 22 percent used staff from other programs, while only six percent hired temporary staff.
As discussed, states can utilize different survey procedures. The timing of telephone nonresponse followup varies
by states: Approximately 57 percent of states begin telephone followup calls after the first survey mailing, 24
percent after the second mailing, and 20 percent after the third or fourth mailing. Over 40 percent of states mail a
nonresponse follow up letter to potential respondents at some point in survey administration – about 18 percent of
states mail it to all nonrespondents, while 25 percent of states target specific firms or industries for the letter.
Over 75 percent of states did not experience mail or other major survey administrative problems in the May 2006
panel.
Historical OES Response Rates
OES response rates are quite high and fairly consistent over time, as shown in Exhibit 5. For most years the
response rate for the November panel is slightly higher when compared to the May panel. This boost is due in
part to the inclusion of State government data in November panels. State government data are often quite large
and easier for the State office to collect from their co-workers in the State’s personnel office. Response rates for
the May panel show a small decline from 78.4 to 76.5 percent from 2003 to 2006.
Exhibit 5. OES Response Rates for Recent Panels
80.0%

78.1%

78.4%

78.0%

77.7%

78.1%

77.9%

77.9%

76.8%
76.5%
76.0%

74.0%

72.0%

70.0%
Nov 2002

May 2003

Nov 2003

May 2004

Nov 2004

May 2005

Nov 2005

May 2006

Panel/Year

Response Rates by Employment Size
Response rates grouped by the size of the establishments show that small establishments have much higher rates
than large establishments, up to 30 percentage points difference. Exhibit 6 also shows small declines in the
response rates over time in establishments with five to 49 employees, but a less consistent trend in larger firms.
In fact, firms with 250 to over 1,000 employees show some increases in the response rates over time. It is
assumed that larger firms are more likely to have the technology to provide data by means of electronic filing and
they are more likely to use it when completing the OES survey. In addition, many of the establishments in the
larger size classes have staff dedicated to completing government forms and surveys (Willimack et al., 2002).
Also, many State offices have diligent analysts who seek out a contact person in large establishments and work at

6

creating and maintaining a cooperative relationship and rapport with the contact in order to facilitate data
collection.
Exhibit 6. OES Response Rates by Establishment Size Class
100.0%

90.0%

80.0%

70.0%
Nov 2002
May 2003
Nov 2003
May 2004
Nov 2004
May 2005
Nov 2005
May 2006

60.0%

50.0%

40.0%

30.0%

20.0%

10.0%

0.0%
1-4

5-9

10-19

20-49

50-99

100-249

250-499

500-999

1000+

Size Class

Response rates for State offices collecting OES data range between 57 percent and 91 percent, shown in Exhibit 7
and also in Table 1 in the Appendix. State partners that collect the data are required to meet a 75 percent response
rate in each panel, in either establishments or employment, as well as in each Metropolitan Statistical Area
(MSA). In Oklahoma, North Carolina and South Carolina, responding to the OES survey is mandatory. OES
response rates mapped out across the nation do not reveal any geographic pattern or indication of survey
administration differences. Looking at the six regional office territories (Boston, Philadelphia, Atlanta, Chicago,
Dallas, and San Francisco) also does not reveal any clear pattern that might indicate survey administration
differences. However, Atlanta and Dallas regional offices have the highest response rates, with Chicago third
overall (Appendix 1, Table 3), which could indicate survey administration differences or also regional differences,
as the south and midwest often have higher response rates than the northeast and west.

7

Exhibit 7. State OES Response Rates – May 2006

Over 85%
78% - 85%
75% - 77.9%
70% - 74.9%
Under 70%

Response Rates by MSA
State analysts suggest that the larger the Metropolitan Statistical Area (MSA), the harder it is to collect data.
They indicate that establishments in larger msa’s are less likely to respond by mail and are also difficult to reach
during telephone follow-ups. Response rates by MSA shown in Exhibit 8 indicate this to be true, and only 66
percent of respondents in MSAs of one million or more population size reported by mail in the May 2006 panel,
compared to 76.5 percent of all respondents (see Exhibit 5). Response rates for non- and smaller MSAs are in the
lower 80s, while the larger MSAs are in the lower 70s.
State analysts report that contacts in firms in the larger MSAs often complain that they are too busy to respond.
The environment of firms found in larger MSAs or perhaps the environment of the larger MSAs themselves
seems to influence the decision of whether or not a firm participates in the survey. State analysts report that it
takes many phones calls and lots of persuasion to collect data from these firms.

8

Exhibit 8. OES Response Rates by MSA Size
OES Response Rates by Size of MSA
86.0%
84.0%
82.0%
80.0%
Nov 2002
May 2003
Nov 2003
May 2004
Nov 2004
May 2005
Nov 2005
May 2006

78.0%
76.0%
74.0%
72.0%
70.0%
68.0%
66.0%
64.0%
Non-MSA

Small

Sm-Med

Med-Lg

Large

Very Lg

MSA Size Categories

Response Rates by Industry
Response rates by industry groups show some differences (Exhibit 9), but not nearly the differences seen in size
class or msa. The information services industry has the lowest response rates while other services and
government show the highest response rates. In recent November panels the response rates for information
services and finance, insurance, and real estate (FIRE) have been slightly higher than in May panels. Informal
interviews suggest that this could be attributed to good timing; those industries, especially FIRE, tend to be
involved in closing out their fiscal year accounting during the November collection period and find it easier to
submit data they are already working on.
Exhibit 9. OES Response Rates by Industry
100.0%
90.0%
80.0%
70.0%
Nov 2002
May 2003
Nov 2003
May 2004
Nov 2004
May 2005
Nov 2005
May 2006

60.0%
50.0%
40.0%
30.0%
20.0%
10.0%

G
ov
t

th
er
se
rv

y
ita
ho
sp
&

&
Le
is
ur
e

O

lit

th
he
al

se
rv
bu
s
of
&

9

Ed
uc

R
E
FI
Pr

In
fo

ity
tra
ns

,u
til

M
fg
Tr
ad
e,

tru
on
s
C

N
at

re
s,

m
in
i

ng

ct
io
n

0.0%

Nonresponse Rates
Nonrespondents to the OES include establishments that do not mail back the survey form, those that communicate
their non-participation (refusals), those that do not return phone calls, and those that submit incomplete
employment data. Nonresponse rates have been consistent over time; they have varied between 21.6 percent and
23.5 percent.
State OES nonresponse rates vary a great deal: between 9 percent and 43 percent. Anecdotally State analysts
attribute the differences in the level of nonresponse to the size of the sample the State must collect, the number of
larger MSAs the State has (which State analysts believe negatively impacts the likelihood an establishment will
agree to participate), and the number of larger units in the State samples. Regional personnel also cite varying
levels of expertise, different State operational practices, and personnel shortages and issues as additional factors
that affect nonresponse.
Nonresponse is lowest for firms with less than five employees. Informal interviews with State analysts suggest
that this is at least partially attributed to State analysts’ preferences for smaller firms. Contacting appropriate
payroll personnel in these units is often easier. These firms also have smaller amounts of data so they are easier
to code into the OES system. During a push to meet the mandatory 75 percent response rate, States will often
concentrate on collect data from the smallest establishments. Nonresponse peaks when surveying larger
establishments; those with 250 to 999 employees. Nonresponse subsides slightly with the largest establishments,
those with more than 1000 employees.

Refusal Rates
Establishments that communicate their desire to not participate in the OES Survey are classified as refusals. In
recent panels refusal rates range between 2.7 percent and 3.7 percent of the sample. As a portion of the overall
nonresponse rate, refusals have ranged between 12.5 percent and 16.6 percent.
Exhibit 10. Refusal Rates
% of
% of
Panel/Year
Sample
Nonresponse
Nov 2002
3.0%
13.7%
May 2003
3.3%
15.5%
Nov 2003
3.4%
15.3%
May 2004
3.5%
15.7%
Nov 2004
2.7%
12.5%
May 2005
3.5%
15.0%
Nov 2005
3.7%
16.6%
May 2006
3.5%
15.1%

State refusal rates, shown in Exhibits 11, range between 0.0 percent and 11.8 percent. There is substantial
anecdotal evidence that lower refusal rates and lower nonresponse rates in general are tied to the expertise and
“people skills” of individual State analysts.
Refusal rates mapped out across the nation again do not reveal any strong geographic or regional pattern, as
shown below.

10

Exhibit 11. State Refusal Rates – May 2006

Less than 1%
1% - 2.9%
3% - 5.9%
More than 6%
not available

Refusal rates by establishment employment size class for the May 2006 panel range between two and seven
percent, as shown in Exhibit 12. Refusals by establishment size class show a directly proportional relationship.
Similar to the reverse observed with overall response rates, refusal rates slightly decrease for the largest
establishments. This may be due to larger establishments having personnel assigned to complete government
paperwork and reports.
Exhibit 12. Percentages of OES Sample Resulting in Refusals by Establishment Size Class
OES Refusal Rates by Establishment Size Class
14.0%

12.0%

10.0%
200204
200302
200304
200402
200404
200502
200504
200602

8.0%

6.0%

4.0%

2.0%

0.0%
1-4

5-9

10-19

20-49

50-99

100-249

Establishment Size Class

11

250-499

500-999

1000+

Multivariate Data and Analysis
To analyze survey participation, we use data from the 2006 May OES panel. We include establishments in the
sample that are collected by BLS partners in the United States, which covers establishments in 50 states and the
District of Columbia (N=179,000 establishments). We exclude one industry and one group of multiestablishment firms, due to different data collection procedures: the federal government, which is centrally
collected by the BLS national office, and establishments whose data are centrally collected by regional offices
through a special arrangement with some multi-establishment firms.
We organize our analyses using the framework outlined by Willimack and colleagues, that of establishment or
business, survey administration, and external environment characteristics. We are not able to include respondent
characteristics, but hope to do so in future analyses. Respondent characteristics, such as contact name, title, and
department, are in overlapping OES text data fields and are difficult to separate. Exhibit 13 includes the variables
that we have available to test under each area.
In addition to OES establishment characteristics discussed earlier (employment size, industry, metropolitan
statistical area size), we include five additional establishment characteristics listed in the first column of Exhibit
13 in the analysis from the BLS Quarterly Census of Employment (QCEW) establishment frame tied to the
collection of state unemployment insurance tax data. The additional items include whether an establishment is
part of a multi-establishment firm that crosses states and/or is part of a multi-establishment firm within the state,
how many state unemployment insurance accounts are attached to the multi-establishment, whether the
establishment provides support services to other establishments in a firm, and the age of the firm (measured by the
first unemployment insurance liability date). The response rates for these data items in the May 06 panel are
reported in Table 3 in the appendix. To summarize: lower response rates are found for multi-establishment firm
either across states or within a state and establishments providing support services for a firm, and higher response
is observed as an establishment increases in age.
The survey administration characteristics listed in column 2 in Exhibit 13 originate in both the OES data and from
the 2006 May panel state questionnaire discussed earlier. From the OES data, we use BLS region, state sample
size, whether the survey is mandatory in a state, and whether an unstructured or industry-specific form was sent
by the state partner. From the QCEW, we use a data item that indicates if any establishment wage or employment
data were missing, imputed or of problematic quality, to indicate a pattern of problem reporting for the
establishment. Also from the QCEW, we are able to determine if the establishment was in the sample for another
BLS survey, the monthly Current Employment Statistics Survey, to assess potential burden. Response rates for
these data items are listed in Table 3 of the appendix and show that being in the CES sample, having missing or
imputed UI/QCEW data, and receiving an industry specific form (the latter is highly associated with employment
size) reduces response rates, while mandatory state surveys have higher response rates. From the state
questionnaire, we use data items on staffing, data collection practices and problems, and state government events,
listed in Table 2 in the appendix.

12

Establishment Characteristics
Employment size
Industry
MSA size
State multi-establishment
Number of State UI accounts
U.S. multi-establishment
Provides auxiliary support services
to other company establishments
Establishment age

Exhibit 13
Survey administration/design
BLS Region
State sample size
State mandatory survey
Survey form type
Missing or imputed UI wage or
employment data
In CES sample
Staff number, composition, experience

External environment
In CES sample
Employment size
MSA size
Significant state economic
change
State population change 0506
State revenue change 05-06

Staff reductions, use of non-regular staff
Data collection practices
Nonresponse followup timing
Survey administration problems
State govt/agency events

We have few variables to measure the external environment of the establishment, including whether the
establishment is in the CES sample to measure the survey environment, employment size, MSA, whether there
were significant state economic events as measured by the state questionnaire, the state population change from
2005-06 from Bureau of the Census data, and 2005-06 state general fund revenue change from the Association of
State Budget Officers.
We use logistic regression models to fit the response outcomes, which predict the probability of whether the
establishment responded or not. We provide the chi-square statistics and significance levels for each data item,
and the exponentiated value of the coefficient, which can be interpreted as an odds ratio (values greater than one
indicate an improvement in response). Finally, we compare different models using the rescaled R2 for variance
explained and the likelihood ratio.
Establishment Model Results
Exhibit 14 provides the Chi-square statistics and significance levels of the variables in the establishment model.
Employment size, followed by industry , and whether or not the establishment is part of a multi-state firm have
the largest Chi-square values, with employment size much larger than other variables. This is followed by the age
of a firm, MSA size, number of state UI accounts, whether the establishment is part of a state mult-unit firm, and
auxiliary status. The odds ratios, shown in Exhibit 15, show that having an employment size of less than 100
increases the likelihood of response. Being outside of a MSA or in an MSA with a lower population increases the
probability of response compared to the most populous MSA; however, it is not a linear trend, due to a higher
likelihood of response from establishments in MSAs with 500-999,000 persons. The results for industry show
that information and finance have lower probability of response than local government establishments. One can
speculate that many of the white-collar industries within the finance and information are likely to have fairly well
developed records systems for reporting data, so this finding is contrary to expectation. While all industries have
a lower probability of response than local government, a number of service industries –education and health,
leisure and hospitality, and all other services have a higher probability of responding compared to manufacturing.
Manufacturing is an industry considered to have a history of strong records-keeping practices, so this finding is
also contrary to expectation. Establishment age, perhaps associated with better reporting capabilities and more
established staffing, increases the likelihood of response, as does the number of state UI accounts associated with
an establishment. However, being part of a multi-unit firm, either across states or within a state reduces the
likelihood of response compared to single unit establishments, as does providing support services to other
establishments in a firm.

13

Establishment Model
Employment Size
MSA
Industry
Multi-State unit
State Multi-unit
No. of state UI accounts
Auxiliary status
Establishment age

Parameters
Intercept
Employment Size
1-9
10-49
50-99
100-249
250-999
1000+
MSA
Not MSA
50-149,999
150-249,999
250-499,999
500-999,999
1,000,000+
Establishment age
No. state UI accounts
Auxiliary status
Not Auxiliary
Auxiliary

Exhibit 14. Establishment Model
Degrees of
Chi-Square
Freedom
5
5302.71
5
149.70
10
1418.57
1
1272.05
1
55.80
1
91.64
1
12.83
1
348.15

Exhibit 15. Establishment Model, Odds Ratios
Pr >
Odds
Parameters
ChiSq
Ratio
.0879
1.17
Industry
Nat res, mining
Construction
<.0001
4.25
Mfg
<.0001
2.01
Trade, trans, utility
<.0001
1.38
Information
.1268
1.10
Finance
.7570
1.02
Prof & bus
1.0
Educ, health
Leisure, hospitality
<.0001
1.43
Other services
<.0001
1.31
Local government
<.0001
1.29
<.0001
1.25
US multi-unit
<.0001
1.33
Single unit
1.0
Multi unit
<.0001
1.01
State multi-unit
<.0001
1.01
Single-unit
Multi unit
0.0003
1.20
1.0

Pr > Chi Sq
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
0.0003
<.0001

Pr > ChiSq

Odds
Ratio

<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001

0.47
0.58
0.55
0.56
0.32
0.42
0.50
0.75
0.64
0.63
1.0

<.0001

1.85
1.0

<.0001

1.12
1.0

Survey Administration Model Results
Exhibits 16 and 17 display the results of the survey administration model. Survey form type, which is highly
associated with the employment size of the firm, has a very high Chi-square value with the longer, industry form
having a much lower probability of response, controlling for all other survey administration variables.
Establishments in states that do not have a mandatory survey have a lower probability of response, as would be
expected. Whether or not an establishment is in the CES sample is not significant in this model. BLS regional
results may reflect administrative practices, but also could reflect geographic differences in responding. The data
show that establishments in Atlanta, Philadelphia and Chicago regions have a higher probability, and Boston and
Dallas have a lower probability of response compared to San Francisco. Having a higher percent of managerial
staff and no decrease in staff positions increases the likelihood of response, but other staffing variables do not
show the same pattern. The number of positions, having staff and managers with greater than four years of
experience, unfilled positions, and using temporary or staff from other programs lowers the probability of
response. One can speculate that staff and managers with the greatest tenure and not having unfilled positions

14

could be associated with burn out, and that using staff outside of regular staffing could be associated with
inexperience. However, more testing of staffing variables using different cut-offs is important to understanding
the patterns. The timing of first telephone followup calls indicates that calling after the first mailing is most
important in predicting response. Other survey administration variables are difficult to assess, for example, the
greater the number of data collection practices utilized, the lower the probability of response. It may be that states
with more difficulty in reaching higher response rates utilize more of the practices. Also, establishments in states
reporting no administrative problems and no state government or agency events have a lower probability of
response. It may be that states reporting those problems, particularly administrative problems, have efficiently
identified problems and are comfortable relating them to regional personnel (who collected the state questionnaire
data).

Survey Admin Model
BLS Region
State sample size
State mandatory survey
Survey form type
Missing/imputed UI data
CES sample
Staff FTE positions
Percent Managerial
Has staff with 4+ yrs exp
Has managers with 4+ yrs exp
Unfilled positions
Used non-regular staff
Had decrease in FTE
No. of data collection practices
Timing of telephone followup
Survey admin problems
State govt/agency events

Exhibit 16. Survey Administration Model
Degrees of Freedom
Chi-Square
5
430.37
1
95.84
1
444.00
1
5099.04
1
23.45
1
1.29
1
165.59
1
348.15
1
112.58
1
62.32
1
176.04
1
132.18
1
74.32
1
27.56
3
267.62
1
18.68
1
52.84

15

Pr > Chi Sq
<.0001
<.0001
<.0001
<.0001
<.0001
<.2566
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001

Exhibit 17. Survey Administration Model, Odds Ratios
Pr >
Odds
Parameters
Parameters
ChiSq
Ratio
Intercept
<.0001
11.57
No staff with 4+ yrs exp
Staff with 4+ yrs exp
BLS Region
Boston
<.0001
.83
No managers with 4+ yrs exp
Philadelphia
<.0001
1.28
Managers with 4+ yrs exp
Atlanta
<.0001
1.62
No unfilled positions
Chicago
<.0001
1.28
Unfilled Positions
Dallas
.2928
.97
No non-regular staff
San Francisco
1.0
Used non-regular staff
<.0001
1.0
State sample size
No decrease in FTE
<.0001
.38
Had decrease in FTE
Not mandatory state
Mandatory state
1.0
No. of data collection practices
<.0001
.42
Industry form
Phone followup begins after
Unstructured form
1.0
1st mailing
<.0001
1.13
2nd mailing
No missing/imputed UI data
3rd mailing
Missing/imputed
1.0
4th mailing
<.0001
1.02
Not in CES sample
No admin problems
CES sample
1.0
Admin problems
<.0001
.91
No. FTE positions
<.0001
1.48
Percent Managerial
No state gov/agency events
State gov/agency events

Pr >
ChiSq
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001

Odds Ratio
1.27
1.0
1.14
1.0
.72
1.0
0.77
1.0
1.29
1.0
0.95
1.34
.82
1.08
1.0
.90
1.0
.84
1.0

External Environment Model Results
The external environment model shows that employment size has the greatest impact, as measured by the chisquare values, as shown in Exhibit 18. Employment size in one form or another has the greatest impact overall in
the models, and we hope to investigate the role of size further in future analyses, using interactions. Clearly, the
size of the establishment affects the reporting environment, with larger establishments having more government
reporting, and perhaps less commitment to completing voluntary surveys. Increasing MSA size decreases the
probability of response, again with the exception of the 500-999,000 category. Establishments in larger MSAs
may operate in a cultural environment that reduces the likelihood of response, although one can speculate that
establishments in smaller MSA might face more difficult economic conditions that could discourage response.
We are exploring adding other economic survey data items that might measure market competitiveness by
industry and MSA, which could improve this model. Significant state economic changes reduced the likelihood
of response, state population change had a small effect, while state revenue changes were not significant. Finally,
not being in the CES sample lowers the probability of response, which is a item that was insignificant in the
survey administration model. This is somewhat contrary to expectation in that establishments operating in a
survey environment that is demanding in reporting requirements might be less likely to participate. However, one
can argue that greater reporting requirements are handled best by those that face the greatest burden, given
staffing and records keeping capabilities.
Exhibit 18. External Environment Model
Degrees of
External Environment Model
Chi-Square
Freedom
In CES sample
1
43.08
Employment size
5
6297.28
MSA size
5
745.18
Significant state economic change
1
345.01
State population change 05-06
1
74.94
State revenue change 05-06
1
1.59

16

Pr > Chi Sq
<.0001
<.0001
<.0001
<.0001
<.0001
0.2076

Exhibit 19, Exernal Environment Model, Odds Ratios
Pr >
Odds
Pr >
Parameters
Parameters
ChiSq
Ratio
ChiSq
.3428
.94
Intercept
MSA
Not MSA
<.0001
50-149,999
.89
<.0001
Not in CES
150-249,999
In CES sample
1.0
<.0001
250-499,999
<.0001
Employment Size
500-999,999
1-9
<.0001
3.93
<.0001
10-49
1,000,000+
<.0001
1.89
No significant
50-99
economic change
<.0001
1.28
<.0001
Significant
economic change
100-249
.8933
.99
250-999
State pop change
.0502
.89
<.0001
State revenue
1000+
change
1.0
0.2076

Odds
Ratio
1.51
1.34
1.32
1.27
1.32
1.0
1.65
1.0
1.0
1.0

Full Model Results
Exhibits 20 and 21 show the results of the inclusion of all variables in the three models. The direction of nearly
all the variables remains the same, with a few minor exceptions. Employment size has the largest chi-square
value, although smaller, the trend is consistent in that establishments with fewer than 100 employees are more
likely to respond. Other items with large values tied to the establishment include whether the establishment is
part of a firm that crosses states, industry type, as well as establishment age and MSA. These variables are in the
same direction as in the establishment model, in that multi-state status decreases, but age increases the probability
of response. Information and finance have the lower probability of response, while many of the services are more
likely to respond than mining, construction, manufacturing, and transportation industries, but overall all industries
have a lower response than local government. MSA results are mixed, with not being located in an msa
associated with the highest response. Survey administration results are very similar, with region, mandatory
survey, form type, telephone followup timing, unfilled positions and use of non-regular staff having a large
impact. Telephone follow timing shows a trend of the greatest probability of response after the first mailing, with
the second and third mailings also increasing response, compared to followup after the final mailing. External
environment variables are in the same direction, with significant state economic factors having a higher chi-square
value.

17

Full Model
Establishment
Employment Size
MSA
Industry
Multi-State unit
State Multi-unit
No. of state UI accounts
Auxiliary status
Establishment age
Survey administration
BLS Region
State sample size
State mandatory survey
Survey form type
Missing/imputed UI data
CES sample
Staff FTE positions
Percent Managerial
Has staff with 4+ yrs exp
Has managers with 4+ yrs exp
Unfilled positions
Used non-regular staff
Had decrease in FTE
No. of data collection practices
Timing of telephone followup
Survey admin problems
External Environment
State govt/agency events
Significant state economic change
State population change 05-06
State revenue change 05-06

Exhibit 20. Full Model
Degrees of
Chi-Square
Freedom

Pr > Chi Sq

5
5
10
1
1
1
1

2034.35
218.85
1097.05
1303.82
39.87
84.99
11.48
363.05

<.0001
<.0001
<.0001

5
1
1
1
1
1
1
1
1
1
1
1
1
1
3
1

382.58
4.58
153.58
123.50
39.99
31.95
17.46
35.31
99.83
52.28
176.04
112.53
15.51
36.32
41.19
16.80

<.0001
.0323
<.0001

1
1
1
1

13.69
148.41
70.01
81.28

.0002
<.0001
<.0001
<.0001

18

<.0001
<.0001
<.0001
0.0007
<.0001

<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001

Exhibit 21. Full Model, Odds Ratios
Parameters
Intercept
Employment Size
1-9
10-49
50-99
100-249
250-999
1000+
MSA
Not MSA
50-149,999
150-249,999
250-499,999
500-999,999
1,000,000+
Industry
Nat res, mining
Construction
Mfg
Trade, trans, utility
Information
Finance
Prof & bus
Educ, health
Leisure, hospitality
Other services
Local government
Establishment age
No. state UI accounts
Auxiliary status
Not Auxiliary unit
Auxiliary unit
US multi-unit
Single-unit
Multi unit
State multi-unit
Single-unit
Multi unit
No significant economic
change
Significant economic change

Pr >
ChiSq

Odds Ratio

.3642

1.13

<.0001
<.0001
<.0001
.0643
.7026

3.67
1.91
1.40
1.12
1.03
1.0

<.0001
<.0001
<.0001
<.0001
<.0001

<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001

BLS Region
Boston
Philadelphia
Atlanta
Chicago
Dallas
San Francisco
State sample size
Not mandatory
Mandatory
Industry form
Unstructured form
No missing/imputed UI data
Missing/imputed
Not in CES sample
CES sample
No. FTE positions
Percent Managerial
No staff with 4+ yrs exp
Staff with 4+ yrs exp
No managers with 4+ yrs exp
Managers with 4+ yrs exp
No unfilled positions
Unfilled Positions
No non-regular staff
Used non-regular staff
No decrease in FTE
Had decrease in FTE
No. of data collection practices
Phone followup begins after
1st mailing
2nd mailing
3rd mailing
4th mailing
No admin problems
Admin problems
No state gov/agency event
State gov/agency event

1.25
1.18
1.22
1.12
1.22
1.0
.47
.57
.54
.56
.32
.42
.50
.77
.64
.62
1.0
1.01
1.01

.0007

1.19
1.0

<.0001

1.88
1.0

<.0001

1.11
1.0

<.0001

1.8
1.0

Parameters

State pop change
State revenue change

19

Pr >
ChiSq

<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
<.0001
.0002

<.0001
<.0001

Odd
s
Rati
o
1.22
1.80
1.99
1.91
1.27
1.0
1.0
.54
1.0
.81
1.0
1.19
1.0
.90
1.0
.97
1.25
1.28
1.0
1.13
1.0
.73
1.0
.74
1.0
1.13
1.0
.94
1.37
1.18
1.18
1.0
.90
1.0
.90
1.0
1.0
1.0

Model Testing
In comparing the different models (Exhibit 22), we see that the establishment model explains more of the
variation and has a better fit than the survey administration or external environment models. The establishment
model includes data items theoretically not under the control of the survey organization, thus its strength poses a
dilemma in how to proceed to address nonresponse. All of the models are heavily influenced by the effect of size,
either through the employment size itself or the survey form type, in the case of the survey administration model.
We hope to disentangle the meaning of employment size through future analysis of interactions with industry,
MSA, establishment age, and multi-establishment status. There may be proxy variables at an industry level that
measure establishment characteristics such as data availability and staffing resource patterns we could utilize; the
latter data are likely to be found in the OES. We also can explore and reconceptualize variables in the survey
administration model, include additional information we are gathering on the modes that are more likely to be
offered to respondents by each state (such as email reporting), and conduct further analysis of form type through
interactions with establishment and survey administration variables. In addition, we plan further analysis of state
sample size, which was significant in the survey administration model, and its interactions with industry, MSA,
and employment size. Since survey design and administration is theoretically under the control of the survey
researcher, it is a critical area for further research. For the external environment model, we are investigating
additional data items that might better capture economic conditions and the legal and regulatory climate
associated with states, MSA and industries.
The full model, reduces the effect of size somewhat, and with all variables from the other models included
explains about 12 percent of the variance, a substantial increase over the models focusing on only one conceptual
areas of survey participation. While we have much more to explore in model testing and alternative variable
construction, it appears that each conceptual area -- the establishment, survey administration and external
environment -- is important in explaining participation in the OES survey.

Model
Establishment
Survey Administration
External environment
Full Model

Exhibit 22. Model Tests
Max-rescaled R
Likelihood ratio
Square
.1042
12678.03
.0649
7882.13
.0775
9458.22
.1229
15048.14

20

Pr > Chi Sq
<.0001
<.0001
<.0001
<.0001

APPENDIX
Table 1. Sample Distribution and Response Rate by State, May 2006 panel
State

FIPS

N
3619
691
2529
2323
15691

Sample
Percent
2.0
.4
1.4
1.3
8.8

Response
Rate
82.5
73.7
78.1
80.6
74.4

Alabama
Alaska
Arizona
Arkansas
California

01
02
04
05
06

Colorado
Connecticut
Delaware
District of
Columbia
Florida
Georgia
Hawaii
Idaho
Illinois
Indiana
Iowa
Kansas
Kentucky
Louisiana
Maine
Maryland
Massachusetts
Michigan
Minnesota
Mississippi

08
09
10
11

3352
3097
825
453

1.9
1.7
.5
.3

77.1
70.8
76.5
73.7

12
13
15
16
17
18
19
20
21
22
23
24
25
26
27
28

8943
4321
949
1183
5545
4842
2912
2276
2855
3283
1408
2902
4460
5298
3540
1888

5.0
2.4
.5
.7
3.1
2.7
1.6
1.3
1.6
1.8
.8
1.6
2.5
3.0
2.0
1.1

77.1
82.9
78.1
75.8
71.2
80.8
82.6
77.9
75.7
75.0
78.3
74.3
74.0
65.6
77.9
89.5

21

State

FIPS

N
4096
1023
1647
1453
1641

Sample
Percent
2.3
.6
.9
.8
.9

Response
Rate
80.7
84.0
91.3
73.1
80.2

Missouri
Montana
Nebraska
Nevada
New
Hampshire
New Jersey
New Mexico
New York
North Carolina

29
30
31
32
33
34
35
36
37

5260
1408
8513
5401

2.9
.8
4.8
3.0

56.6
89.9
74.8
91.4

North Dakota
Ohio
Oklahoma
Oregon
Pennsylvania
Rhode Island
South Carolina
South Dakota
Tennessee
Texas
Utah
Vermont
Virginia
Washington
West Virginia
Wisconsin
Wyoming

38
39
40
41
42
44
45
46
47
48
49
50
51
53
54
55
56

926
7671
2431
2876
7679
954
2975
966
3518
11379
1718
839
4563
3858
1692
4585
743

.5
4.3
1.4
1.6
4.3
.5
1.7
.5
2.0
6.7
1.0
.5
2.6
2.2
1.0
2.6
.4

77.4
74.6
79.1
76.9
74.5
75.4
84.7
91.1
75.0
74.8
79.0
86.3
73.5
76.6
74.9
76.0
89.6

Table 2. State and District Survey Results, May 2006 (N=51)
BLS funded FTE
Mean
Standard Deviation
Range
Sum
BLS Funded FTE-Management/
Professional Staff
Mean
Standard Deviation
Range
Sum

5.1
3.3
1.3 – 18.0
259

3.4
3.0
.5 – 13.0
171.5

Use staff from other programs?

22%

Missing
Hire temporary staff?
Missing
Change in FTE positions from
November 2005 panel
Increase
Decrease
About the same
Unfilled positions
Mean
Standard Deviation
Range
Number of Managers
< 1 – 3 years experience
4 – 6 years experience
> 6 years experience
Total

1
6%
2

4%
22%
75%
35%
.47
.78
.3 -3.4
23%
17%
59%
100%

Number of Staff
< 1 – 3 years experience
4 - 6 years experience
> 6 years experience
Total
Data Collection Practices
Used BLS spreadsheet for address
refinement
Used address refinement postcards
Used email for data collection
Collected multi units separately from
centralized mailings
Mailed nonresponse letter to all
nonrespondents
Mailed nonresponse letter to some
nonrespondents
Nonresponse Telephone Followup
After 1st mailing
After 2nd mailing
After 3rd mailing

After 4th mailing
Survey Administration Problems
Late mail delivery
Other mail problems
Other survey admin problems
None of the above
State Events
Significant economic changes
Significant administrative changes
Agency restructuring
Other agency transitions or moves

22

43%
25%
32%
100%

57%
31%
98%
43%
18%
25%
57%
24%
16%
4%
5%
12%
6%
76%
4%
12%
8%
18%

All States/DC
Emp Size Class
1-4
5-9
10-19
20-49
50-99
100-249
250-499
500-999
1000+
MSA
Not MSA
50-149,999
150-249,999

250-499,999
500-999,999
1,000,000+
Regional
Offices
Boston
Philadelphia
Atlanta
Chicago
Dallas
San
Francisco
Estab. Age
Mean
Quartile 1
Median
Quartile 3

Table 3. Distribution of Variables and Final Response Rate (N=179,000)
Response
N
% of Total
N
% of Total
Rate
Industry
179000
76.5
Nat res, mine
1803
1.0
Constr.
14768
8.3
Mfg
28925
16.2
89.5
17129
9.6
Trade, trans,
utility
29826
16.7
84.9
41469
23.2
Information
32405
18.1
79.5
3978
2.2
Finance
39236
21.9
73.1
12210
6.8
Prof & bus
23718
13.3
67.6
25429
14.2
Educ,
health
15411
8.6
61.8
27485
15.4
Leisure,
hospitality
5722
3.2
60.3
21265
11.9
Other serv
2328
1.3
59.4
9584
5.4
Local govt
1429
.8
62.6
3880
2.2
Multiple
state unit
51665
24.4
Single state
unit
36833
20.6
81.9
127335
75.6
State
multiple unit
14105
7.9
81.7
49878
27.9
State single
unit
13808
7.7
80.7
129122
72.1
State
multiple UI
acct
20712
11.6
78.6
29530
16.5
State single
UI accts
19113
10.7
78.0
149470
83.5
Auxiliary
service estab
74427
41.6
71.1
1958
1.1
Non-aux est.
177042
98.9
CES sample
overlap
18611
10.4
No overlap
21792
11.7
75.2
160389
89.6
Mandatory
state
24510
13.1
70.3
10807
6.0
Nonmandatory
35253
18.7
81.7
16893
94.0
Missing/imp
uted UI data
39427
21.2
76.0
9667
5.4
No miss/imp
UI data
35444
19.0
78.1
169333
94.6
Industry
form
30689
16.3
75.3
87304
48.7
Unstructure
d form
91696
51.2
Years
44750
13.2
74.5
44750
4.1 yrs
74.9
44750
9.2 yrs
77.2
44750
18.9 yrs
79.5

23

Response
Rate
76.4
79.7
74.9
77.1
64.6
72.9
72.5
78.6
76.6
83.8
82.1
65.0
80.2
70.3
78.9

69.4
77.9
61.8
76.7
72.2
77.0
86.8
75.9
73.1
76.7
68.9
83.8

Bibliography

1995 Christianson, Anders and Robert D. Tortora. “Issues in Surveying Businesses: An International Survey.”
Pp. 237-256 in Cox, Brenda G., David A. Binder, B. Nanjamma Chinnappa, Anders Christianson, Michael
J. Colledge, and Phillip s. Kott (eds.), Business Survey Methods. New York: John Wiley and Sons.
1998 Interagency Group on Establishment Nonresponse. “Revisiting the Issues and Looking to the Future.”
Presented at the 1998 Conference of the Council of Professional Associations on Federal Statistics
(November).
1998 Grover, Robert M. and Couper, Mick P., Nonresponse in Household Interview Surveys. New York: John
Wiley and Sons.
1993 Hidiroglou, M.A., J.D. Drew and G.B. Gray. “A framework for measuring and reducing nonresponse in
surveys.” Survey Methodology 19: 81-94.
1999 Jones, Carrie K. “Response Rates Analysis: An Analysis of the 1996/1997 Occupational Employment
Statistics Survey Response Rates.” Bureau of Labor Statistics.
1993 Moore, Danna L. and Rodney K. Baxter. “Increasing Mail Questionnaire Response Rates for Surveys of
Businesses: A Telephone Follow-up Procedure as an Element of the Total Design Method.” Presented at
the International Conference on Establishment Surveys, Buffalo, N.Y. (June).
1997 National Center for Educational Statistics. An Analysis of Total Nonresponse in the 1993-94 Schools and
Staffing Survey. Washington, DC (NCES 98-243).
2000 Potter, D.E.B. “Characteristics of Nonrespondents in a National Longitudinal Survey of Nursing Homes.”
Proceedings of the Second International Conference on Establishment Surveys, pp. 1510-1515.
2004 Sommers, John, Steven Riesz, and David Kashihara. “Response Propensity Weighting for the Medical
Expenditure Panel Survey – Insurance component (MEPS – IC),” ASA section on Survey Research
Methods, pp 4410 - 17.
1994 Tomaskovic-Devey, Jeffrey Leiter, and Shealy Thompson. “Organizational Survey Nonresponse.”
Administrative Science Quarterly 39: 439-457.
198 Tulp, Daniel R., C. Easley Hoy, Gary L. Kusch, and Stacey J. Cole. “Nonresponse under Mandatory Vs.
Voluntary reporting in the 1989 Survey of Pollution Abatement Costs and Expenditures (PACE). Industrial
Statistics Working paper, U.S. Census.
2002 Willimack, D.K., Nichols, E., and Sudman, S. (2002). “Understanding Unit and Item Nonresponse in
Business Surveys,” in Survey Nonresponse, R. Groves et al., New York: Wiley.

24


File Typeapplication/pdf
File TitleWillimack, Nichols, and Sudman (2002) identified a number of factors that influence the likelihood of a firm responding to a su
AuthorPolly Phipps
File Modified2008-06-16
File Created2008-06-16

© 2024 OMB.report | Privacy Policy