DOL Employer Survey Part B

DOL Employer Survey Part B.docx

Survey of Employer Policies on the Employment of People with Disabilities

OMB: 1230-0012

Document [docx]
Download: docx | pdf


Paperwork Reduction Act Submission:
Supporting Statement for the Office of Disability Employment Policy (ODEP)

Survey of Employer Policies on the Employment of People with Disabilities, OMB No. 1230-0NEW

January 2018



OMB SUPPORTING STATEMENT PRA PART B

COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

In this document, we discuss the statistical methods to be used in the data collection activities for the Survey of Employer Policies on the Employment of People with Disabilities. This study is sponsored by the Chief Evaluation Office (CEO) and Office of Disability Employment (ODEP) in the U.S. Department of Labor. The purpose of the study is to develop and conduct a survey of employers to obtain information on their practices and attitudes regarding employees and job seekers with disabilities.

This request for clearance includes the following data collection activities:

  • Telephone survey with HR Managers

  • Qualitative Interviews with HR Managers

  • Case Study Interviews with representatives from 6 companies



B.1 Respondent Universe and Samples

Telephone Survey with HR Managers

Westat will conduct a nationally representative survey of about 4,800 employers from 12 industry sectors and four company size groups (5–14 employees, 15–249 employees, 250 – 999 employees, and 1000+employees). Table 1 shows the number of employers in the universe in each stratum. The industry sectors were chosen based on projected employment growth rates for industries by 2024 (Henderson, 2015). The target population for this survey will include all employers with at least five employees in 12 industries in the U.S. Employers with fewer than five employees, which are often one-person or family-based businesses that do not hire employees, as well as Federal Government agencies, will be excluded from the target population.

Qualitative Interviews with HR Managers

We will conduct qualitative interviews with a subsample of HR managers who completed the telephone survey. We will select all employers with significant experience with disability employment and randomly select 90 companies and 15 alternates and interview mid-level managers from those companies to collect more in-depth information.

Case Studies

We will select six companies for case studies and interview approximately 20 individuals at each company, which will yield a total of approximately 120 interviews.



B.2. Procedures for the Collection of Information

B.2.1. Statistical Methodology for Stratification and Sample Selection

Telephone Survey with HR Managers

The sampling frame for the survey will be the Duns Market Identifiers (DMI) register maintained by Dun & Bradstreet (D&B). Employer disability employment policies and practices may vary among large firms. Some may be highly centralized, whereas others may have separate policies in different branches. DMI provides the option of choosing alternative organizational levels and includes both headquarters and branch-level records. DMI defines a headquarters as a business establishment that has branches or divisions reporting to it and is financially responsible for those branches or divisions. We will include only the headquarters record for those companies with multiple branches. Therefore, the sampling units will be the single location (a business establishment with no branches or subsidiaries reporting to it) companies and the headquarters of the companies that have multiple branches. Another corporate family linkage relationship DMI provides is the subsidiary-to-parent linkage. According to the DOL, a subsidiary organization is any separate organization of which the ownership is wholly vested in the reporting labor organization or its officers or its membership, which is governed or controlled by the officers, employees, or members of the reporting labor organization, and which is wholly financed by the reporting labor organization (Department of Labor, 2010). The subsidiaries and parent companies will be included as separate sampling units.

Table 1 shows the number of company records in the DMI file by major industry sector and company employee size classes for both options. Single location companies and headquarters of companies with multiple branches were used in the tabulation. The number of employees includes full-time and part-time employees as well as the owners/proprietors.

Table B-1 Number of companies by major industry sector and company size sampling strata


Number of employees1

Industry section

5-14

15-249

250-999

1000+

Total

Construction

185,182

89,362

2,290

543

277,377

Manufacturing

119,759

103,394

8,310

4,168

235,631

Wholesale Trade, Transportation, Warehousing

163,685

92,590

4,347

1,577

262,199

Retail Trade

222,785

77,806

2,204

1,258

304,053

Information

44,225

22,048

1,555

1,006

68,834

Financial Activities

138,577

64,732

4,157

2,228

209,694

Professional & Business Services

335,134

144,248

6,888

3,084

489,354

Education

77,110

38,646

6,322

2,234

124,312

Health Services

329,011

115,459

6,859

3,062

454,391

Leisure & Hospitality

420,785

160,092

3,448

1,438

585,763

Other Services

213,811

62,625

1,549

446

278,431

Public Administration

32,958

35,137

3,255

1,588

72,938

Total

2,283,022

1,006,139

51,184

22,632

3,362,977

1 About 0.2 percent of the total number of companies had an unknown employee size. The companies with an unknown employee size are included in the 5–14 size class.



Qualitative Interviews with HR Managers

Because of the small number of HR managers with whom in-depth interviews will be completed, we will make no attempts to generalize to the population of HR managers. We will select all employers with significant experience with disability employment and randomly select 90 companies and 15 alternates and interview mid-level managers from those companies to collect more in-depth information. As shown in Table 3, we will collapse the 12 industry sectors into three broader categories—goods-producing industries, service-producing industries, and public administration—and select 7-8 companies from each stratum of broad industry group within each of the four size classes. We expect to complete a total of 90 interviews and will increase the number of alternates if necessary in order to meet that goal of 90 interviews.

Table B-3. Stratified sample of mid-level managers for in-depth interviewees

Industry Sector

Number of Employees

5–14

15–249

250–999

1000+

Total

Goods-Producing Industries

7

8

8

7

30

Service-Producing Industries

7

8

8

7

30

Public Administration

7

8

8

7

30

Total

21

24

24

21

90



Case Study

For the case study site visits, two companies from each of the following industry sectors will be chosen: goods-producing, service-producing, and public administration. Based on the telephone survey responses of HR managers, we will select companies that reported high levels of experience with disability employment across the life cycle. We aim to focus three case studies on large employers (250+) and three case studies of companies that are mid-sized (15-249 employees). We will not conduct case studies of small companies, since we expect they will not have sufficient recent experience with disability hiring to justify a two-day site visit. No statistical sampling will be used to select employers for the case studies.

B.2.2 Estimation Procedures

Telephone Survey with HR Managers

For the most part, analysis will be descriptive. We will first start with descriptive statistics (e.g., percentages, means, medians, and standard deviations, as appropriate), cross-tabulations, and graphical summaries. We will examine these measures in the aggregate, as well as explore subgroup differences (e.g., by industry, size class, etc.). We will start by using standard bivariate tests (t tests and chi-square tests) to compare groups. To help interpret the responses regarding attitudes, we will conduct exploratory factor analysis. Many attitudes may be reflecting the same underlying phenomena. In such cases, factor analysis can be a useful data reduction method in which correlated observed variables are grouped together and separated from other variables with low or no correlation. We will also explore the relationship between employer characteristics, practices, and attitudes, and disability employment using regression. The main outcomes we will examine are whether an employer has any disabled employees and whether they hired any disabled employees in the past month.

We will assign a base weight to each sample company record as the reciprocal of its probability of selection. The base weights will then be adjusted for nonresponse in order to reduce potential biases resulting from not obtaining an interview with every company in the sample. The estimates of standard errors in this survey will be obtained using the Taylor linearization procedure. The estimators in this survey are in the form of totals, means, proportions, and general ratios of the weighted estimates. The Taylor linearization approach is appropriate to use with these types of nonlinear estimators, such as ratios. The Taylor linearization approach is appropriate to use with these types of nonlinear estimators, such as ratios. The finite population correction factors will be used to reflect properly the effects of the without-replacement sampling on the standard errors. The estimators in this survey are in the form of totals, means, proportions, and general ratios of the weighted estimates.

Qualitative Interviews with HR Managers

Data collected from the qualitative interviews with HR managers will provide in-depth qualitative information about disability employment from the perspective of the HR manager; no estimate procedures will be used. Analysis will be descriptive.

Case Studies

Data collected from the case study site visits will provide in-depth qualitative information about disability employment at the six selected sites; no estimate procedures will be used. Analysis will be descriptive.

B.2.3. Degree of Accuracy Needed

The sample sizes in each size class within the major industry sector will be large enough to provide a sufficient number of completed interviews to obtain estimates with reasonable precision. The population parameters of interest are mainly in the form of proportions. These include, within each company size class and industry sector, the proportion of companies with employees that have disability, the proportion of companies that hired any person with disabilities within the past 12 months, the proportion of companies that proactively recruit job applicants who are persons with disabilities, etc.

Table 2 summarizes the target sample size and associated precision for the sample. We assume 100 completed interviews in each of the 48 strata. The precision estimates are for strata defined by combinations of industry sector and company size. The maximum percent error for estimates of percentages obtained from a simple random sample yielding 100 completed interviews will not exceed 10 percent, 95 percent of the time. The percent error is the largest for a 50 percent proportion and decreases as proportion moves further away from the 50/50 percent split. For example, for an 80/20 percent split, the error is 8 percent. Thus, 100 completed interviews in each size category by industry strata should provide an adequate precision level for estimates of percentages.

There is also interest in comparing the proportions across strata defined by combinations of industry sector and company size. The sample sizes should be large enough to provide more than 80 percent power to detect reasonable differences in proportions. The power of a test is the probability of rejecting the null hypothesis of no difference between two proportions, when the null hypothesis is false and the alternative hypothesis is true. If the power of the test is inadequate, when the null hypothesis of no difference is not rejected, we cannot conclude with a reasonable confidence that there is no difference between the proportions because this may be due to the fact that the sample size is too small to detect the difference. A power of 80 percent is generally considered as adequate. Given, a certain power level, larger sample sizes are needed to detect smaller differences. With a sample size of 100 in each stratum, we will be able to detect differences of about 20 percent or larger.

We want to emphasize that the sampling design is very conservative because it is intended to produce a high level of precision for estimates within strata defined by combination of industry sector and size class. However, in all likelihood, analysts will be more interested in making estimates for specific industry sectors or size classes. The level of precision for these types of “marginal” comparisons is even higher. For example, for an 80/20 percent split, the error is 6 percent within industry sector; within size class, the error is 3 percent. The differences that can be detected between industry sectors or size classes are similarly smaller. For example, with 80 percent power, one can detect a 12 percent difference between industry sectors and a 7 percent difference between size classes.


Table B-2. Sample size and precision estimates

Sample size

Total number of completed interviews

4,800

Number of completed interviews for each stratum

100

Percentage of companies with fewer than 5 employees

5-14 employees

20%

15-49 employees

10%

50-249 employees

5%

250 or more employees

2%

Percentage of companies found to be out-of-business

5-14 employees

20%

15-49 employees

10%

50-249 employees

10%

250 or more employees

10%

Overall target response rate for Employer Survey

50%

Initial sample size

12,240

Precision estimates


Industry sector and size class

Maximum percent error for 50/50 split

10%

Maximum percent error for 80/20 split

8%

Minimum detectable difference with 80 percent power

20%


Industry sector

Maximum percent error for 50/50 split

8%

Maximum percent error for 80/20 split

6%

Minimum detectable difference with 80 percent power

12%


Size class

Maximum percent error for 50/50 split

4%

Maximum percent error for 80/20 split

3%

Minimum detectable difference with 80 percent power

7%



The overall target response rate for the survey is 50 percent. Therefore, to obtain 4,800 completed interviews and a 50 percent response rate, we need to contact at least 9,600 eligible companies. As shown in Table 2, we assume varying eligibility rates across size classes. We assume 20, 10, 5, and 2 percent of companies selected from the very small, small, medium, and large strata, respectively, will be ineligible due to having fewer than five employees. We also assume 20 percent of companies selected from very small size strata and 10 percent from the remaining size strata will be found as out of business. Note that it is not possible to identify and exclude the Federal Government agencies from D&B’s sampling frame. This has to be done after the sample is selected by screening in the beginning of the interview. We will increase the sample size of the public administration sector by about 10 percent to allow for screening and excluding the Federal Government agencies from the survey. Under these assumptions, we will need an initial sample size of 12,240 records to yield 4,800 completed interviews (e.g., 100 in each stratum).

A 50 percent response rate is in line with response rates obtained for other recent establishment surveys based on national probability samples. It is well-known that response rates to government surveys of both households and establishments have been declining in recent decades. It is has also been well-documented that establishment survey response rates are lower than response rates obtained from household surveys (Baruch and Holtom, 2008). Reasons unique for non-response to establishment surveys may include that the respondent is too busy, that the topic of the survey is not relevant, or even that the company has a policy against completing surveys (Fenton O’Creevy, 1996). The 2008 ODEP Survey on the Employment of People with Disabilities, which was conducted by telephone, achieved a 54 percent response rate. While comparing response rates across establishment surveys is difficult because of lack of standard response rate definitions, response rates have been less than 80 percent for a vast majority of other recent establishment surveys. For example, DOL’s 2012 Family Medical Leave Act (FMLA) Survey achieved a 51 percent response rate to the screener interview and 31 percent response rate to the extended interview for a 21 percent overall response rate (Daley et al., 2012). The survey was conducted by web with telephone follow-up. DOL’s Survey of Employers’ Views of Short-Time Compensation received a response rate of 35 percent via web and telephone. And the 2011 National Assessment of the Occupational Safety and Health Workforce survey of employers achieved a 35 percent response rate using web with telephone follow-up (McAdams, et al.2011). We propose a multimodal contact protocol to increase response rates. We believe that targeting a higher response rate would require more follow-up effort and costs than would be worthwhile given the small increase in response rate that would be likely to occur. Instead, we have chosen to include a reserve sample (discussed in more detail Section B.3.2) in the event of a shortfall to maximize precision. In addition, we propose a nonresponse bias analysis to examine potential bias owing to companies with a greater interest in disability employment policy responding to the survey. This is discussed in Section B.3.2.

B.2.4. Data Collection Methods

Telephone Survey with HR Managers

We will mail employers a pre-notification letter (Appendix A) signed by a senior DOL executive one week before data collection begins. The D&B sample frame provides the name of the HR manager for 7 percent of companies and for 39 percent of large companies. The pre-notification letter will be addressed to the HR manager if this information is available; otherwise, it will be addressed to the company president or other senior executive. The pre-notification letter will describe the purpose of the survey, explain respondent rights and privacy, and alert the respondent that someone will be calling to conduct an interview soon. The letter will also provide the respondent with a toll-free number that he or she can call with any questions or to set up an interview appointment.

Seven days after sending the letter, the telephone interviewers will call the employer and attempt to complete the survey using the screening questions on the telephone survey (Appendix B). For the medium and small sized companies, interviewers will begin with screening question SC4 and attempt to identify the appropriate person for the interview. The interviewers will use a script, identify themselves as someone calling on behalf of the DOL, and ask for the name of the executive to whom the letter was mailed. If that person is available, the survey will begin. If that person is no longer with the company, the interviewer will ask to speak to the HR manager. The interviewer will have a list of the types of people to ask for, should the company not have an HR manager (e.g., president, vice president, director, etc.). The intent will be for the interviewer to identify an alternate contact name for the company and complete a survey with that person.

Because larger companies typically have administrative assistants screening calls on behalf of their presidents and other senior executives, those companies need additional contact procedures in the event that HR manager contact information is not available. We will call the larger businesses before sending the pre-notification letter to identify the most appropriate knowledgeable respondent and to confirm the mailing address. When interviewers call the large companies, they will use screening questions SC1-3 (Appendix B) to identify the person to whom the letter should be mailed. Westat will call the larger companies up to 10 times to try to identify the name and contact information of the appropriate person for the survey. We will then send the pre-notification letter to that identified individual.

In both cases, once a good telephone number is identified for the potential respondent, Westat will make up to 15 telephone attempts to reach that person. Calls will be made on various days, at a range of times during business house to reach the respondent. To help us reach respondents when they are available and gain their cooperation, we plan to leave voicemail messages explaining the survey and how to reach us. Once Westat has called the potential respondent 10 times, the No-Contact Letter (Appendix K) will be sent to the potential respondent. This letter reminds the potential respondent of the importance of the survey and asks the individual to respond to the telephone calls from the Westat interviewer. Individuals who refuse to participate in the survey will receive the Refusal Conversion Letter (Appendix L). Similar to the No-Contact Letter, the Refusal Conversion Letter informs the individual of the importance and usefulness of the survey data and asks the individual to reconsider participating. Refusal cases will be rereleased for contact by expert refusal converter interviewers about one week after the refusal conversion letter is sent.

Qualitative Interviews with HR Managers

We will contact each sampled HR manager to invite him/her to participate in the qualitative interviews. HR managers will receive an email from Westat that describes the study and the intent of the qualitative interviews. Westat interviewers will work with the HR managers to select a time to conduct the interview. At the designated time, Westat interviewers will conduct the interviews, making sure to obtain consent to audio-record the discussion.

Case Studies

Before the site visit, Westat will contact the HR manager who responded to the telephone survey to establish permission to conduct the site visit. When we receive permission, we will ask that the HR manager recruit appropriate senior leadership, hiring managers, HR managers, employees with a disability, and colleagues of employees with a disability to participate in interviews. We will provide the HR manager with a fact sheet describing the study and ask that he or she distribute it to potential interviewees. We will ask the HR manager to provide a tentative schedule of interviews for our site visit. See Table 4 for a sample of a site visit schedule.

Table B-4. Sample Site Visit Schedule

TIME

DAY 1

DAY 2

Interviewer 1

Interviewer 2

Interviewer 1

Interviewer 2

8:00 AM – 9:00 AM

HR manager1

Demo of Accommodation Technology

9:00 AM – 10:00 AM

Diversity and Inclusion Officer

Employee 6

HR manager 3

10:00 AM – 11:00 AM

Employee 1

Hiring Manager 1

Hiring Manager 6

Colleague 3

11:00 AM – 12:00 PM

Hiring Manager 2

Colleague 1

Employee 7

Hiring Manager 7

12:00 PM – 1:00 PM

Lunch


1:00 PM – 2:00 PM

Employee 2

Hiring Manager 2

2:00 PM – 3:00 PM

Employee 3

Hiring Manager 3

3:00 PM – 4:00 PM

Colleague 2

Employee 4

4:00 PM – 5:00 PM

Hiring Manager 5

Employee 5



Two Westat data collectors will undertake a site visit to each company to conduct interviews and observe the company culture and climate pertaining to disability employment. Each site visit will last no longer than 2 days. With interviewees’ permission, we will audio-record interviews. Each interview will last 20-30 minutes.

B.2.5 Any use of periodic (less frequent than annual) data collection cycles to reduce burden.

There is no use of less than annual data collection cycles because this study is a one-time data collection. There will not be a second cycle of data collection.

B.3 Methods to Maximize Response Rates and to Deal with Issues of Nonresponse

B.3.1 Methods to Maximize Response Rates

Telephone Survey with HR Managers

As discussed in Section B.2.3, we expect a 50 percent response rate to the HR manager survey based on response rates to previous employer surveys. Taking into account out of scope and ineligible business, we will have to sample 12,240 employers to obtain 4,800 completed interviews. Response rates for establishment surveys, such as this telephone survey with HR managers, are typically lower than response rates for a household survey. In order to maximize response rates to the telephone survey, Westat will employ a multimodal contact approach and appropriate telephone interviewing methods to insure cooperation of senior executives for this survey. Specifically, we will use the following procedures:

Advance Letter (Appendix A). An introductory letter will be sent to sampled businesses. The letter will be on ODEP letterhead and signed by an official at ODEP. The goal of this letter is to introduce the study, emphasize privacy, explain respondent’s rights, and alert the respondents that an interviewer will be calling. A toll-free number will be included so that respondents can call to verify the legitimacy of the study, to ask questions or to set up an appointment for an interview.

Contacting the most appropriate respondent. Westat will send all small and medium-sized businesses the advance letter prior to the interviewer’s call. Large businesses will be called to obtain the name of the most senior knowledgeable respondent. That respondent will be sent the advance letter. Once the letter is sent, an interviewer will be called to complete the interview. If we aren’t able to speak with that respondent, we will determine the name of another knowledgeable respondent.

Contacting the corporate headquarters. Westat will contact the business’ corporate headquarters, if applicable, and try to interview a respondent at the corporate office. If this is not possible, Westat will then conduct the interview with a senior knowledgeable respondent at one of the company’s locations.

Experienced executive interviewers. Westat has a dedicated staff of experienced, executive interviewers whose job it is to conduct interviews with senior level business executives.

Interviewers’ ability to obtain cooperation. Westat uses experienced and well trained interviewers. All interviewers will be monitored, evaluated, and provided with instant feedback on their performance to eliminate interaction patterns or telephone demeanor that might be detrimental to achieving cooperation. (Newer interviewers will be monitored at a higher rate than experienced interviewers.)

Flexibility in scheduling interviews. Being available to speak with people when it is most convenient for them is sometimes overlooked as a factor that can tip the balance in favor of cooperation for an individual who has doubts about participating. Interviewing activities for the survey will be scheduled to coincide with the hours people are most likely to be at work. In the event the respondents need to schedule interviews for a particular time, the CATI system will accommodate their needs. Special arrangements will be made for those respondents available to be interviewed only on a weekend or in the evening.

Well-worded introductory statement. Our telephone interviewing experience has shown that one of the main reasons for nonresponse is that respondents hang up before the interviewer has a chance to explain the study. Westat will immediately reassure the person answering the telephone that the interviewer is not a salesperson and that the study is being done for the Department of Labor.

Refusal avoidance and refusal conversion. Westat interviewers will be well-trained in encouraging respondent participation, especially in making efforts to convert refusals to final cooperation. If a respondent refuses to participate, the interviewer will complete a Non-Interview Response Form (NIRF). The form captures information about key characteristics of the refusing respondent and the stated reason(s) for refusing to participate. Special interviewer training sessions will be led by highly experienced supervisors for a select group of interviewers. The sessions will include participating in the analysis of survey-specific and generic reasons for refusal, preparing answers and statements that are responsive to the objections; effective use of voice and manner on the telephone, and role-playing of different situations. This team of customer cooperation interviewers will re-contact the reluctant respondents.

Penetrating companies with difficult access. Interviewers will be trained in various ways to reach wanted respondents when an IVR system allows no access without an individual’s name and/or extension or when company policy prohibits the operator from transferring a call without a name or extension. This will require use of the Internet, specifically Googling the company or a general phone directory site which sometimes will include key employee names and direct telephone numbers. The training will consist of periodic sharing of verbal phrases that produced the best results in breaking through these company barriers.

Refusal Conversion and No-Contact Letters. Westat will send a Refusal Conversion Letter (Appendix L).to individuals who refuse to participate in the survey and a No-Contact Letter (Appendix K) to those individuals who have been difficult to contact. Both letters provide the recipient with information about how important the survey is and the positive impact the data collected could have on the disability community. The letters ask the recipients to consider participating in the survey, lets them know that a Westat interviewer will be calling again, and also provides a toll-free phone number that recipients can use to reach Westat if they have questions about the survey or wish to make an appointment to complete the survey over the telephone.

We are keenly aware that several factors have affected response rates for establishment telephone surveys in recent years, particularly an increased use of interactive voice response (IVR) systems instead of operators by companies. We will select an additional reserve sample to be used should the response rate be too low to yield the desired number of completed interviews. The reserve sample will increase the total sample size by 20 percent. We will use the reserve sample to obtain the number of completes needed in each stratum for the desired precision levels.

Qualitative Interviews with HR Managers

Westat expects to complete qualitative interviews with a full set of 90 respondents. During sample selection, 15 alternate respondents will be selected. If any of the original 90 respondents selected are unavailable or unwilling to participate, Westat will move on to the alternate respondents until we reach a full sample of 90 completed interviews.

Case Studies

We do not anticipate any difficulty completing the case studies with a full set of six sites.

B.3.2 Nonresponse Bias Analysis

Even though intensive methods will be used to increase response rates and convert non-responders to the survey, non-response bias is still a concern. We plan to carry out analyses of the survey’s nonresponse properties by using the data provided on the sample frame by D&B for all employers in the sample. The first analysis will compare the survey response rates for the different levels of the categorical variables, including company employee size classes, industry sector, Census region, MSA/non-MSA status, and single location company or headquarters identifier for the company. A second analysis will compare the distributions of these variables computed from the sample frame data associated with (1) all employers in the sample frame, computed with or without weights (with an equal probability sample these distributions are the same); (2) all employers sampled, computed with base weights; (3) all employers responding to the survey, computed with base weights; and (4) all employers responding to the survey, computed with adjusted weights.

Initially, we will assign a base weight to each sample company record as the reciprocal of its probability of selection. The base weights will then be adjusted for nonresponse in order to reduce potential biases resulting from not obtaining an interview with every company in the sample. These adjustments will be made by redistributing the weights of nonresponding companies to responding companies with similar propensities for nonresponse. A predictive model for response propensity will be developed to identify subgroups of population with differential response rates. These subgroups will then be used as nonresponse adjustment cells, and a separate weight adjustment will be applied in each cell. The potential predictors that can be used in this modeling effort have to be known for both respondents and nonrespondents. These include company employee size classes, industry sector, Census region, MSA/non-MSA status, and single location company or headquarters identifier for the company.

If response propensity is independent of survey estimates within nonresponse adjustment cells, then nonresponse-adjusted weights yield unbiased estimates. There are several alternative methods of forming nonresponse adjustment cells to achieve this result. We plan to use Chi-Square Automatic Interaction Detector (CHAID) software (SI-CHAID) to guide us in forming the cells (Magidson, 2005). CHAID partitions data into homogenous subsets with respect to response propensity. To accomplish this, it first merges values of the individual predictors, which are statistically homogeneous with respect to the response propensity and maintains all other heterogeneous values. It then selects the most significant predictor (with the smallest p-value) as the best predictor of response propensity and thus forms the first branch in the decision tree. It continues applying the same process within the subgroups (nodes) defined by the “best” predictor chosen in the preceding step. This process continues until no significant predictor is found (based on a standard p-value of less than .05), or a specified (about 20) minimum node size is reached. The procedure is stepwise and creates a hierarchical tree-like structure. Potential predictors of response propensity have to be known for both respondents and nonrespondents. Only a limited number of characteristics of nonrespondents is known and can be used as independent variables. We do not think correlation between the independent variables will be an issue, though we will monitor the survey responses by major industry and size classes and increase efforts in those subgroups with lower than expected number of completes.

Although nonresponse adjustment can reduce bias, at the same time, it may increase the variance of estimates. Small adjustment cells and/or low response rates (or large nonresponse adjustment factors) may increase the variance and give rise to unstable estimates. In order to prevent an undue increase in variance and thereby an adverse effect on the mean square error of the estimates, we plan to limit the size of the smallest cell to a minimum and avoid large adjustment factors. All sample companies will be classified into five major survey response categories based on the outcome of the survey. These five categories will be:

  • Respondent, interview completed;

  • Nonrespondent, identified as inscope (in business) but eligibility (based on the interview) could not be determined (company name and being in business were verified but was not able to conduct the interview);

  • Identified as inscope (in business) but determined to be ineligible in the interview;

  • Inscope (in business) status could not be verified (mainly nonlocatable cases);

  • Out-of-scope (company is no longer in business).

Note that we refer to cases that are identified as being no longer in business as out of scope. A number of companies, although they are in business (which we refer to as inscope), later are identified as ineligible during the interview, for such reasons as fewer than five employees, a Federal Government agency, etc.

We will develop separate models for the nonresponding companies with unknown inscope status (nonlocatables) and for the nonresponding inscope companies. After forming two separate sets of adjustment cells, we will first adjust the weights to compensate for those nonresponding companies with unknown inscope status. This weight adjustment factor will be computed within each adjustment cell, as the ratio of the weighted (by the base weight) total number of sampled companies to the weighted number of companies whose inscope status could be determined. In the second step, we will adjust the weights to compensate for nonresponding inscope companies. This nonresponse adjustment factor will be computed as the ratio of the weighted (after adjusting for nonlocatables) number of all inscope companies (including those identified as ineligible in the interview) to the weighted number of companies whose eligibility could be determined (the companies with a completed interview plus those that are identified as ineligible in the interview) within each nonresponse adjustment cell.

After nonresponse adjustments, the distribution of the adjusted weights will be examined within reported industry type and size classes. We may trim a small number of the extremely large weights to prevent large increases in variances of survey estimates using these weights. We will use a design-based approach for weight trimming. Trimming the weight reduces variance of the estimates but at the same time may introduce bias. To reduce any potential bias, the trimmed portion of the sampling weight will be distributed uniformly to other companies within the same reported industry type and size stratum (that the company with trimmed weight belongs to), producing the original weighted total number of companies. The goal of this weight trimming is to balance any increase in bias due to trimming with a reduction in sampling error to minimize the mean squared error of the estimates.

It is reasonable to suspect that nonresponse may be related to characteristics that are unavailable on the DMI file used as the sampling frame. The most obvious factor may be a company’s predisposition toward hiring individuals with disabilities, with companies that have a higher interest being more likely to respond to the survey. This can be partially (but not fully) addressed by weighting based on variables that are correlated with hiring individuals with disabilities (such as industry and size class). To gauge the magnitude of this potential bias, we will conduct an analysis of early and late responders to the survey, comparing the percentage of companies that actively recruit and hire individuals with disabilities. If predisposition to hire individuals with disabilities is related to response propensity, we would expect that early responders would be more likely to actively recruit/hire individuals with disabilities than late responders. Early and late respondents will be defined based on the data collection period and response rate over time. This analysis will provide ODEP with information about potential nonresponse bias on unobservable characteristics.

Imputation will be used for variables pertinent to nonresponse adjustment. We will use hot-deck imputation methods, which will use sampling strata and other sampling frame variables to form imputation cells. Missing values on a variable for a non-respondent (the recipient) are replaced with data taken from a similar respondent in the same cell (the donor). We anticipate that there will be very little missing data on the variables of interest because data on the DMI are relatively complete and in our experience most respondents are likely to answer these basic questions about employer characteristics.

B.4 Test of Procedures or Methods to be Undertaken

A pretest of the telephone survey was completed in August 2017. Cognitive interviews were conducted with seven HR managers/CEOs of companies with a range in number of employees. The cognitive interviews focused on specific survey questions to ensure clarity, understanding and ability to respond appropriately, as well as a more definitive estimate about the length of the survey. Findings from the pilot test were used to revise the telephone survey, making improvements to the wording of some questions and shortening the survey.

B.5 Individuals Consulted on Statistical Methods and Individuals Responsible for Collecting and/or Analyzing the Data

B.5.1 Individuals Consulted on Statistical Methods

The following Westat staff were consulted on statistical methods in preparing this submission to OMB:

  • Dr. Joseph Gasper (240) 314-2470

  • Dr. Avni Goskel (301) 251-4395

  • Dr. Robert Fay (240) 314-2318

In addition, we assembled a Technical Working Group (TWG) consisting of seven experts in disability employment and/or research methods. The TWG has already been consulted on the study design and the data collection instruments. The TWG members are:

  • Jason Bryn, BAE Systems

  • Bob Fay, Ph.D., Westat

  • Ellen Galinsky, Families and Work Institute

  • Lori Golden, Ernst and Young

  • Andrew Houtenville, Ph.D., University of New Hampshire

  • Peter Rutigliano, Ph.D. Sirota Consulting

  • Zary Amirhosseini, Massachusetts General Hospital



B.5.2 Individuals Responsible for Collecting and/or Analyzing the Data

The following Westat staff are responsible for collecting and analyzing the data for this study:

  • Dr. Joseph Gasper (240) 314-2470

  • Dr. Jocelyn Marrow (240) 314-5887

  • Dr. Avni Goskel (301) 251-4395

  • Michael Hornbostel (240) 314.2578






References

Baruch, Y. & Holtom, B. C. (2008). Survey Response Rate Levels and Trends in Organizational Research. Human Relations 61(8),1139–1160.

Dillman, D. (1978). Mail and Telephone Surveys – the Total Design Method. New York, NY. John Wiley and Sons.

Employment and Training Administration (ETA). (n.d.) Work opportunity tax credit. Retrieved from: https://www.doleta.gov/business/incentives/opptax/eligible.cfm

Holtgraves, Thomas, James Eck, and Benjamin Lasky. (1997). ‘‘Face Management, Question Wording, and Social Desirability.’’ Journal of Applied Social Psychology 27:1650–71.

Kaye, H.S., Jans, L.H., & Jones, E.C. (2011). Why don’t employers hire and retain workers with disabilities? Journal of Occupational Rehabilitation, 21(4), 526-536. http://doi.org/10.1007/s10926-011-9302-8

Magidson, J. (2005). SI-CHAID 4.0 user’s guide. Belmont, Massachusetts: Statistical Innovations Inc.

McAdams, T., Kerwin, J., Olivio, V., and Goskel, H. (2011). National Assessment of the Occupational Safety and Health Workforce. (Prepared under contract to the National Institute of Occupational Safety and Health). Rockville, MD. Westat.

Sudman, S. & Bradburn, N. M. (1982), Asking questions: A practical guide to questionnaire design. San Francisco, CA: Jossey-Bass

Tourangeau, Roger, and Ting Yan. 2007. ‘‘Sensitive Questions in Surveys.’’ Psychological Bulletin 133:859–83.

U.S. Bureau of Labor Statistics (2016). Employment status of the civilian population by sex, age, and disability status, not seasonally adjusted. Retrieved from: http://www.bls.gov/news.release/empsit.t06.htm U. S. Census Bureau (Census). (2014). Selected economic characteristics for the civilian non-institutionalized population by disability status: 2010-2014 American Community Survey 5-year estimates. Washington, DC: Author. Retrieved from: http://factfinder.census.gov/faces/tableservices/jsf/pages/productview.xhtml?pid=ACS_14_5YR_S1811&prodType=table

U. S. Census Bureau (Census). (2014). Selected economic characteristics for the civilian non-institutionalized population by disability status: 2010-2014 American Community Survey 5-year estimates. Washington, DC: Author. Retrieved from: http://factfinder.census.gov/faces/tableservices/jsf/pages/productview.xhtml?pid=ACS_14_5YR_S1811&prodType=table

U.S. Department of Labor (2010). Instructions for Form LM-3 Labor Organization Annual Report. Washington, DC: Author. Retrieved from: https://www.dol.gov/OLMS/regs/compliance/LM-3_Instructions_AR.pdf

U.S. Equal Employment Opportunity Commission (EEOC). (2005). The ADA: Your employment rights as an individual with a disability. Washington, DC: Author. Retrieved from: https://www.eeoc.gov/facts/ada18.html






File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Title7420.01: OMB Package. Section A. Introduction
AuthorMARKOVICH_L
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy