Worker_Classification_Survey-OMB_Part_A_11-01-2013

Worker_Classification_Survey-OMB_Part_A_11-01-2013.docx

Worker Classification Survey

OMB: 1235-0028

Document [docx]
Download: docx | pdf

Worker Classification Information Collection

Supporting Statement for Paperwork Reduction Act
Submissions of Survey

A. Justification

1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection.

The Department of Labor (DOL)’s Wage and Hour Division (WHD) is responsible for administering and enforcing a number of laws that establish the minimum standards for wages and working conditions in the United States. Collectively, these labor standards cover most private, state, and local government employment and protect over 135,000,000 workers in more than 7,300,000 establishments throughout the United States and its territories. Current labor law does not require employers to disclose information regarding employment status (whether the worker is considered an employee or not), the basis for those status determinations, or pay (including hours worked, pay rates, and wages paid) to workers. In the absence of required disclosure, employers may intentionally or unintentionally classify a worker as a contractor rather than as an employee without full knowledge of the worker.

Worker misclassification can be understood as the practice, intended or unintended, of treating a worker who is an employee under the law as something other than an employee (i.e., an independent contractor), depriving the employee of their legal wage entitlements, including minimum wage and/or overtime. The classification of workers impacts whether workers receive the benefits of statutory coverage, and access to employer-provided benefits; affects competition between compliant and non-compliant employers; and impacts the funding and administration of a number of federal and state government programs. There are several key laws that, by design, only protect “employees.” Those laws include: Fair Labor Standards Act, Family and Medical Leave Act, Occupational Safety and Health Act, National Labor Relations Act, Unemployment Insurance and Workers’ Compensation (GAO, 2007). Misclassification as an independent contractor may deny a worker the protections of those laws. Additionally, independent contractors do not have access to many employer-provided benefits such as health insurance and pension programs. Workers who discover that they have been misclassified after having worked for an employer for a period of time will likely incur significant legal costs to correct the classification and receive appropriate back wages. In contrast, past survey results have suggested that some workers prefer this type of work arrangement because it allows for a flexible schedule to accommodate other obligations, a need for additional income, or lack of a more permanent job, or even to avoid required tax obligations (GAO, 2007).

As discussed above, employers who misclassify workers may achieve significant administrative and labor cost reductions, giving them a profound advantage over other employers who do not misclassify their workers. This, in turn, may generate a loss in wages and benefits to workers and a loss in tax revenues for Federal and state governments. According to one estimate, if only one percent of all employees were misclassified nationally, the loss in overall unemployment insurance revenue due to underreporting would be nearly $200 million annually (GAO 2009 and Planmatics, Inc. 2000). This may be an underestimate; some states report losing between $5 and 20 million dollars annually on unemployment insurance payments alone (Harris, 2010; Canak and Adams, 2010; National Alliance for Fair Contracting, Inc., 2009). The GAO estimates that unpaid taxes total more than $2.7 billion dollars per year in unpaid Social Security, unemployment insurance, and income tax due to misclassification (GAO, 2007). To address the problem of misclassification, DOL launched the Misclassification Initiative under the auspices of Vice President Biden’s Middle Class Task Force. Furthermore, the Secretary of Labor recently announced a signing of a Memorandum of Understanding (MOU) between DOL and the Internal Revenue Service (IRS) which will allow the agencies to more easily will work together and share information to reduce the incidence of misclassification of employees, to help reduce the tax gap, and to improve compliance with federal labor laws. However, currently data that would facilitate a better understanding of the scope and magnitude of misclassification are lacking. Such data are critical for determining the impact of misclassification on both compliant and misclassified workers and employers. The data collection effort described here will facilitate the efforts of state governments and workers compensation programs in their outreach and education of workers and employers. There are two target groups of respondents for this data collection effort: workers and employers. These are described in further detail under Question 2 below. The information collected in both components is necessary for the Department’s outreach and education of workers and employers. Results will be used by the U.S. Department of Labor to improve policies and benefits for all workers and employers and to inform the Department’s collaboration with other state and federal agencies. The type of data collection described here is authorized by 29 USC 551 and Public Law 111-117, Division D (December 16, 2009).

2. Indicate how, by whom, how frequently, and for what purpose the information is to be used. For revisions, extensions, and reinstatements of a currently approved collection, indicate the actual use the agency has made of the information received from the current collection.

DOL plans to compile an analytical research report on the findings and results of a nationally representative survey of workers. We will also report on a qualitative study of employers which includes results from in-depth interviews with employers (Abt Associates has been contracted to assist in the research). The information collected from the survey and in-depth interviews will help the Department’s Misclassification Initiative and its efforts to promote fair hiring practices and access to critical workplace benefits, opportunities and protections.

Information about what is collected and how it will be used is provided under the descriptions of each data collection effort below.

Worker Survey

DOL is seeking to administer a new survey to collect information about employment experiences and workers’ knowledge of basic employment laws and rules so as to better understand employees’ experience with their classification status. This is the first time DOL will field a survey to examine worker classification. The survey instrument utilizes and adapts existing survey questions, and it also incorporates new survey questions specific to this study (Attachment C). The data collection effort will gather information about workers’ employment and pay arrangements and will measure workers’ knowledge about their current job classification, and their knowledge about the rights and benefits associated with their job status.

Employer Research (In-Depth Interviews)

The second target group for this data collection is employers, employer consultants, and employer representatives from industries that are known through audit research to have higher rates of misclassification. This research is qualitative and is designed to help the Department better understand part of the decision processes and challenges that employers face when making hiring and staffing arrangements.

The existing literature suggests a number of reasons why employers classify workers as other than an employee, and these reasons can differ across industries and within them as well (Planmatics, 2000). Some misclassification is an honest error. In other cases, the nature of work in some industries or occupations lends itself more naturally to alternative employment structures, such as seasonal labor in farming and construction work. Consequently, some of the cost pressures associated with employment (versus independent contractor classification) will likely affect all businesses of a certain size (the cost of health care, for example). Others will have differential effects depending upon the industry and the nature of the work. (Those industries with traditionally unionized work force or those that entail high-risk job duties associated with high workman’s compensation insurance premiums in construction, for example.)

To better understand the factors that influence employers’ hiring and employment classification practices, Abt Associates (the government contractor) will conduct 16-20 in-depth interviews with employers, employer consultants and employer representatives. This represents approximately three interviews across the six industries that the Department has identified as having a higher likelihood of employees being misclassified: construction, home health, food service, trucking, hotels and manufacturing. Three interviews per industry allows for some variation within industry (size of the company and role of the respondent, described below). Abt anticipates that recruitment will be challenging, and the total number of interviews to be conducted has considered the time and effort it will entail to garner participation. This number balances the research priorities, project schedule and cost considerations.

These interviews will be conducted either on the phone or in-person, based upon the preference of the respondents. The protocols, which include open-ended questions, will serve as a guide for the interviews rather than in a strict question-and-answer format (Attachment E). Detailed notes will be taken during the interviews and these will be systematically coded using standard qualitative research methods. The interviews will be conducted by Abt researchers. Further information about the respondent groups is provided below.

Interviews with employer consultants will be conducted with staff at law and accounting firms who help companies understand and comply with employment classification regulations. Given this role, such employer consultants would have worked through issues related to worker classification across a number of employers and employment sectors.

Abt will also interview employer representatives who are leaders in trade associations and industry specific advocacy groups. They advocate on behalf of many different types of firms in their industry. They will be well versed in the regulatory and policy environment for companies in their industries and the strategies and practices of the firms they represent. The target respondent from employer representative organizations (industry specific) would be the person or persons who specialize in legislative or regulatory affairs.

Finally, Abt will interview employers. Employer respondents will be identified in a number of ways: 1) the information Abt learns in the first round of employer consultant and employer representative interviews; and 2) from a list of industries identified as having a higher likelihood of misclassifying employees. At the company level, the target respondent will be someone in an executive position who knows the most about the firm’s higher level strategy issues, including human resources policies, hiring and worker classification practices. If this knowledge resides among more than one person, Abt would suggest a joint interview and would make every accommodation to make the interview convenient for the respondents. The contractor will analyze and report the results, but the identity of the employers and organizations that participate (or any potentially identifying information) will not be shared with DOL.

3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.

Worker Survey

Abt will field the worker survey using a computer assisted telephone interview (CATI) technique. Abt has attempted to minimize burden in a number of ways. Respondents will be offered the opportunity to conduct the survey at a time most convenient to them. Respondents will also be offered the opportunity to call a toll-free number to schedule or conduct an interview at a time most convenient to them. Following standard survey protocol, the respondents will be informed that they may chose not to answer any question and that they may end the survey at any time. Further, the respondent will be offered the opportunity to begin the survey in one session and finish it in another session. The Worker Survey interviews are expected to take on average fifteen minutes per respondent.

In-Depth Interviews

All interviews will be conducted either in person or by telephone. Telephone interviews will reduce costs for travel time and expenses. The interviews will not be recorded.

4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purpose(s) described in 2 above.

Worker Survey

A review of relevant research and national surveys has revealed that there are no other recent surveys of a large, random sample of households that ask questions specifically related to the required topics related to employment classification. The Contingent and Alternative Employment Arrangement (CAEA) supplement to the Current Population Survey includes some questions that identify independent contractors, on-call workers, temporary help and workers provided by contract firms. However, two factors limit use of CAEA data for the purposes of this research: 1) it has not been conducted since 2005 and 2) it does not contain the important, detailed questions about the job duties performed in contingent or alternative work arrangements that would enable researchers to identify whether or not the employment classification is appropriate.

DOL, the IRS and States perform employer audits to investigate worker classification, but estimates of the extent of misclassification are highly variable because the overall percentage of employers audited is very small and misclassification varies by industry (GAO, 2009). Furthermore, a limitation of using compliance action information is that investigations are often triggered by complaints alleging violations; therefore, investigated employers would not necessarily be representative of all employers. Additionally, while the Contingent Work Supplement to the Current Population Survey includes questions about alternative work arrangements, it has not included questions that are pertinent to the workers’ understanding of their employment status or workers’ understanding of rights and benefits that are associated with employment status. No surveys that include a national, random sample of workers include questions related to potential misclassification. A study funded by the Ford Foundation, the John Randolph Haynes and Dora Haynes Foundation, the Joyce Foundation and the Russell Sage Foundation entitled “Working without Laws” contained a survey that addressed some questions about worker classification, but this was not conducted with a nationally representative sample.

This survey brings together questions and data from three types of sources that allow us to specifically address required topics:

1. New Survey Questions to capture new, relevant and required information:

Worker classification questions derived from components of the Economics Reality Test, the IRS Form SS-8 and the ABC Test for employment classification. Employee rights and knowledge questions.

2. Adaptation of Existing Survey Questions for new, relevant and required information. The Working Without Laws Survey contains several relevant questions; however, they have not been administered to a randomly selected, representative sample of Americans. Similarly, we have adapted the Canadian Self-Employment Survey that had not previously been adapted on a representative survey in the United States.

3. Existing or Adapted Survey Questions from other surveys: (American Time Use Survey, Current Population Survey (CPS), CPS Basic Monthly, CPS March Supplement, NHIS Questionnaire, Family and Medical Leave Act (FMLA) Employer and Employee Surveys, CPS Contingent and Alternative Employment Arrangements Supplement). These questions are used for a variety of purposes: 1) to update information from the CPS Contingent and Alternative Employment Arrangements Supplement, which was a special supplement to the CPS that was last conducted in 2005; 2) to provide relevant, comparable information to other national surveys; and 3) for the purposes of weighting. To our knowledge, the supplement is not scheduled for inclusion on the CPS in the near future.

Abt has included a crosswalk of the adapted questions from the above surveys to be included in the worker survey (Attachment F).In-Depth Interviews

In preparation for this research, Abt conducted a review of the economic and social policy research on employment misclassification. Although the IRS and many states perform audits to investigate misclassification, the information collected is quantitative to estimate misclassification. Furthermore, audit based reports do not provide employers the regulatory and enforcement environments regarding worker classification, nor do they provide information about employers’ knowledge, attitudes, and practices around classifying workers. Additionally, as discussed above, a limitation of using compliance action information is that investigations are often triggered by complaints alleging violations; therefore, investigated employers would not necessarily be representative of all employers.

5. If the collection of information has a significant impact on a substantial number of small businesses or other small entities describe the methods used to minimize burden.

Worker Survey

The telephone survey technique is being used to minimize burden on workers. Small businesses and other small entities are not involved.

In-depth Interviews

Abt expects that no more than four of the twenty in-depth interview participants will represent small businesses. Abt anticipates that this quantity of interviews will be sufficient to gain substantial qualitative insight into the experience of small business with regards to worker classification, given the focus of the research questions being posed in the study. According to the Small Business Association, the Office of Advocacy defines a small business as an independent enterprise having fewer than 500 employees. The data collection procedures have been designed to minimize the burden on those individuals as well as representatives from larger organizations through the following: 1) the advance letter and accompanying materials (Attachment E) inform the respondents of the objectives of the interview. Being notified in advance will allow for the interviews to be conducted more efficiently and effectively. Participants may review these materials at their convenience. 2) Participants will be given every opportunity to conduct interviews at a time and location that is most convenient to them.

6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.

If the proposed data collection is not accomplished, the Congress, the Department of Labor, and other policy makers will have no substantive, relevant data upon which to base policy decisions regarding worker classification. For example, current labor law does not require employers to disclose information regarding employment status. In the absence of required disclosures, employers may not have information about benefits to which they are legally entitled to. If workers are misclassified, the GAO estimates that unpaid taxes may total more than $2.7 billion dollars per year in unpaid Social Security, unemployment insurance, and income tax due to that misclassification. More data is needed about the nature and scope, magnitude of worker misclassification so that Federal and State policymakers can identify how best to address the issue of how workers are classified.

7. Explain any special circumstances that would cause an information collection to be conducted in a manner:

  • Requiring respondents to report information to the agency more often than quarterly;

  • Requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

  • Requiring respondents to submit more than an original and two copies of any document;

  • Requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;

  • In connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;

  • Requiring the use of a statistical data classification that has not been reviewed and approved by OMB;

  • That includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

  • Requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information’s confidentiality to the extent permitted by law.

The survey and in-depth interviews will not involve any of these circumstances.

8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency’s notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.

Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and record-keeping, disclosure, or reporting format (of any), and on the data elements to be recorded, disclosed, or reported.

Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years—even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.

The Department has conducted extensive outreach efforts with Congressional, academic, and private industry constituencies as well as interested agencies within the Executive branch. Comments and suggestions from all interested parties were solicited, reviewed and considered in preparing for the final survey product in an effort to efficiently collect required information while minimizing the reporting burden on the public. Additionally, the Department has convened an advisory panel of outside experts to review the research design. The advisory panel members, their institutional affiliation and area of expertise are included below:

No.

Name

Title

Organization

Expertise

I. Labor Law

1

Catherine Ruckelshaus

Legal Co-Director

National Employment Law Project

Immigrants and work, enforcement of workplace standards, wage and hour protections, nonstandard workforce.

II. Economic/Policy

2

Ross Eisenbrey

Vice President

Economic Policy Institute

Labor and employment law, occupational safety and health, pension policy.

III. Private Business/Employer

3

William C. Dunkelberg

Chief Economist

National Federation of Independent Businesses (NFIB)

Small business, entrepreneurship, consumer behavior and consumer credit and government policy.

IV. Government Enforcement and Regulation

4

Connie Klipsch

Former Regional Administrator, WHD

Department of Labor, Wage and Hour Division

Fair Labor Standards Act (FLSA) Employment Classification.

V. Academic Perspective

5

Abel Valenzuela, Jr.


Professor of Urban Planning and Chicano Studies.
Director, Center for the Study of Urban Poverty - ISSR 

UCLA

Day labor, informal economy.

6

Allison Morantz

Professor of Law

Stanford University

Organizations, labor and employment law.



The comments and suggestions we received from these sources and the advisory panel recommended that this data collection effort bring together new information on workers’ employment arrangements, worker knowledge of their rights and benefits knowledge with information that would also allow comparability across time, as described in response to Item 4 above.

In addition, WHD published a Federal Register Notice on January 11, 2013, inviting public comments about this information collection (78 FR 2447). The agency received 36 timely comments. 23 expressed unique views while seven requested an extension of the comment period. All comments addressed the information collection through the worker survey and/or the employer/employer group interviews. Comments were received from the following entities: American Federation of Labor and Congress of Industrial Organizations (AFL–CIO); Direct Selling Association (DSA); National Association of Manufacturers (NAM); National Federation of Independent Business (NFIB); Society for Human Resourced Management (SHRM); United Brotherhood of Carpenters (UBC); Connecticut Department of Labor; International Brotherhood of Electrical Workers (IBEW); Association of Wholesaler-Distributors (NAW); Associated Builders and Contractors, Inc. (ABC); Associated General Contractors (AGC) of America; American Trucking Associations (ATA); U.S. Chamber of Commerce (the Chamber); Coalition to Promote Independent Entrepreneurs (the Coalition); College and University Professional Association for Human Resources (CUPA-HR); Grawe Law; HR Policy Association; Morgan Lewis & Bockius LLP; Pepper Hamilton LLC; and World at Work. Seven organizations requested an extension. The agency considered all comments.

AFL-CIO states, “DOL’s proposed information collection request is a vital step towards fashioning an appropriately comprehensive response to the widespread and damaging problem of employee misclassification.” Similarly, UBC writes, “The DOL proposal offers a necessary and overdue examination of [the employee misclassification] problem.” The Department agrees that information needs to be collected on workers’ knowledge of their classification status and their knowledge of the rights associated with that status. The AFL-CIO further states, “DOL has appropriately sought to minimize the burden of the proposed information collections on survey respondents to the extent feasible.”

Most comments suggested specific clarification in the instructions or wording of a given question on the worker survey. The Department appreciates the detailed feedback and has reviewed and adopted suggestions as appropriate. Many comments also suggested deleting or adding questions to the worker survey that related to how workers are classified, or further categorization of survey responses. While the Department appreciates the desire for more data, asking additional questions would lengthen the time necessary to administer the survey and potentially negatively impact survey response rates. The Department has reexamined the survey questions, has eliminated some existing questions and has added some new questions in response to commenters’ suggestions and to maximize the value of the data received from the worker survey. The Department believes that the revised survey will generate the data sought to increase understanding of worker knowledge of employment classification while minimizing the burden to the public.

Some comments suggested an increased sample size in order to conduct sub-group analyses of interest (see comments from ABC, CUPA-HR, NAM, NAW, and The Chamber). NAM, NAW, ABC, and CUPA-HR write, “In order to obtain useful results, the sample plan needs to obtain enough completed interviews to enable statistically reliable estimates of target items (e.g., self-employed versus employee and correctly versus incorrectly classified) cross-tabulated by salient respondent characteristics.” The Chamber writes, “The survey sample size should be large enough to support statistically reliable estimates of classification and classification correctness cross-tabulated by salient worker characteristics such as gender, age cohort, educational attainment, occupation and industry.”

The study is not necessarily designed to support inference for subgroups defined by the cross-classification of employment status and classification status, largely because the true misclassification rate is not known at this time. The Department acknowledges that many subgroup analyses could yield valuable findings, but the sampling design reflects the funding available under the current contract.

Several comments suggest that the proposed worker survey is unnecessary (see comments by DSA, NAM, NAW, AGC, ABC, CUPA-HR, NFIB, and ATA). NFIB, NAM, NAW, ABC and CUPA-HR write, “…through disclosures required under state law and the Internal Revenue Code, workers already have access to a significant amount of information regarding their employment status and pay (including hours worked, pay rates and wages paid).” Similarly, AGC states, “The proposed collection of information is unnecessary for the performance of the functions of the WHD and will therefore not have practical utility.”

DOL acknowledges that some information on employment status is available from other sources, including audits conducted by the Internal Revenue Service (IRS). Information from this survey will complement existing information. The survey data will be population based, allowing for the computation of the following: rates of perceived self-employment status, rates of workers being treated as self-employed, and “likely misclassified” according to appropriate legal criteria. In addition, DOL recognizes that information and resources already exist that allow workers to better understand their classification, including information available on DOL’s website. The goal of the survey is not to educate workers on worker classification, but to assess workers’ knowledge of their classification status and their knowledge of the rights associated with that status. Better understanding the scope and magnitude of potential classification irregularities or discrepancies are critical to WHD’s mission of protecting and enhancing the welfare of the Nation’s workforce.

Several comments addressed a perceived bias in the survey language, or questioned whether the survey would yield credible results (see comments from DSA, ATA, ABC, CUPA-HR, NAM, NAW, Morgan Lewis & Bockius LLP, the Coalition). NAM, NAW, ABC, and CUPA-HR assert that the introductory survey statement “…is, at best, value laden and thus a potential source of bias.” Similarly, Morgan Lewis & Bockius LLP note that, “The tone and language of the proposed Survey questions appear to be likely to elicit responses suggesting worker misclassification, and fail to provide the interviewer with sufficient data to evaluate the appropriateness of the worker's classification…. The suggestive language and tone of the questions appear highly likely to yield biased, unreliable responses.” The Coalition writes, “In the Survey, researchers will make worker-status determinations based entirely on Survey responses provided by individuals only, and with no cross-examination of the individuals to test the validity of their responses. It follows that these determinations would be patently unreliable.”

The introductory survey questions and interviewer instructions to determine main job have been revised. More broadly, the Department acknowledges that detailed classification determination must consider a multitude of facts associated with each case. The Department does not promote the survey as an instrument to wholly determine classification. Rather, data from the survey can provide an estimate of potential classification irregularities or discrepancies as well as provide information about workers’ knowledge of basic employment laws and rules. Survey interviewers do not have demonstrable knowledge of legal standards, as they themselves do not make classification determinations. They simply follow questions and logic patterns through an automated Computer Assisted Telephone Interviewing System. The survey questionnaire and its logic patterns – as well as algorithms that will be used to analyze the survey data – have been developed collaboratively between Abt Associates researchers, the Department’s Office of the Solicitor, and experts at the Wage and Hour Division. All employment classification determinations will be made under the close guidance of the Department’s Office of the Solicitor, whose mission it is to meet the legal service demands of the Department. All analytic plans for the worker classification data collected, including the scenarios and resulting algorithms that determine employment classification, will be carried out only after detailed review, discussion, and ultimately the final approval of the Department.

Some comments suggested confusion regarding the method of data collection and analyses for the employers/employer groups, with several comments referring to the “employer survey” (see comments from Grawe Law and HR Policy Association). Additionally, several comments raised concerns about a perceived bias regarding the employer-focused data collection efforts (see comments from WorldatWork, Pepper Hamilton LLC, HR Policy Association, and Morgan Lewis & Bockius LLP). Pepper Hamilton LLC writes, “The information gathered from only 16-20 representatives of employers/businesses, and to focus on industries that the DOL “has identified as having a higher likelihood of employees being misclassified” is likely to produce highly skewed information and data.” Similarly, HR Policy Association writes, “The results will be, by design, biased and not reflective of the broader use of independent contractors in the U.S. economy. Moreover, the limited number of interviews—three per industry—is not nearly enough for the Department to accurately obtain a broad range of employer views within these six industries.”

To clarify, the employer-focused data collection is not a survey but rather qualitative, exploratory one-on-one interviews. The goal of exploratory research is to formulate problems, clarify concepts, and form hypotheses. It is not designed to be representative. Since the Department is interested in learning why and how misclassification may be taking place, it is appropriate to focus on those industries where it is suspected that misclassification occurs. From this research, the Department can begin to understand the decision factors. This research was designed to provide preliminary insight. If common themes and motivations are identified, those may be helpful in expanding future research efforts.

9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.

Worker Survey

The sample design for the survey includes 40% allocation to telephone numbers obtained from a list of cellular telephone exchanges, or cellular frame cases. This allocation balances (i) survey costs to the extent possible (cell interviews are twice as expensive as landline interviews) and (ii) the precision of the survey estimates (having more cell interviews reduces large weights and, in turn, reduces the standard errors). This 40% allocation to the cell frame is somewhat higher than is typical for national dual-frame RDD surveys. Cost considerations often lead survey designers to keep the cell allocation as low as possible (AAPOR Cell Phone Task Force, 2010). For this study, however, the cell RDD frame is much more effective than the landline frame for reaching adults in the labor force, racial and ethnic minorities, and low-income households (Link et al., 2007)—key characteristics in this study. Additionally, vulnerable workers (those who are seasonal, or paid in cash) use disposable cellular phones, which are included in the cell frame. By conducting 40% of the interviews in the cell sample Abt will be able to bring in more respondents belonging to these important analytic groups.

This amounts to 4,000 completed interviews on cell phones. To compensate for telephone charges incurred from the survey and to encourage the young/mobile/single population to participate, Abt will remunerate cell phone respondents $10. This effort will help minimize the risk of nonresponse bias, achieve a high response rate, and, ultimately, improve our estimates among cell-only or cell mostly respondents who are more difficult to reach. The RDD landline telephone respondents will not be offered any gift or payment.

A nonresponse follow-up survey (NRFU) is one of the methods we will use to evaluate nonresponse in the Worker Survey, and we will include incentive payments. Using current nonresponse imputation models without adequate representation from the hard core refusals could bias survey results enough to affect the quality of the eventual data. Abt expects to complete approximately 200 NRFU interviews. This will provide a sufficient case base for meaningful nonresponse analysis.

The NRFU will collect information on workers who fail to respond to the survey and provide insight into whether the nonrespondents differ from the respondents on the characteristics of interest (e.g., employment experiences and workers’ knowledge of basic employment laws and rules). Specifically, interviewers will call back a subsample (n=500) of households that declined the original survey. In addition, among this subsample, all landline cases that can be matched to an address (through reverse lookup) will receive a letter encouraging them to cooperate with the interview. Through the letters and call backs, Abt will attempt to recruit an eligible employee to complete a shortened interview featuring a $20 remuneration.

Incentives are a common feature in NRFU surveys because, by definition, the NRFU sample did not cooperate with the original survey, and so a major change in the recruitment protocol is required to elicit cooperation in the NRFU. Tourangeau and colleagues (1997) noted in their report to the Federal Highway Administration (FHWA-PL-98-029) that larger monetary incentives (e.g., $20 to $50) are a common element of NRFU designs for household surveys. For example, Peytchev et al. (2009) documented how a $20 incentive was used in a successful NRFU to the National Intimate Partner and Sexual Violence Survey for the Centers for Disease Control and Prevention.

In-Depth Interviews

Employers, employer representatives, and employer consultants who participate in the qualitative data collection receive no payments or gifts. Abt will provide respondents with an executive summary report of the findings upon completion of the project. DOL has approved the provision of the executive summary report to respondents, providing this occurs after DOL has published the findings on its website.



10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.

Verbal assurance of privacy will be provided to all respondents of the worker survey and in-depth interviews and in materials mailed to respondents. In addition, measures will be taken by Abt Associates to remove key identifiers (e.g., respondent name, name of company, and name of employer) prior to data analysis, so that individual responses or aggregate results, henceforth, cannot be linked to a specific individual or employer. The basis for the assurance of privacy is from the privacy statement and non-disclosure agreement that is part of the project’s contract. In addition, see Section B.5 for a detailed explanation for the steps Abt will take to ensure the privacy of the public-use data set.

The survey data will be stored on an Abt-SRBI computer that is protected by a firewall that monitors and evaluates all attempted connections from the Internet. Private information on each survey respondent (name and telephone number, only) will be maintained in a separate data file apart from the survey data so that it is not possible to link particular responses to individual respondents. Once the survey is completed, all private data on each respondent will be destroyed. Any data used for analysis by the contractor or the Department will be completely de-identified. The entire database will be encrypted so that any data stored will be further protected. Finally, access to any data with identifying information will be limited only to contractor staff directly working on the survey.

Participation in the survey and in-depth interviews is voluntary. All analyses, summaries or briefings will be presented at the aggregate level and it will not be possible to link specific responses to individual respondents or their employers in any way. The database delivered to DOL will not include any identifying information such as names, addresses, telephone numbers, or social security numbers, or any other information that might support reverse identification of respondents.

The exact statement indicating the privacy of respondents’ answers is attached (Attachment B).

11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.

The survey does not contain highly sensitive questions. Respondents are asked to report about their employment arrangements, but they provide only very general information about their employer (or, the source of wages) for the work they conduct. This information is limited to the industry in which they work and the total number (reported in ranges) of workers at the firm/company. Such questions do not collect sufficient information as to make any employer individually identifiable. The income and demographic questions included are standard survey questions.

Abt conducted cognitive tests on the worker survey with nine volunteer respondents in Chicago. These purposively selected respondents included workers who were potentially misclassified in order to test the applicability of questions on different types of workers (employees and self-employed). The respondents came from a diversity of backgrounds, industries, and education levels in order to test applicability of the questions for different types of workers (salaried versus hourly, for example) and to capture the range of possible comprehension issues. The survey included in this package reflects the findings from those interviews. The cognitive testing did not reveal any hesitance from respondents in answering the survey questions.

The in-depth interviews with employers do not include any sensitive questions.

12. Provide estimates of hour burden of the collection. The statement should:

  • Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.

  • If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens in Item 13 of OMB Form 83-1.

Worker Survey

Annual hour burden:

1) Screeners: 17,906 households x 5 minutes each 1,492

2) Extended interview: 10,060a X 15 minutes 2,515

Total Burden (17,906 unduplicated respondents, 24,966 total responses) 4,007

Main Survey Annualized cost to respondents: (4,007 hours at $23.28b per hour) $ 93,283

3) 200 Nonresponse interviews X 5 minutes each 17

Annualized cost to respondents (17 hours at $23.28 per hour) $396

TOTAL ANNUALIZED Cost to Respondents: $93,679

aIncludes sixty (60) pre-test cases.

bU.S. Department of Labor, Bureau of Labor Statistics, Table B-3. Average hourly and weekly earnings of all employees on private nonfarm payrolls by industry sector, seasonally adjusted (accessed from the following website as of January 2012: http://www.bls.gov/webapps/legacy/cesbtab3.htm)

The worker survey annual hour burden contains three components: a screening interview, an extended interview and a nonresponse interview. The worker survey is a general population study, and therefore the average hourly rate for all employees on private nonfarm payrolls was used to determine the costs. Eligible households must contain at least one person who worked for pay during the target reference period. To determine eligibility, interviewers will conduct a short interview (up to five minutes) with 17,906. It is estimated that of those, 10,060 will go on to complete the extended interview (line 2 above). Finally, to analyze nonresponse bias, we will conduct a nonresponse follow up survey with up to 200 households who failed to respond to the main survey. This will be a five minute survey (line 3 above).







In-Depth Interviews

Annual hour burden:

Recruitment (includes calls and review of materials): 100 executives x 15 minutes each 25

In-depth interviews: 20 @ 60 minutes each 20

Total Burden (100 unduplicated respondents, 120 responses) 45

ANNUALIZED COST TO RESPONDENTS: (45 hours at $85.02a per hour) $3,826

U.S. Department of Labor, Bureau of Labor Statistics Table 1. National employment and wage data from the Occupational Employment Statistics survey by occupation, May 2011. Average hourly and weekly earnings of Chief Executives private nonfarm payrolls by industry sector, seasonally adjusted (accessed from the following website as of May 2012 (http://www.bls.gov/news.release/ocwage.t01.htm)

13. Provide an estimate for the total annual cost burden to respondents or record-keepers resulting from the collection of information (Do not include the cost of any hour burden shown in Items 12 and 14).

  • The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life) and (b) a total operation and maintenance and purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.

  • If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collections services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.

  • Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.

Worker Survey

The survey will not involve any additional cost burden, other than that described above.

In-Depth Interviews

The in-depth interviews will not involve any additional cost burden, other than that described above.

14. Provide estimates of annualized costs to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information. Agencies may also aggregate cost estimates from Items 12, 13, and 14 in a single table.

Worker Survey

This survey involves a one-time cost to the Federal Government and to the respondents. The cost to the Federal Government for the Worker Survey totals approximately $1.6 million. This includes creating and fielding the survey, incentives for respondents, analysis, and reporting on the results. See below for a detailed breakdown of these costs.

In-Depth Interviews

These in-depth interviews involve a one-time cost to the Federal Government and to the respondents. The cost to the Federal Government for the in-depth interviews totals $100,500. This includes developing the in-depth interviews protocol, conducting the in-depth interviews, providing a summary report to respondents, analysis, and reporting on the results. See below for a detailed breakdown of these costs.

Total Cost

Cost to the Federal Government to produce the worker survey and in-depth interviews totals $1.8 million. A breakdown of these costs is presented in Exhibit 1.

Exhibit 1. Breakdown of Costs by Project Tasks

Activity

Approximate cost

Percentage of total cost

Sample and survey design

$399,500

22%

Data collection, processing, and management

$900,000

50%

Interviewing

$702,000

--

CATI programming and data management

$108,000

--

Project Management

$90,000

--

Analysis, review, and interpretation of the findings

$250,000

14%

Preparation of reports and documentation

$150,000

8%

Conduct in-depth interviews

$100,500

6%

15. Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the OMB Form 83-1.

This is a new collection of information.

16. For collections of information whose results are planned to be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.

Worker Survey

Data collected in the Worker Survey will be analyzed and results provided in a report to be issued by DOL. Data will be presented primarily in a descriptive statistical manner, employing cross-tabulations. Please see Attachment D for more detail.

In-Depth Interviews

Qualitative data collected from the in-depth interviews with employee, employer, and industry specific employer representatives will be analyzed and results provided in a report to be issued by DOL. The non-statistical data collected will contribute to the formative and summative assessment of employer representatives’ knowledge and attitude toward challenges and decision processes related to hiring and employing permanent employees, contractors and other alternative hiring practices.

The analysis will consist of coding text data, an iterative process that includes reading, reviewing, and filtering data to identify prevalent themes relating to each of the research questions. Researchers will code the data using a qualitative analysis software package. Qualitative analysis software programs such as nVIVO facilitate the analysis of large quantities of qualitative data by enabling researchers to develop and test hypotheses using codes that are assigned to specific portions of the narrative. It also allows the research team to organize and categorize data within-case or across cases.

Exhibit 2 displays the time schedule for the entire project.

Exhibit 2. Worker Survey and Qualitative Data Collection Project Time Schedule

Activity

Date

Cognitive testing of survey

May 2012

Finalize survey instruments and justification for surveys

September 2013

Office of Management and Budget (OMB)

package under review

October-November 2013

Final approval by OMB

No later than November 30, 2013

Conduct in-depth interviews

January-February 2014

Telephone interviews begin

January 1, 2014

Telephone interviews end

June 30, 2014

17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.

The expiration date will appear on the advance materials.

18. Explain each exception to the certification statement identified in Item 19, “Certification for Paperwork Reduction Act Submissions,” of OMB Form 83-L.

There are no exceptions.

References

American Association for Public Opinion Research. 2008. “Guidelines and Considerations for Survey Researchers When Planning and Conducting RDD and Other Telephone Surveys in the U.S. With Respondents Reached via Cell Phone Numbers.” Available at http://www.aapor.org/uploads/Final_AAPOR_Cell_Phone_TF_ report_041208.pdf.

Bernhardt, A. D. P. and J. DeFilippis. 2009. “Working Without Laws: A Survey of Employment and Labor Law Violations in New York City.”

Canak, William and Randall Adams. 2010. “Misclassified Construction Employees in Tennessee.” Accessed August 14, 2012 from https://carpenters.org/misclassification/ALL%20DOCUMENTS/TN%20fraud%20study%201-15-10.pdf.

GAO. 2007. “Employee Misclassification: Improved Outreach Could Help Ensure Proper Worker Classification” (GAO-07-859T).

GAO. 2009. “Employee Misclassification: Improved Coordination, Outreach, and Targeting Could Better Ensure Detection and Prevention” (GAO-09-717).

Harris, Seth D. 2010. Statement of Seth D. Harris, Deputy Secretary, U.S. Department of Labor, before the Committee on Health, Education, Labor, and Pensions of the U.S. Senate. June 17, 2010. Accessed July 23, 2011 from http://www.dol.gov/_sec/newsletter/2010/20100617-2.htm.

Link, Michael W., Michael P. Battaglia, Martin R. Frankel, Larry Osborn and Ali H. Mokdad. 2007. “Reaching the U.S. Cell Phone Generation: Comparison of Cell Phone Survey Results with an Ongoing Landline Telephone Survey.” Public Opinion Quarterly 71:814-839.

National Alliance for Fair Contracting, Inc. Report of the Ohio Attorney General on the Economic Impact of Misclassified Workers for State and Local Governments in Ohio. February 18, 2009. Accessed August 14, 2012 from http://www.faircontracting.org/PDFs/prevailing_wage/Ohio_on_Misclassification.pdf

Planmatics, Inc. Lalith de Silva, et al. 2000. “Independent Contractors: Prevalence and Implications for Unemployment Insurance Programs.”

Peytchev, A., R.K. Baxter, L.R. Carley-Baxter. 2009. “Not All Survey Effort is Equal: Reduction of Nonresponse Bias and Nonresponse Error. Public Opinion Quarterly 73 (4): 785-806.

Tourangeau, R., M. Zimowski, and R. Ghadialy. 1997. “An Introduction to Panel Surveys in Transportation Studies.” Prepared for the Federal Highway Administration.

9

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJan Nicholson
File Modified0000-00-00
File Created2021-01-29

© 2024 OMB.report | Privacy Policy