FMLA_OMB_Part_A_Final_to_OMB(FMLA 20110819)

FMLA_OMB_Part_A_Final_to_OMB(FMLA 20110819).doc

Family and Medical Leave Act Employer and Employee Surveys, 2011

OMB: 1235-0026

Document [doc]
Download: doc | pdf

2011 FAMILY AND MEDICAL LEAVE ACT (FMLA) EMPLOYER AND
EMPLOYEE SURVEYS

Supporting Statement for Paperwork Reduction Act Submissions of both surveys

A. JUSTIFICATION

  1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection.

In 1996 and 2000, the Federal Government funded the collection of nationally representative data on the Family and Medical Leave Act of 1993 (FMLA) from employees and employers. In 2006, the Department of Labor (herein referred to as “DOL” or “the Department”) conducted an RFI/Request for Information that sought information from the public on the administration and use of FMLA (http://www.dol.gov/whd/06-9489.pdf). That effort was limited by the lack of more current nationally representative data. (See p. 125 of the text of the summary of the comments received to the RFI, noting also that “The RFI was not meant to be a substitute for survey research about the leave needs of the work force and/or leave policies being offered by employers.” (http://www.dol.gov/whd/FMLA2007Report/2007FinalReport.pdf, .).) On December 7, 2009, DOL published a notice in the Federal Register that it “is reviewing the implementation of these new military family leave amendments and other revisions of the current regulations.” (http://www.dol.gov/asp/regs/unifiedagenda/fall_2009_Regulatory_Plan.pdf). Good, reliable, up-to-date information is needed to determine the needs of working families to help them balance the demands of work and family and to develop appropriate revised regulations. This information collection request covers comprehensive, nationally representative surveys of employees and employers on issues related to the FMLA.

  1. Indicate how, by whom, how frequently, and for what purpose the information is to be used. For revisions, extensions, and reinstatements of a currently approved collection, indicate the actual use the agency has made of the information received from the current collection.

DOL plans to update the two previous FMLA surveys and to compile an analytical research report on the findings and results of the surveys (Abt Associates has been contracted to assist in the survey research). To support analyses of changes over time, much of the design, methodology, and survey content will be unchanged from the 2000 effort conducted by Westat. However, the revised instrument includes changes to some of the questions reflecting an analysis of the earlier survey, changes in the external environment, as well as past data collection experience. The final research report will fully discuss the survey findings, compare and contrast the results of the 2011 surveys with the 1995 and 2000 surveys, and estimate the number of employees eligible and not eligible for FMLA, as well as the number of employees who have taken FMLA covered leave. Information obtained from previous surveys was utilized by policymakers and researchers both inside and outside the government to analyze special topics regarding FMLA or family and medical leave issues generally. Updated data will provide insight into what has changed in the decade since the previous surveys.

  1. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.

Employee Survey:

We will field the employee survey using a computer assisted telephone interview (CATI) technique. We have attempted to minimize burden by allowing respondents to provide the information in a short telephone survey. Respondents will be offered the opportunity to conduct the survey at a time most convenient to them. Respondents will also be offered the opportunity to call a toll-free number to schedule or conduct an interview at a time most convenient to them. Following standard survey protocol, the respondents will be informed that they may chose not to answer any question and that they may end the survey at any time. Further, the respondent will be offered the opportunity to begin the survey in one session and finish it in another session. The Employee Survey interviews are expected to take on average 20 minutes per leave taking respondent, 10 minutes per leave needing respondent, and 7 minutes for those who are employed but are neither leave-takers nor leave-needers.

Employer Survey:

This survey will use a combination of Internet administration and CATI techniques. Once the appropriate employer respondent has been identified, (and prior to actual survey administration) a packet will be mailed to prospective respondents that contains a cover letter explaining the survey and a page indicating information about leave takers that the employer may need to gather prior to the telephone interview (Attachment A). We have attempted to minimize the burden on respondents by allowing them to provide the information either through the Internet or over the phone. The interviews are expected to take on average 20 minutes per respondent. Respondents who have assisted with preliminary cognitive testing have indicated that, because of automated record keeping, the time expected to gather information to report will not take more than 30 minutes. The exact length of time will vary by size of company, with smaller companies requiring less time. As a result of the preliminary cognitive testing conducted on this instrument, we expect that survey administration for the main part of the survey to take 20 minutes over the phone. Internet administration will take between 15 and 20 minutes (estimated), depending upon the answers. Total time per respondent is therefore approximately 50 minutes at a maximum.

  1. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purpose(s) described in 2 above.

Employee Survey:

We have designed the survey to provide comparison data for the FMLA Employee Survey conducted by the DOL Commission on Family and Medical Leave in 1995 and by the Department of Labor in 2000. There are no other surveys of a large, random sample of households that asked questions specifically related to the required topics regarding family and medical leave.

Employer Survey:

We have designed the survey to provide comparison data for the FMLA Employer Survey conducted by the DOL Commission on Family and Medical Leave in 1995 and by the Department of Labor in 2000. There are no other surveys of a large, random sample of businesses of diverse sizes that ask questions specifically related to the required topics regarding family and medical leave.

  1. If the collection of information has a significant impact on a substantial number of small businesses or other small entities describe the methods used to minimize burden.

Employee Survey:

The telephone survey technique is being used for the Employee Survey to minimize burden on leave taking and leave needing employees. Small businesses and other small entities are not involved.

Employer Survey:

About one-third of the Employer Survey respondents are expected to represent small businesses. According to the Small Business Association, the Office of Advocacy defines a small business as an independent enterprise having fewer than 500 employees. The data collection procedures have been designed to minimize the burden on those individuals as well as representatives from larger organizations through the following: 1) The advance letter and accompanying materials (Attachment A) inform the respondents that survey questions may require them to consult administrative records in order to accurately complete the instrument. Being notified in advance will minimize the need to pause in completing the survey in order to consult the proper records. 2) The employer survey seeks information about a twelve month period, and allows employer respondents to select the twelve month period most convenient for them. 3) The Internet survey will further enhance this opportunity in so far as the respondent may end any given session on the survey and return to their previous answers in the survey at their discretion. This will allow the respondent the ability to complete the survey at a time and place most convenient to them.

  1. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.

If the proposed data collection is not accomplished in a timely manner, the Congress, the Department of Labor, and other policy makers will have no substantive, relevant data upon which to base policy decisions regarding family and medical leave issues.

  1. Explain any special circumstances that would cause an information collection to be conducted in a manner:

  • Requiring respondents to report information to the agency more often than quarterly;

  • Requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

  • Requiring respondents to submit more than an original and two copies of any document;

  • Requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;

  • In connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;

  • Requiring the use of a statistical data classification that has not been reviewed and approved by OMB;

  • That includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

  • Requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information’s confidentiality to the extent permitted by law.

The survey will not involve any of these circumstances.

  1. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency’s notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.

Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and record-keeping, disclosure, or reporting format (of any), and on the data elements to be recorded, disclosed, or reported.

Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years – even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.

The original Federal Register notice appeared April 1, 2011 at 76 FR 18254. An extension to allow for comments to be received was published in the Federal Register on June 7, 2011 at 76 FR 32991.

Response to Comments for 2011 Family and Medical Leave Act Survey Federal Register Notice

The Department of Labor (the Department or DOL) published a Federal Register Notice on April 1, 2011, inviting public comments about this information collection.  76 FR 18254.  The agency received 17 timely comments. Fifteen comments addressed the information collection of the surveys, of which, 14 comments supported the Department’s collection of data on the use of the Family and Medical Leave Act (FMLA). Comments were received from AARP; American Association of University Women; Aon Hewitt; Applied Economic Strategies, LLC (AES); Association of Flight Attendants – CWA; Debra Bringman; Business and Professional Women’s Foundation (BPWF); Ernest Canelli III; Center for Law and Social Policy; Chamber of Commerce of the United States of America (the Chamber); Human Rights Campaign; National Center for Transgender Equality and National Coalition for LGBT Health (collectively referred to here as NCTE); National Partnership for Women and Families (NPWF); National Women’s Law Center; United States Senate, Committee on Health, Education, Labor, and Pensions (the Senate); a group of 91 worker advocacy and/or workplace flexibility associations and organizations (collectively referred to here as Advocacy Organizations), and an unidentified individual. The agency considered all comments received during the open comment period.

Aon Hewitt states, “We applaud the Department of Labor for undertaking this survey…. We believe the survey to be comprehensive and that it will reap valuable data.” Most comments express similar support. The Department agrees that new information needs to be collected on the use and need of FMLA leave in order to update understanding of leave-taking behavior and to close current data gaps remaining from the previous surveys in 1995 and 2000 and the DOL’s Request for Information in 2006. The Advocacy Organizations further state that “…the information collected through the surveys will have practical utility….” and “…impose minimal burdens.”

Many comments suggest specific clarification in the instructions or wording of a given survey question. The Department appreciates the detailed feedback and has adopted suggestions as appropriate. Most comments also suggest adding questions to be asked of employees and/or employers on a wide range of issues relating to FMLA leave and workplace policies, workplace flexibility, or demographic questions for further categorization of survey responses. While the Department appreciates the desire for more data, asking additional questions would lengthen the time necessary to administer the survey and potentially result in a negative impact on survey response rates. The Department has reexamined the survey questions, and has eliminated or edited some existing questions while adding some new questions in response to commenters’ suggestions, in order to maximize the value of the data received. For example, in response to comments, the Department added a question to the employee survey regarding employee knowledge of FMLA leave reasons. The Department believes that the revised survey will generate the data sought to increase understanding of the current use of FMLA leave and other employer policies while minimizing the burden to the public.

A few comments suggest gathering more information about the military family leave provisions resulting from amendments to the FMLA pursuant to the National Defensive Authorization Act for Fiscal Years 2008 and 2010 (See comments by NPWF, AES, and the Chamber). The Department agrees that obtaining more data pertaining to the military family leave provisions of the FMLA would be valuable. As originally proposed, this information collection contained 12 questions that addressed the military family leave provisions of the FMLA. In order to capture additional data for leave taken under these provisions, the Department modified two questions concerning leave reasons to include military family leave and added four new questions. For example, the Department has modified the question to an employee about their longest leave reason to include issues arising from the deployment of a military member and added a new follow-up question regarding the type of issue arising from the deployment.

The Chamber and AES raised questions concerning a perceived bias in the survey regarding respondents to the employee survey who are both “leave needers” and “leave takers.” The Chamber writes, “On page 5 of the draft employee survey a note under the leave designation column states that ‘leave needer trumps leave taker if respondent is both.’” They assert that this programming instruction “…may improperly inflate the proportion of ‘leave needers’ in the survey sample, because persons who were both leave takers and also experience an unsatisfied leave need will only be asked the ‘leave needer’ questions if selected for in-depth interviews.” Similarly, AES notes that, “This is a potentially serious flaw that may bias the survey results by increasing the number of leave needers and reducing the number of leave takers, when in fact the respondent is a leave taker too.” The Department believes that this concern is unfounded for two reasons.

First, the instruction pertains to selection probabilities but not survey estimation. At the end of the screening, the interviewer must determine which of the eligible adults in the household will be selected for the extended interview. Two rare populations are of particular interest – leave needers and leave takers. Based on the 2000 survey, leave needers are significantly rarer than leave takers (2.4% versus 16.5% of employed adults, respectively). Due to this very low incidence, leave needers will be selected for the extended interview with a higher probability than leave takers. Both groups are selected with a higher probability than “employed only” adults. The instruction in question specifies that if an adult in the household is both a leave needer and leave taker, then for the purposes of extended interview selection, that respondent is to be assigned the probability of selection of a leave needer (because that is the rarer of the two groups to which they belong). The weighting protocol will adjust for these differential selection probabilities to ensure that all respondents are represented in proportion to their actual incidence in the population. Contrary to the concerns raised, this does not mean that adults who are both leave needers and leave takers will be counted only as leave needers in the survey estimates and analysis. Adults who are both leave needers and leave takers will be counted in both groups in the survey estimates and analysis. Weighting will correct for the fact that they have been sampled at a higher rate relative to adults who are only a leave taker or who are employed only.

Second, selected respondents who are known to be a leave needer and a leave taker will be administered the question modules for both leave needers and leave takers. If the extended interview respondent is both a leave needer and a leave taker, then he or she will be administered all of the questions designated for both groups. This will provide data on their experience in both respects and they will be included in the estimates based on both leave needers and leave takers. For these reasons, the Department believes the instruction in question does not present a threat of bias to survey estimates.

Finally, several comments request the final survey results be released to the public or recommend a type of analysis be performed by the DOL. (See the comments from NCTE, NPWF, BPWF, the Senate, and the Chamber.) The Department will publish a written report describing the final survey results and will make raw data from the survey available to the public.

The Department has conducted extensive outreach efforts with Congressional, academic, and private industry constituencies as well as interested agencies within the Executive branch. Comments and suggestions from all interested parties were solicited, reviewed and considered in preparing for the final survey product in an effort to efficiently extract required information while minimizing the reporting burden on the public.


  1. Explain any decision to provide any payment or gift to respondents, other than reenumeration of contractors or grantees.

Employee Survey:

The research literature on incentives in landline and cellular random digit dial (RDD) surveys suggests that incentives would improve the cooperation rate in household surveys (Singer et al. 1999; 2000; Brick et al. 2005). Incentive payments to survey respondents have been used extensively for many years to improve survey response rates. There is considerable research-based evidence supporting the value of compensation for increasing cooperation and improving the speed and quality of response in a broad range of data collection efforts. The offer of a monetary incentive can help persuade the respondent to participate in the survey. The incentive may reduce the “cold call” effect, by piquing interest in the introduction to the survey, when vital information is conveyed about purpose, content and timing.

The Employee Survey features both a screener and an extended interview. In theory, incentives could be offered at one or both stages. The research literature and the survey sample design indicate that extended interview incentives would be cost effective for this survey while screener incentives would not (Arbitron 2003; Cantor et al. 1998; Kropf et al. 2000; Singer et al. 2000; Cantor et al. 2007). A related issue is whether incentives should be pre-paid or post-paid (“promised”). Due to the cell phone sample and unlisted landlines, fewer than half of the numbers sampled for the survey (about 40%) could be matched to an address. This makes pre-paid incentives a non-starter. The extended interview incentives, thus, will need to be post-paid.

The research literature indicates that we can expect about an 8 percentage point increase in the response rate if $15 is promised as opposed to $0 (Cantor et al. 2003; Strouse and Hall 1997).

We propose to test the benefit of this type of incentive in the Employee Survey using a randomized experiment for the first replicate of telephone numbers released in the field for the landline RDD sample. We will offer a randomly determined 50% of 600 first replicate respondents (treatment group) $10 to complete the interview and no incentive to the balance of respondents (control group). We will compare the cooperation rates, overall response rates, and the mean number of call attempts per completion in the two groups. Based on the literature, we hypothesize that when a monetary incentive is used, the cooperation rate will be higher and mean number of call attempts per completion will be lower. Decreasing the interviewer effort required to complete the landline interviews would lower the data collection costs, resulting in a net cost saving from the incentives. We also hypothesize that the promise of an incentive will motivate participants to complete the survey versus terminating early, resulting in a higher survey completion rate. A gain of at least five percentage points in the cooperation rate for the treatment group over the control group (based on contacted working, residential numbers) would suggest that we proceed with an incentive for all landline respondents. Please refer to the “Non-response Follow-up Survey” section in #3 of Part B for additional information.


Employer Survey:

Employer survey respondents receive no payments or gifts.

  1. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.

Verbal assurance of confidentiality will be provided to all respondents of the employer and employee surveys. In addition, measures will be taken by Abt Associates to remove key identifiers (e.g., name of company, name of employer) prior to data analysis, so that individual responses or aggregate results, henceforth, cannot be linked to a specific individual or employer. The basis for the assurance of confidentiality is from the confidentiality statement and non-disclosure agreement that is part of the project’s contract. This includes information classified as confidential under the Privacy Act of 1974. In addition, see Section B.5 for a explanation for the detailed steps we will take to ensure the confidentiality of the public-use data set.

The survey data will be stored on an Abt-SRBI computer that is protected by a firewall that monitors and evaluates all attempted connections from the Internet. Confidential information on each survey respondent (name and telephone number, only) will be maintained in a separate data file apart from the survey data so that it is not possible to link particular responses to individual respondents. Once the survey is completed, all confidential data on each respondent will be destroyed. Any data used for analysis by the contractor or the Department will be completely de-identified. The entire database will be encrypted so that any data stored will be further protected. Finally, access to any data with identifying information will be limited only to contractor staff directly working on the survey.

Participation in the survey is voluntary. All analyses, summaries or briefings will be presented at the aggregate level and it will not be possible to link specific responses to individual respondents in any way. The database delivered to DOL will not include any identifying information such as names, addresses, telephone numbers, or social security numbers, nor any other information that might support reverse identification of respondents.

The exact statement indicating the confidentiality of respondents’ answers is attached (Attachment B).

  1. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.

We are limiting what we are asking to information needed to update the 2000 study entitled, “Balancing the Needs of Families and Employers.” Updates will include both asking the earlier questions again (to estimate trends) and asking new questions based on changing regulations, changes in regulation under consideration, changing external environment, and lessons learned in the analysis and use of the earlier surveys. This includes asking people about a serious health condition, and also about taking leave or needing to take leave for their health condition. These questions are central to the goal and focus of this project.

The Department of Labor has allocated funds to update these data. The survey asks only about the broad reason for leave-taking (i.e., general categories), not details. This information is vital for determining what working families need to help them balance the demands of work and family. All data will be aggregated so that information about individual respondents will not be identifiable.

We conducted cognitive tests on the employee survey with nine volunteer respondents in Chicago. These purposively selected respondents included employees who took or needed to take leave from work for a variety of family and medical reasons in order to test the applicability of questions on different types of leave takers or leave needers. These respondents come from a diversity of backgrounds and education levels in order to test applicability of the questions for different types of employees (salaried versus hourly, for example) and to capture the range of possible comprehension issues. The survey included in this package reflects the findings from those interviews.

We have conducted three cognitive interviews with volunteer respondents from a small company (<100 employees), mid-size company (100-999 employees) and large company (1000+ employees). No major problems have been uncovered from the testing. The survey included in this submission contains edits that resulted from this preliminary testing. Further testing on the Internet version of the survey is scheduled once the survey has been fully programmed. No significant, substantive changes are expected.

The employee survey includes the following question on sexual preference:

D9. Do you consider yourself to be:

1 Heterosexual or straight;

2 Gay or lesbian; or

3 Bisexual?

4 SOMETHING ELSE (VOL)

8 DK (VOL)

9 REF (VOL)

This question follows the best practices outlined by the Williams Institute regarding how to ask survey respondents to self identify sexual orientation (Williams Institute 2009). We experienced no issues with this question in the nine cognitive tests conducted with the employee survey.

While this question is not usually included in Federally-sponsored surveys, the issue of coverage of same sex partners for family and medical leave are of considerable recent policy interest. For example, in June 2010, DOL clarified the FMLA definition of ‘son and daughter’ to include an employee standing “in loco parentis” to a child, regardless of the employee’s legal or biological relationship to the child. Combined with other questions for the survey, this information will provide DOL with the ability to make data-based analyses to better understand the magnitude, needs, and experiences of same sex partners regarding family and medical leave from work.

  1. Provide estimates of hour burden of the collection. The statement should:

  • Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.

  • If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens in Item 13 of OMB Form 83-1.


For Employees Survey:

Annual hour burden:




Screeners



18,500 households x 2 minutes each

617 hours


Extended interview



1440 leave-taker respondents x 20 minutes each

480 hours


250 leave-needer respondents x 10 minutes each

42 hours


1310 employed respondents x 7 minutes each

153 hours


Total Burden (18,500 respondents)

1292 hours

Annualized cost to respondents: (1292 hours at $22.21* per hour)

$28,695

*U.S. Department of Labor, Bureau of Labor Statistics, Table B-3. Average hourly and weekly earnings of all employees on private nonfarm payrolls by industry sector, seasonally adjusted (accessed from the following website as of September 2010: http://www.bls.gov/webapps/legacy/cesbtab3.htm)



For Employer Survey:

Sampled establishments

2572


Respondents

1800


Frequency of response

once


Annual hour burden:




Screeners



2572 establishments x 5 minutes each

214 hours


Data gathering by respondent



1800 respondents x 45 minutes each

1350 hours


Extended interview



1800 respondents x 20 minutes each

600 hours


Total Burden (2,572 respondents)

2164 hours

Annualized cost to respondents: (2,164 hours at $50.73* per hour)

$109,780


*U.S. Department of Labor, Bureau of Labor Statistics, Table National employment and wage data from the Occupational Employment Statistics survey by occupation, May 2009 Hourly rate for Human Resource Managers (accessed from http://www.bls.gov/news.release/ocwage.t01.htm)


  1. Provide an estimate for the total annual cost burden to respondents or record-keepers resulting from the collection of information (Do not include the cost of any hour burden shown in Items 12 and 14).

  • The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life) and (b) a total operation and maintenance and purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.

  • If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collections services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.

  • Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.

The survey will not involve any additional cost burden, other than that described above.

  1. Provide estimates of annualized costs to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information. Agencies may also aggregate cost estimates from Items 12, 13, and 14 in a single table.

For Employee Survey:

This survey involves a one-time cost to the Federal Government and to the respondents. The cost to the Federal Government for the Employee Survey totals $938,500. This includes revising and fielding the survey, incentives for respondents, analysis, and reporting on the results. See below for a detailed breakdown of these costs.

For Employer Survey:

This survey involves a one-time cost to the Federal Government and to the respondents. The cost to the Federal Government for the Employer Survey totals $561,500. This includes revising and fielding the survey, providing a summary report to respondents, analysis, and reporting on the results. See below for a detailed breakdown of these costs.

Total Cost:

Cost to the Federal Government to produce both Employer and Employee surveys total $1.5 million. A breakdown of these costs is presented in Exhibit 1.


Exhibit 1. Breakdown of Costs by Project Tasks

Activity

Approximate cost

Percentage of total cost

Sample and survey design

$180,000

12%

Data collection, processing, and management

$825,000

55%

Interviewing

$676,500

--

CATI/Web programming and data management

$ 61,875

--

Project Management

$ 86,625

--

Analysis, Review, and Interpretation of the findings

$375,000

25%

Preparation of reports and documentation

$120,000

8%

  1. Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the OMB Form 83-1.

This is a one-time survey.

  1. For collections of information whose results are planned to be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.

Employee Survey:

Data collected in the Employee Survey will be analyzed and results provided in a report to be issued by DOL. Data will be presented primarily in a descriptive statistical manner, employing cross-tabulations. No complex analytical techniques will be used in assessing these data.

Employer Survey:

Data collected in the Employer Survey will be analyzed and results provided in a report to be issued by DOL. Data will be presented primarily in a descriptive statistical manner, employing cross-tabulations. No complex analytical techniques will be used in assessing these data.

Exhibit 2. displays the time schedule for the entire project.

Exhibit 2. Employee Survey and Employer Survey Project Time Schedule

Cognitive testing of Surveys

December 2010-January 2011

Finalize survey instruments and justification for surveys

January 2011

Office of Management and Budget (OMB)

package under review

February/March 2011

Draw sample

March 2011

Final approval by OMB

April 15, 2011

Telephone interviews begin

April 30, 2011

Telephone interviews end

September 30, 2011

Survey data codebook

November 2011

Draft methodology report

December 2011

Final methodology report

February 2012

  1. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.

The expiration date will appear on the advance materials.

  1. Explain each exception to the certification statement identified in Item 19, “Certification for Paperwork Reduction Act Submissions,” of OMB Form 83-L.

There are no exceptions.

References

Arbitron. 2003. The Effect of a Pre-paid and Promised Incentive on Response Rate. Paper presented at the Annual Meeting of the American Association for Public Opinion Research, Nashville, TN.

Brick, J., Montaquila, J., , Hagedorn, M., Roth, S., and Chapman, C. 2005. “Implications for RDD Design from an Incentive Experiment.” Journal of Official Statistics 21:571-589.

Cantor, D., Cunningham, P., and Giambo, P. 1998. “Testing the Effects of a Pre-Paid Incentive and Express Delivery to Increase Response Rates on a Random Digit Dial Telephone Survey.” Paper presented at the Annual Meeting of the American Association for Public Opinion Research, St. Louis, MO.

Cantor, D., Cunningham, P., Triplett, T., and Steinbach, R. 2003. “Comparing Incentives at Initial and Refusal Conversion Stages on a Screening Interview for a Random Digit Dial Survey.” Paper presented at the Annual Meeting of the American Association for Public Opinion Research, Nashville, TN.

Cantor, D., O'Hare, B., and O'Connor, K. “The Use of Monetary Incentives to Reduce Nonresponse in Random Digit Dial Telephone Surveys” in Advances in Telephone Survey Methodology (Wiley: New York, 2007).

Kropf, M., Scheib, J., and Blair, J. 2000. “The Effect of Alternative Incentives on Cooperation in Telephone Survey.” Proceedings of the American Statistical Association, Survey Research Section, 1081-1085.

Singer, E., Van Hoewyk, J., Gebler, N., Raghunathan, T., and McGonagle, K. 1999. “The Effect of Incentives on Response Rates in Interviewer-Mediated Surveys” Journal of Official Statistics 15:217-230.

Singer, E., Van Hoewyk, J., and Maher, M. 2000. “Experiments with Incentives in Telephone Surveys.” Public Opinion Quarterly 64:171-188.

Strouse, R. and Hall, J. 1997. “Incentives in Population-based Health Surveys.” Proceedings of the American Statistical Association, Survey Research Section, 952-957.

Williams Institute. 2009. “Best Practices for Asking Questions about Sexual Orientation on Surveys.” Accessed from the following website in December 2010: http://www.law.ucla.edu/WilliamsInstitute/pdf/SMART_FINAL_Nov09.pdf.


16

File Typeapplication/msword
Last Modified ByU.S. Department of Labor
File Modified2011-08-19
File Created2011-08-19

© 2024 OMB.report | Privacy Policy