Evaluation Response Plans, ETA

1205-0453 ETA Plans for Responding to Evaluation Findings_12.30.09.doc

National Agriculture Workers Survey (NAWS)

Evaluation Response Plans, ETA

OMB: 1205-0453

Document [doc]
Download: doc | pdf

Department of Labor, Employment and Training Administration


Response to March 13, 2009 Office of Management and Budget Clearance Terms for the National Agricultural Workers Survey, OMB No. 1205-0453


Per the Office of Management and Budget’s (OMB) March 13, 2009 clearance terms for the National Agricultural Workers Survey (NAWS), the Employment and Training Administration (ETA) commissioned an independent review of the survey to investigate the extent of potential biases in the survey’s estimates related to potential errors in the calculation of the survey’s weights. As directed by OMB, ETA consulted with the Bureau of Labor Statistics (BLS) regarding the necessary qualifications and experience of the statistician who would perform the work. ETA also vetted the statement of work for the requirement with BLS.


ETA acknowledged and fully agreed with the need for the independent evaluation and is confident that it will result in improvements in the survey that will benefit the public, ETA, and the many Federal agencies that rely on the survey’s findings. The evaluation findings appear in the document [1205-0453 Evaluation of NAWS Weights and Point Estimates_12.30.09.pdf].


The evaluator provided its draft report to ETA on December 21, 2009. On December 22, 2009, it presented its preliminary findings to staff from ETA’s Office of Policy Development and Research (OPDR). BLS Office of Survey Research Methods staff also attended the briefing. The four major findings are as follows:


  1. While the modified equations for the sampling weights in Part B of the supporting statement addressed the concerns raised by BLS during its initial review, and the NAWS contractor’s program code for implementing the weights is accurate, the revised formulas and the program are missing the first-stage selection probabilities;


  1. For the major variables it examined using national-level data for 2001-2005, 2006, and 2008, the evaluator found no significant or substantive differences in the point estimates and sampling errors by using the analysis weights developed following the methods in Part B of the supporting statement;


  1. Large survey design effects may reduce the cost-effectiveness of the survey and the reliability of its estimates, and


  1. The lack of detailed field-level forms, protocols, and quality control regimens for selecting workers for an interview may lead to non-random selection of workers.


Sampling Weights and Accuracy of Published Results

The evaluator found no practical differences in key national-level estimates when the old and new weights, developed by the NAWS contractor, were applied to the data. The estimates for the same key variables were also found to be accurate when the evaluator applied a weight that it developed that took into account the first-stage selection probabilities (the step that was missing from the NAWS contractor’s weights).


While these preliminary findings indicate that it will not be necessary for ETA to republish printed or Web-published estimates, they do point to the need to modify the NAWS public access data set. Specifically, the current post-sampling weight variable that does not include the first-stage selection probabilities needs to be replaced by the corrected weight that will include the missing selection probabilities.


A final determination regarding the weights and the accuracy of all of the most recently published NAWS estimates, however, can not be made until the evaluator has examined: 1) the possibility of making non-response adjustments at each stage of selection (currently, a regional-level adjustment is made) and how the new adjustments, if developed, impact point estimates, 2) the weights for and their impact on regional-level estimates, and 3) the full set of demographic and employment variables that are commonly included in published reports. As discussed below, this work will begin immediately and will be completed by March 1, 2010.


Survey Design Effects

The evaluator finds the NAWS to have large survey design effects (Deffs), which implies that some estimates may be unreliable. To fully understand the survey design features that produce large Deffs, a more thorough review of the survey methodology, leading to a re-design of several of its component parts, will be necessary.


Primary among these is changing from simple random sampling (SRS) of growers to selection based on probabilities proportional to the size (PPS) of hired and contract farm labor expenditures. PPS sampling of growers was in place throughout the first half of the survey’s history, from 1989-1998, but changed to SRS in 1999 at the request of the National Institute for Occupational Safety and Health (NIOSH), a key Federal partner which uses the NAWS for occupational injury and health surveillance. NIOSH is also aware of the large design effects and is open to going back to PPS sampling of growers to correct the problems inherent with SRS.


Selection of Workers

NAWS interviewers undergo extensive and frequent training on all aspects of the survey’s field components, including how to randomly select workers for an interview. While the training manual currently contains instructions for randomly selecting workers, and interviewers are dispatched in pairs, with each team including an experienced/lead interviewer, detailed field-level forms, protocols, and quality control regimens for selecting workers do not currently exist. The evaluator noted that the lack of forms/directives and a quality control program may result in non-random sampling of workers.


ETA’s Proposed Action Steps and Deliverables

To address the evaluator’s findings, ETA proposes taking a number of action steps and submitting their associated deliverables to OMB. For each action step, ETA will marshal the necessary resources, including personnel with the requisite skills. ETA will look forward to discussing the following action plan, including the proposed deliverable dates, with OMB:


  • Development of single-stage non-response adjustments and Assessment of weights for and accuracy of regional-level estimates – This requirement will be assigned to the current evaluator and will be completed by March 1, 2010.


  • Development of forms/directives and a quality control program for randomly selecting workers for an interview – This task will be assigned to the NAWS contractor. An independent evaluator, however, will also be tasked with assessing current worker sampling procedures and helping the NAWS contractor develop appropriate forms/directives. This work will commence immediately and will be competed by October 1, 2010: the first interview cycle of fiscal year 2011.


  • Update and modification of the NAWS public access data set to add the 2007-2009 data and the new post-sampling weight variable – As in the past, OPDR will send the revised data set to BLS for approval prior to making it available to the public via the NAWS Web page. OPDR staff and the NAWS contractor will coordinate efforts to complete this work by April 1, 2010.


  • Development and posting of errata announcements on the NAWS Web page –Changes to the statistical methods documentation, e.g. the formulation of new sampling and post-sampling weight variables, and any necessary corrections to previously released estimates or instructions to users of the public data for producing robust estimates will be posted. OPDR staff will coordinate with the NAWS contractor to complete this task by April 1, 2010.


  • Interagency meeting with Federal partners to discuss the need to re-design the NAWS methodology – ETA will consult with all the Federal agencies that use NAWS data so that the re-design accomplishes the goal of reducing survey design effects while continuing to meet, as best as possible, each agency’s information needs. This meeting will take place during the last week of January 2010.


  • Survey re-design to address each of the concerns discussed in the evaluator’s report – While much of this work can be completed by the NAWS contractor, some portions of it may require an independent researcher with highly specialized statistical sampling expertise. This task will require additional funding and portions of the requirement may need to be competed. While some minor changes will be introduced immediately, and others beginning in October 2010, it may not be possible to implement all aspects of the re-design until October 2011: the first interview cycle of fiscal year 2012. As there are three data collection cycles per year, with counties and growers being pulled and the interviews being allocated across the cycles before the first cycle begins, any major methodological changes would need to be implemented by October 1st of each year, the start-date of the first cycle. The envisioned time-line for incorporating the necessary methodological changes is as follows:


Immediate Changes:

  • Incorporating first-stage selection probabilities into the weight variable

  • Application of the new weight variable to all future data analysis

  • Addition of the new weight variable to the public data set


Changes that could be introduced by October 1, 2010 (the first interview cycle of FY 2011):

  • Using simple random sampling at the second-stage of selection

  • Using probability proportional to size sampling of growers

  • Using newly developed worker sampling procedures and quality control program

  • Incorporating single-stage non-response adjustments in the weight variable


Changes that could be introduced by October 1, 2011 (the first interview cycle of FY 2012):

  • Any other changes that are deemed necessary as a result of the survey re- design effort


  • Submission of Paperwork Reduction Act (PRA) clearance package – The NAWS currently has authorization until March 31, 2010. As part of the PRA process to continue the survey, a 60-day public comment notice was published in the Federal Register on November 30, 2009. Upon summarizing any comments it receives, ETA will submit the PRA package to OMB. The new package will incorporate the recommendations from the independent evaluation, including the revised and approved formulas for the survey’s weights, as well as any feedback ETA receives from NAWS Federal partners. The new PRA package, which will be submitted to OMB by February 15, 2010, will request a continuation of the survey, with the methodological changes discussed above, for 18 months, from April 1, 2010 until September 30, 2011, during which time all necessary survey design changes will be developed.


  • Quarterly reports – ETA proposes to submit quarterly reports to OMB to document progress and/or any technical difficulties regarding the completion of the aforementioned deliverables.



File Typeapplication/msword
File TitleDepartment of Labor, Employment and Training Administration
Authorcarroll.daniel.j
Last Modified Bynaradzay.bonnie
File Modified2009-12-30
File Created2009-12-30

© 2024 OMB.report | Privacy Policy