WIOA IMPLEMENTATION evaluation OMB SUPPORTING STATEMENT: PART B. 1290-NEW
In this document, the Department of Labor (DOL) requests clearance from the Office of Management and Budget (OMB) under the Paperwork Reduction Act (PRA) for a new collection: a survey of state-level workforce staff associated with the Workforce Innovation and Opportunity Act (WIOA) Implementation Evaluation. This survey will be conducted online across all 50 states and the District of Columbia.
The WIOA implementation evaluation was funded by the DOL Chief Evaluation Office (CEO) in partnership with the Employment and Training Administration (ETA) with the goal of better understanding the implementation of WIOA, the variations in implementation across states and localities, and the need for further administrative guidance, regulations, TA, and policy. DOL contracted with Mathematica Policy Research in partnership with Social Policy Research to conduct the evaluation.
The respondent universe for the state survey is all 50 states and the District of Columbia. All respondents in the universe will be invited to complete the survey, so no statistical methods will be used to select a sample. Each state has a unique context and approach to implementing WIOA, and the objective of the state survey is to capture the breadth of variation across states. An 80 percent or higher response rate is expected for the survey.
To understand the progress of WIOA implementation, the study team will conduct the survey data collection in 2019, over a period of approximately five months. The three hour survey will be conducted online using a website that workforce staff in each state can log into and complete the survey. For respondents who do not use the website to complete the survey, the study team will mail hard copy questionnaires to states and conduct reminder calls. Login information will be sent to states via hard copy mail and email to the lead contact in each state.
The data collected from survey respondents will provide information about states’ experiences implementing WIOA, including commonalities and differences in implementation. The study team will try to collect completed surveys from all 50 states and the District of Columbia, so no sampling methodology will be used to create a sample of respondents. Within each state, state staff will self-select appropriate persons to complete the response on behalf of the state.
The data gathered through the state survey will be tabulated using simple descriptive statistics (such as means and percentages), summary statistics, and cross-tabulations. Examples might include the percentage of states that consulted with partners in developing their initial WIOA state plan or the characteristics of states that have implemented an integrated intake approach at American Job Centers (AJCs) since WIOA was enacted.
The majority of the survey questions will be closed-ended; open-ended responses will be back-coded. No statistical methodology (such as sample stratification) or estimation will be needed in the analysis of the survey data. Analysis will follow a common set of steps: data cleaning, variable construction, and computing descriptive statistics. To prepare data for analysis, a series of data checks will be run, allowing for an examination of frequencies and means, and an assessment of the extent of missing data.
The study will not use a probability sample for the selection of states since all states will be included. There are no unusual problems requiring specialized sampling procedures.
To limit burden, the study team will collect survey data from each state only once during a five-month period starting in 2019.
To encourage participation by states, Federal DOL staff will sign the initial survey notifications sent out via mail and email. The study team will make use of survey methods and best practices to encourage high response rates while minimizing burden and non-response. These methods include:
Web administration. It is anticipated that most respondents will prefer to complete the survey online. This choice allows the respondent to complete on their own schedule and pace, as well as complete the survey over multiple sessions. The web survey system used by the data collection team also supports mobile browsers, such as tablets or cellular phones.
Multiple modes of administration. For states that do not choose the web mode, the data collection team will mail hard copy questionnaires. Offering this second mode is a way to be flexible and allow states to select the easiest way to complete, encouraging response. To comply with Section 508 of the Rehabilitation Act, states or staff within states who may have difficulty completing a web survey will be offered the option of completing the survey by telephone.
Multiple notifications and reminders. The data collection team will send all states a notification letter and email signed by DOL staff shortly before the fielding of the survey. This approach will capture the states’ attention and communicate the legitimacy of the study. It will provide information about the content of the survey and explain how to access the web-based questionnaire. It will also include assurances about the privacy of their responses to the questionnaire and a telephone number to call for questions or other concerns. The advance notification will be followed up with timed reminders sent via mail and email. Reminders will be sent frequently. The data collection team will also conduct reminder calls to states, using study staff who are well versed in WIOA implementation, the study’s goals, and the survey design. The calls will be an opportunity to answer questions or address states’ concerns.
Locating. The process of locating appropriate staff within each state begins before sending out the first mailing. This involves checking and verifying the full sample list against publically available information. For states who do not respond to calls or notifications, the contact information (including email addresses, telephone numbers, and physical addresses) for their staff or offices will be rechecked and verified using other sources, including locating vendors, public searches, and administrative records.
Modular design. The single survey instrument is split into modules based on topic area with the intent that the different modules can be completed by different respondents within each state. This approach is designed to both ease respondent burden and improve data quality by enabling collection from the most appropriate staff person within the state.
Methods to ensure data reliability. We will use several well-proven strategies to ensure the reliability of the survey data. The survey has been extensively reviewed by project staff, staff at DOL, and the TWG. The survey has also been pretested. These steps have been taken to make the questions as simple and straightforward as possible while targeting a consistent interpretation across states and staff.
Every aspect of the web program will be thoroughly tested before being put into production to ensure skip patterns, text boxes, and response options are all functioning correctly. Additionally, to ensure that respondents answer questions, all respondents are informed of the privacy of their responses and that reports will never identify a state’s specific data and that staff names will remain private. States that begin the survey, but do not complete, will be included in all reminder efforts to encourage participation and reduce survey and item non-response. Reminder efforts include telephone calls, where respondents can ask questions or seek clarifications about the survey or survey process. States will also be able to complete the survey by hard copy (or telephone) if preferable to the online option. Data from completed web surveys will be reviewed throughout the fielding period for accuracy and consistency. To prepare survey data for analysis, we will run data checks, examine frequencies and means, and assess the extent of missing data. Survey non-response will be addressed in two ways. States that do not complete the survey at all will be excluded from the analysis. States with missing responses to individual questions (item non-response) will be excluded from the denominator of any analysis of those items (available case analysis). In addition, we will identify the number of missing responses to each item on the questionnaire.
All data collection procedures and instruments to be used in the WIOA Implementation Study have been reviewed by content and methodological experts to ensure clarity and optimal ordering of the questions.
The WIOA Implementation Study survey procedures, instruments, and protocols were tested to ensure they could be used effectively to conduct data collection, to evaluate the clarity of the questions to be asked, to identify possible modifications to either question wording or question order that could improve the quality of the data, and to estimate respondent burden. The items contained in the survey instrument were thoroughly tested with 3 individuals representing state agencies in July 2018. After each pilot test participant completed the survey, project staff debriefed each participant using a standard debriefing protocol to determine if any words or questions were difficult to understand and answer. Adjustments were made to the survey instrument to ensure that the questionnaire and systems would function as designed and collect the information necessary to answer the research questions.
B.5. Individuals consulted on statistical aspects of design and on collecting and/or analyzing data
Consultations on the methods used in this study were part of the study design phase to ensure technical soundness. As stated previously, no statistical methods will be employed in a sample design. Consultations included stakeholders within the U.S. Department of Labor and a technical working group (TWG). Members of the study team and the TWG are listed in Table B.1.
Table B.1. Individuals who were consulted for the WIOA Implementation Evaluation
Mathematica Policy Research
|
Grace Roemer Project Director
Pamela Holcomb Co-Principal Investigator
Samina Sattar Deputy Project Director
Ryan Callahan Survey Director
|
Social Policy Research Associates
|
Kate Dunham Co-Principal Investigator
|
TWG members |
Yvette Chocolaad, Policy Director National Association of State Workforce Agencies
Cynthia Forland, Assistant Commissioner, Workforce Information and Technology Services Washington State Employment Security Department
Allison Metz, Senior Research Scientist and Director of the National Implementation Research Network (NIRN) University of North Carolina at Chapel Hill
Ron Painter, Chief Executive Officer National Association of State Workforce Boards
Carl Van Horn, Distinguished Professor of Public Policy and Director John J. Heldrich Center for Workforce Development at Rutgers University
|
Staff responsible for overseeing the collection and analysis of data are listed in Table B.2.
Table B.2. Individuals who will oversee the collection and analysis of data for the WIOA Implementation Evaluation
Mathematica
Policy Research |
Grace Roemer Project Director
Pamela Holcomb Co-Principal Investigator
Samina Sattar Deputy Project Director
Ryan Callahan Survey Director |
Social Policy Research Associates
|
Kate Dunham Co-Principal Investigator
|
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Modified | 0000-00-00 |
File Created | 0000-00-00 |