ETA TA Web Survey Supporting Statement Part B

TA study Part B 1205-0436 final 5 16 2014.docx

Quick Turnaround Surveys on Workforce Investment Act Implementation

ETA TA Web Survey Supporting Statement Part B

OMB: 1205-0436

Document [docx]
Download: docx | pdf




Supporting Statement for the

Paperwork Reduction Act of 1995



Part B.

Collections of Information Employing

Statistical Methods



Technical Assistance Feasibility Study





U.S. Department of Labor

Employment and Training Administration

200 Constitution Ave., NW

Washington, DC 20210


TABLE OF CONTENTS


Page








PART B. SUBMISSION FOR COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS


1.1 Project Objectives


The Employment and Training Administration (ETA) of the U.S. Department of Labor (DOL) requests clearance to conduct a web-based survey with recipients of technical assistance (TA) provided by the ETA. Annually, the ETA makes a substantial investment in TA for the public workforce system to fulfill its mission of optimizing the U.S. labor market by expanding opportunities for high quality job training and employment, supporting real-time labor market information, and providing income maintenance to unemployed workers. The TA varies according to multiple factors including: subject matter, modality/activity, provider, recipient, and funding-source. The ETA faces extraordinary complexity in planning to evaluate the effectiveness of its TA investments. To support its efforts in this regard, ETA plans to produce a feasibility report for an impact study. The research activities described below and in Part A will provide critical, detailed information from TA recipients about their experiences with the TA, their use of TA, and their views on the effectiveness of different types of TA. This data will inform our recommendations for the type of TA initiative that would be a suitable subject of an impact study.


B1. Respondent Universe and Sampling Methods


1.2 Sampling Design / Respondent Selection


Three methods are being used to select respondents to the Web-based survey. These are based on respondent type. State-level directors (of workforce agency divisions and of Workforce Investment Boards [WIBs]), will not be sampled – the survey will be distributed to the universe. Directors at the local workforce investment board (LWIB) level will be stratified by geographic area and then randomly selected within strata. Finally, current grantees (the third group) will be sampled purposively. Exhibit 1 shows the proportions of each respondent group.


Exhibit 1: Survey Respondents

Respondent Group

Number of Respondents

Sampling Method

State Workforce Agency Division Directors (or equivalent)

50

No sampling.

State Workforce Agency Unemployment Insurance Division Directors (or equivalent)

50

No sampling.

Workforce Investment Board (WIB) Directors

50

No sampling.

Local WIB Directors

100

Random sampling, stratified by geographic region and urban/rural designation

Current ETA Grantees (Grant Directors or equivalent)

150

Purposive sampling.

TOTAL:

400


1.3 Potential Respondent Universe

The potential respondent universe is directors from any organization (government, public, private) that has participated, as a grantee or partner, in an ETA grant in the past two years.


1.4 Sampling Unit


Individual directors are the sampling unit, as these individuals typically make or finalize TA decisions for their organizations.


1.5 Population Frame and

1.6 Estimated Sample Size

Exhibit 2 shows the number of entities in the universe covered by the collection and in the corresponding sample.


Exhibit 2: Respondent Universe and Sample

Respondent Group

Sample/Universe

Sampling Method

Workforce Agency Division Directors

50/50

No sampling.

Workforce Agency UI Division Directors

50/50

No sampling.

WIB Directors

50/50

No sampling.

Local WIB Directors

100/431

Stratified random sampling.

Current ETA Grantees

150/thousands

Convenience sampling.


1.7 Expected Response Rates


We expect an 80 percent response rate for the survey, based on extensive pre-survey and follow-up activities coordinated with the U.S. Department of Labor. Examples of similar efforts that yielded such a response rate include the Job Corps National Survey Data Collection Project and Project GATE, both of which were conducted for the Employment and Training Administration at the U.S. Department of Labor.


B2. Statistical Methods for Sample Selection and Degree of Accuracy Needed


2.1 Statistical Methodology for Stratification


Stratification is not statistical per se. The geographic regions by which the random sample of LWIB directors will be stratified is as follows:

  • Geographic Region1

    • Northeast (Connecticut, Maine, Massachusetts, New Hampshire, New Jersey, New York, Pennsylvania, Rhode Island, and Vermont)

    • Midwest (Illinois, Indiana, Iowa, Kansas, Michigan, Minnesota, Missouri, Nebraska, North Dakota, Ohio, South Dakota, and Wisconsin)

    • South (Alabama, Arkansas, Delaware, District of Columbia, Florida, Georgia, Kentucky, Louisiana, Maryland, Mississippi, North Carolina, Oklahoma, South Carolina, Tennessee, Texas, Virginia, and West Virginia)

    • West (Alaska, Arizona, California, Colorado, Hawaii, Idaho, Montana, Nevada, New Mexico, Oregon, Utah, Washington, and Wyoming)


We will sample 25 LWIB directors from each strata.


2.2 Sample Selection Methodology


The Web-based survey is being administered to the universe of State Workforce Agency division directors, Unemployment Insurance Division directors, and WIB directors; i.e. there is no sampling.


LWIB directors will be randomly sampled from the (stratified) universe of LWIBs. The geographic strata will ensure that LWIBs from a variety of regions and with a variety of characteristics have an opportunity to participate. The survey is intended to collect contextual data from various TA recipient perspectives. It is not intended to be representative. There is no weighting of sites in the sample selection procedures.


2.3 Estimates of Variance


N/A


2.4 Analysis Plans


The results of the web survey will be analyzed using basic descriptive statistics. Cross-tabs will allow comparisons of means, standard deviations, etc. Key variables will include:


  • Funding type (formula or discretionary)

  • Years experience with ETA grants

  • Years experience with ETA TA

  • Respondant role in organization

  • Technical assistance type (modality)


A content analysis will be conducted on data from the open-ended survey items. This data will be coded using the questions as the guiding framework, and expanding beyond that framework by identifying additional themes and cluster ideas.



2.5 Minimal Substantively Significant Effect


N/A


2.6 Unusual Problems


N/A


2.7 Periodic / Cyclical Data Collection


N/A


B3. Maximizing Response Rates and Addressing Nonresponse


3.1 Methods to Maximize Response Rate / Issues of Non-Response


The survey will be preceded with a Training Employment Notice (TEN) generated by the ETA. When delivered, the survey will be accompanied by an explanatory e-mail explaining the need for the data collection. A direct, unique web link will be provided for every potential respondent, making the survey very easy to access (i.e. no password or multiple log-ins necessary). The survey will be designed so that respondents can enter and exit it if they wish to complete it in more than one sitting. The survey is brief (respondents can complete the survey in about 15 minutes). Finally, all non-respondents will receive two follow-up emails as reminders to complete the survey, I week and then 2 weeks after survey delivery. The survey timeframe, from TEN through closing, will be 8 weeks.


3.1a Nonresponse Bias Analyses


The survey is not intended to be representative.

3.1b Nonresponse Weights


N/A


3.1c Other Procedures to Address Missing Data


N/A


3.2 Accuracy and Reliability of Information Collected


The results from the web survey are is not expected to be representative, and researchers will not make generalizations based on them.


3.3 Justification for non-systematic data-collection


The selection of current ETA grantees will be a purposive sample to maximize the quality of respondent feedback. This is important because we have a limited number of individuals to whom we can reach out under this particular OMB clearance –we cannot cast a wide net. In order to ensure high response rate and quality, we will 1) target grantees who are particularly engaged with ETA TA according to Federal Project Officers, and 2) call on partners (such as colleges, Goodwill, and chambers of commerce) with whom we have strong relationships.


B4. Test Procedures


4.1 Test of Procedures and Methods to Minimize Burden and Improve Utility


In addition to being tested by experienced employment and training researchers who are not working on this particular project, the survey was piloted with 4individuals from across the respondent groups (1 state level director, 2 LWIB directors, and 1 discretionary grantee). Each of these pilots was followed by a cognitive interview. The cognitive interviews allowed researchers to make small changes to the survey in order to clarify questions and response options. The feedback received from the four respondents was consistent in the interpretation of the survey, clarification for the same particular questions, and the same suggestions to improve certain questions.   During these pilot activities, the evaluators also determined that the estimated burden of approximately 15 minutes per respondent is accurate.


4.2 Approval for Pilot Tests with 10 or More Respondents


N/A


B5. Contact Information


5.1 Consultant Contact Information


No uncompensated individuals were consulted on any aspect of this design.


5.2 Analyst Organization Information


No uncompensated agency unit, contractors, grantees, or other persons will collect and/or analyze the information.


1 Based on Bureau of Labor Statistics – Consumer Expenditure Survey (http://www.bls.gov/cex/csxgloss.htm).



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSUPPORTING STATEMENT
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy