GJHC Implementation OMB Supporting Statement_Part B fina 12 27 11

GJHC Implementation OMB Supporting Statement_Part B fina 12 27 11.doc

Green Jobs Implementation Evaluation

OMB: 1205-0487

Document [doc]
Download: doc | pdf

SUPPORTING STATEMENT

FOR THE PAPERWORK REDUCTION ACT OF 1995


Justification for Collection of

Information for the Green Jobs and Healthcare Grants Implementation Study

Part B: Submission for Collections of

Information Employing Statistical Methods


1. Respondent Universe and Sampling Methods


No sampling methods are being used to select respondents since the universe of grantees (i.e., all 152 grantees) will be expected to complete the survey. Therefore no attempt will be made to draw inferences to any population other than the set of units that responded to the data collection effort in (A). 


2. Information Collection Procedures


The grantee survey data will be collected via a web survey developed using the software tool KeySurvey (www.keysurvey.com).


2.1 Statistical methodology for stratification and sample selection. No sampling, stratification or estimation procedures will be utilized in the web surveys that are the focus of this submission.  As such, no unusual problems requiring specialized sampling procedures are anticipated. Similarly, it is not necessary to determine degree of accuracy or to use periodic data collection cycles to reduce burden.


2.2 Estimation procedures. Because the evaluation includes the universe of grantees (i.e., all 152 grantees), standard errors are not an issue and analysis procedures will be simple descriptive statistics, including straightforward regressions.


3. Methods to Maximize Response Rates


The project will utilize a variety of techniques to maximize response rates to the grantee survey, including: 1) working to identify the best point(s) of contact at each grantee and partner site, 2) sending a survey invitation packet communicating DOL/ETA endorsement prior to the survey, 3) providing the online survey directly through a convenient email link, 4) tracking participation and sending periodic reminders to non‐respondents, and 5) gathering the survey data by telephone for those who do not respond to the online survey.

We hope to achieve a response rate of 100 percent to the grantee survey. DOL has indicated that completing the survey is mandatory for its grantees and stressed this requirement in its Solicitation for Grant Applications. In addition, we will employ the following approach, which is designed to maximize efficiency and minimize costs:

  • A clear, stream-lined survey instrument that will make responding easy and ensure accurate responses. The survey will be designed to be as brief as possible, with clear, easy-to-answer questions (mostly closed questions with a few open-ended questions).

  • An official letter will gain attention and legitimize the study. An advance letter with log-in information will be mailed to grantee sample members, helping legitimize the study and further encouraging participation.

A two-staged outreach strategy for reluctant responders will result in a high conversion rate. Beginning in week 2, we will send e-mail reminders to those who have not responded. We will conduct follow-up telephone calls (in addition to e-mail reminders) to non-responders beginning in week 4. IMPAQ telephone interviewers are trained in refusal conversion. Experienced, expert refusal converters will be assigned to work with the reluctant responders to maximize conversion rates. In addition, we expect to have DOL reach out to reluctant responders to remind them of their requirement to participate.

4. Tests of Procedures


After the grantee survey instrument has been developed, it will be pre-tested with a convenience sample of up to nine respondents from the grantee sites. The pretest will be based on cognitive interviewing techniques. The instrument will be revised as appropriate following pre-testing.


After the survey content has been finalized and approved by OMB, the instrument will be programmed for self‐administration on the Web. Research staff and programmers will thoroughly test the computerized questionnaire. A testing protocol will be developed along with various testing scenarios to ensure that the instrument is performing correctly for all types of respondents. Test scenarios will be used to evaluate whether question wording and response choices are accurate, whether instructions are clear, and whether skip patterns are functioning properly. Thorough testing will ensure that any errors are corrected prior to launching the main survey.


We do expect a 100% response rate for this web-based survey as it is a requirement of the grant performance. However, if we do not receive 100% of the responses, we will conduct a non-response analysis.


Non-Response Analysis for the Survey. The actual difference between respondents and non-respondents on estimates will not be known. In this instance, non-response bias is typically explored using indirect measures. We will compare characteristics of the completed and non-completed cases from the sample to determine whether there is any evidence of significant non-response bias in the completed sample. A comparison of the characteristics of all cases in which a survey had not been completed with those that had been completed can be employed using these variables from the grant applications, awards and progress and quarterly reports. Based on the findings of the non-response analysis, any weighting or other statistical adjustment will be made to correct for non-response bias in the completed sample.1



5. Statistical Consultants


This data collection effort is primarily qualitative in nature; as such we will not be using any statistical consultants for the project.


The agency responsible for receiving and approving contract deliverables is:

Employment and Training Administration

U.S. Department of Labor

Frances Perkins Building
200 Constitution Avenue, NW

Washington, DC 20210


Person Responsible:


Savi Swick, Federal Project Officer

(202) 693-3382

[email protected]

All data collection and analysis will be conducted by:


IMPAQ International and its subcontractor, AED.


1 The use of indirect measures such as demographics to conduct nonresponse analysis is supported in the literature. See O’Neil, G. and J. Dixon (2005). Nonresponse bias in the American time Use Survey. ASA Section in Survey Research Methods (p2958-2966). [www.bls.gov/tus/papersandpubs.htm]; Groves, R.M. (2006). Nonresponse Rates and Nonresponse Bias in Household Surveys. Public Opinion Quarterly, 70, 646-675.; and Kasprzyk, D and Geisbreecht (2003). Reporting Sources of Error in U.S. Government Surveys. Journal of Official Statistics, 19(4), pp 343-363.


Page 3 Part B – Supporting Statement

File Typeapplication/msword
File TitleSUPPORTING STATEMENT
AuthorCBennett
Last Modified BySwick.Savi
File Modified2011-12-27
File Created2011-12-27

© 2024 OMB.report | Privacy Policy