CBJTG Supporting Statement_FINAL

CBJTG Supporting Statement_FINAL.doc

Evaluation of the Community-Based Job Training Grants

OMB: 1205-0480

Document [doc]
Download: doc | pdf









SUPPORTING STATEMENT

FOR THE PAPERWORK REDUCTION ACT OF 1995








SUBMISSION FOR SURVEY OF

COMMUNITY-BASED JOB TRAINING GRANT RECIPIENTS





Prepared for:


U.S. Department of Labor, Employment and Training Administration

200 Constitution Avenue, NW

Washington, DC 20210



Prepared by:


The Urban Institute

2100 M Street, NW

Washington, DC 20037





July 20, 2010

Revised January 5, 2011

Table of Contents


A) Justification 2

1) Circumstances that make the collection of information necessary 2

2) How, by whom, and for what purpose the information is to be used 4

3) Use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology 6

4) Identification of duplication of data collection efforts 7

5) Impacts on small businesses or other small entities 7

6) Consequences if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles in reducing burden 7

7) Special circumstances 8

8) Public comments in response to Federal Register notice and consultation with outside representatives 8

9) Payment or gift to respondents 8

10) Assurance of confidentiality provided to respondents 8

11) Additional justification for any questions of a sensitive nature 9

12) Estimates of the hour and cost burden for the information collection 9

a) Hour burden of the collection of information 9

b) Annualized cost to respondents for the hour burden for collection of information 11

13) Estimate for the total annual cost burden to respondents or record-keepers resulting from the collection of information 13

14) Estimates of annualized costs to the Federal government 13

15) Reasons for any program changes or adjustments 13

16) Plans for tabulation and publication 13

17) Approval to not display the expiration date for OMB approval 14

18) Exceptions to the certification statement 14

B) Collection of Information Employing Statistical Methods 15

1) Potential respondent universe and any sampling or other respondent selection methods to be used 15

2) Describe the procedures for the collection of information including: 16

a) Statistical methodology for stratification and sample selection 16

b) Estimation procedures 16

c) Statistical techniques to ensure accuracy for the purposes described in this justification 16

d) Specialized sampling procedures to correct unusual problems. 16

e) Periodic data collection cycles to reduce burden. 17

3) Methods to maximize response rates and to deal with issues of non-response 17

4) Tests of procedures or methods to be undertaken 18

5) Name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency 18


SUPPORTING STATEMENT FOR THE PAPERWORK REDUCTION ACT OF 1995


SUBMISSION FOR SURVEY OF

COMMUNITY-BASED JOB TRAINING GRANT RECIPIENTS



The U.S. Department of Labor, Employment and Training Administration (ETA) is seeking Office of Management and Budget (OMB) approval to collect information from recipients of the first four rounds of the Community-Based Job Training Grants (CBJTG) through a survey of grantee organizations and site visits to eight of them.


ETA supported the CBJTG program as an investment in building “the capacity of community colleges to train workers in the skills required to succeed in high-growth, high-demand industries.”1 CBJTG provides grants for the development and implementation of industry-specific job training programs at community colleges to meet the workforce needs of such industries as health care, energy, and advanced manufacturing. Two hundred and seventy-nine grants were issued between 2005 and 2009 in the first four rounds of grant competition.2 Most of the grant awards went to community and technical colleges, although in the later rounds, some grants were also made to community college districts, state community college systems and organizations, and agencies within the public workforce investment system.

ETA has contracted with the Urban Institute, a nonprofit, nonpartisan research organization based in Washington, DC, to conduct an evaluation of the CBJTG program. The evaluation will draw mainly on the data collected through the survey of grant recipients, a review of grant documents and site visits to eight grant projects. The Urban Institute completed a review of available grantee documents (grant applications, statements of work, quarterly narrative and quantitative reports) in the winter of 2008 and developed a report on the grant characteristics and their planned project activities. The survey data collected through this effort will provide a comprehensive picture of the different grant-funded projects and identify grant implementation issues to date. Site visits to eight grantees will further document trends and patterns across grantees and yield detailed descriptions of projects implemented by the selected grantees.


The survey will be administered to all grantees in the first four rounds. To reduce respondent burden, the survey will be administered in a Web-based format that allows for automatic skip patterns. Grantees will also have the option to complete and return a paper version. The survey will gather data on grantee organization type, size and structure, project design and objectives, recruitment efforts and target populations, training, capacity-building and other program activities, partners’ contributions, and plans for sustaining programming and leveraging resources. It would not be feasible to systematically collect these data using another method, such as phone interviews or extensive site visits to all grant recipients, within the constraints of the project.


The second data collection activity for this evaluation, site visits to eight selected CBJTG program grantee organizations, will deepen our understanding of grant implementation. The eight sites were selected to ensure variation along a list of criteria. The focus of the visits will be to further document the activities supported with the grant funds, the extent to which the key objectives of the overall CBJTG program are addressed, the nature of activities conducted and products developed, and partnerships involved. The site visits will provide information on common trends and patterns across grantees as well as implementation challenges and successes. It would be impossible to gather data of comparable richness and detail through any other data collection strategy.


A final report will provide an analysis of data from the survey and the site visits. Findings from these data will be integrated with reviews of grant applications and quarterly reporting documents.


  1. Justification

    1. Circumstances that make the collection of information necessary

This evaluation of the CBJTG program will shed light on the implementation of the program. The Solicitations for Grant Application (SGAs) for the program include wording that alerted applicants to the possibility of outside evaluation. The SGAs state that any grant-funded program should be prepared to “provide access to program operating personnel and participants, as specified by the evaluator(s) under the direction of ETA including after the expiration of the grant.”3


The CBJTG evaluation, including the proposed survey, is being conducted in compliance with Section 172 of the Workforce Investment Act of 1998. Section 172, “Evaluations,” directs the Secretary to “provide for the continuing evaluation of the programs and activities” carried out under Title I—Workforce Investment Systems.


The proposed survey will provide data essential to the evaluation of the CBJTG program. While grantees are required to submit quarterly reports to ETA with aggregate participant and financial data, little is known about the current structure and operation of grantee projects and activities. A standardized survey of all grantees, such as the one proposed, is demonstrably the best method to gather comprehensive information on the following:

  1. Overall characteristics of grant recipient organizations;

  2. Capacity-building efforts and training programs and activities conducted with the CBJTG funds;

  3. Training goals originally proposed by grantees and their progress in meeting them;

  4. The types, numbers, and roles of grantee partner organizations; and

  5. Grantee plans for sustaining activities after the completion of the grant period.


The proposed site visits would provide additional information for the evaluation. In particular, the site visit data will improve our understanding of the implementation of the grant program, providing deeper insights into such areas as:

  • program context;

  • program design and goals;

  • start-up and ongoing implementation issues;

  • budget and costs, staffing, and staff development;

  • client characteristics;

  • outreach, intake, and assessment;

  • components and content of services;

  • client flow;

  • organizational and partner linkages and integration of services;

  • leveraged resources;

  • program replicability and sustainability; and

  • expected outcomes.


Additionally, the site visits will allow researchers to interview representatives of grantees’ partner organizations, including community-based organizations, employers, the workforce investment system and educational institutions. As neither the survey nor the grantee-provided reporting and documents include information from the perspective of partner organizations, the site visits will provide unique information on:

  • Who the partners are

  • How and why they became involved in the project

  • The partners’ roles in the project and

  • Partner perceptions of impacts of the project on individuals who have completed the training program and on meeting employer demand for well-qualified, skilled workers in the target industry.


In addition, the U.S. Government Accountability Office (GAO) completed a report 4 to Congressional requestors and presented testimony5 before a Senate subcommittee on the Department of Labor employment and training programs, including the CBJTG program. Together these documents examine the awards and evaluation process for the three ETA discretionary grants programs – the High Growth Job Training Initiative, the Workforce Innovation in Regional Economic Development initiative, and CBJTG. Although GAO focuses on the need for more rigorous study of the impact and outcomes of the grant programs, they also call for an evaluation of the implementation of the grants, such as this study.

    1. How, by whom, and for what purpose the information is to be used

The survey of CBJTG recipients is a new one-time information collection, which will be used to provide standardized data on grant recipients and their projects for evaluative purposes. Many of the first-round grant recipients will have completed their grant-funded activities, while grant recipients in the later rounds will still be operating.6


A survey respondent can be any individual at the grantee organization with sufficient knowledge of the organization and program, and multiple individuals will be able to contribute to a given response. The Urban Institute will administer the survey via the World Wide Web although respondents will also have the option of completing a paper version. Questions will concern the following:

  • The economic and community context of the program

  • Program design and goals

  • Program setup and initial implementation

  • Program components and services

  • Recruitment and assessment of training participants

  • Numbers and characteristics of participants

  • Partnerships

  • Leveraged funds

  • Staffing and technical assistance

  • Budgets and costs

  • Plans to continue after the grant ends, if any, and resources involved

  • Available outcome data.

The vast majority of the survey questions are closed-ended to allow for quantitative analysis of the data, such as cross-tabulations, and to reduce human error. Responses to open-ended questions will be coded for analysis.


The site visits are also a new, one-time data collection, and will complement the survey data by providing greater depth of understanding of grant implementation. The focus of these site visits will be to fully document the activities supported with the grant funds, the extent to which the key objectives of the overall CBJTG program are addressed, the nature of activities conducted and products developed, and the partnerships involved. Information collected through the site visits will concern common trends and patterns across grantees as well as implementation challenges and successes. Specifically, the site visits will help address the following types of questions:

  • What workforce challenges are the grantees addressing? What is the economic climate in which the grantee is operating, and how has it affected the project?

  • Have the grantees’ goals evolved over the course of the grant period? If so, how?

  • Have the grantees had any recruitment challenges? How have these been addressed? Can any promising practices be identified?

  • Have there been any challenges in establishing the training programs? How have these been addressed? Can any promising practices be identified?

  • Do the CBJT grants facilitate career ladder/advancement opportunities? How do these career ladders work and have there been any problems in implementing them?

  • Do the CBJT grants establish new pipelines of workers for the target industry/industries, including pipelines for youth? How well do the grantees match skilled workers with employers’ workforce needs?

  • Are community or technical college grantees working in partnership with other community colleges (i.e., in a network/consortium)? If so, what are the benefits and challenges of such partnerships?

  • Do the grants lead to sustained, increased contacts, and/or more joint efforts between employers, the workforce investment system, and community colleges? Has the grantee developed partnerships at the regional level?

  • How do grantees interact with the workforce system?

  • What challenges, if any, did the grantees experience in engaging partners? How were these addressed? Can any promising practices be identified?

  • What challenges, if any, did the grantees experience in capacity-building efforts? How were these addressed? Can any promising practices be identified?

  • Are CBJT grants sufficiently intensive to reasonably expect them to produce measurable impacts on the capacity of community and technical colleges to deliver training?


The strategy for selecting grantees for in-depth site visits ensures that the final sample exhibits variation along the following four dimensions: region, industry of focus, timing of grant, and organizational structure. In addition to these four primary selection criteria, selected grantees exhibit variation in their target population, plans for partnerships, and geographic characteristics.


Respondents that will be interviewed on each visit will include at a minimum:


Grantee organization respondents:

  • The CBJTG project director;

  • CBJTG program supervisors and staff involved in direct service delivery to CBJTG participants; and


Partner organization respondents:

  • Workforce development agency program administrators/staff;

  • Staff at other partnering educational facilities;

  • Staff from corporate and industry partners with links to CBJTG projects; and

  • Other community leaders and individuals knowledgeable about the initiative.


The exact number and type of individuals who will be interviewed will vary by site, depending on the specific grantee model, including its service components and involvement of partners.


All materials developed from the analyses of this data collection effort are intended to reach multiple audiences including:

  • ETA and DOL staff

  • Community colleges, technical colleges, workforce investment agencies and organizations, and other similar training providers

  • Community and technical college associations

  • Industry groups

  • Researchers

  • Policymakers at the state and federal levels of government looking to design similar programs and

  • Others interested in understanding the experiences and lessons from the CBJTG program.


The survey and site visit data will be analyzed and presented in a final report to ETA and posted on the ETA and Urban Institute web sites. This report will guide ETA in understanding the range of CBJTG programs and their implementation, informing the development and implementation of future initiatives.


Included as part of this submission is a document entitled “Research Questions and Data Sources” provides detailed information on the research questions the grantee survey will address. A copy of the discussion guide for key grantee staff is also included.


    1. Use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology


Survey respondents will be able to choose whether to respond online to an electronic version of the survey or respond on a paper version to be returned by fax. However, we expect that nearly all of them will choose to respond to the online version of the survey, as has been the case in similar projects completed by the Urban Institute.


The web-based survey was created and tested in CHECKBOX, a commercial software application for development and administration of online surveys. The main advantage of the online survey is the automatic tabulation of responses that reduces both the hours of staff time needed for survey processing and the possibility of introducing human error into the data. The automated skip patterns embedded in the online survey also place less of a burden on the respondent than the customary “if-then go to” instructions of a paper and pencil questionnaire. (see Attachment 2 for the draft survey instrument). The web and paper versions of the questionnaire will both be in modular formats that allow the primary respondent to pass sections or questions on to other staff members who may be better equipped to address particular topics.


It is expected that only a few respondents will elect to use the paper version; therefore, the survey is designed for completion online. We will check for any indication of bias from survey mode by testing for any significant differences between responses on surveys completed on paper and online.7


Collected survey data will be stored in a SQL server database located on a server in an access-controlled room at the Urban Institute. The database server is behind a firewall that monitors and evaluates all attempted connections from the Internet. The entire database is encrypted using AES for Windows 2000. All transmission of the data from the remote users to the web server is encrypted using a Secure Site License (SSL) through Verisign's Secure Site Licensing; subsequent transmission of data from the web server to the database is encrypted using SQL Server.


Interviews during the site visits will be conducted face-to-face, although researchers may follow up with interviews by telephone if there is a need for further clarification after the visit. While on site, interviews will be conducted by teams of two researchers, one who will guide the discussion and one who will primarily take notes on a laptop computer. All hard copies of site visitor notes and audiotapes will be stored in a locked file cabinet when not in use. At the Urban Institute, electronic versions of site visitor notes will be stored on a confidential drive set up by the Urban Institute IT department. Respondents will not interface with any automated, electronic, mechanical, or other technological collection techniques or other forms of information technology during this portion of the proposed data collection.


    1. Identification of duplication of data collection efforts

The information we propose to collect from CBJTG recipients is not otherwise available. There is no other systematic qualitative assessment of the CBJTG program. The information currently being collected from CBJTG recipients through the narrative quarterly reports to ETA is not standardized in a way that allows data analysis.8 To the extent feasible, we will integrate the participant and financial reports submitted by grant recipients into the analysis for this evaluation. However, those reports present purely quantitative data and lack the kind of in-depth information on grant activities that the survey and site visits will provide.


    1. Impacts on small businesses or other small entities

The CBJTG recipient survey does not impact small businesses or other small entities. All survey respondents are community or technical colleges, community college districts, state community college systems, or government workforce entities such as departments of labor, One-Stop Career Centers, and Workforce Investment Boards.


There is a chance that the data collection during the site visits may impact a small business should one of the grantee partners be a small business. For grantee partners, participation in interviews is voluntary, so a small business may choose not to be interviewed. In addition, interviews with partners will last no longer than one hour and can be done in person or by phone as is the most convenient for the small business (or any other) respondent.


    1. Consequences if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles in reducing burden

Given the significant expenditures involved in implementing the CBJTG program, and the role that this and similar grant programs are intended to play in shaping the nation’s workforce system, it is critical to document the different models and projects that are operating under the initiative, examine and assess the implementation to date, and identify innovative features and potentially promising strategies. The CBJTG survey and site visits are critical to this evaluation project, as they represent the only opportunity to gather comprehensive and in-depth information on implementation from all grantees in the first four rounds.


    1. Special circumstances

There are no special circumstances that would cause this information collection to be conducted in a manner that would:

  • require respondents to report information to the agency more often than quarterly;

  • require respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

  • require respondents to submit more than an original and two copies of any document;

  • require respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years ;

  • be in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;

  • require the use of statistical data classification that has not been reviewed and approved by OMB;

  • include a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

  • require respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information’s confidentiality .


    1. Public comments in response to Federal Register notice and consultation with outside representatives

Notification of this survey was published in the Federal Register, Vol. 75, No. 87 (Thursday, May 6, 2010: pp. 24990-24991). The public was given 60 days from the date of publication to submit comments. There were no comments from the public during this time period.


The American Association of Community Colleges (AACC) was asked to review and comment on the survey. Representatives from the organization provided feedback on the survey, which helped to clarify several of the questions to ensure that they were appropriate for the community college respondents and suggest questions that were missing. The comments from AACC have been addressed and are reflected in the final version of the survey.


    1. Payment or gift to respondents

Survey respondents will receive no payments or gifts.

    1. Assurance of confidentiality provided to respondents

The CBJTG recipients to whom the survey is distributed, as well as any respondents interviewed during the site visits, will be assured that their responses will be kept private. Steps will be taken, in accordance with the Urban Institute Institutional Review Board (IRB) guidelines, to offer respondents the assurance that the information they provide is considered private and will not be shared with anyone outside of the research team in a manner that would allow respondent identification unless the research team is legally ordered otherwise. All findings from the survey will be presented at the aggregate level. Findings from the site visits will be presented at the organizational level, in order to provide detail and illustrative examples, but no individual respondents will be identified or quoted in any publication. Prior to collecting data, each survey and interview respondent will be given the pertinent privacy information, an explanation of the nature of the study, and a description of the time, necessary to participate. However, no binding guarantees of confidentiality will be offered. Please see the first pages of each data collection instrument for respondent privacy statements.


To protect survey respondent privacy, both electronic and paper copy survey data will be secured (procedures are described in the response to item 3 above). While the survey is still active, access to any data with identifying information will be limited only to contractor staff directly working on the survey and will require special usernames and passwords. Once the survey is closed to respondents, responses will be downloaded for analysis from the SQL server database and kept on a controlled access, encrypted network drive. Hard copies of the survey will be entered into the electronic format and kept in a locked file cabinet in a designated Urban Institute employee’s office. All survey hard copies will be shredded upon completion of the evaluation.


To protect site visit respondent privacy, all hard copies of site visitor notes and audiotapes will be stored in a locked file cabinet when not in use. At the Urban Institute, electronic versions of site visitor notes will be stored on a confidential drive set up by its IT department. Access to this drive will be limited to research staff members who are working on the project and have signed the confidentiality pledge. A similar data security procedure will be followed for information obtained from the follow-up telephone interviews with program staff. Three years after the project is completed, notes will be shredded and electronic files securely deleted.


    1. Additional justification for any questions of a sensitive nature

There are no questions of a sensitive, personal, or private nature included in the survey or the site visit interview guides.


    1. Estimates of the hour and cost burden for the information collection

      1. Hour burden of the collection of information

Survey

The survey will be fielded to all 279 grantees from CBJTG rounds 1-4. Respondents are designees whom the grantee organization deems to have sufficient knowledge of the training program to complete the survey, and a grantee may have multiple respondents. Specifically, the respondent will most likely be an administrator at a community or technical college or a four-year college or university (95.7 percent of the grantees). A much smaller percentage of grantee respondents (4.3 percent) will be workforce development professionals (e.g., a One-Stop Career Center manager or Workforce Investment Board executive director).


The estimated response rate is 90 percent. Although participation in evaluation activities is required as a condition of the grant award, we expect that due to changes in staffing, about ten percent of grantees will not respond to the survey.


The response time for the five grantees that pre-tested the survey in March 2010 averaged 60 minutes and varied with the complexity of their programs. Table 1 provides an estimate of the respondent burden for completing the survey.


Table 1. Estimated Time Burden for Respondents to CBJTG Grant Recipient Survey



Category


Sample Size


Response Rate


Number of Respondents



Frequency


Average Time Per Respondent

Total Burden Hours

Staff person(s) of grant recipient organization

279

90%

251

Once

60 minutes

251

hours








Total

279


251



251 hours


Site Visits

Researchers propose to visit eight grantee organizations for the site visit portion of the data collection. At each of these eight sites, we plan to interview approximately six respondents from the grantee organization, two respondents from partner employers, one from the local workforce investment system partner, and one from another key partner, to be identified by the grantee.


Because we selected the visit sites to represent the diversity of grantee organization types, grantee organization respondents will vary accordingly. The majority will be administrators, staff or faculty at a community or technical college or a four-year college or university. At least at one site, the grantee organization will represent the workforce investment system, so its respondents will be workforce development staff (e.g., a One-Stop Career Center manager, career counselor, program manager, or Workforce Investment Board executive director).


Respondents at employer partners will be managers at local businesses. Respondents at workforce investment system partners will be workforce development staff, and respondents at other partner organizations may be employees at government agencies, social service agencies or community-based organizations.


The expected response rate by the grantees is 100 percent. Participation in evaluation activities is required as a condition of the grant award. The research team will schedule interviews in advance of arriving on site.


The expected response rate for grantee partner organizations is also 100 percent. The primary contact at the grantee organization will assist the Urban Institute to identify appropriate contacts at partner organizations and schedule interviews. Since the research design only requires three partner interviews per site, and almost all grantees have at least three partner organizations, we anticipate little difficulty in recruiting the necessary number of partner respondents.


The primary contact at each site will spend an estimated four hours to complete the interview and assist the research team with site visit preparation. All other interviews will last approximately one hour. This timeframe will not vary due to the limited time researchers will have on site.


Table 2. Estimated Time Burden for Respondents to CBJTG Site Visit Interviews



Category


Sample Size


Response Rate


Number of Respondents



Frequency


Average Time Per Respondent

Total Burden Hours

Primary contact at grant recipient organization


8

100%

8

Once

4 hours

32 hours

Staff person(s) of grant recipient organization


40

100%

40

Once

60 minutes

40 hours

Staff at employer partner

16

100%

16

Once

60 minutes

16 hours

Staff at workforce investment partner

8

100%

8

Once

60 minutes

8 hours

Staff at other partner

8

100%

8

Once

60 minutes

8 hours

Total

80


80



104

hours


      1. Annualized cost to respondents for the hour burden for collection of information

Survey

Table 3 presents the total costs to survey respondents. This estimated cost for the staff at grant recipient organizations is based on median hourly wages for administrative service managers at colleges, universities and professional schools and local government managers, as listed in the May 2009 National Industry-Specific Occupational Employment and Wage Estimates, from the U.S. Department of Labor, Bureau of Labor Statistics and available on the Department’s web site.9 These costs to the respondents for completing the survey and participating in the site visit interviews are expected to come out of their grant funds since they are required to participate in evaluation activities as a condition of the grant award.


Table 3. Estimated Costs to CBJTG Survey Respondents Based on Hour Burden



Category

Estimated number of respondents


Total hours


Median Hourly Wage


Total Annualized Cost

Administrative service managers; colleges, universities and professional schools

(95.7 percent of respondents)

240

240

hours

$37.49

$8,997.60

Local government managers

(4.3 percent of respondents)

11

11

hours

$34.52

$379.72

Total

251

251 hours


$9,377.32


Site Visits

The total cost to respondents for the CBJTG site visits is presented in Table 4.


The estimated costs to employer, workforce investment system, and other partner agency respondents is based on median hourly wages for administrative service managers in manufacturing, local government managers, and civic and social organization managers respectively.


Table 4. Estimated Costs to CBJTG Site Visit Respondents Based on Hour Burden



Category

Estimated number of respondents


Total hours


Median Hourly Wage


Total Annualized Cost

Grantee organization respondents

48

72

hours

$37.49

$2,699.28

Employer partner respondents

16

16 hours

$43.55

$696.80

Workforce investment system partner respondents

8

8 hours

$34.52

$276.16

Other partner respondents

8

8 hours

$28.07

$224.56

Total

80

104 hours


$3896.80



    1. Estimate for the total annual cost burden to respondents or record-keepers resulting from the collection of information

Neither the survey nor the site visits will require respondents to purchase equipment or services or to establish new data retrieval mechanisms. There are no capital/start-up or ongoing operation/maintenance costs associated with this information collection. The content of the survey and the site visit instrument is based on the respondents’ experiences, opinions, and factual information. Therefore, the cost to respondents solely involves the time in answering the questions on the survey and the time to complete the interview. This is captured in the burden estimates provided in A.12.


    1. Estimates of annualized costs to the Federal government

The estimated cost of this evaluation, including the proposed data collection effort, to the Federal government is $500,000 and will be borne by ETA.


    1. Reasons for any program changes or adjustments

This is a new request.


    1. Plans for tabulation and publication

After collecting survey data, the Urban Institute team will present it in summary formats that allow ETA and other stakeholders to better understand the variety of CBJTG programs and their implementation. Details of the programs will be summarized and tables, charts, and graphs used to illustrate the results. A statistical software package such as SAS or Stata will be used to conduct the analyses. The analysis will also integrate the findings from the document review and exploratory site visits to provide a full description of the CBJTG program.


The analysis of survey data will immediately follow its collection. All analysis files will be downloaded from the SQL server database and kept on a controlled access, encrypted network drive.


Qualitative and quantitative analytic activities related to the site visits will begin after the completion of all eight visits. The Urban Institute will prepare individual site summaries and perform a cross-site analysis of key topics such as program design and changes since inception, service delivery models, training typologies, client flow, linkages with other partners, participant outcomes, resource leveraging, potential for replication, and implementation lessons. The cross-site analysis will focus on the same key topic areas covered in the site summaries but will also capture similarities and differences between the sites in key programmatic and operational features and implementation experiences and challenges.


Once the data analysis is completed, the Urban Institute will prepare a final report and submit it to ETA. The report will include a stand-alone summary, an executive summary, the main body, and appendices with additional analyses from the survey. We anticipate that main sections of the final report will describe key findings, promising practices, and implementation challenges. If agreed to by ETA and resources allow, the Urban Institute will also produce a short policy brief, geared toward a practitioner audience and highlighting key lessons and challenges in developing training in high-growth and high-demand occupations at community colleges.


The Urban Institute will send a draft outline of the report and the policy brief to ETA for review. Then it will submit these deliverables in draft to ETA two months after receiving feedback on the outline. One month after receipt of comments on the drafts, the Urban Institute will make revisions and submit the final report and brief. The final report and brief will be published on the ETA and Urban Institute web sites. In addition, a public use data file, stripped of organizational identifiers, will be provided to ETA. These deliverables will be publicly released regardless of findings.


    1. Approval to not display the expiration date for OMB approval

The OMB approval number and expiration date will be displayed or cited on all information collection instruments.


    1. Exceptions to the certification statement

There are no exceptions to the certification statement.

  1. Collection of Information Employing Statistical Methods

    1. Potential respondent universe and any sampling or other respondent selection methods to be used

The respondent universe for this survey comprises 279 organizations, which were awarded grants in rounds 1-4 of the CBJTG grant program. 95.7 percent of the organizations are community colleges (including community college districts and state systems), other educational institutions (e.g., four-year universities, boards of regents), and technical colleges, and 4.3 percent represent the public workforce system. As we will survey the entire universe of 279 grant recipients, no sample will be drawn. The expected response rate is 90 percent.

The primary respondent for each grantee organization, who will receive the initial contact email with instructions on how to complete the online survey, will be the individual listed as the primary contact for each grantee in ETA records. Most commonly, this individual was listed as the “person to be contacted” on the grantee’s initial application. Since the level of detailed programmatic knowledge may vary among these primary contacts, the survey instructions and the introductory email direct this individual to delegate sections of the survey to others in the organization if he/she believes that they have more complete knowledge of the grant program. For some grantees, ETA has multiple contacts on file. In such cases, if the primary contact is unresponsive, the Urban Institute team will send the survey instructions to the others. If it does not reach the primary and the secondary contacts, the survey team will make calls to the grantee organization and work with its staff to identify and reach the appropriate individual.


All data collection through the grantee survey will be based on the entire universe of CBJTGs in rounds 1-4. In all reports, other publications and statements resulting from the survey, no attempt will be made to draw inferences outside the grantee universe. We plan to look at the response group by grant round, type of organization, industry, and other potentially relevant variables to determine if there are any significant differences between the non-respondent and the respondent groups. Any such differences will be reported and considered in the interpretation of the findings.


The potential universe for site visit selection is the 279 CBJTG program grantee organizations. From this universe, eight grantee organizations were selected for two-day site visits. The site visits are designed to provide in-depth information about a select group of grantees. Although reports and publications will highlight lessons and themes from the site visits, language will be included to be clear that the results from the site visits should not be generalized to the population of CBJTG grantees.


The potential respondents at each selected site include staff from the grantee organization and partner organizations, as outlined in Figure 1. For each of the eight sites, the member of the two-person site visit team primarily responsible for logistics will make initial contact by phone with the individual listed as the primary contact in ETA records. The site visit team will then send an e-mail to inform the grantee organization of the study and request its cooperation. The initial telephone contact will provide background about the project and seek additional information on organizations and partners in order to identify key respondents. Based on this information, the site visit team will contact respondents and determine the best timing for the visit in order to accommodate the schedule of local respondents.


Figure 1. Potential Respondents for Site Visits

Agency/Program

Respondent

Grantee Organization

(possibilities include: community college or technical college, community college consortium or system, other educational institution or a workforce investment organization)

  • Project Director

  • Project Staff, i.e., staff responsible for curricular development, support services, participant recruitment and tracking, and other areas of project development and administration

  • Instructors

  • Dean or another community college administrator

  • Student counselors/advisors

Partner Organization (e.g., a community-based organization, other community or technical organization, college or university, employer or industry group, workforce investment organization)

  • Staff responsible for partnering activities (e.g., job placement, internships, job shadowing for trainees, curricular consulting recruitment, support services, provision of training or financial resources)


    1. Describe the procedures for the collection of information including:

      1. Statistical methodology for stratification and sample selection

Since this is a qualitative study of CBJTG implementation across the country, no statistical methods will be used to sample respondent populations. All 279 grantees in rounds 1-4 will be surveyed. Additionally, no statistical methods will be used to select the grantee organizations for the site visits as the sample is neither random nor representative.

      1. Estimation procedures

This survey is intended to develop an inventory of grantee goals, activities, project context, and future project plans, not to make statistical inferences about these efforts. Similarly, the site visits are designed to provide in-depth qualitative information about grantees; no estimation procedures will be used.

      1. Statistical techniques to ensure accuracy for the purposes described in this justification

No statistical techniques will be used to ensure accuracy.

      1. Specialized sampling procedures to correct unusual problems.

No specialized sampling procedures will be used

      1. Periodic data collection cycles to reduce burden.

Both the site visits and the survey are one-time data collection efforts and will not require periodic data collection cycles.


    1. Methods to maximize response rates and to deal with issues of non-response

Survey

To ensure the full documentation of activities for all grantees, having the necessary response rates is important. We expect that the steps outlined below will produce a response rate of 90 percent of all grantee organizations because the SGA states that the grantee organizations are required to participate in evaluation activities. Reminding grantees of this requirement in the documentation accompanying the survey will help in ensuring this high response rate.


Other survey procedures are designed to ensure high response rates among respondents. To reach out to grantees prior to fielding the survey, ETA will send advance letters to all grant directors one month before the survey. The letter will specify the date on which the survey is scheduled to be sent, the formats in which it will be available (online or in a Microsoft Word version, if needed), the time expected to complete the survey, and the survey’s originator (the Urban Institute).


On the scheduled date, the Urban Institute will e-mail all CBJTG grant directors with the link to the web-based survey and instruction for completion. The respondents will be provided with a contact should they encounter any problems or questions as they complete the survey. Through CHECKBOX, the study team will be able to track who has started the survey and monitor their progress and follow up with those grantees that have not started or completed the survey. Follow-up with the grantee respondents will be done through e-mail reminders.


The Urban Institute will use a PC-based tracking system to monitor the receipt of surveys, status of follow-up reminders, attachments provided by respondents, completion of data entry, and need for further clarification. As each survey is reviewed, follow-up e-mails and telephone calls will be made to those respondents whose surveys contain errors, unclear responses, or missing information. If an evaluation team member is uncertain about how to code a response to an open-ended question or whether follow-up is needed, the survey team leader will review the item. All coding decisions made in such cases will be documented to assure consistency in coding. Surveys completed electronically will be uploaded into a Microsoft Excel database and kept on the dedicated controlled access, encrypted network drive.


Site Visits

For the site visits, it is expected that all (or nearly all) of the grantee organizations we approach will agree to participate. The selected sites have been secured; site visitors will work closely with the primary contact for each grantee in ETA records to help in scheduling the site visit. One member of the two-person site visit team will take responsibility for working with the primary contact person to handle the scheduling and logistics, e.g., identifying appropriate interview respondents. Dates for site visits will be set at least one month in advance to allow ample time to schedule interviews. Interview appointments will then be confirmed via e-mail the week prior to the visit. The site visit team will request that a quiet, relatively private setting (e.g., a conference room) be made available to interview those who do not have private offices, in order to encourage respondents to speak freely.


    1. Tests of procedures or methods to be undertaken

In March 2010, five CBJTG program grantees were contacted to pre-test the online version of the survey. To assure representativeness, the five grantees varied along the following dimensions: round of the grant program, operational status, industry, number of training programs, and organization type. Brief follow-up telephone interviews were conducted with each of the pre-test participants to obtain their feedback on the survey. Based on it, several changes were made to the survey instrument in five general areas: timing/burden; necessary or helpful documents that are suggested for grantees gather before taking the survey; substance of questions; technical issues; and missing areas.


In addition, the American Association of Community Colleges (AACC) was asked to review and comment on the survey. Representatives from the organization provided feedback on the survey, which helped to clarify several of the questions to be appropriate for the community college respondents and suggest questions that were missing. The comments from AACC and the grantee respondents have been addressed and are reflected in the final version of the survey.


Site visit instruments have been reviewed for content, methodology, and burden estimate by internal reviewers at the Urban Institute. Overall, reviewers report that the discussion guides capture the intended data and minimize burden on respondents.


    1. Name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency


All data collection and analysis will be conducted by:


The Urban Institute

2100 M Street, NW

Washington, DC 20037


Persons Responsible: Lauren Eyster, Project Director

(202) 261-5621

[email protected]

Demetra Nightingale, Co-Principal Investigator

(202) 261-5570

[email protected]

1 U.S. Department of Labor, Employment and Training Administration. 2008. “The President’s Community-Based Job Training Grants.” http://www.doleta.gov/business/PDF/cbjt_overview.pdf.

2 Notice of Availability of Funds and Solicitation for Grant Applications (SGA) for Community-Based Job Training Grants, 70 Fed. Reg. 22905 (May 3, 2005); Notice of Availability of Funds and Solicitation for Grant Applications (SGA) for Community-Based Job Training Grants, 71 Fed. Reg. 37984 (July 3, 2006); Notice of Availability of Funds and Solicitation for Grant Applications (SGA) for Community-Based Job Training Grants, 73 Fed. Reg. 60340 (October 10, 2008); Notice of Availability of Funds and Solicitation for Grant Applications (SGA) for Community-Based Job Training Grants, 75 Fed. Reg. 12272 (March 15, 2009). See also ETA news release, January 16, 2009, http://www.dol.gov/opa/media/press/eta/archive/eta20090068.htm.

3 75 Fed. Reg. 12272 (March 15, 2009); 73 Fed. Reg. 60349 (Oct. 10, 2008); 71 Fed. Reg. 37959 (July 3, 2006); 70 Fed. Reg. 22913 (May 3, 2005).

4 U.S. Government Accountability Office. (2008). Employment and Training Program Grants: Evaluating Impact and Enhancing Monitoring Would Improve Accountability. Report to Congressional Requesters.

5 U.S. Government Accountability Office. (2008). Employment and Training Program Grants: Labor Has Outlined Steps for Additional Documentation and Monitoring but Assessing Impact Still Remains an Issue. Testimony before the Senate Subcommittee on Employment and Workplace Safety, Committee on Health, Education, Labor and Pensions.

6 Survey questions will be pre-programmed to reflect a respondent’s operational status.

7 Newcomer, Kathy and Timothy Triplett.2010 “Using Surveys.” In Wholey, Joseph S., Harry Hatry, and Kathryn E. Newcomer, eds. Handbook of Practical Program Evaluation. Third Edition. San Francisco: Jossey-Bass.

8 OMB No. 1205-0N465.


File Typeapplication/msword
File TitleSUPPORTING STATEMENT FOR PAPERWORK REDUCTION ACT 1995
AuthorAdministrator
Last Modified Bynaradzay.bonnie
File Modified2011-01-19
File Created2011-01-19

© 2024 OMB.report | Privacy Policy