TAACCCT OMB SupportingStatement PartB

TAACCCT OMB SupportingStatement PartB.doc

Evaluation of the Trade Adjustment Assistance Community College Career Training Grants Program

OMB: 1291-0007

Document [doc]
Download: doc | pdf


B. Collection of Information Employing Statistical Methods

1. Potential respondent universe and any sampling or other respondent selection methods to be used


Survey


The respondent universe for the online college survey comprises the 867 colleges across all 178 Rounds 1-3 grant recipients. As the Urban Institute research team will survey the entire universe of 867 colleges, no sample will be drawn. The expected response rate is 90 percent.

The list of primary (and secondary, if available) respondent names and emails from each college will be developed in collaboration with ETA staff responsible for the TAACCCT grant program and the primary contact for each grant organization. The primary respondent will be defined as the person responsible for the grant’s implementation at each college. Since the level of detailed programmatic knowledge may vary among these primary contacts, the survey instructions and the introductory email direct this individual to delegate sections of the survey to others in the organization if he/she believes that they have more complete knowledge of the grant program. If it does not reach the primary and the secondary contacts, the Urban Institute team will reach out to the grantee organization and work with its staff to identify and reach the appropriate individual.


All data collection through the college survey will be based on the entire universe of TAACCCT in Rounds 1-3. In all reports, publications and statements resulting from the survey, no attempt will be made to draw inferences outside the grantee universe. We plan to look at the response group by grant round, type of organization, industry, and other potentially relevant variables to determine if there are any significant differences between the non-respondent and the respondent groups. Any such differences will be reported and considered in the interpretation of the findings.


Key variables to be collected in survey are related to : (1) program development and capacity building activities – program goals, staffing, educational methods and curricular improvements, credentials, transfer and articulation agreements, strategic alignment with other systems, technology; (2) participants’ experience - eligibility, supports, enrollment, length of programs, remedial components; (3) partnerships with employers and other stakeholders - number of partnerships, role of partners, experiences with partners; and (4) sustainability - whether program(s) will continue, challenges to sustainability, future of partnerships.


Site Visits


The potential universe for site visit selection is the 129 rounds 2 and 3 TAACCCT grant recipients. From this universe, 20 grantee organizations will be selected for three- to four-day site visits. The site visits are designed to provide in-depth information about a group of grantees with a range of characteristics and grant designs. Although reports and publications will highlight lessons and themes from the site visits, language will be included to be clear that the results from the site visits should not be generalized to the population of TAACCCT grantees.


Suitable sites will be identified using information in the grantee database, which provides data on each grant’s intervention and evaluation design based on grant applications, evaluation plans, and performance reports. We will also use supporting documentation including narrative progress reports and notes from clarification calls to 15 grantees. We will rank-order a list of potential sites, describing the characteristics that make each grantee suitable for inclusion in the evaluation. From this list, 10 sites will be identified from rounds 2 and 3 that will ensure we visit a range of grants that will provide varying approaches and experiences to inform the implementation analysis. An alternate for each site, having similar characteristics, will be identified in the event that the grantee is unable to participate in the data collection.


The following criteria will be used for site selection: TAACCCT intervention being implemented; number of consortium members; industry sector; program size and structure; types of credentials on which grantees are focusing; target population, funding, geographic region; partnership structure; and evaluation design. The latter criterion is essential to assess the feasibility of similar grant interventions to participate in and sustain random assignment for an experimental evaluation or other more rigorous evaluation designs. In addition, sites will be selected to inform the cross-site non-experimental analysis.


The potential respondents at each selected site include staff from the grantee organization, partner colleges (if a consortium grant), and other partner organizations, as outlined in Table 5. For each of the 20 sites, the member of the two-person site visit team primarily responsible for logistics will make initial contact by phone with the individual listed as the primary contact in ETA records. The site visit team will then send an e-mail to inform the grantee organization of the study and request its cooperation. The initial telephone contact will provide background about the project and seek additional information on organizations and partners in order to identify key respondents. Based on this information, the site visit team will contact respondents and determine the best timing for the visit in order to accommodate the schedule of local respondents.


To prepare for the focus groups, we will discuss with each TAACCCT grantee the best and most culturally appropriate recruitment techniques. There are two ways to do this. We will work with the grantees to recruit students for participation. First, we will ask the grantee to give us a comprehensive list of program participants with their contact information, and then recruit participants using recruitment letters and if necessary, follow-up phone calls. Or, if the grantee prefers to have potential participants contacted by program staff (e.g., the Program Coordinator), we will provide the recruitment materials to facilitate outreach efforts.









Table 5. Respondent Universe for Web Survey and Site Visits

Data Collection Activity

Universe/Sampling Frame

Respondent Description

Online college survey

Universe of 867 Round 1-3 TAACCCT colleges/census


Program coordinator or the most knowledgeable person designated by the college

Site visits: Semi-structured interviews

Universe of 129 Round 2 and 3 grant recipients/purposive sampling of 20 grantees that represent a range of grantee characteristics and program design features

TAACCT project coordinators, other college staff, industry and community partners, and employer partners


Site visits: Focus groups

Universe of approximately 400 TAACCCT students contacted for focus groups participation in 20 grant sites. Convenience sample of 320 students willing and able to attend the focus groups.


Students participating in TAACCCT-funded programs


At the outset of the interviews, we will inquire about the vision and need for program, including the goals and intended outcomes. Focusing on the structure of the capacity-building efforts conducted by the TAACCCT grantees, we will describe grantee (and participating college, if a consortium) organization and characteristics, their administrative structure (i.e., single site, consortia) and the roles/responsibilities of key implementation staff. The history and nature of the partnerships and collaborations will be explored, as well as resources leveraged to facilitate start-up and implementation.


Inquiry concerning the capacity building efforts stemming from the TAACCCT initiative will focus on the overall planning and design that supported the development of the grant activities, the resources needed to implement the activities (e.g., staff training and new hires; facilities and infrastructure; curricula; student services); the processes by which the activities were implemented, and evaluation capacity. Discussion of the TAACCCT Core Elements will focus on use of evidence-based models/approaches, and implementation of stacked and latticed credentials, highlighting the career pathways, program curricula, modalities, and use of technology; credentials, certifications, and degree programs adopted. We will also explore use of online and technology-enabled learning. Transfer and articulation policies/agreements between partner colleges and other institutions of higher education will be explored as well as the strategic alignment between these and other key stakeholders.


Examination of key characteristics will focus on multiple dimensions of the local/regional context, including the economy and labor market conditions, population, employer needs, policy/budget climate, political environment, and historic program experience. Additional characteristics to explore are the target population(s) served and the reasons for this focus, as well as the target industry and occupational focus. As the grants selected for fieldwork will also inform the evaluability assessment, we will also ask about recruitment and retention of the target population(s). These questions will address student eligibility, application/intake, enrollment, and orientation. Progress monitoring, academic and social supports for students, and job supports (e.g., placement, shadowing) as a part of capacity-building efforts will be examined. We will ask about facilitators and challenges experienced during the planning and implementation of the particular model.


We will inquire about program outputs and outcomes at three levels: students (enrollment, retention, educational attainment, and employment); institutional capacity building (both internal and external); and overall grantee accomplishments (to date). Questions about the evaluation design and status will also be addressed. Finally, we will inquire about the sustainability potential and plans of each grantee.


For the focus groups, we will ask about participants’ reasons for enrollment, how they were recruited, their educational and employment aspirations, orientation to the program, satisfaction with the program, linkages to other college services and assistance, interactions with partners, and achievement of goals.

2. Describe the procedures for the collection of information including:

The data will be collected through the online college survey, semi-structured interviews held at selected grantees, and focus groups of students participating in TAACCCT-funded programs. The survey instrument is on Attachment 2. The interview protocols are provided in Attachment 3 and the focus group guide in Attachment 4.

Statistical methodology for stratification and sample selection

Since this is a qualitative study of TAACCCT implementation across the country, no statistical methods will be used to sample respondent populations. All 867 colleges in Rounds 1-3 will be surveyed. Additionally, no statistical methods will be used to select the grantee organizations for the site visits as the sample is intended to be neither random nor representative.

Estimation procedures

This survey is intended to develop an inventory of grantee goals, activities, project context, and future project plans, not to make statistical inferences about these efforts. Similarly, the site visits are designed to provide in-depth qualitative information about grantees; no estimation procedures will be used. The data analysis will be descriptive.

Statistical techniques to ensure accuracy for the purposes described in this justification

No statistical techniques will be used to ensure accuracy.

Specialized sampling procedures to correct unusual problems.

No specialized sampling procedures will be used.

Periodic data collection cycles to reduce burden.

Both the site visits and the survey are one-time data collection efforts and will not require periodic data collection cycles.


3. Methods to maximize response rates and to deal with issues of non-response



Survey

To ensure the full documentation of activities across all TAACCCT colleges, having high response rates is important. We expect that the steps outlined below will produce a response rate of 90 percent of all colleges because the SGA states that grant recipients are required to participate in evaluation activities. Reminding grantees and their partner colleges of this requirement in the documentation accompanying the survey will help in ensuring this high response rate.


Other survey procedures are designed to ensure high response rates among respondents. To reach out to grantees prior to fielding the survey, DOL will send advance letters to all grant directors one month before the survey (see Attachment 5). The letter will specify the date on which the survey is scheduled to be sent, the formats in which it will be available (online or in a Microsoft Word version, if needed), the time expected to complete the survey, and the survey’s originator (the Urban Institute).


On the scheduled date, the Urban Institute will e-mail all primary college contacts with the link to the online survey and instruction for completion. The respondents will be provided with a contact should they encounter any problems or questions as they complete the survey. Through CheckBox, the research team will be able to track who has started the survey and monitor their progress and follow up with those grantees that have not started or completed the survey. Follow-up with the grantee respondents will be done through periodic e-mail reminders.


The Urban Institute research team will use a PC-based tracking system to monitor the receipt of surveys, status of follow-up reminders, attachments provided by respondents, completion of data entry, and need for further clarification. As each survey is reviewed, follow-up e-mails and telephone calls will be made to those respondents whose surveys contain errors, unclear responses, or missing information. If a research team member is uncertain about how to code a response to an open-ended question or whether follow-up is needed, the survey team leader will review the item. All coding decisions made in such cases will be documented to assure consistency in coding. Surveys completed electronically will be uploaded into a Microsoft Excel database and kept on the dedicated controlled access, encrypted network drive.


The research team will examine whether the expected 10 percent non-response of the college universe will introduce bias into measuring the variables of interest. We do not expect to see any systematic non-response across rounds of grants as we will be fielding the survey at the same point of time during the grant performance but it will be important to confirm through our analysis. The tracking system allows for monitoring who has not responded the survey and we could compare survey respondents and nonrespondents in terms of variables such as size of the grant, whether it was competitive or state-designated, consortium or single college, region, and industry of intervention to have more information about possible nonresponse patterns, and will be discussed in the final report to ensure proper interpretation of the results.



Site Visits

For the site visits, it is expected that all of the grantee organizations approached will agree to participate. Once selected sites have been confirmed, site visitors will work closely with the primary contact for each grantee in ETA records to help in scheduling the site visit. One member of the two-person site visit team will take responsibility for working with the primary contact person to handle the scheduling and logistics, e.g., identifying appropriate interview respondents. Dates for site visits will be set at least one month in advance to allow ample time to schedule interviews. Interview appointments will then be confirmed via e-mail the week prior to the visit. Should a potential respondent not be available during the visit, the research team will follow up with a time to interview the person by phone.


4. Tests of procedures or methods to be undertaken

In the late summer and early fall of 2014, the National Evaluation Team engaged in the testing of the data collection instruments. The instrument design and questions were based on those developed for several previous implementation evaluations conducted on similar grant programs – including the High Growth Job Training Initiative (USDOL), the Community-Based Job Training Grants (USDOL), and the Health Profession Opportunity Grants (USDHHS). However, it was important to test the instruments with key respondents for TAACCCT. The team tested the online survey and qualitative site visit instruments to ensure that the instruments (and each question) are clearly written and understandable to respondents, fully covers appropriate topics, ask questions appropriately and offer respondents a complete listing of response categories for each closed-ended question.


The team conducted the following activities to pretest the college survey and the qualitative instruments.


Survey. In addition to internal testing of the online survey, the team asked 9 institutions to answer the survey. Seven completed the survey and one partially completed it. We selected pretesters to vary according to grant type (consortium of colleges or a single institution) and industry of focus. Survey completers were asked to provide feedback about the survey – ease of use, relevance to their TAACCCT experience, and readability and completeness of the questions and response options.



Qualitative site visit instruments. The team tested all qualitative instruments at least once to assess respondent comprehension of questions, response burden, organization of the questions and delivery method. The data collection for this part of the study is composed of six different instruments: (1) director of the grant program; (2) college staff, (3) college leaders; (4) supportive services; (5) employer partners; and (6) focus groups of program participants. In total, the team recruited 10 testers at grantee institutions with each of them an at least one instrument. (No more than 9 individuals were asked to pretest any one instrument.) The testing modes were the following:



  • In person testing with one grantee institution in the greater Washington DC area

  • Written feedback through email

  • Telephone interviews using the instruments


Based on the results of the testing, most questions were clear and were answered by testers. However, several changes were made to the survey and qualitative instruments in the following general areas: wording of questions, definition of terms, sequence and organization of the questions, technical issues of programming and information populated from the grant application to make it easier for respondents to answer various questions, and necessary or helpful documents that are suggested for institutions to gather before taking the survey. Overall, the time of survey completion was as expected, around 90 minutes, but as suggested in the introduction of the survey, this time could be reduced with better information about the documents that can be useful to complete the survey and some of the suggested question revisions. The attached instruments incorporate the revisions suggested from the testing.


5. Name and telephone number of individuals consulted

List the names and telephone numbers of individual consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


The agency overseeing this evaluation is:


U.S. Department of Labor

Chief Evaluation Office

200 Constitution Ave., NW

Washington, DC 20210


Person Responsible: Erika Liliedahl, Project Officer

(202) 693-5992

[email protected]


All data collection and analysis will be conducted by:


The Urban Institute

2100 M Street, NW

Washington, DC 20037


Person Responsible: Lauren Eyster, Project Director

(202) 261-5621

[email protected]

File Typeapplication/msword
File TitleSUPPORTING STATEMENT FOR PAPERWORK REDUCTION ACT 1995
AuthorAdministrator
Last Modified ByDurham, Christin
File Modified2015-01-13
File Created2015-01-13

© 2024 OMB.report | Privacy Policy