Green Jobs ARRA Part_A_Follow-up_Data_Collection 9-21 -2012[1]

Green Jobs ARRA Part_A_Follow-up_Data_Collection 9-21 -2012[1].docx

Follow-Up Survey Information for Green Jobs and Health Care Impact Evaluation, American Recovery Reinvestment Act Grants

OMB: 1205-0506

Document [docx]
Download: docx | pdf

PART A: SUPPORTING STATEMENT FOR PAPERWORK REDUCTION ACT SUBMISSION: FOLLOW-UP surveys, Green Jobs and Health Care and High Growth Training grant initiatives

The Employment and Training Administration (ETA) in the U.S. Department of Labor (DOL) is undertaking the Green Jobs and Health Care (GJ-HC) Impact Evaluation of the Pathways Out of Poverty and Health Care and High Growth Training grant initiatives. The goal of this evaluation is to determine the extent to which enrollees achieve increases in employment, earnings, and career advancement as a result of their participation in the training provided by Pathways and Health Care grantees and to identify promising practices and strategies for replication. ETA has contracted with Abt Associates and its subcontractor, Mathematica Policy Research, to conduct this evaluation.

In July, 2011, OMB approved the baseline data collection for this evaluation (OMB 1205-0486), and in March, 2012, OMB approved a subsequent request for the process study data collection, which includes site visits and focus group administration (OMB 1205-0487). The request for approval in this information collection request is limited to the follow-up interviews planned for all study participants 18 months and 36 months after baseline collection (draft follow-up telephone questionnaire presented in Attachment 1).

Requests for approvals for this evaluation needed to be submitted in three parts for several reasons. The main reason is that it was necessary to (1) conduct random assignment and collect baseline data early in the study period to obtain a sample size needed for the estimation of program impacts and (2) conduct two rounds of process study visits, including one when the early sample was participating in the training program. In addition, the study structure required that the baseline data inform the development of the follow-up data collection effort. As a result, it was necessary to initiate the baseline data collection and gain experience in its implementation before the follow-up instruments could be developed. Thus, ETA is now requesting OMB approval of these follow-up instruments so that the evaluation can be completed on schedule.

The full request for this evaluation needed to be submitted in three parts for several reasons. The main reason is that it was necessary to (1) conduct random assignment and collect baseline data early in the study period in order to obtain a sample size needed for the estimation of program impacts and (2) conduct two rounds of process study visits, including one when the early sample was participating in the training program. In addition, the study structure required that the baseline data inform the development of the follow-up data collection effort. As a result, it was necessary to first obtain clearance for the baseline data collection and gain experience in its implementation before the follow-up instruments could be developed and then submitted for clearance.

Justification

1. Circumstances Necessitating Data Collection

As part of a comprehensive economic stimulus package funded under the 2009 American Recovery Reinvestment Act (ARRA), DOL funded a series of grant initiatives to promote training and employment in select high-growth sectors of the economy. Individuals facing significant barriers to employment, as well as those who were recently displaced as a result of the economic downturn, are the high-priority labor pools targeted by these ARRA initiatives. As part of the ARRA’s focus on labor demand, the Department places particular emphasis on high-growth and emerging industries, with a particular focus on emerging “green” sectors of the economy and pressing skill shortages in health care fields. These grant programs are consistent with ETA’s emphasis on more customized or sector-based labor market solutions, and on targeting job seekers (including incumbent workers) who face significant barriers to economic self-sufficiency as a resource for those specific growth sectors facing skill shortages, or that anticipate a need to hire.

ARRA’s focus on providing training for workers to fill jobs in high-growth and emerging industries comes at a critical time. During periods of both recession and expansion, it is important to address the challenge of building and maintaining a productive workforce to ensure long-term economic competitiveness. This applies particularly in industries, such as health care, education, and energy, in which the Bureau of Labor Statistics projects significant job growth over an extended time (Bureau of Labor Statistics 2010). However, several factors, including declines in educational attainment among American workers, a skilled workforce in need of replacements for aging and retiring workers, and continued immigration, are affecting workforce skill levels and the ability of employers to remain competitive and increase productivity (Dohm and Shniper 2007). Training programs like those funded by ARRA are designed to either provide these skills or to provide an entry-level career path toward acquiring them.

ETA’s grant programs represent an important step towards increasing post-secondary education and training in high-growth areas, particularly those related to health and green jobs. These programs supply resources for providing training, encourage partnerships between different service delivery systems, feature strong employer involvement, and focus on the provision of innovative and promising training strategies. To learn about the impacts of this significant investment of resources in training programs, ETA has funded a rigorous evaluation using a random assignment research design.

The two goals of this evaluation are to (1) determine if members of the randomly assigned treatment group (who have access to grant-funded services) achieve greater employment, earnings, and career advancement than otherwise equivalent control group members, and (2) identify promising practices and strategies for producing those effects for possible future replication. The study uses an experimental design to measure the impact of access to grant-funded training and support services, as well as a process study to examine intervention implementation and operations and provide context for interpreting impact study results. The evaluation is a census of the participating trainees at four study sites and will measure the effectiveness of the training strategies adopted by these four grantees, which were selected from among the 93 grantees funded under the Pathways Out of Poverty and the Health Care and Other High Growth Industries programs. The evaluation team based selection of these grantees primarily on the strength and scale of the grantees’ intervention and their ability to support the requirements of this type of evaluation. The study will not indicate whether the two grant funding vehicles as a whole produce beneficial effects, but rather will tell ETA whether any of the specific training approaches used in the four study sites are worth emulating as successful models for improving workforce outcomes in the green jobs and/or health care sectors.

As described in the clearance package submitted earlier for the baseline data collection, during the intake period all persons who apply for grant services in the study sites and are determined to be eligible are given information about the study (including information on random assignment) and asked to sign a form confirming they have received and understand information about the study. (This form was approved in the earlier package). Everyone who consents to participate is asked to complete a Baseline Information Form that gathers information on sample members’ background characteristics (this form was also approved in the earlier package). Grantee staff then enter the person’s data into the web-based Participant Tracking System and the system randomly assigns each participant to either the treatment or the control group. Staff then notify the participant of his or her assignment status. People who do not consent to participate in the study are not randomized or served with grant funding, but may obtain training and employment help from other sources on their own. Such training and employment help is also available to randomly assigned control group members who are excluded from grant-funded training and services as well as treatment group members who can access grant-funded services.

As noted above, a second clearance package was submitted for the process study visits. This research activity involves conducting two rounds of site visits to the four grantees participating in the evaluation. During these site visits, the evaluation team will observe program activities; conduct semi-structured interviews with administrators, staff, partners, and employers; and—in the first round of visits— hold focus groups with participants. Observations and discussion topics cover the program environment, participant flow through random assignment and program services, the nature and content of the training provided, the control group environment, and grantee perspectives on implementation challenges and intervention effects. The qualitative information collected during these visits will enable the team to describe the program design and operations in each site, help interpret the impact analysis results, and identify lessons learned for purposes of program replication for those models found to have positive labor market impacts.

The final round of data collection—the two follow-up surveys submitted with this package—will complement these baseline and process study data collection efforts by looking at outcomes for the treatment and control group members. The evaluation will address the following research questions:

  • What is the impact of the selected grantee programs on the receipt of education and training services by treatment group members, in terms of both the number who receive these services and the total hours of training received?

  • What is the impact of the programs on the completion of training and educational programs and on the receipt of certificates and credentials from these programs?

  • What is the impact of the programs on employment levels and earnings? To what extent do the programs result in earnings progression?

  • To what extent do the programs result in any employment (regardless of sector)? To what extent do the programs result in employment in the specified sector in which the training was focused?

  • What features of the programs seem to be associated with positive impacts, particularly in terms of target group, curricula and course design, and additional supports?

  • What are the lessons for future programs and practices?

At each selected site, individuals are being randomly assigned to a treatment or control group. A total of 4,024 sample members are expected across the four sites, with target sample size totals varying by site as shown in Table A.1.

Table A.1. Projected Number of People to Be Randomized for the Green Jobs-Health Care Impact Evaluation, by Site

Site

Treatment Group Members

Control Group Members

Total Sample

AIOIC (MN)

600

600

1,200

Grand Rapids (MI)

600

300

900

North Central Texas

589

485

1,074

Kern (CA)

425

425

850

Total

2,214

1,810

4,024

AIOIC = American Indian Opportunities Industrialization Center

For this evaluation, the treatment condition is defined as having the opportunity to enroll in training funded by either a Pathways or the Health Care grant. The treatment condition varies across the sites, depending on the training programs grantees implement with their funds and the context in which the program operates. The control condition, or counterfactual, is defined as not having the opportunity to enroll in training funded by Pathways or Health Care grants. However, control group members will not be prevented from enrolling in other locally available training programs or services in the community. It is reasonable to assume that some people assigned to the control group will find opportunities to receive training from other sources.1 This configuration—a comparison between outcomes for participants with access to the services of the focal programs and those with access to other services in the community—is a common design for random assignment studies of training programs. It is also one that answers the most relevant policy question: Are participant outcomes improved when services of the type funded by the Pathways and Health Care grants are added to the configuration of training services already available in the community?

The conceptual framework, depicted graphically in Figure 1.1, outlines the ways in which elements of the intervention as well as outside factors are expected to influence short- and long-term outcomes of treatment group members. The intervention characteristics are what DOL is funding under the Pathways and Health Care grants. The programs themselves generate outputs as shown in Figure 1.1. Such program outputs then lead to the short-term outcomes of employment, earnings, job quality, and potential additional education and advanced training. These short-term outcomes lead to the longer term outcomes of potentially better employment, earnings, job quality, job persistence, career advancement, and personal or family economic stability. The conceptual framework recognizes that it is not only the characteristics of the program that generate these outputs and outcomes, but also that environmental context and personal characteristics are influential factors. The arrows between specific boxes in the model represent the expected influences among the factors.

Figure 1.1. Conceptual Framework for GJ-HC Impact Evaluation


The conceptual framework is general enough to capture variation along several dimensions. A major dimension of interest to the evaluation is what distinguishes green jobs from health care or high-tech industry jobs, and elements of the conceptual framework capture variation that might exist in that regard. For example, among environmental factors that might matter, this framework includes sector distribution, which might be measured as the percentage of the local labor market that manufacturing jobs comprise. Similarly, what is included in the central Intervention Characteristics box is site-specific, as relevant to the evaluation, but can all be captured in one overarching framework. This conceptual framework is the foundation for both the process and impact portions of the evaluation. For instance, while the process analysis approved earlier may create site-specific versions of this conceptual model, providing rich context for use in interpreting results from the impact analysis, the model suggests some key subgroup analyses to be explored.

Overview of Data Collection

In order to address the above impact research questions adequately, the evaluation will need to collect detailed baseline and follow-up data from different sources at various points in time.

Data collected at the earlier, baseline stage enable the evaluation team to describe the characteristics of study participants at the time they are randomly assigned to the treatment or control group, ensure that random assignment produces matching groups, create subgroups for the analysis, provide contact information to locate individuals for follow-up surveys, and improve the precision of the impact estimates. As noted earlier, the request for approval of the baseline data collection effort was included in a separate OMB package.

Data collected through process study site visits will enable the evaluation team to look closely at grantees’ program structures and how program staff deliver services to the treatment group. Two rounds of visits will be conducted. The visits will include interviews with key program partners (such as One-Stop Career Centers, community-based service organizations, and community colleges) and a few local employers from relevant employment sectors. The first round of visits also will include group discussions with program participants.

Two follow-up telephone surveys, the focal point of this clearance package, will be attempted with 4,024 study participants (all 2,214 members of the treatment group and 1,810 members of the control group). Telephone interviews, which will be conducted 18 and 36 months after random assignment, will serve two major purposes. The first is to provide information on service receipt and educational outcomes, the second is to examine long-run employment, and economic security. While each wave of the survey addresses both issues to some extent, given their timing in relation to participation in training, the first will have a greater focus on service receipt and educational attainment, and the second will have a relatively greater focus on employment, earnings, and career progression.

2. How, by Whom, and for What Purposes Will the Information Be Used?

ETA requests clearance to collect follow-up survey data on service receipt, educational outcomes, long-run employment, economic security, and other outcomes pertinent for evaluating the impacts of the ARRA grant initiatives on participants. The 18-month and 36-month follow-up surveys are described in detail below along with specific details on how, by whom, and for what purposes the information will be used.

18-Month Follow-up Telephone Survey

Telephone interviewers will attempt to reach all study participants, including all treatment and control group members as part of the first telephone survey 18 months after their random assignment dates.

Data on service receipt, a primary focus of the 18-month survey, will aid in developing an understanding of any subsequent program impacts on labor market outcomes. Because impact estimates will be based on differences in outcomes between the treatment and control groups, it will be especially important to understand what, if any, training and related services the control group received. The reason for this is that an estimated impact that cannot be statistically distinguished from zero could be driven by high participation of control group members in services that are similar to grant-funded services.

Additional data elements from the 18-month survey will support analysis of the short-term impacts of the interventions. In particular, as shown in Figure A.1, the 18-month survey will collect information on key outcomes of interest in domains such as the acquisition of credentials, employment and earnings, quality of jobs, match between job type and the training program, and total income and use of public benefits. In an effort to determine whether grant-funded training affects barriers to and attitudes towards work, the 18-months survey will also collect opinions about work similar to the information collected on the baseline information form (Attachment 1 contains the draft follow-up telephone questionnaire).

36-Month Follow-up Telephone Survey

Telephone interviewers will also attempt to contact all treatment and control group members 36 months after random assignment to administer a second telephone follow-up survey. This survey will use the same instrument used for the 18-month data collection effort (again, see Figure 2.1 and Attachment 1). However, given the activities that participants are likely to be involved in over time, the 36-month survey will focus less on program participation measures and more on long-run employment and earnings. The survey will document longer-run employment and income, wage and earnings progression, career advancement, job characteristics (including, employee benefits), and use of public benefit programs. If successful at interviewing individuals at 36 months who were not interviewed at 18 months, then retrospective questions going back to the previous interview will be extended back to these respondents to study enrollment.

Figure 2.1. 18- and 36-Month Survey Data Elements


EMPLOYMENT AND EARNINGS

Employment Since the Beginning of the Follow-Up Perioda

Employed (name of employer, location of employer)

Earnings

Wage rate and hours worked

Industry/Occupation

Length of time in current job

Availability of fringe benefits (paid time off, health insurance, etc)b

Work schedule (regular, split shift, odd job, etc)

Job is on a career pathway

Represented by union

Number of jobs held

Industry/occupation of previous jobs

Periods when laid off from job

Reasons for job separation

Work-related activities in the past week


Barriers to employment & Opinions About Work

Factors that limit ability to work

Lowest acceptable hourly wage

Criminal Behavior (asked in Grand Rapids only) (Parole/probation violations, arrests, convictions, incarceration)

EDUCATIONAL OUTCOMES & SERVICE RECEIPT

Training/Education

Type & number of basic education course

Secondary education

Post-secondary education

Occupational skills training

Occupation for which being trained

Other types of skills training

Date and duration attended

Reason program not completed

Obtained new job or promotion because of training

Degree/training useful for current job

Who and how much paid for training

Acquisition of Credentials

Completion of training/education

Attained a degree, license, certification, or other credential

Type of degree, license, certification, credential

Field of study of degree

Received a high school diploma or GED

Where obtained degree, diploma, license, certification, credential


Employment-Related Support

Additional types of assistance received

Paid/Unpaid internship or on-the-job training

Case-management/counseling


Supportive and Other Services

Type of supportive services

Received needs-related payment

Other services received


Perspectives On Services, If Received Any

Why chose to seek training/employment services

Financial Hardship

Difficulty making ends meet

Financial difficulties related to housing/paying bills/savings


Current Family Status & Demographics

Date of birth / age (if not obtained during baseline)

Race and ethnicity (if not obtained during baseline)

Gender (if not obtained during baseline)

Household composition


Income and Receipt of Public Benefits

Receipt of TANF, SNAP, SSI, UI, TAA, WIC or other benefits

Total months receiving benefits

Total average monthly amount of benefit

Household income


aSome data elements about employment since program completion are collected for all jobs since the beginning of the follow-up period, which is “since random assignment” for (1) all survey respondents completing the 18-month interview and (2) survey respondents who did not complete the 18-month interview but who are completing the 36-month interview. The beginning of the follow-up period is “since the 18-month interview” for survey respondents who ompleted the 18-month interview and who are completing the 36-month interview. Other data elements about employment are asked about for focal jobs only.


bHealth insurance coverage is asked about in the context of both employer-provided coverage and coverage through other sources.


3. Use of Improved Technology to Reduce Burden

Computer assisted telephone interviewing (CATI) will be used for the telephone surveys. CATI allows interviewers to move swiftly through the survey instrument, asking only those questions that are relevant to a particular respondent, based on his or her earlier answers. This reduces the length of time respondents spend on the phone, and minimizes the likelihood that respondents will be asked to answer questions that do not apply to them, which is often an issue with in-person or paper-and-pencil interviews.

CATI is a good choice of method of administration for telephone interviews with large numbers of respondents. With CATI, information about sample members, such as information collected on their baseline information forms, can be preloaded to improve question flow and data accuracy, and reduce respondent burden. CATI programs are efficient and accept only valid responses based on preprogrammed checks for logical consistency across answers. Interviewers are thus able to correct errors during the interview, eliminating the need for costly callbacks to respondents. Also, dialing errors are almost completely eliminated because calls will be made through a preview dialer. The preview dialer allows interviewers to review case history notes and the history of dispositions. The interviewer then presses one button to dial the number after reviewing the case (this is akin to one-touch or speed dialing). An automated call scheduler will simplify scheduling and rescheduling of calls to respondents, and can assign cases to specific interviewers such as those who are trained in refusal conversion techniques or those who are fluent in Spanish. In addition, the flexibility of CATI programming allows for the scheduling of interview times that are convenient for the sample member.

4. Efforts to Identify Duplication

The study team has reviewed the existing literature and existing data sources to ensure that this data collection effort does not represent duplication of existing/available data. There is no other source for the information that will be collected in the follow-up surveys. Answers to the survey questions are not included in the data that grantees are required to collect and report to the DOL and there are no administrative data sources that provide the range of data elements needed. The study will collect information about sample members employment and earnings covered from wage records data maintained by state Unemployment Insurance (UI) agencies. This will be done either directly from the states or from the National Directory of New Hires (NDNH). However, these wage records data provide only total quarterly earnings in UI-covered employment and not the critical information the study needs about employment experiences—information, for example, on wage rates, hours worked, availability of fringe benefits, or on whether the job made use of the person’s training.

5. Methods to Minimize Burden on Small Businesses or Entities

This data collection does not involve small businesses or other small entities.

6. Consequences of Not Collecting the Data

The data collected in the follow-up administrations will enable the GJ-HC impact evaluation to generate precise, unbiased estimates of the impacts of the training services offered. Results from this rigorous evaluation will inform policymakers about net impacts for participants and the context within which the programs operate.

Without collecting follow-up interview data from study participants, the study could not meet its goal of determining the extent to which enrollees in the four sites included in the evaluation experienced increases in service and credential receipt, employment and earnings, and career progression.

7. Special Data Collection Circumstances

This data collection effort does not involve any special circumstances.

8. Federal Register Notice and Consultations Outside the Agency

Federal Register Notice

As required by 5 CFR 1320.8 (d), a Federal Register Notice, published on April 12, 2012 (Vol. 77, pp 22001 – 22003) announced the evaluation of the Green Jobs and Health Care Impact Evaluation of the Pathways Out of Poverty and Health Care and High Growth Training Grant initiatives. The Federal Register announcement provided the public an opportunity to review and comment on the planned data collection and evaluation for 60 days following its publication. No comments were received.

Consultations Outside the Agency

Consultations on the research design, sample design, and data collection procedures were part of the study design phase of the evaluation. The purposes of these consultations were to ensure the technical soundness of the study and the relevance of its findings and to verify the importance, relevance, and accessibility of the information sought in the study.

Peer Review Panel Members

Ms. Maureen Conway, Executive Director, Economic Opportunities Program, Aspen Institute

Dr. Harry J. Holzer, Professor, Georgetown Public Policy Institute

Dr. Robert J. LaLonde, Professor, The Harris School, University of Chicago

Mr. Larry Orr, Larry Orr Consulting

Dr. Burt S. Barnow, Amsterdam Professor of Public Service, The Trachtenberg School of Public Policy and Public Administration, George Washington University

Ms. Mindy Feldbaum, Director for Workforce Development Programs, National Institute for Work and Learning

9. Respondent Payments

The offer of respondent payments. It is critical to maximize cooperation of sample members with follow-up survey data collection efforts and increase survey response rates, thereby ensuring the representativeness of the sample and providing data that are complete, valid, reliable, and unbiased. Given the importance of this evaluation, the data collection must maintain the highest standards. Providing a modest payment to study subjects who complete a given follow-up interview can contribute to the achievement of that goal by significantly increasing response rates, thereby ensuring data collection from a sample that is truly representative. Because response to telephone surveys has been declining in recent years and costs associated with achieving high response have been increasing, the use of respondent payments has become common practice for survey studies (Curtin et al. 2005). These payments can help achieve high response rates by increasing the sample members’ propensity to respond (Singer et al. 2000). Studies offering respondent payments show decreased refusal rates and increased contact and cooperation rates. Among sample members who initially refuse to participate, the availability of payments increases refusal-conversion rates. These payments also can help contain costs by increasing sample members’ propensity to respond, thus significantly reducing the effort and funds expended to resolve a case and the number of interim refusals. These operational cost savings and direct participant benefits provide justification for offering payments to survey respondents.

In addition to helping gain cooperation to increase the overall response rate, respondent payments also increase the likelihood of participation from subgroups with a lower propensity to cooperate with the survey request. This is another important factor in helping to ensure the representative nature of the outcome data and the quality of the data being collected. For example, Jäckle and Lynn (2007) find that respondent payments increase the participation of sample members who are more likely to be unemployed. There is also evidence that respondent payments bolster participation among those with lower interest in the survey topic (Jäckle and Lynn 2007; Kay 2001; Schwartz et al. 2006), resulting in data that are more nearly complete. It has also been established that payments do not impair the quality of the data obtained (for example, by increasing item nonresponse or the distribution of responses) from groups who would otherwise be underrepresented in a survey (Singer et al. 2000).

Offering respondent payments is the final critical addition to intensive efforts to establish contact with prospective respondents, and gain their cooperation with the planned data collection. A $25 payment will be offered to respondents as a gesture to thank individuals for the time they spend completing each follow-up interview. Such a sign of appreciation motivates sample members to participate in the survey, and may influence their decision to provide updated contact information, especially during the 18 months between the first and second follow-up surveys. The current study plan involves conducting locating by the evaluation team as well as field follow-up for hard-to-find cases. The $25 payment offered at the 18-month interview is expected to help reduce locating effort at 36 months.

To leverage fully the benefits of both rounds of payments at 18 and 36 months, the payments will be mentioned when contact is established with the participants and attempts are made to gain their cooperation.

10. Confidentiality

Abt Associates and Mathematica have well-established safeguards to ensure the privacy and protection of all data collected from study participants. This includes policies and procedures related to privacy, physical and technical safeguards, and approaches to the treatment of personally identifiable information (PII).

Privacy Policy

Abt and Mathematica are committed to compliance with federal, state, and DOL data security requirements, and will take steps to ensure that all study staff comply with relevant policies related to secure data collection, data storage and access, and data dissemination and analysis. Both contractors have security policies that meet the legal requirements of the Freedom of Information Act and related regulations to ensure and maintain the privacy of data relating to program participants.

Privacy Safeguards

All interviewers as well as regular contractor staff are required to sign a company data security pledge as a condition of employment. The data security agreement covers all data employees use in the course of their normal duties. Employees who break this agreement face immediate dismissal and possible legal action. Beyond this, all staff working with PII will sign data security agreements. Hard copies of documents will be kept in securely locked file cabinets, electronic data files will be encrypted, and access to study files will be strictly limited to study staff who have been identified by the project director as having a need to view those files. Personal computers of study staff will be locked when not in use. Respondents will be given written assurance in all advance materials and verbal reminders during the survey administration that the information they provide will be kept private and will not be disclosed to anyone but the researchers authorized to conduct the study, except as otherwise required by law. No information will be reported by the contractor in any way that permits linkage to individual respondents, unless required by law, and the information will be destroyed once the final study report has been released.

11. Questions of a Sensitive Nature

The follow-up interviews will collect information from participants who have consented to participate in this evaluation. Information will be collected on services received through the program in areas such as case management, assessments, training or educational courses, and supportive services; any credentials earned; the details of jobs held since random assignment; income and the use of public benefits such as Temporary Assistance for Needy Families (TANF), Supplemental Nutrition and Assistance Program (SNAP), Social Security Insurance, and UI, opinions about work, experiences with the services received through the program, and criminal activity. This type of information is generally collected as part of enrollment in government-funded training programs and is therefore not considered sensitive. However, depending on an individual’s particular circumstances, any question could be perceived as sensitive. Evaluation team interviewers are well trained to show sensitivity while remaining impartial. Also, if a respondent refuses or appears reluctant to answer a question that asks for specific financial information, such as the amount earned in a given period, numeric ranges are generally offered as an alternative. Finally, to encourage reporting, reluctant respondents are reminded that their answers will be kept private.

Listed below are items that may be considered sensitive and the justification for including them: 

  • Information on employment history; participation in TANF, SNAP, and other government programs; household income; and work-related barriers is needed to conduct analyses of employment outcomes, income support program participation, and household income/poverty. The outcomes, taken together, provide a comprehensive picture of sample members’ economic self-sufficiency throughout the follow-up period. Information on work-related barriers also provides important insight on whether the training programs can reduce barriers to work. Such understanding will facilitate the design of programs that include appropriate strategies for overcoming those barriers. Information about sample members’ involvement in the criminal justice system during the follow-up period is to be collected at the one study site that is serving a large number of individuals who had been involved in the justice system prior to study enrollment. This information also will provide insights about sample members’ availability for participation in the legitimate labor market and their integration into productive society.

  • Updated participant contact information, collected during the first follow-up interview at 18 months after random assignment, is essential for re-establishing contact for the 36-month follow-up survey. The name and contact information of up to three individuals who know the participant are collected for use in the event that the contact information the study team has for the participant from baseline becomes outdated during the period between follow-up administrations.

  • Information on date of birth, address, and telephone numbers is needed to identify and contact participants. This information was collected at baseline, and remains part of the respondent’s information. Except in instances in which errors are found, there will be no need to collect this information again. However, during follow-up survey administration, respondents will be asked to confirm this information.

12. Hour Burden of the Collection of Information

The time burden for administering the follow-up surveys is estimated to be 40 minutes for the average interview in each wave of data collection, 18 months, and 36 months. The estimated total hour burden on study participants of collecting the 18- and 36-month follow-up surveys is 4,426 hours (Table A.2). Based on a targeted response rate of 80 percent, an estimated 3,219 respondents are expected to complete each of the two follow-up surveys, and each collection is estimated to take (on average) 40 minutes to complete. The 20 percent who do not complete the survey may still be contacted by telephone interviewers; therefore, 5 minutes per nonrespondent is used as an estimate of their burden. Hence, the total time for sample members to complete the two surveys is (3,219 × 40 × 2) + (805 × 5 × 2) minutes, which, when divided by 60, equals 4,426 hours.

Table A.2. Burden Estimates for Study Participants

Respondents
(Follow-up Surveys)

Number of
Instances of Collection

Frequency of Collection

Average Time
Per Respondent

Burden
(Hours)

18 month follow-up respondents

3,219

Once

40 minutes

2,146

18 month follow-up contacted non-respondents

805

Once

5 minutesa

67

36 month follow-up respondents

3,219

Once

40 minutes

2,146

36 month follow-up contacted non-respondents

805

Once

5 minutes

67

Total Unduplicated Respondents: 4024

Responses: 8048

--

--

4,426


a The 20 percent who do not complete the survey may still be contacted by telephone interviewers, therefore, five minutes per respondent is used as an estimate of their burden.


The estimated total cost burden for the data collection is presented below in Table A.3. The total estimated costs for these data collection activities are $85,953. The average hourly wage in that table, $19.42, is based on the Bureau of Labor Statistics (BLS) average hourly earnings of production and nonsupervisory employees on private, nonfarm payrolls (May 2011 Employment Situation table B-8, Current Employment Statistics, BLS, U.S. DOL). Our respondents, by nature of their eligibility for the programs in which they are participating, often are unemployed or employed at a low wage. Though the follow-up surveys are conducted well into or after completion of the targeted programs, the wage used for this calculation and resultant cost estimate are likely overestimates, so the projected annual cost shown here is likely to be higher than the actual cost incurred.

Table A.3. Total Cost Estimates for Follow-up Surveys

Data Collection Activity

Total Burden Hours

Average Hourly Wage

Total Cost

Respondents and contacted non-respondents

4,426

$19.42

$85,953



Table A.4. Burden Estimates for Study Participants, Annualized

Since this data collection will take place over a multiyear period rather during the course of a single year, the annualized burden is more correctly conservatively estimated as half the total multiyear burden.

Respondents
(Follow-up Surveys)

Number of
Instances of Collection

Frequency of Collection

Average Time
Per Respondent

Burden
(Hours)

18 month follow-up respondents

3,219

Once

40 minutes

2,146

18 month follow-up contacted non-respondents

805

Once

5 minutesa

67

36 month follow-up respondents

3,219

Once

40 minutes

2,146

36 month follow-up contacted non-respondents

805

Once

5 minutes

67

Total Unduplicated Respondents: 4024

Annualized Responses: 4024

--

--

Annualized Burden: 2213



13. Estimated Annualized Respondent Capital and Maintenance Costs

There are no direct costs to respondents and will be no start-up or ongoing financial costs incurred by respondents. The cost to respondents solely involves the time involved in being interviewed. These costs are captured in the burden estimates provided in Item 12.

14. Estimated Annualized Cost to the Federal Government

Table A.4 presents the total cost to the federal government of engaging the Abt-Mathematica team to conduct the GJ-HC Impact Evaluation over a five year period. To annualize the cost, we divide the five year total ($7,992,852) by 5 for an average annual cost of $1,598,570. It is important to note that these figures are total costs for the entire evaluation and not just for the follow-up surveys.

Table A.5. Annual Costs for Entire Green Jobs and Health Care Impact Evaluation

Year

Dates

Cost

1

2010-2011

$1,598, 570

2

2011-2012

$1, 598, 570

3

2012-2013

$1, 598, 570

4

2013-2014

$1, 598, 570

5

2014-2015

$1, 598, 570

Total


$7,992,850


Please note that no annualized costs for this final segment of the evaluation were entered in ROCIS with this submission because the total costs were already reflected in segments previously approved by OMB.


15. Changes in Burden

This is a new information collection request.

P. Publication Plans and Project Schedule

The first round of follow-up surveys will begin in early 2013 with the second round beginning 18 months later. The timeline for reporting survey findings is given in Table A.5.

Table A.5. Study Timeline

Time

Activity

Summer 2011

Baseline data collection begins

Winter 2013

Baseline data collection ends

Winter 2013

First round of follow-up surveys begins

Summer 2014

Second round of follow-up surveys begins; first round of follow-up survey data collection ends

Spring 2015

Interim report published based on first round (18-month) survey data

Summer 2015

Second round of follow-up survey data collection ends

Fall 2016

Final report published based on 36-month survey data


Q. Reasons for Not Displaying Expiration Date of OMB Approval

The expiration date for OMB approval will be displayed on all forms associated with this data collection.

R. Exception to the Certification Statement

Exception to the certification statement is not requested for the data collection.



1 Treatment group members also will be able to access other community-based training programs or services not offered by the grant.

III.12

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorAugust Pitt
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy