HPOG Impact Supporting Statement_Part A_revised to ACF 10.2.12 AW 10.3.2012

HPOG Impact Supporting Statement_Part A_revised to ACF 10.2.12 AW 10.3.2012.doc

Health Profession Opportunity Grants (HPOG) program

OMB: 0970-0394

Document [doc]
Download: doc | pdf



Supporting Statement for OMB Clearance Request


Part A


Health Profession Opportunity Grants Impact Study (HPOG-Impact)

0970-0394








Original: June 2012

Revised: October 2012



Submitted by:

Office of Planning,
Research & Evaluation

Administration for Children & Families

U.S. Department of Health
and Human Services


Federal Project Officers

Molly Irwin and Hilary Forster


Table of Contents



Attachments:

Instrument 1: Supplemental Baseline Questions

Attachment A: References

Attachment B: Informed Consent Form

Attachment C: 60 Day Federal Register Notice

Attachment D: Sources and Justification for the Supplemental Baseline Questions

Attachment E: Logic Models Justifying the Inclusion of the Child Outcomes Roster

Attachment F: HPOG Impact Analysis Plan: Estimating 15 and 30-36 Month Impacts

Attachment G: Constructs for HPOG-Impact Data Collection Efforts

Attachment H: Screen Shots of the PRS Data Collection System







Part A: Justification

This section provides supporting statements for the collection of information for an impact evaluation of the Health Profession Opportunity Grants (HPOG) program, funded by the U.S. Department of Health and Human Services (HHS), Administration for Children and Families (ACF). The grants fund programs that provide Temporary Assistance for Needy Families (TANF) recipients, other low-income individuals, and members of Native American tribes with training and support needed to find and keep employment in healthcare occupations and fill the growing demand for skilled healthcare workers. Thirty-two grants were awarded in September 2010 to government agencies, community-based organizations, post-secondary educational institutions, and tribal-affiliated organizations to conduct these activities.

ACF is implementing a multi-pronged research and evaluation approach for the HPOG program to better understand and assess the activities conducted and their results. The current submission is in support of the HPOG Impact Study (HPOG-Impact).

Abt Associates and its partner, The Urban Institute, are conducting two other evaluations on behalf of ACF as part of the HPOG research portfolio. The Implementation, Systems and Outcome Evaluation of the Health Profession Opportunity Grants to Serve TANF Recipients and Other Low-Income Individuals is an effort to design and implement an HPOG grantee and participant tracking and management information system called the Performance Reporting System (PRS) and to develop designs appropriate for evaluating the implementation, systems change and outcomes of the HPOG programs. The Innovative Strategies for Increasing Self-Sufficiency (ISIS) project is an evaluation of nine career pathways programs training low-income individuals for various occupations, including healthcare jobs. Three of the nine programs evaluated in ISIS will be HPOG programs. Some of the data collection activities for both of these projects have already received OMB clearance. (The OMB clearance number for the Performance Reporting System (PRS) developed under the Implementation, Systems and Outcomes study is 0970-0394; the OMB clearance number for ISIS baseline instruments is 0970-0343.)

Other HPOG-related research and evaluation activities include a separate evaluation of the Tribal HPOG grants currently being conducted by NORC at the University of Chicago (OMB clearance number 0970-0395).

ACF and its contractors are engaged in many efforts to coordinate these evaluation activities so that each evaluation capitalizes on related work being done in other projects, so that burden is minimized for grantees and for study participants, and so that comparable data from different, related evaluations may be combined to enhance the cumulative development of knowledge useful to government policy makers, program operators, and the public. HPOG-Impact will utilize data from the PRS as well as adapted versions of instruments developed and approved for both ISIS and for the execution of the Implementation, Systems and Outcomes evaluation design.

In this document, we request a revision to OMB clearance number 0970-0394 to add baseline questions to the PRS in support of HPOG-Impact. As noted above, the HPOG PRS is currently in use by all HPOG grantees. These added questions will be administered to HPOG-Impact study participants only, including both those in the treatment group(s) and control group, at baseline (i.e., prior to random assignment and prior to intake into the programs studied for the treatment group) in conjunction with the currently approved PRS. These Supplemental Baseline Questions will complement baseline data already being collected about the universe of HPOG program participants through the PRS. The Supplemental Baseline Questions include adapted items from instruments used in ISIS that have also been approved by OMB (clearance number 0970-0343), a small number of new questions necessary to achieve the HPOG-Impact research goals, and a child roster. Subsequent OMB submissions, described below, will seek clearance for additional HPOG-Impact data collection activities beyond this baseline data collection.

A1: Necessity for the Data Collection

The Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services (HHS) seeks approval for the supplemental baseline data collection activities described in this request to support the research that is part of HPOG-Impact.

A1.1 Study Background

As part of the Affordable Care Act (ACA) of 2010, Congress authorized funds for the HPOG program “to conduct demonstration projects that provide eligible individuals with the opportunity to obtain education and training for occupations in the healthcare field that pay well” (Grant Announcement HHS-2010-ACF-OFA-FX-0126). These demonstration projects are intended to address two pervasive and growing problems: the increasing shortfall in the supply of qualified healthcare professionals in the face of expanding demand, and the increasing requirement for a post-secondary education to secure a job with a living wage for families.

HPOG-Impact will demonstrate how effectively grantees achieve these goals and how variations in program services affect program impacts. HPOG-Impact will assess which practices in implementing career pathways programs for TANF recipients and other low-income individuals are especially worthy of replication. As such, HPOG-Impact will fill a void in the sectoral training and career pathways literature both about program effectiveness and about which types of programs or program components are most effective. Few large-scale impact studies of career pathways efforts exist, and none that prove the impact of specific program components and models. (Literature Review: Career Pathways Programs, Werner, Dun Rappaport, et al.) 1

Although prior impact studies have demonstrated that post-secondary vocational training programs can improve the employability and earnings of low-income individuals, there is little evidence on the contribution to impacts made by specific program components. The overall goal of the HPOG Impact Study is to estimate the independent impact of core HPOG program components. The study design, including the data collection strategy, is in service to this goal.

We anticipate that findings will inform the design and implementation of more effective postsecondary education and training programs for low-wage, low-skilled workers and other nontraditional students. As such, study results will be useful to a variety of consumers, including, for example: program developers and practitioners; institutions of higher education; federal policymakers charged with the oversight of federally-funded post-secondary education and training programs for low-income workers; Congress, in providing program funds for such programs; private foundations with the mission to fund and assess such programs; and the evaluation and research community.



A1.2 Legal or Administrative Requirements that Necessitate the Collection

H.R. 3590, the ACA mandates an evaluation of the HPOG demonstration projects (H.R. 3590, Title V, Subtitle F, Sec. 5507, sec. 2008, (a)(3)(B)). The Act further indicates that the evaluation will be used to inform the final report to Congress (H.R. 3590, Title V, Subtitle F, Sec. 5507, sec. 2008, (a)(3)(C)). Examining effects on the demand side of the healthcare sector is a requisite element of the evaluation activities mandated by the ACA in authorizing HPOG. The act calls for evaluation activities to assess the success of HPOG in “creating opportunities for developing and sustaining, particularly with respect to low-income individuals and other entry-level workers, a health professions workforce that has accessible entry points, that meets high standards for education, training, certification, and professional development, and that provides increased wages and affordable benefits, including healthcare coverage, that are responsive to the workforce’s needs” (H.R. 3590, Title V, Subtitle F, Sec. 5507, sec. 2008, (a)(3)(B)).

A1.3 Study Design

The goal of HPOG-Impact is to evaluate the effectiveness of approaches that HPOG grantees use to provide Temporary Assistance for Needy Families (TANF) recipients and other low-income individuals with opportunities for education, training and advancement within the healthcare field. Specifically, HPOG-Impact is intended to evaluate variation in HPOG grantee programs’ impact on participants that is associated with variation in HPOG program components and models.

There are a total of 32 HPOG grantees; the HPOG-Impact Evaluation will target 20 of those 32. All 20 of the grantees to be included in the evaluation serve TANF recipients and other low-income individuals. Of the 12 grantees that will not participate in the evaluation, five are tribal grantees. These tribal grantees serve a unique population, and HPOG programs serving this population are being evaluated under OMB # 0970-0395. Of the remaining seven grantees serving TANF recipients and other low-income individuals, three were selected to participate in another ACF-sponsored RCT evaluation – the Innovative Strategies for Increasing Self-Sufficiency project, or ISIS. ISIS is evaluating nine programs that meet a stringent definition for what constitutes a career pathways program. The HPOG Impact Study has been collaborating closely with the ISIS project on baseline and follow-up data items and design issues. The HPOG Study will have access to the baseline and follow-up data from ISIS, and we plan to use their data on the three HPOG grantees in conducting the HPOG-Impact analysis. As such, the HPOG Impact Study will include 23 of the 27 HPOG grantees serving TANF recipients and other low-income individuals.

The four HPOG grantees serving TANF recipients and other low-income individuals not included in the HPOG-Impact analysis are each engaged in independent research projects with a University partner. The University Partnership Grant projects each focus on a different set or research questions and each involved individual-level data collection from HPOG participants. These four HPOG grantees have not been included in the impact study in order to avoid excessive burden on program staff and participants. These programs do not represent unique training designs and excluding them from the HPOG-Impact study does not undermine the impact study’s goal of evaluating the impact of specific program components and implementation strategies.

Grantee sites selected for this study differ from other grantee sites in ways that may be associated with outcomes measured. Agency interpretations of study results will reference this aspect of the study design and its potential impact on study results.

The HPOG-Impact study design includes: (1) within most HPOG grantees, an experiment in which eligible program applicants will be randomly assigned to a treatment group that is offered access to the HPOG program or to a control group that is not offered the opportunity to enroll in HPOG; and (2) within a small number of HPOG grantees, an experiment in which eligible program applicants will be randomly assigned to have access to the “standard” HPOG program, to have access to an “enhanced” HPOG program, or to be part of a control group that is not offered the opportunity to enroll in HPOG. The control group in both scenarios will have access to whatever other programs and services are available in the local community.

The study team will engage in individualized conversations with the 20 HPOG grantees about their participation in the evaluation. With each grantee, we will discuss the possibility that it could test the enhancement or enhancements selected for the evaluation (in addition to participating in the mandatory random assignment study of the effects of the HPOG program). Grantees that are most willing and able to implement the intervention in a short period of time, and those able to contribute larger samples to the study, will be favored in selecting interested grantees. We have stated that the number will be in the range of three to six, although though more is a possibility, with the exact number being determined by sample size demands and the enhancement budget constraint. Our goal is to include enough grantees of sufficient size to reach the target sample size for the enhancement impact sample, N = 1,250. This may include some smaller grantees if enthusiasm for planned variation and other factors are favorable, up to the limits of project resources.

We anticipate the number of grantees that will test enhancements will be at least three to six, with the exact number being determined by sample size demands and the enhancement budget. The uncertainty about the number of grantees stems from the fact that we are aiming to randomly assign 1,250 participants to the enhanced HPOG program, but it is unclear at this time how many HPOG grantees will be required to meet this sample size target.

The difference between the enhancement grantees and all the other grantees participating in the study will have minor implications for the evaluation. All of the variation in program components available in the study sample—whether randomly induced or natural—will be used to estimate the contribution of components to program impacts. Those variations that occur within site, at random, will provide stronger causal evidence of the impact of different components but will not be applied separately to a distinct study question. The focal question for the evaluation is “How do different program components or elements contribute to program impacts?”, a subject on which all sources of variation in components/elements have something to contribute. Hence, if enhancement sites with random variation have a different character than non-enhancement sites with only natural variation, the evidence base for addressing the “what works”? question will not be tilted either way by the division into the two subsets of sites. That said, we would expect the grantees that are most willing and able to implement the intervention in a short period of time, and those able to contribute larger samples to the study, will be favored in selecting interested grantees.





HPOG-Impact also will include implementation research, featuring: (1) a Grantee Survey that will identify HPOG program implementation models and components that may be associated with variation in participant impacts, and (2) case studies of grantees implementing models and program components of interest.

Additionally, participant data collected through HPOG-Impact for both the treatment and control groups will be matched to long-term employment and earnings data from ACF’s National Directory of New Hires (NDNH). A preliminary agreement is in place.

A1.4 Research Questions

HPOG-Impact will address the following research questions:

  1. What impacts do HPOG programs have on outcomes of interest?

  2. To what extent to these impacts vary by subgroups of interest?

  3. To what extent does HPOG program participation (in particular components, with particular dosage) have an impact on outcomes of interest?

  4. To what extent do various HPOG program models or components have varying impacts?

  5. To what extent do specific program enhancements have impacts, relative to the “standard” HPOG program?

  6. How does parental participation in various HPOG program models and components affect outcomes for children?

A1.5 Universe of Data Collection Efforts for HPOG-Impact

To address these research questions, HPOG-Impact will include the following:

  1. PRS Baseline Data Collection

  2. Supplemental Baseline Questions

  3. 15-month Follow-up Survey

  4. Grantee Survey (a survey of all HPOG grantees serving TANF recipients and other low-Income individuals)

  5. Case studies of selected HPOG grantees (site visits, observations, staff and management interviews)

  6. 30-36-Month Follow-up Survey

  7. Follow-up data collection on children of study participants

The first of these study components, the baseline PRS data collection, already has been approved by OMB (clearance number 0970-0394). This submission is to request clearance for the second component listed above. The Supplemental Baseline Questions provide critical details about study participants and are new. These data are not available through any current source, including the PRS as it currently exists.

We anticipate submitting additional OMB packages to request permission to conduct the third through seventh components. Like the data collected through the supplemental baseline questions, these additional items are unavailable through any other current studies or datasets. The third through seventh study components will collect new data.

A2: Purpose of the Survey and Data Collection Procedures

Overview of Purpose and Approach

This section discusses how information obtained through each of the study components will be used to assess the impact of HPOG on outcomes of interest and to assess variation in impacts attributable to specific program components and models. HPOG programs are intended to help low-income adults “obtain education and training for occupations in the healthcare field that pay well” (Grant Announcement HHS-2010-ACF-OFA-FX-0126). Accordingly, grantees seek to generate participant outcomes related to: training attendance and completion, award of industry-recognized credentials, employment in healthcare, and increases in wages. Grantees also are tasked with addressing personal barriers that can prevent participants from completing training, and/or from obtaining and retaining employment in healthcare.

To assess the impact of HPOG, the evaluation will collect data about targeted outcomes for program participants and for control group members. Using experimental impact analysis, the evaluation will estimate the extent to which HPOG program designs lead to changes in participant outcomes compared to changes in outcomes for individuals in the control group. In assessing the relative impacts of specific program components, the evaluation is combining prospective systematic variation of program models within a few selected HPOG grantees with natural variation in program models across many HPOG grantees.

In assessing the relative impacts of specific program components, the evaluation is combining prospective systematic variation of program models within a few selected HPOG grantees with natural variation in program models across many HPOG grantees. We achieve “prospective systematic variation” by introducing enhancement(s) in selected sites and then comparing participants receiving services as usual, those receiving the enhancement, and those in the control group receiving no HPOG services. We will examine “natural variation” in program models by estimating effects of models comparing those assigned to HPOG vs. a no-HPOG control group. Additionally natural variation will be examined through statistical modeling of the variation in impacts across sites as a function of the program models or components.

By “prospective systematic variation” of program models, we are referring to the estimated effects of program enhancements—that is, program components that have been added to enhance the program—via three-way random assignment, in those sites that test the selected program enhancement. By “natural variation in program models across HPOG grantees,” we are referring to the estimated effects of program models that we can obtain by conducting two-way random assignment to HPOG or control, combined with statistical modeling of the variation in impacts across sites as a function of the program models or components. The first approach provides unbiased estimates of the effects of program enhancements for a small and select group of grantees. Therefore, while we can be confident in the results for this group of grantees, we cannot be confident that the results from this analysis based on prospective systematic variation will be generalizable to the broader population of HPOG grantees. In contrast, the second approach may provide estimates that are more broadly applicable to HPOG grantees, since the sample will include a larger and more representative set of HPOG grantees. However, to estimate the effects of program components using natural variation, we have to rely on nonexperimental methods to compare the impacts in HPOG grantees that differ in their program components, but where the program components were not randomly assigned to grantees. Therefore, the estimates based on natural variation may contain some bias: differences in impacts between sites may be attributed to particular program components when in fact, they are at least partially attributable to other factors (e.g., other program components, the strategies used to implement particular program components, and differences in the population of trainees served across are implemented differences in the population of trainees served across grantees).

We plan to combine these two approaches because both have the potential to contribute to the study. Prospective systematic variation allows us to test program enhancements and obtain unbiased estimates of their effects; natural variation allows us to obtain estimates of the effects of a wider range of program components, and for a broader population of grantees.

An important potential contribution of this study is that we have identified several alternative analytic approaches for estimating the impacts of program components based on the natural variation across sites, and, at the same time, we will also have experimental estimates of the impacts of some of those components (the tested enhancements). This provides a unique opportunity to understand the merits of various analytic approaches: we will be able to cross-validate results from the natural variation analysis with those from the experimental analysis to determine which non-experimental analyses are preferable for analyzing the effects the components of multi-faceted interventions such as HPOG. In fact, having both “systematic variation” and “natural variation” is essential to our ability to make these methodological advances, which is a unique strength of the HPOG-Impact study design.

Each of the proposed study components serves an important purpose in the overall analysis and evaluation. Some baseline data about treatment and control group members will be used to develop covariates for the impact analysis and will facilitate sub-group analysis. The Grantee Survey will provide consistent information on program models and components. This will allow the study to assess variation in participant outcomes attributable to models and components identified in that survey. The analysis plan includes applying innovative analytic approaches, including experimental endogenous subgroup analysis (Peck, 2003), multi-level modeling (e.g., Bloom, Hill & Riccio, 2003; Raudenbush and Bryk, 2002) and other new approaches currently in development for these purposes. Data collected at 15 months and at 30-36 months after random assignment will allow the study to estimate the shorter- and longer-term treatment/control contrast on outcomes of interest. Case studies of grantees identified as implementing particular models will provide detailed information about how successful models work in practice. Attachment G provides more detail about the constructs and measures that will be included in each of the above mentioned study components. Attachment F is a detailed analysis plan that shows what analytic techniques will be used. The next section provides more information about each of the study’s major data collection components and discusses how each contributes to the impact evaluation. Although we describe all of the proposed data collection efforts for HPOG-Impact, we provide most detail on the purpose and execution of the Supplemental Baseline Questions (A2.2 below), which is the one for which this OMB package requests clearance.

A2.1 PRS Baseline Data Collection

HPOG-Impact will collect baseline data about study participants, including both treatment and control group members. Those data will be collected through both the PRS (OMB clearance number 0970-0394, described here) and its Supplemental Baseline Questions (described below).

Items from the PRS will provide researchers with important baseline demographic information about study participants. Including these items will allow the impact evaluation to describe the study sample; to identify balance between the treatment and control groups; to increase the precision of estimates regarding the impact of program components; and to identify subgroups for subgroup impact analysis at follow-up. Baseline variables will not be used in this study to measure change over time. The PRS also collects contact information for study participants. The contact information collected at baseline is necessary to enhance researchers’ ability to locate respondents for follow-up surveys that will measure intervention outcomes.

A2.2 Supplemental Baseline Questions

The Supplemental Baseline Questions include a small subset of questions adapted from the OMB-approved ISIS Baseline Information Form (BIF) and Self-Administered Questionnaire (SAQ) (both approved under OMB clearance number 0970-0343) about participants’ experiences in and expectations for education and employment, and barriers to employment. In addition, it includes some questions about individuals’ work preferences and self-efficacy.

Integrated into the PRS, data from these Supplemental Baseline Questions also will provide information on participant attitudes and expectations about the program and about employment in the healthcare industry. As with the PRS Baseline Data Collection, variables collected through the Supplemental Baseline Questions will also be used to demonstrate that random assignment yielded balanced groups on most baseline characteristics.

HPOG-Impact will collect these data because prior research on employment and training programs suggests that these factors may be associated with participants’ ability to complete training and to obtain and retain employment. (Matus-Grossman and Gooden, 2002; Fein and Beecroft, 2006; Michalolpoulos and Schwartz, 2001). The data elements that comprise the Supplemental Baseline Questions will increase our ability to test the relative effectiveness of specific program components and to assess variation in impact for specific subgroups of interest. The Supplemental Baseline Questions render the study much better able to achieve these objectives than it would be if it only used current PRS data items. Specifically, understanding people’s experience with prior training and education, their barriers to and preferences for work, and their motivations and self-efficacy will improve our ability to identify which treatment group members access various components of the HPOG program. Prior work documents that psycho-social questions such as these are important to sorting participants by characteristics that predict outcomes of interest (e.g., Gibson, 2003; Peck, 2007). In fact, this analysis “depends critically on the richness of the covariates” (Schochet & Burghardt, 2007, p.104), which is why we have included these specific additional variables at baseline: to increase our ability to classify correctly individuals according to their program experiences.

The Supplemental Baseline Questions also include a child roster, which lists research sample members’ children under 18 who reside with them at least half of the time. This roster will be used to create a sampling frame for follow-up surveys that collect data about child outcomes. (The justification for including this roster is discussed in more detail in Section A.11 and in Attachment E, which presents a logic model illustrating how career pathways programs may generate changes in child outcomes and which summarizes relevant literature.)

A2.2a How baseline data collection will be conducted

Baseline data collection will extend for the entire period of random assignment—from the pilot phase beginning in November 2012, and will continue approximately 16 months, i.e. through February 2014.

Site staff will administer an Informed Consent Form for the existing PRS and Supplemental Baseline Questions when individuals who apply to the program are found eligible. Potential study participants will complete a single consent form, based on the current PRS Informed Consent Form that allows researchers (1) to access data they provide at enrollment and (2) to contact them for additional follow-up questions for HPOG-Impact. Importantly, Supplemental Baseline Questions will be integrated into the PRS so that there is one data baseline data collection for each study potential participant. (The Informed Consent Form is included as Attachment B.)

During enrollment, site staff first will describe the study and administer the Informed Consent Form. All eligible applicants for HPOG during the intake period for the study must sign the paper Informed Consent Form to be part of the study. Next, site staff will administer the PRS, including the Supplemental Baseline Questions. After individuals complete the Informed Consent Form, the PRS and its Supplemental Baseline Questions, a secure, web-based software program will randomly assign them into either the treatment or control group (or into one of the two treatment groups or the control group, if it is a grantee where an enhancement is being evaluated).

Program applicants will be subject to random assignment after they consent to be in the research. Those who do not consent to participating in the research will not have a chance to have access to HPOG program services during the HPOG Impact study period but may receive any other service or benefit for which they are eligible.

A2.3 15-Month Follow-up Survey

The 15-month Follow-up Survey—which will be submitted to OMB in a future clearance package—will be used to document program impacts on outcomes of interest (Attachment G identifies key constructs that will be included in this and all other future HPOG-Impact surveys). As such, data from the 15-month Follow-up Survey will be used to address all of the major research questions posed for the study. In particular, program experiences and program participation outputs will be collected in order to understand the contrast between the treatment and control groups. Further, program impacts – across all the outcomes collected in the survey – will be estimated as the difference in mean treatment group outcomes and mean control group outcomes measured at follow-up. This will extend both to the grantees with a single treatment and the grantees where two treatments are in place: their standard HPOG program and the selected enhancement. Further, we will use the Follow-up Survey data, along with results from the Grantee Survey (discussed next), to evaluate the relative effectiveness of various program models and components. Baseline data will be used in these analyses to create subgroups (in both single- and multiple trait categories, including program-related subgroups), for which we will also estimate 15-month follow-up impacts.

A2.4 Grantee Survey

The Grantee Survey, to take place in fall 2013, will be used to describe the universe of HPOG grantees and will explore variation in how grantees implement the program. (As noted earlier, Attachment G identifies key constructs that will be included in this and all other future HPOG-Impact surveys.) The Grantee Survey will include questions about the characteristics of programs that are hypothesized in the career pathways literature to yield particularly strong participant impacts. As such, the survey will be critical to classifying grantees and to identifying distinct service delivery models. The impact analysis will consider how variation in participant outcomes is attributable to specific program models and components. This Grantee Survey will be coordinated with the future execution of the Implementation, Systems and Outcomes evaluation design and the ISIS evaluation, both referenced above.

A.2.5 Case studies of selected HPOG grantees (site visits, observations, staff and management interviews)

In the course of the study, HPOG site teams will make multiple visits to grantee programs to monitor site operations and the implementation of random assignment, beginning winter 2013. During one or more of those visits, site research staff will also use structured observations and staff and management interviews to validate the results of the Grantee Survey. (As noted earlier, Attachment G identifies key constructs that will be included in this and all other HPOG-Impact surveys.) The Interview Guides for these site visits will be submitted to OMB in a future clearance package.

A2.6 30-36-Month Follow-up Survey

The 30-36-Month Follow-up Survey—which will be submitted to OMB in a future clearance package—will be used to document program impacts. (Again, Attachment G identifies key constructs that will be included in this and all other future HPOG-Impact surveys.) As such, its data will be used to address all of the major research questions posed for the study. In particular, HPOG program experiences and use of relevant services (both in-program for treatment group members and out-of-program services for both research groups) will be compared between treatment and control groups to understand the contrast that is being evaluated. Further, program impacts – across all the outcomes collected in the survey – will be estimated as the difference in mean treatment group outcomes and mean control group outcomes measured at this point of follow-up. This will extend to both the grantees with a single treatment and the grantees where two treatments are in place: their standard HPOG program and the selected enhancement. We also will use the Follow-up Survey data to evaluate the relative effectiveness of various program models and components. Baseline data will be used to create subgroups (in both single- and multiple trait categories, including program-related subgroups), for which we will also estimate 30-36-month follow-up impacts. The 30-36-Month Follow-up Survey also will be combined with findings from the 15-month Follow-up Survey to measure changes in impacts over time.

A.2.7 Follow-up Data Collection on Children of Study Participants

The Supplemental Baseline Questions include a child roster, which will be used to create a sampling frame for follow-up surveys that collect data about child outcomes. (As noted above, the justification for including this roster is discussed in more detail in Section A.11 and in Attachment E, which presents a logic model illustrating how career pathways programs may generate changes in child outcomes and which summarizes relevant literature.) Because the nature of the follow-up data collection on children of study participants is dependent on children’s ages, it is premature at this time to specify the kind of data collection that will be conducted. As Attachment E illustrates, the kinds of child outcomes that may be associated with parental impacts tied to program participation and components will vary depending on children’s ages. The way in which outcome data should be collected also depends on the ages of children in the sample. (School-aged children could complete surveys; data on infants and preschoolers would need to be collected through some other means.)

A2.8 Who Will Use the Information

Researchers on the HPOG-Impact study team will have access to and use of the data collected as part of the evaluation. Some of the data also will be available to researchers studying the ISIS project. At the conclusion of the HPOG-Impact study, Abt Associates will provide ACF with a restricted-use data set containing individual-level data that are stripped of all personally identifying information.

A3: Improved Information Technology to Reduce Burden

The HPOG-Impact Supplemental Baseline Questions will be collected through the Internet-based PRS, which Abt Associates and The Urban Institute developed for the Implementation, Systems and Outcome Evaluation (0970-0394). The Urban Institute maintains the system and allows entry and secure maintenance of information on participants (as well as information on the grantees and their programs). HPOG grantees will enter participant-level data directly into the PRS. Attachment H provides screen shots of the PRS.

Key features of the PRS that reduce burden include:

  • Internet-Based Application. The PRS is implemented on a secure HPOG website maintained by The Urban Institute. Staff at the grantee or sub-grantee level who are granted authorization to access the system receive a secure password and are able to enter and/or view data on their program’s participants (but not those in programs operated by other grantees). HPOG evaluators are able to view all data from participants across all grantees, but private information (such as participant name and Social Security number) will be used only for the purposes identified in informed consent forms signed by the participant.

  • Efficient and Secure Data Entry Format. The PRS is structured to reduce the burden on grantees while ensuring adequate detail and accuracy. A data streaming capability allows some grantee representatives to program their existing information systems to interface with the PRS so that participant data already included in existing grantee or provider management information systems can be entered directly into the PRS. Directly populating the HPOG system with existing electronic data in this manner reduces data entry burden and minimizes data entry errors. Data items that cannot be directly streamed by a grantee are entered directly by authorized grantee program staff into the PRS. Data items that include private information (e.g., Social Security numbers) are automatically encrypted at data entry or at the point of electronic interface.

The HPOG PRS was developed using the highest standards of technology and data security. Data for grantee-level and individual-level records is stored securely in an SQL server database. The web interface for data entry and reporting is programmed in ColdFusion. The system is maintained on a highly secure Internet Information Server (IIS) web server at The Urban Institute, physically located in The Urban Institute’s server room in Washington, DC. The database software is MySQL (current version 5.1.36). When stored, all project files containing private data are encrypted using PGP software, a tool for encrypting storage media and for creating encrypted compressed files. Specifically, the secure data system used for the collection of baseline data will encrypt individuals’ SSNs and other individual identifiers and assign numeric project IDs to sample members. Actual SSNs, separated from other individual baseline data and identifiers, will be used only to access data on quarterly earnings from the National Directory of New Hires (NDNH), as explained in the Justification Package. Pass-through NDNH data will be stripped of all individual identifiers including SSNs.

A4: Efforts to Identify Duplication

Many of the Supplemental Baseline Questions are adapted from instruments currently being used in ISIS, with a small number of additional questions necessary for estimating the effects of program models or components. The Supplemental Baseline Questions will be integrated into the PRS baseline data collection as a separate data module. Study participants will participate in only one baseline data collection activity for HPOG. Also, as noted in the previous section, to minimize duplication of data entry, like the PRS, the Supplemental Baseline Questions will be designed with the capacity to interface electronically with existing data systems used by the grantees or their partners, to the extent that it is relevant and possible. The data collected by the Supplemental Baseline Questions are otherwise not included in the PRS and thus will otherwise not be collected for prospective research sample members.

A5: Involvement of Small Organizations

None of the HPOG grantees are small businesses. The primary organizations involved in this study are community colleges, workforce development agencies and community-based organizations that operate occupational training programs. Burden is minimized for those entities by requesting only the information required to achieve the study’s objectives, integrating the Supplemental Baseline Questions into the existing PRS already in use by HPOG grantees, and by allowing for the streaming of data collected and stored in systems other than the PRS. In addition, at the time the grants were awarded, ACF informed all grantees of the reporting requirements, and adequate resources have been provided to coordinate the data collection and reporting. There should be no adverse impact for any grantees participating in the study.

A6: Consequences of Less Frequent Data Collection

This is a one-time data collection. Earlier, we identified a number of critical uses of baseline data. Here, we note briefly how several of these uses would be affected were baseline data not available.

A.6.1 Description of Sample at Random Assignment

Obtaining baseline data allows the HPOG-Impact team to describe the research sample’s descriptive characteristics as well as confirming balance between the treatment and control groups. Without baseline data, we would lack the ability to confirm that random assignment produced statistically similar treatment and control groups. More importantly, we would not be able to describe in more detail the population these innovative programs are serving.

A.6.2 More Precise Estimates of Program Impacts

Without baseline values of measures correlated with outcomes, our estimates for the impact analysis would be less precise. The more precise the estimates are, the more confident evaluators can be in determining which strategies improve outcomes for low-income families.

A.6.3 Analysis of Sub-populations (and program components)

Lack of baseline data would mean the HPOG-Impact team would be unable to estimate impacts for subgroups of interest to provide insights for improved program targeting. In addition, one of our strategies for estimating the impacts of various program components relies on baseline data to create subgroups associated with participant pathways through the program. So, it is not just the commonly-understood single-trait subgroup analyses that would lack, but also a core part of the analysis plan for identifying what it is specifically about HPOG programs that drives impacts.

A7: Special Circumstances

There are no special circumstances for the proposed data collection.

A8: Federal Register Notice and Consultation

Federal Register Notice and Comments

In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13 and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995)), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on Tuesday, February 7, 2012, Volume 77, Number 25, page 6126, and provided a 60-day period for public comment. This notice also included a request to waive future 60-day Federal Register Notices. A copy of this notice is included as Attachment C. During the notice and comment period, the government received two requests for copies of the instrument. Those requests were fulfilled and no public comments were received.

Consultation with Experts Outside of the Study

The majority of the items in the Supplemental Baseline Questions are adapted from previously approved data collection instruments for ISIS (clearance number 0970-0343), as described in Attachment D. The new child roster is a list of HPOG participants’ children’s first names and birthdates relationship to study participant, time spent living with the HPOG participant, and other adults with whom the children lives, but does not collect any other substantive information about children.. We will use this information to create a sampling frame for longer-term follow-up about the program’s influence on children. Both the roster and the rationale for including it were developed in consultation with senior methodological and substantive experts, including:

  • Howard Rolston, PhD, Principal Associate, Abt Associates

  • Barbara Goodson, PhD, National Expert on Child and Family Research, Dillon-Goodson Research Associates

  • Anne Wolf, PhD, Associate, Abt Associates

Dr. Goodson and her colleagues developed two logic models that delineate the theoretical linkage between career pathways programs outcomes for pre-school and school-age children. Dr. Goodson and her colleagues also summarized relevant research explaining and justifying the logic models. These logic models and related research are included as Attachment E. Dr. Goodson’s logic models will inform both HPOG-Impact and future ISIS data collection.

A9: Payment of Respondents

Our study plan does not include payments to respondents at baseline, in the PRS or the Supplemental Baseline Questions, but will propose payments of $30 for completing the follow-up surveys, as reflected in the consent form (see Attachment B). Offering incentives to gain cooperation and solicit participation is a well-established practice in social science research and program evaluation for both small-scale studies and sample surveys. Participants are provided incentives to compensate them for their time and as a gesture of appreciation for voluntary participation in data collection activities. Incentives are needed because it takes effort for a respondent to take the time to participate in the interview or survey.

At baseline, HPOG applicants will be motivated to respond to the baseline interview as part of the program’s intake procedures. At follow-up, however, study participants may be less interested in responding to a survey of detailed information on HPOG program experiences (for the treatment group), other education, training and employment services received since random assignment (for the entire research sample), current employment and earnings status, psycho-social well-being, family well-being and other measures of interest (also for the entire sample). Motivation to answer these questions at follow-up may be even lower for control group subjects who are not vested in the program. Thus, to prevent differential nonresponse between treatment and control groups, ACF recommends offering respondents $30 as a token of appreciation to improve cooperation at follow-up. Estimates of program impacts may be biased if the respondents in each group are not comparable due to differential group nonresponse.

Many surveys are designed to offer incentives of varying types with the goal of increasing survey response. Monetary incentives at one or more phases of data collection have become fairly common, including some federally-sponsored surveys. Examples include the National Survey on Drug Use and Health (NSDUH, Substance Abuse and Mental Health Services Administration), the National Survey of Family Growth (NSFG, National Center for Health Statistics), the National Health and Nutrition Examination Survey (NHANES, National Center for Health Statistics), the National Survey of Child and Adolescent Well-Being (NSCAW, Administration for Children and Families), and the Early Childhood Longitudinal Study-Birth Cohort (ECLS-B, U.S. Department of Education).

There has been extensive publication about the relative efficacy of different monetary incentives, but several federal agencies have determined $20-$30 to be effective. The U.S. Census Bureau has experimented with and begun offering monetary incentives for several of its longitudinal panel surveys, including the Survey of Income and Program Participation (SIPP) and the Survey of Program Dynamics (SPD). SIPP has conducted several multi-wave incentive studies, most recently with their 2008 panel, comparing results of $10, $20, and $40 incentive amounts to those of a $0 control group. They examined response rate outcomes in various subgroups of interest (e.g., the poverty stratum), use of targeted incentives for non-interview cases, and the impact of base wave incentives on later participation. Overall, $20 incentives increased response rates and improved the conversion rate for non-interview cases. (Creighton et al, 2007). The National Survey on Drug Use and Health (NSDUH, Substance Abuse and Mental Health Services Administration) conducted an experiment in which the cost per interview in the $20 incentive group was five percent lower than the control group, whereas the $40 incentive group cost was four percent lower than the control, due to reduced effort needed in gaining cooperation (Kennet et al., 2005). The NSDUH adopted an intermediate incentive of $30 because the greatest increase in response rate was found in the $20 incentive condition, and the $40 condition obtained a higher variation in per-interview costs. A similar incentive experiment conducted for the National Survey of Family Growth (NSFG, National Center for Health Statistics) Cycle 5 Pretest examined $0, $20, and $40 incentive amounts. The additional incentive costs were more than offset by savings in interviewer labor and travel costs (Duffer et al, 1994).

Providing tokens of appreciation rfor responding to surveys may be particularly important for control group members as there are no plans to offer preferential selection into HPOG in years post-evaluation. The impacts of HPOG programs/components will be reported in a final report due in June 2016. However, the HPOG grants end in September 2015, nine months before the report is due. Therefore, it will not be possible to offer HPOG services to control group members after the findings from the evaluation are known, and the success of the program can be assessed.

A10: Confidentiality of Respondents

Although the Supplemental Baseline Questions themselves do not involve collecting individual identification data, the HPOG-Impact baseline data collection will include individual identification data collected through the existing PRS. All HPOG-Impact study participants will complete both the PRS and the Supplemental Baseline Questions that are added to it for the purpose of the impact evaluation.

Respondent privacy will be protected to the extent allowed by law. ACF recognizes that HPOG grantees serve vulnerable populations (per the authorizing legislation), and that grantees must protect those populations from any risks of harm from the research and evaluation activities. Accordingly, as is done when collecting participant data in the PRS, HPOG-Impact will obtain the informed consent form from all study participants. This informed consent will ensure that participants understand the nature of the research and evaluation activities being conducted. The Informed Consent Form is included as Attachment B.

As a part of informed consent, the following rationale for data collection and privacy assurances will be provided to HPOG participants by grantees:

  • Research is being conducted to see how well various approaches to training for healthcare jobs work. This program and research are funded by the U.S. Department of Health and Human Services, and they may fund other research on this program in the future.

  • In this program, we will collect some personal information from you, such as your name, date of birth, Social Security number, and your involvement in other programs. The researchers studying the program for the government also need this information. All of the information about you collected for the program or for the research studies will be kept completely private to the extent allowed by law, and no one’s name will ever appear in any report or discussion of the evaluation results.

  • As part of the study, researchers may contact some of you in the future. You may refuse to answer any of their specific questions at any time.

  • Researchers and program staff using the information collected must take all necessary actions to protect your information and they will pledge their agreement to protect privacy. All Abt Associates employees must sign a data confidentiality pledge on accepting an offer of employment. Any individual allowed access to identifiable data for this project must sign an additional user agreement pledging confidentiality. Urban Institute employees must sign a similar pledge of confidentiality upon employment. Individuals accessing data through the PRS must sign an additional PRS User Agreement that indicates that they will keep those data secure.

A11: Sensitive Questions

Supplemental Baseline Questions

Among the Supplemental Baseline Questions are items addressing respondents’ employment barriers, personal preferences, motivations and self-efficacy. Some respondents may consider these somewhat personal questions to be sensitive. Because grantee agency staff will be collecting the data, we designed the questions to be as neutral and non-personal as possible, framing them as part of an assessment of getting to know clients’ needs and strengths so that the program might best meet their needs and capitalize on their strengths.

The literature provides ample support for including these items as barriers to employment as outlined in the justification table included in Attachment D. Including these items is necessary to describe the study population and evaluate their moderating effects on program impacts. Furthermore, questions pertaining to personal preferences, motivations and self-efficacy will be especially useful for identifying the pathways that participants follow through multi-faceted programs, thereby allowing us to estimate the impacts of various program models and components, which is the central research question that HPOG-Impact considers. Program staff will remind potential study members during the enrollment process that they may refuse to answer individual items. Potential study members will also be reminded that their responses will be kept private, to encourage their candid responses.

Child Roster

The Supplemental Baseline Questions also include a child roster, which requests basic information needed to design a sampling strategy for future data collection: the first name and birth date for each of the respondent’s minor children, respondents’ relationships to the children (biological parent, foster parent, etc.), the amount of time child lives with respondent and the identity of other adults spending time with the child.

We have included this roster because studies of related programs suggest that programs such as HPOG may have impacts on children, even in the absence of impacts for parents, and that these impacts may vary with age. Understanding impacts on both children as well as HPOG participants is important to ACF given the agency’s dual focus on the well-being of low-income children and families. Attachment E presents two logic models that illustrate the effects that career pathways programs may generate for preschool-aged children and for older children. It also includes a narrative that explains the logic model and that cites research supporting it.

Generally, we believe it is important to include the child roster because, as the logic models and supporting literature illustrate, career pathways programs such as HPOG, have the potential to generate both positive and negative outcomes for children. (Impacts for children ages 0-5 and for older children may be quite different; because older children may be more likely than preschoolers to engage in negative behaviors as a consequence of decreased supervision that results from parental engagement in education and/or employment.) Research on related welfare reform programs illustrates the potential for HPOG programs to generate these kinds of outcomes for youth, but that research does not specifically assess outcomes associated with career pathways programs.

Including the child roster in the baseline data collection allows the study the potential to sample a focal child (or children) for inclusion in a follow-up survey; directly, via questions posed to parents, or by accessing publically available data. Including birth date supports a sampling plan that would allow the research team to target age-appropriate follow-up survey questions to particular sample members. Including questions about children’s relationships to respondents is important because child/respondent relationship may be a moderating factor. Future follow-up surveys will collect data about participants’ children’s developmental and behavioral outcomes, so that impacts on children can be estimated.

Although the child roster does not ask many questions specifically about children (only their first names, dates of birth, relationship to study participants, the amount of time participants spend with the child, and the identity of other adults that spend time with the child), participants may be reticent to offer any such information. Prior to enrolling in the study, participants will be given an informed consent form that states that participation in the study is required to access HPOG services. (See Attachment B.) In addition, any future data collection efforts involving children would require additional permission from parents.

A12: Estimation of Information Collection Burden

Exhibit A12.1 presents the estimated annual reporting burden on study participants completing the Supplemental Baseline Questions and on grantees who enter those data. Response times were estimated based on prior experience of the contractors with similar data collection. We estimate that the additional questions integrated into the PRS will take sample members approximately 15 minutes to complete, and we assume that it will take grantees an equivalent amount of time to enter data per participant. Our estimation of the annualized number of respondents assumes an overall sample intake period of up to 24 months across the 20 grantees to be included in the study. The total number of individual participants over the 24 months is estimated to be 10,250, but the annual number of respondents is 5,125 as reflected in the exhibit.



Exhibit A12.1: Annual Information Collection Activities and Cost

Instrument

Annual

Number of Respondents

Number of Responses Per Respondent

Average Burden Hours Per Response

Total Burden Hours

Average Hourly Wage

Total Annual Cost

HPOG Performance Reporting System (PRS)2

32

2

31.2

1,997

$29.81

$59,531

Supplemental Baseline Questions (program participants and control group members)

5,125

1

0.25

1,281

$3.41

(see description in paragraph below)

$4,369

Supplemental Baseline Questions (grantees)

20 grantees

256.25

0.25

1,281

$28.293

$36,239

Totals


2,562


$40,608



Our calculation of the annualized burden cost of the supplemental questions has two parts.  First, we calculate the annual burden cost to HPOG applicants.  In this part of the calculation of annualized burden cost, we use the value of the hourly wage of HPOG participants ($9.74) and the percentage of participants employed at intake (35%) to value respondent burden at an average $3.41/hour.  When multiplied by the total annual burden in hours (1,281), the result is an annualized total burden cost of $4,369. Next, we calculate the burden to grantees of administering the supplemental questions to HPOG applicants.  Each of the 20 grantees in the study will collect the supplemental data annually for an average 256 HPOG applicants (or 5,125 in total) for an annual burden of 1,281 hours.  When multiplied by the average hourly income for the general class of workers performing this function ($28.29), the total comes to $36,239.  When added to the burden cost for applicants the annual total is $40,608. 



A13: Cost Burden to Respondents or Record Keepers

Not applicable. The proposed information collection activities do not place any new capital cost or cost of maintaining capital requirements on respondents. All grantees will use their existing computers to access the Internet and enter data on the secure Internet site.

A14: Estimate of Cost to the Federal Government

The proposed data collection will take place beginning on November 1, 2012 and is anticipated to go through February 28, 2014. The total cost for these data collection activities will be $344,866. Annual costs to the Federal government will be $172,433 for the proposed data collection. Total annual cost of the information collection including previously approved information collection is $339,430.

A15: Change in Burden

This is a request to include additional questions to an already approved collection. The original PRS system was approved for 1,997 annual hours of burden. These supplemental questions add an additional 2,561 annual hours of burden to this information collection.

A16: Plan and Time Schedule for Information Collection, Tabulation and Publication

16.1 Analysis Plan

HPOG-Impact data collection activities will support four major deliverables:

  1. A baseline report that describes the research sample, including both treatment-control group equivalence and subgroups for which separate impact estimates will eventually be reported. The baseline report will use data from the PRS and the Supplemental Survey. It will be submitted to ACF approximately six months after baseline data collection is completed.

  2. An implementation report that describes program operations of grantees in the study. This report will summarize the results of the Grantee Survey. It will assess differences and commonalities in the way in which grantees run HPOG programs, and, as such, will identify distinct models of HPOG implementation. The implementation report also will summarize the results of case studies of grantees identified as implementing specified HPOG program models. The implementation report is expected to be completed by the end of 2013.

  3. A 15-month follow-up report that presents program impacts of specific program models/components, estimated as the difference in mean outcomes between the treatment and control groups, among grantees that offer both a single HPOG program and that offer a standard compared to an enhanced version of the HPOG program. This report will rely on the 15-month Follow-up Survey as well as on administrative data to assess variation in participant outcomes associated with variation in HPOG program characteristics. Accordingly, the report will identify research-proven models and components that achieve more favorable 15-month outcomes for participants, including for some specific targeted subgroups of participants. The report will summarize the results of all of the data collection activities. The 15-month follow up report is expected to be completed by the end of September 2015.

  4. A 30-36 month follow-up report that presents long-term impacts of specific program models/components, estimated as the difference in mean outcomes between the treatment and control groups, among grantees that offer both a single HPOG program and that offer a standard compared to enhanced version of the HPOG program. This report will rely on the 30-36-Month Follow-up Survey, as well as on administrative data to assess variation in participant outcomes associated with variation in HPOG program characteristics. Accordingly, the final report will identify research-proven models and components that achieve more favorable long-term outcomes for participants and for their children, including for some specific targeted subgroups of participants. The report will summarize the results of all of the data collection activities and is expected to be completed by the end of September 2017.

16.2 Time Schedule and Publications

Exhibit 16.2 presents an overview of the project schedule for information collection. It also identifies deliverables associated with each major data collection activity.

Exhibit 16.2 Overview of Project Data Collection Schedule

Data Collection Activity

Timing

Associated Publications

  1. PRS baseline data collection


Currently operating under OMB # 0970-0394

Baseline report, 15-month follow-up report

  1. Supplemental Baseline Questions

Beginning December 2012

Baseline report, 15-month follow-up report

  1. 15-month Follow-up Survey

Beginning February 2014

Implementation report, 15-month follow-up report

  1. Implementation Survey of Grantees

Beginning fall 2013

Implementation report, 15-month follow-up report

  1. Site visits, observations, staff and management interview

Beginning Winter 2013

Implementation report, 15-month follow-up report

  1. 30-36-Month Follow-up Survey

Beginning May 2015

30-36 month follow-up report

  1. Follow-up Data Collection on Children of Study Participants

Beginning May 2015

30-36 month follow-up report



A17: Reasons not to Display OMB Expiration Date

All instruments created for HPOG-Impact will display the OMB approval number and the expiration date for OMB approval.

A18: Exceptions to Certification for Paperwork Reduction Act Submissions

No exceptions are necessary for this information collection.

1 Public Private Ventures’ Sectoral Employment Impact Study (Maguire, Freely, Clymer, et. al., 2010) is an impact evaluation of sectoral employment programs; A Promising Start: Year Up’s Initial Impacts on Low-Income Young Adults’ Careers (Roder and Elliott, 2011) is a small-scale random assignment impact study of a sectoral employment effort that does not target healthcare. The impact evaluation of the national Employment Retention and Advancement (ERA) Project (Hendra, Dillman, Hamilton, et. al, 2010) is another recent impact study of a workforce development program, but it is not specifically focused on career pathways or healthcare.

2This instrument and all burden related was previously approved on September 29, 2011 (0970-0394). This request is for approval of the supplemental baseline questions, which will be added to the approved PRS.

3 Source: Bureau of Labor Statistics, National Compensation Survey, 2010: Combined average hourly wage across education training and library occupations and community and social services occupations.


File Typeapplication/msword
File TitleAbt Single-Sided Body Template
AuthorKatheleen Linton
Last Modified ByCTAC
File Modified2012-10-17
File Created2012-10-17

© 2024 OMB.report | Privacy Policy