PACE 36 Mo Followup OMB Part A - Revised - clean

PACE 36 Mo Followup OMB Part A - Revised - clean.docx

Pathways for Advancing Careers and Education (PACE)

OMB: 0970-0397

Document [docx]
Download: docx | pdf

Supporting Statement for OMB Clearance Request


Part A: Justification



Pathways for Advancing Careers and Education (PACE) – Second Follow-up Data Collection


OMB No. 0970-0397




November 2014




Submitted by:

Brendan Kelly

Office of Planning, Research
and Evaluation

Administration for Children
and Families

U.S. Department of Health and Human Services





Supporting Statement for OMB Clearance Request – Part A: Justification

Table of Contents



Introduction

In this document, we provide justification for the next set of data collection activities for the Pathways for Advancing Careers and Education (PACE) evaluation sponsored by the Office of Planning, Research, and Evaluation (OPRE) in the Administration for Children and Families (ACF) in the U.S. Department of Health and Human Services (HHS).1

OMB approval was received in November of 2011 for PACE baseline data collection and in August of 2013 for 15-month data collection (OMB No. 0970-0397). .

This submission seeks OMB approval for one follow-up data collection instruments:

  • A 36-Month Follow-up Survey

Subsequent OMB submissions will seek clearance for any future follow-up data collection activities.

PACE is one project within the broader portfolio of research that OPRE is using to assess the success of career pathways programs and models. This strategy includes a multi-pronged research and evaluation approach for the Health Profession Opportunity Grants (HPOG) program to better understand and assess the activities conducted and their results. In order to maximize learning across the portfolio, survey development for PACE and HPOG baseline and follow up surveys is being coordinated, and the majority of the data elements collected in these surveys are similar. Three data collection efforts for HPOG have been approved under OMB clearance number 0970-0394, and a fourth (new) request is being submitted at the same time as this request. The HPOG and PACE research teams coordinated on development of the 36-month survey.

A.1 Necessity for the Data Collection

ACF seeks approval for the follow-up data collection activities described in this request in order to support a study conducted for it by Abt Associates (Abt). The PACE evaluation will assess a range of promising post-secondary career pathways programs that promote the improvement of education, employment, and self-sufficiency outcomes for economically disadvantaged adults. The major goal of PACE is to assess the effectiveness of a group of these programs in increasing 1) the receipt of educational credentials, 2) employment and earnings, and 3) self-sufficiency and other measures of well-being. ACF believes the development of rigorous evidence on these matters will be of great use to both policymakers and program administrators.

A.1.1 Legal or Administrative Requirements that Necessitate the Collection

There are no legal or administrative requirements that necessitate the collection. ACF is undertaking the collection as part of its ongoing effort to improve the economic well-being of the low-income population.

A.1.2 Study Background

ACF conceived of the PACE project as a test of promising interventions for improving the economic prospects of low-income individuals and families. After extensive outreach to the program and policy community (conducted under OMB clearance No. 0970-0343), ACF determined that the focus of PACE would be programs that fit into the career pathways framework. Appendix A is an exhibit of the career pathways framework and the theory of change used to guide the PACE evaluation.

A.1.3 Study Design

PACE study sites target low-income adults who are interested in occupational skills training. The sites conduct random assignment of individuals to one of two groups: a treatment group that is offered the innovative career pathways interventions, or a control group that is able to access a set of “business-as-usual” services, or any other services except the PACE services. The sample size in eight of the nine sites ranges from 500 to 1,200, with most near 1,000—equally distributed between the two research groups (in one of these sites, the sample will come from three sub-sites). The ninth site has an estimated sample of 2,540 across eight sub-sites, with 1,695 in the treatment group and 845 in the control group. Appendix B provides summaries of the nine PACE programs.

A.1.4 Research Questions

Overall, the PACE evaluation seeks to address the following research questions:

  • Implementation—What services are provided under each intervention? What are the characteristics of the populations served? How are services implemented? What are the issues and challenges associated with implementing and operating the service packages and policy approaches studied? How do services available to the treatment group compare to the services available to the control group? How does the take-up and utilization of services by the treatment group compare to the take-up and utilization of the control group?

Implementation data will provide a fuller understanding of the conditions surrounding these career pathway programs and the contexts in which they operate. This information also will allow researchers to assess the quality of the implementation of these programs—assessments that will be important to the interpretation of program impact results.

  • Impact—What are the net impacts of career pathway programs on educational outcomes (program completion and attainment of credentials and degrees) and economic outcomes (earnings, employment levels, and wage progression)?

The impact study will use baseline data, 15- month follow-up survey and 36-month follow-up survey data (for which this package seeks approval) to address impact study research questions. Additionally, data on study participants’ wages will be collected from the National Directory of New Hires (NDNH). This is expected to reduce the burden on study participants by negating the need to ask detailed earnings questions on the follow-up survey. As well, administrative records are not subject to recall error or non-response. Additionally, the research team plans to use the National Student Clearinghouse (NSC) to gather information about college persistence and degree completion for study participants. NSC is currently the only data source which tracks postsecondary student enrollment across states.

Data collected for PACE will provide a rich body of information from which to answer these key research questions. For example, the research team will be able to say with confidence whether a program improved credential receipt at the time of the 36-month follow-up. The research team will analyze and report on each PACE program as a separate study. In addition, the research team will conduct limited cross-site impact analyses. This package seeks clearance for the following impact study information collection items: the 36-month follow-up survey, and the contact information update letter with a contact update form, which will help to ensure the research team can locate the study participants for the follow-up data collection activities.

  • Cost effectiveness—What are the costs of career pathway programs in the study? Do the estimated benefits of providing services outweigh the costs of these programs?

The project will address the cost-effectiveness of programs through comparison of net economic benefits with net program costs in the cost-benefit study. Although the bulk of cost data will come from programs’ existing administrative records, PACE researchers will augment their understanding of program costs through interviews with program staff.

A.1.5 Data Collection in the PACE Evaluation

In 2011 and 2013, the PACE project obtained OMB clearance for the baseline data and 15-month follow-up data collection (OMB No. 0970-0397). The following instruments were approved under this clearance:

  • Basic Information Form (BIF) for participants that collects general demographic and contact information. The BIF is administered during intake prior to random assignment by an intake staff person or self-administered on a paper form. (Approved November 2011)



  • Self-Administered Questionnaire (SAQ) for participants that collects more sensitive and personal information, including several psycho-social items designed and validated by the testing firm ACT, Inc. The SAQ is also completed during intake prior to random assignment and is self-administered on a paper form. (Approved November 2011)



  • First round Interview Guides for interviews with program staff that is used to collect information from site staff personnel. The project team is interviewing program administrators and staff at PACE sites and other organizations that partner with PACE sites to deliver services. The first round interview guides were used for interviews conducted during the pilot and early full implementation stages. (Approved November 2011)

  • Basic Information Form Modification. This small but very important modification to the BIF allows the study team to collect information at baseline regarding the sample members’ children (as applicable). The newly added questions help to establish a sampling frame for future follow-up activities that estimate the effects of the programs on the children of those in the study who are parents. (Approved August 2013)

  • 15-month Follow-up Survey. The follow-up survey collects information from study participants 15 months following the date of random assignment. The follow-up survey is administered by telephone using specially trained interviewers and captures information on outcome measures for treatment and control group members in several domains including education and training, employment and income, and life circumstances. Field follow-up is used to contact participants who could not be reached after multiple phone attempts. (Approved August 2013)

  • Second Round Interview Guides for Program Leadership/Managers, Instructional Staff, Case Managers/Advisors, and Partners. Interview topic guides for the implementation study are used during a second round of site visits to each program to collect information from PACE program staff and other organizations involved in the delivery of services. The interview guides collect data to describe the programs as implemented, including core components, management and staffing, and contextual factors. In addition to describing the interventions, this information will help the research team interpret impact results. (Approved August 2013)

  • Online Surveys of Case Managers/Advisors, Managers/Supervisors and Instructional Staff. Online staff surveys are administered at each of the nine PACE programs. The case manager/advisor survey focuses on the issues covered with students (personal, academic, career planning, employment, financial), the amount of time spent with students, staff development activities, and how closely student progress and completion is monitored. The managers/supervisor survey inquires about staff background, the nature of assistance provided to program participants, and staff development and morale. The instructor survey elicits quantitative data about class size, the extent to which basic skills are integrated with training instruction, the use of and time spent on different instructional modes, instructor backgrounds, staff development activities, staff autonomy, and morale. (Approved August 2013)

  • In-depth Study Participant Interviews. In-depth interview guides are used to collect information from a sample of study participants from each site at two points in time, as well as for a brief interim telephone check in. In the 7 programs that are single sites (i.e., no sub-sites), the team is interviewing 10 treatment and 5 control group members. In programs with sub-sites, the team is interviewing 10 treatment and 5 control group members per sub-site in one program and half in another. This information will be used to gain a more comprehensive understanding of treatment and control members’ experiences with the services. (Approved August 2013)

This current submission seeks clearance for follow-up data collection instrument developed for the PACE evaluation. A copy is in the Appendix.

Instrument: 36-month Follow-up Survey. The follow-up survey will collect information from study participants approximately 36 months following random assignment. The follow-up survey will be administered by telephone using specially trained interviewers and will capture information on longer term outcome measures for treatment and control group members in several domains including education and training, employment and income, and life circumstances. Field follow-up will be used to contact participants who could not be reached after multiple phone attempts. (Appendix C) Many of the questions to be asked at 36 months were approved for the 15-month survey and most other items have been asked in other OMB-approved studies. A summary of the sources of survey items is provided in Appendix D.



ACF will seek OMB approval at a later point for additional information collections as part of the PACE evaluation. Data sources that will need OMB approval include:

  • Follow-up surveys: ACF may submit for review an additional follow-up surveys in the future. The next follow-up survey would take place 60 months after random assignment. Additional survey instruments will be submitted to OMB for clearance.

Other Data sources:

  • Government administrative records: These records include Unemployment Insurance (UI) and federal wage records. The project has established an agreement with the HHS Administration for Children and Families’ Office of Child Support Enforcement to utilize the UI and wage records from the National Directory of New Hires.

  • National Student Clearinghouse: NSC includes 3,641 participating public and private institutions that collectively represent approximately 93 percent of higher education enrollments nationwide. The project will use NSC data for information on college persistence and degree completion.

  • Program records: The project team is collecting data on outcomes from the programs for the implementation study and for the impact study in sites where program records are available. Illustrative outcomes include measures of basic academic skills, services received, and credits and credentials earned. For some outcomes – such as educational attainment at post-secondary institutions – it is possible we will gather data from centralized state databases. The information included in these records will differ from site to site based on the information collected by each site's management information system(s). Where the site is a community college, the study team may be able to get reasonably comparable data on both treatment and control group members, but for the most part, the team will be limited to data on the former. This will not impose burden on programs because they currently collect this data for their own use.

A.2 Purpose of Survey and Data Collection Procedures

A.2.1 Overview of Purpose and Approach

The PACE project is an evaluation of promising programs and policies for improving employment and self-sufficiency outcomes for low-income, low-skilled adults. The PACE study is utilizing an experimental design in nine programs to assess the impact of promising interventions on credential attainment, employment, earnings, and general well-being and will also include an implementation study and cost-benefit study.

The purpose of this second follow-up survey is to measure outcomes and impacts approximately 36 months after random assignment on service receipt, educational attainment, employment, and other life circumstances. These data will be used for the PACE impact study, as well as to understand treatment and control differentials for the implementation study. The instrument can be found in Appendix C.

A.2.2 Data Collection Process

The follow-up survey data collection will take place approximately 36 months following random assignment, which began in the first program in November 2011. Therefore, the follow-up data collection will start in late 2014 upon OMB approval. The last program will end random assignment in December 2014, and the sample for this cohort will be released to the field in December 2017. Allowing for six months to field the sample, follow-up survey data collection will conclude in June 2018

A.2.3 Who Will Use the Information

The primary beneficiaries of this planned data collection effort will be ACF, other federal agencies, program operators, and low-income individuals themselves. ACF will use the information to assess the effects of the PACE programs for low-income individuals. These data will begin to answer ACF's questions about impacts of the post-secondary career pathways programs in all study domains: for example, education and credential achievement, employment and earnings, and income. Similarly, the Departments of Labor and Education have expressed a strong interest in the PACE study in particular and career pathways program effectiveness in general. The results of the PACE study could inform programmatic and funding decisions for all three agencies. Organizations (e.g., community colleges, workforce development agencies, community-based organizations) that are operating or creating career pathways programs will use the study information to refine or design programs for their target populations. Finally, low-income individuals will benefit from this information to the extent that it demonstrates the cost-effectiveness of career pathways programming and contributes to a body of evidence to inform program and policy design and investments.

Secondary beneficiaries of this data collection will be those in the public policy and social science research community who are interested in further understanding initiatives to promote economic self-sufficiency of individuals and families through comprehensive career pathways programs. At the conclusion of the PACE study, the research team will provide ACF with a restricted-use data set containing individual level data stripped of all personally identifying information. The restricted-use data will be made available to researchers for approved secondary uses.

Ultimately, these data will benefit researchers, policy analysts, and policy makers in a wide range of program areas. The effects of post-secondary career pathways programs on the well-being of low-income individuals and families could manifest themselves in many dimensions and could be relevant to an array of other public programs. This project offers the first opportunity to obtain reliable measures of these effects. The long-term indirect benefits of this research are therefore likely to be substantial.

A.2.4 Instrument Item-by-Item Justification

Exhibit A-2 describes the target respondents, content, and reason for inclusion for each new data collection activity. For more information about previously approved instruments, see previous information collection request (# 0970-0397) approved August 2013. Copies of all the data collection instruments are provided as appendices.


Exhibit A-2. Item-by-Item Justification of Data Collection Instruments

Data Collection Activity

Data Collection Instrument(s)

Respondents, Content, and Reason for Inclusion




Study Participant Follow-up Survey

Instrument 1:36-Month Follow-up Survey


(Appendix C)

Respondents: Overall expected sample of 7,386 (80% of 9,232 sample members from nine PACE programs).


Content:

  • Employment and training history

  • School spells

  • Gaps in education and school

  • Job spells

  • Job conditions

  • Education and career goals

  • Services and assistance 21st century skills



  • Household composition

  • Income and material well-being

  • Time out of home/child supervision

  • Child education-related goals and support

  • Child outcomes

Contact information


Reason: This follow-up period of 36 months will provide a longer-term look at education and employment outcomes, as well as the first opportunity to explore parenting practices and child outcomes.



A.3 Improved Information Technology to Reduce Burden

The PACE evaluation will generate a substantial amount of data and will use a combination of data collection methods. For each data collection activity, the study team has selected the form of technology that enables the collection of valid and reliable information in an efficient way while minimizing burden. This evaluation will use improved technology to facilitate the collection of the survey data in standardized and accurate ways that also ensures the protection of the data collected.

The follow-up survey will be administered using CATI (computer-assisted telephone interviewing) technology for telephone interviews and CAPI (computer assisted personal interviewing) for in-person interviewing when the individual cannot be located for a telephone survey. CATI and CAPI technology reduces respondent burden, as interviewers can proceed more quickly and accurately through the survey instruments, minimizing the interview length. Computerized questionnaires ensure that the skip patterns work properly, minimizing respondent burden by not asking inappropriate or non-applicable questions. For example, respondents who did not participate in post-secondary training will be routed past questions only relevant to those who did. Computer-assisted interviewing can build in checkpoints, which allow the interviewer or respondent to confirm responses thereby minimizing data entry errors. Finally, automated survey administration can incorporate hard edits to check for allowable ranges for quantity and range value questions, minimizing out of range or unallowable values.

The agency considered using a planned missingness and multiple imputation method to reduce burden on respondents by establishing subpanels that receive different forms of the questionnaire. This would enable the research team to report about a larger set of outcomes without causing the average interview duration to exceed 60 minutes. However, the research team was concerned about the large number of programs being evaluated and analytic difficulties that would be encountered in trying to impute the missing data in ways that fully reflected differences in effectiveness across programs. The agency and research team instead opted to employ a variety of efficiencies to reduce the length of the survey overall to 60 minutes. This included analyzing data from the 15-month survey to determine which 21st Century Skills scales worked best to answer the study questions and removing or consolidating questions that were similar to those asked at 15 months.

A.4 Efforts to Identify Duplication

The purpose of the follow-up survey for the PACE evaluation is to obtain current information on the status and wellbeing of individuals in the PACE evaluation study sample. Information about these respondents' educational achievement, economic well-being, and job skills development are not available through any other source, nor is information about parenting and child outcomes. The evaluation will utilize administrative data (e.g., wage records) in conjunction with survey data to avoid duplication of reporting.

The research team will also avoid duplication in this study by use of the centrally maintained data system, which links all the data collected at baseline and follow-up (and during the subsequent active and passive tracking efforts) with subsequent information gathered from administrative sources. This eliminates the need to ask about personal characteristics or background factors for known household members on follow-up surveys.

Of the nine sites included in PACE, three are programs that received Health Profession Opportunity Grants (HPOG) administered by ACF and a fourth is a sub-grantee to an HPOG-funded program. ACF is funding implementation and impact evaluations of the HPOG program and the PACE and HPOG research teams worked closely to coordinate data collection in the four programs that are part of both studies. Areas of coordination include:

  • Development of the 36-month follow-up survey included in this clearance request. The teams worked in close collaboration to develop the follow up instrument. The HPOG evaluation is also submitting an OMB clearance package at this time under OMB #0970-0394.

  • Data sharing. All data collected for the three HPOG sites in PACE will be shared with the HPOG research team for inclusion in the HPOG implementation and impact studies.

A.5 Involvement of Small Organizations

The primary organizations involved in this study are community colleges, workforce development agencies, and community-based organizations that operate occupational training programs. Burden is minimized for these entities by requesting the minimum information required to achieve the study’s objectives. On-site interviews with program staff cover topics on which the study team is unable to collect sufficient information by other means.

A.6 Consequences of Less Frequent Data Collection

The data collection effort described in this document is essential to the PACE evaluation. If data are collected less frequently, it would jeopardize ACF’s ability to conduct the impact analyses. Delays in the administration of the follow-up survey run an inherent risk that the respondent will have trouble recalling the details about the services received and potentially lead to missing the achievement of key milestone events as study participants move through training and education.

A.7 Special Circumstances

The proposed data collection activities are consistent with the guidelines set forth in 5 CFR 1320.6 (Controlling Paperwork Burden on the Public, General Information Collection Guidelines). There are no circumstances that require deviation from these guidelines.

A.8 Federal Register Notice and Consultation

In accordance with the Paperwork Reduction Act of 1995, the Administration for Children and Families (ACF) at the Department of Health and Human Services published a notice in the Federal Register June 23, 2014, Vol. 79, page 35549-50. The document number is FR Doc. 2014–14566. A copy of the notice is shown in Appendix E. During the notice and comment period, the government received two requests for information about the data collection activity. Those requests were fulfilled. No comments were received.

A.9 Incentives for Respondents

For the evaluation to be most successful, the study team determined that monetary gifts should be provided to the study participants in appreciation of the time they spend participating in the data collection activities. These tokens of appreciation are a powerful tool for maintaining low attrition rates in longitudinal studies, especially for participants in the control group because these sample members are not receiving any (other) program benefits or services. The use of monetary gifts for the PACE follow-up surveys can help ensure a high response rate, which increases confidence in producing unbiased impact estimates. Low response rates increase the danger of differential response rates between the treatment and control groups, leading to possible non-comparability between the two groups and potentially biased impact estimates.

Three factors helped to determine the amounts for the follow-up survey:

  1. Respondent burden, both at the time of the interview and over the life of the evaluation;

  2. Costs associated with participating in the interview at that time; and

  3. Other studies of comparable populations and burden.

  4. Experience to date with the 15-month follow-up survey.

Previous research has shown that sample members with low incomes and/or low educational attainment are responsive to incentives, as are minority group members. These characteristics are expected to be heavily represented in the PACE study population.2

The amounts offered for the PACE follow-up survey and tracking responses are as follows:

  • Follow-up Survey (all study participants): $40. This is an increase of $10 over the amount offered for the 15-month survey. The higher amount is expected to reduce nonresponse bias to the 36-month follow-up for this difficult-to-track PACE population. Feedback received by the survey firm from 15-month survey respondents is that they appreciate the payment amount in and that it is fair given the length of the interview. (Additional information on determining the survey amount is provided below.)

  • Contact information update letters (all study participants): Contact information update letters will be mailed to respondents every four months in order to obtain the most up-to-date contact information and minimize the likelihood of being unable to contact a participant (see Appendix F). Each contact information update letter will include two $1 bills (total $2) as a thank you for updating or confirming contact information. This is a change in protocol from the 15-month survey in which participants were paid $5 if they confirmed or changed their contact information. The change to a prepayment in the contact information update letters is based on extensive literature documenting the effectiveness of prepayments (see Church 1993, Yammarino, Skinner, and Childers 1991; Edwards et al. 2002;.Edwards et al. 2005).

  • In addition, study participants will receive $5 with the Advance letter (Appendix G). (Advance letters are mailed one week before calls begin for a particular cohort). This is a change in protocol from the 15-month survey. This prepaid gift was incorporated into the 36-month survey protocol based on literature that shows attrition in survey response rates for panel surveys (see Deng et al. 2013; Hernandez 1999; Clark and Mack 2009; Fitzgerald, Gottschalk, and Moffitt 1998; Hill 1991).

Respondents to the follow-up survey will receive $40 to complete a 60 minute questionnaire that will be administered via telephone or in person if the individuals are not reachable by phone. The proposed amount is based on similar surveys the research team has conducted with this population. Tokens of appreciation help to secure the cooperation of the individual over the duration of the study period and reduce the potential for individuals to fail to complete them.

Many surveys are designed to offer incentives of varying types with the goal of increasing survey response. Monetary incentives at one or more phases of data collection have become fairly common, including in some federally sponsored surveys. Examples include the National Survey on Drug Use and Health (NSDUH, Substance Abuse and Mental Health Services Administration), the National Survey of Family Growth (NSFG, National Center for Health Statistics), the National Health and Nutrition Examination Survey (NHANES, National Center for Health Statistics), the National Survey of Child and Adolescent Well-Being (NSCAW, Administration for Children and Families), and the Early Childhood Longitudinal Study-Birth Cohort (ECLS-B, U.S. Department of Education).

There has been extensive publication about the relative efficacy of different monetary incentives. The U.S. Census Bureau has experimented with and begun offering monetary incentives for several of its longitudinal panel surveys, including the Survey of Income and Program Participation (SIPP). SIPP has conducted several multi-wave incentive studies, most recently with their 2008 panel, comparing results of $10, $20, and $40 incentive amounts to those of a $0 control group. They examined response rate outcomes in various subgroups of interest (e.g., the poverty stratum), use of targeted incentives for non-interview cases, and the impact of base wave incentives on later participation. Overall, $20 incentives increased response rates and improved the conversion rate for non-interview cases (Creighton et al, 2007). The National Survey on Drug Use and Health (NSDUH, Substance Abuse and Mental Health Services Administration) conducted an experiment in which the cost per interview in the $20 incentive group was five percent lower than the control group, whereas the $40 incentive group cost was four percent lower than the control, due to reduced effort needed in gaining cooperation (Kennet et al., 2005). The NSDUH adopted an intermediate incentive of $30 because the greatest increase in response rate was found in the $20 incentive condition, and the $40 condition obtained a higher variation in per-interview costs. A similar incentive experiment conducted for the National Survey of Family Growth (NSFG, National Center for Health Statistics) Cycle 5 Pretest examined $0, $20, and $40 incentive amounts. The additional incentive costs were more than offset by savings in interviewer labor and travel costs (Duffer et al, 1994).

A.10 Privacy of Respondents

The information collected under this data collection will be kept private to the fullest extent provided by law. The information requested under this collection will be private in a manner consistent with 42 U.S.C. 1306, 20 CFR 401 and 402, and OMB Circular No. A-130.

A.10.1 Data Privacy Protections

The study team has established rigorous data security and privacy provisions. First, all data users are aware of and trained on their responsibilities to protect participants’ personal information, including the limitations on uses and disclosures of data. (Each study team member who works with data signs an Individual Investigator/Confidentiality Agreement, which outlines the individual’s responsibilities in complying with the standards and requirements for protecting data). The research databases are designed to limit access to authorized users with levels of access commensurate with each person’s role on the project. The web server hosting the database is maintained in a secure facility with power back up, network redundancy, and system monitoring. In addition, daily back up of the server is maintained at the data center and an off-site location. The database and website are password protected, and access is provided only after user authentication.

The PACE Participation Agreement (see Appendix H) ensures a commitment to keeping personal information private. This assurance will also be made to all respondents as part of the introduction to the follow-up survey. For both survey data and corresponding administrative data on sample members, computer security will be maintained by individual passwords and folder permissions which limit access to files to only those project staff members who require access to these files.

The ACF Office of Planning, Research and Evaluation is in the process of publishing a System of Records Notice (SORN) titled OPRE Research and Evaluation Project Records and a Privacy Impact Assessment (PIA) titled ACF Research and Evaluation Studies.

A.11 Sensitive Questions

The follow-up survey includes questions about physical and emotional health, and substance use, items that some respondents may consider sensitive. The literature provides ample support for including these items as barriers to education and employment. Including these items is necessary to describe the study population and evaluate mediating effects on program impacts. Program staff will remind study members during the interviewing process that they may refuse to answer individual items. Study members will also be reminded that their responses will be kept private to encourage their candid responses.

A.12 Estimation of Information Collection Burden

Baseline Data Collection Already Approved

The total burden for the instruments already approved was estimated to be 21,454.

Administration of these previously approved instruments continues and the total remaining burden is 2,598hours, or 866 hours annually over three years.

Current Information Collection Request

Follow-up data collection for the PACE evaluation will occur for each individual approximately 36 months following random assignment. Exhibit A-12 shows the estimated burden for both new instruments and instruments which have previously been approved but have burden remaining. It shows the average time, in hours, that study participants and program staff and partners are estimated to spend completing each data collection instrument.

The average hourly wage was calculated for each respondent group based on information from the Bureau of Labor Statistics3 or the federal minimum wage. The mean hourly rate4 for each respondent group was calculated as follows:

  • Study participant: the minimum hourly wage ($7.25) plus a 40 percent adjustment to account for benefits, or $10.15 per hour.

  • Case manager: Community and Social Service Occupations (SOC code 21-0000) wage rate of $21.50 plus a 40 percent adjustment for benefits, or $30.10.

  • Instructional staff: Education, Training, and Library Occupations (SOC 25-0000) wage rate of $24.76, plus a 40 percent adjustment for benefits, or $34.66.

  • Program leadership/Manager: Social and Community Service Manager Occupations SOC 11-9151) wage rate of $31.61, plus a 40 percent adjustment for benefits, or $44.25.



Exhibit A-12. Total Data Collection Burden

[This information collection request is for a three-year period.]

Instrument

Total Number of Respondents

Annual number of respondents

Number of responses per respondent

Average burden hours per response

Annual burden hours

Average Hourly Wage

Total Annual Cost

Previously Approved Instruments

Baseline data collection: Basic Information Form

24

8

1

.25

2

$10.15

$20

Baseline data collection: Self-administered Questionnaire

24

8

1

.33

3

$10.15

$30


15 Month Follow-up Survey


2,900

967

1

0.833

806

$10.15

$8,171


Study Participant In-depth Interview Guide


144

48

1

1

48

$10.15

$487

Study Participant Check-in Call

144

48

1

.16

8

$10.15

$81

Current Request for Approval

36-Month Follow-up Survey


7,386

2,462

1

1

2,462

$10.15

$24,989



Total Burden Hour Request

Total burden is displayed in Exhibit A-12. The total burden for already approved, but continuing, information collection and the new request is 9,987 hours, or 3,329 hours per year over three years. The annual burden is equivalent to $33,779 based on respondents’ estimated hourly compensation of $10.15, or a total of $101,338 over three years.

A.13 Cost Burden to Respondents or Record Keepers

This data collection effort involves no recordkeeping or reporting costs for respondents other than those described in item A.12 above.

A.14 Estimate of Cost to the Federal Government

The annual cost for all information collection under this OMB number is $3.9 million. This includes the cost of the development of data collection instruments and tools, the administration of the follow-up surveys, and the analysis and reporting of data.

A.15 Change in Burden

This evaluation involves new data collection that increases the public reporting burden under this OMB number. Section A.12 details the burden figures.

A.16 Plan and Time Schedule for Information Collection, Tabulation and Publication

The evaluation contractor, Abt Associates, and its subcontractors will collect, analyze, tabulate and report the data collected for the PACE evaluation to ACF.

A.16.1 Analysis Plan

The PACE data collection activities will support the following major deliverables in addition to those already produced under the same OMB number:

  1. Site-specific 36-month impact studies. Each interim report will describe the program impact on key indicators, including career pathways-relevant training, earnings and career-track employment, and family well-being. Inasmuch as programs use different strategies and target varying outcomes, the site-specific impact studies will also focus on whether impacts vary by subgroups. The site-specific interim reports, using data from the 36-month survey, will also include cost-benefit information. Interim impact reports will be drafted on a rolling basis based on a schedule determined by the timing of the 36-month surveys. The first report will be drafted by November 2017and the last will be written by January 2019. For each report, a revision will be submitted after receiving comments from ACF. The first report is projected to be finalized by February 2018 and the last by April 2019.

In addition to these primary analyses, data from this information collection will be important to conducting a number of secondary analyses.  For example, the PACE impact evaluation will consider the extent to which component parts of a career pathway program contribute to program impacts and compare existing career pathway resources with programming developed under PACE, assuming the available data support such a comparison. This question is of primary concern for the HPOG impact evaluation and will form part of the secondary analysis for the PACE impact evaluation.

A.16.2 Time Schedule and Publications

Exhibit A-16 presents an overview of the project schedule for all information collection under OMB #0970-0397. It also identifies deliverables associated with each major data collection activity.

Exhibit A-16. Overview of Project Data Collection Schedule

Data Collection Activity

Timing

Associated Publications

  1. Baseline data collection


Currently operating under OMB # 0970-0397

Site-specific implementation reports, interim impact reports

  1. Supplemental baseline questions on BIF

Currently operating under OMB # 0970-0397

Site-specific implementation reports, interim impact reports

  1. 15-Month Follow-up survey

Currently operating under OMB # 0970-0397

Site-specific implementation reports, interim impact reports, cost-benefit report

  1. Survey of instructors and case managers/advisors

Information collection completed under OMB # 0970-0397

Site-specific implementation reports

  1. Site visits, staff and management interview

Information collection completed under OMB # 0970-0397

Site-specific implementation reports

  1. In-depth interviews with study participants

Currently operating under OMB # 0970-0397

Site-specific implementation reports

  1. 36-Month Follow-up survey

Beginning in 2014 upon OMB approval

Site-specific, interim and final impact reports



A.17 Reasons Not to Display OMB Expiration Date

All data collection instruments created for the PACE evaluation will display the OMB approval number and expiration date.

A.18 Exceptions to Certification for Paperwork Reduction Act Submissions

This submission describing data collection requests no exceptions to the Certificate for Paperwork Reduction Act (5 CFR 1320.9).

1 From the project inception in 2007 through October 2014 the project was called Innovative Strategies for Increasing Self-Sufficiency.

2 See among the sources documenting this recommendation: Allen P. Duffer et al., "Effects of Incentive Payments on Response Rates and Field Costs in a Pretest of a National CAPI Survey" (Research Triangle Institute, May 1994), passim; see also "National Adult Literacy Survey Addendum to Clearance Package, Volume II: Analyses of the NALS Field Test" (Educational Testing Service, September 1991), pp. 2-3.

3 http://www.bls.gov/oes/current/oes_nat.htm

4 Assuming 2080 FTE hours worked.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleAbt Single-Sided Body Template
AuthorMissy Robinson
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy