20211007_Supporting Statement B - InFin - revised - clean

20211007_Supporting Statement B - InFin - revised - clean.docx

OPRE Study: Integrating Financial Capability and Employment Services (InFin) [Descriptive Study]

OMB: 0970-0586

Document [docx]
Download: docx | pdf

Alternative Supporting Statement for Information Collections Designed for

Research, Public Health Surveillance, and Program Evaluation Purposes





Integrating Financial Capability and Employment Services



OMB Information Collection Request

0970 – New Collection





Supporting Statement

Part B



NOVEMBER 2021








Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officers:

Lisa Zingman


Part B


B1. Objectives

Study Objectives

The Office of Planning, Research, and Evaluation (OPRE) within the Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services seeks approval for the Integrating Financial Capability and Employment Service study. The objective of this exploratory, descriptive study is to build more evidence for policymakers and practitioners about the extent, forms, and practices of incorporating financial capability interventions into employment and training (E&T) programs serving low-income adult populations. This study also aims to help establish a basis for research and evaluation in this area. As noted in SSA, the study research questions include:

  • To what extent are employment coaching or training programs incorporating financial capability training?

  • Why are they incorporating financial capability training? What factors, including any state or local policies, drive this decision?

  • What are the key inputs, activities, and outputs of financial capability trainings as implemented in employment and training programs?

  • What are the efforts to evaluate financial capability training in employment and training contexts to date?

  • What are the research gaps in these areas and options for future research and evaluation efforts to address them?

To advance these goals, this study includes six data collection components:

  • A survey of employment and training programs integrating financial capability services (Instrument 1)

  • Program phone interviews with employment and training programs integrating financial capability services (Instrument 2)

  • Virtual site visit interviews to employment and training programs integrating financial capability services (Instrument 3)

  • Interviews with participants of employment and training programs integrating financial capability services (Instrument 4)

  • Interviews with employers that offer financial capability services to their employees (Instrument 5)

  • Focus groups with program administrators of employment and training programs integrating financial capability services (Instrument 6)


Generalizability of Results

This study is an exploratory study designed to build evidence about the forms, extents, and practices of incorporating financial capability interventions into E&T programs serving low-income adults. The information from this study will help establish a basis for potential future research and evaluation in this area by ACF or by other agencies and highlight examples of promising strategies for E&T programs interested in incorporating financial capability services into their programming as well as for financial capability providers considering incorporating employment services directly or through partnerships.

. It is intended to present an internally valid description of financial capability services offered in employment and training programs serving low-income adults, not to promote statistical generalization to other sites or service populations.


Appropriateness of Study Design and Methods for Planned Uses


We will not use the data to make statistical inferences or generalize findings beyond the study sample. However, the information gathered from the nonrepresentative sample will provide a snapshot of the extent, forms, and practices of incorporating financial capability interventions in employment and training programs serving low-income adults.


This study takes advantage of the relative strengths of several different types of qualitative data collection to answer in depth all study objectives in this exploratory, descriptive study.


Through the survey (Instrument 1), we will be able to document high level characteristics of financial capability interventions, including types of employment programs incorporating financial capability services; types of financial capability services integrated; models of integration; and any past, current, and potential future evaluation activities of these services. A web-based survey is the most appropriate method for collecting this information because it will allow us to collect information systematically across many different types of employment and training programs. The survey will also provide information that will help with selection of programs for phone interviews and virtual site visit interviews.


We will also be conducting semi-structured interviews with several different programs and populations. Semi-structured interviews allow for greater flexibility than a web-based survey and will allow us to collect more detailed information than a survey allows. By allowing respondents to guide the discussion, we will likely discover topics that are understudied and therefore not well understood, a major goal for this exploratory study. Semi-structured interviews will be used for the Program Phone Interviews,

Virtual Site Visit Interviews, Interviews with Participants, and Interviews with Employers (Instruments 2-5). The multiple types of interviews will ensure that the study gathers detailed information from different perspectives and on different types of interventions. More specifically:

  • The Program Phone Interviews will collect qualitative information on E&T programs implementing financial capability services across a variety of program types.

  • The Virtual Site Visit Interviews will collect more in-depth qualitative information from multiple perspectives on a limited number of notable models of E&T programs implementing financial capability services that involve higher levels of complexity (e.g., partnerships in service delivery across more than one agency or organization).

  • The Interviews with Participants will gather information on participant perspectives, including feedback on how E&T programs could better provide financial capability services.

  • The Interviews with Employers will collect qualitative information on a variety of employer-based financial capability programs—a topic emphasized as an important component of the employment-related financial capability service delivery ecosystem for low income workers.


This study also plans to utilize Focus Groups with Program Administrators (Instrument 6) to explore broad research questions by allowing administrators from different programs to talk about challenges each has encountered and to collaboratively discuss approaches for confronting those challenges that build on the successes of their respective programs.


As noted in Supporting Statement A, this information is not intended to be used as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information.


B2. Methods and Design

Target Population

The target populations for this study include employment and training program leadership and staff, participants of employment and training programs, and employers. The target populations and estimated sample size for each data component are detailed below.


  • Survey (Instrument 1). The unit of analysis for the survey will be employment and training programs offering financial capability services. We will collect information from up to 80 employment and training programs, likely from program managers or directors.

  • Program Phone Interviews (Instrument 2). The unit of analysis will be employment and training programs offering financial capability services. We will collect information from 15 employment and training programs, likely from program managers or directors.

  • Program Virtual Site Visit Interviews (Instrument 3). For each of the four selected employment and training programs, we will collect information from leadership and frontline staff. For each program we will interview two program managers and six frontline staff, for a total of 32 staff.

  • Participant Interviews (Instrument 4). The unit of analysis for the interviews will be the participant. We will collect information from 16 participants.

  • Program Interviews with Employers (Instrument 5). The unit of analysis will be the employer, and we will collect information from 10 representatives of companies delivering financial capability services to their employees (likely an HR representatives).

  • Program Administrator Focus Groups (Instrument 6). The unit of analysis for the focus groups will be the group discussion. We will collect information from 10 program administrators in two group discussions.



Sampling and Site Selection

Respondent recruitment for each data collection component is described below:


Survey (Instrument 1). Because this is an exploratory study, and we do not have a population from which to sample, we will use a convenience sample to identify potential respondents who can provide key information on their programs’ financial capability services. This sample will be based on programs identified through expert discussions, an earlier review of relevant literature, internet searches by project team members, and nominations. Programs will not be representative of the population of employment and training programs that offer financial capability services. Instead, we aim to obtain various employment and training program types to understand the range of integration models (e.g., curricula or workshops focused on both topics, coaching covering both job search and financial goals, co-located services, referral networks, etc.).


Program Phone Interviews (Instrument 2). Potential participants for interviews will primarily be identified using purposive sampling based on prior survey responses and other available information from the survey sample identification process. We will consider two criteria in choosing programs to interview:

  1. Achieving a variety of program type and features across all programs selected, on dimensions such as:

    • Agency type

    • E&T program type

    • Type of financial capability services offered

    • Nature of partnerships and referral arrangements

    • Populations served

  2. Inclusion of programs that earlier study activities (e.g., the literature review, the survey) suggests have particularly notable models.

We will seek to interview managers from a set of programs that meet these criteria to understand how implementing financial capability services looks in different employment and training contexts.

Program Virtual Site Visit Interviews (Instrument 3). Based on prior survey responses, phone interview responses, and other available information from the survey sample identification process, we will select programs to participate in virtual site visit interviews. We will prioritize selecting larger programs with greater complexity (since gathering multiple perspectives will be most important for larger programs with a more complex approach to service delivery) and that the previous rounds of information collection have indicated either have noteworthy or innovative approaches that other programs and researchers could learn from. The study team aims to speak with 1-2 program managers and 6 staff from each program, depending on the staffing patterns at each site and staff availability. The program managers will identify the staff that they think would be best positioned to respond to the research questions. This includes any staff at partner agencies that provide financial capability services.


Participant Interviews (Instrument 4). Potential participants for interviews will be identified at the four programs identified for virtual site visit interviews by program managers. The research team will provide program managers with guidance for selecting individuals who have participated in financial capability services connected to the employment and training program. They will reach out to these individuals and ask if they agree to participate in the interview. The research team will then reach out to those who agree to confirm their interest in participating.


Program Interviews with Employers (Instrument 5). Potential participants for interviews will be identified via a literature scan, Google search, and expert consultations. Using these sources, we will compile a list of employers that potentially offer financial capability services to their employees. We will select employers to interview that reflect a range of financial capability services, industry sectors, and other characteristics.


Program Administrator Focus Groups (Instrument 6). The research team will identify potential administrators to invite to participate in the focus groups primarily based on information collected during prior survey, phone interview, and site visit responses. Information that the team will consider in making this determination is the nature of the challenges in program administration and notable practices identified at the earlier stages, as well as any statements of interest on the parts of program administrators during the previous information collection stages.


B3. Design of Data Collection Instruments

Development of Data Collection Instruments

The survey (Instrument 1) was designed based on information collected in a foundational literature review for the study. We identified constructs we wished to capture and scanned existing survey instruments, such as the Consumer Financial Protection Bureau Survey of Financial Education Providers, for measures to adopt or adapt. For questions related to the strength of the organization’s partnership, we relied on validated questions used in Greenwald and Zukoski’s Assessing Collaboration: Alternative Measures and Issues for Evaluation.1 Many items were drafted by the team to capture information specific to the project.


We aimed to keep the survey relatively brief, around 20 minutes on average. Questions in the survey instrument have been included because they directly address the research questions that motivate this project. The items are primarily closed-ended.


The survey was also pre-tested with three programs: a county-administered TANF program, a state-administered TANF program, and a community-based organization. Pre-testing revealed that our burden estimates were accurate, the questions resonated with respondents, and that the web-platform functioned properly. We made a slight change in the survey flow because of the pretest.


Guides for the Program Phone Interviews, Program Virtual Site Visit Interviews, Participant Interviews, Interviews with Employers, and Program Administrator Focus Group (Instruments 2-6), were developed to gather more detailed information than the survey can provide in order to answer the study’s research questions. The research team based the specific topics covered by each guide’s questions on the input and feedback gathered through the consultation with experts described in Supporting Statement A. The research team designed the guides to be semi-structured to allow the interviewers flexibility to tailor the discussions given the potential variety in program arrangements—in terms of specific services delivered, target populations, integration between employment and training and financial capability services, and partnerships involved in service delivery—and also given the potential that more previous information may be available from earlier stages of information collection for some interviews than others. Questions in each guide are designed for particular respondents; for example, the research team made an effort to use less technical language in the Participant Interviews guide. The questions in the Program Administrator Focus Group guide are aimed at trying to create a discussion among the focus group participants rather than obtaining direct responses to the interviewer.



B4. Collection of Data and Quality Control

ACF has contracted with MEF Associates and its subcontractor Urban Institute to conduct this study. MEF Associates and Urban Institute staff will be collecting all data for each data component mentioned above. Details on recruitment and mode of data collection are detailed for each below.


Survey (Instrument 1). We will email potential respondents explaining the purpose of the survey (Appendix A). The email will include a link to Qualtrics, a web-based survey platform. While the survey is being fielded, we will spot check responses as they come in to ensure that skip patterns are performing properly and to assess any issues with missing data.

Program Phone Interviews (Instrument 2). We will reach out to program managers via email (Appendix B) and request their participation in these interviews to support the study. Interviews will be scheduled at a time that is convenient for the respondent over videoconferencing or phone.


Program Virtual Site Visit Interviews (Instrument 3). We will reach out to program managers via email (Appendix C) and request their participation in these interviews to support the study. If the site agrees to participate in the site visit interview, we will request a list of names and contact information for other staff and partners that may be included in interviews. Then we will reach out via email to each contact listed to confirm their interest in participating.


Participant Interviews (Instrument 4). Potential participants for interviews will be identified by the four programs identified for virtual site visit interviews. To begin, we will reach out to program managers who have agreed to participate in the site visit interview, to request a list of participants who have agreed to be contacted about participating in interviews using a secure transfer method to protect potential participants’ information. Then we will reach out to each contact listed to confirm their interest in participating. (No specific materials included as an appendix as program managers will be leading the first outreach steps.)


Program Interviews with Employers (Instrument 5). We will reach out to HR representatives via email (Appendix D) and request their participation in these interviews to support the study.


Program Administrator Focus Groups (Instrument 6). The focus groups will occur after the survey, phone interviews and site visits. We will reach out to managers of programs identified through their participation in the earlier stages of information collection to explore their interest in participating in the focus groups (See Appendix E for associated outreach materials). We will emphasize the potential to learn from the other focus group participants.


Interviews will be conducted in the Summer and Fall of 2022 (pending OMB approval) and will last no more than 90 minutes. Due to the possibility that the COVID-19 pandemic maybe ongoing, all semi-structured interviews and focus groups will be conducted via videoconferencing (or over the phone if the interviewees are unable to use videoconferencing). At least two members of the research team will be on each phone or video call. During the interviews, one person will lead the discussion and one person will take detailed notes. We will also audio record the interviews with the permission of respondents.


All interviewers/focus group facilitators will attend a training that will cover the overall goal of the study and related research questions, data security and storage, obtaining participant consent, how to ask interview questions, focus group dynamics and procedures, and program visit logistics. The training will also cover ways to reduce burden by avoiding asking questions that have already been answered in earlier stages of information collection (except where multiple perspectives are valuable).



B5. Response Rates and Potential Nonresponse Bias

Response Rates

All six data components are not designed to produce statistically generalizable findings and participation is wholly at the respondent’s discretion. Response rates will not be calculated or reported.


NonResponse

As participants will not be randomly sampled and findings are not intended to be representative, non-response bias will not be calculated.



B6. Production of Estimates and Projections

The data will not be used to generate population estimates, either for internal use or dissemination.



B7. Data Handling and Analysis

Data Handling

For the survey, Qualtrics provides real-time logic checks and access to data frequencies that will make data review efficient. After the first phase of data collection, project staff will review data frequencies to make sure the responses remain internally consistent. After data collection has ended, we will further check data frequencies and make logical edits when necessary. We will examine if there are systematic patterns in non-response, and we will recode other-specify survey responses into valid response options as appropriate.

For all semi-structured interviews and focus groups, we will have both detailed notes as well as audio recordings. Prior to importing the notes into a qualitative software (see discussion below) the study team will ensure that they have a complete and accurate transcription by checking the detailed notes taken during the interview or focus group with the audio recording.

Data Analysis

For closed-ended survey questions, we will conduct descriptive analysis, calculating frequencies and means, as appropriate. Due to the small number of respondents and lack of a sampling frame, subgroup analysis will be limited. We may conduct subgroup analysis by organization type depending on the final variation of the respondents.


When the data collection period has ended, we will code open-ended responses (that is, other-specify and open-text responses). A trained analyst will code the qualitative responses, starting with a basic coding scheme that will evolve to capture the major categories and subcategories observed in the data. Once coded, responses will be counted and analyzed descriptively. In addition, anecdotes illustrative of themes may be included in public-facing documents.


For the semi-structured interviews and focus group, the guides were developed to answer the key research questions. The research team will fully transcribe audio recordings of interviews and focus groups and clean notes in cases when participants did not consent to being recorded. We will develop a coding scheme to organize the data into themes or topic areas. Transcripts will be coded (tagged based on the theme or topic for which they are relevant) and analyzed using a qualitative analysis software package such as NVivo or Dedoose.


Data Use

This study will produce a final report, a brief on employer-based interventions, a brief on participant perspectives, a brief on credit reporting by employers, and a brief on challenges incorporating financial capability into employment and training programs. This information will help establish a basis for future research and evaluation in this area, and highlight examples of promising strategies for E&T programs interested in incorporating financial capability services into their programming as well as for financial capability providers considering incorporating employment services directly or through partnerships.



B8. Contact Persons

The information for this study is being collected by MEF Associates and the Urban Institute on behalf of ACF. Principal Investigator Sam Elkin will oversee all data collection activities. The Federal Project Officer for this study is Lisa Zingman.



Attachments

Instrument 1. Survey of E&T Programs Integrating Financial Capability

Instrument 2. Phone Interview Protocol

Instrument 3. Virtual Site Visit Interview Protocol

Instrument 4. Participant Interview Protocol

Instrument 5. Employer Interview Protocol

Instrument 6. Program Administrator Focus Group Protocol

Appendix A. Survey Outreach and Reminder Emails

Appendix B. Phone Interview Outreach Email

Appendix C. Virtual Site Visit Outreach Email

Appendix D. Employer Interview Outreach Email

Appendix E. Program Administrator Focus Group Outreach Email

Appendix F: Institutional Review Board Approval





1 Greenwald, Howard and Ann P. Zukoski. 2018. Assessing Collaboration: Alternative Measures and Issues for Evaluation. American Journal of Evaluation. 39(3):322-335.

9


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorRiley Webster
File Modified0000-00-00
File Created2021-11-02

© 2024 OMB.report | Privacy Policy