Part B OMB_SUPPORT Act Grants_6.2.22

Part B OMB_SUPPORT Act Grants_6.2.22.docx

The SUPPORT Act Grants Evaluation

OMB: 1290-0042

Document [docx]
Download: docx | pdf

The SUPPORT Act Grants Evaluation

OMB No. XXXX-XXXX

June 2022



Part B: Statistical Methods

Overview

The Chief Evaluation Office (CEO), in partnership with the Employment and Training Administration (ETA), is sponsoring an implementation evaluation of grants awarded under the Substance Use-Disorder Prevention that Promotes Opioid Recovery and Treatment for Patients and Communities (SUPPORT) Act. CEO is seeking approval from OMB under the Paperwork Reduction Act for data collection instruments associated with the evaluation. With the goal of producing important information on innovative practices and perceived implementation challenges in providing services that integrate employment services and substance use disorder (SUD) treatment services, the U.S. Department of Labor (DOL) awarded nearly $20 million in SUPPORT Act grants to four state workforce agencies. Grantees may use these funds to provide a range of employment services for affected individuals. The grants can also be used to train and support two types of workers: workers personally affected by opioid misuse or other SUDs (including having a friend or family member with a substance use disorder), and workers who seek to transition to professions that address the opioid crisis (such as addiction and SUD treatment, mental health services, and pain management). Finally, grantees can use a portion of their funds for individual or group outpatient treatment and recovery services, in addition to using funds for employment services. DOL contracted with Abt Associates and its partner to conduct an implementation evaluation will inform program administrators and practitioners on providing services that address both employment and treatment needs for people with SUDs. Data collection instruments described in this submission will be used to collect implementation study data.

Part B of the Supporting Statement for the SUPPORT Act Grants Evaluation considers the issues pertaining to Collection of Information Employing Statistical Methods. This submission includes the following instruments: Interview Guide for Data Collection Planning; Grantee, Sub-grantee, and Partner Surveys; Interview Guide for Grantee and Sub-grantee Site Visits; a Participant Consent Form; In-Depth Participant Interview Guide; a Participant Interview Information Form; and a Final Reflection Interview Guide.

B.1 Respondent Universe and Sampling Methods

DOL awarded four SUPPORT Act grants to state workforce agencies. The four state grantees awarded sub-grants to 18 total local workforce agencies (“sub-grantees”). Sub-grantees work with a variety of local partners, across workforce development, justice, and behavioral health care systems. This section outlines the respondent universe and sampling methods for each of instruments that are involved in the implementation study.

Interview Guide for Data Collection Planning

The respondent universe for the Interview Guide for Data Collection Planning is all four grantees and the eight sub-grantees selected for implementation study visits.

Selection of sub-grantees. Eight sub-grantees were selected for implementation study visits based on information collected during sub-grantee clarification calls conducted in December 2020 and January 2021. A purposive sampling approach was used to select sub-grantees with an adequate range of conditions of interest. Pre-specified site selection criteria include key factors that might affect program implementation and may best reflect the varied local context of policymakers and practitioners interested in creating similar programs:

  • Population density. SUD is not specific to urban or rural areas. However, access to treatment services, training providers, and employers may be more limited for rural areas, necessitating different partnerships and service delivery approaches than in more populated areas.

  • Economic context. Sub-grantees vary in the structure of their local labor markets in terms of primary industries, qualifications needed for local jobs, and unemployment rates, among other factors, prior to COVID-19. Sub-grantees will likely also vary in the timeline and extent their economies’ pandemic recovery. The economic context will affect sub-grantee outcomes in many ways—e.g., economic insecurity can fuel SUD rates, and poor labor markets affect employment prospects for program participants. Selected sub-grantees include a range of unemployment rates at the time of site selection, and within three out of four states, the recommended sub-grantees include counties with the highest and/or lowest unemployment rates in their states.

  • Grant implementation progress. Some sub-grantees were able to implement grant activities quickly and enroll more participants relatively early in the grant, while others were delayed in forming partnerships and enrolling participants. Early implementers may have experience that facilitates implementation, such as previous experience with similar grants, knowledgeable and tenured staff, and established partnerships in the service area. Sub-grantees slower to implement may have limited experience planning similar interventions or have other reasons for delays. Sub-grantees were selected to include a mix of early implementers and those that experienced implementation delays.

  • Projected cost per participant. State-level grantees vary in their projected cost per participant, as do sub-grantees. Selected sub-grantees represent a range in projected cost per participant, from $5,000 to $10,979.

  • Proximity to the state capital. The team will begin each site visit in the state capital, meeting with grantee staff to understand the larger administrative and grant context. In order to avoid spending limited on-site time travelling, each state has at least one sub-grantee location within proximity to the state capital (e.g., no more than two hours by car).

In addition to having variation in the characteristics above, recommended sub-grantees also provide the opportunity to learn more about specific implementation features identified during the sub-grantee calls: justice system partners, training emphasis on recovery-specific occupations, targeted employer engagement and education on SUD, and enhanced approaches to supportive services and case management.

  • Justice system partners. Sub-grantees reported engaging a range of agencies within the justice system, such as courts, probation/parole, and the sheriff’s department, primarily to recruit participants. Some partnerships enable sub-grantees to make contact with participants at the point where their substance use has led or contributed to justice system contact, while others will focus on pre-release contact (with the aim of facilitating smooth reentry and sustained recovery through a handoff to the workforce system), as well as post-release coordination.

  • Training emphasis on recovery-specific careers. SUPPORT Act grants aim to both address employment needs of people affected by SUD and address the need for workers in recovery-specific occupations.

  • Targeted employer engagement and education on SUD. In addition to standard work with employers around job development and placement, sub-grantees reported plans to educate employers on hiring those with SUD and making appropriate adjustments to their employer policies and practices.

  • Enhanced approaches to case management. Sub-grantees reported that people in recovery from SUD have unique case management needs that may warrant more intensive case management than typically provided in the workforce system.

Grantee, Sub-grantee, and Partner Surveys

As part of the implementation study, an online survey will be administered to all four SUPPORT Act grantees, all 18 sub-grantees, and up to five community and employer partners per sub-grantee the purpose of systematically documenting program operations and the type of services provided across the sub-grantees. The Grantee Survey will be more limited in nature, reflecting the role of grantees in daily program operations. It will focus on grantees’ coordination with DOL, sub-grantees and partners. The Sub-grantee Survey will cover background and context, program implementation and services provided, employer engagement and perceptions about SUD, perspectives on participant experiences, and relationships with partners. The Partner Survey will cover the history and nature of their involvement in grant-funded services, employer engagement, and relationships with the sub-grantee and other partners. The list of primary (and secondary, if available) respondent names and emails from each grantee and sub-grantee will be developed in collaboration with DOL staff. Community and employer partners will be selected via purposive sampling. Partner respondent names and emails will be collected through the Sub-grantee Survey.



Interview Guide for Grantee and Sub-grantee Site Visits

The respondent universe for the interview guides is 4 grantees, a subset of 8 of the 18 sub-grantees, including their partners. The implementation study includes a site visit to each grantee and the eight selected sub-grantees to document grantee and sub-grantee practices in more detail from a smaller number of programs. This will allow a better understanding of how programs were implemented; their partnerships; perceived challenges faced in implementing programs and how they overcame them; and plans for replication, scaling and sustainability beyond the grant period. Site visitors will use a modular interview guide that will be tailored for each respondent based on their roles and responsibilities.

The evaluation team will then conduct 1.5-day site visits to up to the four grantees and 3-day site visits to eight selected sub-grantees. DOL expects response rates for the site visits to be 100 percent.

Selection of interviewees. For each of the grantees, the evaluation team will conduct semi-structured interviews with the grant director or their designee. For each of the 8 sub-grantees, the evaluation team will conduct semi-structured interviews with staff in roles funded by the SUPPORT Act grant, as well as grant partners, including employer representatives. The evaluation team will interview up to 10 sub-grantee staff and staff employed by partnering organizations and employers. Staff interviewed may be affiliated with the sub-grantee itself, probation departments, human service agencies, court systems, and/or an employer of depending on sub-grant staffing and partners. The exact number of interview respondents in each of these categories will be unique to the sub-grantee depending on the entities collaborating with them on the grant and the structure of their grant activities. The evaluation team has estimated interviewing a total of 4 grant directors; 16 grantee staff; and 80 sub-grant staff, including directors, staff, and partners. The interviewees will be selective via purposive sampling.

The respondent universe for interviewees is not known because the evaluation team has not gathered detailed information on the staffing for the SUPPORT Act grant-funded programs, including staffing within each partnering entity and among employers. The evaluation team will contact each of the grantees and 8 sub-grantees to discuss the structure of their grant (i.e., within their organization, and with partners and employers) and staffing at each organization involved. This information will provide the respondent universe from which the interviewees will be identified. The evaluation team will use the Interview Guide for Data Collection Planning (described above) to identify appropriate respondents in each category.

Interview Guide for In-depth Participant Interviews

Up to five participants per sub-grantee, selected via purposive sampling, will participate in in-depth interviews during the site visits. Interviewees will be selected via purposive sampling. In-depth participant interviews will collect detailed information on participants’ experiences and attitudes about the SUPPORT Act grant-funded programs. This guide will cover participant experience in several areas including how the opioid crisis has affected them and/or their families, the barriers participants in recovery have faced to finding and retaining employment, the role of employment and employment services in alleviating the effects of the crisis on their lives, their interest in and goals participating in the program, and their perspectives on how well grant-funded services have met their needs.

In-Depth Participant Interview Consent Form

For the in-depth participant interviews, all of the respondents will first be asked to consent to the interview. The consent form states that the evaluators treat the information respondents provide as private. All publications based on the interviews will report findings anonymously and the names of participant respondents will not be published in any form. Only the evaluation team will be able to identify individual responses. To protect respondents’ privacy, all data will be stored on a password protected drive established at the contractor site. Access to this drive will be limited to research staff members who are working on the project and have signed the non-disclosure agreement.

Participant Interview Information Form

Following the in-depth participant interviews, all of the respondents will be asked to complete a brief form. This instrument will capture demographic information that will allow the research team to describe in aggregate the group of respondents. To protect respondents’ privacy, names will not be recorded, and all data will be stored in a locked file cabinet at the contractor site. Access to these forms will be limited to research staff members who are working on the project and have signed the non-disclosure agreement.

Final Reflection Interview Guide

Approximately one year after the site visits, a semi-structured interview protocol will be used to guide virtual interviews with program managers for all four grantees and the eight sub-grantees selected for implementation study visits. Guides will be used to document  implementation lessons and the extent to which services and partnerships will be sustained once the grants end. 

Exhibit B1 presents the sampling methods and target response rates for each of the data collection instruments.

Exhibit B1. SUPPORT Act Grants Evaluation Respondents

Respondent Universe

Respondent Subgroup

Sampling Methods and Target Response Rate

Data Collection Instrument


Number of Respondents in Population


Number of Respondents in Sample

Grantees and sub-grantees

Grantee directors

All four grantees will participate in the study.


Grantees agreed to participate in the evaluation as a condition of receiving SUPPORT Act grant funding. Therefore, the team expects a very high (100 percent) response rate.

Data Collection Planning Interview Protocol

All 4 grantees

All 4 grantees


Sub-grantee directors

Eight sub-grantees have been purposively selected for the stie visits.


The team expects a very high (100 percent) response rate among sub-grant directors and staff.

Data Collection Planning Interview Protocol

18 sub-grantees

8 sub-grantees

Grantees, sub-grantees, and partners

Grantees

Grant directors or their designees from all four grantees will participate in the site visits.


Grantees agreed to participate in the evaluation as a condition of receiving SUPPORT Act grant funding. Therefore, the team expects a very high (100 percent) response rate.

Grantee, Sub-grantee, and Partner Interview Guide

All 4 grantees

All 4 grantees


Sub-grantees

Sub-grant directors and staff from eight purposively selected sub-grantees will participate in sites visits. Approximately 5 staff from each sub-grantee will participate in interviews.


Evaluation team members will review the topics of interest with sub-grantees to identify appropriate respondent(s) based on who is most knowledgeable about the topics of interest.


The team expects a very high (100 percent) response rate is expected among sub-grantee directors and staff.

Grantee, Sub-grantee, and Partner Interview Guide


90+ staff members (5+ per 18 sub-grantee2)

App. 40 staff members (5 per 8 sub-grantees)


Partners

Up to 15 community and employer partner representatives from each of the eight purposively selected sub-grantees will participate in site visits.


Evaluation team members will review the topics of interest with sub-grantees to identify appropriate respondent(s) based on who is most knowledgeable about the topics of interest.


The team expects a very high (100 percent) response rate is expected among sub-grantee partners, if involved in program operations.

Grantee, Sub-grantee, and Partner Interview Guide


270+ community and employer partner representatives (in 18 sub-grantees)

120 community and employer partner representatives (in 8 selected sub-grantees)

Program participants


Up to five participants from the eight purposively selected sub-grantees will participate in in-depth interviews.


The team expects a high (90 percent) response rate among program participants.

In-depth Participant Interview Guide

90+ participants (5+ for each sub-grantee)

At least 36 participants (up to 5 participants per 8 sub-grantees at 90%)

Program participants


Up to five participants from the eight purposively selected sub-grantees who participate in in-depth interviews will complete a consent form.


The team expects a high (90 percent) response rate among program participants.

Participant Consent Form

90+ participants (5+ for each sub-grantee)

At least 36 participants (up to 5 participants per 8 sub-grantees at 90%)

Program participants


Up to five participants from the eight purposively selected sub-grantees who participate in in-depth interviews will complete a consent form.


The team expects a high (90 percent) response rate among program participants.

Participant Interview Information Form

90+ participants (5+ for each sub-grantee)

At least 36 participants (up to 5 participants per 8 sub-grantees at 90%)

Grantees and sub-grantees

Grant directors

Grant directors or their designees from all four grantees will participate in the final reflection interviews.


Grantees agreed to participate in the evaluation as a condition of receiving SUPPORT Act grant funding. Therefore, the team expects a very high (100 percent) response rate.

Final Reflection Interview Guide

All 4 grantees

All 4 grantees


Sub-grant directors

Sub-grant directors or their designees from the eight purposively selected sub-grantees will participate in the final reflection interviews.


The team expects a very high response rate (100 percent) among sub-grant directors and staff.

Final Reflection Interview Guide

18 sub-grantees

8 selected sub-grantees

Grantees

Grant directors or other staff identified by grantee as appropriate respondent

All 4 grantees will be asked to respond to the web-based survey.


Grantees agreed to participate in the evaluation as a condition of receiving SUPPORT Act grant funding. Therefore, the team expects a very high (100 percent) response rate.

Web-based survey

All 4 grantees

All 4 grantees

Sub-grantees

Sub-grant directors or other staff identified by grantee as appropriate respondent

All 18 sub-grantees will be asked to respond to the web-based survey.


The team expects a very high (90 percent) response rate among sub-grant directors and staff.

Web-based survey

18 sub-grantees

At least 16 sub-grantees (aim for 18)

Partners

Partners identified by sub-grantees as appropriate respondents

All sub-grantees will be asked to identify five partners actively involved in program operations.


The team expects a high 80 percent) response rate is expected among sub-grantee partners, if involved in program operations

Web-based survey

90 partners (5 for each of the 18 sub-grantees)

72 partners (aim for 90)

Though this will be the first set of data collection activities for the SUPPORT Act Grants evaluation, the evaluation team has had extensive experience administering similar data collection instruments and has been able to achieve high response rates for them (e.g.- Using TANF Funds to Provide Housing Assistance During the COVID-19 Pandemic1 and the Evaluation of the TAACCCT Round 4 Grant Program2. Also, the Department of Labor Funding Opportunity Announcement (FOA-ETA-20-01) for the SUPPORT Act Grants established that the department is committed to document activities across grantees and that all grantees must fully participate in any evaluation initiated by DOL as a condition of grant award.

B.2 Procedures for Collection of Information

This section describes the data collection and analysis procedures for the SUPPORT Act Grants evaluation’s implementation study activities. The study includes a round of data collection planning interviews with the four grantees and eight purposively selected sub-grantees. Grantees and the eight selected sub-grantees will participate in site visits. To collect in-depth qualitative information on sub-grantee operations. The evaluation team will hold semi-structured interviews with a variety of stakeholders involved in the SUPPORT Act grants that will allow them to document grant planning and implementation, changes to operations over time, participant outcomes, and post-grant sustainability. The implementation study also relies on the Grantee, Sub-grantee, and Partner Survey, designed to collect consistent information from the four grantees, all 18 sub-grantees and five partners for each sub-grantee. The survey will be administered using a web-based system. No statistical methods will be used for stratification and sample selection for the implementation study.

Procedures with special subpopulations: With the SUPPORT Act grants, DOL is interested in learning more about innovative practices and perceived implementation challenges in providing services that integrate employment and SUD treatment services. The proposed data collection efforts will collect data from SUPPORT Act grant participants who may be receiving treatment or recovery supports for SUDs. The participant consent process for the in-depth interviews will include assurances that the interviews are voluntary, that participants can decline to answer questions including questions about personal health information), and that they can end the interview at any time. The evaluation team will consult with Abt Associates’ Institutional Review Board to ensure that consent process and interview questions comply with participant’s rights to health data privacy under the Health Insurance Portability and Accountability Act. All other data collection will be from grantees, sub-grantees, and their partners, which do not constitute a special population.

Interview Guide for Grantee and Sub-grantee Site Visits

The data will be collected through semi-structured interviews held by telephone and on-site with eight sub-grantees selected for the implementation study. As discussed in section B.1, all four grantees are included and the eight sub-grantees for the implementation study were purposively selected. No statistical methods will be used to select grantees, sub-grantees, or partners. The sample is intended to be neither random nor representative.

The site visits are designed to provide in-depth qualitative information about grantees; no estimation procedures will be used. The data analysis will be descriptive. No statistical techniques will be used to ensure accuracy. The evaluation team does not foresee any unusual problems that would require specialized sampling procedures.

In-depth Participant Interviews

While scheduling sub-grantee site visits, the evaluation team will work closely with sub-grantee staff to identify and recruit a diverse group of participants. We will ask program staff to identify five participants who present a range of characteristics (race and ethnicity, gender, age), stage of participation in the program (recently enrolled, already in training or work experience, completed training or work experience), types of services accessed (training, employment, supportive services) and type of targeted worker (personally affected by opioid misuse or other SUDs, workers seeking to transition to professions that address the opioid crisis). If a participant declines to participate, we will select another participant with similar experiences, to ensure a sample size of five participants per sub-grantee.

The evaluation team will analyze transcribed interviews with participants to gain a thorough understanding of the participant experience. We will code de-identified transcripts of in-depth participant interviews using qualitative data analysis software and analyze coded transcripts using a grounded theory approach (Glaser & Strauss, 1967; Strauss & Corbin, 1994) in which we will use patterns in the data to generate key themes on the participant experience. Descriptors related to interviewee characteristics will also be assigned to each interview to explore differences and similarities in experiences across characteristics, such as duration in the program, gender, or prior work/training experience.

Grantee, Sub-grantee, and Partner Surveys

The evaluation team will administer the survey to the universe of four SUPPORT Act grantees and 18 sub-grantees. Thus, no statistical methods will be used to select grantees or sub-grantees. Sub-grantees will identify up to five partners who are actively involved in their programs. The surveys will be used to develop an inventory of grantee goals, activities, project context, and future project plans. Estimation procedures will be, for the most part, very simple. Much of the data collection will be on a census basis, removing the need for survey weights. There will be one survey per grantee and sub-grantee. Partners will be purposively selected by sub-grantees based on their level of engagement in the program and will not be representative of all partner organizations. There will also be one survey per nominated partner. The evaluation team does not foresee any unusual problems that would require specialized sampling procedures.

For the sub-grantee and partner surveys, we will summarize and compare patterns in sub-grantee and partner approaches and services. For the grantee, sub-grantee, and partner surveys, we will also conduct the social network analysis to document the following concepts:

  • Network strength, driven by the strength of individual relationships within the network, as defined by interaction frequency and purpose (for example, whether organizations were in meetings together vs. whether they collaborated on projects).

  • Centralization, or the distribution of power within the network, derived from the prominence (or centrality) of individual organizations within a network and whether well-connected organizations are connected to other well-connected organizations.

  • Density, as defined by the proportion of organizations in the network that work on a set level of partnership.

  • Comprehensiveness, as defined by partnerships across different types of work.

We will develop visualizations called sociograms, which show the “nodes” (organizations) and “edges” (ties weighted depending on frequency and strength of relationships) for each sub-grantee, by state and by type of organization (e.g. workforce system, justice system, treatment provider) to facilitate explanation of these concepts. Understanding these aspects of the sub-grantee and partner networks will help explain how social network features influenced implementation of planned activities under the grants.

B.3 Methods to Maximize Response Rates and Deal with Non-response

This section describes the methods to maximize response rates for the SUPPORT Act Grants Evaluation. The data collection efforts are heavily dependent on gaining grantees’, sub-grantees’ and their partners’, and participants’ cooperation, buy-in, and collaboration. The evaluation team believes that grantees, sub-grantees, and their partners are interested in supporting DOL efforts to expand similar services and thus are willing to contribute to participating in an evaluation to build the evidence base around this workforce model. Grantees are aware of the federal evaluation and have agreed to participate as a condition of their DOL grant. The evaluation team is committed to providing the support and guidance to ensure minimal burden and high response rates. The evaluators expect high response rates on each of the data collection efforts described in this package, ranging from a 90 percent response rate on the participant survey to a 100 percent response rate for the interviews. Plans are in place to reach out to participants to obtain any missing data (see details below).

Interview Guide for Grantee and Sub-grantee Site Visits

For the site visits, it is expected that all of the grantee organizations approached will agree to participate.3 Once selected sites have been confirmed, site visitors will work closely with the primary DOL contact for each grantee to help in scheduling the site visit. One member of the two-person site visit team will take responsibility for working with the primary contact person to handle the scheduling and logistics, e.g., identifying appropriate interview respondents. Dates for site visits will be set at least one month in advance to allow ample time to schedule interviews. Interview appointments will then be confirmed via e-mail the week prior to the visit. Should a potential respondent not be available during the visit, the study team will follow up with a time to interview the person by phone.

In-depth Participant Interviews


Sub-grantee staff will identify individuals who are engaged in program services and willing to participate in interviews. The research team will ask staff to remind participants of their interview appointments and identify replacements if individuals choose not to participate. The evaluation team proposes to offer an incentive valued at $20 to participants who complete in-depth interviews. The incentive is a way to thank participants for their time. Participants will receive the incentive in the form of a gift card.

Grantee, Sub-grantee, and Partner Surveys

To achieve the targeted response rates on the Grantee, Sub-grantee, and Partner Surveys, the evaluation team will take the steps outlined below:

  • DOL will send advance letters or emails to all grantee and sub-grantee directors two weeks before the survey. The communication will specify the date on which the survey is scheduled to be sent, the format in which it will be available (web-based), the time expected to complete the survey, and the survey’s originator.

  • The evaluation team will ask sub-grantee directors to inform partners they have been nominated to participate in the survey.

  • On the scheduled date, the evaluator will email all primary grantee, sub-grantee, and partner contacts the link to the online survey and instruction for completion. The respondents will be provided with a contact should they encounter any problems or questions as they complete the survey.

  • The evaluation team will track who has started the survey and monitor their progress and follow- up with any grantees that have not started or completed the survey. Follow-up with the respondents will occur through periodic email reminders and telephone calls.

  • As each survey is reviewed, follow-up emails and telephone calls will be made to those respondents whose surveys contain errors, unclear responses, or missing information in answers provided.

  • The evaluation team will remind grantees and sub-grantees in survey-related communications that participation in evaluation activities is a requirement of their grant.

  • The evaluation team will ask sub-grantee directors to remind partners that the survey is an important part of the evaluation.

The evaluation team will develop the surveys with a skip logic feature, so that a question must be answered before moving to the next appropriate question for that respondent. This will ensure that all submitted surveys are complete and there will be no possibilities for item nonresponse. As described above, the team will make numerous efforts to receive a unit response rate of 90% or above. Since it is expected that the unit nonresponse rate will be very low, the plan is to tabulate complete cases in the event of any missingness.

B.4 Tests of Procedures

This section describes any tests of the data collection instruments included in this submission.

Interview Guides

Experienced implementation researchers developed the implementation study interview guides, including the Interview Guide for Data Collection Planning and Final Reflection Interview Guide, deriving the types of questions from protocols used for other federal evaluations of workforce programs. The estimated completion time for the implementation study interviews is based on extensive experiences with previous site visits to DOL grantees for other projects.

In-depth Participant Interviews

Experienced implementation researchers developed the In-depth Participant Interview Guide, Participant Consent Form, and Participant Interview Information Form deriving the types of questions used for other federal evaluations of workforce programs. The estimated completion time for the in-depth participant interviews is based on extensive experiences with similar populations on other studies.

Grantee, Sub-grantee, and Partner Surveys

The Grantee, Sub-grantee, and Partner Surveys were developed and reviewed by DOL staff and evaluation team members. The evaluation team will pre-test the surveys with one SUPPORT Act grantee, one sub-grantee, and one partner representative. Pre-test respondents will provide feedback on the experience of completing the surveys, both in written comments and in telephone conversations with an evaluation team member. Respondents commented on their perceptions of the clarity and flow of survey items, ease of completion, and time requirements. After pretesting, the evaluation team revised the instruments based on the feedback. Respondents that completed the surveys during the pre-test will be given their completed surveys to review and update when the full survey is fielded to reduce burden while ensuring all responses are accurate and up to date.


B.5 Individuals Consulted on Statistical Aspects of the Design

With DOL oversight, Abt and its partner MDRC are responsible for conducting the SUPPORT Act Grants Evaluation. The individuals listed in Exhibit B2 below made a contribution to the design of the implementation study activities. Both the conduct and analysis of data for the Implementation Study will be under the direction of Robin Koralek, Abt Associates and Kyla Wasserman, MDRC. Hannah Betesh, the project director for the evaluation, will have oversight of all data collection efforts.

Exhibit B2: Individuals Consulted on Data Collection

Name

Telephone Number

Role in Study

Hannah Betesh

(301) 347-5990

Project Director

Karin Martinson

(301) 347-5726

Principal Investigator

Karen Gardiner

(301) 347-5116

Project Quality Advisor

Robin Koralek

(301) 347-5613

Implementation Study Co-lead

Kyla Wasserman

(212) 340-8656

Implementation Study Co-lead



Inquiries regarding the statistical aspects of the study’s planned analysis should be directed to:

Hannah Betesh

Project Director, Abt Associates

301-347-5990

Kuang-Chi Chang

Contracting Officer’s Representative, Chief Evaluation Office

202-693-5992




1 Dunton, Lauren, Cara Sierks, Nishi Kumar, and Asaph Glosser. (2021). Supporting Families Experiencing Homelessness: Strategies and Approaches for TANF Agencies. OPRE Report No. 2022-02. Washington, DC. Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.

2 Judkins, David, Karen Gardiner, Adrienne Smith, and Douglas Walton. (2020). Trade Adjustment Assistance Community College and Career Training: Round 4 Early Outcomes Study Report. Report prepared for the U.S. Department of Labor, Chief Evaluation Office. Rockville, MD; and Washington, DC: Authors.

3 The expected response rate by the grantees is 100 percent. Participation in evaluation activities is required as a condition of the grant award.

DRAFT OMB Supporting Statement B | 1


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorMean, Monica - ASP
File Modified0000-00-00
File Created2022-08-04

© 2024 OMB.report | Privacy Policy