HVRP_OMB Part B_DRAFT PACKAGE_COVIDupdate_clean_forDOL

HVRP_OMB Part B_DRAFT PACKAGE_COVIDupdate_clean_forDOL.docx

Evaluation of the Homeless Veterans’ Reintegration Program (HVRP)

OMB: 1290-0032

Document [docx]
Download: docx | pdf

eVALUATION OF THE HOMELESS VETERANS’ REINTEGRATION PROGRAM (hvrp)

omb nO. 1290-0NEW

May 2020

PART B: data collection activities

The Chief Evaluation Office of the U.S. Department of Labor (DOL) has commissioned an evaluation of the Homeless Veterans’ Reintegration Program (HVRP), a competitive grant program administered by DOL’s Veterans’ Employment and Training Service (VETS). HVRP assists veterans experiencing homelessness find and hold meaningful employment. It does so by providing employment services and by developing partnerships with other service providers to help address the complex circumstances of homeless veterans. The HVRP evaluation offers an opportunity to build knowledge about the implementation and effectiveness of these grants.

This package requests clearance for four data collection instruments as part of the implementation evaluation:

  1. Grantee Survey

  2. Key Informant Interview Guide

  3. HVRP Veteran Interview Guide

  4. Non-HVRP Veteran Interview Guide

B.1. Respondent universe and samples

Descriptions of the respondent universe and sampling for each instrument follows, and are summarized in Table B.1.

Grantee survey. The study team will not sample respondents for the grantee survey. The survey will be administered to all grantees awarded in Program Year 2020.

Key informant interview guide. From among the current set of grantees, the study team will identify eight grantees to participate in site visits. We will select grantees for site visits to reflect a diverse set of grantees and also to capture grantees that are represented in the impact study. The impact study will identify HVRP participants using the Workforce Integrated Performance System (WIPS) administrative data set. Once we identify the HVRP participants, we will use the administrative data to identify from which of the HVRP grantees they received services. For those grantees that still operate a grant, we will collect basic information about them, such as population served, urbanicity, and region, to select a diverse set of grantees.

Within each grantee selected for site visits, we will identify a set of interview respondents. Working with the HVRP grantee director, we will identify grantee staff involved in the program and the partner managers and staff working with the HVRP grantee. We also will discuss interviewing other community stakeholders that may contribute understanding of the service environment for homeless veterans. Respondents could include the Veteran Affairs (VA) homeless coordinator, the local Disabled Veterans’ Outreach Program (DVOP) specialist at the local American Job Center (AJC), and a Continuum of Care (CoC) director.

HVRP veteran interview guide. The study team will conduct in-depth interviews with eight current and former HVRP participants at the site visit grantees. For each grantee, we will request that the HVRP program director identify veterans who represent different experiences, such as when they separated from the military, whether they are single, and the barriers they faced, such as formerly incarcerated. We expect to sample 10 individuals from each grantee with the expectation that eight individuals will participate in the interviews.

Non-HVRP veteran interview guide. In each of the eight communities, the study team will conduct interviews with eight non-HVRP veterans who receive services through the AJC. We will work with program staff at the AJC, most likely the DVOP, to identify veterans for the interviews. This will be a purposive sample to learn about why veterans sought services at the AJC and not through HVRP. We expect to sample 10 individuals from each community with the expectation that eight individuals will participate in the interviews.

In our experience conducting in-depth interviews for other projects, such as the Parent and Child Together project conducted for the U.S. Department of Health and Human Services and the Evaluation of Demonstration Projects to End Childhood Hunger (EDECH) for the U.S. Department of Agriculture, not all individuals recruited and scheduled for interviews show for their scheduled appointment. For example, in 2017, the EDECH project completed 61 percent of scheduled interviews. We expect to have a higher completion rate as we will work closely with program staff to identify and schedule participants for interviews. Thus, we plan to schedule two additional HVRP veteran and non-HVRP veteran interviews per site to account for no-shows.




Table B.1. Sampling and response rate assumptions, by data collection activity and respondent type

Data collection activity

Sampling method

Respondent universe

Respondent universe size

Sample

Estimated response rate

Estimated responses

Grantee survey

Census

All PY 2020 grantees

149

149

90

134

Key informant interview guide

Purposive

Informants in all PY 2020 grantees

2,384*

128

100

128

HVRP veteran interview guide

Purposive

All HVRP veteran participants

18,000*

80

80

64

Non-HVRP veteran interview guide

Purposive

All non-HVRP veterans receiving services through the AJC

17,800*

80

80

64

*Estimated.


B2. Statistical methods for sample selection and degree of accuracy needed

We do not anticipate using statistical methods for sample selection. The survey will be a census of all PY 2020 grantees. For the site visit data collection activities, we will attempt to interview all of the key grantee, partner, and community informants, but we will use purposive sampling if the set of possible respondents is larger than 16 in any one site. In such instances, we will work with the HVRP grantee director to determine the respondents best able to inform the study about the grant activities, veteran experiences, and the community services available to homeless veterans.

We expect to conduct an average of 16 key respondent interviews per grantee/community. This is based on reviewing HVRP applications and determining an average of 5 staff members per HVRP program (such as grantee administrator, program director, 2 case managers, and 1 job developer). We also have reviewed other materials and determined that it is reasonable that we would speak with 11 partners or community stakeholders, on average, including:  VA services (2 respondents); AJC manager; DVOP; LVER, CoC; employers (2); and partners (3) such as community college and community based organizations.

For the HVRP veteran and non-HVRP veteran interviews, the study team will recruit veterans to participate in the interviews. For HVRP veteran interviews, we expect to complete interviews with eight veterans in each site, or 64 HVRP veterans total and, for the non-HVRP veteran interviews, we expect to complete interviews with eight veterans in each site, or 64 non-HVRP veterans total.

The number of in-depth interviews conducted for qualitative research—absent constraints like a limited sample or resources—is generally determined by whether the research has reached the point of saturation, or the point where few new questions or themes arise from continued data collection (Remler and Van Ryzin 2014). Given the literature and our own experience with qualitative research methods, we believe that this number of interviews will reach the point of saturation.

Literature on qualitative research generally does not set a firm standard for the number of interviews necessary to achieve saturation, and it tends to differ across research projects. For example, Guest, Bunce, and Johnson (2006) found that as few as six interviews were enough to reach the point of saturation in a relatively heterogeneous sample; thirty-four of their 36 codes were developed after the initial six interviews, and 35 of the 36 codes were determined after 12 interviews.1 In contrast, Griffin and Hauser (1993) analyzed data from in-depth interviews and focus groups, and hypothesized that about 90 percent of the codes and themes would be observable after between 20 and 30 in-depth interviews. Additionally, Creswell recommends between 20 and 30 interviews to reach the point of saturation (1998, p.64) and Morse determines that between 30 and 50 interviews are needed (1994, p.225) (from Guest et al. 2006). Our sample—64 interviews per group of interviewees—will allow us to reach the point of saturation while accommodating the potential for heterogeneous experiences among HVRP and non-HVRP veterans participating in the interviews.


Recruitment will begin by working with HVRP grantees to generate a list of veterans who have been enrolled in the program for at least six months or who have exited the program in the past six months. We will request that this list include a diverse set of veterans—women, people experiencing chronic homelessness, people returning to the community from prison—and include their contact information. The team will select diverse participants from this list to recruit and schedule the interviews. For the non-HVRP veteran interviews, we will ask staff at the AJC, most likely the DVOP, for assistance in identifying and recruiting ten veterans for interviews. We expect to complete interviews with eight veterans in each site, or 64 non-HVRP veterans total. In the likelihood that interviews need to be conducted via telephone or video conference, we will work with the HVRP grantee and the AJC/DVOP to assist us with ensuring that no veteran is excluded because they do not have personal access to a reliable telephone or internet connection and/or a device that can be used to conduct the interview.

Conducting interviews with eight HVRP veterans and eight non-HVRP veterans per site will provide the study team with important information. These interviews are not meant to produce generalizable or representative information. However, they will provide context for the impact study findings regarding veterans’ different pathways to the HVRP program or to the AJC and their different experiences receiving services through HVRP and AJCs.

A plan for the assessment and correction of survey nonresponse bias will not be necessary for the analysis of grantee survey data. It is anticipated that the response rate for the grantee survey will be 90 percent.

All data collection activities will take place one time only.

Our analysis plans for each component are described below:

  • For the grantee survey: We will summarize the quantitative data using basic descriptive statistics. Our analysis will follow similar steps: data cleaning, variable construction, and computation of descriptive statistics. Despite best efforts to encourage full response to the survey instrument, respondents will likely leave some missing or incomplete items. During data cleaning, the study team will look for unusual patterns of item nonresponse. If item nonresponse is less than 10 percent, the study report will simply indicate the proportion missing. If it is greater than 10 percent, the study team will examine the types of respondents that did not respond and determine whether the data item suffers from nonresponse bias. Some items of less significance could be dropped from the analysis. Others could be presented in reports, but the study report will provide clear information on the nonresponse issue and describe any cautions that readers should take in interpreting the results.

To facilitate analysis, we will create variables to address the implementation constructs of interest and then, to prepare the data for analysis, we will run a series of data checks, examine frequencies and means, and assess the extent of missing data. We will use these data to identify key ingredients of the HVRP model and to create a typology of service approaches and to classify HVRP grantees using this framework.

  • For the key informant interviews: We will analyze the data to develop common themes from the research questions as well as site profiles. These common themes for site visit interviews will be organized by the interview guide framework: (1) target population and enrollment process; (2) key components of the HVRP program model; (3) HVRP partners; (4) implementation challenges and facilitators. The common themes for the comparison area interviews will focus on: (1) available services for homeless veterans, (2) community partnerships, and (3) the system for providing services.

  • For the qualitative data from the HVRP and non-HVRP veteran interviews: We will use NVivo, a qualitative data analysis software, to create a uniform coding scheme aligned with the interview guide. We will code this data and identify common themes that emerge from the interviews.



B.3. Methods to maximize response rates and deal with nonresponse

1. Grantee survey

Response rates. Participation in the survey is required of grantees. To further encourage participation in the grantee survey, the study team plans to hold a webinar with HVRP grantees to inform them of the survey and its importance for helping DOL improve the program. The study team will send the link to the web survey via email to the director at each grantee and provide a way for the director to identify the appropriate respondent for the identified HVRP grant. We will re-send the web survey link to the appropriate respondent, as needed, and monitor progress throughout the fielding period. Reminders to complete the survey will be sent on a weekly basis to directors, or their designees, who have not yet fully completed the survey instrument. These reminders will be sent until the desired response rate is met. As needed, we also will call directors who have been unresponsive, or their designee.

Through the webinar, support from DOL, and development of a succinct survey instrument, we expect to achieve a response rate of 90 percent. This rate is comparable to that achieved on other establishment surveys where participation is required. For example, a grantee survey for the Youth Career Connect project (OMB Control No. 1291-0003, Discontinued 5/31/18) reached a 100 percent rate, albeit for a smaller number of grantees. On the School Nutrition and Meal Cost Study (OMB Control Number 0584-0596, Discontinued 03/23/2017), response rates ranged from 87% to 96% on three different web-based establishment surveys.

Data reliability. The use of the web mode allows for sophisticated skip logic and fills within the instrument, so that questions and response options shown are logical based on the respondent’s previous answers. This will contribute to more complete responses and better data quality, as respondents will only see the items that apply to their HVRP program.

2. Other instruments

Our plan to select and visit eight grantees and their communities will be sufficient for our intended uses. As the grantee survey will include all grantees, the site visits will explore more deeply with a subset of grantees their structure, processes, and partnerships. We will select from the grantees represented in the impact study, so that they are diverse on certain characteristics, such as location and grantee organization type, to ensure a diversity of experiences.

We will use several well-proven strategies to ensure the reliability of site visit data. First, the small site visit team, all members of whom are experienced site visitors, will be thoroughly trained in the issues of importance to this particular study, including how to probe for additional details to help interpret responses to interview questions. Second, this training and the use of the protocols will ensure that data collection is standardized across sites. When appropriate, the protocols will use standardized checklists to further ensure that the information is collected systematically. Finally, we will assure all interview respondents of the privacy of their responses. These same staff will also conduct the key informant interviews in comparison areas using the same strategies.

Additional details follow describing strategies to maximize response to the specific instruments:

  • Key informant interview guide. Although the study team will try to arrange interviews that accommodate respondents’ scheduling needs, there might be instances when a respondent is unable to meet while the team is on site; when this happens, a member of the study team will request to meet with the respondent’s designee or schedule a follow-up call at a more convenient time. With these approaches, the study team anticipates a 100 percent response rate for staff interviews, as has been achieved on similar qualitative data collection efforts, such as those for the Workforce Investment Act Adult and Dislocated Worker Programs Gold Standard Evaluation, the Evaluation of the Linking to Employment Activities Pre-Release grants, the Evaluation of the Summer Youth Employment Initiative, and the Impact Evaluation of the Trade Adjustment Assistance Program.

  • HVRP veteran interview guide. For each site visit grantee, we will work with the HVRP grantee director to identify and recruit current and former HVRP participants. We will attempt to recruit 80 veterans (that is, ten at each site) in order to conduct in-depth interviews with 64 current and former participants, expecting an 80 percent response rate. To maximize response rates, we will schedule an interview with the HVRP participant at a time and place that is convenient for the interviewee. In the likelihood that these interviews will need to take place virtually, the research team will use the same method of communication (telephone or video conference and the same platform) that the HVRP provider uses to provide virtual case work and services. This will increase the veterans’ fluency and comfort with the medium. Then, the interviewer will follow up shortly before the interview to remind the participant of the upcoming interview. We will also provide a $50 gift card incentive for a 90-minute participation, which will help offset costs such as transportation. In the instance that the interview is conducted virtually, the incentive as well as a copy of the consent, will be delivered to the provider in advance so it can be given to the veteran at the time of interview.

  • Non-HVRP veteran interview guide. For each site visit grantee, we will work with the DVOP or an AJC program manager for assistance in identifying and recruiting non-HVRP participants. We will attempt to recruit 80 veterans (that is, ten at each site) in order to conduct interviews with 64 veterans, expecting an 80 percent response rate. To maximize response rates, we plan to schedule interviews so that they coincide with the veterans’ appointment times at the AJC. In the likelihood that these interviews will need to take place virtually, the research team will use the same method of communication (telephone or video conference and the same platform) that the AJC/DVOP provider uses to provide virtual case work and services. This will increase the veterans’ fluency and comfort with the medium. We will also provide a $25 gift card incentive for a 45-minute participation, which will help offset costs such as transportation. In the instance that the interview is conducted virtually, the incentive as well as a copy of the consent, will be delivered to the provider in advance so it can be given to the veteran at the time of interview.



B.4. Tests of procedures or methods

We pretested the grantee survey with six grantees that had a grant that is no longer funded (that is, ended June 30, 2018). This included one grantee that held multiple grants during the previous program year; the others each held one grant. Each grantee representative completed the instrument on hard copy and participated in a debriefing call to determine whether any words or questions were difficult to understand or answer. Changes to the instrument resulting from the pretest feedback have been incorporated into the grantee survey. The pretest also revealed that the individuals managing the HVRP grant were best able to respond to the questions in the survey, and that this person was often not the director at the grantee organization. Therefore, as described in Section B.3, we will provide grantee directors with the information covered in the instrument so that they can designate the person most qualified to respond for the HVRP program. For grantees operating more than one grant, there will likely be different respondents for each grantee survey. Therefore, we will send separate grantee survey invitations for each grant, to allow for different respondents to complete the instrument for each grant.

In the likelihood that site visit interviews will need to be converted to telephone or video conference formats, we will make every attempt to use the same platform that the grantee uses (such as Skype or Zoom) to increase the fluency of the interview respondent with the medium. We will test the use of the platform as part of the site visit preparation to ensure successful access prior to the first virtual site visit interview.



B.5. Contact information and confidentiality

The following individual was consulted on statistical methods.

Mathematica

Dr. Peter Schochet (609) 936-2783

The following individuals will be primarily responsible for collecting and analyzing the data for the agency:

Mathematica

Ms. Linda Rosenberg (609) 936-2762

Dr. Dana Rotz (617) 301-8979

Ms. Mindy Hu (510) 830-3710


Urban Institute

Ms. Mary Cunningham (202) 261-5764

References

Creswell, J. 1998. Qualitative inquiry and research design: Choosing among five traditions. Thousand Oaks, CA: Sage.

Guest, Greg, Arwen Bunce, and Laura Johnson. "How many interviews are enough? An experiment with data saturation and variability." Field methods 18, no. 1 (2006): 59-82. https://www.sfu.ca/~palys/GuestEtAl-2006-HowManyInterviewsAreEnough.pdf

Griffin, Abbie & Hauser, John R. 1993. The voice of the customer. Marketing Science, 12(1), 1-27.

Morse, J. 1994. Designing funded qualitative research. In Handbook for qualitative research, ed. N. Denzin and Y. Lincoln, 220–35. Thousand Oaks, CA: Sage.

Remler, Dahlia K., and Gregg G. Van Ryzin. 2014. Research methods in practice: Strategies for description and causation. Sage Publications.



1 The team completed 60 interviews total.



DRAFT 1

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJHegedus
File Modified0000-00-00
File Created2021-01-14

© 2024 OMB.report | Privacy Policy