Final - EHV_Supporting Statement B

Final - EHV_Supporting Statement B.docx

Evaluation of Emergency Housing Voucher Program

OMB:

Document [docx]
Download: docx | pdf

Supporting Statement for Paperwork Reduction Act Submissions

(Evaluation of Emergency Housing Voucher Program)

(OMB # 2528-new)



B. Collections of Information Employing Statistical Methods


1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection methods to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.



In total, SPR and Abt Global (“the research team”) expects to field the survey to 611 Public Housing Agency (PHAs) and 371 Continuum of Care Programs (CoCs) that operated Emergency Housing Voucher (EHV) programs. The research team will also conduct qualitative follow-up telephone interviews with 25 PHAs, 20 CoCs, and five other community-based organizations. This is a total of 50 interviews.



Web-based Surveys

To understand more about how PHAs and CoCs implemented their EHV programs, the research team will conduct two web-based surveys:

  • A PHA survey, fielded to all 611 PHAs that received allocations of EHV vouchers (Appendix B); and

  • A CoC survey, fielded to all 371 CoCs that partnered with these 611 PHAs to implement the EHV program (Appendix C).

The two surveys will collect information about the implementation of the EHV program, primarily through closed-ended questions. The expected respondents will be the PHA and CoC administrators most knowledgeable about the EHV program at their respective organizations. Only one person will submit the survey from each organization.

Topics in the PHA survey will include: the PHA/CoC partnership; the EHV referral and eligibility processes; issuing EHVs; the housing search and lease-up processes; supplemental PHA funding for the EHV program; and challenges and strengths of the EHV program.

Topics in the CoC survey will include: the PHA/CoC partnership; the EHV referral and eligibility processes; search, lease up, and services; and challenges and strengths of the EHV program.



Exhibit B-1 shows the survey population universes, expected number of completes, and expected response rates for each of the two web surveys.

  • For the PHA survey, the research team will send the survey to all 611 PHAs that participated in the EHV program, with the goal of achieving at least 489 completes, an 80 percent response rate.

  • For the CoC survey, the research team will send the survey to all 371 CoCs that partnered with one or more PHA on an EHV program, with the goal of achieving at least 297 completes, an 80 percent response rate.

Exhibit B-1: Survey Sampling Plan


PHA Survey

CoCs Survey

Universe

611 PHAs

371 CoCs

Expected No. of Completes

489

297

Expected Response Rate

80%

80%



Follow-up Telephone Interviews

To clarify information provided through the surveys, observe patterns and unique experiences across the included communities, confirm or expand on early hypotheses, and delve deeper into topics of interest, the research team will conduct follow-up telephone interviews with PHA and CoC (or other, alternative partners, such as Victim Service Providers [VSP]) leaders in 25 purposefully selected communities.

Communities will be selected to represent variation across EHV communities, EHV program choices/strategies, and voucher use, among other factors (see more details about the selection process under question 2). If a selected community declines to participate in the follow-up telephone interviews, the research team will choose another community with similar characteristics and programmatic choices so that the goal number of communities and interviews is achieved. A 100 percent response rate is expected.

  • Topics in the PHA follow-up telephone interviews include: partnerships and targeting; referrals and eligibility; housing search and lease-up processes; outcomes; and lessons learned (see Appendix D).


  • Topics in the CoC or other partner organization follow-up telephone interviews include: partnerships and targeting; referrals and eligibility; services; and challenges and strengths of the EHV program (see Appendix E).


Exhibit B-2 shows the sample description, potential respondent universe, expected sample size, and expected response rate for each of the follow-up telephone interview types.



Exhibit B-2: Follow-up Telephone Interview Sampling Plan

Primary Data Collection Source 

Sample Description 

Potential Respondent Universe 

Expected Respondent Sample Size

Expected

Response

Rate 

Total Expected Respondents

Follow-up Telephone Interviews with PHAs

PHA administrative staff most knowledgeable about their agency’s EHV program

Administrative staff from all PHAs that completed the PHA survey (expected to be at least 489 PHAs)

50 administrative staff participating in 25 interviews (up to two staff members per interview)

100%

50

Follow-up Telephone Interviews with CoCs

CoC administrative staff most knowledgeable about their organization’s EHV partnership

Administrative staff from all CoCs that completed a CoC survey (expected to be at least 297 CoCs)

40 administrative staff participating in 20 interviews (up to two staff members per interview)

100% 

40

Follow-up Telephone Interviews with Other EHV Partner Organizations

Other EHV partner organization administrative staff most knowledgeable about their organization’s EHV partnership

Administrative staff from all other (non-CoC) EHV partner organizations that partner with a PHA that completed a PHA survey

10 administrative staff participating in 5 interviews (up to two staff members per interview)


100%

10

Overall Total





100



2. Describe the procedures for the collection of information including:


  • Statistical methodology for stratification and sample selection,

  • Estimation procedure,

  • Degree of accuracy needed for the purpose described in the justification,

  • Unusual problems requiring specialized sampling procedures, and

  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.



None of the data collection activities involve estimation procedures or unusual problems requiring specialized sampling procedures. In addition, no periodic data collection cycles will be used as both the PHA and CoC web surveys and follow-up telephone interviews are one-time data collection efforts.

Data Collection Procedures for the Web-based Surveys

The research team will send the surveys to all PHAs who received EHVs allocations and all of their partner CoCs, so there is no sample selection.


Data Collection and Sample Selection Procedures for the Follow-up Telephone Interviews

The follow-up telephone interview respondents are limited in number and are not intended to constitute a representative sample of all EHV programs. No statistical or randomization processes will be used to select communities for participation.

To be selected to participate in the interviews, both a community’s PHA and CoC must have completed the web-based surveys. From the list of communities that meet this test, the research team will select a group that represents variation across EHV communities, EHV program choices/strategies, and voucher use, among other factors. The research team will create a PHA/CoC Selection Index that catalogs each community on the following criteria of interest:

  • community factors, such as HUD region, urbanicity, housing market characteristics, PHA size, initial EHV allocation, number of PHAs associated with the CoC, use of an alternative partner, and being a Moving to Work (MTW) PHA;

  • EHV program choices/strategies, such as allocation of EHVs across the four qualified household types for EHVs, the CoC’s target subpopulations (e.g., unsheltered adults, individuals exiting from PSH, households in rapid re-housing, chronically homeless individuals), planned use of optional waivers and services (for example, waiver of income verification at admission, pre-inspection of units, and landlord incentives), and number of referral partners;

  • indicators of EHV use/performance, such as percentage of EHV allocation used, whether any EHVs were reallocated, success rates in leasing up, average time to lease up, amount of rapid lease-up incentives received by PHA, and use of optional waivers; and

  • other factors, such as participation in other EHV related research.

We will purposefully select 25 communities, ensuring that the sample represents a diverse group of strategies and criteria. We will also identify alternative communities to account for refusals or inability to participate as well as any alterations to the list made by HUD in the review process.

While only CoCs will be included in the web survey, the research team will include non-CoC alternative partners, such as VSPs, in the interviews with up to five of the 25 communities. The research team will work with HUD (and, when selected, the PHA or CoC) to determine which of these alternative partners to invite.

3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.

Every data collection instrument and question has been carefully scrutinized to make sure they align with the research questions; are needed for the evaluation; and do not duplicate other existing data sources. The research team will also clearly communicate the reason for collecting the data to the respondents and make it as easy as possible for respondents to provide the data.

The research team will provide appropriate translations and accommodations to maximize response rates. HUD and the research team will use translation services as needed for individuals with Limited English Proficiency. The web-based survey will be translated into Spanish for PHAs and CoCs in Puerto Rico. If any of these communities are selected for follow-up telephone interviews, the interview protocols will be translated into Spanish and a Spanish speaking member of the research team will conduct the interviews in Spanish. In addition, the research team will offer accommodations to support participation in both the web-based surveys and follow-up phone interviews. Individuals who prefer to take the web-based survey with a live person over the phone (for example, if it is difficult for them to read or complete a survey on a screen) will be given that opportunity. For the phone interviews, the research team will provide closed captioning and/or a sign language interpreter for anyone requesting such accommodations. HUD and the research team welcome and are prepared to receive calls from individuals who are deaf or hard of hearing, as well as individuals with speech or communication disabilities. When telephoning prospective participants, the research team will ask if participants need a reasonable accommodation or language access service. If prospective participants require information to be presented in an accessible format, reasonable accommodations, or language assistance to participate in this study, they will be invited to contact [INSERT STAFF CONTACT], the [TITLE/ROLE], by phone at (XXX) XXX-XXXX or by email at [INSERT EMAIL ADDRESS].

We will proactively maximize response rates by conducting outreach to potential respondents using the materials contained in Appendices F, G, H, and I. On all survey outreach materials, we will include a study-specific hotline and email address for PHAs and CoCs to ask questions about the web surveys, which will be closely monitored by members of the research team. Respondents can also use this contact information to request survey accommodations as needed.

We also plan to maximize response rates and deal with issues of non-response by using the following methods, described below by data collection method.

Web-based Surveys

Maximizing Response Rates

The research team will track the number of completed surveys for both PHAs and CoCs and produce a weekly tracking report for each survey to identify which PHAs and partners have not responded. This will enable the research team to monitor the response rate and identify respondents that require telephone follow-up. The research team will prioritize follow-up to ensure that we have complete pairs of PHAs and their partner organizations, as well as responses from PHAs of different sizes and in different geographies.

The research team will begin sending reminder emails to both PHA and CoC non-responders after Week 4 of survey administration. The research team will send at least three email reminders to non-responders over the 12-week survey period. In addition, the research team will implement telephone reminder calls beginning in Week 6 of data collection. These calls will be made by professional interviewers from Abt’s virtual call center. The research team will execute up to two phone calls to non-respondents on weekdays during their local working hours. Interviewers will receive training on the survey instruments and on how to answer frequently asked questions. Using this rigorous process, the research team aims to achieve an 80 percent response rate on both PHA and CoC web surveys.

Additionally, to reduce burden and increase accuracy, the research team will pre-populate the survey using HUD administrative data, including the number of EHVs allocated to the PHA, the number of EHVs issued, and the number of EHVs leased up.

Dealing with Non-Response

After administering the web-based survey, the research team will use data cleaning and verification procedures on these data. Doing so will involve using simple statistical techniques using the research team’s standard statistical analysis packages (SAS and R) to identify outliers or missing data, and, if needed, assess the best method for addressing nonresponse bias (e.g., inverse propensity weighting, raking procedures, etc.) and then correct for nonresponse bias (based on the threshold of an 80 percent response rate). The goal of the nonresponse bias correction will be to ensure that the survey weights generate estimates representative of all PHAs that received EHVs with respect to program size as of 2020, mix of programs offered (e.g., public housing or vouchers or both), average Housing Assistance Payment (HAP) in 2021, total administrative costs per household per unit month in 2021, and local economic and housing market conditions as of 2021 (e.g., unemployment, median income, ratio of average rents to area median income).



Follow-up Telephone Interviews

Maximizing Response Rates

In recruiting PHAs, CoCs, and alternative partners for the interviews, the research team will emphasize the minimal burden involved (each interview is 60 or 90-minutes, virtual, and to be scheduled at the respondent’s convenience) as well as the value of the activity to the overall study effort. See Appendices H and I for recruitment material. The interviewer will contact the chosen PHA and CoC or other partner twice by email and once by phone if the number is accessible and email attempts are unsuccessful. If contacts for the PHA or CoC or other partner are unsure or have questions about participation, the assigned staff person will alleviate concerns and explain the value of participation.

Additionally, interviews can be conducted in Spanish (for any selected organizations in Puerto Rico or for others who prefer) and using closed captioning or a sign language interpreter for those who request these accommodations.

Dealing with Non-Response

The research team expects to complete the planned number of interviews (25 with PHAs, 20 with CoCs, and five with alternative partners) because if a PHA or the CoC or other partner is unresponsive or declines to participate after the recruitment attempts, the research team will move on to the next community on the list that best matches the declining communities’ program or context.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of test may be submitted for approval separately or in combination with the main collection of information.



During the time the surveys are in OMB clearance, the research team will pre-test each of the web-based surveys with two PHAs and two CoCs using paper copies of the survey. The research team will purposefully select PHAs and CoCs expected to have different survey experiences to ensure it can be successfully completed in each circumstance. We will include one CoC that worked with more than one PHA to test the experience of filling out the survey, and the time involved, when accounting for multiple EHV partnerships.

No pretesting will occur for the follow-up telephone interview guides. However, the instruments were reviewed by a subject matter expert (a former high-level PHA and CoC administrator). Three individuals with lived experiences using EHVs also reviewed the study design and provided insight into questions that they felt should be included. The research team incorporated feedback from the subject matter expert and the lived experience group into both the follow-up telephone interview guides and the web-based survey instruments.



  1. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.

HUD contracted with Social Policy Research Associates (SPR) to conduct this evaluation. SPR also subcontracted with Abt Global (Abt) to support the evaluation and lead the web-based survey task. The following individuals from SPR and Abt were consulted on statistical and other aspects of the design and/or will collect and analyze study data:


SPR staff:

Christian Geckeler, co-Principal Investigator (510-788-2461)

Anne Paprocki, Project Director (510-768-8499)


Abt staff:

Lauren Dunton, co-Principal Investigator (301-634-1779)


HUD staff:

HUD’s Contracting Officer’s Representative (COR), Jeffrey Chen, reviewed the statistical aspects of the design and had them reviewed by other subject matter experts at HUD. Anne Fletcher (202-236-1484) is the COR’s supervisor. If there are any questions about this submission, please call either the HUD COR, Jeffrey Chen (212-542-7422) or SPR's Project Director, Anne Paprocki (510-768-8499).




File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorh03483
File Modified0000-00-00
File Created2024-10-07

© 2024 OMB.report | Privacy Policy