Final - PFS 3rd OMB Package Part B

Final - PFS 3rd OMB Package Part B.docx

Evaluation of the HUD-DOJ Pay for Success Permanent Supportive Housing Demonstration

OMB: 2528-0319

Document [docx]
Download: docx | pdf

Supporting Statement Part B for Paperwork Reduction Act Submission Evaluation of the HUD-DOJ Pay for Success Re-Entry Permanent Supportive Housing Demonstration

OMB # 2528-0319


B. Collections of Information Employing Statistical Methods



  1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection methods to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.

Evaluation Overview

Accompanying the Pay for Success (PFS) Permanent Supportive Housing (PSH) Demonstration, HUD and DOJ funded a national evaluation to assess whether PFS is a viable financing approach for scaling supportive housing in order to improve outcomes for a re-entry population, that is, for individuals returning to the community from jail or prison. The overarching goal of this national evaluation is to learn how the PFS model is implemented in diverse settings with varied populations of individuals reentering the community from jail and/or prison, who have histories of homelessness and high health/behavioral health care needs, and returning to community contexts that vary in housing and services availability and experience with Housing First and PSH, as well political and policy contexts that vary in terms of experience with PFS and availability of financing to support PSH. This national evaluation is focused on questions about the effectiveness of PFS as an approach to financing and implementing PSH. (Questions concerning the effectiveness of PSH in promoting housing stability, reducing recidivism, and/or reducing health care utilization are being assessed by local evaluators at each site.)

This information collection renewal request concerns two specific data collection activities that are part of the national evaluation: (1) the HUD-DOJ PFS Key Project Partners Annual Web-based Partnership Survey (referred to as the Partnership Survey), which is focused on the development and functioning of partnerships and community-level collaborations that may benefit the target population, and (2) the HUD-DOJ PFS Key Project Partners Quarterly Time Use Study (referred to as the Time Use Study), which is being conducted as part of a study of the staff time that is used to develop each PFS project, through the phases of its PFS life cycle of feasibility analysis, transaction structuring, and project implementation.

Partnership Survey

Effectively serving the target population for the Demonstration, who are high utilizers of healthcare, criminal-justice, and housing resources, requires collaboration across multiple governmental agencies and often at multiple levels of government (e.g., county, city, state), as well as non-governmental housing providers and contracted services providers. An important hypothesis in the Demonstration's theory of change is that PFS can be a vehicle for galvanizing partnership and collaboration to promote effective cross-sector implementation. At the same time, PFS also requires the involvement of additional organizations, especially intermediary organizations to help develop PFS transactions as well as evaluators.

To study the function of these partnerships, the evaluation includes a computer-based survey of organizations involved in the Demonstration – primarily the leadership of the organizations – concerning partnership and collaboration. (Other modes of survey administration will be made available for individuals with disabilities, as discussed below.)

The Partnership Survey is a cost-effective approach to getting input from a broad set of participating organizations concerning how the partnerships needed for the Demonstration are functioning, and how those partnerships evolve over time and over the three phases of the Demonstration. (The Instrument is included in Appendix A.) The primarily quantitative data from the survey are complementary to qualitative data being collected during interviews and site visits. The Partnership Survey will be administered annually to approximately 64 individuals in total across the four remaining active sites. The respondent universe consists of key project respondents who are active members of executive or steering committees overseeing the Demonstration projects or actively involved in implementation. The survey will be administered to key project respondents at all organizational partners in the Demonstration sites, shown in Exhibit B-1.

The Urban Institute will work with each site’s intermediary organization, which is the HUD grantee, to identify the most appropriate individuals in relevant management or other central roles in each organization/agency and who can knowledgably respond to questions concerning the organizational partnerships in the Demonstration. Based on knowledge of the sites through the first 7 years of the evaluation, we estimate that an average of 9 respondents per site will be responsive across key partner organizations. No statistical sampling will be involved.

At each site, staff to be surveyed will be identified from the following six types of organizations involved in the demonstration, as shown in the following table:

Exhibit B-1. Description of Key Project Respondents for Partnership Survey

Organization

Staff Category/Role

Grantees/Intermediaries

Financial intermediary/fiscal agent

Knowledge /programmatic intermediary/project coordinator

Analysis/reporting compliance

Government partners

Budget/finance

Programmatic

Communications/elected officials

Legal

Service providers

Executive/associate director

Program director

Outreach director

Financial director

Evaluation partner

Team leads

Associates/analysts

Technical assistance advisors

Team leads

Associates/analysts

Investors/funders

Investor leads

Associates/analysts





Exhibit B-2: Partnership Survey: Proposed Populations, Sampling Frames, Sample Sizes, and Response Rates

Group

Grantees/ intermediaries

Government partners

Service providers

Evaluation partner

Technical assistance advisors

Investors/ funders

Total

Universe

 11

 13

 18

 5

 9

7

64

Sample







no sampling

No. of completes

 8

 10

 9

 1

 5

3

36

Response rate

 73%

 77%

 50%

 20%

 56%

43%

56%



Time Use Study

PFS requires complex multi-party transactions, and involves a set of organizations (intermediary organizations, investors, and evaluators) beyond those involved in a typical two-party contracts between government agencies and service providers. Although the theory of change for the Demonstration hypothesizes that a benefit of PFS will be to improve collaboration across government and service sectors, PFS also imposes considerable costs in terms of people's time, especially during the pre-implementation phases of feasibility analysis and transaction structuring. To assess these time costs, the evaluation includes a Time Use Study, which involves requesting time-use information from a key informant at each participating organization.

For the Time Use Study, the Urban Institute will contact one key informant at up to six organizations/agencies per Demonstration site. Depending on nonresponse, organizational opt in, and if sites are actively participating in the demonstration, estimated requests for information per quarter range from 16 to 36 (i.e., up to 144 requests per year).

Each data collection request at an organization concerns how much time was spent by individuals in that organization on the Demonstration, during the preceding quarter.

As the proposed data collection follows five prior waves of data collection, the Urban Institute will continue to work with each site’s intermediary organization, which is the HUD grantee, to identify the active organizations in each site and the most appropriate key informant in each organization. No statistical sampling will be involved. We assume that all of these active organization partners and the HUD grantee organization will participate in the Time Use Study. Response rates for previous waves range from 58% to 85%, with an average response rate of 76%. We estimate future waves can also achieve at an approximate 75% response rate.

Exhibit B-3: Time Use Study: Proposed Populations, Sampling Frames, Sample Sizes, and Response Rates


Key Staff Informants

Universe

36

Sample

no sampling

Number of complete responses

29

Response rate

80%

Note: The universe size, and subsequently the response rate, are subject to change based on aforementioned factors of site participation, organizational chronic nonresponse, shifting organizational involvement, or refusal to participate in data collection.

  1. Describe the procedures for the collection of information including:

  • Statistical methodology for stratification and sample selection,

  • Estimation procedure,

  • Degree of accuracy needed for the purpose described in the justification,

  • Unusual problems requiring specialized sampling procedures, and

  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.

Partnership Survey

Overview: The Urban Institute will administer the Partnership Survey to the selection of participants described in Exhibit B-1, capturing the perspective of key stakeholders who were involved in the project in the preceding year. The research team developed a survey instrument that draws on tested questions from other surveys intended to measure the strength of partnership and community- and system-level changes, and which has now been used for five prior survey waves. The survey instrument includes questions in the areas of collaboration with partners, data sharing and outcomes, and barriers to service provision.

Survey Administration: The survey will generally be administered online using Qualtrics survey software. The survey is designed to be completed online and accessible via multiple platforms, such as computer, tablet, and smart phone. A PDF version will be available for download for informational purposes. The survey is designed to take approximately fifteen minutes to complete.

Stakeholders will be contacted by email and invited to take the survey using an individualized link. This advance mailing will offer information in large font for individuals with visual impairments and notify individuals with disabilities that they may request reasonable accommodation in order to participate in the study. The Section 508-compliant web survey will offer an alternative to the phone survey for individuals with hearing impairments.

The advance mailer will also detail language assistance options for persons with limited English proficiency. The research team will make the survey available in other languages, as needed, for persons with limited English proficiency. Survey language needs will be determined before the survey is administered so that survey translations can be made available to individuals with limited English proficiency (LEP). In prior waves, the research team did not find any need any individuals in the survey population of stakeholders for the initiative. If LEP needs are discovered, translation services will be engaged to make the survey available in other languages, as needed. HUD’s Office of Policy Development and Research has a translation services contractor that can help provide translation services if needed.

Non-respondents will receive an automated follow-up email after one week and after three weeks, and the survey will close after one month. If there is a low response rate from a given site, personalized reminders will be sent to non-respondents by the Urban Institute staff who lead the work with that site and, where useful, from a site's grantee organization to other partnering organizations. The universe of respondents will be updated as needed for each survey if and when key project partners change at each Demonstration site.

Data Management and Storage: While the survey is being fielded, completed and partially completed surveys will be stored on the Qualtrics secure site. Access to the survey by respondents will be through a link with a unique ID provided in the email invitation. Once the survey has been completed, respondents will no longer have access. Access to survey data by Urban Institute staff will be password-controlled and limited to those staff involved in fielding the survey and who have signed the confidentiality agreement. All survey and other sensitive data will be saved on access-controlled files, with access granted to only Urban staff that need access to the raw data, who have signed the confidentiality agreement. Access will only be available through password-protected computers. Data stripped of personal identifiers will be shared with HUD.

Follow up and Quality Control: During the survey administration period, email reminders will be sent on different days to respondents that have not responded after one week and again after three weeks. The number of respondents to be surveyed is not large, and the pool of non-respondents is expected to be small. Efforts to increase response rates during the survey period will most likely be more effective than post-survey adjustments for producing reliable data. Progress on survey administration will be reported to HUD showing ongoing response rates.

Analysis: Responses to survey items will be grouped into indices based on inter-item correlations and reliability analyses, in order to increase the reliability of reported responses and reduce spurious reports of change over time or phase in individual items. (Individual items will not generally be presented.) A statistical summary of the responses will be provided to HUD and DOJ upon survey completion along with tables of frequencies. Results from the Partnership Survey will be presented in the final report as descriptive statistics and correlations. Crosstabs of survey responses by organization type and respondent role will also be produced when statistical tests indicate variation by organization type or role is reliable (i.e., statistically significant). Descriptive statistics will be constructed to show change over Demonstration phases (which vary across sites over time) and data collection waves.

Data Delivery: The annual data submission to HUD will consist of the cleaned survey data collected to date, de-identified with individual names removed and respondents identified by role (e.g., investor, end payor, etc.). Metadata for each site will provide the file name, a complete description of the survey methodology, the dates fielded, the participant universe, and the response rate. The methodology description will include any measures taken to correct for nonresponse, if necessary. Submissions will also include a data dictionary, tables of frequencies, and copies of the survey instrument for each file.

Other Notes: The Partnership Survey does not require sample selection or specialized sampling procedures because it will be fielded annually to the full universe of respondents (organizations involved in the Demonstration sites). Administering this survey less frequently would affect the reliability of the data.

  • Statistical methodology for stratification and sample selection: N/A

  • Estimation procedure: N/A

  • Degree of accuracy needed for the purpose described in the justification: N/A

  • Unusual Problems requiring specialized sampling procedures: N/A

  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden: N/A



Time Use Study

Overview: The Urban Institute will identify staff members at each organization who are able to report on time spent on PFS-related tasks by themselves and other staff members at their organization. The identified staff member will typically be in an administrative role. Intermediary organizations may have staff members able to report on other organizations, for instance if they have responsibility for organizing high-level meetings, but those staff members are unlikely to have complete information. The research team will identify informants with knowledge of, and access, to data describing time use for each category/role across the organizations listed in Exhibit 1.

Consent: Through phone and email contact, identified staff members will be asked to participate in the Time Use Study. Respondents will be informed that their participation is voluntary, but that capturing their valuable perspective is helpful for the evaluation.

Data Collection: Outreach will be conducted once per quarter via email. The email will offer information in large font for individuals with visual impairments and notify individuals with disabilities that they may request reasonable accommodations, including a phone-based interview. A Section 508-compliant email-based data collection will also be offered as an alternative for individuals with hearing impairments.

The email will also offer language assistance options for persons with limited English proficiency. Language needs will be determined before data is collected, so that translations can be made available to individuals with limited English proficiency (LEP). In prior waves, the research team did not find any such need for individuals requested to participate in the Time Use Study for the initiative. If LEP needs are discovered, translation services will be engaged to make the survey available in other languages, as needed. HUD’s Office of Policy Development and Research has a translation services contractor that can help provide translation services if needed.

Time spent by the informant providing information is expected to take one hour or less. Time spent not captured through DRGR submissions will fall into two categories: 1) time spent by individuals who are covered by the grant, but who are spending more time than is covered; and 2) time spent by individuals who are not covered by the grant.

If the organization does not submit time into DRGR, the research team will probe for an estimate of all time spent on PFS tasks in the previous quarter, by role. If the organization reports time spent through DRGR, the research team will investigate whether there were activities not covered in the DRGR report. Some time spent by high-level staff may be captured through attendance records at executive and steering committees. The research team will also explore with the representative at each organization whether there are other, simple ways to capture that time involvement.

Data Management and Storage: Access to survey data by Urban Institute staff will be password-controlled and limited to those staff involved in fielding the survey and who have signed the confidentiality agreement. All survey and other sensitive data will be saved on access-controlled files, with access granted to only the Urban Institute staff that need access to the raw data who have signed the confidentiality agreement. Access will only be available through password-protected computers. After individual identifiers are removed, the survey data will be sent to HUD.

Analysis: The research team will analyze the data annually and summarize the costs of each PFS stage as projects move through the lifecycle. This will be broken down by partner organization and by Demonstration site, by how many persons at what level are involved in working on each PFS project, and by what portion of PFS project time is and is not covered by the HUD grant. In addition, these time costs will be described by PFS lifecycle phase. Because different Demonstration sites’ grants cover different lifecycle phases, data from different sites will be involved in the estimates for the different phases.

For the final evaluation, hours of time use will be monetized by being multiplied with median hourly wages within relevant labor categories from the Bureau of Labor Statistics Occupational Employment Statistics. By combining information from actual grant spending from DRGR with time estimates from staff informants and any information about additional funding, descriptions of the overall time spent and the financial costs of that time will be developed.

Data Delivery: The annual data submission to HUD will consist of tables with amount of time spent by role averaged across site, by site averaged across role. Because there are too few people in a role within a site, the submission will not include tables crossing these two dimensions. Files will be provided in CSV format.

Other Notes: The Time Use Study does not require sample selection or specialized sampling procedures because they will be conducted with the full universe of respondents. Conducting the interviews less frequently would affect the reliability of the data.

  • Statistical methodology for stratification and sample selection: N/A

  • Estimation procedure: N/A

  • Degree of accuracy needed for the purpose described in the justification: N/A

  • Unusual Problems requiring specialized sampling procedures: N/A

  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden: N/A

3. Describe methods to maximize response rates and to deal with issues of nonresponse. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.

Partnership Survey

The proposed survey will be fielded to the full universe of Demonstration sites organizations and will not be based on sampling. The accuracy and reliability of the survey data will be a factor of adequate response rates and data quality. Earlier in the project, the Urban Institute conducted webinars and other informational calls with grantees to introduce the concept of the Partnership Survey and solicit their feedback. This engagement prior to the survey period was intended to build buy-in among respondents and improve buy-in. HUD will also send a Respondent Contact Letter to all respondents during the survey period to encourage participation. The majority of current respondent targets have been engaged in the project for multiple years and have developed familiarity with the Urban Institute and the Demonstration evaluation after repeated contact. Most respondents also have working relationships with the grantee leads and are responsive to emails from either the Urban Institute site lead or the grantee lead.

For the Partnership Survey, the expected response rate is 70 percent. The study team will employ a variety of techniques to ensure the highest possible response rate, including: survey technology that allows respondents to use various types of electronic devices to complete the survey; effective communication before the survey to prepare respondents for participation; assurance that only de-identified, aggregated data will be shared; survey reminders throughout the fielding period; ongoing response tracking; and email and phone follow-up with non-responders. Reminders will be sent to key personnel that have not responded after one week and again after three weeks. Furthermore, if, after three weeks, particularly categories of grantee types or stakeholder roles have not yet achieved the target response rate, stakeholders within those categories will receive phone call reminders. The research team will share response rates with the grantees and the HUD COR to encourage higher response rates.



Time Use Study

Response rates have varied over the Time Use Study’s life, with grantees and partners making use of grant funds providing the most consistent data. Organizations with less engaged or intermittent roles, have typically provided less data. The evaluation team will address this with targeted email reminders. Because the Time Use Study is intended to obtain information based on an organization's internal administrative data and other information, we assume that late responses remain reliable. Therefore, this quarterly Time Use Study data collection will not be officially closed until the completion of the entire evaluation's data collection period. That is, completing any incomplete responses for earlier quarters is encouraged anew with each new wave of quarterly data collection, which increases the response rate.

  1. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.

The Urban Institute conducted a pretest of each survey in January and February 2017 with 8 individuals across sites who were active in PFS-related tasks and volunteered for the test. The objectives were to: (a) review each survey for wording, flow, and meaning; (b) verify the estimated time to complete the survey; and (c) conduct post-survey interviews with respondents to assess their interpretation of the questions and the reasoning behind their answers.

After administering the pretest survey, the Urban Institute conducted interviews with the test sample to learn about survey fatigue and question clarity and answerability. Feedback on formatting and content was solicited from all test participants via e-mail and by phone call. Based on these interviews, the Urban Institute made several changes to the survey instrument and administrative procedures.

Changes made to the Partnership Survey based on test participant feedback included the addition of a progress bar showing where one is in the survey, removal of ambiguity concerning whether an item was asking about the PFS project or the broader community, and added questions about data infrastructure and sustainability.

Respondents reported the web-based Partnership Survey took less than 15 minutes to complete (timing in Qualtrics showed the average was 13 minutes) and the weekly text and email surveys took less than 1 minute to complete. Please see Part A for more information on the changes that resulted from the pre-test. The initial plan for collecting Time Use Study data involved a weekly SMS text message sent to stakeholders on Fridays, requesting they share estimates of time spent on the project in the preceding week. Pre-testing this method with 9 stakeholders representing different types of organizations across the various sites indicated that this method would have low response rates (several respondents found it intrusive). Similarly, weekly email requests had poor response rates. As such, less frequent data collection (quarterly) with more options (email or interview) and more robust reminders, complemented with DRGR administrative data, where available, was identified as the best method for collecting complete data from a range of stakeholders.



  1. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractors, grantees, or other person(s) who will actually collect or analyze the information for the agency.

HUD has contracted with The Urban Institute for the Evaluation of the HUD-DOJ Pay for Success Re-Entry Permanent Supportive Housing Demonstration. The HUD Contracting Officer’s Representative (COR) reviewed all the procedures and had them reviewed by other subject matter experts at HUD. If there are any questions about this submission, please call either the HUD COR, Marina Myhre, (202-402-5705), or the Urban Institute co-Principal Investigators, Samantha Batko (202-261-5436) and Kelly Walsh (917-589-0381).

5



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorMyhre, Marina L
File Modified0000-00-00
File Created2024-10-29

© 2024 OMB.report | Privacy Policy