Clean OMB Package Part B - REVISED 7.30

Clean OMB Package Part B - REVISED 7.30.docx

Evaluation of the HUD-DOJ Pay for Success Permanent Supportive Housing Demonstration

OMB: 2528-0319

Document [docx]
Download: docx | pdf

Supporting Statement Part B for Paperwork Reduction Act Submission Evaluation of the HUD-DOJ Pay for Success Re-Entry Permanent Supportive Housing Demonstration

OMB # 2528-XXXX


Part B. Collections of Information Employing Statistical Methods



  1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection methods to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.

Partnership Survey

The partnership survey will be administered annually to an estimated 168 people in management roles. These include active participants in executive and steering committees and people to be involved in implementation. The survey will be administered to the universe of these partners.

Among the 6 current demonstration sites, two (Maryland and Alaska) include two jurisdictions that involve largely distinct organizations; thus, the demonstration involves 8 jurisdictions. (One additional HUD grantee participated in the feasibility analysis phase of the demonstration but then elected not to pursue PFS financing at the end of 2017, and will not be included in these data collection efforts.) At some government partners in some jurisdictions, the relevant organization/departments may formally be different agencies (e.g., the housing agency and department of health), while at other government partners, these units may be different departments within a formal agency, such as the housing department within a Health and Human Services agency. The survey will be administered to organizations/departments that participate on executive and steering committees and/or in project implementation.

Exhibit 1 shows the basic types of organizational/agency partners expected at a typical jurisdiction. Which organizations/departments are participating will vary by site, and the participating organizations/departments will also vary over the phases of the PFS demonstration, with some of the implementation partners (e.g., some service providers) only being definitively identified and engaged as the project moves from its planning phases (feasibility analysis and transaction structuring) into its implementation phase. At some government partners, there are several active participants in the various roles shown in Exhibit 1.



Exhibit 1. Description of Key Project Partners

Organization

Staff Category/Role

Grantees/Intermediaries

Financial intermediary/fiscal agent

Knowledge /programmatic intermediary/project coordinator

Analysis/reporting compliance

Government partners

Budget/finance

Programmatic

Communications/elected officials

Legal

Service providers

Executive/associate director

Service provider program director

Service provider outreach director

Service provider financial director

Evaluation partner

Evaluator team leads and Associates/analysts

Technical assistance advisors

Team leads and Associates/analysts

Investors/funders

Investor leads and Associates/analysts



Based on our knowledge of the sites though our qualitative work on the evaluation to date, we estimate that on average, at each jurisdiction, there will be 20 relevant staff categories of respondents, among whom 1-2 people will be surveyed, for a total of 240 respondents (8 jurisdictions, 1-2 respondents, 20 relevant staff categories).

The Urban Institute will work with each site’s intermediary organization, which is the HUD grantee, to identify the most appropriate individuals in relevant management roles in each organization/agency for the partnership survey; no formal sampling will be involved. Based on Urban’s past survey administration work, a response rate of 70% of a 240 person sample will provide sufficient data. Given the existing relationship between the respondents and the intermediary, Urban expects this rate is achievable. Assuming an estimated 70% response rate for this universe of 180 individuals, 126 people will respond to the survey (exhibit 2).

Exhibit 2. Partnership Survey: Respondents and Response Rate

Respondent category

Universe

Number covered by collection

Number of responses expected

Response rate

Key Project Partners

240

240

168

70%



Time Use Interviews

Regarding time use, interviews will be conducted with one key informant at an estimated average of 8 organizations/agencies per jurisdiction (across the 8 participating jurisdictions at the six demonstration sites), for an estimated 64 interviews per quarter.

At each phase of the project, we estimate that about half (on average 6) will be more active participating organizations, in the sense that multiple people in the organization are working on the project, some of whom may be billing time to the grant through HUD’s DRGR system. Interviews will also be conducted with a key informant at each of these organizations.

Based on our ongoing data collection efforts, monthly phone interviews and annual site visits, we have observed that for some partnering organizations, participation is often limited to executive-level participation in steering committee and executive committee meetings. These participants’ time use will be identified through one interview with the intermediary organization that is tasked with organizing the project and its meetings, and through the collection of meeting participation rosters.

Urban Institute will work with each site’s intermediary organization, which is the HUD grantee, to identify each jurisdictions active organizations (in the sense just described) and it most appropriate key informant; no formal sampling will be involved. We assume that all of these active organization partners and the HUD grantee organization will participate in these time use interviews.



Exhibit 4. Time Use Interview: Respondents and Response Rates

Respondent category

Universe

Number covered by collection

Number of responses expected

Response rate

Key Staff Informants

64

64

64

100%



  1. Describe the procedures for the collection of information including:

    1. Statistical methodology for stratification and sample selection

    2. Estimation procedure

    3. Degree of accuracy needed for the purpose described in the justification

    4. Unusual Problems requiring specialized sampling procedures

    5. Any use of periodic (less frequent than annual) data collection cycles to reduce burden.

At the start of the survey period (targeted for September 2018 for the surveys and interviews) HUD will send an introductory Respondent Contact Letter to respondents to explain the importance of the evaluation, participation in data collection, and the role of the Urban Institute (Appendix A).

Partnership Survey

Overview: The Urban Institute will field the Partnership Survey to the universe of participants described in exhibit 1. The research team developed a survey instrument that draws on tested questions from other surveys intended to measure the strength of partnership and community- and system-level changes. The survey instrument includes questions in the areas of collaboration with partners, data sharing and outcomes, and barriers to service provision.

Survey Administration: The survey will be administered online using Qualtrics survey software. Stakeholders will be contacted by email and invited to take the survey using an individualized link. The survey is designed to be completed online, accessible through multiple platforms, such as computer, tablet, and smart phone; a PDF version will be available for download for informational purposes only. The survey is designed to take approximately fifteen minutes to complete. Non-respondents will receive an automated follow-up email after one week and after three weeks, and the survey will close after one month. The universe of respondents will be updated as needed for each survey, if, and when, key project partners change at each demonstration site.

Data Management and Storage: While the survey is being fielded, completed and partially completed surveys will be stored on the Qualtrics secure site. Access to the survey by respondents will be through a link with a unique ID provided in the email invitation. Once the survey has been completed, respondents will no longer have access. Access to survey data by Urban Institute staff will be password-controlled and limited to those staff involved in fielding the survey and who have signed the confidentiality agreement. All survey and other sensitive data will be saved to an encrypted network drive, with access limited to Urban Institute staff with a need to work with raw data, who have signed the confidentiality agreement. Access will only be available on-site, through password-protected computers.

Follow up and Quality Control: During the survey period, email reminders will be sent on different days to respondents that have not responded after one week and again after three weeks. The number of respondents to be surveyed is not large, and the pool of non-respondents is expected to be small. Efforts to increase response rates during the survey period will most likely be more effective than post-survey adjustments for producing reliable data. Progress on survey administration will be reported biweekly to HUD with production reports showing ongoing response rates.

Analysis: A summary will be provided upon survey completion with tables of frequencies for all survey questions. Results from the Partnership Survey will be presented in the final report as descriptive statistics and correlations. Crosstabs of survey responses by organization type and respondent role will also be produced.

Data Delivery: The annual data submission to HUD will consist of the cleaned survey data collected to date, de-identified with sites identified by a number (e.g. Site 1, Site 2, etc.) and respondents identified by role (e.g. investor, end payor, etc.). Metadata for each site will provide the file name, a complete description of the survey methodology, the dates fielded, the participant universe, and the response rate. The methodology description will include any measures taken to correct for nonresponse, if necessary. Files will be provided in both SAS and CSV formats. Submissions will also include a data dictionary, tables of frequencies, and copies of the survey instrument for each file.

Other Notes: The Partnership Survey does not require sample selection or specialized sampling procedures because it will be fielded to the full universe of respondents. Administering this survey less frequently would affect the reliability of the data.

    • Statistical methodology for stratification and sample selection: N/A

    • Estimation procedure: N/A

    • Degree of accuracy needed for the purpose described in the justification: N/A

    • Unusual Problems requiring specialized sampling procedures: N/A

    • Any use of periodic (less frequent than annual) data collection cycles to reduce burden: N/A



Time Use Interviews

Overview: The Urban Institute will identify staff members at each organization who are able to report on time spent on PFS-related tasks by all other staff members. The identified staff member will typically be in an administrative role. Intermediary organizations may have staff members able to report on other organizations, for instance if they have responsibility for organizing high-level meetings, but those staff members are unlikely to have complete information. The research team will identify informants to cover each category/role across the organizations listed in Exhibit 1.

Consent: Through phone and email contact, identified staff members will be asked to participate in the time use interview. Prior to data collection, potential participants will receive a letter from HUD (Appendix A), followed by an email from the Urban Institute (Appendix B), encouraging their participation. Respondents will be informed that their participation is voluntary, but would be important to the evaluation.

Interview administration: The interviews will be conducted once per quarter, at a time convenient for the informant. They are expected to take one hour or less. Time spent not captured through DRGR submissions will fall into two categories: 1) time spent by individuals who are covered by the grant, but who are spending more time than is covered; and 2) time spent by individuals who are not covered by the grant.

If the organization does not submit time into DRGR, the research team will probe for an estimate of all time spent on PFS tasks in the previous quarter, by role. If the organization reports time spent through DRGR, the research team will investigate whether there were activities not covered in the DRGR report. Some time spent by high-level staff may be captured through attendance records at executive and steering committees. The research team will also explore with the representative at each organization whether there are other, simple ways to capture that time involvement.

In addition to the DRGR submissions, during interviews key informants will be asked about any other funding sources leveraged to support the PFS process and relevant documents will be collected if possible.

Data Management and Storage: Access to survey data by Urban Institute staff will be password controlled, and limited to those staff involved in fielding the survey, and who have signed the confidentiality agreement. All sensitive data will be saved to an encrypted network drive, with access limited to Urban Institute staff with a need to work with raw data, and who have signed the confidentiality agreement. Access will only be available on site, through password protected computers.

Analysis: The research team will analyze the data annually and summarize the costs of each PFS stage as projects move through the lifecycle. By combining information from actual grant spending from DRGR with time estimates from staff informants and any information about additional funding, descriptions of the overall time spent will be developed. This will be broken down by partner and by site, by how many people at what level are involved in working on each PFS project, and by what portion of PFS project time is and is not covered by the HUD grant. In addition, these time costs will be described by PFS lifecycle phase. Because different demonstration sites’ grants cover different lifecycle phases, data from different sites will be involved in the estimates for the different phases.

Data Delivery: The annual data submission to HUD will consist of tables with amount of time spent by role averaged across site, by site averaged across role. Because there are too few people in a role within a site, the submission will not include tables crossing these two dimensions. Files will be provided in CSV format.

Other Notes: The Time Use Interviews do not require sample selection or specialized sampling procedures because they will be conducted with the full universe of respondents. Conducting the interviews less frequently would affect the reliability of the data.

    • Statistical methodology for stratification and sample selection: N/A

    • Estimation procedure: N/A

    • Degree of accuracy needed for the purpose described in the justification: N/A

    • Unusual Problems requiring specialized sampling procedures: N/A

    • Any use of periodic (less frequent than annual) data collection cycles to reduce burden: N/A



  1. Describe methods to maximize response rates and to deal with issues of nonresponse. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.

Partnership Survey

The proposed survey will be fielded to the full universe of respondents and will not be based on sampling. The accuracy and reliability of the survey data will be a factor of adequate response rates and data quality. The Urban Institute conducted webinars and other informational calls with grantees to introduce the concept of the Partnership Survey and solicit their feedback. This engagement prior to the survey period was intended to build buy-in among respondents and improve buy-in. HUD will also send an introductory Respondent Contact Letter to all respondents at the start of the survey period to encourage participation (Appendix A).

For the Partnership Survey, the expected response rate is 70 percent. The study team will employ a variety of techniques to ensure the highest possible response rate, including: survey design that allows respondents to use various types of electronic devices to complete the survey; effective communication before the survey to prepare respondents for participation; assurance that only de-identified, aggregated data will be shared; survey reminders throughout the fielding period; ongoing response tracking; and email follow-up with non-responders. Reminders will be sent to key personnel that have not responded after one week and again after three weeks. Furthermore, if, after three weeks, particularly categories of grantee types or stakeholder roles have not yet achieved the target response rate, stakeholders within those categories will receive phone call reminders. Completed surveys from the late-to-complete cohort will be compared with all other completed surveys to test for significant differences. The research team will share response rates with the grantees and the HUD GTR, to encourage higher response rates.

To measure the extent of response bias for each survey, the research team will conduct two types of tests. First, it will stratify the sample by characteristics of the organization and role of the respondent and compare the characteristics of respondents to non-respondents. If the characteristics of respondents are significantly different from the rest of the respondents, adjustments to the estimates through weighting may be used, and the results of the testing will be reported. The team will also use t-tests to compare respondents to non-respondents using information acquired from non-participants who have participated in one of the other surveys. Because the entire population of interest, which is not large, will be surveyed, and because the expected pool of non-respondents is small, the team anticipates efforts to increase response rates overall will be more effective than post-survey adjustments for producing reliable data.

Time Use Interviews

It is expected that the interviewees described in Part B Question 1, the grantees and partners making use of grant funds, will participate fully. Investor organizations may have a lower participation rate. To encourage participation, HUD will also send an introductory Respondent Contact Letter to all selected informants at the start of the data collection to encourage participation (Appendix A).



  1. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of test may be submitted for approval separately or in combination with the main collection of information.

The Urban Institute conducted a pretest each survey in January and February 2017 with 8 respondents individuals across sites who were active in PFS-related tasks and volunteered for the test. The objectives were to: (a) test each survey for wording, flow, and meaning; (b) verify the estimated time to complete the survey; and (c) conduct post-survey interviews with respondents to assess their interpretation of the questions and the reasoning behind their answers.

After administering the pretest survey, Urban Institute conducted interviews to learn about survey fatigue and question clarity and answerability. Feedback on formatting and content was solicited from all testers via e-mail and by phone call. Based on these interviews, Urban made several changes to the surveys and administration plans.

Changes made to the partnership survey based on tester feedback included, the addition of a progress bar, removal of ambiguity about when the survey is asking about the PFS project or the broader community, and added questions about benefits of data infrastructure and sustainability.

Respondents reported the web-based partnership survey took less than 15 minutes (and timing in Qualtrics showed the average was 13 minutes) and the weekly text and email surveys took less than 1 minute to complete. Please see Part A for more information on the changes that resulted from the pre-test.

  1. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractors, grantees, or other person(s) who will actually collect or analyze the information for the agency.

    1. The agency responsible for receiving and approving contract deliverables is:

Office of Policy Development and Research, Program Evaluation Division

U.S. Department of Housing and Urban Development

451 Seventh St. SW

Washington, DC 20410

Person Responsible: Marina Myhre, Social Science Analyst/GTR, HUD (202-402-5705)

    1. The organization responsible for survey design, data collection, and data analysis is:

The Urban Institute

2100 M St. NW

Washington, DC 20037

Persons Responsible:

Mary Cunningham, Co-Principal Investigator, Urban Institute (202-261-5764)

Akiva Liberman, Co-Principal Investigator, Urban Institute (202-261-5704)

Christopher Hayes, Survey Lead, Urban Institute (202-261-5650)

Timothy Triplett, Senior Survey Advisor, Urban Institute (202-261-5579)

6



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorHayes, Christopher
File Modified0000-00-00
File Created2021-01-20

© 2024 OMB.report | Privacy Policy