Final - 2528-0339 - STRD Phase 2 PRA Supporting Statement Part B

Final - 2528-0339 - STRD Phase 2 PRA Supporting Statement Part B.docx

Stepped and Tiered Rent Demonstration Evaluation

OMB: 2528-0339

Document [docx]
Download: docx | pdf





STEPPED AND TIERED RENT DEMONSTRATION EVALUATION







OMB INFORMATION COLLECTION REQUEST

REVISION OF CURRENTLY APPROVED COLLECTION







SUPPORTING STATEMENT

PART B





February 21, 2025



Submitted to:





U.S. Department of Housing and Urban Development

OMB Control # 2528-0339



B. Collections of Information Employing Statistical Methods


  1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection methods to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.


The U.S. Department of Housing and Urban Development seeks approval to collect additional data from public housing site staff and study participants for the Stepped and Tiered Rent Demonstration Evaluation. The information collection request is focused on research activities covered by the second phase of the project.

30-Month Follow-Up Survey Sampling and Target Population

From January 2023 to November 2024, eligible households in the 10 participating housing agencies were randomly assigned to receive the stepped or tiered rent or the standard rent rules. The 30-month follow-up survey will be conducted for heads of households participating in the study to assess the effects of the alternative rent policies on key outcomes that cannot be captured with administrative records, such as material hardship and the program group’s perspective on the alternative rent policies. The survey data is also being collected to estimate the effects of the alternative rent rules on employment, earnings, housing subsidy, and other key outcomes. MDRC’s subcontractor, Decision Information Resources, Inc. (DIR), will conduct web and phone-based data collection for the participant survey. The follow-up survey will be 15 minutes in length and will be fielded to 8,000 participants (Exhibit B.1) with an even distribution between the program and control groups. MDRC anticipates a 50 to 60 percent response rate without an in-person field effort, which would result in approximately 4,000 to 4,800 respondents. For burden estimates we assume 8,000 respondents.













Exhibit B.1: Sample Sizes by Site


State

Housing Agency

Alternative Rent Rules

Standard Rent Rules

Total Households Enrolled in Study

Survey Fielded Sample









Tiered rent






OH

Akron Metropolitan Housing Authority

1,286

1,291

2,577

1,322


WA

Everett Housing Authority

292

308

600

308


WV

Charleston-Kanawha Housing Authority

547

545

1,092

560


OR

Housing Authority of Washington County

232

243

475

244


TX

Houston Housing Authority

1,535

1,518

3,053

1,566


Tiered Rent Total

3,892

3,905

7,797

4,000


Stepped rent






NC

Housing Authority of the City of Asheville

405

407

812

812


IN

Fort Wayne Housing Authority

561

554

1,115

1,115


CA

Housing Authority of the County of Kern

514

526

1,040

1,040


VA

Portsmouth Redevelopment and Housing Authority

210

212

422

422


UT

Housing Connect (Housing Authority of the County of Salt Lake)

298

265

563

563


Stepped Rent Total

1,988

1,964

3,952

3,952




Staff interviews


The data collection activities for Phase 2 also include a third round of interviews with PHA staff via video conferencing to understand their implementation experiences with the alternative rent policies. MDRC will conduct this round of staff interviews at each of the 10 study sites, roughly three to six months after the first triennial recertification effective dates. We expect to conduct two group interviews at each site to understand each PHA’s experience administering triennial recertifications, gathering the required documents from families, applying the verification hierarchy, calculating TTPs and hardship remedies, and staff level of effort for implementation activities.

  • One round of group interviews will be conducted with two to four frontline staff at each of the 10 study sites (using the PHA Housing Specialist Group Implementation Interview Guide).

  • One round of group interviews will also be conducted with two to four directors or other senior leadership with oversight of MTW and STRD implementation and PHA strategy (using the PHA Program Director/Manager Group Interview Guide).

Cost Study Staff Interviews and Cost Checklist

The cost interview team will conduct interviews with PHA staff at each of the 10 PHAs to collect time-use data on actions related to the administration of the new rent policies. Prior to the interviews, managers and specialists will be asked to complete the cost checklist where they will indicate the activities they have worked on in their current position. The cost checklist will be completed by PHA staff ahead of the interview, and it is estimated that it will take approximately six minutes to complete. Interviews will be conducted concurrently with the implementation interviews three to six months after PHA staff begin conducting the first tiered rent triennial recertifications and stepped rent eligibility reviews. The team expects that two managers and two specialists will join the group interviews from each site.



  1. Describe the procedures for the collection of information including:


  • Statistical methodology for stratification and sample selection,

  • Estimation procedure,

  • Degree of accuracy needed for the purpose described in the justification,

  • Unusual problems requiring specialized sampling procedures, and

  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


30-Month Participant Survey

DIR will collect the follow-up survey data beginning with a push-to-web effort, with Computer Assisted Telephone Interview (CATI) follow-up across a targeted 14-week data collection window for each potential participant. Eligible households will be invited to participate via both United States Postal Service (USPS) and email (as available). Additional reminder postcards and emails will be sent to non-responders throughout the fielding period to locate respondents and maximize response rates.

Web Protocols. All sample members will be given two weeks to complete via the web option before outbound CATI calls are initiated (the study team has email or USPS address for all sample members). Sample members will receive an initial web invitation (via email and USPS mail) to participate in the self-administered survey that includes information about the follow-up survey, the respondent’s rights as a participant, contact information for DIR’s study-specific toll-free number, and instructions for accessing the online version of the survey – including the survey link, unique respondent PIN, and QR code (should the respondent choose to complete the survey on a mobile device). The invitation will also inform the sample that if they complete the survey by a specified date (two weeks after the start of the data collection wave), they will receive an early-bird incentive of $10 in addition to the $30 incentive we propose to offer all respondents. A reminder mailing will be sent to arrive approximately one week before the end of the two-week web push cutoff date specified in the initial mailing. The self-administered web option will remain available to all respondents for the duration of their data collection period.

Telephone Interview Protocols. If there are sample members without valid email or USPS addresses, they will become eligible for outbound dialing immediately upon the start of their survey window. Those who do have addresses but do not complete the 30-month participant survey during the two-week web push will become eligible for outbound CATI dialing two weeks after survey launch.

After six weeks of outbound CATI dialing (eight weeks after survey launch), non-responders will receive a refusal conversion mailing to boost response rates as we near the end of the data collection period. The mailing will include the offer of an additional $10 if the respondents complete before the end of data collection window, totaling $40 in incentives. Prior to the last two weeks of data collection, respondents will receive a final refusal conversion mailing sent via priority mailer to better ensure it is opened by respondents. This mailing will include a $5 prepaid incentive. Sample members that receive this final refusal conversion mailing will also be offered the additional $10 for completing before the end of the data collection window – a total of $45 for completing the survey. For consistency, sample members that receive the final refusal conversion mailing by email will also be offered a total of $45 for completing the survey – however because a prepaid incentive cannot be sent via email, their offer to complete the survey before the end of the data collection window will be $15 rather than $10.

The table below summarizes the proposed outreach and incentive offers across the data collection period.























Exhibit B.2. Proposed Respondent Outreach and Incentive Structure


Outreach

Mode

USPS Type

Incentive Offer

Timing

1

Invitation with Early Bird Offer

USPS/Email

Letter

$10 +$30 ($40)

Launch

2

Early Bird Reminder

USPS/Email

Post Card

$10 +$30 ($40)

Week 1

3

Reminder 1

Email

N/A

$30

Week 4

4

Reminder 2

Email

N/A

$30

Week 6

5

Refusal Conversion

USPS/Email

Post Card

$10 +$30 ($40)

Week 8

6

Refusal Conversion Reminder

Email

N/A

$10 +$30 ($40)

Week 10



7



Final Refusal Conversion

USPS

Priority Mail

$5 prepaid; $10 +$30 postpaid ($45)

Week 12

Email

N/A

$15 +$30 postpaid ($45)

Week 12

8

Final Reminder

USPS/Email

Post Card

$15 +$30 postpaid ($45)

Week 13



As shown above, the proposed outreach will be sent via a combination of email, United Stated Postal Service (USPS) mail or both. A total of five pieces of outreach will be sent using USPS. The type of mailing will vary across pieces of outreach – ideally increasingly the likelihood that the respondent opens the outreach. A total of eight pieces of outreach will be sent via email, including versions of the five pieces of outreach sent via USPS. Because email outreach is less expensive than USPS, we will send three additional pieces of outreach by email only. Using a combination of both USPS and email further increases the likelihood of reaching the targeted respondents.


Staff Interviews

For the PHA staff interviews, MDRC will work with each PHA liaison to identify interview participants and the best timing for this round of interviews. Where we identify more staff than we can interview to perform a particular role, we will give priority to those with the most experience administering the new rules or those from our technical assistance work with the sites we believe would have important perspectives to offer. This approach will allow us to gather the data needed to answer the research questions. We do not plan to draw statistical inferences from the staff interview data.

Cost Study Data Collection

The cost interview team will also conduct staff interviews to collect information on staff time-use data on administrative actions including triennial certifications, interim certifications, hardship requests, as well as activities that do not result in a formal action. The team will work with PHA liaisons to identify staff for interviews and to understand the best timing for this data collection effort to accommodate staff workload.

Data on staff wages, salaries, and fringe benefits will be collected from MTW Reports and other supporting documentation produced by the PHAs. The cost team will work with PHA staff to confirm the accuracy of the cost data. Additional personnel costs including supervision and support for frontline staff activities will be gathered from supplemental sources including the Housing Choice Voucher Administrative Fee Study and confirmed with PHA staff.


Impact Estimation Approach

This demonstration uses an experimental design that randomly assigns households within each participating PHA to either the alternative rent policy or to a control group that will continue to be subject to the traditional percent-of-income rent policy. The power of the experimental research design will come from the fact that, with an adequate sample size, random assignment ensures that the intervention and control groups will be similar in terms of the distribution of observed and unobserved baseline and pre-baseline characteristics. Thus, post-baseline differences between the two groups can be interpreted as effects of the intervention. The basic estimation strategy used here is analogous to the methodology that MDRC and other social science researchers have used in social experiments over the last few decades to generate credible results. The analysis will compare average outcomes for the intervention and control groups, and it will use regression adjustments, controlling for a range of baseline characteristics, to increase the precision of the statistical estimates that are performed. In making these adjustments, an outcome, such as “total earnings” or “total HAP” is regressed on an indicator for intervention group status and a range of other background characteristics. The following basic impact model would be used:

Yi = α + βPi + δXi + εi

where: Yi = the outcome measure for sample member i; Pi = one for program (or intervention) group members and zero for control group members; Xi = a set of background characteristics for sample member i; εi = a random error term for sample member i; β= the estimate of the impact of the program on the average value of the outcome; α=the intercept of the regression; and δ = the set of regression coefficients for the background characteristics.

In estimating impacts on earnings and employment outcomes, separate estimates will be produced for the heads of households, other adults in the household, all adults combined, and the household (as defined at the time of random assignment). However, heads of households will be the primary unit of analysis for estimating confirmatory impacts, and for a fuller range of subgroup analyses and analyses of impacts on survey-based outcome measures. For housing outcomes, such as subsidy levels and use of homeless services, the unit of analysis is the household.


The main impact analysis will pool the samples across the cluster of PHAs that are implementing the same rent policy to estimate the effects of the alternative rent model for all those sites combined. Pooling increases the precision of impact estimates, which becomes especially relevant when estimating effects for subgroups of the full sample. The analysis will include Houston in the tiered rent cluster, even though that PHA implemented a modified tiered rent. The differences in the policy specification are minor and it is unlikely that a differential effect for Houston could be clearly attributed to these differences in specifications rather than other site-level factors.


  1. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.


30-Month Participant Survey

In addition to using incentive payments to help maximize response, DIR will contact the sample using various outreach materials (e.g., advance letters, emailed invitations, and follow-up mailings) to respondents to introduce the importance of the study and the contribution their participation will make in rent reform policy for the future. DIR will also maximize response by providing options for completing the surveys (web and telephone) and by maintaining contact with the sample through various modes across many attempts. DIR will follow up with participants to encourage them to complete the survey via web with email and telephone invitations, as well as hardcopy mailings of invitation letters and reminders. DIR will translate the participant survey into two languages: Spanish and Arabic.1 These languages were identified during study enrollment as the two languages most commonly preferred by study participants for communication from the study team besides English. Respondents can easily toggle to different languages in the web survey using on-screen toggle buttons. Over the telephone, Spanish-speaking and Arabic-speaking interviewers will be assigned to cases identified as needing either Spanish or Arabic on a prior call.

PHA Staff Interviews:

Based on our experience, nonresponse has not been an issue for PHA staff interviews, even though staff participation in research interviews is always voluntary. In previous rounds of staff interviews for this study, and in similar projects, the MDRC team has been successful in scheduling and conducting interviews with individuals identified for staff interviews, including executive staff and directors, managers and supervisors, and frontline specialists.

MDRC will give the PHAs sufficient advance notification before beginning these interviews and will request their assistance in scheduling interviews. In the 6-8 weeks leading up to the interviews, MDRC will connect with PHA liaisons and identify a pool of staff for these interviews. As noted above, if we identify more staff than we can interview perform a particular role, we will give priority to those with the most experience in administering the new rules, or those whom from our technical assistance work with the sites we believe would have especially important perspectives to offer. Once the list is finalized, MDRC will conduct the necessary outreach, provide selected staff with relevant background information on the goals of the interview, the duration of the interview, and how the interviews will be conducted (in-person, individual or group, phone or video conference). Staff will be informed that no special preparation will be needed from them for this interview.

For staff unable to participate in a scheduled interview, because of an unforeseen situation or a scheduling conflict, MDRC will attempt to reschedule the interview by phone or videoconference.

Cost Study Questionnaire

To maximize response rates, the team will ensure the questionnaire is concise, user-friendly, and accessible. The team will also send personalized follow-up reminders to encourage participation. To address non-response and confirm accuracy, the team will conduct follow-up interviews with PHA staff and review data submitted on the questionnaire.


  1. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of test may be submitted for approval separately or in combination with the main collection of information.


30-Month Participant Survey

Many of the items in the proposed long-term follow-up survey have been successfully administered to low-income households in other large-scale similar studies such as the Rent Reform Demonstration and the Family Self Sufficiency study. However, because the proposed survey is a compilation of items from multiple sources, a pilot test will be conducted with up to nine members of the tiered and stepped rent group using DIR’s Computer-Assisted Telephone Interview (CATI) call center to confirm survey length, flow of the script, and alignment of response options with the answers provided by participants.

Staff interviews:

The proposed qualitative data collection protocols (which are designed to serve as discussion guides) adapt and build on the protocols used in previous rounds of field research conducted for the STRD evaluation. In this sense, the research team has had a chance to test how the types of questions used in the staff interview protocols work in the field and does not anticipate requiring advance protocol testing activities. MDRC will prepare training materials and train experienced interviewers to conduct the staff interviews. This training will be conducted in the weeks leading up to the interviews with PHA staff.

  1. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.



HUD has contracted with MDRC and its partners Decision Information Resources, Inc. (DIR), Barbara Fink, and David Long (independent consultants). The 30-Month Participant Survey instrument and the PHA staff interview protocols and questionnaire draw on similar data collection tools used for other HUD-commissioned evaluations that MDRC has led. The HUD Contracting Officer’s Representative (COR) reviewed the household follow-up survey instrument, PHA staff interview guides, and the PHA staff cost questionnaire, and had them reviewed by other subject matter experts at HUD. If there are any questions about this submission, please call either the HUD COR, Paul Joice (312-913-8597) or the MDRC co-Principal Investigator and Project Manager Nina Castells (212-340-7605) or co-Principal Investigator, Nandita Verma (212-340-8849). MDRC consulted Charles Michalopoulos, Howard Bloom, and Cynthia Miller on the statistical aspect of the study design. In addition, the following team members contributed to the development of the data collection tools: Joshua Vermette, Data Manager (212-340-4451), Jonathan Bigelow, Implementation Researcher, 212-340-8646, Keith Olejniczak, Technical Assistance Lead, 212-340-2306, Barbara Fink, Consultant (267-992-7000), and David Long, Consultant, (609-865-4705).



1 Study participants were asked at study enrollment what their preferred language was for receiving communication materials from the study team. 97.8% said English, 0.7% said Spanish, and 0.3% said Arabic. Of the remaining 1.2%, 0.9% named a different language as their preferred language for communication (26 additional languages were named, though none exceeded 0.2% of participants). The remaining 0.3% of study participants did not respond to this question.


13



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJacquelyn George
File Modified0000-00-00
File Created2025-06-07

© 2025 OMB.report | Privacy Policy