8-30-18 Rent Reform follow-up survey Part A

8-30-18 Rent Reform follow-up survey Part A.docx

Rent Reform Demonstration TO2: 36-Month Follow-Up Survey and Comprehensive Impact

OMB: 2528-0306

Document [docx]
Download: docx | pdf


Supporting Statement for Paperwork Reduction Act Submission

Rent Reform Demonstration: Long-Term Follow-Up Survey

OMB # 2528-0306


A. Justification


This supporting statement provides information on the next phase of data collection activities associated with the Rent Reform Demonstration evaluation administered by the U.S. Department of Housing and Urban Development (HUD). The alternative rent policy was developed by a design team, which includes: MDRC and its subcontractors at the Urban Institute, Quadel Consulting, the Bronner Group, and independent and academic consultants; HUD; and participating housing agencies. MDRC is leading the demonstration evaluation effort.

This Supporting Statement provides information on the proposed long-term follow-up survey of participants enrolled in the Rent Reform Demonstration. It builds upon the baseline data collected under Task Order 1 (TO1), the collection of staff and participant interviews, administrative records, cost data and administrative data authorized under Task Order 2 (TO2) under the Office of Management and Budget (OMB) control number 2528-0306, expiration date 10/31/2018.


  1. Explain the circumstances that make the collection of information necessary.


The primary goal of the Rent Reform Demonstration is to test the effects of an alternative rent policy on voucher holders and housing authorities that issue them. HUD’s Office of Policy Development and Research (PD&R) awarded a contract to MDRC in September 2012 to design and implement the Rent Reform Demonstration. While design and implementation took longer than anticipated, over 6,600 Housing Choice Voucher (HCV) program participants were enrolled in the Demonstration from June 2015 through March 2016 (TO 1). A baseline survey was administered at the HCV participants’ recertification just after random assignment. While the original Demonstration called for a long-term follow-up survey to occur as part of TO 2, PD&R did not have the funds to award the survey contract until March 2018. (A new contract was awarded to Decision Information Resources, Inc, to conduct the long-term follow-up survey, with a subsequent contract awarded to MDRC to conduct the comprehensive impact analysis of the long-term follow-up survey and administrative data.) While MDRC is currently conducting the short-term analysis and will be conducting the long-term impact analysis in the next year using administrative data and a small number of study participant interviews, the long-term follow-up survey is necessary to contextualize any findings from the administrative data.



1.1 Overview of the Rent Reform Demonstration


Features of the Alternative Rent Model


The MDRC team consulted extensively with HUD program and research staff, advocacy organizations, Moving to Work (MTW) Public Housing Agencies (PHAs), and consulting organizations that support the activities of PHAs to develop the alternative rent model. The resulting framework includes several fundamental features while leaving some room for PHA discretion in adapting those features to local conditions.


The alternative rent policy applies only to Housing Choice Voucher (HCV) program recipients1 and includes the following key features:


  • Simplifying the calculation of the household’s total tenant payment (TTP) and subsidy amount by:


    1. Eliminating deductions and allowances,

    2. Changing the percent of income that a household pays for its share, from 30 percent of adjusted income to 28 percent of gross income,

    3. Ignoring a household’s income from assets when the total value of its assets is $25,000 or less, and

    4. Simplifying the policy for determining utility allowances.


  • Using retrospective income in setting a household’s TTP and housing subsidy (to discourage intentional reductions in income)

  • Establishing a minimum TTP of at least $50 and requiring that all households pay a minimum amount of rent directly to the landlord, to mirror the landlord-tenant relationship in the non-subsidized rental market


  • Conducting income recertifications triennially rather than annually (or biannually as in the case of one PHA), so that earnings gains do not increase TTP for three years (thus creating a strong work incentive by eliminating, for an extended period, the implicit housing-subsidy-related “tax” on increased earnings)


  • Limiting household-requested interim recertifications to a maximum of one per year, to protect households when their income drops while limiting the burden to the housing agency


  • Establishing a suitable hardship policy that identifies a standard set of hardship conditions and remedies to protect households from excessive rent burdens.


To increase the likelihood that the alternative rent policy encourages tenants to increase their work efforts, it is important to clearly and periodically inform them of the implicit incentives associated with not having to report any earnings gains for three years. They must also understand the safeguards in place to protect them from excessive rent burdens if their incomes fall. The MDRC team, as part of its technical assistance role, has helped housing authorities develop appropriate materials and strategies for communicating these incentives and safeguards.



Eligibility


The alternative rent policy applies only to HCV recipients. Eligible sample members include voucher holders with vouchers that are administered under the MTW demonstration. Non-MTW Vouchers (i.e., Veterans Assisted Special Housing, Moderate Rehabilitation, and Shelter Plus Care), Enhanced Vouchers, and Project-Based Vouchers were excluded from the study. Households that have ported out were excluded. Additionally, the study is focused on work-able populations and does not include elderly households, disabled households, households that will become elderly during the course of the long-term study, or households where at least one member does not have legal status in the U.S. Households receiving a child care deduction at the time of random assignment were also excluded from the study. Households participating in the PHA’s Family Self-Sufficiency, homeownership programs, or any of the PHA’s special programs with partner agencies were not included in the study. Lastly, households who had 0 Housing Assistance Payments (HAP) at the time of random assignment were not included in the study.


Evaluation design and components


The demonstration is using a randomized controlled experiment to compare the current rent subsidy policy for HCVs to an alternative rent policy. Four MTW PHAs are participating in the study. The demonstration is being guided by a comprehensive research agenda structured around three study components: impacts, implementation processes, and cost analysis.


In order to evaluate the impacts of the alternative rent policy, all households that met the eligibility requirements for the demonstration were randomly assigned to either a new policy group or an existing rent policy group (i.e., the control group). Households did not have an opportunity to switch their assignment from one rent policy group to the other. This approach is consistent with the MTW Demonstration’s policy of authorizing PHAs to implement and test innovative rent policies to try to help voucher holders become self-sufficient and to reduce administrative costs. When MTW PHAs that are not part of the Rent Reform Demonstration have implemented their own rent reforms, they have not been expected to limit these new policies to tenants who volunteer for them.

A research design that includes the broader eligible population fits the compelling need for HUD and Congress to understand the effectiveness of the new rent policy for the full eligible population, not just for a subset of volunteers recruited for a special demonstration project. For a variety of reasons, volunteers may not adequately represent the full eligible population. The new rent policy itself offered all tenants the possibility of becoming economically better off, while also including a number of safeguards intended to help prevent them from becoming economically worse off. All tenants were given an opportunity to withdraw from having their personally identifiable data shared with the researchers if they wished that it not be disclosed.


The evaluation plan includes an exception for one of its sites, the Louisville Metropolitan Housing Authority (LMHA). Households in this PHA that were randomly assigned to the new policy were given the option to opt out of the new rent policy group in addition to withdrawing from having their data disclosed to the researchers. HUD agreed to this exception to be responsive to concerns raised by LMHA and the local community. Twenty-two percent of families chose to opt out of the new policy in Louisville.


The study will examine the effects of the alternative policy from two perspectives: that of the housing agencies and that of the voucher holders.


A. Research questions concerning PHAs

The goal of reducing the burden and costs that PHAs incur in administering the current rent rules is one of the primary motivations in rent reform. Indeed, it was a major reason why the MTW PHAs wanted to join MTW in the first place. As such, the demonstration will assess to what extent the alternative rent model actually simplifies the administration of rent subsidies, and improves PHA finances, without placing undue burdens on residents. Related goals concern PHAs’ ability to stretch their budgets to serve more residents in need of housing assistance, such as by reducing average subsidy levels and the duration of subsidy receipt.


The evaluation will address these important issues in the following ways:


  • Document the alternative rent model as implemented in practice. The alternative rent model and implementation strategy were developed and implemented at the four participating PHAs (District of Columbia Housing Authority, Lexington Housing Authority, LMHA, and San Antonio Housing Authority), including any adaptations of the model to their local circumstances. The PHAs’ experiences in operating the model varied, given their different administrative systems, organizational capacities, and local contexts, but the main features of the model and implementation were consistent across all of the PHAs.


Three components of the evaluation will document how the new rent policy is operationalized in each PHA: (1) research on the implementation of the model; (2) ongoing technical assistance and monitoring efforts that began as part of TO1 and continued into TO2; and (3) a cost study. Data collected through these methods (which combine direct observations, interviews, and more standardized measurement) will be used to compute the costs of operating the rent systems and in determining whether and how much the alternative system yielded costs savings. The data collection will also include explorations of the choices PHAs make, for example, with regard to hardship exemptions, and what procedures they establish to approve or deny exemption requests. The research will document the kinds of changes in administrative processes, including data management systems and software, income verification procedures, and staff deployment, all of which are crucial for understanding whether the new policies simplify or complicate rent administration for the PHAs, and whether strategies adopted by some PHAs are more efficient and accurate than others and worthy of emulation. This documentation began under TO1 and continues under TO2 using the information we are collecting through our technical assistance to help PHAs implement the new policies and set up random assignment. Future data collection efforts will be expected to generate richer data for systematic comparisons of site implementation experiences and practices.

  • Measuring effects on tenant turnover and the availability of vouchers. Changes in the rent rules could change tenant turnover in a number of ways. First, they may increase earnings and income and, in turn, increase or hasten exits from vouchers. But, second, and in contrast, the new rent policy could reduce tenant turnover if more voucher holders come to view voucher receipt as more attractive than unsubsidized housing on the private market than they would otherwise view it or because the triennial recertification reduces household contact with the PHA and, therefore, could lead to less turnover. As described later, we will compare lengths of stay and reasons for exit for the program and control groups, and we will try to discern the relative influence of different factors on that impact.


  • Effects on tenants’ housing-related hardships. Changes in the rent rules affect tenants’ rent burden and thus their likelihood of being evicted or having their utilities shut off. For example, families at the lower end of the income distribution may strain to afford a high minimum rent, or those with higher incomes may fall into arrears if their income drops, unless adequate hardship protections are included in the rent policy. As described below, we will assess the effects on housing hardships and rent burden in the impact analysis, comparing rates of several dimensions of material hardship and also eviction rates for families in the program and control groups. We will do this for all families and for certain subgroups of families thought to be most at risk for these hardships.


  • Effects on PHAs’ costs and ability to serve more eligible families. The effect of rent reform on tenant turnover is a key question because it concerns, not only the well-being of families with vouchers, but also the number of families PHAs can afford to serve with a given budget. In theory, a new rent policy may allow PHAs to stretch their budgets and fund subsidies for more families by causing tenants’ earnings to grow and/or more directly by reducing the amount of subsidy offered over time.


Alternatively, some policies could end up increasing rather than reducing PHAs’ costs. That might be the case, for example, if the policy causes some tenants to work less than they might have otherwise (e.g., because of a fear of the implications of the loss of deductions such as child care), despite the fact that they have a greater economic incentive to work in the following three years, while also having no effect on other tenants’ work rates. If that’s the case, (relative to the 30-percent rule) the aggregate amount of tenant contributions would be lower at the end of the three-year rent-freeze period and heading into the subsequent period, thus costing the PHA more in subsidies. Although this result is not anticipated, it cannot be ruled out. The evaluation will determine whether it in fact occurs, or whether, as hoped, the new policy produces increases in tenant work effort. Whether or not a new policy is budget-neutral, and whether it achieves the broader goals of rent reform, will depend to a very important extent on the changes it causes in participants’ labor market and housing decisions.


  • Administrative reforms and PHA cost savings. The alternative rent policy should reduce the administrative burden on the housing agency because it should be simpler to administer and require fewer tenant-staff interactions. This should produce administrative cost savings. It will thus be important to document how the changes in rent policy affect a variety of administrative processes and the extent to which there is any offsetting increase in the administrative burden of dealing with hardship cases under the new policy.


  • Assessing administrative efficiencies with an eye toward “scaling-up.” In any demonstration project, one must be concerned that the ways of operating a program or policy as a special research initiative may not mirror the ways it would operate as a scaled-up policy. In the Rent Reform Demonstration, the PHAs will be required to operate dual rent systems – the current income-based system plus an alternative system. At the very least, this means that the PHAs will not be able to achieve the same efficiencies and economies of scale with the hybrid system as they could if the new policy were operated at full-scale for all voucher recipients.


Although it is not possible to avoid this problem in the context of the demonstration, we propose to include as part of a longer-term implementation study an assessment of where further operational efficiencies could be achieved if the rent policy were implemented at scale. For example, we would look for ways in which everything from staff deployment to information systems could be modified or consolidated if the new policy were adopted wholesale.


B. Research questions concerning individuals and families


A premise behind many rent reform proposals is that the reforms will benefit assisted families as well as PHAs. Thus, drawing on the available data sources (which now includes a participant survey), the evaluation is designed to assess whether this is true, using the following approach:


  • Assessing voucher recipients’ understanding of rent reform incentives. Tenants’ understanding of the new model and its implicit incentives will inform how they make labor market and housing choices. Using qualitative research methods, as well as survey data, the evaluation will explore whether tenants understand the new rules, and the “frames” they use in interpreting them, such as whether they believe that “extra work is penalized.” The MDRC team will conduct a small number of interviews to get a read on voucher recipients’ understanding of these issues, while the long-term follow-up survey will ask program group participants about their awareness and understanding of the triennial recertification feature, minimum rent, restrictions on interims, hardship policies, and other safeguards. Participants will also be asked about their perspective on the fairness of the minimum rent policy and their overall preference for the new rent rules or the existing rules.


  • Measuring tenant outcomes: We identify the following clusters of tenant outcomes.


Household composition and structure: To explore effects on household composition and structure, the analysis would rely on information collected about all household members, including names, ages, employment status (if appropriate), and relationship to the head of household through the HUD 50058 form, as well as survey data that will include information about who is currently living in the household.


Work behaviors: Unemployment Insurance wage records will include information on employment and earnings. Survey data will include information on education and training, including education or certifications obtained or in process, as well as the respondents’ current or recent job experience in terms of hours, wages, type of work, and associated employer provided benefits. If the respondent is not employed, we will ask about barriers to employment (e.g., child care, transportation) and efforts to obtain employment.


Income, assets, and rent burden: If rent reform increases tenants’ disposable income, it may help them accumulate assets. Data on income (from housing agency data), rent, and utilities payments would be used to construct measures of rent burden. Survey questions will collect total household income (unemployment insurance wage records only capture the head of household and individual adult earnings, but no measure of household earnings) and benefits that members of the household received. In addition, we will ask questions related to financial hardship, material and food security, savings, debt, credit scores, and taxes. This section of the questionnaire will include items about health coverage, funded by employers or other private sources, or funded by government programs like Medicaid.


Homelessness: We will measure effects on homelessness using Homeless Management Information Systems (HMIS) data.


Other government benefits: Temporary Assistance for Needy Families (TANF) and Supplemental Nutrition Assistance Program (SNAP) data will be used to examine effects on other subsidy receipt and amounts, since changes in the receipt of these public benefits may flow from any impacts that rent reform has on tenants’ earnings.


Voucher use: Using the HUD 50058 data, which is form that HUD uses to collect data on the families that receive housing assistance from the PHAs that administer the housing assistance programs that serve them, the study would examine the effects of alternative rent strategies on the duration of voucher receipt and exit reasons. The survey will ask about residential moves since study enrollment, reasons for those moves, HCV exits, and information and satisfaction with housing and neighborhood. In addition, we will ask questions about experience with PHAs, landlords, and evictions.


Knowledge and perceptions of rent rules: In-depth qualitative data would be used to examine voucher recipients’ perceptions, understanding, and awareness of the rent rules, and their attitudes toward the PHA and frontline staff. A section of the long-term follow-up survey focusing on program experience will only be asked of program group respondents. It will have questions unique to participation in the Rent Reform Demonstration.


C. Counterfactual and service context


Knowing not only the control group’s outcomes, but also the “treatment” they receive is crucial to making sense of impact findings in any random assignment trial. Put differently, what a randomized trial tests is not simply the effects of an intervention for the program group, but, rather, the difference in treatment between the program and control groups, which is what produces the difference in outcomes (or program “impacts” or effects) between those two groups. This is critical because it will influence what the experiment actually tests.


For this demonstration, how much the MTW sites have already implemented some features of rent reform was a subject of initial exploration. Our scan of MTW plans and reports indicated that MTW PHAs have taken some steps in this direction, and some have moved quite far down the road. Thus, the benchmark against which the sites’ impacts would be judged differ, making the intervention (the alternative rent policy) a bigger change or a qualitatively different change in some sites than in others (to limit this problem, we have not selected sites that have already instituted wholesale rent reform). For instance, we know that in the sites we have selected, the counterfactual will differ in terms of length of the current recertification period: some are annual, but one is already biennial.


The control group benchmarks could also differ across sites in terms of the intensity of services available to study participants. For instance, the control group might receive more employment services in certain sites simply because those sites are located in “service-rich” environments, which might influence the effectiveness of the incentives. These circumstances of the control group will have very important implications for interpreting the impact findings and drawing lessons for policy.


The MDRC team would address this issue as part of its implementation research, drawing on information obtained in the PHA site selection discussions and visits in which the alternative rent rules are discussed as part of TO1; through later interviews with PHA staff; through PHA data on participation in self-sufficiency programs (where appropriate); and through the participant interviews, which would ask respondents about the extent to which they receive relevant work-related services.


D. Measuring the effects on residential mobility


Rent reform may affect the residential choices that voucher holders make. For example, because the tenants’ rent share will remain fixed for each three-year recertification period, some tenants may choose to move to higher-cost apartments or neighborhoods if they increase their earnings and keep more of their extra income.


For voucher holders, we would rely on HUD 50058 data to track overall mobility rates for treatment and control groups.


  1. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.


How the Information will be used?


The findings from the study will be used to inform the Federal government, public housing agencies, and other stakeholders about the effectiveness of the alternative rent policy. Specifically, the follow-up survey will document and contextualize administrative data findings related to employment, earnings, and hardship and study participants’ experience with the demonstration.


Who will collect the information?


Under the TO1, baseline data were collected through interviews administered by housing agency staff during recertification and from housing agency or HUD data. Study participants were also provided with a brief information sheet that ensured that they understood the demonstration, as well as their role and rights within the study. This form was approved under the study’s original submission to OMB under TO1. Under TO2, the longer-term evaluation collected follow-up data through interviews with key informants and access to housing agency/ HUD data and other administrative records (employment records, for example). As with the baseline data, and in an effort to limit data collection burden on participants, the participant interviews focused on information that is not readily available in available data sources (or not available in the format required for the evaluation). Under a new contract, Decision Information Resources, Inc. will collect data through a long-term follow-up survey (Appendix A).


Informed consent to participate in the long-term follow-up survey will be gathered in the opening section of the survey. It will ensure that participants: 1) understand the Rent Reform Demonstration evaluation, as well as their role and rights within the study; and 2) provide their consent to participate in the long-term follow-up survey. To ensure that all survey participants receive a clear, consistent explanation of the project, the language will be clear, plainly written, and will inform them that participation is voluntary and that strict rules are in place to protect sample members’ privacy.


The long-term follow-up survey is structured to cover the following topics:

  • Education and Training

  • Job History, Work Search, and Barriers to Employment

  • Household Composition and Childcare

  • Household Income, Material Hardship & Food Security

  • Housing and Moving

  • Program Experience


Data collected at baseline was analyzed and presented in the baseline report on the demonstration entitled, Reducing Work Disincentives in the Housing Choice Voucher Program: Rent Reform Demonstration Baseline Report2, published by HUD in October 2017. This report is the first of several that will be issued over the course of the project. Its purpose was to establish a foundation for future assessments of the implementation, impacts, and costs of the new rent policy. It describes the new policy, the rationale behind each of its critical elements, and the manner in which it is being evaluated. It also describes the process for identifying and enrolling families into the study, the background characteristics of those families, the amounts the families have begun paying for their rent and utilities under new rent rules compared with the existing rules, and the housing subsidies they initially received. Future reports (expected to be released in 2019 and 2020) will examine the PHAs’ implementation experiences; the relative burden of the new policy on PHAs and the costs they incurred to administer it; the policy’s effects on families’ contributions toward their rent and utilities; and its effects on families’ employment, earnings, and receipt of housing subsidies and other government benefits.


  1. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology.

The long-term follow-up survey will feature a multi-mode approach, using both a self-administered web-based option and a computer-assisted telephone interview (CATI) option.


  1. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in item 2 above.

The information collection will not duplicate information that is already available. Where possible, the evaluation will use available data sources, such as tenant data reported by the PHA to HUD into the Inventory Management System (IMS) PIH Information Center (PIC) system.


  1. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.

All survey respondents will be individuals. We do not anticipate that this study will burden small businesses.


  1. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing the burden.

This evaluation represents an important opportunity for the Federal government to build a body of knowledge about the effects of an alternative rent policy. This is consistent with the Administration’s strong focus on evidence-based policymaking. If this study is not conducted and the data not collected, analyzed, reported, and disseminated, Federal and local program or policy decisions will not be informed by high quality evidence upon which to base critical decisions regarding future rent policy. Without the long-term follow-up survey data, the Department’s understanding of the impacts of the rent reform model would be limited to what can be gleaned from administrative data alone and would lack the richness and context that follow-up survey data provides.


  1. Explain any special circumstances that would cause an information collection to be conducted in a manner:


The proposed data collection activities are consistent with the guidelines set forth in 5 CFR 1320 (Controlling Paperwork Burdens on the Public). There are no special circumstances that require deviation from these guidelines.


  • Under this ICR, HUD will not conduct any data collection requiring respondents to report information to the agency more often than quarterly;

  • Under this ICR, HUD will not conduct any data collection requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

  • Under this ICR, HUD will not conduct any data collection requiring respondents to submit more than an original and two copies of any document;

  • Under this ICR, HUD will not conduct any data collection requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;

  • Under this ICR, HUD will not conduct any data collection in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;

  • Under this ICR, HUD will not conduct any data collection requiring the use of a statistical data classification that has not been reviewed and approved by OMB;

  • Under this ICR, HUD will not conduct any data collection that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

  • Under this ICR, HUD will not conduct any data collection requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.


  1. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB.

The notice 83 FR 3178 was posted in the Federal Register on January 23, 2018. The Federal Register Notice appeared on pages 3178 and 3179. No comments on the collection have been received.


The instrument will be pretested with up to 9 study participants prior to OMB approval in order to obtain feedback on its clarity of instructions, wording of the questions, skip patterns, and data elements that will be collected through the survey instrument.


  1. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.

As a token of appreciation, Decision Information Resources (DIR) will offer an incentive of $50 to households who participate in the long-term follow-up survey. To enhance response rates, we will offer an additional $10 as either an “early bird” incentive (for those who complete within the first 2 weeks after survey launch) or as an increased incentive at the tail end of the data collection window for any outstanding nonrespondents.

The respondent payments and justification for each proposed instrument are outlined below.

  • Pre-Launch Incentives—Before the launch of the survey, all Rent Reform Demonstration sample members will receive a $2 cash incentive, which will be mailed via USPS mail with a tracking flyer (Appendix B) to encourage them to provide updated contact information so that we can reach them when the survey is released. This flyer will provide three options to provide updated contact information—a web link, phone number, and email. This will be our first request for updated contact information from the sample since they completed the Baseline Information Form, which will be more than 40 months later. As such, this $2 cash incentive is critical to both motivate them to respond to this request and also set the stage for all other contacts from the survey firm (DIR).

  • Long-Term Survey Incentives—Rent Reform Demonstration respondents who agree to participate in the long-term follow-up survey will receive a $50 cash incentive. This amount is based on the length of the survey (30 minutes) and the amount of time since random assignment (approximately 42 months). Survey respondents who complete via the web option within two weeks of the start of their survey window will be given a $10 ‘early bird’ incentive, in addition to the $50, for a total of $60. This option is included to encourage completion via the web because it is a more cost-efficient option (compared to the CATI and field-initiated options). Early-bird incentives have been shown to be effective in boosting initial response rates and thus reducing costs as fewer cases require phone and field follow-up3. This approach was tested in the 12-month follow-up survey in the YouthBuild evaluation and implemented in subsequent waves of follow-up surveys (OMB #1205-0503). The $50 incentive for completing regardless of timing is similar to those previously approved by OMB in past studies such as the Subsidized Transitional Employment Demonstration 12-month and 30-month follow-up surveys ($40 and $50, respectively; OMB #0970-0413) and the Choice Neighborhoods Initiative baseline survey ($50; OMB #2528-0286). A $50 incentive was also used for the baseline survey administered in a foundation-funded study of the Housing Opportunities and Services Together (HOST) study, which targeted a similar population. Both MDRC and DIR’s prior experience interviewing similar populations indicate that these incentive amounts are necessary to reach response rate goals.

The purpose of the incentives is to motivate participant cooperation and ultimately improve response rates, a strategy that has been empirically evaluated and supported4. Offering to pay respondents for their time will decrease the likelihood of refusing and at the same time, increase the likelihood of keeping sample members engaged. Incentives are also a token of appreciation to acknowledge burden of time for participating in the survey. These proposed incentives will be used in conjunction with the planned techniques DIR will employ to improve response rates with the hard-to-reach sample included in the Rent Reform Demonstration. Studies suggest that pre-incentives yield higher response rates compared to promised/contingent incentives5. There is also evidence that a small prepaid incentive given as a “token of appreciation” encourages responsiveness6.


Further, our experience with low-income populations engaged in long-term evaluation efforts, and in particular, those with a control group, suggests that incentives are required to reach response rate goals. Singer7 et al. conducted a meta-analysis and found that incentives in methodologies like those planned for the Rent Reform Demonstration (telephone and in-person follow-up) were effective at increasing response rates among underrepresented demographic groups, such as low-income and non-white individuals.8 As noted above, several OMB approved studies with similar samples and similar methodologies have included incentive levels similar to what we are requesting clearance for in this current submission, but we do note that the survey length is not exactly the same. However, by the time the Rent Reform Demonstration survey launches, there will have been approximately 40 months since contact. In addition, the sample was randomly assigned to the program or control group before they were even aware of the study, although they received a study information sheet at that time. There is likely little to no awareness of this upcoming survey for the control group, and minimal expectation for the program group. These factors make it critical to receive approval for the proposed incentive levels to ensure response rates for representative results.


  1. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.



The information requested under this collection is protected and held confidential in accordance with 42 U.S.C. 1306, 20 CFR 401 and 402, 5 U.S.C.552 (Freedom of Information Act), 5 U.S.C. 552a (Privacy Act of 1974) and OMB Circular No. A-130. As required by 5 U.S.C. 552a (Privacy Act of 1974),

The survey contractor, DIR, will protect the confidentiality of the data it collects to the extent provided by law through its regular high-security safeguards and practices. All respondents will be informed that any personal information they provide in the survey will be used only for the purpose of this study. Individuals will not be identified in prepared reports. All research staff working on the project have been trained to protect confidential information and have signed a pledge stating that they will keep all information gathered confidential to the extent provided by law. All papers that contain study participant names or other identifying information will be kept in locked areas and any computer documents containing identifying information will be protected with a password.


DIR understands the critical issue of data security and protection. DIR conducts threat analyses, protection plans, and backups on a regular basis to help safeguard and prepare for potential threats and harm that could arise from power failures, fire, hurricane, flooding, piracy, information hacking. DIR operates under National Institute of Standards and Technology (NIST) guidelines, and our business continuity plan has adopted these guidelines for data protection and recovery. DIR’s data protection strategies include:

  • A secure server room with increased power input

  • A coded-card access system that identifies personnel with access to the server room; limited building access off hours; and locks on all doors and building facilities

  • Procedures for data backup and recovery, including an off-site redundant server


Upon completion of the study, DIR will contract with an appropriate facility for the destruction of study data. All data—hardcopy and electronic—will be stored in secure archive facilities according to the time period specified in the contract. At the expiration of this specified time period, data will be destroyed with the approval of our HUD Contracting Officer’s Technical Representative (COTR). The contracted facility will be responsible for the secure and certified shredding of all hardcopy documents collected during the study and for the eradication of electronic data.


  1. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private.

The data collection proposed under the long-term follow-up survey will focus on participant experiences with the alternative and current rent policies, as well as topics such as education and training, job history, household composition, income, residential status, and family welfare. Many of the questions in the long-term follow-up survey are potentially sensitive for respondents, but they cover some topics that are generally discussed with (and documented by) housing agency staff at the time of recertification. For instance, voucher holders are asked to report on employment and income at the time of recertification. However, the evaluation is interested in documenting additional background characteristics that may influence participants’ response to the alternative rent policy, but are not captured by administrative data. Without asking these additional questions there would be no explanation for the research findings. We would know that the alternative rent policy encouraged participants to work more hours or caused them to earn higher incomes, but we wouldn’t know why. Thus, it is crucial that DIR collect the type of information included in the attached long-term follow-up survey (see Appendix A). Respondents will be informed by study staff prior to survey data collection that their answers are private to the extent provided by law, that they may refuse to answer any question, that results will only be reported in the aggregate, and that their responses will not have any effect on any services or benefits they or their family members receive.


  1. Provide estimates of the hour burden of the collection of information.

The hour burden estimates for the long-term follow-up survey data collection for Rent Reform respondents is outlined below in Table 1. The estimates included below are based on DIR’s experience with previous random assignment studies involving similar populations and data collection.

Cost to respondents for collection of information

Households participating in the Rent Reform Demonstration range widely in employment position and earnings. We have estimated the hourly wage at the expected prevailing minimum wage, which is $7.25 per hour in Kentucky and Texas, and is the same as the federal minimum wage (Source: U.S. Department of Labor, https://www.dol.whd/minwage/america.htm). The hourly minimum wage in the District of Columbia is expected to be $13.25 by Q3 of 2018 (Source: District of Columbia Department of Employment Services, https://does.dc.gov/sites/default/files/dc/sites/does/page_content/attachments/Minimum%20Wage%20Amendment%20Revision%20Act%20Poster%20-%20Eng.pdf). Accordingly, we assume an hourly rate across all sites of $8.96 that represents an average of these two rates, weighted by the sample at each site (1,903 participants in Washington, D.C. and 4,756 enrolled in Kentucky and Texas). To calculate the maximum cost to respondents, we have assumed full employment across the sample.

Based on these assumption, and the frequency and duration of responses listed in Table 1, the estimated total respondent costs are $32,815.55 (6,659 sample members * $8.96 * (0.05 hours + 0.5 hours)*1 response).

Table 1. Estimated Reporting Burden for Survey Respondents

Information Collection

Number of Respondents

Frequency of Response

Responses

Per Annum

Burden Hour Per Response

Annual Burden Hours

Hourly Cost Per Response

Annual Cost


Consent Form

6,659

1.00

6,659

0.05

333


$8.96

$2,983.68


Long-term Follow-up Survey

6,659

1.00

6,659

0.50

3,330


$8.96

$29,836.80


Housing Authority Database Extraction Activities

4


1.00

4

1.00

4


$36.33

$145.32


TOTAL

13,322




3,667


$32,965.80

Cost burden to PHA staff

On three separate occasions (once before the survey launches and then twice again during data collection), DIR and MDRC will request contact information provided to the PHA during the triennial recertification process. For program staff supporting data extraction activities that are not part of their regular duties, the estimate uses the median hourly wages of selected relevant occupations (classified by Standard Occupational Classification (SOC) codes) sourced from the Occupational Employment Statistics from the U.S. Department of Labor’s Bureau of Labor Statistics. A standard wage assumption of $36.33 was created by averaging median hourly wage rates for these selected relevant occupations:


Occupation

SOC Code

Median Hourly Wage Rate

Database Administrator

15-1141

$41.84

Social/community Service Manager

11-9151

$30.82


Source: Occupational Employment Statistics, accessed online May 28, 2018 at https://www.bls.gov/oes/current/oes_stru.htm


Based on this assumption, and the frequency and duration of responses listed in Table 1, the estimated total PHA staff costs for database extraction are $435.96 (4 staff (1 staff * 4 sites) * $36.33 * 1 hour * 3 responses). Note, this estimate does not include fringe benefits or other overhead costs.


Total estimated cost: $33,251.51.


  1. Provide an estimate for the total annual cost burden to respondents or record-keepers resulting from the collection of information.

There are no costs to the respondents to participate beyond the time needed to respond to the tracking requests and answer the long-term follow-up survey questions. No equipment, printing, or postage charges will be incurred by the participants.


  1. Provide estimates of annualized costs to the Federal Government.

The total cost to the federal government for the long-term follow-up survey of the Rent Reform Demonstration study is $3,047,914 (Table 2).


Table 2. Estimated Cost to the Government

IDIQ Labor Category

Estimated Hours

Estimated Costs

Class I - Senior

163


Class II - Associate

1,140


Class IV - Junior

8,790


Class V - Editorial

1,026


Class VI - Clerical (including CATI interviewers and field staff)

43,979


Total Labor Cost

$2,317,964


Consultant

$3,741

Programming

$3,750

NCOA (Tracking Database)

$13,320

Translation

 

$4,000

Travel

$52,570

Printing/Copying

$32,796

Postage/Delivery

$24,222

Field Expenses (including technology)

$99,188

Telephone

 

$5,258

CATI ODC

 

$27,437

Incentives for Survey Respondents

$301,113

Subcontract Administration

$162,555

Total Expenses

 

$3,047,914


  1. Explain the reasons for any program changes or adjustments reported on the burden worksheet.

This submission is a revision of an existing collection (OMB #2528-0306). The revision is necessary to obtain approval for longer term data collection activities that are part of the Rent Reform Demonstration study. The long-term follow-up survey will provide contextual information to better understand the impacts of the revised rent reform model.


  1. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.


To determine the effectiveness of the targeted programs, the evaluation is designed to collect three categories of data: 1) baseline data, 2) implementation and process data, 3) administrative records, and 4) long-term follow-up survey data will be used, which will contextualize and help interpret the findings from these other data sources. In January 2018, the baseline report was published. The interim impact report is expected to be published in early 2019. The long-term impact analysis is expected to be published as a series of working papers in late 2020 or early 2021. A comprehensive impact analysis including an analysis of the long-term follow-up survey and administrative data is expected in late 2021 or early 2022.



  1. If you are seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.

The expiration date for OMB approval will be displayed on any forms completed as part of the data collection.


  1. Explain each exception to the topics of the certification statement identified in Certification for Paperwork Reduction Act Submissions.

This submission describing data collection requests no exceptions to the Certificate for Paperwork Reduction Act (5 CFR 1320.9).



1 See eligibility section below for specific eligibility criteria.


2 https://www.mdrc.org/publication/reducing-work-disincentives-housing-choice-voucher-program

3 Coopersmith, J, Klein Vogel, L, Bruursema, T, & Feeney, K. (2016) Effects of Incentive Amount and Type of Web Survey Response Rates. Survey Practice.; De Santis, J., Callahan, R., Marsh, S, & Perez-Johnson, I. (2016, May). Early-bird Incentives: Results from an Experiment to Determine Response Rates and Cost Effects. Paper presented at 71st annual meeting of the American Association of Public Opinion Research, Austin, TX,; and Ward, C., Stern, M., Vanicek, J., Black, C., Knighton, C & Wilkinson, L.(2014). Evaluating the Effectiveness of Early Bird Incentives in a Web Survey [Powerpoint]. Retrieved from https://www.census.gov/fedcasic/fc2014/ppt/02_ward.pdf.

5 Petrolia, D. R., & Bhattacharjee, S. (2009). Revisiting incentive effects: evidence from a random-sample mail survey on consumer preferences for fuel ethanol. Public Opinion Quarterly73(3), 537-550. and Church, A. H. (1993). Estimating the effect of incentives on mail survey response rates: A meta-analysis. Public opinion quarterly57(1), 62-79.

6 Dillman, D. A., Smyth, J. D., & Christian, L. M. (2009). Internet, mail, and mixed-mode surveys. The tailored design method3.

7 Eleanor Singer, Robert M. Groves and Amy D. Corning, The Public Opinion Quarterly, Vol. 63, No. 2 (Summer, 1999), pp. 251-260.

8 Berlin, M., L. Mohadjer and J. Waksberg (1992). An experiment in monetary incentives. Proceedings of the Survey Research Section of the American Statistical Association, 393-398; de Heer, W. and E. de Leeuw. “Trends in household survey non-response: A longitudinal and international comparison.” In Survey Non-response, edited by R. M. Groves, D. A. Dillman, J. L. Eltinge, and R. J. A. Little. New York: John Wiley, 2002, pp.41-54; Singer, E. and Kulka, R. Studies of Welfare Populations: Data Collection and Research Issues, Panel on Data and Methods for Measuring the Effects of Changes in Social Welfare Programs. Ploeg, Robert A.Moffitt, and Constance F.Citro, Editors. National Academies Press, Washington, DC, 2000, pp. 105-128.

15


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleOMB Package Part A
Authornunez
File Modified0000-00-00
File Created2021-01-20

© 2024 OMB.report | Privacy Policy