final-OMB 2528-0017 - 2021 AHS - Supporting Statement A

final-OMB 2528-0017 - 2021 AHS - Supporting Statement A.docx

2021 American Housing Survey (AHS)

OMB: 2528-0017

Document [docx]
Download: docx | pdf


Supporting Statement for Paperwork Reduction Act Submission

American Housing Survey

OMB Number 2528-0017


A. Justification


  1. Necessity of Information Collection

We request clearance for the proposed questions to be used on the 2021 American Housing Survey (AHS). We will collect data for the majority of the sample between May 3 and September 14, 2021. This request is a revision to the currently approved data collection request for the AHS due to changes in content and increased sample size.

In 2015, AHS began a new longitudinal panel. The sample design has two components: an integrated national sample, and an independent metropolitan areas sample. Furthermore, the integrated national sample includes three parts: (1) 35,731 national cases representative of the US and 9 Census divisions, (2) 12,060 subsidized renter oversample cases, and (3) 47,175 oversample of the top 15 metropolitan areas in the US. The total integrated national sample for 2021 will be 94,966. For 2021, the independent metropolitan areas sample will consist of an additional 10 metropolitan areas and will include 32,919 records (approximately 3,000 per metropolitan area). The ten metropolitan areas were selected from the U.S. Department of Housing and Urban Development’s (HUD) “Next 20” group, as described in HUD’s report, “Metro Area Selection Strategies & Decisions for the 2015 AHS & Beyond.” The total AHS sample size will be 127,885.


Starting in 2009, the AHS questions were classified into “core” modules and “supplemental” modules in order to minimize respondent burden and satisfy widening needs for data content. Questions in the core modules are asked in each survey and typically undergo minor revisions between surveys. Questions in the supplemental modules are asked on a supplemental basis.


Title 12, United States Code, Sections 1701z-1, 1701z-2(g), and 1701z-10a provide authority to collect this information.

HUD uses information from the AHS to prepare the Worst Case Needs reports to Congress. HUD was directed to prepare this report series by U.S. Senate Appropriations Committee in 1990 (Committee Report to accompany H.R. 5158, The VA-HUD Appropriations Act for FY 1991 (S. Rpt. 101-474)). HUD also uses these data to prepare other special reports for Congress and its committees concerning the effect of legislation on the housing stock.


The 2021 data collection procedures and questionnaire content are similar to the 2019 survey with the following exceptions:


  1. Redesign of the Core Mortgage Module: The purpose of the redesign was to simplify the questions and streamline the flow of the module. The primary purpose of the Mortgage module is to measure housing costs to feed into HUD’s interest of measuring housing affordability. Secondary purposes include measuring the household’s financial capacity in terms of access to credit and understanding the sources of financing for alterations and repairs.


  1. Removal of Three Supplemental Modules from the 2019 AHS: The Food Security, Home Accessibility, and Post-Secondary Education supplemental modules will not be included in the 2021 survey.


  1. Conclusion of the Housing Insecurity Research Module Follow-On: There will not be a Housing Insecurity Research Module Follow-on module for the 2021 AHS. Analysis continues on the data collected in the 2019 AHS.


  1. Reinstatement of the Delinquent Payments and Notices Supplemental Module: The Delinquent Payment and Notices module collects data on whether people had to move due to lack of financial means or other support and where they would stay if they left the household. This supplemental module was last deployed in 2017.


  1. Introduction of Five New Supplemental Modules: To continue the strategy of supplemental modules in order to minimize respondent burden and satisfy widening needs for data content, four new supplemental modules have been added to the survey – Intent to Move, Expanded Renter Housing Search, Wildfire Risk, Pets, and Smoking. These modules collect data on whether the respondent plans to move, the renter housing search process, housing characteristics that increase wildfire risk, household pets, and smoke and smoking in the housing unit. Please refer to the attached items booklet for the questions in these modules and the entire AHS questionnaire.


  1. Sample Split for Supplemental Modules: A split of the survey sample will be used to maximize the number of supplemental modules that can be included in the 2021 AHS. Fifty percent of the sample will be asked the Intent to Move and Expanded Renter Housing Search modules. The other 50 percent will be asked the Wildfire Risk, Pets, and Smoking modules. The full sample will receive the Delinquent Payments and Notices Module. The Wildfire Risk module will be targeted toward geographical areas at increased risk for wildfires.


  1. Nonresponse Bias Incentives Experiment: A proposed experiment will test whether targeting incentives to units at risk of nonresponse and likely to introduce bias can lower nonresponse bias in the integrated national sample.



We also request clearance for the reinterview questions to be used in conjunction with this survey. We will conduct a second interview at approximately 7 percent of the total addresses in the survey for the purpose of interviewer quality control. Reinterview questions ask respondents whether they recall general details from the original interview. The 2021 reinterview instrument will contain five questions about the AHS questionnaire. Each respondent will be asked all five question. We included in this clearance the cost and respondent burden estimates for the reinterview.


  1. Needs and Uses


Both HUD and outside entities use the core modules of the AHS extensively. The core modules capture information about building and unit characteristics, housing quality, fuel and electricity costs, resident mobility and recent movers, rent and mortgage expenses, household demographic characteristics, income, and repairs and remodeling frequency and expenses. The following subsections describe the internal and external uses of the core modules and expected uses of the supplemental modules.


  1. HUD’s Internal Needs for the Core Modules



HUD has numerous needs for the AHS to support Congressional reporting requirements, programmatic needs, and ongoing research.


The needs include, but are not limited to:


    1. Worst Case Housing Needs: Congress requires HUD to produce the Worst Case Housing Needs report every two years. This report is based almost entirely on the AHS.


    1. Worst Case Housing Needs of People with Disabilities: HUD produces a supplemental report to the Worst Case Housing Needs report providing national estimates and information on the critical housing problems that confront low income renting families that include people with disabilities.


    1. Characteristics of HUD Assisted Renters and Their Units: HUD produces a report detailing the housing conditions of HUD-assisted renters. This report is based entirely on the AHS responses of units that match to HUD administrative records of subsidized housing.


    1. Housing Program Monitoring: AHS data is used to evaluate, monitor, and design the HUD programs to improve efficiency and effectiveness. From a HUD policy perspective, the AHS data have proved valuable in analyzing the potential effects of program design and redesign proposals. Past data have enabled HUD, for instance, to determine under what conditions a moderate income, multifamily construction program might be needed and feasible; to examine the effect of low vacancy rates on housing maintenance and quality; and to evaluate how housing assistance programs help welfare recipients.


    1. National Housing Market Program of Research: HUD’s Office of Policy Development and Research (PD&R) continuously monitors the state of the nation’s housing market. The AHS contributes to this effort by providing estimates of vacancy, financing types, homeowner equity, and housing values, to name a few.


    1. Regional and Local Housing Market Research: HUD PD&R use the AHS data as one source of data for creating Comprehensive Housing Market Analyses and other local housing market intelligence reports. These reports help HUD field economists evaluate feasibility and market impacts of proposed multifamily assisted housing project investments.


    1. Affordable Housing Program of Research: HUD PD&R uses the AHS to conduct research on the number of affordable rental units in the housing stock and the degree to which rents are affordable to low- and moderate-income families and to very-low-income families.


    1. Housing and Demographics Program of Research: HUD PD&R uses the AHS to conduct research on demographic distributions by types of housing units. Of particular interest are housing choices by low-income female householders, minorities, first-time homebuyers, the elderly, and households nearing retirement.



  1. Core Modules Uses External to HUD


Core Modules Uses
s: ibility to their places of workn part of the current AHS sampletistical precision of national estimates wa
National and local policy analysts, program managers, budget analysts, and Congressional staff use the AHS data to advise executive and legislative branches about housing conditions and the suitability of policy initiatives. Academic researchers and private organizations also use the AHS data in efforts of specific interest and concern to their respective communities.

Data from the AHS is the major source of estimates of the space-rental value of housing (a component of personal consumption expenditures) and of the rental income of persons (a component of both personal income and national income). The Bureau of Economic Analysis (BEA) uses the AHS data in preparing metropolitan income and product accounts. The specific data that the BEA uses are those defining farm or nonfarm location, type of housing unit, occupancy status, tenure of the occupant, and the expenditures related to housing (rent, utilities, mortgage, and so on). Another use of the AHS data is to evaluate the housing program benefits reported on the Current Population Survey (CPS) and the Survey of Income and Program Participation (SIPP). The Energy Information Administration of the Department of Energy (EIA) issues an annual report “Annual Energy Review” using the heating fuel data collected in AHS (http://www.eia.gov/totalenergy/data/annual/).

Data from the AHS are the primary input into Harvard’s Joint Center for Housing Studies estimate of the size of the remodeling market (http://www.jchs.harvard.edu/research/remodeling-futures).



  1. Supplemental Module Needs and Potential Uses


New data are being collected in the 2021 survey on whether the respondent plans to move, the renter housing search process, housing characteristics that increase wildfire risk, household pets, delinquent payments and notices for mortgage, rent, or utility bills, and smoking. We will collect this data in the following five modules.

  1. Intent to Move Module: Virtually no other surveys have the ability to ask people if they intend to move, then actually measure whether they moved. Including these questions in the AHS will help determine if intent-based moving questions have validity.

  2. Expanded Renter Housing Search Module: This will become a very important series of questions during the pandemic recovery. Many renters may have suffered reductions in their credit scores due to job loss. We are interested in whether they will have difficulty finding rental housing.  

  3. Wildfire Risk Module:  This module, developed with input from the National Fire Prevention Association, will be useful in analyzing housing characteristics that make housing units vulnerable to wildfires in areas at greater risk of wildfire.


  1. Pets Module: When the pets question was included in the 2017 AHS Disaster Preparedness module, HUD received a lot of interest in the AHS data. Further examination of the results revealed that the AHS estimates were not matching other major surveys, such as the GSS. In the interest of providing accurate data, we decided to replicate the results of the GSS. To do this, we adopted their questions about pet ownership, with a few changes.

  2. Delinquent Payments and Notices Module: The Delinquent Payments and Notices topical module is comprised of a subset of the questions from the 2013 AHS Doubled-Up Households module. The original Doubled-Up Households module was created as a rotating topical module that collected data on people who had to temporarily move in with other households in the last year; why people left their previous homes to move in with other households; and, in what is now the Delinquent Payments and Notices module, whether people had to move due to lack of financial means or other support and where they would stay if they left the household. The Delinquent Payments and Notices module was last included in the 2017 AHS.

This module is related to HUD’s Strategic Goal 2, “Meet the Need for Quality Affordable Rental Homes.” In particular, Subgoal 2A is “End homelessness and substantially reduce the number of families and individuals with severe housing needs.” Doubling up is widely seen as a precursor to homelessness.

HUD expects The Department of Education to be interested in the data produced by this module. Children in temporary doubled-up conditions are considered homeless for the purposes of education policy, and efforts are made to ensure that these children attend the same schools as their housing situation changes. The United State Interagency Council on Homelessness and the National Alliance to End Homelessness Research Institute each has expressed strong interest in the findings from this module.

  1. Smoking Supplement: These questions on the frequency of smoking in the home and secondhand smoke are important to help assess the impact of HUD’s smoke-free housing rule, which became effective in July 2018.  There have also been efforts to promote adoption of smoke-free policies in assisted MF housing, so the data would be important for that as well.  


  1. Nonresponse Bias Incentives Experiment


As with many federal surveys, the AHS has experienced declining response rates, requiring increasing amounts of time and effort to reach the 80 percent response rate preferred by the Office of Management and Budget. In particular, response rates have declined from approximately 85 percent in the 2015 wave to 80.4 percent in the 2017 wave to 73.3 percent in the 2019 wave.


If the features we want to, but cannot, measure for nonresponders differ systematically from those of responders, nonresponse can lead to bias. The attached nonresponse bias memorandum (NRB Summary Memo Draft) presents multiple forms of evidence suggesting that AHS is at risk of nonresponse bias: responders and non-responders differ systematically on a range of attributes, and AHS estimates diverge from the equivalent quantities measured through the 2010 Census count.


The purpose of this project is to determine whether and how the provision of unconditional, prepaid cash incentives included in advance letters can reduce nonresponse bias. The proposed incentive project will test whether targeting differing levels of incentives ($0, $1, $5, and $10) to units in the integrated national sample both with a high risk of non-response and likely to introduce bias can successfully decrease nonresponse bias.


The experimental design will text three main outcomes and three secondary outcomes. The three main outcomes are:


1. The effect of propensity-determined allocation of incentives on the difference in sample and population means of selected covariates, including both characteristics correlated with important outcomes and, where possible, important outcomes themselves.


2. The effect of propensity-determined allocation of incentives on the response rate.


3. The effect of a one-dollar change in incentive on the response rate.


Including incentives will not impact the overall respondent burden times.


The attached Project Design Document describes in more detail the intended test outcomes, sample design, protocol for data collection, and decision criteria.


Finally, information quality assessment is an integral part of the pre-dissemination review of information disseminated by the Census Bureau (fully described in the Census Bureau’s Information Quality Guidelines). Information quality assurance is also integral to information collections conducted by the Census Bureau and is incorporated into the clearance process required by the Paperwork Reduction Act.


  1. Use of Information Technology


  1. Data Collection


The U.S. Census Bureau began conducting all the AHS interviewing with computers with the 1997 AHS enumeration. A Census Bureau Field Representative (FR) conducts the interview via a Blaise Computer Assisted Person Interview (CAPI) instrument. The same survey instrument is used for all interviews. However, the instrument code includes skip patterns and makes use of dependent interviewing techniques, which means that a few questions will not have to be asked in future enumerations to decrease respondent burden for households in sample.


The AHS has not collected data via the Internet or through the Electronic Data Interchange because of the significant investment in time and research needed to establish these types of electronic reporting in an ongoing survey. However, the Census Bureau has plans to use a multimode Internet self-response and CAPI survey in a future interview cycle.

  1. Data Dissemination


The Census Bureau currently makes public-use micro data collected by the AHS available to the public on the Census Bureau Internet Web site at:

http://www.census.gov/programs-surveys/ahs


The Census Bureau will make the 2021 AHS summary data available via the AHS Table Creator Tool (http://www.census.gov/programs-surveys/ahs/data/interactive/ahstablecreator.html).


The data being disseminated and released are not individually identifiable and will have been cleared for release/dissemination by the Census Bureau's Disclosure Review Board.


  1. Efforts to Identify Duplication


  1. Duplication in the Core Modules


HUD consulted with other government agencies and determined that the AHS is the only data source with both detailed information on the physical condition of the housing inventory and of rents of housing units. Although housing data are collected as part of the American Community Survey (ACS) (Census Bureau), Consumer Expenditure Survey (CES) (Bureau of Labor Statistics), and the Residential Energy Consumption Survey (RECS) (Department of Energy), neither of these surveys provide the longitudinal data over a period of years or the detailed information available from the AHS. The CES collects housing cost data but does not collect detailed information on vacant units. The RECS does not collect mortgage or detailed housing cost data. Neither the ACS nor the RECS have detailed information on the physical condition of housing units or information on vacant units. Thus, these datasets could not serve as substitutes for the measures produced by the AHS that detail worst case housing needs.


The purposes of the AHS and the other surveys cited above also differ according to the agency’s goals and objectives. Certainly, the HUD surveys involve personal/household behavior with respect to housing and community development issues. However, human behavior in general is conditional on fundamental familial, demographic, housing, and economic variables. Generally, HUD is not interested in the levels of individual variables, but in the relationships among variables. Therefore, they must observe the values of the variables for the same individuals in the same sample to capture covariance structure. (All multivariate statistical procedures rely on the covariance structure.) The AHS asks about the same fundamental variables but goes further and asks numerous detailed questions about other aspects of housing consumption, finance, and moving. In order to understand human behavior and detailed housing information, HUD needs to know how the fundamental housing variables affect or are related to the more detailed housing variables. It would make no sense to collect detailed information about housing cost burdens and mortgage financing if we had no idea about fundamental housing attributes such as size, value, or rent of the housing unit.


  1. Duplication in the Supplemental Modules


HUD undertook considerable effort to determine if the supplemental modules would be duplicative of existing surveys. HUD’s conclusions are below:


  1. Intent to Move Module: Virtually no other surveys have the ability to ask people if they intend to move, then actually measure whether they moved. The Survey of Income and Program Participation asks about moving intentions as a starting point for recontacting households in the following wave of the survey, but the data are not edited or released.


  1. Expanded Renter Housing Search Module: We are not aware of any other major federal surveys with similar questions.


  1. Wildfire Risk Module: We are not aware of any other major federal surveys with similar questions.


  1. Pets Module: When the pets question was included in the 2017 AHS Disaster Preparedness module, HUD received a lot of interest in the AHS data. Further examination of the results revealed that the AHS estimates were not matching other major surveys, such as the GSS. In the interest of providing accurate data, we decided to replicate the results of the GSS. To do this, we adopted their questions about pet ownership, with a few changes.


  1. Delinquent Payments and Notices Module: The most current research report to make an attempt to estimate the population of doubled-up households is:


Mykyta, Laryssa and Macartney, Suzanne. June 2012. Sharing a Household: Household Composition and Economic Well-Being: 2007-2010. Current Population Report U.S. Census Bureau. Accessed July 7, 2012 at http://www.census.gov/prod/2012pubs/p60-242.pdf


The aforementioned report used data from the Survey of Income and Program Participation (SIPP). HUD PD&R feels that the SIPP-based analysis has shortcomings that make it difficult to accurately measure the doubled-up household population. First, SIPP does not address the issue of risk of housing loss. They focus instead on doubled-up households at the time of the interviews and shifts over different interview waves rather than on housing loss and out-movers. Second, SIPP questions do not directly allow for an assessment of “economic” doubled-up households, which is of the most interest to HUD. Instead, SIPP permits analysis of the presence of “additional adults,” describing their basic demographic characteristics and shifts in numbers over time, plus changes in overall household economic well-being and eligibility for means-tested public benefits given a change in household composition.


  1. Smoking Supplement: The questions were suggested by HUD’s Office of Healthy Homes and were last included in the 2015 AHS. The AHS is the most appropriate vehicle for collecting these data because it is a random sample of U.S. housing and has an established infrastructure for implementation and reporting. The oversample of HUD-assisted units in the AHS make it possible to assess the impacts of HUD policies on smoking in HUD-assisted units. There is no other regularly administered survey that routinely captures these data.


  1. Minimizing Burden


We have designed the AHS questions to obtain the required information, while keeping respondent burden to a minimum. The data are collected only from individual households, not small businesses or other small entities. For unoccupied units, data are collected from a “knowledgeable respondent,” who could be a landlord, property manager, rental agent, real estate agent, or neighbor.





  1. Consequences of Less Frequent Collection


As a longitudinal survey, we interview our samples periodically to provide intermittent readings between decennial censuses. The length of time between interviews is two years on the AHS. Less frequent enumerations would reduce HUD’s ability to detect changes in worst case housing needs. Without this ability, the Administration and Congress would be unable to formulate policy on housing assistance.


  1. Special Circumstances


  • Requiring respondents to report information to the agency more than quarterly; Not Applicable.

  • Requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it; Not Applicable.

  • Requiring respondents to submit more than an original and two copies of any document; Not Applicable.

  • Requiring respondents to retain records other than health, medical, government contract, grant-in-aid, or tax records for more than three years; Not Applicable.

  • In connection with a statistical survey that is not designed to produce valid and reliable results than can be generalized to the universe of study; Not Applicable.

  • Requiring the use of a statistical data classification that has not been reviewed and approved by OMB; Not Applicable.

  • That includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; Not Applicable.

  • Requiring respondents to submit proprietary trade secret, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law. Not Applicable.


We collect the data in a manner consistent with OMB guidelines, and there are no special circumstances.


  1. Consultations Outside the Agency


  1. Federal Register Comments


Attached is a copy of the Federal Register Notice required by 5 CFR 1320.8(d). The Notice was published on August 7, 2020.


HUD received no comments.


  1. Consultations Influencing the 2021 AHS Core Modules


The content of the 2021 AHS Core Modules are the result of many years of consultation and testing starting with the development of the 1984 AHS questionnaire. For the original 1984 AHS questionnaire approximately 250 prospective data users were consulted who represented diverse areas of interest. The BEA suggested modifications to the original questionnaire to improve BEA’s estimates and to improve the clarity and consistency of the questions.


HUD and the Census Bureau routinely consult with outside groups who are frequent users of the AHS, including the National Association of Home Builders and the Harvard Joint Center for Housing Studies (JCHS). Because of the depth of their experience with the AHS, these groups often make recommendations concerning minor changes to AHS questions. The Neighborhood Quality Module was added to the core and the number of questions in the module was reduced after consultation with NAHB. In consultation with JCHS, questions on the date of completion and the source of financing for remodeling jobs were added to the Home Improvement and Remodeling Module. We also worked with JCHS to combine some of the Home Improvement job categories to reduce respondent burden. The EIA at the Department of Energy was consulted in the development of utility cost allocation models, which are used to model utility costs using household and housing characteristics and climate data in the 2021 AHS.


  1. Consultations Influencing the 2021 AHS Supplemental Modules


The process of developing the 2021 AHS supplemental topic modules included consultations with several outside groups.

  1. Intent to Move Module: The 2021 supplemental module is sponsored by Harvard Joint Center for Housing Studies and the National Academy of Science Committee on National Statistics.


  1. Expanded Housing Search Module: The 2021 supplemental module expands on the core housing search questions. HUD consulted with the Harvard Joint Center for Housing Studies.


  1. Wildfire Risk Module: The 2021 supplemental module is sponsored by National Fire Prevention Association.


  1. Pets Module: The 2021 supplemental module was adapted from similar questions in the General Social Survey (GSS). In 2018, NORC at the University of Chicago published results from the GSS pets questions showing more pet owners than was reported in a pets question in the 2017 AHS. Given the importance of the presence of pets to households, and the differing estimates, HUD determined it was worth replicating the GSS survey questions.


  1. Delinquent Payments and Notices Module: The questions were developed by a panel of experts assembled by HUD PD&R’s and modified by AHS personnel at HUD and Census. The panel of experts included representatives from the Urban Institute, the University of Pennsylvania, the United States Interagency Council on Homelessness, Westat, the National Alliance to End Homelessness Research Institute, Abt Associates, and Wayne State University.


  1. Smoking Supplement: Smoking Supplement: The questions were suggested by HUD’s Office of Healthy Homes and were last included in the 2015 AHS.


  1. Consultations Influencing the Nonresponse Bias Incentives Experiment


HUD consulted with the Office of Evaluation Sciences (OES) at the U.S. General Services Administration to design the Nonresponse Bias Incentives Experiment.


  1. Paying Respondents


HUD and Census intend to test whether incentives reduce nonresponse bias in the integrated national sample. Following a design developed by the GSA, HUD and Census will test eight treatment conditions, leading to three main outcomes and three secondary outcomes. For a more detailed discussion on the rationale for incentives, see Section 2 of the attached Project Design Document.


An early finding in the literature on incentives is that, while response rates increase as the incentive amount increases, they do so at a decreasing rate (Armstrong 1975).1 In a large meta-analysis of the effect of incentive amounts on response rates, Mercer et al. (2015) showed that 1) the type of incentive and survey mode appeared to matter for the dose-response curve (see Section 2.3 of the Project Design Document); and 2) that a relative paucity of data on varying amounts in the context of mixed-mode, panel surveys such as the AHS made generalizing to those contexts based on extant literature difficult.2 Understanding where the inflection point lies in the AHS survey sample will help to determine whether a flat $5 incentive, as is used in the NHES, makes sense, or whether differing amounts need to be used among different subgroups.


Our study plans to randomize respondents to one of four amounts: $0, $1, $5, and $10. The $5 dollar amount is chosen as it corresponds to amounts in similar surveys such as the NHES.


We include the $1 amount as it is possible that we find ourselves in a scenario in which the bulk of the response rate increase can be generated with one dollar. However, the medium-cost scenario seems very plausible. Mercer et al. (2015), for example found that, on average, in person surveys that paid $5 versus nothing had a response rate increase of 5 percentage points, those that paid $10 versus nothing had an increase of 7 percentage points, while those that paid $20 had an increase of 9 percentage points. In other words, while doubling the incentive from 5 to 10 produced a 40 percent increase in effectiveness, doubling it from $10 to $20 only produced a 28 percent increase in effectiveness.


For this reason, we believe it makes sense to test an amount of $10. Moreover, the panel context of the AHS argues in favor of including at least one substantial incentive amount. In particular, it is important to know how incentives in one wave affect response patterns in subsequent waves. While respondents may very easily forget having received $1 or $5 two years ago given the largely symbolic value of these sums, $10 seems more likely to stand out in one’s memory.

This raises the prospect that, either through habit-formation or recall, large incentive amounts may durably increase response rates beyond the one wave in which they are conducted or lead to an expectation of similar incentives in future waves. This is a possibility largely unexplored in the literature.


Incentives will range from $0 to $10 and will be targeted among units in the integrated national sample (n=93,616 units). Approximately 70 percent of units will be sent no incentive. Approximately 7.5 percent of units will be sent a $1 incentive. Approximately 7.5 percent of units will be sent a $5 incentive. Approximately 15 percent of units will be sent a $10 incentive. The total cost of the incentives is $182,572.



Propensity-Independent (50%)

Propensity-Determined (50%)

Amount

$0

$1

$5

$10

$0

$1

$5

$10

Proportion

70%

7.5%

7.5%

15%

70%

7.5%

7.5%

15%

Number

32,765

3,511

3,511

7,021

32,765

3,511

3,511

7,021

Total cost

$0

$3,511

$17,555

$70,021

$0

$3,511

$17,555

$70,021

Total

$182,572


The precise design of the experiment is detailed in the attached Project Design Document.



  1. Assurance of Confidentiality


The Census Bureau collects these data in compliance with the Privacy Act of 1974 and OMB Circular A-108. The Census Bureau will send each sample address a letter (AHS-26/66(L)) in advance of the interview containing the information required by this act.


The Advance Letter informs the respondents of the voluntary nature of this survey and states that there are no penalties for failure to answer any question. The letter explains why the information is being collected, how it will be used, and that it will take approximately 40 minutes to complete the interview. The letter displays the OMB control number and date of expiration.


As part of the introduction for personal-visit households, the Census Bureau FRs will ask the respondents if they received the Advance Letter. If not, the FRs will give the letter to the respondents and allow them sufficient time to read the contents. We also display the program website and the toll-free phone number of the regional office for which the FR works as a way for the respondent to authenticate her/his employment with the Census Bureau. For interviews conducted by telephone, FRs will read to the respondents a condensed version of the advance letter that includes the information required by the Privacy Act.


After the interview is completed, the FRs will give the respondents a "Thank You" Letter (AHS-28/68(L)). Both the Advance Letter and the Thank You Letter state that all information respondents give to the Census Bureau employees is held in strict confidence by Title 13, United States Code. Each FR has taken an oath to this effect and is subject to a jail term, fine, or both, if he/she discloses any information given him/her.


The data collected under this agreement are confidential under Title 13, U.S.C., Section 9(a). Should HUD staff require access to Title 13 data from this survey to assist in the planning, data collection, data analysis, or production of final products, those staff members are required to obtain Census Bureau Special Sworn Status (SSS). They must demonstrate that they have suitable background clearance and they must take Title 13 Awareness Training.


Any access to Title 13 data at HUD is subject to prior approval by the Census Bureau's Data Stewardship Executive Policy Committee upon assurance that the HUD facility and information technology security meet Census Bureau requirements.




  1. Justification for Sensitive Questions


The survey does not include any questions of a sensitive nature.


  1. Estimate of Hour Burden


We estimate the respondent burden hours to be about 63,137 hours. Refer to the following table for more detailed information.


Information Collection

Number of Respondents

Frequency of Response

Responses

Per Annum

Burden Hour Per Response

Annual Burden Hours

Hourly Cost Per Response

Annual Cost


Occupied Interviews

86,962.00

1.00

86,962.00

.66

57,395.00

$0.00

$0.00

Vacant Interviews

12,788.00

1.00

12,788.00

.33

4,220.00

$0.00

$0.00

Non-interviews

24,298.00

1.00

24,298.00

.00

0.00

$0.00

$0.00

Ineligible

3837.00

1.00

3,837.00

.00

0.00

$0.00

$0.00

Subtotal

127,885.00

1.00

127,885.00

.00

.00

$0.00

$0.00

Reinterviews

8,952.00

1.00

8,952.00

.17

1,522.00

$0.00

$0.00

Total

136,837.00


136,837.00


63,137.00




The 2021 AHS sample will be split into two groups. Fifty percent of the sample will be asked the Intent to Move and Expanded Renter Housing Search modules. The other 50 percent will be asked the Wildfire Risk, Pets, and Smoking modules. This will maximize the number of supplemental modules that can be included while not increasing overall response burden.


  1. Estimate of Cost Burden


The annualized cost estimate to respondents for burden hours is $0. There are no costs to respondents other than that of their time to respond.


  1. Cost to Federal Government


HUD estimates the 2-year survey cycle costs to the government for the 2021 AHS, including 10 metropolitan areas, to be $66.4 million.


Cost Items

FY 2020

FY2021

Total

Professional Staff

$12,600,000.00

$13,200,000.00

$25,800,000.00

Field Data Collection


$45,500,000.00

$45,500,000.00

Technology

$0.00

$2,500,000.00

$2,500,000.00

Total

$12,600,000.00

$61,200,000.00

$73,800,000.00


The figures above are based on the following factors.

  • For professional staff, the estimates are based on actual money spent in FY 2019 and budgeted “not-to-exceed” amounts for FY 2021. Professional staff include survey methodologists, statisticians, computer programmers and other IT support, communications specialists and managers.


  • For field data collection, projected costs reflect “not-to-exceed” amounts. The projected costs are provided by the Census Bureau’s field case management cost projection model. The cost projection model uses information on costs from prior surveys (including, but not limited to, the AHS), specifications for the current AHS, and current local and regional labor rates.


  • Technology costs include purchase and maintenance of laptops. This estimate is provided by the Census Bureau and reflects a cost-sharing portion of the Census Bureau’s annual technology costs CAPI-based surveys. All surveys using CAPI share in the cost of technology.


  1. Reason for Change in Burden


The estimated respondent burden for 2021 (63,137 hours) is slightly higher than respondent burden cited in the 2019 AHS Supporting Statement. The reason for this is that we increased the size of the subsidized renter oversample after the 2019 AHS. Our estimated 2021 AHS response rates are based on what was observed in the 2019 AHS.



  1. Project Schedule


The Census Bureau has scheduled the majority of 2021 field enumeration for the AHS survey to begin May 3 and end September 14, 2021. The entire reinterview data collection will span May 4 through September 21, 2021.


The projected release date of the National and Metropolitan public use files (PUFs) is summer 2022. When processing the data, the Census Bureau usually implements basic data edits to ensure consistency. In some cases, statistical models are used to allocate for missing values, such as values for income, utility cost, etc. Allocated values can be identified by analysts with the help of variables that are included in the data set that tag such edits. We also create new variables by collapsing or combining questions in the survey.


HUD and the Census Bureau will issue product announcements when releasing the PUFs, as well as the Table Creator tables as agreed upon with HUD. The Department of Commerce or HUD may release other publications.


The data being disseminated and released are not individually identifiable and will have been cleared for release/dissemination by the Census Bureau's Disclosure Review Board.


  1. Request to Not Display Expiration Date


The OMB number and expiration date will be included on the AHS-26/66(L)

Advance Letter. Because the questionnaire is an automated instrument, the respondent will not see the OMB number and expiration date.


  1. Exceptions to the Certificate


There are no exceptions.

1 Armstrong, J. S. 1975. “Monetary Incentives in Mail Surveys.” Public Opinion Quarterly 39 (1): 111–16.

2 Mercer, Andrew, Andrew Caporaso, David Cantor, and Reanne Townsend. "How much gets you how much? Monetary incentives and response rates in household surveys." Public Opinion Quarterly 79, no. 1 (2015): 105-129.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorBureau of the Census
File Modified0000-00-00
File Created2021-01-12

© 2024 OMB.report | Privacy Policy