Final-4-27-21 - Choice Supporting Statement Part A

Final-4-27-21 - Choice Supporting Statement Part A.docx

The Outcomes Evaluation of the Choice Neighborhoods Program

OMB: 2528-0332

Document [docx]
Download: docx | pdf

Supporting Statement for Paperwork Reduction Act Submission

Outcomes Evaluation of the Choice Neighborhoods Program

OMB Control # 2528-New



A. Justification



1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.



This is a new collection. The U.S. Department of Housing and Urban Development (HUD)’s Office of Policy Research and Development is conducting an Outcomes Evaluation of the Choice Neighborhoods Program. Choice provides direct investments through competitive grants targeted to neighborhoods marked by high rates of poverty with distressed public or HUD-assisted housing. Choice is one of HUD’s primary tools to support planning and implementation efforts to catalyze redevelopment in cities across the nation through an ambitious multifaceted strategy focused on three components: housing, people and neighborhood. By leveraging public and private dollars to support locally driven strategies, local leaders, residents, government officials, and community stakeholders work together to create and implement a transformation plan that revitalizes distressed HUD housing, supports residents, and addresses challenges in the neighborhood.

The Choice grant is expected to attract greater levels of public and private investment in the neighborhood, increase civic engagement, and catalyze physical and perceived neighborhood improvements. Choice activities include the redevelopment of public or assisted target housing, the provision of case management and supportive services for residents of the target developments, and physical improvements in the neighborhoods, such as those in infrastructure and public facilities, neighborhood housing, commercial development, and community amenities. Since fiscal year (FY) 2010-11, HUD has engaged with more than 30 neighborhoods, awarding more than $862 million in implementation grants and $38 million in 85 Planning Grants.

Under contract with HUD, The Urban Institute completed a baseline evaluation in 2016 of the first five Choice implementation sites: Quincy Corridor neighborhood in Boston, Massachusetts; Woodlawn neighborhood in Chicago, Illinois; Iberville/Tremé neighborhood in New Orleans, Louisiana; Eastern Bayview neighborhood in San Francisco, California; and Yesler neighborhood in Seattle, Washington.

In 2018, HUD was directed by Congress to undertake a follow up evaluation of the Choice Neighborhoods Program.1 In 2019, HUD contracted with The Urban Institute and their partners Decision Information Resources (DIR) and Case Western Reserve University’s (CWRU) National Initiative on Mixed-Income Communities for a follow up evaluation, which is the subject of this information collection request. The follow-up evaluation builds on the Urban Institute’s baseline study by incorporating measures of outcomes and impact in key areas of interest. The evaluation will use qualitative and quantitative methods to answer the following overarching research question: whether public and private dollars were successfully leveraged to 1) replace distressed public and assisted housing with high-quality mixed-income housing that is well-managed and responsive to the needs of the surrounding neighborhood, 2) improve outcomes for households in the target housing, including in employment and income, health, and education, and 3) create the conditions necessary for public and private reinvestment in distressed neighborhoods to improve amenities and assets. The evaluation will include the original neighborhoods from the baseline evaluation and 2013 grantees: Near East Side neighborhood in Columbus, Ohio; South Norwalk neighborhood in Norwalk, Connecticut; North Central Philadelphia neighborhood in Philadelphia, Pennsylvania; and Larimer/East Liberty neighborhood in Pittsburgh, Pennsylvania. Data are collected under Title 12, U.S.C. Sec. 1701Z-1 and 2.


2. Indicate how, by whom and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.

This evaluation is designed to answer a range of questions about Choice implementation and early outcomes across nine Choice sites:

  1. Quincy Corridor neighborhood in Boston, Massachusetts

  2. Woodlawn neighborhood in Chicago, Illinois

  3. Iberville/Tremé neighborhood in New Orleans, Louisiana

  4. Eastern Bayview neighborhood in San Francisco, California

  5. Yesler neighborhood in Seattle, Washington.

  6. Near East Side neighborhood in Columbus, Ohio

  7. South Norwalk neighborhood in Norwalk, Connecticut

  8. North Central Philadelphia neighborhood in Philadelphia, Pennsylvania

  9. Larimer/East Liberty neighborhood in Pittsburgh, Pennsylvania



The research questions are organized in four main topic areas below. Each may be answered through multiple data collection modes.

The four topic areas and associated research questions are:



  1. Housing

    1. How has the housing mix changed in the revitalized developments and, by extension, in the surrounding neighborhood? How well did grantees accomplish their plans to redevelop mixed-income target housing? What, if any, were the impediments to completion?

    2. How well did grantees adhere to their one-for-one replacement requirements? What, if any, were the impediments to completion?

    3. What are the differences in how grantees with public housing versus grantees with HUD-assisted target housing progressed?

    4. How has the revitalized housing physically changed and what design principles (e.g., compatibility with and enrichment of the surrounding neighborhood, defensible space, connectedness to the surrounding neighborhood) were incorporated?

    5. How has management and quality of housing changed for replacement units?

    6. What is the management and housing quality of non-replacement units?



  1. People

    1. Resident characteristics and experience:

      1. What are the characteristics of households living in the different unit types, and how have those changed over time?

      2. Has Choice contributed to a self-reported improvement in their housing conditions, economic prospects, and quality of life?

    2. Housing satisfaction, relocation, and stability:

      1. What was the resident experience with relocation and re-occupancy (as applicable)?

      2. What are the experience and perspectives of residents living in the replacement units?

      3. What is residents’ satisfaction with their current housing and neighborhood?

    3. Case management and supportive services coordination:

      1. What are the differences in how grantees provided case management and coordinated services, and which strategies seemed more effective in achieving resident outcomes?



  1. Neighborhoods

    1. How did the neighborhoods change during and after Choice, as reflected in housing market conditions, general economic conditions, and public safety?

    2. What changes have occurred in the quantity and nature of investment in the neighborhoods since the time of application?

    3. How are grantees using Choice funding for Critical Community Improvements?



  1. Governance

    1. Did Choice lead to strengthened or new partnerships and leveraged resources?



Information Collection Request Overview

This Information Collection Request (ICR) includes eleven data collection instruments to collect information from Choice residents, HUD staff, housing, people, and neighborhood implementation leads, and other local stakeholders, such as anchor institution staff and city officials:



Instruments to collect data from Choice residents:

  • Choice Neighborhoods Outcomes Study Household Survey (Appendix A)

  • Choice Neighborhoods Protocol: Resident/Neighborhood Leader (Appendix B)



Instruments to collect data from other Stakeholders:

  • Choice Neighborhoods Protocol: Grantee/City Officials(Appendix C)

  • Choice Neighborhoods Protocol: HUD Choice Neighborhoods Grant Managers (Appendix D)

  • Choice Neighborhoods Protocol: Housing (Appendix E)

  • Choice Neighborhoods Protocol: People (Appendix F)

  • Choice Neighborhoods Protocol: Education (Appendix G)

  • Choice Neighborhoods Protocol: Neighborhoods (Appendix H)


3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.

The study uses a household survey, semi-structured interviews, neighborhood observations, and advanced analysis of secondary and administrative data. For each data collection activity, the contents of each instrument were compared against the research questions forming the project’s goals to ensure each instrument is thorough yet collects only the minimum information necessary for the project’s purpose. The contractor will conduct the data collection activities involving public burden using the data collection mode that is (1) most appropriate for the research questions it is answering, and (2) minimizes burden.


***The survey of households in the Choice Neighborhood will take advantage of computer-assisted survey technology to reduce burden on respondents. The 35-minute instrument will be offered via web, by telephone and in-person to provide the easiest means of providing data for respondents. The sub-contractor will program the survey in Voxco, a mixed-mode survey software. Using this software, the survey can be seamlessly administered across web and interviewer-assisted telephone and field modes. The web instrument will allow respondents to complete the survey at a time convenient for them without the risk of their losing a paper survey questionnaire. If respondents are unable to complete the survey in one sitting, they may save their place in the survey and return to the questionnaire at another time. The survey instrument will automatically skip to the next appropriate question based on a respondent’s answers, reducing missing data. Respondents will be given the option to complete the survey via web or by telephone. Respondents will receive an advance mailer that provides information about the study, displays the OMB control number and other required PRA language, and provides a url to complete the survey online and a telephone number to complete the survey by phone. The advance mailing will offer information in large font for individuals with visual impairments and notify individuals with disabilities that they may request reasonable accommodations in order to participate in the study. The Section 508-compliant web survey will offer an alternative to the phone survey for individuals with hearing impairments. The advance mailer will also detail language assistance options for persons with limited English proficiency. The research team will make the survey available in four other languages, as needed, for persons with limited English proficiency. Survey language needs are determined before the survey is administered so that survey translations and interviewer trainings can be conducted systematically. For the Choice baseline survey, the research team identified the five most commonly spoken languages in the Choice communities (English, Spanish, Cantonese, Somali, and Vietnamese) and offered the telephone survey in those languages. Disposition data from the Choice baseline survey indicate that only 69, or 1.7 percent, of the 3967 households invited to participate did not do so because of a language barrier. The outcome study survey will be offered in the same languages. The web survey is offered in English and Spanish, the two most common languages.


***The qualitative interviews require direct person-to-person communication to capture the depth of information needed to understand the Choice program’s outcomes. To reduce respondent burden, respondents will be allowed to skip any question they cannot or do not want to respond to and interviewers will adhere to one hour for each interview. With respondents’ permission, the project team will audio-record the interviews to minimize time needed for potential follow-up for clarification and to support interview transcriptions. The research team will make reasonable accommodations for individuals with disabilities. The research team will provide reasonable accommodations that may be required in response to the requests identified by potential participants in the process of scheduling interviews. Accommodations may include, for example, the use of telecommunications relay services for any respondent who is deaf or hard of hearing. The research team will also provide language assistance services to persons with limited English proficiency. We anticipate the use of translators to interview respondents with limited English proficiency. We anticipate conducting some of the community leader interviews in Seattle in a language other than English. We will identify the community leaders as early as possible so that we can determine the need for interpreter services.


4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.

HUD is not aware of any other studies for which this study represents a duplicate research effort. No other data source, either public or private, has been identified that provides the information available from these data collections. There is no similar information available at the national, regional, or local level, that could be used or modified for use for the purposes described.

The data collection instruments were designed so that no two instruments collect the same information, even when addressing the same research question. Different respondents may be asked the same questions to capture different knowledge and perspectives.

This study includes a sizeable household survey that will provide information about all three components of the Choice program (Housing, People, and Neighborhoods) and will include three groups at each of the five sites: 1. Households from original target developments; 2. Households in new, revitalized non-replacement units; and 3. Households in new, revitalized replacement units. Skip logic will be built into the survey so that subsets of questions are administered to appropriate respondent groups.

5. If the collection of information impacts small businesses or other small entities (Item 5 of OMB Form 83-I) describe any methods used to minimize burden.

This information collection will not have a significant impact on a substantial number of small entities. A small number of representatives from small nonprofit organizations and service providers will be asked to participate in a qualitative survey, but participation is voluntary. The qualitative interviews require direct person-to-person communication to capture the depth of information needed to understand the Choice program’s outcomes. To reduce respondent burden, respondents will be allowed to skip any question they cannot or do not want to respond to and interviewers will adhere to one hour for each interview and 30 minutes of preparation and follow up.

6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.

The survey is a one-time data collection event, with no repetition of data collection planned. The survey and interviews will only take place once with each respondent. The proposed data collection activities aim to provide the government with evidence on the outcomes and impact of the Choice program and the investment it brings to surrounding neighborhoods. If the proposed activity is not implemented, the government will have to rely on incomplete or limited information to recommend program or policy improvements.

  1. Explain any special circumstances that would cause an information collection to be conducted in a manner:

  • requiring respondents to report information to the agency more than quarterly;

  • requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

  • requiring respondents to submit more than an original and two copies of any document;

  • requiring respondents to retain records other than health, medical, government contract, grant-in-aid, or tax records for more than three years;

  • in connection with a statistical survey, that is not designed to produce valid and reliable results than can be generalized to the universe of study;

  • requiring the use of a statistical data classification that has not been reviewed and approved by OMB;

  • that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

  • requiring respondents to submit proprietary trade secret, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.


The proposed data collection activities are consistent with the guidelines set forth in 5 CFR § 1320 (Controlling Paperwork Burdens on the Public). There are no special circumstances that require deviation from these guidelines. The following below are “Not Applicable” to this collection:


  • requiring respondents to report information to the agency more than quarterly – “Not Applicable”;

  • requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it – “Not Applicable”;

  • requiring respondents to submit more than an original and two copies of any document – “Not Applicable”;

  • requiring respondents to retain records other than health, medical, government contract, grant-in-aid, or tax records for more than three years – “Not Applicable”;

  • in connection with a statistical survey, that is not designed to produce valid and reliable results than can be generalized to the universe of study – “Not Applicable”;

  • requiring the use of a statistical data classification that has not been reviewed and approved by OMB – “Not Applicable”;

  • that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use – “Not Applicable”; or

  • requiring respondents to submit proprietary trade secret, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law – “Not Applicable”.


8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.

  • Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping disclosure, or reporting format (if any) and the data elements to be recorded, disclosed, or reported.

  • Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years -- even if the collection of information activity is the same as in prior periods. There may be circumstances that preclude consultation in a specific situation. These circumstances should be explained.

In accordance with 5 CFR § 1320.8 (Paperwork Reduction Act of 1995), a Notice of Proposed Information Collection for publication in the Federal register has been prepared to announce the agency’s intention to request an OMB review of supplemental data collection activities for the Outcomes Evaluation of the Choice Neighborhoods Program. HUD published a 60-Day Notice of Proposed Information Collection in the Federal Register on June 1, 2020. The Docket No. is FR-7029-N-04 and the notice appeared on pages 33189-33191. The notice provided a 60-day period for public comments, and comments are due July 31, 2020. No public comments have been received. A copy of the notice is included with this ICR in Appendix I.

The Outcomes Evaluation of the Choice Neighborhoods Program was developed and is being implemented with the assistance of The Urban Institute, the study’s contractor. Key members of the Urban Institute’s team include Project Manager Kathryn Reynolds; Co-Principal Investigators Diane Levy and Brett Theodos; and Technical Reviewer Ingrid Gould Ellen.


  1. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.

The research team is offering a $50 incentive for survey participation to households in the Choice program and $40 incentives to resident leaders for qualitative interviews. Additionally, to thank the 50 pre-test takers for their time taking the survey and responding to questions about the survey experience, DIR will offer a $50 incentive payment in the form of either a hardcopy gift card or electronic payment. To increase awareness of the survey and encourage sample members to update their contact information so that they can be contacted by phone or email, DIR will send a flyer to all sample members with a $2 cash pre-incentive. In addition, respondents who complete the web-based survey during a two-week period before CATI interviews begin will receive an early-bird incentive of $10 in addition to the $50 incentive DIR is providing to those who complete the survey. Individuals with disabilities must be provided reasonable accommodations and contractors must ensure effective communication with individuals with disabilities during this study. Similarly, meaningful access must be provided to persons with LEP.

Research has shown that such tokens of appreciation are effective at increasing participation from populations with lower education levels2 as well as low-income and nonwhite populations.3 Leading survey research organizations, American Association for Public Opinion Research (AAPOR) and American Statistical Association (ASA), agree that incentives improve response rates and are financially prudent for survey researchers because they reduce time spent pursuing responses. In fact, incentives used for federal surveys can save taxpayer money.

By encouraging otherwise reluctant respondents, the study reduces the risks associated with nonresponse bias—namely the risk that the research team draws inaccurate or biased conclusions about the program.

To prevent the token of appreciation from being coercive, the Urban Institute’s Institutional Review Board (IRB) will carefully review the information to be disclosed to potential subjects to ensure that the incentives and how they will be provided are clearly described. The consent process will accurately state known benefits without exaggerating them. In addition, $50 was the amount offered to participants for participation in the baseline survey seven years ago and is considered to be an appropriate amount given the amount of time that participants are being asked to give.

  1. Describe any assurance of confidentiality provided to respondents and the basis for assurance in statute, regulation or agency policy.

HUD has entered into a contract with an independent research firm, The Urban Institute, to conduct this research effort. HUD and The Urban Institute will make every effort to maintain the privacy of respondents, to the extent permitted by law. The subjects of this information collection and the nature of the information to be collected require strict confidentiality procedures. The information requested under this collection is protected and held private in accordance with 5 U.S.C. § 552a (Privacy Act of 1974) and OMB Circular No. A-130. HUD will publish a System of Records Notice (SORN) in the Federal Register before any data that will be stored and retrieved using PII is collected. The SORN is currently under review with HUD’s Privacy Office. Additionally, individuals will not be cited as sources of information in prepared reports.

Personally Identifiable Information

The information we collect will be kept private to the extent permitted by law. Names will not be linked to comments or responses.

Prior to administering the household survey, Urban will request PII from HUD to use for sampling and for contacting sample members. This will include the following fields:

Address: Unit address; Unit apartment number; City; State (Postal); Zip code; MSA; Latitude; Longitude; Census tract; State (FIPS); County; Census Block Group; Central city indicator; Minor Civil Division; Place code

Head of household contact information (full name)

Flag for whether baseline survey sample member (CN_ID, FDSAMPLE, NRSAMPLE in baseline survey file)

Flag for baseline survey completion

Date record was updated

Date last recertified

During the household survey, respondents will be asked to provide additional PII about their household. This will include the following:

Marital status of the householder

Ethnicity of the householder

Race of the householder

First name, initials, or nickname for every person in the household

Relationship of every person in the household to the householder

Birthday of everyone in the household (with age confirmation)

Sex of everyone in the household

Data will be publicly reported in aggregate form only. The Urban Institute will obtain Institutional Review Board (IRB) approval for all data collection under this contract. The Urban Institute developed, and its IRB approved, a confidentiality pledge. All researchers working with the data will read and sign the confidentiality pledge, agreeing to adhere to the data security procedures laid out in the approved IRB submission. The contractor will safeguard all data, and only authorized users will have access to them. Information gathered for this study will be made available only to researchers authorized to work on the study. Information will not be maintained in a paper or electronic system from which data are directly retrieved by individuals’ personal identifiers.

Assurances of Privacy

Survey respondents will be told the purposes for which the information is collected, and that any identifiable information about them will not be used or disclosed for any other purpose, except under such circumstances as may be required by law. Respondents will be given this assurance during recruitment, and in the survey instruments. Respondents will be informed that participation is voluntary, that they may refuse to answer any question, and that they may stop their participation at any time.

For qualitative interviews, The Urban Institute will use the informed consent documents attached to each interview protocol to obtain consent for participation in the study. These forms detail the risks and benefits of participating and the expected privacy for each participant. These respondents are not in categories designated as vulnerable populations, and the information the evaluation team will collect is not highly sensitive. Because some study participants will be local agency or organization leaders, administrators, or staff members, and because the team will name the sites in our reports, individuals reading the reports may be able to attribute particular information or comments to those respondents. The evaluation team will inform respondents about this potential risk.

Data Security and Monitoring

The Urban Institute will have a data security plan that outlines how the project will store, transfer and destroy sensitive information as well as the precautions to be taken during each of these activities to ensure the security of those data. The contractor has a secure server for a web-based data collection, utilizing its existing and continuously tested web-based survey infrastructure. The infrastructure features the use of HTTPS (secure socket, encrypted) data communication; authentication (login and password); firewalls; and multiple layers of servers, all implemented on a mixture of platforms and systems to minimize vulnerability to security breaches. Hosting on an HTTPS site ensures that data are transmitted using 128-bit encryption, so that transmissions intercepted by unauthorized users cannot be read as plain text. This security measure is in addition to standard password authentication that precludes unauthorized users from accessing the web application. The contractor has established data security plans for handling all data during all phases of survey execution and data processing for the surveys it conducts. Its existing plans meet the requirements of U.S. federal government agencies and are continually reviewed in the light of new government requirements and survey needs. Such security is based on (1) exacting company policy promulgated by the highest corporate officers in consultation with systems staff and outside consultants, (2) a secure systems infrastructure that is continually monitored and evaluated with respect to security risks, and (3) secure work practices of an informed staff that take all necessary precautions when dealing with private data.


11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.

There are no sensitive questions that will be asked as part of the qualitative interviews, but some sensitive questions will be asked as part of the household survey instrument. Respondents may skip any questions they do not want to answer.


The only sensitive questions that will be asked as a part of the data collection are in the household survey instrument (Appendix A). The goal of the survey is to obtain information about the experiences of residents who lived or currently live in the target developments. For those who lived in the target development at baseline, the survey will obtain information about their well-being in 2021 that can be compared to their well-being in 2013 and 2014. For those who live in the target development in 2021, the survey will provide information about them in 2021.


Some of our research questions about resident experiences require us to ask survey questions that people consider to be sensitive. Most people consider their income and experience with hardship, health, household composition, and information about their children, to be sensitive. This survey collects information about each of these areas.


Before starting the survey, all respondents will be informed that their identities will be kept private and that they do not have to answer any question that makes them uncomfortable. Although some questions may be sensitive for some respondents, they have been successfully asked of similar respondents in other data collection efforts, such as in the Choice Baseline Study.


12. Provide estimates of the hour burden of the collection of information. The statement should:

  • indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices;

  • if this request covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens in Item 13 of OMB Form 83-I; and

  • provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead this cost should be included in Item 13.

Exhibit A shows the projected burden hour estimates for data collection for the household survey and qualitative interviews with HUD staff, housing, people, and neighborhood implementation leads, and other local stakeholders, such as anchor institution staff and city officials. These estimates assume the maximum possible number of study participants. The estimates included in Exhibit A-1 are based on estimates for the time needed to complete these data collection activities. The total annual cost burden to respondents is approximately $38,028.00.

For the household survey (Appendix A) of Choice Neighborhood residents (residents of market rate and subsidized housing units) in the new developments, we assume that the contact person earns approximately $17 per hour. This is the weighted average of the hourly wage for a laborer, a licensed nurse, and an office clerk. This average was determined by the Bureau of Labor Statistics’ Occupational Employment Statistics as of May 2018. For 2,388 respondents taking 0.58 hours (~35 minutes) to complete the survey, the total cost would be $23,545.68 We will also interview 5 resident leaders (Appendix B) as part of the qualitative assessment of Choice Neighborhoods. We assume the same wage rate for these residents ($17 per hour).

For the high-level informants (i.e., city officials and lead grantees) of Choice (Appendix C), we assume that the likely respondent earns approximately $42.30 per hour. We expect to interview 45 respondents for 60 minutes. We also assume that respondents will spend a half hour reviewing materials prior to the interview, for a total of 1.5 hours. The total cost would be $2,855.25.

For HUD staff informants (Appendix D), we assume that the likely respondent earns an average of approximately $75.82 per hour (GS13-15 pay as reported by the Office of Personnel Management as of January 2020). We expect to interview 18 respondents for 60 minutes. Again, we also assume that respondents will spend a half hour reviewing materials prior to the interview, for a total of 1.5 hours. The total cost would be $2,047.14.

For Housing informants (Appendix E) (i.e., housing developers, housing implementation lead, public housing and affordable-housing property management staff), we assume that the likely respondent earns approximately $35.39 per hour. This is the weighted average of the hourly wage for architects and property managers as reported by the Bureau of Labor Statistics’ Occupational Employment Statistics in May of 2018. We expect to interview 54 respondents for 60 minutes. We assume that respondents will spend a half hour reviewing materials prior to the interview, for a total of 1.5 hours. The total cost would be $2,866.59.

For the People informants (i.e., case management and other service providers, people implementation lead) of Choice (Appendix F), we assume that the likely respondent earns approximately $23.92 per hour. This is the weighted average of the hourly wage for child, family and school social workers as reported by the Bureau of Labor Statistics’ Occupational Employment Statistics in May of 2018. We expect to interview 63 respondents for 60 minutes. We assume that respondents will spend a half hour reviewing materials prior to the interview, for a total of 1.5 hours. The total cost would be $2,260.44.

For the Education informants (i.e., education implementation lead and education-related implementation staff) of Choice (Appendix G), we assume that the likely respondent earns approximately $23.92 per hour. This is the weighted average of the hourly wage for child, family and school social workers as reported by the Bureau of Labor Statistics’ Occupational Employment Statistics in May of 2018. We expect to interview 27 respondents for 60 minutes. We assume that respondents will spend a half hour reviewing materials prior to the interview, for a total of 1.5 hours. The total cost would be $968.76.

Lastly, for the Neighborhood informants (i.e., local anchor institution staff, community leaders, implementation lead, and local police precinct commanders) of Choice (Appendix H), we assume that the likely respondent earns approximately $35.52 per hour. This is the weighted average of the hourly wage for clergy, police commanders and nonprofit CEOs as reported by the Bureau of Labor Statistics’ Occupational Employment Statistics in May of 2018. We expect to interview 63 respondents for 60 minutes. We assume that respondents will spend a half hour reviewing materials prior to the interview, for a total of 1.5 hours. The total cost would be $3,356.64.

Exhibit A-1: Estimated Hour and Cost Burden of Information Collection

Information Collection

Number of Respondents

Frequency of Response

Responses

Per Annum


Burden Hour Per Response


Annual Burden Hours


Hourly Cost Per Response

Cost


Household survey

2,388

1

2,388

.58

1,385.04

$17.00

$23,545.68

Interviews with resident leaders

5

1

5

1.5

7.5

$17.00

$127.50

Interviews with High-level informants: Lead grantees, City officials and staff

45

1

45

1.5

67.5

$42.30

$2,855.25

Interviews with HUD staff

18

1

18

1.5

27

$75.82

$2,047.14

Interviews with housing informants: Housing implementation lead, Housing developers, Public housing and affordable-housing property management staff

54

1

54

1.5

81

$35.39

$2,866.59

Interviews with people informants: People implementation lead, Case management staff, Other service providers

63

1

1

1.5

94.5

$23.92

$2,260.44

Interviews with education informants: Education implementation lead, education implementation staff

27

1

1

1.5

40.5

$23.92

$968.76

Interviews with Neighborhood informants: Implementation lead, Local police precinct commanders,

Local anchor institution staff, Community leaders


63

1

1

1.5

94.5

$35.52

$3,356.64


Total


2,663




1,797.54


$38,028.00




13. Provide an estimate of the total annual cost burden to respondents or recordkeepers resulting from the collection of information (do not include the cost of any hour burden shown in Items 12 and 14).


  • The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life); and (b) a total operation and maintenance purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s) and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities;

  • If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collection services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10) utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.

  • generally, estimates should not include purchases of equipment or services, or portions thereof made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.


This data collection effort involves no recordkeeping or reporting costs for respondents other than the time burden to respond to questions on the data collection instruments as described in item 12 above. There is no known additional cost burden to the respondents.


14. Provide estimates of annualized cost to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information. Agencies also may aggregate cost estimates from Items 12, 13, and 14 in a single table.



This data collection for the Outcomes Evaluation of the Choice Neighborhoods Program is being carried out under a HUD contract with The Urban Institute and its subcontractors, DIR and CWRU. The estimated cost to the Federal government for this data collection totals $1,109,294. The cost of the data collection to the Federal government is based on contractor’s and subcontractor’s labor hours to a) develop surveys and interviews for the study ($165,119); b) perform data collection and analysis activities including outreach to participants, conduct surveys and interviews, and data analysis ($944,175).

The data collection costs are one-time costs based on the competitively bid and awarded contract for this study.



15. Explain the reasons for any program changes or adjustments reported in Items 13 and 14 of the OMB Form 83-I.

This submission is a new request for approval.



16. For collection of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.


The data collected will be analyzed, tabulated, and reported to HUD by the evaluation contractor, The Urban Institute.


Exhibit A-3 presents an overview of the data collection and analysis schedule.



Exhibit A-3: Project Schedule

Task

Description

Timeframe (after OMB approval)

Survey administration

Surveys of households in the Choice Neighborhoods

June 2021 – October 2021

Qualitative Interviews

Interviews with high-level staff, HUD staff, and housing, people and neighborhood leads

March 2021 – June 2021

Final paper

Drafting and release of final paper

December 2021 - January 2022



17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.

The expiration date for OMB approval will be displayed on all forms completed as part of the data collection.

18. Explain each exception to the certification statement identified in item 19.

This submission describing data collection requests no exceptions to the Certificate for Paperwork Reduction Act (5 CFR § 1320.9).

1 See pages H2880 - H2881 of the Congressional Record for March 22, 2018, included as an attachment and also available at: https://www.congress.gov/crec/2018/03/22/CREC-2018-03-22-bk3.pdf


2 Berlin, Martha, Leyla Mohadjer, Joseph Waksberg, Andrew Kolstad, Irwin Kirsch, D. Rock, and Kentaro Yamamoto. 1992. "An experiment in monetary incentives." Proceedings of the Survey Research Methods Section of the American Statistical Association. Alexandria, VA: American Statistical Association.

3 James, Jeannine and Richard Bolstein. 1990. “The Effect of Monetary Incentives and Follow-Up Mailings on the Response Rate and Response Quality in Mail Surveys.” Public Opinion Quarterly 54 (3): 346-61.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-04-29

© 2024 OMB.report | Privacy Policy