ACS Messaging -- Key Informant Interviews Justification Part A

Key Informant Interviews Supporting Statement Part A_1.14.14_Final.docx

Generic Clearance for Data User and Customer Evaluation Surveys

ACS Messaging -- Key Informant Interviews Justification Part A

OMB: 0607-0760

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT

U.S. Department of Commerce

U.S. Census Bureau

ACS Messaging Key Informant Interviews

OMB Control No. 0607-0760


This Supporting Statement provides additional information regarding the Census Bureau’s request for processing of the proposed information collection, ACS Messaging Key Informant Interviews. The numbered questions correspond to the order shown on the Office of Management and Budget Form 83-I, “Instructions for Completing OMB Form 83-I.”


A. Justification


  1. Necessity of the Information Collection


The U.S. Census Bureau requests authorization from the Office of Management and Budget (OMB) to conduct ACS Messaging Key Informant Interviews, as part of the CLMSO Generic Clearance for Data User and Customer Evaluation Surveys, OMB No. 0607-0760.


The American Community Survey (ACS) collects detailed socioeconomic data from about 3.5 million households in the United States and 36,000 in Puerto Rico each year. The ACS is a multi-modal survey; initially households receive a series of mailings to encourage them to respond online or by mail. Census Bureau representatives attempt to follow up with the remaining households by telephone. A sub-sample of households that cannot be matched with a telephone number is reached through in-person visits with a Census Bureau field representative.


Resulting tabulations from that data collection are provided publicly on an annual basis. The ACS allows the Census Bureau to provide timely and relevant housing and socio-economic statistics for even small levels of geography. The ACS survey generates data that help determine how more than $450 billion in federal and state funds for education, public health, and other public interests are distributed each year (Groves, 2012). In addition, ACS data is widely used by analysts in the private sector and academic community. The quality of the population estimates derived from the ACS, and the success of the survey, rests largely on the response rate for data to be representative and maintain quality (see 2013 ACS Integrated Communications Plan).


The American Community Survey Office (ACSO) is currently conducting a series of closely related research projects around ACS messaging and the mail package. These studies use a variety of qualitative and quantitative methodologies including cognitive interviews, mental modeling, quantitative surveys, and field tests. This research aims to increase participation rates in the ACS survey and improve the value of Census Bureau data products to the broader universe of data users. The Census Bureau is also developing strategies to address concerns about whether the ACS is too intrusive—especially when recipients receive phone calls or personal visits from the Census Bureau. Improving self-response rates from the initial mailings could reduce the number of those follow-up contacts that impact the public.


In addition, substantial taxpayer savings can result from greater participation in the initial self-response phases (primarily online and by mail), because of a subsequent reduction in labor-intensive telephone and in-person follow-up activities. In 2011, just under 60 percent of households self-responded to the ACS data collection (Olson, 2013). The Census Bureau anticipated a net savings of more than $875,000 per year in nonresponse follow-up costs by using an additional reminder postcard that increased overall response rates by 1.6 percent (see Chestnut, 2010).


Previous Census Bureau analysis has identified socioeconomic and demographic characteristics that predict self-response rates. In an analysis of 2005 ACS self-response rates, characteristics such as income below the poverty line, minority, and non-citizen were associated with below average self-response participation rates. Other factors associated with lower response included being renters, single, in multi-unit buildings, having attained less than a high school education, and being under age 30 (See Bates 2008). In another large-scale government-funded non-response survey, the American Time Use Survey (ATUS), researchers found substantial support for the notion that socioeconomic factors play a role in nonresponse, suggesting that these individuals are not as well integrated into society and are therefore harder to reach (Abraham, Maitland, Bianchi, 2006). As a result, this study design incorporates elements to explore from local government agencies and community non-profits how they effectively reach out to those high-interest communities.


  1. Needs and Uses


As part of the ACS Messaging and Mail Package Assessment, this study is designed to gather insights from leaders in organizations that use data professionally or conduct outreach to low-income, minority, or immigrant populations. By better understanding how these two groups interact with ACS data and with their communities, the Census Bureau can improve outreach efforts and data delivery efforts, especially to the high-interest populations that have the lowest ACS self-response rates, which currently require costly follow-up from Census Bureau field representatives. This study is specifically designed to reach beyond existing formal channels for external consultation (such as advisory boards and committees) in order to identify original approaches for the ACS.


Specifically, the objectives of the Key Informant Interviews are to:

  • Assess existing experience and knowledge gaps and barriers with ACS in each stakeholder segment

  • Identify other information sources that data users access for population data.

  • Understand how to link ACS data to areas of interest for various stakeholder segments

  • Find engagement opportunities to encourage greater usage of ACS data by decision-makers outside of the federal government

  • Highlight key barriers, outreach channels and effective messages to reaching high-interest communities

  • Recommend how to incorporate the findings, particularly with high-interest communities into subsequent research projects, such as the ACS Refinement Messaging study


The Key Informant Interviews will use a qualitative telephone methodology to gain rich insights from leaders in organizations that use data professionally or conduct outreach to low-income, minority, or immigrant populations. We will conduct n=100 in-depth telephone interviews to reach priority stakeholders from five key segments nationally.


The sample sizes are designed to allow the research team to assess a broad range of stakeholders, with more interviews allotted to segments that have greater heterogeneity and a broader range of experiences. The five segments (with sample sizes) are: private sector business (n=30), academic / research (n=15), state and local government (n=15), tribal government and organizations (n=10), and community or advocacy associations (n=30).


Many ACS stakeholders in local government and community advocacy groups frequently engage with high-interest populations, especially low-income, minority, and immigrant communities that have the lowest self-response rates for the ACS. These interviews are designed to gain insight from organizations that are experienced in delivering outreach and services in the areas that have the lowest ACS self-response. Insights from these groups will likely help improve efforts to increase response rates among high-interest communities which will decrease the overall respondent burden of the ACS and decrease costs for the Census Bureau.


Additionally, investigating the changing needs of professional data-users will enable the Census Bureau to review the way that it reports ACS data to the public and to help guide improvements to online data publication so that ACS data is most useful to key ACS stakeholders. This will help support the ACSO’s goal of being, “valued by data users, trusted by the public, and supported by influential stakeholders” (ACS Integrated Communications Plan (2013), pg 2).


These open-ended discussions will inventory stakeholder knowledge of ACS, identify key gaps, discuss potential themes and key messages, and assess the best communication and outreach channels.


These findings are designed to provide guidance for internal Census Bureau decision-making only. The findings are not intended for publication or public dissemination. While the results will inform ACS design and outreach to improve effectiveness and reduce cost, they will not be used to drive any policy decisions. Reports will state that the data were produced for messaging and exploratory research, not for official estimates or policy decisions. Further, data from the study will be included in reports with clear statements about methodology and limitations.


  1. Use of information technology


While collecting data, the Census Bureau will use computer-aided telephone interviewing (CATI) technology to process responses. Interviews will also be recorded and transcribed digitally. This technology is the most appropriate collection methodology, given the need for rich qualitative insight from organizations spread across the country.


  1. Efforts to identify duplication


Though it will contribute to ACSO’s past studies regarding ACS messaging and outreach, this study does not duplicate previous efforts. The study is informed by a thorough review of existing Census Bureau research on communications (see “Communications Research and Analytics Review”) and it expands upon those efforts by drilling down into usability and outreach issues with specific key stakeholders.


  1. Minimizing burden


The Census Bureau has incorporated several design elements to minimize the burden on small entities, especially among local governments (n=15), Tribal governments and organizations (n=10), community and advocacy associations (n=30), and small private sector businesses (approximately one third of the n=30 interviews). The questionnaire has been streamlined using close-ended questions to guide to relevant open-ended questions. This reduces the time-burden on respondents to around 30 minutes.


  1. Consequences of Less Frequent Collection


This study is a one-time, special test with a defined period for data collection.


  1. Special Circumstances


The Census Bureau will collect these data in a manner consistent with OMB guidelines.


  1. Consultations Outside the Agency


Outside of the Federal Government, consultants include:


Kiera McCaffrey

Reingold

202.559.4436

[email protected]


Sam Hagedorn

Penn Schoen Berland

202.962.3056

[email protected]


Jack Benson

Reingold

202.559.4455

[email protected]

Robert Green

Penn Schoen Berland

202.962.3049

[email protected]



  1. Paying Respondents


The research team will not pay respondents for participation.


  1. Assurance of Confidentiality


This survey is being conducted under Title 13, but the results will not be statutorily confidential. Rather, we will inform respondents that, “we intend to protect your anonymity by not asking for your name, address, or other personal information that could easily identify you.”


  1. Justification for Sensitive Questions


This survey does not include questions of a sensitive nature. In addition, participants may decline to answer any question(s) they choose.


  1. Estimate of Hour Burden



Total # of Interviews

Estimated Response Time

Estimated Burden Hours

Key Informant Interviews

100

30 minutes

50

Key Informant Screener

600

3 minutes

30



Total:

80


  1. Estimate of Cost Burden


There are no costs to respondents, other than that of their time, to respond to the survey.


  1. Cost to Federal Government


The Census Bureau incurs costs in terms of staff time for reviewing development and analysis documents. In addition, the total cost of this project is an estimated $86,300.

This research is part of a larger communications services, strategic consulting, and applied research project with private contractors.


  1. Reason for Change in Burden


Not applicable; this is a new data collection.


  1. Project Schedule


While no results will be published, the project’s timeline is detailed below. This timeline assumes approval by OMB by January 10, 2014.

Activity

Begin Date

End Date

OMB Approval

-

1/10/2014

Field Interviews

1/13/2014

2/7/2014

Conduct Analysis

2/10/2014

2/28/2014

Finalized Findings

-

3/17/2014


  1. Request to Not Display Expiration Date


The questionnaire will include the OMB control number and expiration date. This information will be conveyed verbally to the respondents.



  1. Exceptions to the Certification


There are no exceptions to the certification statement.


Attachments


A—Citations and References

B – Key Informant Interviews Questionnaire and Sampling Plan

1


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJoe Ste.Marie
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy