ACS Messaging -- Benchmarking & Refinement Survey Justification Part A

ACS Messaging Benchmark and Refinement Supporting Statement Part A_Final_1.20.14.docx

Generic Clearance for Data User and Customer Evaluation Surveys

ACS Messaging -- Benchmarking & Refinement Survey Justification Part A

OMB: 0607-0760

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT

U.S. Department of Commerce

U.S. Census Bureau

ACS Messaging Benchmark and Refinement Study

OMB Control No. 0607-0760


This Supporting Statement provides additional information regarding the Census Bureau’s request for processing of the proposed information collection, ACS Messaging Benchmark and Refinement Study. The numbered questions correspond to the order shown on the Office of Management and Budget Form 83-I, “Instructions for Completing OMB Form 83-I.”


A. Justification


  1. Necessity of the Information Collection


The U.S. Census Bureau requests authorization from the Office of Management and Budget (OMB) to conduct the ACS Messaging Benchmark and Refinement Study as part of the CLMSO Generic Clearance for Data User and Customer Evaluation Surveys, OMB No. 0607-0760.


The American Community Survey (ACS) collects detailed socioeconomic data from about 3.5 million households in the United States and 36,000 households in Puerto Rico each year. Resulting tabulations from that data collection are provided publicly on an annual basis. ACS data is widely used inside and outside the federal government, including helping determine how more than $450 billion in federal and state funds are distributed each year (Groves, 2012).


The ACS is a multi-modal survey; households initially receive a series of mailings to encourage them to respond online or by mail. These modes are identified as self-response. In 2012, just less than 60 percent of households self-responded to the ACS survey (Olson, 2013). Census Bureau representatives attempt to follow up with the remaining households by telephone. A sub-sample of households that cannot be matched with a telephone number is reached through in-person visits with Census Bureau field representatives. For a full description of the ACS data collection methodology, see the 2009 ACS Design and Methodology Report.


Telephone and in-person ACS completions are significantly more labor-intensive, and therefore expensive, for the Census Bureau. As a result, greater participation in the self-response phases can result in significant cost efficiencies. For example, the Census Bureau anticipated a net savings of more than $875,000 per year in nonresponse follow-up costs by increasing the overall mail response rate by 1.6 percent using an additional reminder postcard (see Chesnut, 2010).


The American Community Survey Office (ACSO) is currently conducting a series of closely related research projects around messaging and the ACS mail package with potential respondents to the ACS. This research aims to increase participation rates in the ACS survey and address concerns about whether the ACS is too intrusive.


This data collection activity is designed to support those goals by providing formative research that will help ACS decision-makers develop effective messages. The study will provide practical utility to guide further communications research projects by testing differences between various messages. Other qualitative research projects, including key informant interviews, mental modeling interviews with Census Bureau staff that collect telephone and in-person responses, and deliberative focus groups will all contribute to developing effective messages for respondents.


  1. Needs and Uses


The Census Bureau has previously conducted significant communications research around the decennial census. Segmentation studies like the Census Barriers, Attitudes, and Motivators Study (CBAMS) and CBAMS II, have found that appeals to community benefit are broadly effective at raising interest in participating in the census (See Bates et al., 2009 and Conrey et al., 2012). Other research methodologies, such as focus groups, have reached similar conclusions (see Newburger July 2009; Newburger August 2009). In addition, messages about “mandatory participation” are highly effective for catching respondents’ attention and boosting response rates in data collections (Leslie, 1996; Schwede, 2008; Navarro, 2011).


However, there are limitations for using decennial messaging research for an on-going, sampled survey like the American Community Survey. The ACS and the decennial survey are different in major ways. The decennial enumeration is not only a high-visibility event with a large motivational campaign, but it is also asked of every person in the United States and has very tangible benefits by apportioning representation in Congress. In addition, the decennial census asks relatively few questions (in part, because the ACS replaced the long-form census for the 2010 cycle). By contrast, the American Community Survey is a continuous survey that is less well known, and asks more questions to only a few participants in any specific geographic area at a time.


The current ACS Messaging and Mail Package Assessment research is designed to address the unique challenges of the ACS. This research program aims to increase self-response rates in the ACS survey and improve the value of Census Bureau data products to data users. The Census Bureau is also developing messaging strategies to address concerns about whether the ACS is too intrusive—especially resulting from phone calls or personal visits from Census Bureau field representatives. Improving self-response rates from the initial mailings could reduce the number of those follow-up contacts that are costly and impact the public.


The Messaging Benchmark and Refinement telephone survey is a two-phased telephone messaging survey that builds on qualitative research projects that are currently in progress. Each phase will consist of n=1000 quantitative telephone interviews among US adults who generally handle the mail for their household. Respondents will hear a random selection of six messages with subsequent questions measuring their response to those messages. The survey is designed to provide guidance about which messages are more or less effective than other messages within a reasonable degree of certainty among survey respondents; however, this survey will not produce detailed statistical inferences about the population as a whole.


The initial Benchmark phase will focus on identifying the best message themes (e.g., civic duty, importance for governance, community benefit) surrounding participation in the ACS. The Refinement phase will build on those findings by drilling down on particular phrases, words, and tones within the highest-testing Benchmark themes that convey the importance of the ACS. The two phases of the study will be independent—respondents to the Benchmark phase will not be contacted in the Refinement phase. As a result the study will not be used to measure changes in perception over time by comparing the results between the two phases.


The sample will include n=750 completed landline interviews and n=250 completed cell phone interviews, all using random digit dialing (RDD). The landline interviews will be stratified by ACS self-response rates to ensure that high-, medium-, and low-response counties have a proportional representation in the overall survey results. In order to get the necessary number of completes, we will begin with approximately n=52,250 working, non-commercial landline numbers and n=20,000 cell phone RDD numbers (details of the sampling plan are included in Part B.) Because we are not aiming to calculate or disseminate statistically-valid estimates about the population, our survey will not need to attain the OMB-standard response rates necessary to produce such estimates.


During the message testing, respondents will hear six of eleven messages. After calculating the mean score for each message on the likelihood to respond metric (5 for “much more likely” and 1 for “much less likely”) we will rank the scores by means. Using a sample of 545 randomized observations for each message, we expect to be able to measure a Cohen’s d of 0.28 difference between average message scores on a 5-point scale with 80% power and family-wise a=.05 using a Bonferroni correction for multiple comparisons. Effect sizes of the magnitude are generally considered between small (d~0.2) and medium (d~0.5) in size (Cohen 1992). Using Tukey’s homogenous subsets testing, we will have a ranking of which messages (or sub-groups of messages) are statistically different from one another.

The interviews will take approximately 20 minutes. Interviews will be conducted in English, with Spanish language callbacks as necessary.


The analysis will look at specific messages to identify effective themes and identify promising areas for further research in the Refinement phase of the study. Participants will be asked several attitudinal questions similar to CBAMS II. Some participants who are distrustful or skeptical towards the government will evaluate potential message themes for improving trust in the Census Bureau. We conservatively anticipate around 25% (or 250) of the sample will qualify for the drilldown on intrusiveness and privacy questions.


The objectives of the ACS Messaging Benchmark and Refinement study are to:

  • Provide insight into applicable topics, themes, specific messages, phrases, and tones for increasing interest in participating in ACS data collection activities

  • Recommend messages that effectively address concerns about privacy, intrusiveness, and harassment with people who are distrustful of government

  • Develop messages that can broadly inform other Census Bureau communications, including talking points, press releases, and the website.


These findings are designed to provide guidance for internal Census Bureau decision-making only. The findings are not intended for publication or public dissemination. While the results may inform ACS messaging and subsequent research, they will not be used to drive any policy decisions. Data from the study will be included in reports with clear statements about the methodology and limitations.


  1. Use of information technology


While collecting data, the Census Bureau contractor will use computer-aided telephone interviewing (CATI) technology to process responses and autodialing of landline numbers using an interview management system. The Census Bureau contractor will submit landline phone numbers to automated verification before calling respondents to reduce the number of telephone calls made to nonworking, fax, or commercial numbers.


  1. Efforts to identify duplication


By focusing on ACS’s specific messaging challenges, this study does not duplicate previous research efforts.


  1. Minimizing burden


The proposed data collections consist of telephone questions asked of a sample of households, not businesses or other small entities.


  1. Consequences of Less Frequent Collection


The proposed study is a one-time, special test with a defined period for data collection.


  1. Special Circumstances


The Census Bureau will collect these data in a manner consistent with the OMB guidelines.


  1. Consultations Outside the Agency


Outside of the Federal Government, consultants include:

Kiera McCaffrey

Reingold

202.559.4436

[email protected]


Jack Benson

Reingold

202.333.0400

[email protected]

Sam Hagedorn

Penn Schoen Berland

202.962.3056

[email protected]


Robert Green

Penn Schoen Berland

202.962.3049

[email protected]


  1. Paying Respondents


Respondents will not be paid or provided with gifts for participating in this data collection.




  1. Assurance of Confidentiality


We are conducting this study under Title 13, but the results will not be statutorily confidential. Rather, we will inform respondents that their answers are anonymous, and we will not ask for any personally identifiable information. The Census Bureau will keep all information collected in this test protected and in a secure environment. Contact information will not be maintained. Respondents will be assured that their responses will be kept anonymous by the Census Bureau and its contractors. This study also complies with the Privacy Act and the Paperwork Reduction Act.


  1. Justification for Sensitive Questions


This survey does not include questions of a sensitive nature.


  1. Estimate of Hour Burden


The Benchmark and Refinement survey each will have n=1,000 completed interviews which are anticipated to take ~20 minutes. In addition, survey participants in the study must meet two screening criteria—they must be eighteen years or older and generally handle the mail for the household. The research team anticipates that 2,000 individuals will answer the screening questions to yield n=1,000 completed interviews for each phase. Households that are contacted for the Benchmark phase will be removed from the sample for the Refinement phase so that no potential respondents are re-contacted.



Total # of Households

Estimated Response Time

Estimated Burden Hours

Benchmark Questionnaire

1000

20 min

333

Benchmark Screener

2000

2 min

67

Refinement Questionnaire

1000

20 min

333

Refinement Screener

2000

2 min

67



TOTAL

800


  1. Estimate of Cost Burden

In general, participants will not incur costs, other than their time. Some cell phone respondents (25% of interviews) may incur phone charges, either in used minutes or per minute cost. The industrywide average revenue to phone companies per cell phone minute of talk time is estimated at $0.12 (Sur, 2013). Thus, the cost to any individual cell phone respondent is expected to be equivalent to $2.40, although the great majority of cell phone subscribers (more than 80%) have “postpaid” plans with a predetermined number of monthly minutes or have free night and weekend calling, so there will be no cost to almost all respondents.


  1. Cost to Federal Government


The total cost of this project is an estimated $286,650. This research is part of a larger communications services, strategic consulting, and applied research project with private contractors. In addition, the Census Bureau incurs costs in terms of staff time for reviewing development and analysis documents.


  1. Reason for Change in Burden


Not applicable; this is a new data collection.


  1. Project Schedule


While no results will be published, the project timeline is detailed below. This timeline assumes approval by OMB by January 10, 2014.


This survey uses an iterative design, where the findings from the initial Benchmark phase will inform the specific messages that are selected for drill down in the Refinement phase. In practice, this will involve making data-informed revisions to the Benchmark questionnaire to test variations on specific messages and remove some messages that are not included in the Refinement phase. Between the two phases, the sample design, screening questions, and overall structure of the survey will remain the same. After the questionnaire has been refined, the Census Bureau will notify OMB with the final questionnaire for the Refinement Phase and changes made during the refinement process.



Activity

Begin Date

End Date

Benchmark

Phase

OMB approval

-

1/10/2014

Fielding

1/13/2014

1/23/2014

Conduct analysis

1/24/2014

2/5/2014

Finalized findings

-

2/13/2014

Refinement

Phase

Revisions to Refinement questionnaire based on Benchmark findings

2/14/2014

2/25/2014

OMB receives Refinement questionnaire

-

2/25/2014

Fielding

2/27/2014

3/12/2014

Conduct analysis

3/13/2014

3/26/2014

Finalized findings

-

4/10/2014


  1. Request to Not Display Expiration Date


The data collection instruments will include the OMB control number and expiration date. This information will be conveyed verbally to the respondents during the interview.


  1. Exceptions to the Certification


There are no exceptions to the certification statement.


Attachments

A – References and Works Cited

B – Cognitive Interview Report

C -- Benchmark Survey Questionnaire (English)

D – Benchmark Survey Questionnaire (Spanish)

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJoe Ste.Marie
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy