CBAMS II_OMB Clearance_Supporting Statement_Part A_042211

CBAMS II_OMB Clearance_Supporting Statement_Part A_042211.docx

Census Barriers, Attitudes & Motivators Survey II

OMB: 0607-0947

Document [docx]
Download: docx | pdf

A. Justification



  1. Necessity of the Information Collection



Every ten years, the U.S. Census Bureau is constitutionally mandated to count everyone (citizens and non-citizens) residing in the United States. An accurate count is critical for many reasons including but not limited to:


• Congressional reapportionment,

• Redistricting congressional boundaries;

• Community planning; and

• Distribution of public funds and program development.


To facilitate the data collection effort for the 2010 Census, the Census Bureau developed an Integrated Communications Plan (ICP). The role of the ICP was to increase public awareness and to motivate people to respond to the census promptly, saving millions of taxpayer dollars. The specific objectives of the ICP were to:


• Increase mail response;

• Improve cooperation with enumerators; and

• Improve overall accuracy and reduce differential undercount.


The Census Bureau conducted the Census Barriers, Attitudes, and Motivators Survey (CBAMS) in 2008 to gain an in-depth understanding of the public’s opinions about the 2010 Census. The results of that survey revealed that there were distinct mindsets toward the Census, and customizing outreach to these attitudinal mindsets is an important part of Census’ communications strategy for 2020 and beyond. In CBAMS II, Census will extend that research to further specify the segments and to learn about their stability and structure. The results of CBAMS II will inform the market research program and communications for Census 2020.


The legal authority under which this information is being collected is Title 13 U.S.C. Sections 141 and 193.

  1. Needs and Uses



CBAMS II has several goals, one of which is to determine how consistent the Census mindsets that emerged from the first CBAMS are over time, especially now in the post-censal environment. We originally developed five mindsets each with a unique set of attitudes toward the Census, barriers to participation, and motivators to participate - - Leading Edge, Head Nodders, Unacquainted, Insulated, and Cynical Fifth.


Combined with other research efforts, the results of CBAMS informed the 2010 Census Integrated Communications Campaign by allowing us to produce messaging that resonated with each group. The Leading Edge had a good grasp of the purposes of the Census and its benefits and were the most likely to respond. Head Nodders generally had an average understanding of the Census, and were the second most likely group to respond to the Census. The Unacquainted had no knowledge of the Census, while the Insulated likely knew what the Census was, but they were not likely to know why it is conducted and what the benefits are. Finally, the Cynical Fifth is the population segment that is distrustful of the government and not likely to respond. While there is some clustering, racial/ethnic groups are found throughout all five mindsets. Further information on CBAMS mindsets are available in the paper, “Messaging to America: Census Barriers, Attitudes, and Motivators Survey Results” by Bates et al.


Methodologically, CBAMS II is very similar to the first CBAMS. We were able to expand the CBAMS II questionnaire after removing media usage questions that were needed to direct the Paid Advertising campaign in 2010. Also, 2008 Census Dress Rehearsal sites that were removed from the frame are now eligible for selection.


The analytic goals of CBAMS II are to:


  • Determine the best method for identifying Census mindsets by evaluating the reliability of mindset creation algorithms from CBAMS I and CBAMS II.

    • Are the Census mindsets from CBAMS I consistent over time?

  • Understand more what the mindsets are, especially addressing the following questions:

    • Is there a qualitative distinction between people who are unaware of the Census and those who lack extensive knowledge of the Census? That is, are the Unaware and Unacquainted mindsets from CBAMS I really different?

    • What are the characteristics and belief profiles of people whose attitude toward the Census is negative? That is, what are the Census dislikers all about?

    • What sub-segments exist within the large positive segments? That is, what is the big group of Census likers all about?

    • What are the full attitudinal and demographic profiles of CBAMS II mindsets? That is, what is each of the final segments really like?

  • Define who is in each mindset:

    • Census has classified Census locations into “clusters” that have specific demographic profiles. We will crosswalk the segments to these clusters and evaluate the correspondence.

    • We will also relate segments to hard-to-count groups.

    • Census behavior - - Self-reports of decennial Census experience, while not perfectly accurate, will be an important part of evaluating the utility of each segment.

  • Measure attitudes toward the possible use of administrative records to supplement or replace Nonresponse Followup to the Census and relate those attitudes to CBAMS II mindsets. This aspect is categorized as exploratory research as discussed below.

  • Determine what communications medium(s) can best reach each mindset.


We will employ in-person, landline telephone, and cell-phone interview protocols. The CBAMS II sample is stratified to capture the following hard-to-reach populations: linguistically isolated, high concentrations of Hispanic and Asian populations, as well as American Indian reservations and rural economically disadvantaged populations. The remaining sample is broken into big, mid, and small markets based upon Designated Market Area size; each strata is further segmented by hard-to-count scores.


The results from CBAMS II will serve as a baseline measure for further research conducted between now and the 2020 Census, including potential future replications of this survey. If we are, in fact, in a position to conduct additional CBAMS iterations, CBAMS II results will be the first in a time series of data that tracks mindsets over time. An exceptional understanding of our audience will better equip us to produce advertising that will resonate strongly with each group.


Additionally, Census Bureau management will utilize CBAMS II results to inform intercensal communications efforts and will publicly share research results including aggregated statistics. Findings may also be presented at methodological conferences. CBAMS II results, however, will not be used for official Census Bureau population estimates or for measures of the economy.


CBAMS II interviews will be conducted via telephone, both landline and cell, and by in-person interviews. The in-person interviews are reserved for those populations that are historically hard-to-reach, and these are the same groups that were interviewed in-person for the original CBAMS.


The US Census Bureau has developed clustered Census tracts based on demographic information. These "clusters" are demographic segments used in planning communications efforts. One goal of CBAMS II is to examine the links between the attitudinal mindsets and the demographic clusters. To do
this, we will identify the most likely tract membership for each sample record. For in person interviews, ICF Macro will have the addresses of the sampled records, and for landlines, we will have the exchanges, which can be located with some precision. However, having zip code will help to refine this matching process, especially for cell phone interviews (since cell phone exchanges cannot be located with precision). The only use of this "most likely tract" assignment will be to classify records into
demographic Census "clusters". We will then link the CBAMS II respondents to the Planning Database (PDB) by tract number to obtain cluster as well as Census 2000 participation and hard-to-count data. This linkage will be used to inform CBAMS II, but will not be used to update the PDB.


We also intend to compile a limited set of questions from CBAMS II that we can insert into future surveys that will allow us to classify respondents from those surveys into CBAMS II mindsets, thus expanding our understanding of each mindset over time.


The CBAMS II questionnaire includes a section requiring motivators to be ranked, and we will use MaxDiff to accomplish this task, which provides two important benefits, as follows:

First, when defining segments where the benefits sought are part of the segmentation, one typically needs to get an ordering of the importance of the benefits. Benefit oriented segmentation is often used in advertising segmentation and message development. When using typical importance rating scales, either numerical or word anchored, one tends to see a lot of clustering at the upper end of the scale. MaxDiff is one tool to help break these ties and get an ordering. Others are conjoint analysis, choice based conjoint, ranking, Q-sorts, but most of these are cumbersome or impossible to implement over the phone. These methods can be very time consuming. MaxDiff gets around the time and difficulty issue. It is commonly used in the private sector as discussed in Orme and Johnson, 2009.

A second benefit is MaxDiff leads quickly to a typing tool for assigning respondents to a segment as mentioned above. Typically, these can be implemented as a series of questions like, "Which comes closest to how you feel - - statement A or statement B?" It can also be done with preferences, importances, agreement, etc. Usually, after 5 or so questions, a respondent can be assigned to a segment with reasonably high reliability. This assignment tool is much easier to implement than using a battery of multipoint rating questions and an associated assignment tool using discriminant analysis or logistic regression. In many cases, a battery based assignment tool is too long to include in subsequent research and is not used -- thus the segmentation is not really impacting subsequent testing, research, and decisions.


Finally, the CBAMS II questionnaire will conduct some exploratory research to evaluate public opinion about the use of administrative records for completing Census forms during Nonresponse Followup. Three sets of questions, each framing the use of administrative records differently (cost, burden, control), will be presented in a split questionnaire. Each respondent will get one set of administrative records questions. The question sets will be equally divided among participants. Public opinion about the use of administrative records will be compared between the three frames. The estimated sample sizes and minimum detectable differences (MDD) for overall comparisons and for the mindsets based on these assumptions:

  • 80% power

  • 5% type I error (95% confidence level)

  • Standard deviation: p*q=0.5*0.5

  • Design effect = 2




CBAMS I population estimate

Expected CBAMS II Sample Size

Total

Frame 1

Frame 2

Frame 3

Total

4000

1333

1333

1333

Leading Edge

26%

1040

347

347

347

Head Nodders

41%

1640

547

547

547

Insulated andUnaquainted

13%

520

173

173

173

Cynical Fifth

19%

760

253

253

253


Between frame MDD






Total

7.7%





Leading Edge

15.0%





Head Nodders

12.0%





Insulated and unaquainted

21.3%





Cynical Fifth

17.6%







Within frame MDD


Edge-Nodders

13.6%

Edge-Ins/Unaq

18.5%

Edge-Fifth

16.4%

Nodders-Ins/Unaq

17.3%

Nodders-Fifth

15.1%

Ins/Unaq-Fifth

19.6%


Quality is an integral part of the pre-dissemination review of the information disseminated by the Census Bureau (fully described in the Census Bureau’s Quality Guidelines). Information quality is also integral to the information collections conducted by the Census Bureau and is incorporated in the clearance process required by the Paperwork Reduction Act.




  1. Use of Information Technology


Approximately 80 percent of the interviews will be conducted using computer assisted telephone interviewing (CATI). Telephone interviews will be conducted with respondents on cell phones and landline telephones.


A survey verification line with interactive voice recognition (IVR) will be available to field inquiries about the authenticity of the survey, to allow the respondent to opt out of the survey, or transfer to a company representative to complete the interview (during operating hours).

  1. Efforts to Identify Duplication


The information collected in CBAMS II is unduplicated against the CBAMS I sample, meaning that we will not have the same respondent in both surveys. As this is a nationally representative sample, we are still able to achieve one of our primary goals - - to evaluate reliability of the segments originally measured in CBAMS I. No other survey has contained all the questions needed to recreate these segments.

  1. Minimizing Burden


The data collection does not impact small entities.

  1. Consequences of Less Frequent Collection


This is a one-time data.

  1. Special Circumstances


There are no special circumstances. Data collection is conducted in accordance with the Office of Management and Budget (OMB) guidelines.




  1. Consultations Outside the Agency


On December 29, 2010, we published a notice in the Federal Register (Vol. 75, No. 249, pages 81965-81966) seeking public comment on the necessity, content, and scope of the data collection. We have not received any comments to date.


Outside the Federal Government, consultants include:


Randy Zuwallack, MS

ICF Macro

802-264-3724

[email protected]

Frederica Conrey, PhD

ICF Macro

802-264-3785

[email protected]

Mike A. Lotti

Accretive Insights

(m) 585.734.1216

[email protected]

Peter V. Miller, PhD

Department of Communication Studies

Northwestern University

Past President, American Association for Public Opinion Research

847 491 5835

[email protected]


  1. Paying Respondents


The sample for CBAMS II is nationally representative and includes subsamples from population segments that are historically known to be hard-to-count, including American Indians, Hispanics, Asians, and the economically disadvantaged households residing in rural areas. Previous research indicates that these populations would be under-represented in a RDD telephone survey; therefore, we will conduct this portion of the data collection via in-person interviews. Furthermore, maximizing the number of personal interviews in CBAMS II is particularly important because households with available phone numbers may differ in characteristics from those without telephones and those with unlisted phone numbers.


To increase the response rate of these hard-to-count subgroups and obtain the necessary number of completed interviews, we will offer a $10 cash incentive to households selected for in-person interviews regardless of whether or not they participate. OMB survey guidance notes that research has consistently shown that giving an unconditional incentive when first contacting the respondent is more effective in obtaining cooperation than the promise of an incentive after completion of the survey. We will complete a minimum of 800 interviews with an expected number of 1,000. Incentive payments, then, will not exceed $10,000. Providing incentives to these groups is not cost prohibitive, and it is both beneficial and necessary to expend additional effort and expense to secure survey participation.


The RDD landline and cellular respondents will not be offered any gift or payment. We decided not to offer cell phone incentives for two reasons:


1.       Very few plans now pay per minute, and

2.       We have no evidence that cell phone incentives promote response.


 Post-paid subscribers almost all have unlimited plans or blocks of minutes. 12.5% of subscribers are pre-paid. Conventionally, pre-paid subscribers have paid per minute, but now unlimited and block minute plans are increasingly common in the prepaid market. The actual proportions of unlimited and traditional pre-paid plans in the market are not available; however, in the second quarter of 2009, 1.5M pre-paid unlimited plans were added, which is more than 50% of all new subscribers; only about 7% of new subscribers were traditional pre-paid subscribers. In other words, there is good evidence that pay-per-minute plans are currently a small portion of the market and are decreasing in popularity.


Even if all prepaid subscribers were pay-per-minute, an incentive would be warranted in only about 12.5% of cases. Given the rapid shift toward unlimited prepaid plans in the market, the actual proportion of all interviews for which the incentive would be warranted might be as low as 1 or 2%. Offering an incentive to all respondents does not seem like an efficient use of project funds unless there is a positive impact on data quality. These numbers seem to suggest that there is little reason to believe that the sample could be reasonably skewed by the exclusion of an incentive. Even lower income populations now seem to have access to unlimited calling plans according to the 14th Mobile Wireless Competition Report.


Regarding cell phone response rates, in the past two years, there has been interest in whether expensive cell phone incentives are actually promoting response. There is currently little to no literature currently available on this topic. Macro International has run some small pilot studies to examine the impact of cell incentives. In one natural experiment, a client used an incentive and then stopped for cost reasons. Over five months, among 391 respondents who were contacted and told about the incentive, 57% completed the interview. Over the next five months, 941 respondents were contacted and not offered an incentive - 57% completed the interview. That interview was also comparable in length to the CBAMS survey.


Results from CBAMS II have the potential to benefit multiple other censuses and surveys conducted by the U.S. Census Bureau as well as those conducted by others in the Federal statistical system. Such research is necessary in order to accomplish the U. S. Census Bureau’s mandated purpose of conducting censuses and surveys to produce quality and accurate national demographic and economic data about America's people and economy. Use of incentives may encourage and maintain or increase respondent participation in surveys, resulting in improved quality of the data collected and more accurate overall survey results. Lack of incentive use will severely hinder our ability to successfully accomplish our purpose.


  1. Assurance of Confidentiality


This survey is being conducted under Title 13, but it is not a Title 13, Section 9 data collection that would legally protect the confidentiality of the data collected. Therefore, we ensure our respondents that we have every intention of keeping their answers anonymous; we will not make a legal promise of confidentiality. Additionally, we are not collecting sensitive personally identifiable information in CBAMS II. We address this during interview consent with the statement: “We intend to protect your anonymity by not asking for your name, address, or other personal information that could easily identify you.”


Additionally, we are not explicitly stating that Census is the agency sponsoring this survey, but we are using “a federal agency” instead, unless the respondent specifically requests the identity of the sponsor. The prenotification letters identify ICF Macro, an independent survey research firm, as the entity responsible for data collection. In CBAMS II, we are trying to learn more about The Cynical Fifth, a mindset that emerged from CBAMS. This mindset accounts for approximately 20% of the population and comes from a fairly representative cross-section of the United States and is not dominated by a few race/ethnic groups or a single socioeconomic class, and who are characterized by high skepticism, concerns about confidentiality, and low belief that the Census is a civic obligation (Bates et al., 2009). Therefore, with this approach, we are attempting not to alienate those respondents who may help us learn more about this mindset.

  1. Justification for Sensitive Questions


The survey does not include questions of a sensitive nature.

  1. Estimate of Hour Burden


The annual respondent burden for conducting 4,200 interviews is estimated at 1,757 hours. This estimate includes 800 personal interviews, 2500 landline telephone interviews and 900 cell phone interviews. The average length for all three survey modes is estimated to be 23 minutes. The estimated time to screen a household for eligibility is 2 minutes. The estimated percentage of qualifying households is 95 percent. The interview is administered once to each respondent.

  1. Estimate of Cost Burden


Most cell phone respondents (24% of respondents) will incur phone charges, either in used minutes or per minute cost. As of 2008, the average revenue to phone companies per cell phone minute of talk time was $.05. Thus, the cost to any individual cell phone respondent is expected to be equivalent to $1.15, although the great majority of cell phone subscribers (more than 80%) have “postpaid” plans with a predetermined number of monthly minutes (Federal Communications Commission, 2010), so the incremental cost in dollars to most respondents will be $0.

  1. Cost to Federal Government


The cost of this data collection is an estimated $1.2 million.

  1. Reason for Change in Burden


Not applicable; this is a new data collection.

  1. Project Schedule



The timeline below is based on receiving OMB approval on 4/29/2011.

Task

Start

Finish

Program CATI English

January 19

February 8

CATI Testing and revisions

February 8

February 25

Translation

January 19

February 10

Load landline and cell sample

April 29

May 2

Phone data collection

May 2

July 17

Produce Field Materials

April 29

May 6

Train in-person interviewers

April 22

May 6

Distribute survey packets

May 6

May 6

In person data collection

May 6

July 8

Data entry and processing

May 6

July 27

  1. Request to Not Display Expiration Date


The data collection instruments will include the OMB control number and expiration date. This information will be conveyed verbally to the respondents.

  1. Exceptions to the Certification


There are no exceptions to the certification statement.


10


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorRiki Conrey
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy