OMB 2010 Census ICP Evaluation Supporting Statement Part B Rev 7 17 2009

OMB 2010 Census ICP Evaluation Supporting Statement Part B Rev 7 17 2009.doc

2010 Census Integrated Communication Program (ICP) Evaluation

OMB: 0607-0955

Document [doc]
Download: doc | pdf

SUPPORTING STATEMENT – PART B

U.S. Department of Commerce

U.S. Census Bureau

2010 Census Integrated Communication Program Evaluation

OMB Control Number 0607-XXXX


1. Universe and Respondent Selection



The 2010 Census ICP Evaluation will employ a design stratified by the hard-to-enumerate populations (also referred to below as race/ethnicity groups). This design oversamples the five race/ethnicity groups, but will also consider eight audience segmentation clusters defined by DraftFCB in its development of the 2010 Census Integrated Communications Campaign (ICC). Drawing a sample primarily by segmentation cluster would not achieve sufficient sample sizes for the race/ethnicity groups because the clusters are not homogenous enough by race/ethnicity.



Since the research questions of interest for the 2010 Census Integrated Communications Program (ICP) Evaluation mention five specific race/ethnicity groups, the sample size is divided equally between these five race/ethnicity groups and the remaining non-targeted group. The sample size of 500 per race/ethnicity group used in the Census 2000 Partnership and Marketing Program Evaluation (PMPE) resulted in design effects around 2.0 and standard errors of 3.2 percent (on a binary proportion of 50 percent). The same sample sizes are proposed for the 2010 Census ICP Evaluation. Wave 3 will include an additional 1,200 interviews above the 3,000, but only from the Core sample. This will also permit collection of additional interviews into the sample to support the observational study design described below.

Table 1 below summarizes NORC’s planned total sample sizes and provides the percentage of households nationally of each race/ethnicity (from the March 2007 Current Population Survey) and the number of interviews that would be obtained with an equal-probability sample in each wave.

Table 1

Proposed Sample Sizes by Race/Ethnicity


Race/Ethnicity

Sample Size (Waves 1/2/3)

Percent Of Households

Sample Size for Equal-Probability Samples

Hispanic

500/500/900

11.18

335/335/470

African-Americans

500/500/900

12.12

364/364/509

Asian

500/500/500

3.89

117/117/163

American Indian/Alaska Native

500/500/500

1.30

39/39/55

Native Hawaiian/ Other Pacific Islander

500/500/500

0.23

7/7/10

All Other

500/500/900

71.28

2,138/2,138/2,993

TOTAL

3,000/3,000/4,200

100.00

3,000/3,000/4,200


As shown in Table 1 above, only two of the five hard-to-enumerate populations (Hispanics and Non-Hispanic African-Americans) could provide a significant number of interviews from nationally representative samples of 3,000 or 4,200 interviews.



Oversampling within a nationally representative sample can achieve the necessary sample sizes for both groups, so NORC will use a nationally representative sample to achieve the targets for Hispanics, non-Hispanic African-Americans, and non-Hispanic Others (including Whites) and designate this as the Core Sample (1,500 completed interviews in Waves 1 and 2; 2,700 completed interviews in Wave 3). For the remaining three hard-to-enumerate race/ethnicity groups (American Indian and Alaska Natives, Asians, and Native Hawaiian and Other Pacific Islanders), manipulating the tracts and segments selected will not be sufficient to meet the target sample sizes. Therefore, these must be fielded as independent samples.

Nationally Representative Core Sample

To select the 2010 Census ICP Evaluation Core Sample, NORC will use its 2000 NORC National Frame to draw a nationally representative sample of addresses. The 2000 NORC National Frame is efficient, powerful, and flexible because it takes advantage of the availability of the United States Postal Service (USPS) postal address lists for much of the United States. Traditional listing (sending out field employees to list every housing unit in certain selected census blocks) will be used only in areas for which USPS address lists are unavailable. NORC has researched the USPS lists, and this work shows that the USPS lists outperform traditional listings with better coverage at a lower cost.



The 2000 NORC National Frame is a traditional two-stage nationally representative area probability sample. First stage units, or National Frame Areas (NFAs), were selected within three categories. The twenty-four largest metropolitan statistical areas (MSAs) are selected with certainty into the Certainty Urban category (three are combinations of MSAs), which contains about 45 percent of the U.S. population. Thirty other MSAs were selected in the Non-Certainty Urban category, which represents 30 percent of the U.S. population. This category includes only urban census tracts within these MSAs. The third category, for rural NFAs, consists of 25 selections from the rural census tracts in the non-certainty MSAs, plus counties or county pairs that are not within a metropolitan statistical area. This category represents 25 percent of the U.S. population. Therefore, the 2000 NORC National Frame has 79 total NFAs. In order to create a larger base (through more concentration of the interviews) for the Wave 3 observational study design (see below), this project will only use half of the NORC NFAs (44 due to certainty NFAs).



Urban and rural are defined roughly by whether or not the addresses can be found in NORC’s database of USPS addresses. Within urban areas, the 2000 NORC National Frame second-stage units are entire census tracts. In rural areas, smaller segments are selected, consisting of at least 300 housing units (according to 2000 decennial census data) that are within one census tract. Since the eight audience segmentation clusters defined by DraftFCB and used for stratifying this sample are also defined by census tract, the sample can be balanced by cluster by sorting on segmentation cluster.



In order to achieve a higher precision for estimates of change, NORC will use a mixed panel design in which Waves 2 and 3 will consist of a panel component and a cross-sectional component. There is the possibility of a conditioning effect in that respondents may have changed their attitudes because of a previous survey, but the conditional effect is likely to be small. NORC will measure this conditioning effect analytically. The sampling plan calls for half of the interviews in Waves 2 and 3 to come from a panel while half come from a fresh sample. It is important to note that the panel design implies that Waves 2 and 3 will have 250 panel respondents from Wave 1 for each of the six race/ethnicity groups.



The 2010 Census ICP Evaluation will use an oversampling methodology to achieve equal numbers of interviews among Hispanics, non-Hispanic African-Americans, and non-Hispanic Others. It is important to note that Hispanics and non-Hispanic African-Americans are both larger proportions of the U.S. population than they were ten years ago, so less oversampling will be necessary for the 2010 Census ICP Evaluation. Second-stage tracts will be stratified within the NORC National Frame Areas into high- and low-density tracts for both Hispanics and non-Hispanic African-Americans. This results in four types of tracts. Tracts will be selected that are high-density in one or both of Hispanics or non-Hispanic African-Americans at a higher rate. Once the tracts are selected, NORC can also select housing units within “high-density” tracts at a higher rate to achieve the targets for Hispanics and non-Hispanic African-Americans.



For Wave 3, 1,200 additional interviews will be obtained in the Core Sample. These cases will be selected in the observation case study paired locations.

The 2010 Census ICP Evaluation sample will consist of sampled addresses. The core sample plan, including assumptions about telephone number match rates, completion by various mode, and other factors, is summarized below in Table 2.



Table 2

Summary of Core Sample Strategy


Sampling Step

Assumed
Rate (%)

Wave 1

Wave 2

Wave 3

Addresses Selected


4,580

2,290

5,954

Acquire Phone Number

62.0

2,840

1,420

3,691

No Phone Number → Attempt Internet

38.0

1,740

870

2,263

Phone Interviewing





Advance Letter


2,840

1,420

3,691

Working Residential Number Rate

70.0




Screener Completion

64.0




Interview Completion

72.0




COMPLETES by telephone contact


916

458

1,191

Nonresponse → Attempt Internet


1,924

962

2,501

COMPLETES by Internet

5.0

183

92

238

Field Interviewing


3,481

1,740

4,525

Sub-sampling for Face-to-Face Interviewing

20.0

696

348

905

Occupied Housing Units

90.0




Screener Completion

80.0




Interview Completion

80.0




COMPLETES In-Person


401

200

521

Total COMPLETES (Weighted)

67.78

1,500

750

1,950


Five hundred completed interviews are planned for each of the three race/ethnicity groups for Wave 1. For Wave 2, there will be 250 completed interviews from a subsample of Wave 1 completes, and 250 fresh completed interviews for each of the three race/ethnicity groups. For Wave 3, there will again be 250 panel completes and 250 fresh completed interviews for each of the three race/ethnicity groups (plus 1,200 completed interviews or the observational study design).

Three Supplemental Samples

As mentioned above, manipulating the tracts and segments selected will not be sufficient to meet the target sample sizes for American Indian and Alaskan Natives, Asians, and Native Hawaiian and Other Pacific Islanders. Therefore, these must be fielded as independent samples. The NORC National Frame can provide addresses for the Asian and Native Hawaiian samples, as well as the urban areas for the American Indian/Alaskan Native samples, but not for the American Indian/Alaska Native reservations. For the reservations, NORC will try to obtain address lists from the tribe, but we expect to conduct traditional listing (a field person lists every housing unit within designated areas for use in sampling) in some, most, or all of the selected reservations.

American Indian and Alaska Native (AIAN)

According to the 2000 decennial census, there were 3,420,171 persons living in the United States that were non-Hispanic and AIAN (alone or in combination with another race), and 998,199 living on any of the 651 reservations (29.3 percent of the AIAN population). In order to balance costs and coverage, NORC will select most addresses for the Waves 1, 2, and 3 AIAN samples from 10 of the 283 reservations with at least 250 AIANs. To increase the coverage from 29.3 percent of the AIAN population to 45.6 percent of the AIAN population, we will also select addresses for the Waves 1, 2 and 3 AIAN samples from seven of the ten urban areas with AIAN population densities of at least 2 percent (i.e., at least 2 percent of the population is AIAN). The reservations with at least 250 AIANs residents have a high AIAN density of 18 percent. However, the ten urban areas have a combined AIAN density of under 4 percent. By oversampling tracts with higher AIAN densities, we plan to achieve a hit rate (percentage of households with an eligible AIAN adult) of 14.9 percent. Further oversampling is possible, but the variation in the probabilities (and weights) will reduce the effective sample size, so we will limit the loss in effective sample size due to differential probabilities to 20 percent (i.e., a design effect of 1.25 due to differential probabilities). It will be necessary to select 10,665 addresses to obtain 500 AIAN interviews, partly because we assume a lower rate of telephone matching (50 percent).

Asians

According to the 2000 decennial census, there were 11,266,934 persons living in the United States that were non-Hispanic and Asian (alone or in combination with another race). Of these, 77.7 percent live in the 1,261 U.S. cities with at least 1,000 Asians (alone or in combination). NORC will select the Waves 1, 2, and 3 samples from a representative sample of 25 of these 1,261 cities. The cities range in Asian densities from 65.84 percent to 1.94 percent. By oversampling of tracts with higher Asian densities, we expect an overall hit rate of approximately 12.5 percent. Further oversampling (and a higher hit rate) is again possible, but we again limit the loss in effective sample size due to the differential probabilities to 20 percent. This strategy provides more coverage of the Asian population than in the 2000 PMPE with almost twice the density. It will be necessary to select 11,288 addresses to obtain 500 Asian interviews.


Native Hawaiian and Pacific Islanders (NHOPI)

According to the 2000 decennial census, there were 860,965 persons living in the United States who were non-Hispanic and NHOPI (alone or in combination with another race). Of these persons, 32.8 percent live in the state of Hawaii, and 23.32 percent of Hawaii residents are NHOPI; less than 1 percent of residents are NHOPI in all other states. The state with the largest NHOPI population outside of Hawaii is California, which contains 25.4 percent of U.S. NHOPIs, but only 0.64 percent of California residents are NHOPI. All NHOPI samples will be selected from the five counties in Hawaii. The U.S. Census Bureau, with NORC, considered data collection from California as well as Hawaii, but the additional costs when the density is so low are prohibitive. NHOPI residents are not as concentrated in Hawaii as the other two oversample groups, but by oversampling of tracts with higher NHOPI densities, we expect an overall hit rate of approximately 26.0 percent. Further oversampling (and a higher hit rate) is again possible, but we again limit the loss of effective sample size due to differential probabilities to 20 percent. With this assumed hit rate, it will be necessary to select 5,418 addresses to obtain 500 NHOPI interviews.



Paid Advertising Heavy-Up Experiment (PAHUE)

We had originally planned an observational case study design to supplement the analytical models used with time-series and longitudinal survey data. These studies, while inferior to an experiment, would have used a combination of aggregate advertising coverage, ratings (exposure estimates), characterizations of advertising message content, and survey responses to make it more likely to separate the effects of the media campaign from those of the partnership activities and other non-media components, and would have supported examination of the effects of different types of paid advertising and partnership activities.


Now that the Census Bureau has added an experiment with a strong design, PAHUE, NORC has committed its budget for the observational case studies towards the collection of PAHUE cases.



For the PAHUE, eight pairs of DMA’s were matched on indicators such as hard to count scores, mail return rates in Census 2000, race/ethnic populations, poverty rates, urban/rural composition, linguistic isolation population, and number of households. NORC will collect interviews in all sixteen of these DMA’s during Waves 1 and 3.



2. Procedures for Collecting Information

The 2010 Census ICP Evaluation will employ a mixed-mode approach for collecting data in the nationally representative probability surveys. An address-based sampling design marries the comprehensive coverage of address lists with the cost effectiveness of telephone data collection.



This mixed-mode data collection approach emphasizes the cost efficiencies of telephone interviewing alongside the flexibility of targeted in-person data collection to increase the overall response rate. NORC expects that approximately 65 percent of all completed questionnaires will be fielded by phone and the remaining 35 percent of all cases will either be fielded in-person or through a web-based survey. Routing a larger number of completed interviews through the telephone center will permit allocation of field resources to work with hard-to-enumerate populations. It is anticipated that a higher proportion of in-person interviews will be completed in the American Indian, Native



Hawaiian, and Asian oversampled target groups because the incidence of reluctant respondents and households without telephones will likely be more commonplace among these groups.

Paper and pencil questionnaires

Field staff working the in-person field follow-up will administer a paper and pencil instrument (PAPI). The PAPI questionnaires will be wave specific and will be used in conjunction with the PAPI screeners. Field Interviewers will be responsible for returning completed questionnaire forms to NORC’s Chicago offices using the secure methods dictated under Title 13. Optical scanning technology will be deployed in the central office to create data files from the responses entered into the completed questionnaires.

Self-administered web survey.

To increase the number of options for households to respond, a web survey alternative will also be made available. The web survey will be programmed to provide respondents with instructions to guide them through the questionnaire. Advance letters will be personalized to allow easy access and provide a link back to the sample control file. To maintain respondent confidentiality, web respondents will be assigned a unique user name and password to access our secure website and their responses will be encrypted during transmission.

Batch locating and telephone numbers.

To maximize the number of sample members available during the telephone effort, NORC will run all sampled addresses through a two-tier telephone number matching protocol. First, addresses will be sent to Targus, a computerized locating service that uses a large database of information to provide matches between addresses and telephone numbers. It is expected that approximately 60 percent of cases will match to an address through this method. Any unmatched cases will undergo more advanced locating through Accurint®, a widely accepted locate-and-research tool available to government, law enforcement, and commercial customers. Using a variety of publicly and privately available sources, a batch of addresses is submitted for telephone matching. NORC expects to match phone numbers for at least an additional 2 percent of addresses through this method.



Advance letters

NORC will mail all sample members a customized advance letter packet prior to the start of data collection. The content will present the study’s purposes generally and provide examples of the types of questions to be answered as well as the approximate amount of time to complete the interview. Households where a telephone number has been obtained will be mailed an advance letter including a project toll-free number participants can call to receive more information, provide a preferred telephone number for future contacts and also complete the interview. For households where a matching telephone number has not been identified, the advance letter packet will also contain a “postcard” and postage paid return envelope for potential respondents to mail contact information back to the project for contact purposes. Phone numbers provided on the returned postcard will be data entered and updated in the telephony queue for outbound dialing. For households that have a high likelihood of containing non-English speakers, the letter will be multi-lingual and include NORC’s alternative language toll free numbers. Such households will be determined based on Census block/tract data. Where appropriate, a web site address, password and login information will be provided as well.



3. Methods to Maximize Response

In order to maximize response rates, the 2010 Census ICP Evaluation will rely on a mixed-mode data collection design that offers respondents alternative ways to complete the survey. NORC will begin the data collection effort by telephone, expecting to obtain phone numbers for approximately 60 percent of the selected sample. Cases will then also be solicited for web completion. A sub-sample of all cases not completed will be fielded in-person in order to reach reluctant respondents and households without telephone or Internet access.


NORC is targeting weighted response rates generally between 65 percent (for the American Indian and Alaska Native sample) and 68 percent (for the core sample that includes Hispanics, non-Hispanic Blacks, and non-Hispanic Others) under the current sampling strategies. Weighting the response rate is necessary due to the subsampling for face-to-face interviewing.


To assist in gaining cooperation for the survey, advance letters will be mailed to all households selected for the study. These letters will inform potential respondents that interviewers will be attempting to contact them as well as include a project toll-free phone number for potential respondents to call if they would like to verify the validity of the study. Telephone interviewers will be equipped to answer respondents’ questions regarding the study, interview procedures and uses of data, including confidentiality questions.


Given the positive correlation between Census non-response and non-English- speaking status and/or linguistic isolation, multi-lingual data collection will also help to increase response rates. NORC will have full translations of the evaluation questionnaire in the languages chosen for hard copy 2010 Census forms: English, Spanish, Chinese, Korean, Vietnamese and Russian. Telephone and in-person data collection will be conducted in the languages of the 2010 Census form, and NORC has the facility and ability to conduct ad hoc translations for additional languages. In addition, supporting materials will be translated into the relevant foreign languages and tailored appropriately to address the targeted populations.


Incentives will also be used to increase response rates. Initially, all respondents will not be offered an incentive to participate. To better gauge the effectiveness of incentive use across the different modes of data collection within the Address Based Sampling design, a set of experiments will be implemented. First, a small pool of funds will be available for use in a $10 conversion incentive experiment within the telephone sample. A second experiment is planned for the panel sample members in the second and third waves of data collection. A treatment group will receive a self administered questionnaire with a $2 incentive enclosed, plus the offer of receiving an additional $10 upon return of the completed SAQ. Finally, incentives will be used in the face-to-face interviewing environment as a tool to gain cooperation within eligible households. All eligible respondents fielded in the face-to-face environment will receive $20 for participation.

4. Test of Procedures or Methods

Three separate instruments have been developed to measure public awareness of 2010 Census ICC activities before, during, and after the Census. At least 75 percent of the questionnaire items for all three waves’ instruments have previously been administered in national surveys cleared by Office of Management and Budget (OMB). Chiefly, the source instruments are from the 2000 Census PMPE and the Census Barriers, Attitudes, and Motivations Survey (fielded in 2008 by Macro International Inc.). Additional pre-testing of the proposed instruments has been conducted by NORC.


In preparation for data collection, four rounds of cognitive testing were conducted as part of the development of three separate survey instruments: The Wave 1 questionnaire, the Wave 2 questionnaire, and the Wave 3 questionnaire. Two of these rounds involved testing the Wave 1 instrument, and one round each was devoted to testing the Wave 2 and Wave 3 instruments. Each round included no more than nine participants. The cognitive testing process provided information about questionnaire structure, clarity of survey items and time estimates for survey completion. Convenience sample respondents were recruited from the general public, the five hard-to-enumerate populations identified as priorities by the Census Bureau, and the eight ‘audience segments’ defined at the census-tract level.


Because respondents would be unlikely to report current exposure to the 2010 Census ICC, cognitive testing protocols were revised to ask about other public information/participation campaigns (e.g., ‘get out the vote’ efforts for the November, 2008 election, and obesity-related campaigns). After initial testing of all three waves’ instruments, a revised wave 1 questionnaire was tested. Respondent answers and comments from all four rounds of cognitive interviews informed the structure and inclusion of survey items in the final versions of the questionnaires.


  1. Contact(s) for Statistical Aspects and Data Collection


For questions on statistical methods or the data collection described above, please contact

Lawrence Cahoon

U.S. Census Bureau

DSSD, HQ4K065

4600 Silver Hill Road

Washington, DC 20233

Phone: 301-763-9294

Email: [email protected]


Steven Pedlow

NORC

55 East Monroe Street

Chicago, IL 60603

Phone: 312-759-4000

Email: [email protected]

Attachments to the Supporting Statement



Appendix A: Advance letter sent to core sample households with matched phone number

Appendix B: Advance letter sent to oversample households with matched phone number

Appendix C: Advance letter sent to core sample households without a phone number match

Appendix D: Advance letter sent to oversample households without a phone number match

Appendix E: Conversion reminder letter sent to core sample households after CATI dialing

Appendix F: Conversion reminder letter sent to oversample households after CATI dialing

Appendix G: Advance letter for panel participants

Appendix H: Postcard for address/phone updates

Appendix I: General informed Consent statement to be included in interviewer scripts

Appendix J: Thank you letter mailed to participants following interview

Appendix K: Lifecycle flow of selected sample units through data collection

Appendix L: 2010 Census ICP Evaluation Questionnaire Wave 1

Appendix M: 2010 Census ICP Evaluation Questionnaire Wave 2

Appendix N: 2010 Census ICP Evaluation Questionnaire Wave 3

Appendix O: Comments from Cynthia Taeuber

Appendix P: Letter responding to Cynthia Taeuber’s comments

Appendix Q: Letter from Andrew Reamer of the Metropolitan Policy Program of the Brookings Institution

17


File Typeapplication/msword
File TitleDesign Phase of the NSCCSD – Feasibility Test
Authordatta-atreyee
Last Modified Bysoude001
File Modified2009-07-23
File Created2009-07-17

© 2024 OMB.report | Privacy Policy