OMB Memo

2020 Tracking Survey OMB Memo_03192019_clean.docx

Generic Clearance for Questionnaire Pretesting Research

OMB Memo

OMB: 0607-0725

Document [docx]
Download: docx | pdf


2020 Census Tracking Survey


Request: The Census Bureau plans to conduct additional research under the generic clearance for questionnaire pretesting research (OMB number 0607-0725). We propose to conduct an in-person and online cognitive evaluation of the 2020 Census Tracking Survey in English and Spanish. As part of this current submission, we are seeking approval for cognitive interviews conducted by Census Bureau staff and for the online data collection designed and supervised by Census Bureau staff. Please notice that we will use the name “2020 Census Attitudes Survey” for public testing to avoid confusion on the use of the word “tracking” in the title.


Purpose: The purpose of this research is to test added or revised questions in the 2020 Census Tracking Survey in English and Spanish. In 2010, the Census Bureau monitored public opinion behavior in a weekly telephone survey to identify general attitudes toward the Census and people’s intent to participate. For this decade, the 2020 Tracking Survey will help track US public sentiment concerning matters that may bear upon 2020 Census participation. The 2020 Tracking Survey’s data collection will be through RDD telephone interviews. The data collection shall employ a cross-sectional national random probability sample of the U.S. population yielding 1400 completed cases monthly in 2019 and 1400 completed cases weekly in the 2020 time period. Data from the public opinion survey will inform Advertising Campaign Optimization before and during the enumeration of the 2020 Census.


The 2020 Census Tracking Survey will also include a nonprobability web panel survey. This will provide the research team with data from various subgroups and audiences, which can be analyzed with the necessary precision on which to base communications campaign decisions. Respondents will come from multiple vendors due to the sample size and length of the field period; this vendor mix will remain constant throughout. The survey will be offered in both English and Spanish, employing sample quotas based on age, sex, region, Hispanic origin, race, and education. The data will then be weighted by the same variables at a more granular level with the addition of Hispanic nativity. There will be approximately n=2100 interviews per month in 2019 and per week from January to May of 2020. There will be an oversample of Asian respondents and the entire sample will be cross-sectional (e.g. it will not include duplicate responses from a single panelist throughout the weekly field period). A number of quality control measures will be taken to ensure high data quality including checks for speeding, straight lining, the same panelist attempting to respond multiple times, and illogical or suspicious answers (e.g. the same open-ended response used throughout the survey or across surveys). The data from this survey will allow the campaign optimization team to detect statistically significant changes in attitudes towards the 2020 Census among Black/African Americans, Hispanics, and Asians, as well as other subgroups such as younger respondents. The information collected from the probability and non-probability samples will provide an opportunity to monitor changes in attitudes about the census over time and explore relevant topics such as message awareness, data confidentiality, the citizenship question, or other emerging areas that may have an impact on census participation.


Part 1: In-Person Cognitive Test


Population of Interest: The planned cognitive pretesting evaluation will focus on assessing and improving the questionnaire content for the general population.


Timeline: There will be one round of cognitive testing that will begin in February 2019 and end no later than May 2019.


Language: Cognitive testing will be conducted in English and Spanish.


Method: From February 2019 to May 2019, staff from the Center for Behavioral Science Methods will conduct 30 cognitive interviews. We will conduct 15 interviews in English and 15 in Spanish using a paper version of the telephone administered instrument (See Attachment 1: 2020 Census Tracking Survey Questionnaire and Attachment 2: 2020 Census Tracking Survey Questionnaire (web questionnaire)).


Sample: For the cognitive testing our recruiting efforts for English interviews will target respondents with average to low educational attainment and diverse demographics including race, age, and sex. In Spanish, we will target recruiting to monolingual Spanish speakers with average to low educational attainment.

 

Recruitment: For the cognitive testing, participants will be recruited using a combination of word-of-mouth, fliers posted at local community organizations such as recreation centers, advertisements on Craigslist.com, and broadcast messages distributed through the Census Bureau’s daily online newsletter. Interviews will be conducted at the Census Bureau’s Response Research Laboratory and at locations convenient to interviewees in DC, Maryland, Virginia, and West Virginia.


Informed Consent: We will inform participants that their response is voluntary and that the information they provide is confidential and will be accessed only by employees involved in the research project. We will used the standard consent form used at the Center for Behavioral Science Methods. This consent form indicates that the respondent agrees that the interview can be audio-taped to facilitate analysis of the results Participants who do not consent to be video or audio-taped will still be allowed to participate.


Cognitive Interviews’ Protocol: For the 2020 Census Tracking Survey, we will conduct interviews using the think aloud method and retrospective probes (See Attachment 3: Census Tracking Survey Cognitive Interview Protocol Paper). Using retrospective probes will allow the respondent to respond to sections of the questionnaire uninterrupted before being asked probing questions about the survey questions they just answered. During probing, respondents will be asked about how they came up with their answers for questions, their comprehension of specific words and terms in the questionnaire, and about the level of difficulty of the questions. Participants will be asked to complete the entire survey.


Use of Incentive: For cognitive interviews, due to the length of the interview and the necessity to travel to test locations, we plan to offer an incentive of $40 to offset the costs of participation, such as travel and parking.


Length of Cognitive Interview: We estimate 90 minutes per respondent, overall. (Respondents will be screened using the Census Bureau’s generic screener; thus the burden hours for screenings are covered under a separate request.) The total estimated respondent burden for this request is 45 hours (90 minutes x 30 interviews).


Below is a list of materials to be used in the cognitive testing study:

  1. Attachment 1: 2020 Census Tracking Survey Questionnaire

  2. Attachment 2: 2020 Census Tracking Survey Questionnaire (web questionnaire)

  3. Attachment 3: 2020 Census Tracking Survey Cognitive Interview Protocol Paper

  4. Attachment 4: Demographic Questionnaire


Part 2: Online Data Collection: Online probing for pre-testing


Population of Interest: The online probing for pre-testing will focus on assessing a sub-set of questions and probes from the in-person cognitive interview protocol. This online data collection will examine a total of 14 questions and 7 probes only (See Attachment 5: 2020 Census Tracking Survey Online instrument and probes). We will test how similar or different where the responses from the in person cognitive interviews and the online responses both in English and Spanish in selected items. This research will examine the comparability of findings from in-person cognitive interviews and online probing in government surveys on salient issues identified previously in the literature (Childs, Clark Fobia, Holzberg and Morales 2016; García Trejo and Schoua Glusberg 2017; Goerman, Meyers, Sha, Park and Schoua Glusberg 2018; Meitinger and Behr 2016).


Sample: Participants will be recruited using a non-probability sample from Qualtrics panel (n = 500). For this online pre-testing study, the qualification criteria of participants will follow quota targets by sex, education, nativity and region. The sample size of 500 respondents will be divided by spoken language at home (250 in English and 250 in Spanish). The table below describes the specific quotas by sex, education, nativity and region. The focus of this study is focused on internal validity rather than representativeness of any population. This sample size also takes into account break-offs, incomplete data, and participants who do not follow the task instructions, similar to other samples used for studies of this nature.





Variable

English

Spanish

Sex

Men (120) , Women (130)

Men (120) , Women (130)

Education

High school education or less (100), More than High School (150)

High school education or less (150), More than High School (100)

Nativity

U.S. Born (250)

U.S. Born (150) and foreign born (100)

Region

Census regions (125 from the West and 125 from Northeast and South) See:

Census regions (150 from the West and 100 from Northeast and South) See:

Source: Adapted from the sample size available on the 2016 Collaborative Multiracial Post-election Survey (http://cmpsurvey.org/wp-content/uploads/2017/11/CMPS_Toplines.pdf.) For more information about the Census regions, see https://www2.census.gov/geo/pdfs/maps-data/maps/reference/us_regdiv.pdf


Based on the demographic characteristics required for this study, Qualtrics calculated the cost for buying sample for 500 participants (250 in English and 250 in Spanish) as follows:



n= 

CPI

Total

English Sample

250

 $4.50

 $1,125

Spanish Sample

250

 $15.00

 $3,750




 $4,875

*CPI: Cost Per Interview


Timeline: The online data collection will happen between February 2019 and May 2019.


Language: Online data collection will be conducted in English and Spanish only.

Method: For this online survey we will focus only on the following questions: intention to participate in the Census, Census awareness, messaging related questions and language questions (See Attachment 5: 2020 Census Tracking Survey Online instrument and probes). The probing will be retrospective to mirror the in-person cognitive interviews for half of the participants. The rest of the participants will receive concurrent probing immediately after they answered a question. Respondents will first answer demographic questions and then awareness, mode, intention to participate questions among others. For the message awareness questions, respondents will receive randomly messages related to the mandatory nature of the Census and the benefits of the Census (for the list of messages to be tested see Attachment 6: 2020 Census Tracking Survey List of messages). We will also split the sample for participants to test different wordings of the intention to participate in the Census and the census awareness questions (both in English and Spanish) and we will be adding three items related to language measurement because the online panel only measures language spoken at home (See Attachment 7: 2020 Census Tracking Survey language questions for participants).

Respondents will have a one-week period to complete the survey. Respondents who are interested in participating will volunteer to respond and thus will not be randomly selected. This is a convenience sample to learn more about the difference or similarity of responses when comparing online and in-person modes of interviews. Data collection will be restricted to people living in the United States. We will not need an invitation for the respondents to participate because Qualtrics sends surveys directly to their online panelists. Qualtrics pre-screens participants and we will use their pre-screening variables. CBSM will program and host the survey for the purchased sample through the Census Bureau’s instance of Qualtrics online platform, which has been FedRAMP and Census ATO certified. The survey page will have a survey introduction (See Attachment 8: 2020 Census Tracking Survey Introduction to the survey). An example of how the questions look like in Qualtrics is included in Attachments 9: 2020 Census Tracking Survey Examples from Qualtrics platform).

 Use of Incentive: The Qualtrics sample will be given an incentive by the Qualtrics vendor.  Per Qualtrics, the exact amount and form that each respondent receives can vary depending on the participants’ profile, how they were recruited, and the form of incentives they have elected to receive (i.e. e-gift cards, points, cash, etc.). Generally speaking, though, respondents receive ~$1.00-$1.50 or a relatively equivalent value for completing a 15 to 20-minute consumer survey.


Length of online survey: We estimate 10 minutes per respondent, overall. For this calculations we are assuming a 75 percent incidence rate. This means that if we aim to have a total of 500 interviews we will need to interview 875 people in total. The estimated respondent burden with a 75 percent incidence rate is calculated as follows:


# of Participants Screened

Minutes per participant for Screening

Total Screening Burden

Maximum number of Participants

Minutes per participant for data collection

Total Collection Burden

Total Maximum Burden (Screening + Collection)

  1. Qualtrics

0

0

0

875

10

875

875

Total Burden

875 minutes


145.83 hours



Below is a list of materials to be used in the current online data collection:

  1. Attachment 5: 2020 Census Tracking Survey Online instrument and probes

  2. Attachment 6: 2020 Census Tracking Survey List of messages

  3. Attachment 7: 2020 Census Tracking Survey language questions for participants

  4. Attachment 8: 2020 Census Tracking Survey Introduction to the survey

  5. Attachment 9: 2020 Census Tracking Survey Examples from Qualtrics platform


References

Childs, J. Clark Fobia, A. Holzberg, J.L. and Morales, G. 2016. A comparison of cognitive testing methods and sources: In person versus online nonprobability and probability methods. International Conference on Questionnaire Design, Development, Evaluation, and Testing, Miami, Florida. Available at: https://ww2.amstat.org/meetings/qdet2/OnlineProgram/AbstractDetails.cfm?AbstractID=303312

García Trejo, Y. and Schoua Glusberg, A. 2017 "Use and practice of mobile web surveys among monolingual Spanish speakers." Survey Practice Co-authored with Alisú Schoua-Glusberg. Available at http://www.surveypractice.org/index.php/SurveyPractice/article/view/410/html_93

Goerman, P., Meyers, M., Sha M., Park, J, and Schoua Glusberg, A. 2018. “Comparable Meaning of Different Language Versions of Survey Instruments: Effects of Language Proficiency in Cognitive Testing of Translated Questions.” In Advances in Comparative Survey Methods: Multinational, Multiregional, and Multicultural Contexts (3MC). Eds. Johnson, T., Penell, B., Stoop, I. A.L., Dorer, B. (Wiley Series in Survey Methodology)

Meitinger, K. and Behr, D. 2016. “Comparing Cognitive Interviewing and Online Probing: Do They Find Similar Results?” Field Methods, 28(4), 363-380.

The contact persons for questions regarding data collection and the design of this research are listed below:

Yazmin Argentina García Trejo

Center for Behavioral Science Methods

U.S. Census Bureau

Washington, D.C. 20233

(301) 763-3355

[email protected]


Jennifer Hunter Childs

Center for Behavioral Science Methods

U.S. Census Bureau

Washington, D.C. 20233

(301) 763-4927

[email protected]


Page 6 of 6


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorfreid002
File Modified0000-00-00
File Created2021-01-15

© 2024 OMB.report | Privacy Policy