Various Customer Feedback Activities

Generic Clearance for Data User and Customer Evaluation Surveys

ACS Messaging Benchmark Supporting Statement Attachment B_Cognitive Interview Report_Final_1.20.14

Various Customer Feedback Activities

OMB: 0607-0760

Document [docx]
Download: docx | pdf

Supporting Statement Attachment B - ACS Messaging Benchmark– Cognitive Interview Report

OMB Control Number 0607-XXXX

American Community Survey, US Census Bureau

Benchmark Survey Cognitive Interviews Report

Supporting Statement Attachment B


This report covers cognitive interview findings for the ACS Messaging Benchmark survey conducted between 12/2/2013 – 12/4/2013. It discusses the results for each question and recommends ways to improve the survey instrument. Recommendations from the cognitive interview are incorporated into the OMB supporting statement.


SUMMARY OF RECOMMENDED CHANGES:

  • Q9: Remove the question about whether participation in the survey is “required by law,” in order to test the other messages using likelihood to respond metrics. Respondents repeatedly mentioned the legal requirements when discussing whether other messages would make them more or less likely to respond to the ACS. We will still test the “required by law” message in the messaging section.

  • Q12: Replace the question about “whether the federal government does more harm than good” with a widely used question about “how often people can trust the federal government to do the right thing”

  • Q15: Replace the question about whether ACS participation can present a “benefit or harm,” with two separate questions about whether it can benefit or harm an individual, and a question about benefit or harm to the community

  • Q24, 26, 33: Minor wording changes to three messages to improve clarity

  • Q33: Minor wording change to improve clarity in the Privacy/Security message

  • Q46: Remove the marital status question

  • Q52: Modify the rent/own question to clarify that the question is at the household-level


METHODOLOGY:

Cognitive testing ensures that potential respondents are capable of understanding and properly responding to the Benchmark survey instrument. The approach focuses on whether participants are able to comprehend, interpret, and respond to our questions—without confusion or cognitive overload. The process identifies potential “red flags” that might affect the validity of the survey instrument and allows the research team to improve the questionnaire before fielding. 


The cognitive interviews for the Benchmark survey used a verbal probing approach, in which the interviewers probed into specific survey questions using a series of planned as well as unplanned (spontaneous) follow-ups. Verbal probing ensures that critical sections of the survey instrument are explored in every interview, and in a manner that is easy for respondents to provide feedback in a candid and unassuming way (Willis, 1999).


Participants were recruited over the telephone from a mix of random digit dial (RDD) and commercially available listed telephone numbers. Eight screening questions and a quota system were used to ensure a broad range of participants. Interviewers set up appointments with respondents who were not able to complete the cognitive interview immediately. Interviews averaged 38 minutes in length, and participants received a $40 honorarium for their time. Cognitive interviews were transcribed and reviewed for quality. Interviews were conducted in English, though at least one participant indicated that their first language was not English.


In total, nine cognitive interviews were completed and analyzed. Because of limited respondent engagement, two additional interviews were conducted and not included in the final analysis. The final set of interviews included individuals with the following characteristics:


Gender

Race/Ethnicity

Age

Census Region*

  • 5 Males

  • 4 Females

  • 2 Whites

  • 2 African Americans

  • 3 Hispanics

  • 2 Asians

  • 2 who are <35

  • 5 who are 35-64

  • 2 who are 65+

  • 4 from Midwest

  • 2 from South

  • 3 from West


*Note: the two interviews excluded from analysis because of data quality concerns were from the Northeast


Over the telephone, the cognitive interview went through the full ACS Messaging Benchmark questionnaire. In addition, at twelve (12) designated points in the survey, the interviewer probed about the participant’s understanding of the questions, the respondent’s ability to interpret the responses being requested, as well as any confusion resulting from the questions asked or language used. In addition, interviewers spontaneously probed in situations where respondents seemed to be have an issue with an item in the survey. At the conclusion of the interview, the interviewer also gave an opportunity for the respondents to express any other thoughts they had about the process or the survey.


The following analysis looks at each question that had a structured probing question.




Question 6:

  1. Original:

Q6. Overall, how would you describe your general feelings about the Census?


  1. Very favorable

  2. Somewhat favorable

  3. Somewhat unfavorable

  4. Not at all favorable

  5. Don’t know (DO NOT READ)

  6. Refused (DO NOT READ)


(Note: question originally asked in CBAMS II)

Shape1


  1. Probe(s):

  • What sort of things did you think about when answering that question?


  1. Results:


In general, participants had a strong basis for answering this question, based on specific experiences with the Census Bureau or understanding of how the decennial census is used (i.e. for reapportionment of congressional representation). Eight of the nine respondents had “very favorable” or “somewhat favorable” feelings towards the Census. One middle-aged Asian man recalled that, “it was a straight-forward census asking straight-forward questions.” Others mentioned that it was important to understand how “the population has dispersed itself” and that the Census’ goal was to have an accurate count for dividing up federal funds.


One participant recalled filling out the 2010 Census and not liking the choices available for race or ethnicity. The person felt the categories didn’t really apply to their family. Another mentioned “thinking about all the immigrants (inaudible) being counted that shouldn’t be.”


  1. Suggested revision:


Based on the personal experiences with the previous decennial census that participants cited, no changes are recommended.






QUESTION 7, 8:

  1. Original:

Q7. Have you ever heard of the American Community Survey?


  1. Yes

  2. No

  3. Don’t know (DO NOT READ)

  4. Refused (DO NOT READ)


/* DISPLAY */ ## IF HAVE HEARD OF ACS ## As you may know, The American Community Survey is conducted by the US Census Bureau. Each year roughly three percent of all US households are selected at random to participate. The survey asks questions about you and the people in your household. For example, it asks about topics such as your commute time, income, and the age of children.


/* DISPLAY */ ## IF HAVE NOT HEARD OF ACS ## I would like to tell you some information about the American Community Survey. The American Community Survey is conducted by the US Census Bureau. Each year roughly three percent of all US households are selected at random to participate. The survey asks questions about you and people in your household. For example, it asks about topics such as your commute time, income, and the age of children.


Q8. ## IF “NO” OR “DON’T KNOW” TO ACS AWARENESS ## Have you ever heard of that before?


  1. Yes

  2. No

  3. Don’t know (DO NOT READ)

  4. Refused (DO NOT READ)

Shape2


  1. Probe(s):

  • Can you describe in your own words what the American Community Survey is?


  1. Results:


None of the nine participants had previously heard of the American Community Survey, even after hearing the short description of it. This is consistent with Census staff’s experience that many ACS respondents’ first exposure to the ACS comes from the initial notification letters.

When asked to describe the ACS in her own words, one woman replied, “from what you just told me?” This implies that the description in the display is critically important for the survey instrument.


When asked to describe what the ACS is in their own words, three respondents said that households were selected at random. In addition, three respondents noted that only a fraction of the population is selected to participate in the ACS. Two participants expressed concerns that by only interviewing a “fragment” of the population, you would not get a true picture of what the public was right. One participant responded that the three examples cited seemed intrusive. One participant, whose first language was not English, initially had some difficulty understanding that the ACS was implemented by the Census Bureau.


  1. Suggested revision:


No changes recommended, as this section appears to be functioning as intended. Survey participants have a mutually shared understanding of what the ACS is that outlines the purposes and some of the material that is asked in the data collection. The three examples in the description (commute time, income, and the age of children) were included because they represent both the range of ACS questions, and topics that Census Bureau field interviewers have identified as some of the more controversial topics (Olson 2013).





QUESTION 10:

  1. Original:

Q10. Overall, how would you describe your general feelings about the American Community Survey?


  1. Very favorable

  2. Somewhat favorable

  3. Somewhat unfavorable

  4. Very unfavorable

  5. Don’t know (DO NOT READ)

  6. Refused (DO NOT READ)

Shape3


  1. Probe(s):

  • What sort of things did you think about when answering that question? How did you decide on your answer?


  1. Results:


The answer choices from the nine interviews are outlined below:


Choice

Count

Very favorable

1


Somewhat favorable

5


Somewhat unfavorable

1


Very unfavorable

1


Don’t know (NOT READ)

1


Refused (NOT READ)

0



Six of the nine respondents said they were either “very favorable” or “somewhat favorable.” Participants mentioned “seeing the importance of doing it” and that “the information would be useful” for learning about the demographics of a community. One African-American male said he was favorable to the program because it might impact “whether or not you fund certain programs that may exist in my community, then I would think it’s favorable.”


When probed about how they came to their decision, four participants mentioned a lack of knowledge about the survey that would be useful to form an opinion. One Hispanic male said, “I guess it would depend on what the actual questions that were being asked and whether I thought they were valid.” An African-American male prefaced his comments by saying, “I mean, I’m not too familiar with it, but if it’s dealing with, along the same lines as the census, then I would think it would be favorable.” Similarly a different African-American male, had the following exchange:


RESPONDENT: I mean, this is new to me, hearing just that. I guess it would be something that I would still have to learn, and I guess have more research on… It’s no different than the 10 year census. Most that’s told to us, we heard of that. They broadcast that on the news, and we see commercials about that when it’s coming. So you’re fully aware of what’s going on. But with American [Community Survey], I didn’t know anything about that until you told me.

INTERVIEWER: It’s the lack of information, then?

RESPONDENT: Yes, it would be the lack of information.

INTERVIEWER: How did you decide then that you were “somewhat favorable?”

RESPONDENT: Just interested. I mean, they always come up with something, I guess you learn as you go. And I have no problem with that. As long as I’m told roughly what it is, and how I can benefit from it.


In addition, one participant expressed skepticism that “Washington politicians” would really look at the surveys of their community and another participant reported feeling uncomfortable about making a survey “obligatory.”


  1. Suggested revision:


No changes recommended. As all nine participants were hearing about the American Community Survey for the first time, it’s understandable that they wanted to learn more information about the survey before expressing an opinion on the telephone. Nevertheless, eight of the nine interview participants did select an answer, and the reasons they cited (from wanting to benefit their community, to concerns over whether politicians would take action), illustrates an engagement with the concept of the ACS, the data collection process, and how it is used.




QUESTION 11:

  1. Original:

Q11. Overall, how would you describe your feelings about the federal government?

  1. Very favorable

  2. Somewhat favorable

  3. Somewhat unfavorable

  4. Very unfavorable

  5. Don’t know (DO NOT READ)

  6. Refused (DO NOT READ)

Shape4


  1. Probe(s):

  • What kinds of things did you think about when answering that last question?


  1. Results:


The answer choices from the nine interviews are outlined below:


Choice

Count

Very favorable

2


Somewhat favorable

3


Somewhat unfavorable

4


Very unfavorable

0


Don’t know (NOT READ)

0


Refused (NOT READ)

0


Respondents appeared to have very little difficulty answering this question. Their personal experiences and political philosophy seemed to have a significant impact on how they answered this question. Participants recognized this question was broader than the Census Bureau or the ACS discussion that had come in previous questions. Upon probing, frequently cited reasons for their answers included “partisanship” and how split the government is.”


Other reasons for their response included healthcare reform, views about Barack Obama, and “the general well-being” of the people. One man described how in theory he had a very favorable view of the government, but that, “I think they should be doing or making an effort to compromise on things, and that’s what I find frustrating.” A senior Asian woman who said she was “very favorable” of the federal government said that, “without the federal government, we couldn’t exist, now could we? I mean they’re the top guys, in charge first.”


One participant mentioned that they viewed the federal government’s involvement in healthcare in a positive light. On the other hand, other respondents said their answer reflected, “my healthcare situation, and how it’s being imposed on me.” When asked to describe why, a Midwest woman described her reasons as follows:


RESPONDENT: Just politics in general.

INTERVIEWER: Anything particular about politics?

RESPONDENT: No. Just the fact that I’m not happy with the president. I’m not happy with Obamacare.


She also indicated at the end of the survey that asking her opinion on the federal government was the one “really uncomfortable” question to answer. She said, “You never know about that one.”


  1. Suggested revision:


No changes recommended. One interesting finding was that no participants indicated that they had a “very unfavorable” view of the federal government, while four participants were “somewhat unfavorable.” While this could be entirely due to the particular interviews or quotas that were not representative of the population as a whole, there could potentially be an observer effect from having the interviews recorded. This could be an issue to monitor in subsequent fielding of the Benchmark survey. However, the survey instrument considers both a “somewhat unfavorable” and “very unfavorable” as factors for potential drilldown with the additional messaging section.





QUESTION 12:

  1. Original:

Q12. Which of the following best describes your opinion? (READ CHOICES)

  1. The US federal government does more harm than good

  2. The US federal government does about as much harm as good

  3. The US federal government does more good than harm

  4. Don’t know (DO NOT READ)

  5. Refused (DO NOT READ)

Shape5


  1. Probe(s):

  • What were you thinking about when answering that last question?



  1. Results:


Five respondents indicated that the US federal government does at least as much harm as good. The answer choices from the nine interviews are outlined below:


Choice

Count

The US federal government does more harm than good

1


The US federal government does about as much harm as good

4


The US federal government does more good than harm

4


Don’t know (DO NOT READ)

0


Refused (DO NOT READ)

0


The US federal government does more harm than good

0


This question got a wide range of reasons for why respondents answered the way they did.


Several participants cited their own situations to describe how they decided to answer this question. One Asian man posited, “one’s opinion of the government is based on life experience.” Another participant who said the government did more harm than good said, “people dictating what I need to do to comply with the law, when the things that they write into the law are actually harmful to me”.


Two participants thought about social class. One woman said, “it’s about even… it’s good for the people who have a decent income and are up there. But the people who are average and below poverty; the government doesn’t seem to do anything.” An African-American male said he was thinking, “About people that are in a low income bracket,” and that government did not have a lot of programs that were designed to help people in the low income brackets.


There was also a wide range of specific policies that participants put forward as examples. One Hispanic woman said she was, “thinking of the Clean Air Act, any environmental regulations that are in place, but I also feel that currently… I think of Congress and how Congress is susceptible to lobby.” Two respondents cited airplanes flying as an example of the federal government doing good.


An Asian woman said that she thought it was about equal because while “President Obama works hard. He’s an honest man and tries to do his best,” the “Republicans” were against his policies. In this way, she interpreted the President and Congress as off-setting each other.


  1. Suggested revision:


Replace the initial question with the following question:

Q12. How often do you think you can trust the government in Washington to do what is right?


  1. Just about always

  2. Most of the time

  3. Only some of the time ## DRILLDOWN CRITERIA ##

  4. Never (DO NOT READ) ## DRILLDOWN CRITERIA ##

  5. Don’t know (DO NOT READ)

  6. Refused (DO NOT READ)

Shape6


As the range of responses in cognitive interviewing indicates, the initial question is open to a wide-range of interpretations. The intent of this question is to measure whether a respondent is cynical about government. As a result, we recommend changing the question to one that more broadly captures sentiments towards government.


This question is commonly used in academic literature to measure trust in government and political cynicism (see Levi & Stoker, 2000; Citrin & Muste, 1999; Aberbach, 1969). It has been used from 1958 to the present day as part of the NSF-funded National Election Studies (NES). This question asks directly about trust in government, while encompassing the ethical qualities of leaders and the ability and efficiency of government officials and the correctness of their policy decisions (Stokes 1962).



QUESTION 13:

  1. Original:

Q13. Which of the following best describes your opinion?

  1. The US federal government knows too much about Americans

  2. The federal government knows enough about Americans

  3. The federal government knows too little about Americans

  4. Don’t know (DO NOT READ) /* DO NOT ROTATE */

  5. Refused (DO NOT READ) /* DO NOT ROTATE */

Shape7


  1. Probe(s):

  • Tell me a little more about that. How did you choose your answer to that one?


  1. Results:


The answer choices from the nine interviews are outlined below:


Choice

Count

The US federal government knows too much about Americans

2


The federal government knows enough about Americans

5


The federal government knows too little about Americans

2


Don’t know (DO NOT READ)

0


Refused (DO NOT READ)

0


The US federal government knows too much about Americans

0


An equal number of respondents said the government knows too much about Americans as said the government knows too little. Five respondents said the government knows enough about Americans.


Several respondents referred to current events in their responses. One Asian male said, “We are bombarded in the media with what Uncle Sam’s been up to lately, right?” Another respondent who thought the government had enough information mentioned telephone snooping.


Some participants answered that the federal government didn’t have enough understanding of their communities and what they needed. One Hispanic male who though the government knows too little said, “they [the federal government] don’t know what’s going on out here. They really don’t. They don’t know, and I don’t think they want to know.”



Other respondents said that the government had access to large amount of information already. One respondent who said the government, “already have access to enough information, I think, to do what they need to do.”


Both participants who said the government knows too much expressed a general sense of concern about “too much prying going on into my personal business” and concerns about “big brother watching.”


One African-American male thought the government knew enough.


RESPONDENT: “You know what I mean? I mean, even if you are unemployed and you have to ask for Social Security or social service, or whatever, they make you fill out all of the forms that you, you know, in order to obtain that, you have to jump through the hoops and do all of that, so they know enough. They know about everybody here.”


  1. Suggested revision:


No recommended change. This question is very useful for identifying various attitudes towards information collection, especially for a statistical agency like the Census Bureau.


QUESTION 14:

  1. Original:

Q14. How likely would you be to participate in the American Community Survey (ACS) if contacted by the Census Bureau?


  1. Very likely

  2. Somewhat likely

  3. Somewhat unlikely

  4. Very unlikely

  5. Don’t know (DO NOT READ)

  6. Refused (DO NOT READ)

Shape8


  1. Probe(s):

  • Tell me a little more about that. What would influence your decision?


  1. Results:


This question was very clearly influenced by Q9, when participants heard that participation in the survey was required by law. All participants said they were at least “somewhat likely” to participate in the ACS.


Choice

Count

Very likely

6


Somewhat likely

3


Somewhat unlikely

0


Very unlikely

0


Don’t know (DO NOT READ)

0


Refused (DO NOT READ)

0


Four respondents explicitly said their response was because of the earlier question. One Hispanic male said he would definitely participate because, “I’m not looking to break the law.” Other participants said that if, “by law if I had to do it, I would do it,” and that they would, “now that I know it’s obligatory.” One participant wondered that since he had been told that by law he was required to participate, “what are the consequences if I don’t?”


Other participants said that they wanted to help make the Census Bureau work better. One Hispanic woman said, “If it helps the governments choose better ways of doing the Census, just help in general, then I would definitely be more inclined to do it.” Another participant said he wanted the government to be “educated” about the people they govern. Some participants said they would participate because they saw it as closely related to the US Census with “just different names.”


Two participants noted that they would likely participate, but it would depend on what was asked of them. One Asian male remarked that the best way for him to participate would be to “keep it short and simple.” Another participate noted that, “It would depend on a lot of the questions, and I would hope that I would have the option of saying no… a lot depends on the questions. If they’re going to ask me how many people that live in my house I can tell them the number, but when they start to ask me what the ages are, I kind of resent that.”


  1. Suggested revision:


There was a strong order effect from the earlier question about participation being required by law. The legal requirements were cited specifically in four of the nine interviews. This suggests that messages about mandatory participation are powerful messages—which is consistent with findings from previous Census Bureau mail testing (Leslie, 1996) and ACS survey design estimates (Nevarro et al. 2011).


Recommendation: remove the following question and display:


Q9: If your household is selected to participate in the American Community Survey, do you believe you are required by law to fill out the survey?


  1. Yes

  2. No

  3. Don’t know (DO NOT READ)

  4. Refused (DO NOT READ)


/* DISPLAY */ ## Q9 = NO, DON’T KNOW, OR REFUSED ## Actually, completing the American Community Survey is required by law, just like completion of the census once every ten years.

Shape9


This message will still be tested in the message section as Q29 (OBLIGATED TO RESPOND). This will enable the research team to compare its impact on anticipated response in a randomized setting with other messages. Similarly, several participants cited their expectation that participation was required by law when responding to Metric B for Q23-33.


Keeping Q9 and removing only the informative display would presumably be unsettling for participants, who may wonder why the interviewer asked them their opinion about whether it was legally required before moving on to other topics.

QUESTION 15:

  1. Original:

Q15. Do you believe that answering the American Community Survey could [personally benefit you in any way, personally harm you OR personally harm you in any way, personally benefit you] or neither benefit nor harm you?

  1. Personally benefit you

  2. Personally harm you ##DRILLDOWN CRITERIA##

  3. Neither benefit nor harm /* DO NOT ROTATE */

  4. Both benefit and harm /* DO NOT ROTATE */

  5. Don’t know (DO NOT READ) /* DO NOT ROTATE */

  6. Refused (DO NOT READ) /* DO NOT ROTATE */

Shape10


  1. Probe(s):

  • What sort of things did you think about when answering that question?


  1. Results:


Participants were less certain how to answer this question. They choose the following answers:


Choice

Count

Personally benefit you

3


Personally harm you

0


Neither benefit nor harm

3


Both benefit and harm

1


Don’t know (DO NOT READ)

2


Refused (DO NOT READ)

0


When asked what sort of things they were thinking about when answering the question, some participants described potential benefits to the community.


One Hispanic female said, “the Census is supposed to tally how many people there are so that we can better allocate resources to communities that need it more. So when I thought about taking that survey, I thought, OK it could help my community, but…when someone says, ‘Do it, it’ll benefit the community’ it’s kind of like I’ll believe it when I see it. So that’s why I said it would neither benefit nor harm me.”


Three respondents considered if there could be negative consequences for filling out the survey. One respondent who thought it could both benefit and harm said he wasn’t sure, “whether, if I answered incorrectly, there could be retribution.” Another thought that it wouldn’t affect him personally, but “if you’re a convicted felon” or if you “declared bankruptcy,” then completing the ACS could harm you. One African-American male said, “I don’t know. But it can’t harm me, I mean; I’m a law-abiding citizen, so it can’t harm me in any way.”


In addition, two respondents didn’t think the survey would impact them in any way.

One respondent thought they didn’t have enough information, based on the initial examples, to answer the question.


  1. Suggested revision:

Previous cognitive testing from CBAMS II identified this question as a potentially difficult question to respond to (see Conrey 2012). Participants answered the question either by talking about how the ACS would affect them personally (by attracting government attention), or by how it would benefit the community. The current question asks specifically whether the survey would affect them personally.


We recommend splitting the question to capture both the community impact and the individual impact of completing the ACS:


Q15. Do you believe that answering the American Community Survey could [personally benefit you in any way, personally harm you OR personally harm you in any way, personally benefit you] or neither benefit nor harm you?

  1. Personally benefit you

  2. Personally harm you ##DRILLDOWN CRITERIA##

  3. Neither benefit nor harm /* DO NOT ROTATE */

  4. Both benefit and harm /* DO NOT ROTATE */

  5. Don’t know (DO NOT READ) /* DO NOT ROTATE */

  6. Refused (DO NOT READ) /* DO NOT ROTATE */


Q15_COMM. Thinking more generally, do you believe that answering the American Community Survey could [benefit your community in any way, harm your community OR harm your community in any way, benefit your community] or neither benefit nor harm your community?

  1. Benefit your community

  2. Harm your community

  3. Neither benefit nor harm /* DO NOT ROTATE */

  4. Both benefit and harm /* DO NOT ROTATE */

  5. Don’t know (DO NOT READ) /* DO NOT ROTATE */

  6. Refused (DO NOT READ) /* DO NOT ROTATE */

Shape11

Question 16:

  1. Original:

Q16. Would you say you agree with the following statement? The American Community Survey is an invasion of privacy.

  1. Strongly agree ## DRILLDOWN CRITERIA ##

  2. Somewhat agree ## DRILLDOWN CRITERIA ##

  3. Somewhat disagree

  4. Strongly disagree

  5. Don’t know (DO NOT READ)

  6. Refused (DO NOT READ)

Shape12


  1. Probe(s):

  • What does “invasion of privacy” mean to you in this question?


  1. Results:


Participants were split as to whether the ACS was an invasion of privacy or not. They choose the following answers:


Choice

Count

Strongly agree

0


Somewhat agree

4


Somewhat disagree

2


Strongly disagree

1


Don’t know (DO NOT READ)

1


Refused (DO NOT READ)

1


When asked what “invasion of privacy” meant, two respondents mentioned that it combination of the specific questions and how the government makes them answer. One Hispanic man said “invasion of privacy” meant “information that you don’t need to know.”


One African-American male said that while, “the survey asks personal questions about my life particularly,” he didn’t see that as an invasion because he was voluntarily offering the information. He continued that an invasion involved, “some type of mechanism that intrudes on my life whether I want it to or not.”


One woman said “invasion of privacy” made her think of, “personal questions about your finances or what have you,” but with social media, people are sharing more information than they need to already. “I don’t think government would invade anybody’s privacy because they’re asking you to take the survey first and they’re not just going into your house and going through your things to gather that information.”


One Hispanic male somewhat agreed that the ACS was an invasion of privacy, because it could lead to greater government scrutiny from the “IRS” and “more attention from government agencies on what I’m up to.”


Two respondents noted that they couldn’t be sure without seeing all of the questions. Another said he was basing his response on, “the example of some of the questions” from earlier in the survey.


  1. Suggested revision:


No changes recommended. Participants understood that the questions were of a more personal nature, and were able to evaluate whether that constituted an invasion of privacy to them.


Question 17:

  1. Original:

Q17. As far as you know, are the Census Bureau and the American Community Survey required by law to keep information gathered confidential?


  1. Yes

  2. No ## DRILLDOWN CRITERIA ##

  3. Don’t know (DO NOT READ) ## DRILLDOWN CRITERIA ##

  4. Refused (DO NOT READ) ## DRILLDOWN CRITERIA ##


(Note: This question is adapted from CBAMS II)

Shape13


  1. Probe(s):

  • What does it mean to “keep information confidential” in that question?


  1. Results:


When asked whether the Census Bureau was required to keep ACS answers confidential, six said that the employees were. The breakdown of answer choices is below:


Choice

Count

Yes

6


No

1



Don’t know (DO NOT READ)

2


Refused (DO NOT READ)

0


Three respondents said they certainly expected that the Census Bureau would keep ACS responses confidential. One Hispanic man said that confidential meant that, “that I can answer freely, without any fear of more government interference in my life.”


Respondents described how the aggregate findings would be shared, but not the individual responses. They used words like “general information,” “high-level,” and “results” would be shared, but not the “personal name,” “unique,” or “specific information about particular people.” One Asian woman said that some responses would be OK, but others would not be OK to share.


  1. Suggested revision:


No suggested change. Respondents had a clear notion of “confidential,” and that some aggregate information would be released. They also provided examples of information that they expected not to be released, such as individual names, social security numbers, and incomes.




Question 18-22:

  1. Original:

/* DISPLAY */ I would like to read you some statements about the American Community Survey and the Census Bureau.


/* METRIC A */ Would you say this statement makes you trust the Census Bureau…? (READ CHOICES, READ AGAIN IF NECESSARY)

  1. Much more

  2. Somewhat more

  3. Neither more nor less

  4. Somewhat less

  5. Much less

  6. Don’t know (DO NOT READ)

  7. Refused (DO NOT READ)


/* RANDOM ROTATE SERIES */


Q18. The Census Bureau is different than many other parts of the federal government. They are solely a research organization.


Q19. The US Census has been in existence since the 1790s and the American Community Survey has been conducted in some form or another since the 1850s.


Q20. Participating in the American Community Survey is safe. All individual responses are protected by law and are not shared with anyone – not even other government agencies.

Q21. By law, Census Bureau employees cannot publically release any information that could identify an individual. The penalties for unlawful disclosure can be up to two hundred and fifty thousand dollars or up to five years in prison.


Q22. Millions of Americans participate in the American Community Survey every year. However, the ACS does not release any information that can identify any individual who participates.


/* END SERIES */

Shape14


  1. Probe(s):

  • From the last set of questions, what stood out to you? Were any of the messages particularly memorable?

  • Did any of those last questions seem confusing or difficult to answer?


  1. Results:


Six participants qualified to hear the drilldown section on intrusiveness/privacy.



Message


Census is different

Legacy/ history since 1790s

Responses are safe, not shared

Fines, improper release

Millions participate

Much more

2

2

1

2

2

Somewhat more

0

1

3

2

1

Neither

4

3

1

0

2

Somewhat less

0

0

1

1

0

Much less

0

0

0

1

1

Don’t know

0

0

0

0

0

Refused

0

0

0

0

0


Among the six participants that heard the privacy messages (in a random order), each message got a mixture of responses. This suggests that the survey instrument would work as intended by allowing for comparison between messages. None of the respondents indicated that they “didn’t know” or were confused by the messages.


In the probing section, four respondents said that they remembered the message that the Census Bureau doesn’t share individual answers with other government agencies. As one African-American male said, “the last one was memorable in that they don’t share any of the information with anyone, not even other federal government offices.”


Three respondents mentioned that the legacy/history message was particularly memorable. One African American said the “1790s” and “1850s” stood out to him for how long the survey has been conducted. A Caucasian woman was also surprised; “wow, why didn’t I hear about this? It’s been around for so long!”


One African-American male indicated that each of the statements would make him trust the ACS much more. When asked why he said, “whatever is stated with the United States Census, then I would trust them, based on that alone.” He also noted that the $250,000 fine or 5 years in prison was particularly memorable message.


  1. Suggested revision:


No changes recommended based on the six cognitive interviews that qualified for this section.


Question Q23-Q33:

  1. Original:

/* DISPLAY */ Now I would like to read you some statements and ask your opinion on each. For each statement, I will ask you how believable you find it and I will also ask if that statement would make you more or less likely to complete the American Community Survey.

/* METRIC A */ How believable is this statement?


  1. Very believable

  2. Somewhat believable

  3. Somewhat unbelievable

  4. Very unbelievable

  5. Don’t know (DO NOT READ)

  6. Refused (DO NOT READ)


/* METRIC B */ And when it comes to completing the American Community Survey, would you say this statement makes you… (READ CHOICES)


  1. Much more likely to complete the ACS

  2. Somewhat more likely

  3. Neither more nor less likely

  4. Somewhat less likely

  5. Much less likely to complete the ACS

  6. Don’t know (DO NOT READ)

  7. Refused (DO NOT READ)


(RESPONDENTS RANDOMLY HEAR 6 OF 11 STATEMENTS)

Shape15


  1. Probe(s):

  • From the last set of questions, what stood out to you? Were any of the messages particularly memorable?

  • Did any of those last questions seem confusing or difficult to answer? (If so, what was confusing?)


  1. Results:


All nine participants saw six different messages.







Message

Metric A

Determine where funding goes

Non-partisan

Essential for Govern.

Accurate and Timely

Learn about where they live

Good for Economy

Obligated to respond

Only a few

Legacy / Patriotism

Easy / Tech.

Privacy / Security

Very believable

1

2

2

0

0

0

1

5

1

1

1

Somewhat believable

2

2

3

2

3

2

1

0

1

0

5

Somewhat unbelievable

2

1

0

2

1

2

2

0

2

1

1

Very unbelievable

1

1

0

2

0

0

0

0

0

0

0

Don’t know

0

0

0

0

0

0

0

0

1

0

0

Refused

0

0

0

0

0

0

0

0

0

0

0



Message

Metric B

Determine where funding goes

Non-partisan

Essential for Govern.

Accurate and Timely

Learn about where they live

Good for Economy

Obligated to respond

Only a few

Legacy / Patriotism

Easy / Tech.

Privacy / Security

Much more likely…

1

1

1

0

0

0

0

1

1

0

1

Somewhat more likely

2

3

3

3

2

0

3

4

1

0

5

Neither more or less

2

2

1

3

2

4

1

0

3

1

1

Somewhat less likely

1

0

0

0

0

0

0

0

0

1

0

Much less likely…

0

0

0

0

0

0

0

0

0

0

0

Don’t know

0

0

0

0

0

0

0

0

0

0

0

Refused

0

0

0

0

0

0

0

0

0

0

0


In general, participants were able to answer each of the questions for all of the messages. Only one participant answered a message as “don’t know.” He was concerned that the beginning part of the message talked about the decennial census, and the end talks about the American Community Survey. To him, it seemed that the message, “was trying too hard to say the constitution—kind of like added on to [the ACS].”


The only message to have more respondents say it was unbelievable than believable was the Accurate and Timely message. Four participants said it was “somewhat unbelievable” or “very unbelievable” and only two participants said it was “somewhat believable.”


One Asian male noted that the “Founding Fathers and going back to the 1700s” message was memorable, but it wouldn’t get him “jumping out of my chair.” He said it wouldn’t particularly affect him at all.


An African-American male noted that he didn’t find the message about growing the economy credible. He said that he would, “love for that to be the case” if the survey would benefit communities. But he was “pessimistic” because he didn’t believe it would have those effects for his community.



  1. Suggested revision:


Based on the cognitive testing, we recommend two minor wording changes.


Because the Accurate and Timely message had low believability marks, we recommend the following change:


Q26. ##ACCURATE AND TIMELY DECISION MAKING ##


The American Community Survey is often the most only reliable source of accurate and timely statistical information essential for decision making.

Shape16


One participant got confused by the wording in the Privacy/Security message. For clarity, we recommend the following change:


Q33. ##PRIVACY/SECURITY##


All individual information collected as part of the American Community Survey is kept strictly confidential. The answers from individual respondents provide cannot be shared with anyone—not even other government agencies.

Shape17










Question 24:

  1. Original:

In the messaging section, six participants heard the following message.


Q24. The American Community Survey is required by law to be completely non-partisan and non-political. This ensures that the statistics the Census Bureau gathers and produce are both reliable and trustworthy.

Shape18


  1. Probe(s):

  • What did the phrase “non-partisan and non-political” mean to you in that series?


  1. Results:


In general, participants had a clear idea of what “non-political” meant in the message. One female respondent said the survey would not be, “geared towards any political party, whether it is left or right.” Others used words like “no consideration given to the politics,” and that it was “not favoring Republicans or Democrats.”


One Caucasian woman said she didn’t know what “non-partisan” meant, but the rest of the message was that, “politics are not supposed to play a part in the information that’s gathered, that’s used, that decisions are made on.” Among those who offered descriptions of both “non-political” and “non-partisan,” both started with a description of “non-political.”


Two participants noted that they didn’t think that it was possible for something to be completely non-political. As one said, “I think that everything is political now. Everything. That’s why I said, you know, that was somewhat unbelievable.”


  1. Suggested revision:


No changes recommended.


In the messaging section, six participants heard the following message.


Q24. The American Community Survey is required by law to be completely non-political and non-partisan [order reversed]. This ensures that the statistics the Census Bureau gathers and produce are both reliable and trustworthy.

Shape19



Question 46:

  1. Original:

Q46. What is your marital status?


  1. Now married

  2. Widowed

  3. Divorced

  4. Separated

  5. Never married

  6. Don’t know (DO NOT READ)

  7. Refused (DO NOT READ)


(Note: question initially asked in CBAMS II)

Shape20


  1. Probe(s): There was not a structured probe for this question.


  1. Results:


One elderly participant noted that many of these terms would describe her. After the interviewer clarified which one best currently described her, she said widowed.


  1. Suggested revision:


We recommend removing this question, as it is unlikely to be a significant demographic variable in our analysis.



Question 52:

  1. Original:

Q52. Do you rent or own your own house or apartment?

  1. Rent

  2. Own

  3. Other /* SPECIFY */

  4. Don’t know (DO NOT READ)

  5. Refused (DO NOT READ)


(Note: question initially asked in CBAMS II)

Shape21


  1. Probe(s): There was not a structured probe for this question.


  1. Results:


One participant illustrated how this question can be difficult to answer for some respondents as currently worded.


RESPONDENT: My mom owns. Well, I live with her, so can I just say own? I don’t know how to answer that because I live here.

INTERVIEWER: OK. You live with your parents?

RESPONDENT: Yes.


  1. Suggested revision:


Q52. Does your household rent or own your house or apartment?

  1. Rent

  2. Own

  3. Other /* SPECIFY */

  4. Don’t know (DO NOT READ)

  5. Refused (DO NOT READ)

Shape22


This revision makes clear that the intent of the question is for the household as a whole, not just for the individual respondent.



27


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorSamuel Hagedorn
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy