2006 PFU Results

ATTACHG1 - 2006 PFU Results.doc

Census Coverage Measurement 2010 Person Followup and Person Followup Reinterview Operations and Respondent Debriefings

2006 PFU Results

OMB: 0607-0961

Document [doc]
Download: doc | pdf

51






November 6, 2007


DSSD 2010 CENSUS COVERAGE MEASUREMENT MEMORANDUM SERIES #2006-D7-13



MEMORANDUM FOR Brian Monaghan

Chief, Field Division


Attention: FLD Decennial Management Training, Oversight, and Recruiting Branch


From: David C. Whitford Signed November 6, 2007

Chief, Decennial Statistical Studies Division


Through: Donna Kostanich and Magdalena Ramos

Co-Chairs, Census Coverage Measurement Operational Integration Team


Prepared by: Beth Nichols, Jennifer Hunter Childs, Joanne Pascale,

Laurie Schwede,

(Statistical Research Division)

Julie Bibb, Vicki Smith, Sandy Norton,

(National Processing Center)

Jamie Burnham, and Patricia Sanchez

(Decennial Statistical Studies Division)


Subject: 2006 Census Coverage Measurement Person Followup Interview:

Trip Reports: January 2007



I. Background


A major goal of Census Coverage Measurement (CCM) in the 2006 Census Test was to

test new methodology for improving determination of Census Day residence. This goal

was motivated by problems with the 2000 Accuracy and Coverage Evaluation (A.C.E.) in conjunction with the presence of large numbers of duplicate enumerations in Census 2000. The most serious limitation with the 2000 A.C.E. was the ineffectiveness of its person and followup interviews in establishing a person’s Census Day residence. This led to an underestimation of erroneous census enumerations, including duplicates, and caused us to overstate the net undercount.

We needed evaluations during the 2006 Census Test in order to determine if the revised Person Followup (PFU) established a person’s Census Day residence correctly. For 2006, the primary evaluation of Census Day residence is found in Adams and Nichols (2007). As part of that evaluation, nine headquarters staff observed 47 PFU interviews at an occupied housing unit (HU) and conducted 18 respondent debriefings to help evaluate how accurately the PFU assigned residence status. These nine staff members also tape recorded 40 PFU interviews (with respondent permission) for a study of how interviewers completed the PFU forms given data provided by the respondent.


This memorandum documents our observations from all of the interviews we observed during our stay in Austin, Texas, and Cheyenne River, South Dakota. It also summarizes the changes to the PFU we recommend for 2008. These recommendations will be forwarded to the CCM Questionnaire Wording and Automation Team (CCM QWAT) for discussion. A future memorandum will document the form evaluation results.


II. Methodology


The complete methodology for the respondent debriefing project is explained in DSSD 2010 CENSUS COVERAGE MEASUREMENT MEMORANDUM SERIES #2006-D7-10, “Conducting the 2006 Census Coverage Measurement Person Interview and Person Followup Interview Respondent Debriefings.”


Eight headquarters staff observed PFU interviews in Austin, TX between January 11-23, 2007 (one of the eight did not see any interviews due to an ice storm in Austin) and one headquarter staff observed PFU interviews in Cheyenne River, SD during roughly the same time. All staff members who participated in this project were either on the CCM QWAT and had participated in the creation of the PFU, or they were experts in the residence rules and familiar with coverage measurement issues and complex living situations.


This document presents both a list of recommended changes to the PFU based on post-trip meetings held by these individuals (Section III), a summary of findings from these interviews (Section IV), and a complete person-by-person observation report from each individual (Section V). The 2006 PFU form (English translation) is attached.


III. Recommended changes to the Person Followup Interview for 2008


The following recommendations are in no particular order of importance. Note that for the 2006 Census Test the CCM PFU data collection used a Paper-Assisted Personal Interview (PAPI) on landscape 8 ½ x 14 paper.


1. The PFU dates approach worked well - keep this approach.


2. Maintain the legal landscaped form and general navigation scheme.

  • Try to keep ‘over for yes, down for no’ pattern throughout form - emphasize skip patterns.

  • Revisit skips for questions (Q)1a-h in Section C.

3. Shorten Section C of the form from seven person pages to four (hopefully this will

work!).

  • Modify form by having many of the “other address” probes on the same page. Idea being that no one will have addresses for each and every probe.

  • Omit summary page.

  • Revisit Q10-12 - conduct analyses to see if these got useful data, possibly omit Q10-12. (NPC analysts stated they did not use the data to code cases.)

4. Consider three different options for printing the Spanish forms.

  • Spiral bound on top with hard back, like ILB but 8 ½ X 14, Spanish on the reverse page to page. Would eliminate need for clipboards.

  • Docuprint two forms for every household.

  • Docuprint two forms targeted when Person Interview (PI), Nonresponse Followup (NRFU) or Mailout/Mailback (MO/MB) form was completed in Spanish (or at least when PI was completed in Spanish).


5. Improve maps for 2008/2010 and consider using Global Positioning System (GPS) in rural areas for future censuses (2020).

  • Ensure maps are better for 2008

  • Update maps throughout field test for 2008

  • Plan to use GPS in CCM for rural areas (identified by Type of Enumeration Area) in 2020. This entails using GPS systems in listing as well as other operations.

  • Provide a copy of the HU reference list along with the maps (especially in rural areas) to help enumerators orient themselves.


6. Modify materials.

  • Have one 8.5 x 11 page containing the letter/flashcard/calendar combo to hand to every respondent. The calendar would be on one side and the other side would be ½ letter and ½ flashcard. There would be separate handouts for English and Spanish.

  • Revise letter to be clearer about purpose of visit.

  • Add a paragraph on the Spanish letter saying that if you need a Spanish-speaking interviewer and the current one does not speak Spanish, we will try to send one.

  • Give a Census bag (big enough to contain the clipboards) to each interviewer (purpose: safeguards Title 13 data, promotes authenticity of interviewers).

  • Continue to use plastic bags to package forms.


7. Work on training.

  • Points for gaining cooperation—address how to respond to question such as: “Why are you here again?” and “Why are you asking only about one household member?”

  • Training situations that show what to do when followup person has moved out.

  • Clarify what to do when you find a Spanish-speaking household.

  • Train using clipboard and train on how to use clipboard.

  • Stress in training that even if the respondent (R) and interviewer do not think a calendar would help, they should give it a try because often it does, in fact, aid in recall.

  • Use calendar/letter/flashcard in training.

  • Use complete form (including Spanish translation) in training.

  • Offer a short update training two weeks after being in the field.

  • Review possible problems in remote/rural areas and the remedies (example: R not recognizing official address & R providing a P.O. Box as the current address).

8. In 2006, the biggest problem with PFU was finding a knowledgeable respondent.

  • Rework Section A of the questionnaire entirely to guide the selection of the knowledgeable respondent.

  • During training, note problem with lag time of PFU operation. There were many movers.


9. Re-evaluate proxy rules

  • Make it clear when you can immediately get a proxy

  • Re-evaluate six visits before a proxy rule

  • In special situations (xx miles and/or XX visits), allow interviewers to telephone to attempt to set up an interview.


10. Reduce interviewer burden

  • Print some general reason for returning to the housing unit to conduct the PFU on each questionnaire by type of case.

  • Evaluate the Spanish translation through cognitive testing. (There is planned cognitive testing of the Spanish PI questionnaire. Many of the PI and PFU questions are the same. Add the unique PFU questions to the debriefing of that PI cognitive testing.)

  • Add whether the PI was conducted in Spanish in case management, so the crew leader (CL) can assign Spanish-speaking interviewers to known Spanish cases.

  • Try to improve “same as” shortcut when interviewers want to indicate that one household member lived at the same address as another.

  • Add “sample address” as college address response option.

  • Consider photo IDs

  • Encourage “promotional material about the Census” to include what is and what is not collected. For example, in the Census advertising, make sure it is clear that the Census does not collect mother’s maiden name or social security numbers.


IV. Summary of Findings


In this Section, we identify findings of particular interest that lead to the recommendations above. When possible, we reference findings to the individual report from which they came or post-observations meetings. These are referenced by the authors’ initials following the finding. In addition to findings from the trip reports, where pertinent, there are also notes from two meetings where this team discussed these findings to generate the recommendations above.


Initial Code:

BN=Beth Nichols, JC=Jennifer Hunter Childs, JP=Joanne Pascale, LS=Laurie Schwede,

J. Bibb=Julie Bibb, VS=Vicki Smith, SN=Sandy Norton, J. Burnham=Jamie Burnham


A. General Observations


1. Identifying followup people:

a. Followup people were often long-gone by the time PFU came around, and it was often not possible to locate anyone who knew anything about them. (VS)

b. There needs to be more explanation in training that there are really two phases to the PFU:

i. Investigating in order to find someone who is knowledgeable about the followup people if they are no longer at the sample address (SA)

ii. Conducting the PFU itself once a knowledgeable R is found (BN)

c. The interviewers did not know what to do when the respondent did not know someone on the roster. This situation came up in a couple of interviews I observed, but the interviewers did not ask the followup questions to determine who might know the people. This led to confusion on determining if a person is fictitious. (J. Burnham)

d. Contrary to procedures, interviewers were seen contacting three people in the same household to determine fictitious.


2. Spanish:

a. The operation seems not to have enough Spanish-speaking interviewers. (VS, JP, SN)

b. Procedures for finding a translator in the household (e.g., a child) or the neighborhood seem unclear (VS, JP) and it seemed that no effort was made to assign highly Hispanic neighborhoods to Spanish speakers. (SN)

c. Not all materials are printed in Spanish (e.g., notice of visit). (VS) Also when an English-only interviewer is trying to talk to a Spanish-only respondent to explain that a Spanish-speaking interviewer will return another time, obviously the respondent is mystified. It would be useful to leave a letter in Spanish with the respondent which explains this. (J. Bibb)

d. Spanish on the flipside of the form seemed problematic, (VS, SN) but others said it did not seem to be a problem. (BN, JP)

e. Translation may be at a ‘high fluency’ and may need to be simplified. This will be explored in the cognitive testing of the Spanish PI form with the translated “dates questions” in a debriefing supplement. (JC)

f. One interviewer thought the PFU translation was more usable than the PI translation. We should also listen to the tapes of Spanish interviews and evaluate how they went. (BN)


3. Gaining Cooperation

a. Respondents (even landlords) feel hassled and do not understand purpose of return visit. (VS)

b. Respondents did not seem to understand the introductory letter. (VS)

c. Interviewers who explained the purpose in terms of a discrepancy seemed to be able to help respondents understand the purpose of the visit. (JC)

d. Some interviewers thought better community support and information to the public would help. (J. Bibb)


4. Respondent Burden

a. Most new addresses were reported at Q1 and Q2 in Section C. When Interviewers continued asking the probe questions, often the address had already been reported but interviewers failed to use the ‘same as Name’s location’ box and so re-recorded everything. This is really burdensome and error-prone. A proposal would be to ask Q1 and Q2 as is but format the others as probes, rather than each question having a full set of followup questions (landmarks, type of place, etc.). The initial question on military/job/etc., would still be read to the respondent but we could save a lot of space on the form if we did not have to print followups for each question, but just left space for perhaps two additional addresses. (J. Bibb) Data from Questionnaire Design Experimental Research Survey (QDERS) show the maximum number of addresses collected for any one person was five.

b. When all the answers are the same for multiple household members, repeating the questions over and over is burdensome. (SN, J. Burnham, BN)

c. Looking at the data, “same as” and “yes, already recorded” were used, but there were also cases where “same as” was marked and address information was collected anyway. Thus, the shortcuts were somewhat effective, but not as effective as they could have been.

d. Looking at the data, only 2 out of 4199 cases had three addresses recorded to Q3-Q9. No one had more than three addresses to these questions. Two address fields for these questions plus notes should be sufficient in all cases.


5. Form Staple and Size

a. Staple in lower left corner is problematic. (all) We should show interviewers in training how to use the clipboard. (JC) If we use landscape legal size paper, docuprinting dictates that the staple go in the lower left corner. (4/10 meeting note)

b. Many interviewers reported that they did not like size and would have preferred 8 ½ x 11. (J. Bibb)

d. The number of pages was manageable, even with a five-person household. (BN)

e. Many interviewers preferred the laptop. (SN, JC)



6. Post-interview work

a. Some interviewers spent a lot of time post-interview filling out the notes and summary Sections. (JC)

b. Some interviewers complained that the summary page was ‘silly, redundant, unnecessary.’ Analysts do not think these pages were helpful. (J. Bibb)


7. Skip patterns & formatting

a. Skips were confusing and inconsistent (some were more explicit than others). The general ‘over to the right for yes; down for no’ pattern (Section C Q2-Q) seemed to work fine but other areas (e.g., Q10-12) did not. (BN, VS, J. Burnham) Q10 skip patterns were missed consistently. Questions 10, 11 and 12 were asked when not needed. (JC, J. Bibb, J. Burnham)

b. Skip from Q1a to 1h was difficult. Perhaps put Q1h twice on that page. (BN)

c. Skips and instructions need to be marked better. (VS)


8. Flashcards

a. Flashcards were used inconsistently. (all)

b. Interviewer recommended flashcards be laminated and extras provided or produce the bags with a clear plastic pocket attached to the outside, and the flashcards could be kept inside the pocket or print the calendar on the clipboards. (JP)

  1. In 2008, NRFU and PI will use “handout” flashcards that respondents can keep. (4/10 meeting note)


9. Calendars

a. Calendars were used inconsistently. (all)

b. One observer demonstrated that the calendar could help a respondent recall dates and after that the interviewer used it more consistently, with success. (J. Bibb)


10. Docuprinting: (all)

a. We need to make sure both rosters are printed (if appropriate).

b. We need to check age filters for babies less than one year old and for people with age from census (not PI). According to our plans, if date of birth was missing, it would be asked in PFU. However, sometimes age was not printed on form, nor was date of birth asked, indicating that we possibly had date of birth but failed to calculate or to print age on the form.

c. Sometimes two Section C’s were printed in error for definite matches.

11. Maps

a. Although the plan for the 2006 PFU test was that interviewers would have maps to assist in finding the followup cases, and the maps were created and delivered to the field, we observed that interviewers in Austin did not have the required Census Bureau maps. As a result, most had to Google the directions themselves and still had trouble. Also, without maps, interviewer could not confirm whether they were outside their sample block. (J. Bibb, J. Burnham)

Post-meeting e-mail discussion (10/17/07): Field has determined that maps were not used in Austin as they should have been. There were comments in all the debriefings (interviewers, crew leaders, FOSs, and office) suggesting that there was variable availability (late delivery), use, and quality of the maps. Field is aware of the issue and will stress to the RO/field staff the importance of the requirement to use the Census Bureau maps for this operation. Also, map development problems encountered for 2006 will be corrected. These changes shall ensure correct application of the PFU procedures for DR and 2010 operations.

b. Inaccurate census maps and unclear mapspots in the South Dakota site continue to be a problem in PFU, as they were during the PI operation. Map-related problems that I observed in PFU (and also in PI) include:

i. Cluster maps that do not have clear descriptions of boundaries;

ii. Maps with no roads or features within them to triangulate the location of target households; maps not drawn to scale; and,

iii. Most pervasive of all, maps with just the perimeter boundaries and just the scattered target households marked by squares, with no other reference points within the cluster to determine whether a given house or trailer is the right one or not.

In looking at some of these maps with no other marked features, I would have had a lot of problems first in just finding the houses, and second, in trying to figure out which units on the ground were in scope. Other houses should be marked on the maps to help interviewers find the right ones relative to the others. Also enumerators expected maps to be updated based on PI, but they were not. (LS)

c. In 2008 interviewers will have better maps, based on CCM independent listing, and the HU matching and Housing Unit Followup (HUFU) operations should help a lot and make it much more likely that PFU addresses are correct. The new maps should also help interviewers figure out whether a given address is within their cluster. NPC thinks this will not be a problem in 2008. (4/17 meeting note)


12. Address Mix-Ups

a. In rural areas sometimes PFU people were recorded at the wrong address. Interviewers need better instruction on what to do here; also it may help to give them a list of ALL HUs in the block, not just the PFU addresses, so they have a chance of figuring out the mix-up. (J. Bibb, LS)

b. Sometimes people in extremely rural areas and Indian reservations do not know their address in the same way the address gets printed on the form. So in Section C, Q1 they say ‘no’ to the address that is read out to them, and then at Q1b the address they do use gets recorded as if it was an alternate address (often a P.O. Box), which it is not. (LS) NPC thought that they got sufficient notes in these cases anyway. (4/17 meeting notes)

c. Interviewers need better training on what to do if people are listed at the wrong address. Specifically, they need to understand that the goal is to followup people, not necessarily addresses, if there is any discrepancy between the two. And they need to be trained to take very good notes on these cases (generally the notes have been good). (4/17 meeting notes)



13. Training

a. Interviewers thought training was helpful and adequate. (J. Bibb)

b. One interviewer said it would be helpful to have some kind of “mid-session” debriefing or training, roughly two weeks after they’ve been in the field. The purpose would be to allow interviewers to come together and share their experiences and problems, and exchange tips on how they overcame those problems. (JP)

c. Interviewer reported there was no training case where the respondent did not know the followup person, even though this happened a lot in the field. (BN)




14. Equipment

a. A legal-size clipboard does not fit into the bags used in 2006. Currently, there are no plans and no budget for bags in 2010 but we recommend advocating for this for the following reasons: (from meeting discussions)

1. Some interviewers go around on foot and have nothing to carry the materials in.

2. A clearly marked bag lends credibility and authority to the interviewer.

3. A bag could be useful for confidentiality since it would shield the data from sight.

4. The outside of the bag could include a plastic sleeve to display the calendar or flashcard. This may make it more likely that Interviewers would actually show the materials to the R, since they would not even necessarily have to take it out of the bag.

5. On Indian reservations where residents are suspicious of strange vehicles and people, a census bag may help to communicate that the interviewer is with Census and alleviate some of the suspicion.

6. If we do get funding for bags we should ensure they are the appropriate size for the clipboards and other factors (such as accommodating flashcards). (4/10 meeting note)

b. Clipboard worked well when they understood how to use it. (BN, JC) They should be trained using clipboard as a prop.

c. Clipboards not as useful in the rain - had to wait at the door to organize form - downside of using paper forms. (SN)

d. Plastic bags worked well to keep forms organized. (BN)

e. Given concerns about identity theft, interviewers think they would have more credibility if they had photo identifications (IDs). (JP)

f. One interviewer recommended that in promotions, the Census notify the public about what information they will and will not be collecting – specifically, that the Census will not ask for Social Security Number or mother’s maiden name. (JP)

g. Not having a Census office impeded efficient meetings for PFU Crew Leaders and their staffs. (LS)



15. Proxy rules

a. Interviewers did not seem to understand proxy rules when the followup person had moved out. (BN)

b. Six proxy attempts were too many, especially when people had to drive far. (4/10 meeting notes)

c. For distant/rural areas it can be very inefficient to drive there only to find no one home, which suggests it may be more productive to phone ahead for an appointment. (LS) However, this could invite refusals that would not happen in a face-to-face encounter. Need to evaluate trade-offs and perhaps come up with a rural-specific scheme, such as ‘once you’ve driven xx miles and/or attempted xx visits, it is ok to call’. (meeting notes)



B. Section-by-Section Observations


1. Cover Page

a. There may have been printing and/or matching errors, as it was often not clear why certain people were being followed up. (JP, JC) More importantly, from the information printed on the rosters, it was often not clear at all to the interviewer (or the observer!) what the situation was and what information the PFU needed to capture for that particular case. (J. Bibb) Interviewers sometimes did not understand why six people were on the roster, but only one was followed up. (SN)

b. The questionnaire should give the interviewer some idea of why the people are being followed up, because that would give them a better overview of the case. Interviewers I observed were forming their own opinion as to why we were following up certain people and that would sometimes affect how they approached the interview. (J. Burnham, JP)

c. We may need to add a question to confirm street number. In apartment complexes where the unit numbers match what is on the form, the actual street address may be different. (JC)


2. Section A

a. When the landlord can give at least some information on when the people moved out, there is not a good place to capture this on the form. (JC) Likewise, if something was learned in casual conversation with the sample address residents about the followup people, there was nowhere to record it. (BN)

b. Some interviewers did not use these questions to determine whether a neighbor would be a qualified proxy; others did not use them to track down a “Smith/Jones” case; others did not use them at all. (JC)

c. Observed an interviewer that used Section A as a record-of-visit page. (VS, JC)

d. Did not work at all to prompt interviewers to actually look for a more knowledgeable person OUTSIDE the household (e.g., landlord, neighbor). Interviewers seemed to get the concept of a ‘more knowledgeable person’ but the forms did not adequately prompt them to go find those types of people – they were just focused on filling out their three names. (J. Bibb, BN) Intent of Q4 is not clear. May need to ask ‘qualifying’ questions for a knowledgeable respondent – whether the respondent knows if the followup people were there on Census Day, whether they still get mail, etc. (J. Bibb)

e. Location of the respondent type codes was problematic. (J. Bibb, BN)

f. Interviewers reported liking that the questionnaire asked if it was ok to ask about deceased person. (JP)

g. Interviewers reported that the introduction is inadequate and it is more helpful to be more explicit with respondents. (JP) In general, she said, respondents are less annoyed with the PFU interview than they were with the PI, because they grasp the purpose of the PFU (to clarify a known problem). (JP)

h. Drop Q4 (J. Bibb notes it was never asked) but include instructions for finding other knowledgeable respondents in training.

i. In 2000 they had an introduction that included some kind of screening to identify an eligible informant. We should look at that and see if we can adapt some of the text into the Section A questions. (4/10 meeting note) Post meeting note: These instructions were household-centered, whereas our interview was person-centered. Also reports from 2000 were that these questions were not helpful for interviewers.


3. Section C

a. Dates and Back-and-Forth Questions

i. Some difficulty was observed with “stayed all the time” box when a person had moved. Is it all the time after the move? (JC)

ii. It was also noted that when a followup person stayed several places (going back and forth between all of them, the interviewer did not ask but recorded that the daughter lived at each place 1/1/06 to 12/31/06. (JC)

iii. It was not clear how to ask Q1h. The question is vague; ‘how often’ does not have a time frame – does it mean ‘how often during the calendar year of 2006?’? (JP)

iv. In general the dates questions seemed to work well and the back/forth was a bit fuzzy sometimes but interviewers usually figured it out. (J. Bibb) Overlapping dates usually meant the person went back and forth during that time.

v. When multiple dates are given, it is unclear how to report them. (BN)

vi. The layout is confusing and multi-barreled because it is several questions embedded in one. Some enumerators were confused as to when to mark what boxes in moved vs. back and forth situations. Additionally, the critical question of whether the person was at this place on Census Day is hidden among other response categories. (LS)

vii. Question 8h (and others like it) - Analysts thought these questions worked well and captured the kind of information they needed in order to determine where someone should be counted. Analysts thought dates were almost always given in nonproxy interviews and the checkboxes were also useful. Although we thought it might be problematic to have overlapping dates, analysts thought that it was actually helpful because you could clearly see that the person was going back and forth between those two places during the overlapping time period. Although some concerns were brought forth about the format of the 8h-type questions, the analysts thought the data brought in by them were good. We need to maintain the ability to collect all the data, even if the format changes. (4/17 meeting notes)

viii. The concepts being asked about – dates ‘moving’ in and out, going back and forth, etc – did not map on very well to how people on the Indian reservation think about their residency patterns. However, analysts said that interviewer notes were generally sufficient for them to code people to an address. (4/17 meeting notes)


b. Shortcuts

Using the “Same as” shortcut to indicate repeated addresses, did not seem to work; interviewers re-wrote everything anyway. (BN, J. Burnham) Suggestions - automate! If that cannot be done, we should train better on shortcuts if we keep them. We will review the forms for skip pattern problems and possibly reformat or eliminate these if they are not useful.


c. Question-by-Question Problems


Q1: Stayed at Sample Address

If the answer is “no” the skips are fairly easy to follow but if “yes” the skip is hard to follow. (JP)


Q1c: Landmarks

This question is usually met with confusion and/or a do not know answer. One interviewer reported that respondents might be able to provide better information if they understood the purpose of the request, and recommended modifying the wording – something like: “Are there any landmarks, such as a school or hospital, that would help someone find the address?” (JP)

Q2: Other Address

In two cases, the respondent had an alternate address, reported in Q2, but at Q2g Interviewer asked “Did he live HERE (i.e., original sample address), not THERE (alternate address). (JC)


Q2g: On April 1st

This item was sometimes overlooked, and the crew leader had to ask interviewers to return to the field and get the answer. (JP)


Q4: College Address

This was a problem for people who were attending college but did not actually live/stay overnight there. (JC, JP) They did not know how to respond to these questions. (Interviewers need a quick way to record that the person stayed at the sample address (SA)).



Q9: Group Quarters (GQs)

i. Many interviewers understood and gave good explanations of GQs. But two interviewers (who probably did not read the question as worded) almost elicited a ‘yes’ response because the respondent worked at a nursing home or visited a grandmother at a nursing home. (JC)

ii. In Q9c, it was recommended that we eliminate response categories 13-19 because they are not on the flashcard and mention of hospitals as an example by interviewers confuses people to think that general hospital stays should be reported here. (JC, BN)


Questions 10-12c

i. Skips were problematic.

ii. In an observed interview, the respondent had other places to stay, but was not able to give any other addresses, the interviewer read only Q11 (since only one address had been recorded), and skipped 12a-12c. (LS)

  1. Additionally, a recommendation was made to ask everyone 12a-12c to ensure Census Day residence is accurately determined. (LS)

  2. While problematic due to skips not being followed, 10-12c were not very useful to analysts. For the most part they could derive residency through the earlier set of questions. These questions had been put in as a back up in case the earlier questions were insufficient, but when they turned up inconsistent answers, analysts used responses given in the earlier parts of the form instead. Sometimes it seemed that respondents would answer the 12a series with where they thought they should be counted, rather than actually answering the question. Analysts thought we did not need questions 10-12c. (4/17 meeting note)

  3. Q12a: Around 4/1 question: We may want to add words like “Just to confirm” since multiple addresses have already been talked about. (JP)

Q13: Demographics

These questions are not required by design requirements, but might still be useful for analysts. (4/10 meeting note) Post meeting note: The decision was made to drop the demographic Q13 series, because it is not required.


Q13a: Name

There are no instructions for what to do if the name is not spelled correctly – should the interviewer cross out the pre-printed name and write in a new name? One possibility would be to add a set of yes/no check boxes, and if “no” is selected, allow a space for writing in the correctly spelled name. Interviewers noted that some respondents find it difficult to listen to a spelling of a name, and both interviewers and respondents are tempted to just look at the pre-printed name. (JP)


Q13c: Date of Birth

Some knew age but not date of birth (DOB). If we keep this sequence, we need to add a place to capture age. (J. Bibb)


4. Record of Visits

  1. Record of visits may be more useful on the front page, rather than tucked at the back. Interviewers do not seem to fill them out. (JC) An interviewer mentioned that it would be helpful to have the record of visits on the front of the form, because it was a lot of trouble having to keep flipping back to the front to get the address information. (J. Burnham)

  2. The cover page ended up being used as a record-of-visits in some instances. (JC)

c. As part of the discussion on how to identify fictitious people and knowledgeable informants, the placement of the record of visit page came up. Interviewers complained about this being in the back, and some thought moving to the front would be an improvement. If the ‘other’ roster column was dropped from the cover page there could be space. Dropping the ‘other’ column would mean that for cases that do have an ‘other’ roster the two forms would need to be clerically identified and attached. However, we think these cases are very rare and could be worth the trade-off. (4/10 meeting notes)

V. Person by Person Observations


Jenny Childs

January 11- January 17, 2006


I observed two interviewers over the course of four days (and was iced in the remaining two days of my trip). With one interviewer, I was able to observe a set of about 17 new cases, all in an apartment complex, that she worked from the beginning, almost to the end. With the other interviewer, I saw the end of a set of cases, where we visited HUs where either no one had answered previously, or the knowledgeable person had not been home. Both interviewers had worked PI. One explicitly told me (without prompting) that she preferred the laptop to the paper form. I observed her during PI as well, and she had been very proficient with the laptop. She had difficulty managing the pieces of paper. The other interviewer was split as to whether he liked the computer or the paper. He mentioned benefits of both and wavered as to which he preferred.


It seems that a lot of work needed to be done post-interview by the interviewer to clean up the form. They did the interview as quickly as possible, leaving some filling in to do later. One interviewer told me she spent hours after the interviews at home completing the forms, including writing notes on them. For this reason, I did not get to see how the summary and notes pages appeared. Those will have to be assessed post-operation. We should get a box of forms, at least, to look through to see how interviewers used the notes Sections. The analysts should also provide their input on the summary page as well as the notes fields.


Interviewer #1

Interview #1

This was a proxy interview and was not taped. The respondent was the neighbor of the followup person, whom she reported had moved. On the roster was what looked like a father, son, and daughter. The interviewer followed up about the father, though the father and son had the same name and I am not sure she was following up on the correct person. It seemed like there would be more uncertainty revolving around the children. The neighbor said the children stayed with him part of the time.


The interviewer showed the respondent the calendar, but did not show the flashcard. The respondent was not very knowledgeable about dates and the interviewer did not use the Section A questions to determine if she was a knowledgeable respondent. She reported that the man moved to another city in TX for a construction job, but she did not know when or the address of where he moved. Interviewer asked Section C Question 2g about the wrong address. She said “was he HERE on Saturday, April 1st”. Respondent did not know the answer, nonetheless.


This interviewer had a good explanation of the GQ question. She understood the intent well.


This interviewer missed the skip pattern for Question 10 consistently. She answered 10, 11 and the 12 series for just about everyone.


The next day, the interviewer told me her crew leader said she should continue to try to get in touch with the landlord, because the proxy had given her mostly don’t know responses.


Interview #2

This was a single-person household with a census roster only printed on the form. The person lived at one place only, but moved in on 1/6/06. We did not get the other address, but this is probably ok because it was so early in the year. The interviewer used the GQ card and offered the calendar, but the respondent did not want to see the calendar.


Interview #3

This was a census roster with one followup person. I think there were two people on the roster, but only one was followed-up. The interviewer used the calendar. The respondent was the followup person and had moved in to SA from a Naval Air Station. He reported living in the barracks there. He gave move dates, but I did not write them down (they are on the PFU form).


Here an interesting problem with the checkboxes after the dates displayed itself. Upon encountering a move, she selected “stayed all the time” for the SA, because after the move, he stayed here all the time. I think she thought the “moved” box meant “moved out.”


Interview #4

This interview had both a PI and a census roster with three people on each. All three people should have matched. The first set of people had a different name on each side (an Asian and an English name for the same person). The other two sets of people only differed by a very minor discrepancy (different middle initial, I think) and probably should have been matched. There was a Section B for the first person, but not the other two. There were two person Sections for each person. The interviewer only asked the questions once for each person. I think the later two people should have just been matched and not followed-up at all.


The interviewer did not follow the skips for Section B. She asked both questions, and wrote in to question 2 that they were the “same person.” Generally, this interviewer had trouble with the skips aside from the general Section C “across for yes and down for no” path.


The interviewer used the calendar on this one. The respondent first reported that she had moved from two places (I think a parental home and a college home) to this place. She gave a move date and the first address she gave for Section C Q2 was her parent’s home. After she completed the Q2 series, she realized that her move was completed in 2005, not 2006. We looked at the calendar and confirmed this (she had been at the new place longer than five months). I am not sure if this got corrected on the form, though the interviewer had used a pencil and might have corrected it afterwards.


One of the biggest problems identified in my observations deals with a college address. The respondent reported that, yes, she was in college. She went to a community college – gave name. The interviewer asked for the address (not reading the question as worded). Respondent said she did not know the address of the college. Then the interviewer got stuck because she did not know how to indicate on the form that the respondent only stayed at SA while attending college. We need to be more careful about making sure we can indicate “stayed only at SA” for college housing. She wrote a note on the form, but was not able to indicate this within the questions on the questionnaire.


The interviewer did not read the “just to confirm, you only had one address” question. She just marked the answer as if she had.


Section B was missing for Person 2 and Person 3, who should have been matched without having to ask. There were also duplicate person pages for these people.


Interview #5

This was a Smith/Jones case, though I did not see if the Smith/Jones pages printed. The person on the PI roster answered the door and reported she had lived at only one place during 2006. She had lived at the SA for four years. She did not know, nor had she heard of the other family. The interviewer asked if she knew the other people. I asked if she had even heard of them. The interviewer did not use Section A to help her know what to do with this case. In fact, I think she did not know how to complete it after the interview with the person on the PI roster. Section A needs to be revised to be more helpful.


This respondent almost answered “yes” to the GQ question because she works in a nursing home. The interviewer probably did not read the question exactly as worded (as she was not reading most questions as worded), and she clarified that it was only if she was living there, not working there. This did get recorded correctly.


Later, we went to the apartment manager to ask about several households. The apartment manager looked up the census roster family and said that no one by that name had lived in this complex at all. Seems to be fictitious. In this case, she needs one more respondent, but it will be fruitless. If the apartment manager can definitively say those people never lived there, then it is very doubtful that anyone else in the apartment will know them either.


Interview #6

This was an interesting case to try to get an interview for, but relatively boring once we got the interview. We had knocked on this door the day before with no answer. This day we knocked again with no answer. We went across the hall to another SA, when a guy came up and knocked on the door. The interviewer took the opportunity and asked if he was there to see the girl listed on the census roster (single person on census roster). He said “yes” and she answered the door. The interviewer persisted to do the interview despite that the respondent claimed to be busy and that she had a guest at her door. The guy left and we got the interview. She reported only 1 place to live. The only thing we found out was that her middle initial had been wrong on the form.


This respondent almost reported yes to the GQ question because she had visited her grandma in a nursing home. I do not think the interviewer read the question as worded – it states “spend even one night.” The interviewer clarified, correctly, that this did not apply if she did not stay there.


There were 17 cases in her caseload from this apartment complex – and mainly from 2 buildings in it. Most had a census roster, but no PI roster printed on the form. I wonder why they ended up in followup. . . I wonder if we can look up these cases and see.



Interview #7

There were two people on the roster – a mother and daughter. Mother first said that daughter did not” officially” live anywhere else. She did stay with several friends locally. At one of these places, she stayed with people who were originally from outside the state. In another neighborhood, she had two places to stay. We did not get complete addresses, but we did get the street, a cross street or landmark and the name of at least one person she stayed with at each place. There were three addresses total with friends. The respondent also gave the her ex-husband’s address, saying that when mom is out of town, she drops daughter off there, but daughter does not stay there – she leaves to stay with friends because she does not like it there.


For the dates, the interviewer did not ask, but confirmed that the girl went back and forth all year, and she put 1-1-06 to 12-31-06 for each address. This did not function as intended.


Most of these other addresses came out during Q2 and there was not enough room to write them all down. Perhaps we should think about having room for another address next to Q2. Interviewer also asked the 2g question wrong – about “here” instead of there, but she wrote the answer down correctly. (She found out the girl was at SA on 4/1 – so she marked “no” for the other address).


Mom reported that daughter stayed less than half the time at each place, but in response to my probing, that the sample address was the place the daughter stayed more than any other place.


Later in the interview, it came up that the respondent and her daughter had moved on Census Day. Because of all the other places to live, this did not come up immediately – and I am not sure if it got recorded on the form. They moved on Census Day and had filled out a census form at the old address. She said she did not fill out a form at this address, but an interviewer came by wanting to know why she did not fill one out (not sure if she did a NRFU interview). The daughter had moved out of SA “permanently” just yesterday. On Q12, interviewer marked moved after 4/1 and after the PI interview date because of the most recent move. I think she marked SA for around 4/1 and around the PI interview date.


This was interesting case of mobility that is documented in more detail on our debriefing sheets (not included here for confidentiality purposes). I am not sure the interview captured the information correctly. We needed to know where she was more than anywhere else. Q12 series was probably important in this case, but that data ignored the move on 4/1 and concentrated on the move during January 2007.


From what I learned, it looks like the daughter (and mother) should have been an inmover for PI. I think the rule is that if you move on Census Day, you are to be counted where you filled out your form. If so, she should have been counted at another HU on Census day. I think she was a resident of SA on Interview Day, though.


Interview #8

This was a poorly conducted Spanish interview. There was a single person on the Census roster. The respondent did not speak very much English at all. The interviewer knew very limited Spanish. The interviewer called her husband (a fluent Spanish speaker) and he tried to help her with the Spanish, but her pronunciation was pretty bad. She completed the interview – answering that the respondent had no other place to live, but I am not sure how much he actually understood.


Apartment Manager Proxies

The interviewer took the remainder of the cases for this apartment building to the Apartment Manager. In many cases, she knew already the followup person had moved out based on reports by the current residents. She had about five addresses where she had not found out any information yet (no one home). I encouraged her to ask the apartment manager if they were still living there, and we found out that 2 of those had also moved out.


In six cases, total, the followup person(s) had moved out. The apartment manager was able to give a move out date, but nothing else. There is not really a good place to capture this information on the form. It is likely the only information we will get in these cases because the people in these apartments do not seem to know one another. From move dates, I gather that one household moved out after the PI, five households moved out between Census Day and PI, and one moved out in August (unsure whether it was before or after PI).


There was one additional case where a different person lived at SA than the followup person from the Census roster and the Apartment Manager reported that the other person had been there since 11/05. I would guess the census person is either fictitious or an address mix-up.


Interviewer #2

Interview #1 and only

This was a Spanish case who refused the tape. A single person was on the Census roster. She reported no other addresses. Interviewer asked Q12 “Around 4/1” despite only having one address. He seemed to have problems with that Q10 check-item skip pattern too. The interviewer did speak fluent Spanish and was very good at addressing the respondent’s concerns.


The age on this form was missing and we asked the college question despite the woman appearing in her mid-50s. No date of birth question was printed. Does that mean we had her date of birth? Is there a problem reading in census date of birth for question filters?


Noninterview #2

This case was kind of funny – Jamie saw it too. The PI Roster seemed fictitious (famous name). There was a different name on the Census roster. Interviewer had talked to neighbors and no one knew anything about who might have lived there. Interestingly, he had not reported any of these contacts on his record of visits – it was still blank. The one piece of information I thought might be useful for the analysts, which he had not written down yet, at least, was that there was a sign for an Austin Historical Landmark on this house. The unit appeared vacant. A good investigator could have gone to the registry of landmarks and looked for the owner, contacted them and asked if anyone lived there. I do realize this is above and beyond the call of duty for the average interviewer, but I wonder if analysts are allowed to do anything like that.





General observations

Interviewer #1 was using the blank Census Roster on the front cover to record her record of visits. She had listed all the people she had talked to in a couple of cases. I think this indicates that the record of visits would be more useful on the front page. Interviewer #2 did not use the record of visits, as far as I could see. We visited five addresses that he had been trying to get and nothing was written on the record of visits.


Interviewer #1 was not good with the skip patterns. She generally answered more questions than she needed to. I saw Interviewer #2 make a skip error on Q10 also. Lesson learned – avoid unique skip patterns if possible. Both seemed fine with the general Section C “over if yes, down if no” skip pattern.


Section A was not helpful. Interviewer #1 did not use this to try to get a knowledgeable respondent, nor did she use the “have you heard of” question even in the cases that it would be useful. We should look at the forms when they come back to see if they ever used Q3 & Q4. I think that since we are not doing outmover tracing we do not need Q3. If Q4 is not useful for interviewers we could use that space better. It is not data captured. From Section A, it was not clear how to deal with people who had moved out or possible fictitious people. How to report respondent type was also not clear.


Address mix-ups

When we were trying to find the apartment complex initially, Interviewer #1 had mapquest directions. We pulled up to this complex, did not see a street number, but continued to look at the unit numbers and found the numbers we were looking for. After knocking on the 3rd door of the addresses she had, someone answered. She asked if he was the guy listed on the form. He said the first name was right, but the last name was wrong. She asked if he knew the other person on the roster and he said no. At that point, I interrupted and asked him what the street address was. It was a different street number!! I learned two things here: 1) We should confirm the address in the questionnaire, and 2) apartment mix-ups can occur at the street number level. We had two cases in this caseload where the people on the roster did not live at the SA and the Apartment Manager said they had never lived there. I suspect no one in the complex will know these people, but they could live at the same unit number in a different complex. This complex was one of many on the street.


The dates seemed to work well for a move, but not so well for a back and forth place. I think we need to revise so that it asks for dates for a move, and asks for amount of time spent for a back and forth.


Both interviewers were well versed in how to introduce this interview. Both understood that there had been some kind of discrepancy in the information previously received and that they needed to resolve it. This worked with respondents rather well. I think people could understand that.


We need to tell them in training how to use the clipboard. Some had difficulty since the staple was bottom left.


One interviewer mentioned that the language on the Spanish flashcard was at a very high fluency. Looking at it, the words they use are very big. Other than that, I did not hear any negative feedback on the Spanish translation.


In 2008, we need to make sure everything prints as it should. Here are problems noted with the 2006 docuprinting:

1) Both rosters do not always print in cases where there was a Census and CCM roster. Printing both rosters would help interviewers understand what they are following up.

2) People that should have been matched perfectly were issued two Section C’s with no Section B.

3) There might be a problem with age/DOB filters. One problem was observed with a Census roster middle aged person where DOB question did not print, but no age printed and age filters were not used.



Beth Nichols


I observed four interviewers (working for two different crew leaders) over the course of three days (1.11.07-1.13.07). All the interviewers I accompanied had worked the PI operation. I saw one completed English-language interview and one completed Spanish-language interview. For both these interviews, the entire PI household was followed up. Both respondents agreed to be audio-tape recorded; I asked an additional question at the English-language interview, which could be considered a respondent debriefing.

In addition to these two cases, I also saw these types of cases:

  • a followup case where the current sample address residents had never heard of the Census followup person.

  • a followup case where the respondent would not answer questions about herself (she was a PI followup) until her roommate arrived home and could be present during the interview; this respondent also said the Census followup person lived next door.

  • a followup case where the respondent recognized the first name of the PI followup person, but said that person did not live there anymore.

  • one PI followup person who was home, but did not answer the door. The interviewer left a notice of visit and called me later and told me she got the interview over the phone. This followup person was incapacitated.

  • eight cases of no one home.


The interviewers I accompanied also told me they had other cases where the sample address was now vacant and they were waiting to talk to the Apartment Manager about the followup people.


These were the problems I encountered.


Problem 1: The first thing to note is how few completed cases I saw. I think this occurred for a number of reasons. First off, none of the interviewers had a lot of cases to begin with (an average of three cases per interviewer). I think that is due to the delay in sending out the cases. I do not really understand why there was this delay, but my last interviewer had heard they were getting more cases on Monday the 15th. Secondly, and perhaps more importantly, the PFU task is very different than the PI. Not only do you have to get someone to answer the door and agree to an interview, you then have to find a person who knows about the followup people. The PI did not have this second step. One interviewer told me that even though he was told it was going to be a short operation, he thought it would take longer, because there was a lot of investigative work to do.


Solution 1: Emphasize to interviewers (especially those who conducted the PI) that this task is different than the PI task and they need to be investigators. With an emphasis on the different task, interviewers might take it upon themselves to go and try to find out the truth regarding whether people were fictitiously listed, or whether we just need to find another R to give us good data.


Problem 2: I am not really sure interviewers understood exactly what to do when the current occupants do not know and have never heard of the followup people. Although I saw only one instance of this in the field, and only talked to four interviewers, here are the problems I see with this situation.

  • When asked, one interviewer told me there was no training case devoted to a situation where the respondent did not know the followup person and had never heard of the followup person. This interviewer did not know what to do when presented with the situation.

  • There were form management problems. In Q2 of Section A, the interviewer forgot to collect the current occupant’s name while at the door and later in the car did not see the relationship between the R type and the R Type Code Key at the top of the page. Thus, he did not know what to write in the R type column. (I showed him the key code.)

  • The form misses critical pieces of data to use in coding possible fictitious cases. For the case I saw where the current occupant had never heard of the followup Census person, we found out through casual conversation that the current occupants had lived at the address for over two years and that the followup person was not the neighbor. There is no convenient place to put these critical pieces of information in Section A of the PFU form. Other interviewers told me they had cases where new people had moved to the followup address and yet they had already gotten three people who said they did not know and had never heard of the followup person. In my opinion, I do not think we should code a person as fictitious under those circumstances. Unfortunately, it is not clear from the form whether the analysts will know the full story. To assign someone as fictitious, I want to be sure we have a good picture of who the three respondents are.

  • For the case I saw, the interviewer started to ask Q4 in Section A… “Is there someone else who might know the people who lived around here/there in 2006?” but stopped mid-question because the question did not make sense for the situation at hand. In this case, the current occupants had lived at the unit for two years, and said candidly that the person was not a neighbor and they didn’t know anyone by that name. The respondents are the people who know who lived around the unit. Q4 attempts to obtain the name of the neighbor who knows everyone in the neighborhood, but the intent gets muddled by the addition of the year, making the question awkward for current occupants who have lived in the unit prior to 2006.


Solution 2: For this situation revamp the form to capture not only who the respondent is (name), and the relationship to the followup person’s address, but also possibly an address, phone number, and how long they lived at the address (or how long they have known who has lived at the followup person’s address or around the area). Rework the respondent type key code into the form since it is not seen at the top of the form. Eliminate Q4. Include this situation as a training case. Training should also emphasize how to find additional respondents for this situation. Do we tell the interviewer that some folks might be fictitious and that we need three good respondents to be sure we are confident in coding someone as fictitious? If we do not, perhaps we should.


Problem 3: I heard interviewers say a lot of cases had followup people who no longer lived at the address anymore. When I observed this situation, my interviewer seemed hesitant to get data from the proxy. This attitude of not taking a proxy might be leftover from the PI training. (Honestly, I am not sure of the proxy rules for PFU.) The interviewer I saw could have possibly gotten more data from the current occupant about someone who had moved out. Instead the interviewer seemed to want to try to contact the outmover before taking proxy data. The interviewer I saw did get the current occupant’s phone number and jotted it down in the notes. This was a very good idea since there was no specific space in Section A for the phone number of the respondent.


Solution 3: Create training cases like this. Review our proxy rules for situations like this.


Problem 4: Three possible docuprinting errors.

  • The front cover only prints the rosters if someone on the roster needed to be followed up. Interviewers did not know this; instead they assumed whatever rosters printed were the rosters we had. (During the form creation phase, that is how we initially intended the rosters to print.) With the current printing scheme, interviewers were speculating about the situation by looking at the front cover, e.g., this person was missed on the Census, since no one was printed on the Census side. (I was aware of this issue prior to my observations, but interviewers did not seem to know about this printing glitch.)

  • Some names are followed up twice (one name is on the Census side and one is on the PI side). These look like the same people, yet no Section B (duplicate name/possible match) was printed. I saw forms like this (but no interview with this situation). My interviewers told me this was confusing. They wanted to verify that it was the same person and not ask the second iteration of the questions for that person.

  • I noticed that babies less than one year old on Census Day have all the alternate addresses printed for them. I saw this in two cases. I suspect that if age=0 then somehow the filter does not work for those alternate addresses.


Solution 4: Fix these errors for 2008. Be sure to print both rosters (Census and PI) on the front cover since the interviewers speculate that a case might be more difficult than it really is with only one roster present.


Problem 5: I am not sure why some cases went to followup when they seemed so straightforward.

  • The completed Spanish case had five followup PI people. The household structure was a nuclear family. They all moved into the unit in May 2006. The respondent knew and responded with the complete date of the move. The interviewer got the complete inmover address, with ZIP code. The respondent reported her son was in the hospital for a few days in April. Other than this misunderstanding of our GQ question, there was no other alternate address mentioned. (See below.)

  • The other completed case had three followup PI people. The household structure was mother/father and child. They had lived at the unit on Census Day. The respondent gave an out of the country vacation address for herself, but she was only there for a few weeks. I asked about additional alternate addresses and there were none.

  • Other cases where I reviewed the form seemed straightforward as well, such as a case where there were two people on both rosters and the names matched.


Solution 5: Until I find out why these two cases went to followup, it is difficult to give a solution to this problem. Perhaps look at refining the followup workload.


Problem 6: The inclusion of, “Hospital with patients who have no usual home elsewhere” on the Q9c list of responses in Section C is problematic. The Spanish interviewer I saw did not use the type of place flashcard for this question and instead reworded the question with some examples of GQs. He used “hospitals” as one of his examples and the respondent reported that her small son was in the hospital for asthma from April 3-5.


Solution 6: Eliminate the second column of response choices in 9c and instead use an “other category” with a write-in response.


Problem 7: The separate flashcard and calendar do not work as intended. I did not see the flashcard or calendar used during an interview. One interviewer had lost the flashcard and did not use it in the Spanish interview. Another interviewer left her flashcards in the car and just showed Q9c GQ list on the form to the respondent. My last interviewer had all the flashcards with him, but I did not see those in action since we did not see any completed interviews. For the other type of place questions, the interviewers just read the first half of the question and the respondent would say house or apartment. The interviewer always gave the letter explaining the interview.


Solution 7: I am not sure how to fix this problem. Perhaps we could embed the flashcard into the letter explaining the interview or eliminate it. Perhaps eliminate the calendar.


Problem 8: There might be a form design problem with the skip from 1a skip to 1h in Section C. I saw one interviewer fill in both 1a and 1g for all three followup people and then complete 1h. For this family, the interviewer should not have completed 1g. This should be investigated with a review of all the forms.


Solution 8: Rethink the form design for this series of questions. Perhaps put 1h twice on Section C, page 1 (once under 1a and again under 1g).


Problem 9: When multiple dates are given, such as when the respondent reported she went on a vacation for several nonconsecutive weeks, the interviewer did not know what to write down in the “dates field” and asked the respondent to give her the dates for the last time the respondent stayed at the vacation place. The interviewer did correctly verify that this stay was for less than half the time and I assumed checked that box. The interviewer did not handle the dates field as we had intended. For multiple-nonconsecutive dates, the interviewer should have left the “dates field” blank and filled out the checkboxes and written a note.


Solution 9: Redesign the form layout for the last columns in Section C. Perhaps use three categories… All the time, Moved, Back and Forth and then only ask for dates if the person moved from or to that place. The checkbox categories could be selected for any “back and forth stay.” We could also have notes associated with the back and forth situation where the interviewer could write dates in an open ended format.


Problem 10: The form requires the interviewer to often rewrite the same information numerous times for the multiple people in the household. This includes dates of moves. I do not know if interviewers understood that they could select “Same as Name’s location for the address. The Spanish interview I saw, the interviewer did not write down the address multiple times while he was conducting the interview, but instead said he would do that later, after he finished interviewing for the day. I recommend a review of forms to see if interviewers understood the concept of “Same as…” I am also not sure if the interviewers knew what to do if the same address was mentioned for the person multiple times since I did not see that situation.


Solution 10: This form would do well to be automated. In fact of the four interviewers I observed three of them preferred the PI since with the automated PI, the interviewers did not have to worry about following the correct skip sequence. The PI instrument would do it for them. Given the form will not be automated; we should conduct a review of the forms for skip problems.


Problem 11: The interviewer who conducted the interview in Spanish used the Spanish translation and claimed that during the PI, he translated the English on the fly since the Spanish was terrible. Given many of the translated questions were the same on both the PI and PFU, I was surprised he used the PFU Spanish translation. When asked what the problematic Spanish questions were, he said the college and military alternate address questions are worded incorrectly.


Solution 11: Listen to the tape record of his interview and determine which of the Spanish questions he reworded. Cognitively test the Spanish translation and revise accordingly.


What worked generally well:


Not a problem 1: The number of pages in the form was manageable. No one complained to me about the length of the form. I saw a five-person followup household. The length of the form for this five-person followup household did not pose a usability problem.


Not a problem 2: Use of the clipboard. Interviewers seemed to use the clipboard to write on. They did not have each case clipped under the clipboard. I am not sure if they were trained on this aspect or not.


Not a problem 3: The plastic bags held the forms together nicely. Of the four interviewers I observed, all kept the forms in the plastic bags. The plastic bags helped keep the documents together when spread out in the car. Interviewers were aware that the forms should be face down in the car and were sensitive to the confidentiality of the material on the forms.


Not a problem 4: There did not seem to be a problem with having English on one side and the Spanish translation on the reverse flipped.


Not a problem 5: The general navigation scheme used in the alternate address questions, Section C Q2-Q9, worked well. Interviewers knew that if the respondent answered “Yes” to the main question stem, the interviewer navigated to the right and followed the flow for the row. If the respondent answered, “No” then the interviewer navigated vertically down the column to the next main alternate address question.



Jamie Burnham


I observed CCM PFU interviewing in Austin, Texas January 11-13, 2007. I observed three interviewers from two different Crew Leader Districts. I saw nine interviews in English. Of the nine completed interviews, I only conducted one respondent debriefing. Four of the respondents refused to be taped, but I was able to tape record the other five interviews.


  1. PFU Questionnaire

    • Cover Page/Record of Contacts

      • The questionnaire should give the interviewer some idea of why the people are being followed up, because that would give them a better overview of the case. It would be helpful to have both rosters printed on the front of the form. The interviewers I observed were forming their own opinion as to why we were following up certain people and that would sometimes affect how they approached the interview.

      • The interviewers I observed did not like the form being stapled on the bottom left side of the packet, because it made them flip the pages into their body. This made it more difficult to get through the interview quickly. One interviewer commented that we should staple the form on the top left side since the English version is used the most. They also did not clip the form onto the clipboard, but would use the clipboard to write on.

      • One interviewer mentioned that it would be helpful to have the Record of Contacts on the front of the form, because it was a lot of trouble having to keep flipping back to the front to get the address information that needed to be recorded on the back.


    • Section A - Introduction

      • The interviewers I observed did not know what to do when the respondent did not know someone on the roster. This situation came up in a couple of interviews I observed, but the interviewers did not ask the followup questions to determine who might know the people. This led to confusion on determining if a person is fictitious.



    • Section B – Possible Match/Duplicate

      • This Section was not printed for certain cases when there were people on both rosters with the exact (or very similar) name. However, the person questions (Section C) were printed for both names. The interviewers were confused on how to handle this during the interview.


    • Section C – Person Questions

      • For the most part, the interviewers seemed to be able to follow the skips in this Section. They seemed to understand that a “No” response would send them down the form to the next series of questions and a “Yes” response would send them across the form. However, the interviewers I observed did not follow the skips when it did not follow this pattern. It seemed they would not take the time to read the skips because they were trying to rush through the form. This would often lead to confusing questions for the interviewer and the respondent. The following series of questions seemed to cause the most confusion.


Just to confirm, Name lived or stayed at only one address during 2006. Is that correct? If yes, skip next two questions.


Around April 1, 2006, where did Name live or sleep most of the time?


Around Interview Day, where did Name live or sleep most of the time?


The interviewers would ask all three questions because they would not see the skip instruction to skip the last two questions. I am not sure the best way to fix this problem. The most obvious solution would be to have these questions follow the same skip format that is used throughout the rest of the form; however, there might not be space on the form to have the questions laid out in that format. If the layout of the form cannot be changed, we need to look into ways to make the skips stand out more.

      • Using a person-based approach when asking the alternate address questions makes the interview long and repetitive in large households. In one interview I observed, we were following up on a family of six people. The respondent kept saying that the entire family all moved in on the same date and no one had any addresses, but the Interviewer had to continue asking the questions for each person. This may just be a limitation of the questionnaire. The interviewer was also not following the skips, which made the interview much longer. He did not use the “same as” option when entering the same address for each person. So he would ask all of the address questions (type of place, landmarks, and neighbors) for each person even though we had previously collected this information for that address.


  1. Procedure Comments:

  • The Interviewers did not receive maps for their assignments and there were a few instances where the map was needed. I observed in PI that the Interviewers did not seem to need the maps very often to help them find their assignments, but they seemed to be more familiar with the areas where they were interviewing. Since the PFU assignments were more spread out, they were not always familiar with the location of the address. One of the interviewers was using a city map, but was still having trouble finding some of the streets. Another interviewer did not have a map, so spent a lot of time riding around trying to find the neighborhood.



Joanne Pascale


I visited Austin in order to observe PFU interviews and also to conduct respondent debriefings in cases where the interview itself may not have fully captured the living situation of household members. I was paired up with three different interviewers on three different days (January 12, 13 and 14) but due to bad weather on the last day I was unable to observe interviews and instead debriefed the interviewer on the forms, procedures, training, etc. Altogether I observed three interviews (one on the 12th and two on the 13th) and taped them all, no debriefings were conducted. One interview was in Spanish (the interviewer asked a neighbor to translate), and one interview was completely straightforward and seemed to have no ambiguity (a single man, living at the sample address and nowhere else for the entire year), and so did not warrant a debriefing. The other interview was a good candidate – the respondent reported a second home and said she stays mostly at the sample address but her husband stays mostly at the second home – but she was generally hostile about the interview and seemed sensitive to being probed on the fact that she and her husband did not live together most of the time.


1. Assessment of Operations Observed, Problems and Recommendations


Overall the interviewers seemed very competent in conducting the interviews, and knowledgeable about the goals of the operation. Below are general observations, as well as item-specific comments and recommendations.


A. Preprinted forms

Some cases seemed to have either matching or printing problems. For example:

1. In the following case, only Peter and George (not their real names) were followup cases, each with a Section C (apparently Mark and John had been matched). It was not clear why Tim and Harry were not also followup cases:

CCM roster Census roster

1. Peter 1. George

2. Mark 2. Harry

3. John 3. Mark (but spelled differently than the CCM roster mark)

4. Tim 4. John (but spelled differently than the CCM roster john)

2. One person showed up on both CCM and Census rosters, with the exact same age and name spelling, and there were two “Section Cs” for her.

3. In several cases, individuals’ ages were not printed.

[Note: details on these cases, including ID information, was forwarded to DSSD]


B. Procedures

Given concerns about identity theft, interviewers think they would have more credibility if they had photo IDs. One also recommended that in promotions, the Census notify the public about what information they will and will not be collecting – specifically, that the Census will not ask for Social Security Number or mother’s maiden name. (According to one interviewer, there was some kind of fraud going on in Austin where someone posed as a Census taker but asked for these extra pieces of information for use in identify theft).

  1. One interviewer said it would be helpful to have some kind of “mid-session” debriefing or training, roughly two weeks after they’ve been in the field. The purpose would be to allow interviewers to come together and share their experiences and problems, and exchange tips on how they overcame those problems.

  2. Regarding respondents’ recall and dates, one interviewer who worked both PI and PFU had the impression that respondents could recall dates fairly accurately in the PI, but by the time PFU was conducted their memory was failing.

  3. All interviewers preferred an automated instrument in general. And regarding Spanish interviews, one interviewer said it was very helpful in the PI to be able to toggle back and forth between English and Spanish wording for any given question. When I asked if she ever tried to do something similar with the Spanish wording on the flipside of the English PFU pages, she said she had not tried that, but that it would not be as useful or easy as on the laptop.

  4. Some interviewers took a lot of initiative to do some detective work. Two were real estate agents and often used public records to try and obtain a full address if the respondent only gave them a partial address. If a house had a realtor sign up, one interviewer would sometimes call the agent to see if the agent would pass on the interviewer’s name and phone number to the sellers. Another interviewer also used public records to obtain the dates that a respondent’s boyfriend was incarcerated.


C. The PFU Form


General

1. The form should not be stapled at the bottom left of the packet; this makes the interviewer have to turn the pages into him/herself and filling out the subsequent pages becomes difficult. Alternatives would be to have one vertical staple in the top left corner, or possibly three vertical staples going down the side so that the interviewer would be more inclined to clip the whole packet to their clipboard.

2. Interviewers asked if the forms would be scanned by computer or keyed by hand; they’d like to know how careful they need to be when checking the boxes.


Flashcards

3. Interviewers recommended that the flashcards be laminated because they get tattered and dirty after several uses, and they asked for extras in case they lose them.

4. For the three interviews I observed, the “type of place” flashcard was always used. The calendar, however, was never used.

5. One idea for increasing the likelihood that the flashcards get used would be to print them on the U.S. Census bag itself, on the opposite side of the Census seal. Another possibility would be to produce the bags with a clear plastic pocket attached to the outside, and the flashcards could be kept inside the pocket. In some cases the interviewer would not necessarily need to even take the flashcard out, if the bag was in a position so the respondent could see it. Finally, another possibility would be to print the calendar on the clipboards. None of these measures would ensure the flashcards get used, but it would make it more convenient for the interviewers to actually show them to respondents.

6. For the “type of place” flashcard it may be useful to list the response categories with numbers or letters to give a better sense of confidentiality (e.g., respondents who were in jail might prefer to provide a letter or number rather than say “correctional facility”).


Cover page

7. In general, the cover page should tell a clearer “story” about what the living situation of the household members was (at the time of the Census and PI), and why the case is in followup. One feature that could help is to list the entire household roster of both CCM and Census residents, and also include some kind of marker or flag next to the individual(s) who are to be followed up.

8. Interviewers said that having the phone number for the sample address is a big help.


Introduction


9. Asking if it is ok to ask questions about a deceased person is a very good idea.


Section A

10. One interviewer found the introduction inadequate and finds it more helpful to be more explicit with respondents. She explains that there was some kind of discrepancy in the data collected previously and that is the reason for this followup interview – to clear up the discrepancy so that no one is missed or counted twice. In general, she said, respondents are less annoyed with the PFU interview than they were with the PI, because they grasp the purpose of the PFU (to clarify a known problem). With the PI, most respondents just felt like they were being hassled a second time.


Section C

11. Q1: If the answer is “no” the skips are fairly easy to follow but if “yes” the skip is hard to follow.

12. Q1c: This question is usually met with confusion and/or a do not know answer. She thinks respondents might be able to provide better information if they understood the purpose of the request, and recommended modifying the wording – something like: “Are there any landmarks, such as a school or hospital, that would help someone find the address?”

13. Q1a and Q1h: In one case, the followup person was a student who lived in Dallas during the school year but comes home to the sample address during the summers. While Q1a was fairly straightforward for entering dates at the SA, it was not clear whether or how to ask Q1h. The most appropriate answer was ‘back and forth’ but it was not clear how the next part should be answered since the question is vague; ‘how often’ does not have a time frame – does it mean ‘how often during the calendar year of 2006?’

14. Q2g: This item was sometimes overlooked, and the crew leader had to ask interviewers to return to the field and get the answer.

15. Q4: For respondents attending college but just commuting (i.e., not staying overnight there), it would be helpful to have a response category so they do not end up going down the path to collect the college address.

16. Q12a: We may want to add words like “Just to confirm” since multiple addresses have already been talked about.

17. Q13a: There are no instructions for what to do if the name is not spelled correctly – should the interviewer cross out the pre-printed name and write in a new name? One possibility would be to add a set of yes/no check boxes, and if “no” is selected, allow a space for writing in the correctly-spelled name. All interviewers noted that some respondents find it difficult to listen to a spelling of a name, and both interviewers and respondents are tempted to just look at the pre-printed name.


D. Interesting Case


As mentioned above, in one case the respondent reported that she lived most of the time at the sample address, and had a second home where she spent two weekends per month. During her husband’s interview she reported that he spends most of his time at the second home. She was unable to report which address either of them stayed at on April 1st or on the date of the PI. What the instrument did not capture was whether the husband spent most of his time at the second home throughout the entire calendar year of 2006, or whether that was a recent development. In theory he could have lived most of his time at the sample address on Census Day, but later began spending more of his time at the second home.



Vicki Smith


I, along with two other CCM Analysts, observed CCM PFU in the Austin site on January 20 through January 23. I observed four interviewers, completed two interviews (one of which was in Spanish), and I did not conduct any respondent debriefings. The Spanish interview was a straight-forward interview with the family being nonmovers.


The observation for PFU was very frustrating. Overwhelmingly, the people needing additional information had moved months ago, and no one seemed to know them, let alone answer any residency questions about these people. All the areas of Austin I observed were large apartment complexes with multiple buildings. In the first complex we visited, the cluster caseload was so large, it was divided among four interviewers.


In three of the five block clusters I observed, the people needing followup were exclusively Spanish speaking households. In one of the three block clusters, the interviewer did speak enough Spanish to conduct an interview, but she was not fluent in the language. The crew leader referred to her as his “half-Spanish” interviewer.


In the other two block clusters that were exclusively Hispanic clusters, the interviewer spoke no Spanish. In this cluster, we found a lot of people at home and willing to do the interview, but no one in the household was bilingual, so the only thing the interviewer was able to do was to give the household a letter explaining the purpose of the visit.


In general, people were tired of the Census Bureau “harassing them” for the same information they felt like they had given months ago. The landlords were reluctant to give even minimal information such as move-in/move-out dates. When the analysts were coding the PFU forms, it seems like we saw a lot of noninterviews and “K” codes (unable to contact a knowledgeable respondent).


Following is a breakdown of the households we visited, and the problems we encountered trying to obtain an interview.


January 20 -

This was the very large complex that was divided among four interviewers. The interviewer I observed had three buildings. We visited ten apartments. Of the 10 apartments, only one person was home. The person who answered the door was an adult son visiting his mother. She did live there April 1st and on the day the PI was conducted. He was unwilling and uncomfortable answering any further questions without her knowledge or consent. He politely closed the door in our face.


Of the nine apartments where no one was home, five had the prior notice of visit still in the door.

One apartment had two notice of visits still stuck in the door, as well as a phone book. I would assume this was a vacant apartment, but the interviewer had not contacted the landlord yet, so he did not know for sure. One woman, in a prior visit, had threatened to call the police. We did not visit her apartment since he doubted she would cooperate.


January 21 -

The HUs in this cluster are composed of small multi-units which are subsidized housing. This cluster was predominantly Spanish speaking households. It was in this cluster we were able to conduct my only completed interview. We visited 20 households that day. Of the 20 households we visited, three households scheduled a callback, one completed the interview, four households knew of the people needing followup, but did not know them well enough to complete an interview, and the rest were not at home. In these clusters, we tried in the morning as well as in the evening. Had I counted all attempts in this cluster, it would be close to 40 attempts.


January 22 -

The cluster we visited this day was located in a gated community. We sat in the car and waited for someone to enter the complex and followed them in. We visited eleven HUs in this cluster. Of the eleven HUs we visited, we completed one interview (she was on the phone with the utility company and refused to be taped), contacted two households who did not know the people needing followup (they were recent inmovers to the apartment), and the rest were not at home. One household the interviewer was attempting to contact had no electric meter. I told the interviewer the apartment was definitely vacant, and had probably been vacant for at least a month.


January 23 -

The cluster we visited this day was exclusively Hispanic. The interviewer I was with that day did not speak Spanish at all. We visited nineteen apartments. Of the nineteen apartments, seven households in question were the households needing PFU, but because neither of us spoke Spanish, we could only give them a letter and tell them in English a Spanish speaking interviewer will visit them in a few days. (I doubt they understood). The rest of the households were not at home at the time of our visit.


Over the course of my visit, I was able to talk to the interviewers, a crew leader, and an assistant crew leader about problems or concerns with the PFU form. All found the form awkward because of the way the form was stapled in the left bottom corner. Even though they used a clipboard, the interviewers had to turn the pages towards their body instead the more natural way of away from their body (had it been stapled in the upper right or left corner). When I asked about the preferred way, all agreed it would be better either stapled in the upper right corner, or in a booklet form, stapled on the left. All the interviewers agreed the skip patterns need to be marked better. They suggested the instructions be in a different color, or if that were not possible, in bold italics. All said that even though we handed the households a letter, the respondents really did not understand why we were coming back months and months later.


One suggestion I have is to print the notice of visit the interviewer leaves at each household when no one is home with the time and contact number in English on one side, and in Spanish on the other side (similar to the letter which explains the purpose of visit). The back side is blank, so why not utilize both sides of the paper? In Hispanic households, unless someone in the household or a neighbor is bilingual, these households have virtually no idea who visited or why we visited.


I also observed that if we were following up census only households, invariably the household had moved out months ago. These cases were the noninterviews and the “K” codes I referred to previously. All of the interviewers I spoke to said that the leases in the apartments had the option to be on a month-to-month basis. In the Hispanic cluster, it was exclusively on a monthly basis, and the people who did open their doors had little furniture in their apartments. This just reinforced the notion that these people are highly mobile, and future attempts at followup for these types of people will result in a high level of unresolved codes as it did in 2006.



Sandy Norton

January 20 - January 23, 2007


I observed four interviewers over the course of four days. All of the interviews had worked the PI operation. I observed three complete English interviews and taped them all. I conducted two debriefings. With all of the interviewers, I only saw the end of a set of cases; no one had been assigned any new cases.


The first day I started observing was a Saturday, the weather was 40 degrees and raining. The interviewer had sixteen cases in a large apartment complex. Since it was raining the interviewer could not take the forms out of the packages until he arrived at the sample address. Once at the address, it would take a few seconds for the interviewer to get the materials he would need for the interview (calendar, flash cards). When the interviewer was conducting an interview, he had a hard time turning the pages, his fingers were numb due to the weather, and the turning of the pages was cumbersome. I conducted a debriefing on this case. Of the sixteen cases, no one home (6), no one in household spoke English (6), complete interview (1), movers, never heard of household (2) and one refused to answer the door when the interviewer identified himself.


On the second day of observations, the assigned area was a low income housing area where the majority of the households were Hispanic. The interviewer was a white male, late 40's who spoke no Spanish. He did not understand why he kept getting areas where the majority of the households spoke Spanish. He completed one interview that I received permission to tape. The interview was completely straightforward (a single man, living at the sample address and nowhere else for the entire year), no debriefing needed.


On the third day of observations, the interviewer was a 75 year old, white female. I do not know if I made her nervous or if she was not familiar with the form. There were two people listed on the PFU. During the interview she kept turning the page over and would be on the Spanish side version. She tried to end the interview after completing the interview for person one. At this point the respondent was getting a little impatient. The respondent kept telling the interviewer the answers were the same for her husband. They had two seasonal homes, but they spend more time at the sample address. I conducted a debriefing for this case.


On the fourth day of observations, the interviewer only had cases where either no one was home previously, or he had not contacted three knowledgeable respondents. He needed to contact someone at the manager’s office. We arrived at the apartment complex around 2:00 pm, but the manager’s office closed at 1:00 pm.


Debriefing:

The other two interviews were interesting. The first interview, a mother and her son, had moved to the sample address the middle of April. The mother had been living in another TX city with her brother until the first of April. She and her son then moved into a hotel for one week while they were looking for a car. They then moved to Austin, TX, and stayed in another hotel until they moved into the sample address. The son was homeless until he moved into the motel with his mother. From January through April 1, 2007, the son was staying in homeless shelters or was living in his car.


The second interview was a husband and wife who had two seasonal homes. They spend more time at the sample address. The interviewer was a 75 year old, female. She was nervous and lost her place during the interview. I conducted the debriefing to verify that they spent more time at the sample address.


All of the interviewers I talked to said they preferred the laptop to the form. The staple in the bottom left was cumbersome. One interviewer suggested they punch holes at the top of the form and use rings.


The interviewers did not understand why there were six people listed on the PI side of the form, but they were only asking questions about one person in the household. The roster was confusing to the interviewers.


For the three interviews I observed, the “type of place” flashcard was always used.


Weather permitting, the interviewers used the clipboard. When we would arrive at the area, the interviewers would take each case out of the plastic bags and attach all of the forms to the clipboard. When it was raining this was not feasible. You had to wait until you were at the front door to take the form out.


Again the interviewers could not understand why they were assigned cases where the household was Hispanic. The four interviewers I observed did not speak Spanish.



Julie Bibb


I observed the PFU operation for the 2006 Census Test in the test site of Austin, TX from January 20 - January 23, 2007. I was able to observe 15 interviews with three different interviewers. (One I had previously observed during PI.) Unfortunately, most of the interviews (11) were observed with the same interviewer. He was not one that I was originally scheduled to go out with, but on three occasions, the interviewer I was scheduled with did not show (one quit, one had to work late at her other job, and one told me on the phone she did not want to interview that day), and so I had to make use of whomever the crew leader had going out. The third time I met this particular interviewer because he was headed back to a rather interesting case where he was only able to obtain information for one of the followup people. He was following up a second person, but was going to inquire (his idea) about the person we had followed up previously. More about that later. I conducted respondent debriefings in nine of these cases. Only one person hesitated when I asked to tape the interview, and upon telling her it was strictly voluntary she allowed it.


Here are my thoughts from observing the interviews and talking with several interviewers as they met with their crew leader.


General:

Not having maps was a big concern and time consumer for the interviewers. After receiving their assignments, all the interviewers I talked to said they had to Google their own directions and still spent time driving around trying to find their addresses. All three interviewers I was with had trouble navigating to their assignments to some degree. One was a rural type area and the interviewer remarked he has lived in Austin his whole life and never knew they had such places. Another drove (and walked) around trying to find the correct apartment complex and buildings. The third drove around quite a while (in the cold rain) until she found the correct addresses. Several I spoke with were adamant that maps should have been given with the assignments as in the PI. Also, without maps, there is no way for the interviewer to realize they are really outside their sample block.


In the rural setting sample block, it would have helped to have a list of all the HUs, not just the followup addresses. The actual addresses and person rosters for them were all mixed up. (Later, I learned there were PI duplicates because of the mix-ups.) The interviewer was not sure if he should go ahead and do the PFU interview with the followup household since they were not at the correct address. He wanted to know if we were following up people or addresses. Fortunately, he decided to go ahead and do the interview and make a note about the address. He mentioned he would talk to his crew leader. Training should address what to do in these situations, address and apartment mix-ups, if it does not already. (Addresses used for mailing and what is on the electric meters were all different.)


Two of the interviewers I was with did not speak Spanish. When encountering a Spanish speaking household, there was no way for them to communicate that they would send a Spanish speaking interviewer. This left the respondents with blank stares. One of the interviewers showed the Spanish side of the questionnaire to the respondent, but I am still not sure they realized someone else would come to conduct an interview. Speaking louder did not seem to help! It seemed rude to just walk away. A letter stating in Spanish that another interviewer will be coming is needed.


The interviewers consistently did not like the size of the paper. Thought 8x11 would have been less cumbersome and fit on the clipboard better. They did not like the staple in the bottom left corner. This made it awkward to turn the page in toward themselves. I had mixed reviews on paper vs. laptop. One thought the paper was a more flexible instrument, allowing the interviewer to go back and change something based on new information the respondent throws out there in the course of conversation (changing dates, etc.) Another did not like the paper and preferred the automated version. Others said they could use either. Probably just depends on how computer savvy the interviewer is.


The interviewers in general all agreed the training and training materials were satisfactory. All seemed to like the job but expressed this was a tougher operation to get the data needed. The public was tired of the multiple census visits (although in the interviews I observed, most people cooperated after expressing slight exasperation with yet another visit), and followup people had moved out and proxies were reluctant to give information. Some thought better community support and information to the public would have helped.


Recall error seemed to play a part because of the time span since Census Day. Some respondents said they just could not remember and some were corrected by spouses as to dates.

Cover:

One: Both rosters need to be printed on the followup form. Two: It would also help if the interviewers had some indication of the information that was needed. (I myself could not even tell by looking at the front of the form what information I was hoping to capture.) All of the interviewers I talked to said they felt they could have done a much better job of getting us the information we needed. Also, this would have kept the interviewers from speculating what the reasons may be. In a few cases, the respondent would ask why we were there again and the interviewer would say it looked like someone was left off the census form since there was no roster. This caused confusion because the respondent would state they did fill out their census questionnaire and/or many census people have already visited, and also caused concern for the integrity of the census. One man actually refused the interview and cited the news stories about the Census Bureau “losing data”. One interviewer said it was particularly awkward because the respondent wanted to know why we were asking only about them and not everyone in the household. I witnessed several respondents ask this exact question. When the interviewers answered there must be some “discrepancy” with the data, respondents looked concerned. The interviewers would have liked to give the respondents a more precise answer and felt this would have elicited better data. We need to verify the date you moved in or the address from which you moved, etc.


Section A:

This whole Section needs rethinking. It did not work as planned. Training may have played a part in that, but the design of this Section could be improved. First, this Section was used as a contact list, not a knowledgeable respondent list. This Section is intended to record three respondents who were knowledgeable about the status of the HU and occupants on Census Day in order to code people who are fictitious and need to be removed. The interviewers I spoke with all seemed to know what a knowledgeable respondent was. When encountering inmovers, they would remark to me that of course they would not know the followup person if they just moved in after Census Day. However, they still listed them in this Section. They would fill all three lines with inmovers, whether from the same household or next door telling me this was their procedure for coding “fictitious” people. Even though they instinctively knew these were probably the people who moved out after Census Day. One interviewer did remark that he tried to at least interview the landlord for one of his respondents. While coding these forms in the After Followup (AFU) clerical coding operation, we had an enormous amount of “K” codes which meant we could not code these people fictitious (even though the form was filled out that way) because we recognized these were not three “knowledgeable” respondents. The interviewer sometimes even left notes saying the landlord said these people moved out. There must have been a breakdown in training because the interviewers knew the respondents were not knowledgeable, knew the followup people were real people that moved out after Census Day, yet every interviewer I spoke to said they were following procedures for coding “fictitious” people. One interviewer not knowing how to record that the respondent knew the followup person (she still lived there!) just not ‘well enough’ to answer questions said he continued to ask until he got three people who did not know the followup person, thus making this person fictitious, and he no longer had to look for a knowledgeable respondent (his three lines were filled).


But having said that, I will also say they did their job in following the flow of the form. This Section was not designed to weed out the contacts from the knowledgeable respondents. When the respondents said they never heard of the followup people, the interviewers would ask on the fly if the respondents did in fact live here on Census Day, if they knew who did or if anyone lived here on Census Day, or if perhaps they still get mail for these people. Those types of questions need to be incorporated into this Section and only when someone is determined to be knowledgeable about Census Day residents, should their name be recorded as such.


Several interviewers did not like the placement of the R type code key (top right corner). One interviewer did not know it was there until I pointed it out. It was suggested, to be useful, it should be better placed or have an asterisk or arrow to alert the interviewer of its location.


I do not believe I ever saw item 4 used during AFU coding. I did, however, see item 3 used and interviewers then traced movers to get the information.


Section B:

There were not too many comments on the content of Section B but on the omission of Section B when the interviewers thought there should be one. I think this was mostly due to a printing problem of the PFU form. It seemed that both halves of a match went to followup because both the PI and the census person qualified for followup. So there were two Section C’s (one for the PI person, one for the census person) but no Section B. The interviewers (and crew leader) thought the possible match Section must not be printing since these clearly were not two different people. I made a note of a few of these cases, and upon research found them to be matches, not possible matches. So in these cases, the printing problem was with Section C, not Section B, or perhaps in what was flagged for followup. I did see one Section B printed, and the interviewer remarked this was the first he had seen. It seemed to print correctly. However, I do not know if all Section B’s were printed when necessary.


Section C:

Most of the interviewers felt this Section was very redundant and should be shortened. Almost all of the alternate addresses were captured with the second question in this Section. (Did you live anywhere else in 2006?) Then this address was repeated as needed throughout the Section for each different address type category. A few checked the “same as” box but most all recorded the same address and all accompanying information two or three times. This really slowed down the interview and made it redundant when the interviewer kept asking for the same information. One interviewer started adding the words “in addition to” so the respondents would not provide the same address. There were times when the respondent answered no to question 2, but then upon being probed by a more specific question, answered yes. So to make a suggestion to shorten the interview, relieve respondent burden, and reduce the bulkiness of the form and yet still have all of our probes to gather information, I propose this Section be changed. Leaving question 1 and 2 as is, but only printing and asking questions 3 through 8 as probes will accomplish the same results. (It will be transparent to the respondent.) We could provide space to capture all of the information for two new additional addresses (making a total of four per person). The probes and space for additional addresses could fit on one page. In the unlikely event someone has address number five, it could be recorded in the notes.


The dates approach seemed to work. It seemed dates were almost always provided. The interviewers did not seem to use their calendar, but after I used one during a debriefing and the respondent was able to provide a date because of it, the interviewer I was with then used his calendar in every subsequent interview where the respondent hesitated over dates. He met with success in procuring dates and commented that he had not thought before that would work. Only one person told him that looking at a calendar would not help, he just could not remember. Perhaps more stress in training that calendars do work. This particular interviewer then picked up verbal cues of respondents using calendars and used them in subsequent interviews, i.e., asking respondents about July 4th, etc., trying to jog their memory.


The back and forth concept was a little fuzzy for one interviewer but most of the respondents seemed to get it. It could become confusing when there were more than two addresses and/or there was a move involved at the same time that someone was cycling with another address. However, in almost all the cases I saw, the interviewer figured it out through conversation and was able to record the information. They did get the concept of trying to figure out where most of the time was spent. Notes were left in the more confusing cases.


A couple interviewers thought the skip patterns were confusing. They thought the form was not consistent in stating skip patterns, i.e., some were explicit while others the interviewer had to figure out.

It seemed like questions 11 through 12c were not always asked according to skip patterns. 12a and 12b always seemed to get asked.


All interviewers I observed verified the spelling of the person’s name and asked for nickname.


Upon asking date-of-birth (DOB), some respondents did not know DOB, but provided an age (or approximate age) but there was no space to record age. We should add age to the form.


Only one interviewer said he did not mind filling out the summary and notes at the end of Section C although he did find it redundant. The other interviewers I tagged along with or spoke to thought it was “silly”, “redundant”, “unnecessary”. I have to say when using the form for coding during AFU, the information was either redundant or worse, confusing. When contradicting the marked responses in the address portion of Section C, I went with the address portion. It was much more informative and explicit about how much time was spent at each address, and I knew it was filled out while talking to the respondent. The summary was filled out (that I observed) after the interviewer left the respondent.


Section D:

I did not observe Section D, nor had any comments from any interviewers about it. I suspect Section D was never printed because Smith/Jones cases were not flagged (or printed?) correctly for followup. Only the Jones (census) household went to followup.


Forms:

The notice of visit form was left by all interviewers. One interviewer left the notice of visits (NOVs) in the mailboxes. When I suggested this was illegal in KY (trying to protect the Census Bureau) she informed me it was OK in TX. (She also works in the police department by day.) I did notice the other interviewers took great care to strategically place the form where it would not be blown away but still be seen without using the convenience of the mailbox!


The calendar was only used by the one interviewer after I used it during the debriefing. I have found this to be a valuable tool (and so did he). It should be stressed in training.


The group quarters flashcard was used by one interviewer but only at the GQ question (not at the beginning of Section C as is on the form.) Another interviewer showed the form to the respondent at the GQ question.


Interviewers:

Two of the interviewers seemed very adept at reading the questionnaire and gathering information through conversation when needed to clarify answers. One of the interviewers stumbled over the questions and led the respondents into answers, i.e., “You said he probably stayed here three or four months, so we’ll assume he was here April 1st, is that correct?” Also, added the words “while he was staying here” when asking if the followup person had any other addresses.

One interviewer, while a good interviewer, kept interrupting my debriefings. When the respondent would give me an answer different than what he recorded on the questionnaire, he would pipe up with “but you told me..........” or “well, you said he was here......”. I had to stop two of the interviewers from leaving notes of what was said during the debriefing. Still not sure they did not add it later.


PFU

There seemed to be several problems with either cases getting flagged for followup or the printing of the followup forms. I mention them because I do not know which it is.

* There were the matches where both the PI and census person was followed up.

* There was the lack of the Smiths (PI) part of the Smith/Jones situations that did not get followed up.

* There were partial PI households that went to followup.

* There were partial household nonmatches that did not get followed up.

* Roster printed out of order


Interesting case:

I observed a case in followup where a man may have had an alter-ego. Upon asking about the followup person, the respondent said that was him. Then he said no, that person was his partner and he is “inside”. The interviewer asked if he could proceed to the door (we were in the driveway) to talk to that person. The respondent said there was no one else in the house, his was the only car in the driveway. Through the course of the interview, the respondent said the followup person comes and goes and it is hard to tell when he would be here. Anyway, the interview is very confusing, with the respondent waffling answers and the interviewer being very confused trying to make the man stick to an answer. The followup person seems to fit the description of the respondent. I established during the debriefing that the followup person should not be counted at the sample address. However, there was a woman that needed followup at this address as well, and the interviewer was coming back the next day to get that interview (the respondent said she would be there) and the interviewer mentioned he would ask her about the followup male as well. I met up with the interviewer to be present for the interview the next day, but the followup female was not there at the appointed time. Consequently, I followed this case in AFU coding, and apparently, the interviewer never got an interview with the followup female.



Laurie Schwede


Overview


I accompanied six different CCM PFU enumerators as they attempted to interview respondents on the Cheyenne River Reservation in South Dakota between Tuesday, January 16 and Saturday, January 20. It was bitterly cold, with temperatures hovering around 10 to 20 degrees during the day, and having the gloves off during outdoor interviews to be able to write on the questionnaire and write notes left us with numb fingers by the end of the interviews. It was much colder at night and nearly pitch dark in this mostly rural area by 5:30 p.m.


We attempted to contact 15 households in different parts of the reservation, some of them in vain over two or three visits at different times of the day and week. We got three completed interviews on tape and I conducted qualitative followup interviews in all three.


In one of these cases, the young woman identified for followup had traveled during 2006 to three different locations for varying lengths of time ranging up to three months, but her usual residence, Census Day residence, and preponderance of time in 2006 were at the place where we interviewed her. In the second taped case, the one young man we were supposed to follow up on had stayed with his grandparents for two to three months while working at a temporary job, but he left without warning and the grandparents simply could not recall whether this had been around Census Day or not. Despite the enumerator’s best efforts and my qualitative probing, I could not get a clear answer from the respondents to determine whether this tenuously attached man was there as of Census Day or not; all we were able to get was that the stay may have been sometime between March and August. In the third complete interview, the family of three was not found at the house described on the docuprinted form, but rather in a different house that looked nothing like the described house, several miles away.


Additionally, I was able to collect information on two more cases pertaining to who should be counted in PFU households in South Dakota around Census Day. In one (case 4), the enumerator and I had gone to the house and found no one at home, but the enumerator knew the woman who lived there and, by phone, learned that the couple we were supposed to interview did not live there, but at another place several miles away. During our trip the next day, the enumerator pointed out their house in the distance (but I did not observe the actual interview there). This was the second case where families/couples were listed in the wrong house, and both of the “wrong houses” seemed to be in the same cluster. I wonder how many other mistakes there might have been in this cluster that may not have been identified.


In the other (case 5), another enumerator had completed the PFU interview on her own at the correct household, but told the crew leader the next morning she felt she had not finished the case because she had learned that a baby and several people now in the household had not been counted in either the census or the PI. She did not know whether she was somehow supposed to add these new people to the PFU instrument, and if so, where, since the PFU questionnaire does not seem to ask questions about additions or omissions. The crew leader told her to go back to the respondent’s house and find out when the baby was born and when the other people arrived. I tagged along with the enumerator to this ad hoc followup interview and she learned the baby and the additional people had arrived in the late summer and fall, so they should not be included here. During the questions, the respondent glanced at the PFU instrument on the table, read the names on the PI roster and census roster and asked why her partner was not listed on either one. I probed then to find out about her partner’s living situation and learned he had been living there continuously since very early in 2006, when the couple first moved into this place together. He should have been included on both rosters, but was on neither. However, the PFU instrument had no designated questions or instructions on what to do if a new omission is identified. (I do not know if the enumerator included him in her notes or not). I also learned more about a child on the CCM PI roster who was not there (or even on the reservation) on Census Day, but who was not identified as a PFU followup person. Full information on these cases is included in the spreadsheet.


Persistent problems with census maps and mapspots that are likely to affect coverage


Maps and mapspots are critical components of enumeration in a rural area like this, where there are few paved roads, but most people live in houses that are scattered across the land in this very sparsely settled area.


Inaccurate census maps and unclear mapspots in the South Dakota 2006 Census Site continue to be a problem in CCM PFU, as they were during the CCM PI operation last summer. Map-related problems that I observed in PFU (and also in PI) include: cluster maps that do not have clear descriptions of boundaries; maps with no roads or features within them to triangulate the location of target households; maps not drawn to scale; and, most pervasive of all, maps with just the perimeter boundaries and just the scattered target households marked by squares, with no other reference points within the cluster to determine whether a given house or trailer is the right one or not. In looking at some of these maps with no other marked features, I would have had a lot of problems first in just finding the houses, and second, in trying to figure out which units on the ground were in scope.


In one case, an enumerator and I drove slowly and repeatedly around an enumeration area with trailers and houses, trying to figure out which were the two selected for PFU interviews, based on just the physical descriptions of the units docuprinted on the forms, without any streets marked or named on the map. After awhile, the enumerator had to leave due to a personal obligation and we gave up without deciding where the units were.


This map issue has been a big, persistent concern of the enumerators. Several enumerators told me that they had worked on previous 2006 Census Test operations at this site. In each operation, they had identified problems with maps and had asked if they could write the corrections on the maps, but the trainers said no. Changes were rarely, if ever, made for subsequent operations. One enumerator estimated that about 75 percent of the maps he had seen had significant errors in them. As an example, another enumerator told me that in a previous operation, in one cluster, all of the mapspot numbers were bunched together near a landmark and the enumerator had a very difficult time figuring out which units should be included.


This enumerator said that when training started on this final operation (CCM PFU), the first question the returning enumerators asked was whether the maps had been corrected. When told they had not, the trainees all groaned. He told me a rumor was going around among the enumerators that Census Bureau headquarters (HQ) staff had deliberately left the maps with errors just to see how the Indians would handle the problems. I assured him that HQ mapping staff was working in good faith to produce good maps and I could not imagine that anyone in HQ would deliberately produce faulty maps as a test to see how Indian enumerators would react to problems. I told him there had to be some other explanation for the mapping problem, maybe something associated with the costs of making changes to maps, or some communication problems with the map issue not having been accepted as a problem that needed fixing. The fact that he would even tell me some enumerators suspected someone in HQ deliberately created mistakes to test them suggests that the map problem has been a sore spot for enumerators in this test site for a long time. Had their changes to the maps been accepted earlier and maps modified, the PI and the Census would likely have had fewer differences and the PFU operation might have been done more quickly, with fewer errors to follow up on.

Lack of fit between census addresses and the addresses people use on this reservation


There is also a lack of fit between the official addresses we have from census canvassing and the types of addresses people actually use on this reservation. This was clearly evident in this observation trip, as it was in my CCM PI trip last summer.


In two of the three PFU interviews I observed, the actual street or road name address docuprinted on the form was unknown to the respondents at those addresses and led them to answer critical question C.1 differently than the questionnaire designers intended. This sent them down a longer questionnaire path. For example, in one of these cases, the actual target person listed on the PFU form was at home at the time of our visit and became our respondent. When the enumerator asked her question C1, “Did you live or stay at 10 Banyan Street1 anytime in 2006?” She said, “No, I’ve never heard of that address.” The “no” answer skipped her into questions 1b through 1f, which asked “at what address did you live in 2006?” She answered PO Box xxxx, which did not provide us any location-specific information, and took us through more questions about the type of place, names of others who lived at this address, and names of neighbors, before it got to the calendar question, which would have been her second question, had she answered “yes” to C1.


At the end of the interview, I asked about this and the respondent said that the address the enumerator read off to her could be the official address, but she had never heard what that official address is. Like most others on this reservation, she has always used just her post office box number, never any street address. The “no” answer she gave was the result of hearing an address she never heard of, not the result of her having some other residence elsewhere. The house where we interviewed her is definitely where she lives, so the answer to our intended C1 question should have been “yes.” By answering “no” she was thus sent down the path that tried to collect another address for another place and had to answer extra questions that should not have been necessary.


In the second interview I observed, we had the same problem with question C.1, as well as problems with a faulty description of the house that was likely related to the mapspotting problems mentioned above. The docuprinted address was “County Road 999, blue house with white split-rail fence.”2 We drove on this road and found a house that matched the physical description perfectly, but found a placard showing that a family with a different name lived there. No one answered, so we left. We continued driving several miles down the road and came to another house that looked completely different and we would have passed it by, but for the fact that we saw a sign in front giving the same last name of the respondents we were looking for. We went to the house and verified the last name of the residents and the enumerator started the interview. Knowing the physical description of the property was wrong, the enumerator truncated the docuprinted address when he read question C.1: “Did you live or stay at County Road 999 anytime in 2006?” The respondent said he had never heard of this road, and when asked 1b for a new address, just gave a post office box number. As in the case described above, the enumerator asked the additional questions, and during the rest of the interview, it became apparent that the respondents at this house very rarely went anywhere else. They clearly lived here. Again, they would have answered “yes,” but for the address vs. post office box problem, which resulted in a “no” answer and a series of unnecessary questions that make it appear they had a second residence. They do not.


At the end of this interview I asked how long the respondents had lived in this house on this road and they said, more than 30 years! The respondent told me that they just have P.O. Box numbers where they receive their mail. Yet, later in the car, looking at the census maps, County Road 999 was identified and marked, and actually formed one of the boundaries of this cluster. However, the space within this very isolated cluster on the map was almost entirely grey, with a few lines on it, but no description of what these lines indicated. The only HUs marked on the cluster map were those four or five in the sample, so without identifiable landmarks and any other houses by which one could calibrate where one was, the map was very vague and it would be easy to make mistakes–like making the mistake that this family lived in a house with a very different physical description several miles away.


Later, one enumerator advised me to check the local telephone directory, which includes maps with the names and locations of telephone subscribers. I did check this directory, and the county road names in this cluster and others are clearly marked there. Very few people who live there have ever used these official addresses. Further, in some cases, people have local names for the roads based on original settler’s names, rather than “County Road xxx,” but when enumerators asked the RO (Regional Office) trainer which address to use, the trainer firmly told them to use the official names.


After this interview, the enumerator told me he had lived his whole life in a town on this reservation and it was not until he started working as a Census Bureau enumerator that he learned that there were street names and street numbers associated with houses. A crew leader told me that when she was hired for this operation, the Census Office had already closed, but she had to be able to send and receive mail from the RO by FEDEX. Not knowing the street number or street name of the house in which she’d lived for three years, she had to go to the sheriff to get this information, and then had to affix numbers on her house showing the street number, and then give FEDEX directions to reach her house!


The address issue manifests itself in another part of the CCM interviews too, when the respondent mentions someone stayed in another place and we ask for the address of the other place, or about landmarks and other distinguishing features. Most respondents answer this question with just a general description of the location, such as near highway 212 (which runs east-west across the entire state), or several streets away, or in the country several miles from a certain town. To them, these answers are sufficient, for they do not know or use the precise street addresses, and they could find these people, if it became necessary to do so. But for us, these general answers do not provide enough detail for analysts to determine precisely where these places are. We may want to get some of these forms sent here and go through them to see how useable the addresses of alternate places are in determining where people should be counted.


These issues of faulty maps, vague mapspots, use of official street names that do nonmatch how people think about addresses and directions, and inaccurate house descriptions are core issues on this reservation that I believe are contributing to coverage errors. There is evidence that these types of problems also occur on other Indian reservations and in Native Alaska areas (see Tongue 2006 and Craver 2006), and possibly in rural areas in general.


These are relatively new sets of factors that, to my knowledge, we have not explored in any depth in past coverage studies. I strongly suggest that we need more research on factors affecting enumeration procedures and coverage in rural areas and those lacking city-style addresses.

One of the enumerators told me that using the hand-held computers with GPS worked very well for him and he really hoped we would use this for the 2010 census, as it would allow him to specify exactly where each HU is located.


This does not solve the wording problem of C.1, though, where a “no” answer because of an unrecognized formal address, may send a respondent who usually lives at the same place through questions designed to identify another residence when there is none.


How could this problem be fixed in a way that would not adversely affect enumeration of city-style addresses? I offer a brainstorm suggestion on this in the last Section dealing with specific questions.


The “amount of time” questions did not always work on the reservation


Trying to specify times of moves and/or start and end times for back-and-forth movements was difficult. In the first observed PFU interview, the woman reported several stays at other places during the year. As the interview unfolded, I noticed that she gave some overlapping dates for trips to a training session elsewhere in the state, a trip to the coast, and a trip to a western state to stay with relatives, but she did not seem to be aware of the anomalies. Originally, the trip to stay with relatives seemed to be from June or so to November, which would have been about six months, but I learned during probing that she was at the training session in June and July, then went to the coast for a week, then went to stay with the relatives for four months, then returned to her home where we interviewed her.


In the second PFU case, the grandparents simply could not remember when their grandson stayed with them, despite all the questions in the PFU and the enumerator’s and my various attempts to help them recall this. Their grandson was a tenuously attached person, who just arrived to take a short term job with a relative and then suddenly left. The respondents said this young man moves around a lot, mostly with his mother, and that during 2006 he had been living north of the reservation with her, and also in another state for awhile. The grandparents did not keep in regular touch and did not know for sure where he was during the time he was away.


Here is something that the crew leader told me should be kept in mind. The Sioux people on this reservation are Plains Indians who traditionally were nomadic people who moved around on horses following the buffalos across the plains and set up teepees that could be picked up and moved often. While those times are long gone and people no longer live in teepees, there is still a lot of mobility in and out of households and on and off the reservation. I noted in my PI trip report that there were a number of tenuously housed individuals identified in the PI interviews I happened to observe, and that several of the enumerators themselves were tenuously attached. When I returned to the reservation in January, I asked about the enumerators I had met in July. In the six months I had been gone, two of the young enumerators had moved off the reservation and one more was planning to leave in two weeks. Moving in and out of households is taken for granted on this reservation. People stay for varying lengths of time. Specific dates that people arrive and leave are not typically remembered or recorded, so we may not be able to get this information from most respondents on this reservation or others like it.


Our emphasis on trying to establish precise dates of back and forth movements is somewhat at odds with this pattern of reservation living. One enumerator said that, with the frequent back and forth moves, it was difficult for the enumerators to try to answer the question as to whether more than half of the year had been spent here or elsewhere. More research is needed on how to improve enumeration in areas of high mobility and fluid households.


The effects of not having a Census Office during this operation


CCM PFU is the final operation of the 2006 Census Test but the Local Census Office was closed before this operation started. The crew leaders and enumerators, many of whom had worked on CCM PI and previous operations related to the 2006 Census Test seemed dispirited by this and it made meetings more difficult. Also, only two crew leaders had been selected for this operation, while more than that had hoped to get these positions. About 17 people were selected as enumerators, but only about 150 cases would be assigned to South Dakota. On average, there would not be many cases for each enumerator to do.


One of the crew leaders made arrangements to have her daily meetings with enumerators in a local restaurant, at a large table in a second room away from the main dining room. This enabled her to go over each of the completed forms, and if something were not complete, she would send the enumerator out to get the rest of the information. By attending these meetings, I negotiated to go out with enumerators to observe interviews, and also learned about cases that I would otherwise have not known about, including case 5 (described elsewhere). During the time I was there, five of her enumerators came to these meetings.


The other crew leader had not made arrangements for an indoor place to meet, and met his enumerators outside public buildings in the very cold air very briefly to give assignments and collect completed questionnaires. I only saw two of his enumerators the two days I met with his team. He told me that some of the enumerators hired and assigned to him turned out not to have cars available: this was hindering his group from getting interviews completed. In this rural area, not having a vehicle severely restricts how many cases an interviewer can get done. Having access to a car seems like it should be nearly a requirement to get hired as an enumerator in a rural area like this.


At the time of my observation, the operation had just recently started, enumerators were not fully comfortable with the questionnaire, and cases were coming in from the RO quite slowly. As a result, most of the enumerators only had two or three cases total to work on at any one time. They were trying to get the interviews, but it also seemed harder to find people at home in January than it did last July. It was also winter and bitterly cold, and it got dark early. It seemed much more intrusive to be going to people’s houses unannounced in the dark, and this was rarely done. For all of these reasons, it seemed to take much longer to find people at home.


Exclusive reliance on cold-call personal visit interviews does not seem efficient or cost effective


At the time I was there, almost all of the interviews were being attempted as unannounced personal visit interviews and most of the time; no one was home at the sample households.


My own experiences with two enumerators can show how time consuming and frustrating this is on a vast rural reservation.


In the first case, the enumerator and I drove for more than an hour on Tuesday to get to another town where our two sample households were located on the same street. No one was home at either house. As we sat in the car talking, we noticed teenagers come around the corner and enter each of the houses. We talked with each of them and were told that their parents were not going to be home for hours, but would be home the next afternoon. One of the teenagers said she’d have her mother call the enumerator, but the mother would definitely be home the following day. We thought we had these interviews pretty much scheduled for the next day and drove the 40 miles back to town where the enumerator dropped me off at the hotel.


The following day we waited until later in the afternoon to drive the 40+ miles back to the other town. We saw a blue car parked in the first driveway, but started at the other house and found no one was home. By the time we started walking to the first house, the blue car had pulled out of the driveway slowly and was moving down the road away from us. When we knocked at the door, a teenage boy answered and said his parents had gone to a social event and would not be back until later. We walked back to our car and saw the blue car pass slowly by the house again and go around the corner. We decided to go to another part of town and wait for half an hour. When we returned, the blue car was again parked in the driveway and music could be heard from inside the house. After our third knock, we heard a male voice say that someone was at the door, but no one came to open it, but through a window, we could see the back of the head of one person sitting in the living room. We did not know if this was an adult or just a friend of the teenager’s, and we finally gave up and drove the 40 miles back to the main town. On the following day, the enumerator traveled up there once more, in vain. The case was then reassigned to another enumerator who lived up in that area who might have more luck.


All in all, over three days, the enumerator drove nearly 240 miles and he and I clocked more than 11 hours of work time without getting these interviews completed.


In the second case, another enumerator knew the family at the sample HU. I also knew this household, because I had conducted a taped PI interview there last summer and identified a number of omitted persons there. This household was the only one assigned to the enumerator at the time, and it was a 45 minute drive from my hotel. We tried going in the afternoon, but no one was home. We made plans to go again the following night at 7 p.m. and I arrived ½ hour early at the spot where I was to meet the enumerator. She was not there yet, so I drove past the house in the dark, and saw an older woman walking from the house to the driveway and getting into the back door of an idling car. The car then pulled away and I went back to the rendezvous place. About 10 minutes later, the enumerator arrived and we drove back to the house. No one was home. The enumerator knew where a relative of one of the residents lived and told me she was authorized to go to that relative’s house, and did we want to go there now? I said, “Sure,” and the enumerator then drove her truck off the road at an unpaved horse crossing spot, exited the vehicle to take down a rope fence, got back in, drove through, and got out again to replace the rope. She then drove in the back country over a very rutted, two-track road for awhile, then left that and just seemed to drive across unmarked land for about five minutes, in the pitch black darkness with only the headlights to provide light. It was surreal. We approached a trailer in the middle of nowhere, with dogs emerging from under the trailer to approach the vehicle. She went to the door and knocked, but no one was home here either, so she dropped me off at my car and I drove back to the hotel. The two trips with this enumerator to try to get just one interview ate up about 5-6 hours of enumerator (and observer) time, without resulting in a completed interview. The following morning I heard through the reservation grapevine that an emergency had come up and the woman had to leave on very short notice.


Because of the many hours the two enumerators and I spent trying in vain to find these three distant respondents at home, I asked the crew leader in one of the morning meetings, whether it was permissible for enumerators to use the respondent phone numbers on the form to call ahead and set up a time for personal visit interviews. Neither she nor any of the enumerators could recall anything in CCM PFU training that said they could use the respondent phone numbers docuprinted on the form to set up appointments, so they were not doing this.


After hearing about these cases, the crew leader did think it could be a good idea to call ahead to make an appointment and explain the purpose of the visit. It was at this same time that the RO coordinator was calling the crew leaders to ask why so few completed interviews had been turned in.


All of this suggests that it could be more efficient and cost effective, as well as more respectful of potential respondents’ privacy and time, to use those phone numbers printed on the form to call potential respondents, explain what we are doing, and ask to set up a time convenient to them to do the interview. It is true that advance phone calls might lead to firm refusals, whereas when just showing up unannounced, it might be harder for respondents to refuse, but there may be a tipping point at which it makes sense to call first.


Recommendations: Consider adding a statement or two to the training materials telling enumerators they can use those docuprinted phone numbers to call people to schedule appointments for in-person interviews. Also, consider doing research on the tradeoffs for response rates, coverage, costs, and time of doing cold-call visits or pre-arranged visits with advance phone call contacts.


Specific questions on the form:


1. Earlier I mentioned problems with question C.1, which asks “Did you live or stay at [docuprinted address] anytime in 2006?” The problem was that if the respondent had not heard of the official address identified in the question, they would answer no, even though they had lived in this place for most of the year. The “no” answer would take them through questions 1b to 1e which would seek to identify the address of another residence.


Just as a brainstorm suggestion, maybe the wording of C1 could be broken into two questions. The first could ask something like, “Did you live at this (house, apartment or mobile home) anytime in 2006?” If the R says yes, then we could ask the followup question on the address: Is xxxx Banyan Street the correct address for this (house/apartment/mobile home)? Splitting up the question in this way would enable people on the reservation to answer yes to live here, but no to the address. The “no” answer could take them to a question to get the address (even if it is just a PO Box number), then skip them to the calendar question. A “yes” answer could also lead to verifying the address, then skipping to the calendar question. This could work if the interview is being conducted at the HU itself, but may not work if the personal interview is conducted away from the target HU. I think this might be a way to do it without causing problems for the majority of people with city-style addresses.


2. One of the enumerators specifically identified a question enumerators had difficulty with: 8f.

Other versions of the same question include 3e, 5f, 6f, and 7f. He said that enumerators differed in their opinions as to whether the downward pointing skip arrow after “Back and Forth” applied just to “Back and Forth” or to “Back and Forth” as well as “Moved.” The trainer told them it was just for “Back and Forth” but some enumerators may not have been convinced. Adding a skip instruction just after the “Moved” box could alleviate this problem. Whatever fix is made here would need to be made to other versions of this same question.


3. This question 8f (and those like it) is also in an unconventional format that we typically try to avoid whenever possible when we design questions: multiple-barreled items that ask multiple questions within one numbered item. The numbering of response categories is strange and not like other questions on this instrument: response 1 = move, 2 = back and forth. Since that is the end of the type of move question, a new item number and numbering system is expected for “Situation”, but not found: the categories continue, with 3 to 4 and 7 for proportion of time over the year, but then 8 asking for certain days of the week and 9 asking for daytime only (part of the day). Shifting the reference period within a question is not often done. Additionally yet another question is embedded under response category 4, so that if they answer response category 4 for half of the time, they are supposed to mark either 5 or 6 for being there or not on April 1. This means that for one numbered multi-barreled question (8f), respondents may be marking anywhere from 1 to 5 response categories (for example, 2. Back and forth; 4. Half of the time; 5. Yes; 8. Certain days of the week, with those sub-response categories marked, and 9. Daytime only, or maybe just 1 (moved). This multiple reporting could lead to errors.


The reason for this unconventional formatting is likely to be space limitations on the page, but I think this question is far too important to try to squeeze multiple questions together and risk errors. These questions, after all, are critical ones in determining whether a person should be counted at this address or at another address in the census and in the CCM.


One absolutely critical question is whether the respondent was at this address on Census Day–this is a central question in the census residence rules. It needs to be formatted as a stand-alone question that everyone answers. This is part of the crux of the operation–these questions should be formatted separately to have the maximum potential of getting accurate information.


I strongly recommend separating out the multiple embedded questions and reformatting the page to accommodate separate questions.


Questions 1g and 1h do separate out the request for dates during the year (1g) from the situation (1h), so this is a start on disaggregating. However, the formatting of 1h differs from that in 8f and the other items. When revisions are made to the question, all variations of the questions should have the same format.


4. Also on this question, I saw interviewers stumble on the wording of response categories 1 and 2, saying “moved” or “moved back and forth” (inserting moved in front of back and forth). The dual use of the word “moved” confused respondents. I suggest rewording these as “moved out” and “go back and forth” to clearly distinguish them.


5. Questions 10-12c are intended to get the final answer on where the person should be counted. As currently formulated, 10 asks the enumerator to determine how many addresses the person had in 2006. If there was just one, the person is skipped to question 11 which asks, “Just to confirm, you lived or stayed at only one address during 2006. Is that correct?” If the respondent agrees with this, he/she is not asked the central questions used to establish where they should be counted: 12a, b and c asking where they lived and slept most of the time around April 1 and around PI day, and the pattern of movement. If they had listed more than one address, these questions 12a, b and c would be asked.


A problem occurred in South Dakota interviews, because respondents did not really know or offer street addresses. They only give one P.O. Box number, and for any other place where the person stayed during the year, they tend to give just vague descriptions that are of little use to us. In the first interview, with the woman who lived at her house but had stayed at three other places through the year, she gave no specific addresses for the other places. At question 10, the interviewer thought she thus had only one address, and confirmed that with her in question 11. He followed the skip over the critical questions 12a, 12b, and 12c that she should have been asked, to definitively show on the PFU form where she should be counted. This is not really an outlier case that can be dismissed. I recall complicated cases like this in the PI operation, too, where the qualitative followup interview was definitely needed to straighten out the respondent’s complicated patterns of movement and living situation.


In reading through the Austin PFU trip reports, I note that a number of colleagues who observed during PFU said that enumerators were missing question 10 and sometimes asking all of these questions without skipping. Some of them are recommending that the skip patterns be improved so that 12a, 12b, and 12c are asked less.


I would make the opposite recommendation. Delete the check in item 10 and delete the verification of just one address in 11.


The purpose of this is to make sure that every person is asked to answer 12a, 12b and 12c.

Why? As a long-term member of Ed Byerly’s 2010 Census Residence Rules Working Group, I know that question 12a is almost an exact embodiment of the central residence rule we reworked and reworded for the 2010 decennial census. This question asks “Around April 1, 2006, where did you live and sleep most of the time?” Because it encapsulates the residence rule, and one central purpose of this research project is to determine Census Day residence, this should be the central question in CCM. It should be asked for every person, regardless of whether the person stayed in just one place or more than one place during the year, even at the risk of respondents finding this to be redundant. The other central residence rule question that needs to be asked of everyone is “Were you living or staying at this place on April 1, 2006?” The answer to these two questions, plus (in some cases) where the person was staying on April 1, if not here, would be the keys to determining each person’s usual residence in the census.


References Cited:


Adams, T. and Nichols, E. (2007). “2006 Census Test Evaluation #5: Census Day Residence” 2006 Census Test Memo #67. U.S. Census Bureau.


Craver, A. (2006). “Household Adaptive Strategies Among the Inupiat,” in Complex Ethnic Households in America, edited by L. Schwede, R. L. Blumberg, and A. Y. Chan. Lanham, MD: Rowman & Littlefield Publishers, Inc.

Tongue, N. (2006). “I Live Here and I Stay There: Navajo Perceptions of Households on the Reservation,” in same book as above.


Attachment


cc: DSSD CCM Contacts List








1 For purposes of confidentiality, I am not presenting the real addresses of respondents. The made-up address here is structured to show the city-style address format that was docuprinted on the PFU form.

2 The real location and description of this house is confidential. This description is an example of the format of the location/description in this case.


File Typeapplication/msword
AuthorBureau Of The Census
Last Modified Bymcart002
File Modified2007-11-20
File Created2007-11-20

© 2024 OMB.report | Privacy Policy