2009 PFU Results 1

ATTACHH1 - 2009 PFU Results1.doc

Census Coverage Measurement 2010 Person Followup and Person Followup Reinterview Operations and Respondent Debriefings

2009 PFU Results 1

OMB: 0607-0961

Document [doc]
Download: doc | pdf

2




November 10, 2009


DSSD 2010 CENSUS COVERAGE MEASUREMENT MEMORANDUM SERIES #2008-D7-14


MEMORANDUM FOR: Magdalena Ramos

Co-chair, Census Coverage Measurement Operational

Integration Team


Patrick Cantwell

Co-chair, Census Coverage Measurement Operational

Integration Team


From: Gia Donnalley Signed November 10, 2009

Chief, Coverage Measurement Design for Data Collection Operations Branch

Decennial Statistical Studies Division


Prepared by: Patricia Sanchez and Travis Pape

Coverage Measurement Design for Data Collections Operations Branch

Decennial Statistical Studies Division


Subject: Lessons Learned from the 2009 Census Coverage Measurement Person Followup Operational Test


This memorandum documents the lessons learned from the 2009 Census Coverage Measurement (CCM) Person Followup operational test based on trip report summaries and a facilitated discussion with the participants in the test held on April 2, 2009.


If you need further information, contact Patricia Sanchez on 301-763-9268 or Travis Pape on 301-763-5744.


Attachments


cc: DSSD CCM Contacts List




Lessons Learned from the Activities Related to the 2009 Census Coverage Measurement Person Followup Operational Test

  1. Background


1.1 General

Due to budget shortfalls, the CCM Person Followup (PFU) operation is one of the CCM activities that were dropped from the 2008 Census Dress Rehearsal. Although the PFU form and operation were tested in the 2006 Census Test, findings revealed that changes were needed to the PFU instrument and training prior to 2010. In order to assess these changes and determine if any further changes are required for the 2010 Census PFU questionnaire, the Decennial Statistical Studies Division (DSSD) conducted a reduced-scope field test in March 2009.


The PFU interviewing phase of the CCM program involves the follow-up of persons to resolve inconsistent information between data from the CCM Person Interview (PI) and from the census. The CCM PI is designed to collect information about the members of each household selected for interviewing and about other places where each member lived or stayed around Census Day and around CCM PI Interview Day. After the CCM PI operation is completed, person matching is conducted between data from the CCM PI and data from the census. This involves a computer match, followed by a clerical match. Cases are targeted for a PFU interview when information is insufficient to determine residence, match, or duplication status. The completed PFU forms are reviewed and used to code a person’s final status.


The cases sent to PFU include people whose residence status is unresolved, CCM people who are possible matches to census people, possible duplicates in either the CCM or census, and CCM and census nonmatches. The interviewer must get as much address and dates of stay information as possible for each follow-up person in order to resolve each person’s status. Since 2010 PFU will be conducted about ten months after Census Day, some of the follow-up people could have moved and it may be difficult to find knowledgeable respondents. A person is considered a knowledgeable respondent if they know the follow-up person well enough to answer questions about where the follow-up person in question was living during the year. If a person can’t be found who can answer these questions, three people who would have known the follow-up person if the person had lived in the area are needed before the follow-up person can be considered “fictitious” (that is, to classify the person as “did not exist in the area”).


1.2 Operational Test Design and Implementation

The main goal of this operational test was to test changes to Section A – Introduction, which includes questions to help the interviewer identify either a knowledgeable respondent that knows the follow-up person or three knowledgeable respondents who can verify that the follow-up person did not exist. In addition, minor layout changes to Person Questions in Section C were also tested.


To conduct this test, we originally sampled 250 households from recently expired sample from the Current Population Survey (CPS). We targeted people who had recently moved into or out of the CPS sample households. Due to insufficient name data and safety issues, only 184 cases were sent to the field (143 in Washington, DC and Fairfax County, VA; 41 in Louisville, KY).


Interviewers for this test were comprised of nine regular Census Headquarters and National Processing Center (NPC) employees who were not familiar with the PFU operation. Each interviewer was paired with an observer for each day of interviewing. There were 11 observers who were also from Headquarters and NPC. The majority of the observers had participated in the development of the PFU form.


The PFU questionnaire is printed by Docuprint, which allows each form to be customized to each case so that different pages can be included for different types of cases and individual questions can be customized such as the follow-up person’s name included in the text of a question.


1.3 Summary of Questionnaire Changes

Table A provides a summary of the different sections of the 2006 and 2008 PFU forms and an indication of what was changed.


Table A: Changes between the 2006 and 2008 Person Follow-up Forms

Section:

Change between 2006 and 2008

Front cover: Contains the Census and PI roster

-Included “Possible Reasons for Follow-up”

-Added more note space

-Eliminated third column (which was not used in 2006)

-Added a Miscellaneous question indicator to alert interviewers

Section A: Introduction

-Major changes

-More information collected on respondents contacted

-Attempts to define a knowledgeable respondent (KR)

Section B: Possible match

-No change

Section C: Person Questions

-Minor skip sequence changes, trying to highlight shortcuts

-Collapsed questions for alternate addresses onto one page

-Some questions eliminated

Section D: April 1 occupancy

-No changes

Section E: Geocoding

-Was not tested in 2006

-2008 uses basic 2000 design

Back page:

Record of visits

-Minor changes

Introductory letter/ Flashcard & Calendar

- Major changes

In 2006, we had separate letter, flashcard and calendar. In 2008/2010, we propose using an Introductory letter/flashcard/calendar combo. We will have 2 separate forms, (one English, one Spanish) color coded, 8.5 x 11. (On one side we'd have the letter, on the other side we'd have the calendar and list A-Places that House Groups of People).

  1. Summary of Observations About the Questionnaire


This section compiles the lessons learned and suggested improvements for the PFU form based on the reduced-scope PFU operational test.


A. Cover Page:

There was a suggestion to mark which people on the rosters will be followed up by using an asterisk (*).


B. Section A:

  1. Introduction – While some interviewers memorized the introduction as suggested in training, other interviewers used their own interpretation of the introduction. There were problems using the introduction with proxies, and it was hard to see the wording in its current location. The Information Sheet, which included the introductory letter, was not always handed out during the introduction, though when forgotten it was usually handed out later in the interview. Suggestions were made to create a reference card, which would include the introduction among other things, which the interviewers could pull out when they had questions. We considered possibly moving the introduction to the cover page.

  2. Questions 1-1c (Identifying a respondent who knows the follow-up person) – There were many instances where these questions were not asked as worded or not asked at all. Some checkboxes were not marked correctly, and skip patterns were not followed correctly. There were difficulties with what to mark for respondents who have heard of the follow-up person but don’t know them well enough to answer questions. Some interviewers expressed difficulty with seemingly offensive questions like asking whether the respondent knew the follow-up person well enough when the respondent had spontaneously mentioned that he was her husband. Rules for erasing data when using this section for multiple respondents were not made clear. Reusing questions for multiple respondents was confusing even when not erasing. Suggestions were made for extending notes space and reordering the response options in Question 1 (Q1).

  3. Questions 2-2b (Identifying three knowledgeable respondents who do not know the follow-up person)– There were instances of skip pattern problems, not reading questions as worded, and entering data in Q2 incorrectly. In Q2, respondent names were being recorded when they shouldn’t have been and useful information was not being recorded in the notes area. There was some discussion about moving these questions earlier in the interview and asking about knowledge of the address of the follow-up person before asking if the respondent knows the follow-up person.

  4. Noninterview Assessment (This allows interviewers to provide a reason why they were unable to complete an interview for this follow-up person) – There was difficulty determining the relationship between the Noninterview Assessment and Final and Person Level Outcome codes. The location of the Noninterview Assessment was inconvenient and easily missed. The interviewers may have missed this section because it looked like ‘For Office Use Only’ area.

  5. Other Section A Comments – There were questions about the following: what to do when talking to a respondent who is not knowledgeable (when to collect name), when to erase, and where to collect additional information about the follow-up person. There was a request for more notes space on this page. The interviewers had trouble understanding the concept of a knowledgeable respondent. There were also navigation difficulties in this section.


C. Section C:

  1. Questions 1-1j (Questions about the sample address or other place follow-up person lived) – There was a problem with interviewers not referring to the year in Question 1g since it was not explicitly mentioned in the question. This resulted in different data than intended. There was discussion about adding formatting (e.g., MM/DD/YYYY) to the date questions’ answer space (such as 1g). Additional problems included confusion caused by the arrows in question 1, interviewers skipping over 1h and 1i, and not using the technique of verifying correctly. There was a suggestion to add an “All year” check box for Q1g.

  2. Questions 2-2j (Other place follow-up person lived during the year) – There was confusion with the moved “before/after Census Day” and “before/after PI Interview Day” checkboxes because the second set row was often redundant. Interviewers had trouble following skip patterns for “Same as” checkboxes.

  3. Questions 3-3h (College address) – These questions are fine as is.

  4. Questions 4-8, A1-I2 (Alternate addresses) – There were instances of interviewers skipping questions. There were questions about whether to record short hotel stays as alternate addresses.

  5. Questions 9-9e (Group Quarters address) – Many people found the wording of Q9 to be awkward. There was a suggestion to add an introduction to this question to make it easier to understand. There was discussion about changing the words used to refer to the list of places that house groups of people. There were several suggestions of wording to replace “the list.” This wording needs further discussion. There was a discussion about how question 9e (and similar questions about April 1st) is difficult for respondents to answer (Were you/Was he/she there on Tuesday, April 1st?). A decision was made to leave 9e as is, since it gathers the exact data that is needed for Residence Status coding.

  6. Question 10-10a (Respondent information) – Interviewers had problems with the skip instructions in this section, especially when the respondent was the follow-up person. There were instances of interviewers entering names of people who were not knowledgeable respondents in this section, which was incorrect. Many interviewers filled out this section after the interview, which may have led to missing or incorrect data. There were suggestions to reorder these questions to make it flow more easily.

  7. Question 10b (Miscellaneous Questions) – There were no problems with the Miscellaneous Questions seen in the field. In at least one instance, the indicator that we added on the front cover seemed to help the interviewer remember to look for these. There was a request that these questions be printed in BOLD since they are usually questions to be read to the respondent.

  8. Other Section C comments – Some interviewers missed the instruction to probe for street, city, and state when collecting alternate addresses. There were instances of interviewers not reading questions as worded, forgetting to ask questions, not verifying properly, and apologizing to respondents for questions that seemed repetitive. There was confusion about repeating an address that has already been given. The clerical matchers have told us that they want the interviewers to record an address as many times as applicable, so this needs to be made clearer during training. Some interviewers had difficulty when multiple addresses were given in one question. Interviewers consistently missed the phrase “Now, we’ll talk about Person’s Name” at the beginning of each Section C. Some interviewers had trouble recording answers from multiple respondents on the same question. Interviewers thought that the form had too many pages but that the arrows and skip instructions were helpful. There were suggestions to use stop signs on the form along with the skip instructions. The group discussed a suggestion to give respondents a list of address types that will be asked during the interview to prepare them. While this was an interesting suggestion, the majority of the group seemed to think that this would not be beneficial because it would lead to shortcutting questions.


D. Record of Visits:

The location of the Record of Visits on the back of the form proved to be awkward for some of the interviewers. Some interviewers did not complete an entry in the Record of Visits for every visit attempt. There was some difficulty completing the respondent classification and visit outcome code. There were instances of interviewers filling out the Record of Visits in the car after the interview, which led to some missing information regarding the visit. There were questions about: when the interviewers should complete the Record of Visits, what to do if there are more than 10 visits, where to write notes, what is considered a “visit”, how complete should the addresses be, and how much space is really needed (some think the current space allotted is not sufficient). There was a suggestion to print the sample address on the Record of Visits page to make it easier to record the full address for each visit so that interviewers may be more inclined to complete entries for every visit attempt. More information in the record of visits could also aid Reinterviewers in re-contacting the household.


E. Other Form Comments:

Many participants thought that the form was lengthy and that there were repetitive questions. Some interviewers used the cover page as a Record of Visits since it was more convenient than the Record of Visits page on the back cover. There were suggestions to highlight questions to be read to the respondent, to staple differently, and to print, “do not ask” to indicate interviewer instructions.


  1. Summary of Observations About Training


This section compiles the lessons learned and suggested improvements from the training session for the PFU operational test.

A. Section A:

Introduction –Suggestions to stress the importance of using the script when talking to proxy respondents.

Other – Training should stress the “Mission” of PFU, to read questions as worded, the definition of knowledgeable respondent, and to always hand out the information sheet to respondents. There were suggestions for additional scenarios including: more examples with movers and proxies, examples with differences in the name of the follow-up person printed on the form and the name usually used, more examples using the Noninterview Assessment, examples using Section A Q2 for three different respondents. There were also suggestions to include in training an explanation of the difference in intent between Q1 (Have you heard of) and Q1a (Do you know), discussions of good places to find proxy respondents, suggestions for what to do when encountering a respondent who is not a knowledgeable respondent or no one home situations, and more examples of probing questions that might be useful when encountering a difficult situation.


B. Section B:

There were questions about when to mark the second person’s outcome code Complete (Valid Skip) when two names are confirmed to refer to the same person, whether it should be during the interview or after the interview is complete. This should be made clearer during training.


C. Section C:

Training should stress the following: reading questions as worded, proper verification techniques (vs. asking) and when to use them, referencing the information sheet, how and when to probe for additional information, and what to do if respondents offer multiple addresses at the same time.


D. Record of Visits:

Training should spend more time on how to fill out the Record of Visits correctly, including how to record multiple proxy visits and no one home visits. Training should stress the importance of recording all attempts.


E. Other Training Comments:

  1. Training should stress the reason that these questions are important (also known as the “Mission” of PFU). That might give the interviewers a better understanding of why all the questions are important and should be read as worded. It might also reduce the interviewers’ tendencies to apologize for asking the questions. Training could include sample wording to give to respondents when respondents complain about seemingly repetitive questions.

  2. Training should explain what to do when respondents give vague answers such as “not to my knowledge” and when respondents give multiple answers to the same question. Multiple respondents answering the same questions should be discussed. More training should be given on what to write in notes spaces and where to write notes for different situations.

  3. In the discussion of probing, it would be beneficial to include suggested probes to find additional respondents or a more knowledgeable respondent.

  4. There was also a suggestion to make the practice scenarios more realistic, such as making the interviewers stand and use the clipboard. It may be beneficial to have the interviewers pair up to practice scenarios instead of going through the practice cases as a class.

  5. Some interviewers and observers felt that training did not address how to gain access to secured apartment buildings, where to find a proxy respondent, what to do when the follow-up person(s) has moved, what to do when confronted with non-English speakers, how to set up callback appointments, and when to consider a case a Noninterview.

  6. Additional suggestions included stressing to interviewers to take their time while conducting an interview.

  7. There were instances of interviewers writing notes on paper other than the PFU questionnaire. This should be discouraged during training since these notes could contain Title 13 or other Personally Identifiable information.

  8. There was a suggestion to pre-fill the answers to the practice PFU forms in the workbook for the more complicated scenarios. The interviewers are instructed to fill out the forms as they follow along with the class, but it can get difficult to keep up during the later, complex scenarios.


  1. Summary of Observations About Materials and Other


This section compiles the lessons learned and suggested improvements for the PFU procedures and concepts.


  1. Materials – The majority of interviewers had positive comments about the training materials used. There was a suggestion to create a Quick Reference Guide that would pull out the most used information in the manual for easier access. One possibility would be to add a one-sheet list of important information in the Information Booklet that interviewers could tear out if needed.

  2. Information Sheet - There were no problems with the content or layout of the Information Sheet, but interviewers had trouble remembering to hand it to respondents and to refer to it during the interview.

  3. Other

    • Several of the participants felt that an official Census bag would a) make it easier for the interviewer to carry all their materials, b) be more secure for transporting Title 13 information, and c) make the interviewers look more official, which could aid in gaining cooperation. Current Field plans already call for Census bags to be included for all CCM operations in 2010.

    • There were both positive and negative comments on using the clipboards with the PFU forms. Left-handed interviewers found the clipboards difficult to use because the PFU forms were stapled on the right side.

    • Currently, we do not encourage interviewing landlords of large apartment complexes. Some observers felt that we can often gain useful information from these people. This issue requires further discussion.


  1. Recommendations


This section compiles the recommendations for changes to the PFU form and training.


A. Cover Page:

      1. Move the introduction to the front cover above “Possible Reasons for Follow-up” section. This should only be read once to each respondent so it does not need to be on every Section A page. Moving the introduction to the cover will also help to streamline Section A.

      2. Expand Miscellaneous Question indicator Docuprinted on cover to be more descriptive, such as “Miscellaneous Question in Section C, Question 10.”



B. Section A:

  1. Revise the layout for Section A completely to enable the interviewer to record respondent information for each respondent contacted, not just the one to three knowledgeable respondents necessary to gather the information or to code a person fictitious.

  2. Remove the Noninterview Assessment subsection. This was a new question in 2008 and the team feels that by revising Section A, the Noninterview Assessment subsection is no longer necessary for either the interviewer or the matchers at NPC coding the forms.

C. Section C:

  1. Remove long arrows from the Question 1 series (e.g., from 1 to 1g). The long arrows were confusing to interviewers and redundant since skip instruction wording was printed for each response option. Shorter arrows should remain as they are.

  2. Question 1a; remove the “Same as …” option. This shortcut is causing more problems than it is solving. Removing this shortcut and its arrow will reduce clutter on this page and make the remaining navigation easier to understand.

  3. In Questions 1d, remove the parentheses around the references to the list on the back of the letter. The parentheses made it optional to read to respondents so interviewers were never referring to the list of Places that House Groups of People. Is that place a house or apartment or another type of place like those shown on the list I gave you? The list is on the back of the letter.

  4. Change the wording of Question 1g to Please look at the calendar on the back of the letter. During 2010, when did you/he/she live or stay at this address? (remove parentheses and add year).

  5. For all questions requesting dates (1g, 2g, 3g, F1, F2, and 9d), include light date fields mm/dd/yyyy in the “From” and “To” answer fields.

  6. There were differences of opinion as to what time period Question 1i referred to, whether it referred to the whole year or in the case of movers just to the part of the year that the follow-up person lived at that address. The team recommends adding the year to Question 1i to help standardize the responses. During 2010, did you/he/she stay at that place:

  7. In Question 9, remove the white text box where the follow-up person’s name would be Docuprinted and replace it with the bolded, normal font, “you/he/she” statement. The consistency of the font should reduce interviewers’ difficulties reading this question. Additionally, change the first sentence to Please look at the list on the back of the letter.

  8. Reorder Questions 10, 10a, and 10b –

Question 10 should now have the Miscellaneous Question printed or a docuprinted skip instruction printed if there is no question.


Question 10a. What is your name and phone number?

Name_________________________

Phone: ( ) -


Question 10b: What is your address?

[] [SAMPLE ADDRESS]

Other Address: ___________________________


Question 10c: Respondent type - - same as it used to be in Question 10.


This order should flow better and reduce missing information. Additionally, use a set of parentheses for area code and open space to record phone number instead of individual boxes for each number. This is similar to the phone number format design that was used in 2006 PFU form.

  1. Notes – Add lined pages to each form for interviewers to write notes. For existing notes spaces, use faint lines instead of the darker lines so interviewers don’t feel constrained to the small space but still have a line to write on.


D. Training:


  1. The following concepts should be added to the practice scenarios:

    1. Movers/Proxies

    2. Fictitious person with 3 different respondents

    3. No one home

    4. Encountering a person who is not a knowledgeable respondent

    5. Additional Probing

    6. Multiple responses given in the same question

These should be added to existing scenarios where possible.

  1. The definition of a knowledgeable respondent should be emphasized more in training. Clearer guidelines might help interviewers understand this concept better.

  2. More instruction should be given on proper verification techniques. Examples of good verification should be included in training.

  3. Training should include examples of probes that can be used when talking to a respondent who is not knowledgeable to aid in finding a knowledgeable respondent.

  4. Interviewer trainees should stand and use a clipboard during the practice scenarios to make the practice more realistic.

  5. Training should stress the difference in intent between Section A Question 1 (Have you heard of NAME?) and Question 1a (Do you know NAME well). Without sufficient training on these questions, interviewers may incorrectly reword these to switch their meanings. This will lead to difficulties in identifying knowledgeable respondents.

















File Typeapplication/msword
File Title2008 Mini-IHUFU Notes
Authorcontr309
Last Modified Bymcart002
File Modified2009-11-12
File Created2009-11-12

© 2024 OMB.report | Privacy Policy