Volume I:
Request for Clearance for Respondent Debriefing Interviews for the 2009 National Household Education Survey (NHES) Study Pilot Test
1850-0803 v.18
November 19, 2009
Justification
The Random Digit Dial (RDD) National Household Education Survey (NHES) like most RDD surveys, has experienced a rapid decline in response rates. As a result of these declining response rates, the National Center for Education Statistics (NCES) has implemented a multi-stage redesign of the study. The primary goal for a revised NHES is to increase response rates without increasing measurement or coverage error in the study. To achieve this objective, a number of modes including face-to-face interviews and Internet data collection were evaluated for the redesign. Based on this evaluation, it was determined that a mail out- mail back survey with telephone non response follow up design had the greatest potential to achieve the response rate goal within the budget and precision constraints of the study.
NHES screens households to identify eligibility for one of three possible topical interviews (Early Childhood Program Participation (ECPP), and two Parent-Family Involvement (PFI) in Education (School and Home schooled) versions. In order to minimize the mode effects that can arise from the switch to self-administered questionnaires, and maximize response, NCES has engaged in a multi-faceted development approach. The first step in this approach was cognitive interviewing focused on the proposed instruments and methods (OMB# 1850-0803 v.14). The next step, an operational pilot test with approximately 10,800 respondents, is currently underway and is to be followed by a large-scale field test in 2011 (OMB# 1850-0768 v.6). This request for clearance is to expand the knowledge gained from the Pilot Test, to better inform the 2011 field test design, by re-contacting 30 respondents to debrief them on their decision to respond, experience taking the interview, and specific data items.
The respondent debriefing provides a unique opportunity to understand how the respondents view the materials and decide to participate and interpret specific items in a true data collection setting. Unlike respondents to the cognitive interviews which were used to develop the questionnaires, respondents to the debriefing did not respond to an advertisement to participate in research nor did they complete the survey in the presence of a researcher.
The objective of the debriefing is to better understand the respondents' experience in three specific areas of the operation: the decision to participate, impact of mailing materials, and response error issues.
Decision to Participate: The NHES pilot employed a sequential multimode approach to non-response follow up. Some respondents received three mailing packages (two by first class mail, one by FedEx) while others received telephone non-response follow up after the first or second mailing. Each wave of the mailing packages contained a different letter about the study. We will use the debriefing to understand why respondents choose to participate at a particular stage rather than an earlier one. We will also attempt to learn which aspects of the mailing materials were most compelling in their decision to participate. This will help us to refine our mailing strategy and materials for the 2011 field test.
Impact of Mailing Materials: Four different versions of the screener package are being tested in the 2009 Pilot study (Screenout, Core, Engaging, and Bilingual). Response to the screener questionnaire package currently varies from 51% to 56% depending on the package. The debriefing can provide valuable insight on respondent’s impressions of the different mailing packages and the aspects of the package that positively or negatively impacted their decision to participate. This debriefing may shed light on the differential response rates between the packages.
Response Error Issues: An early manual review of completed questionnaires has shown a number of interesting data anomalies, such as:
Some households reported that all members of the household were under the age of 20 but did not complete the household roster.
Most households used their children’s name on the roster. However, some reported only initials or left the name blank.
A key skip pattern on the Early Childhood Program Participation survey was routinely misinterpreted.
A date field was routinely misinterpreted on the bilingual screener, while it was answered correctly on all other versions.
We aim to learn more about these and other response variations through the debriefing. The information garnered from these interviews will be used to revise the questionnaires for the 2011 Field Test.
Design
The debriefings will be conducted as a one-on-one telephone interview between a respondent who completed at least a screener interview and a trained qualitative interviewer. The interviewer will follow a prewritten protocol (draft attached) but will be free to deviate in order to address specific issues or anomalies in the respondent’s written or verbal reports. These interviews are expected to last less than thirty minutes.
Table 1. Respondent debriefing burden hours.
Respondents |
Maximum interview length |
Total burden hours |
30 |
30 |
15 |
We propose to conduct a total of thirty interviews for this debriefing. Participants will be selected from among all completed NHES interviews for which we have a telephone contact number. Respondents will be selected based on meeting criteria from columns A-D in table 2.
We will select participants based on various combinations of 4 characteristics: 1.Response wave (first mailing, second mailing, FedEx, telephone); 2. Screener package (core, engaging, screenout, bilingual); 3. Topical eligibility (none, ECPP, PFI-School), and; 4. Data anomalies (children’s names, skip patterns, household roster, other). The goal is to have a variety of combinations from the grid illustrated in Table 2.
Table 2. Key respondent characteristics that will be used to select debriefing respondents
A. Response Wave |
B. Screener package |
C. Topical |
D. Data anomalies |
Responded to first mailing |
Core |
None |
Did not provide full names for children |
Responded to second mailing |
Engaging |
ECPP |
Skip pattern issues |
Responded to FedEx mailing |
Screenout |
PFI |
Household roster issues |
Responded by telephone |
Bilingual |
|
Other anomalies |
Automation
Due to the scale of this data collection, it is not resource effective to use automated tools in the collection.
Frequency of Collection
This is a one-time data collection.
Consultations Outside the Agency
Past versions of the questionnaires, from the 1989 field test through the 2007 national study, have been reviewed during development by technical review panels composed of individuals with expertise on issues included in those studies, and 2009 Pilot test methodology and instruments were developed with input from a technical review panel of experts in survey methodology.
Recruiting and Paying Respondents
Participants will not receive an incentive for the interview.
Assurance of Confidentiality
Respondents will be informed that their participation is voluntary at the beginning of the debrief interview call. Additionally, the paper questionnaires and letters that respondents have already received informed them of the voluntary nature of the interviews and the applicable confidentiality pledge. No information will be obtained from minors. No personally identifiable information will be maintained after the debriefing analyses are completed. Interviews will be recorded for analysis with respondent permission. The recordings will be destroyed at the conclusion of the project.
Estimate of Hour Burden
We expect the interviews for the screener questionnaire to be less than thirty minutes in length. Thus, the estimated total respondent burden will be 15 hours (Table 1).
Estimate of Cost Burden
There is no direct cost to respondents.
Cost to the Government
The anticipated cost of conducting these interviews is less than $15,000.
Project Schedule
To reduce the possibility of the respondents having difficulty recalling the recruitment materials and their decision to participate, we would like to begin interviewing as soon as possible.
Analysis and Publication
Results from this study will be used to revise data collection instruments and plans for a field test in 2011. The individual data will not be reported or published.
NHES:2009 Pilot Test
Respondent Debriefing: Initial Test Interviews
November 30, 2009
Research Goal: To gain insight into the respondent’s motivation to participate in the NHES survey and understand their experience with the survey materials. We hope that respondents will be able to provide information about the various design elements and the extent to which these influenced their decision to participate and ability to successfully complete the survey.
Interview # ________
Screener version ______________________________
Response wave and mode _______________________
Topical? Version _________________________
Data anomalies to probe:
Call attempts:
INTRODUCTION
Hello, I’m (name) calling from (fill). May I please speak with {NAME/the person who completed an Education Survey, several/a few weeks ago}?
When would be a convenient time to contact {NAME/the person who completed the survey}?
[IF DO NOT HAVE NAME: Who should I ask for when I call back?__________________]
WHEN THE RESPONDENT COMES TO THE PHONE
Hello, I understand that you completed an Education Survey a few weeks ago. You gave us a phone number on the survey in case we had any more questions. We actually do have a few more questions, could we talk to you about these?
IF YES
We are calling to try to find out some important information about what people who completed the survey thought about it. This information will help us improve the survey. This is a research project and your participation is voluntary. You can stop at any time and you can skip any question you wish. We expect this to take less than 30 minutes.
Everything that we cover here will be kept confidential. I would also appreciate your permission to audio record this conversation. The recording will be for note-taking purposes only. This allows me to listen to what you say and not try to write down what you are saying. When the recorder starts, I’ll need to get your permission on the tape.
[IF RESPONDENT AGREES, START RECORDER] I have started the recorder. Do I have your permission to record our discussion?
START OF DEBRIEFING
Think back to when you first received the survey packet in the mail. When it came in the mail, what did you think?
{IN THIS SECTION PROBE RESPONDENT COMMENTS TO ASCERTAIN WHETHER HE/SHE IS RECALLING/DECRIBING THE SCREENER OR TOPICAL INTERVIEW. ASK RESPONDENT TO DESCRIBE THE PACKAGE.}
What made you open the packet (as opposed to throwing it away)? What do you remember about the envelope or mailing materials?
[DEPENDING ON MAILING RESPONDED TO:]
[FIRST MAILING] Are you the person who usually receives the mail? Did you speak with anyone else before deciding to complete the survey (e.g. another family member, Westat, DOE). Do you remember how soon after receiving the questionnaire you decided to complete it? If more than a week, what are some reasons you waited to complete it? (note: Parents are responding slower than non parents – look for child related reasons)
[SECOND MAILING] Do you remember receiving another mailing of the questionnaire? What happened to the original mailing ? What are some reasons you didn’t complete that one? Why did you decide to complete the second one?
[THIRD MAILING] Do you remember receiving other mailings of the questionnaire? What happened to the earlier mailings? Why did you decide to complete this one? Was your reaction to this Fedex mailing different than to the other mailings? Why (not)?
[ALL]
Please tell me what you remember about what was in the package? (RECALL OF LETTER, INCENTIVE, RETURN ENVELOPE)
Did you read the letter?
IF YES Was the letter informative? What do you remember about the reasons for the study?
IF NO, Why didn’t you read the letter?
What was the overall message or impression you got from the survey packet (letter, envelope, and survey)?
Was there anything that was unclear or unnecessary? Was there any information you wanted that was not given?
SCREENER EXPERIENCE
Now, I would like to talk about the initial survey you completed about your household:
How long did you think the survey would take to complete? Do you recall how long it actually took?
Could you tell me what you remember about the questions you answered? How would you describe the experience of completing the survey (look for: reaction to engaging items and non response adjustment items, note if any specific items are mentioned)
IF RESPONDENT ENUMERATED CHILDREN/YOUTH: How did you feel about writing down information about the kids in your household? How did you think the information would be used?
[PROBES, AS APPLICABLE:]
You did not write in names or initials for your children. Tell me about that.
You gave initials rather than names for your children. Tell me about that.
You wrote about your childrens’ (enrollment/grade/INFORMATION) but left the space for names or initial blank. Tell me about that.
[IF RESPONDENT REPORTED HOUSEHOLD MEMBERS 20 OR YOUNGER BUT DID NOT ENUMERATE:]
You told us that (number) people in your household are age 20 or younger. Is that right? [IF NO, PROBE HH COMPOSITION.]
I see that you did not fill out any of the specific information such as name, age, or grade for those people. Tell me about that.
Once you completed the initial survey, what did you expect to happen?
{DID R EXPECT TO RECEIVE ANOTHER SURVEY?}
Did you receive any other mailings or phone calls from us? (PROBE WHETHER SCREENER/TOPICAL. DID RESPONDENT OPEN IT? POSSIBLE PROBE: Was it the same as the survey you already completed or something else?
IF COMPLETED THE TOPICAL
We sent your household another questionnaire mailing a couple of weeks after you sent back the initial survey form; that was a questionnaire about (CHILD). Can you tell me what you thought when that arrived?
[PROBES] Did you receive the survey or did someone else in the household? Was there any discussion in your household about the second survey and its content? Did you see a connection between the first survey you completed and the second one you received? What was the connection?
What made you want to complete that survey?
Was there anything that made you not want to complete that survey?
IF NECESSARY: One child’s name was printed on the second survey – how did that make you feel to see that child’s name on the survey?
Did you have difficulty reporting on that child? If so, what items were difficult to answer? Why?
(ask case specific data anomaly items)
CONCLUSION
Let’s think back over the entire study process. Were there any items that you were unsure how to answer or unable to answer? (PROBE FOR ANY CASE SPECIFIC ITEMS)
IF NECESSARY: Was it easy or difficult?
After doing the survey, do you feel that you’ve done something for education? (PROBE FOR SPECIFICS)
Thank you very much for your time. Goodbye.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Volume I: |
Author | Temp_MHolte |
File Modified | 0000-00-00 |
File Created | 2021-02-04 |