Justification

NHES 2011 FT Respondent Debrief Vol 1 Justification.docx

NCES Cognitive, Pilot, and Field Test Studies System

Justification

OMB: 1850-0803

Document [docx]
Download: docx | pdf

Volume I:


Request for Clearance for Respondent Debriefing Interviews for the 2011 National Household Education Survey (NHES) Study Field Test


1850-0803 v.46























March 7, 2011


Justification


The Random Digit Dial (RDD) National Household Education Survey (NHES), like most RDD surveys, has experienced a rapid decline in response rates. As a result of these declining response rates, the National Center for Education Statistics (NCES) has implemented a multi-stage redesign of the study. The primary goal for a revised NHES is to increase response rates without increasing measurement or coverage error in the study. To achieve this objective, a number of modes including face-to-face interviews and Internet data collection were evaluated for the redesign. Based on this evaluation, it was determined that a mail out-mail back survey with telephone nonresponse follow-up design had the greatest potential to achieve the response rate goal within the budget and precision constraints of the study.


NHES screens households to identify eligibility for one of two possible topical interviews: Early Childhood Program Participation (ECPP) and Parent and Family Involvement in Education (PFI). In order to minimize the mode effects that can arise from the switch to self-administered questionnaires, and maximize response, NCES has engaged in a multi-faceted development approach. The first step in this approach was cognitive interviewing focused on the proposed instruments and methods (OMB# 1850-0803). The second step, an operational pilot test with approximately 10,800 respondents, was conducted from September to December 2009 (OMB# 1850-0768). Following the pilot test, additional cognitive interviews were conducted to further improve instrument design (OMB# 1850-0803). The fourth step, a large-scale field test, began in January 2011 and is currently underway (OMB# 1850-0768). This request for clearance is to expand the knowledge gained from the Field Test in order to inform the 2012 full data collection design, by re-contacting 50 respondents to debrief them on their decision to respond, experience taking the interview, and specific data items.


The respondent debriefing provides a unique opportunity to understand how the respondents view the materials and decide to participate and interpret specific items in a true data collection setting. Unlike respondents to the cognitive interviews, which were used to develop and refine the questionnaires, respondents to the debriefing did not respond to an advertisement to participate in research nor did they complete the survey in the presence of a researcher.


The debriefings have two main objectives: 1) to collect information on items of particular interest to OMB and NCES such as school choice, student mobility, and changes in the parent/guardian background sections and 2) to collect information on characteristics of the survey that motivate response (such as the use of an incentive) and respondent’s perceptions of specific survey features (such as asking for the children’s names or a phone number for the household).


Specific survey items: The NHES: 2011 Field Test includes six different topical questionnaire versions (ECPP mainline and alternate, PFI mainline and alternate, ECPP Spanish, and PFI Spanish). The debriefings will focus primarily on respondents who returned an alternate topical questionnaire as these versions contain substantive changes from the mainline versions tested in the NHES: 2009 Pilot Test. The specific survey items to be examined from the PFI include questions about whether the child has been in the same school since September, school choice (e.g. does the district allow choice, does the child attend a regularly assigned school, does the child attend a charter school), parent attendance at school activities, and use of Internet instruction for school-related courses. On the ECPP, questions about how parents define children’s reading activities will be discussed. The parent section, common to the PFI and ECPP, will be discussed with respondents to see if they report any confusion or difficulty related to use of gender-neutral wording in parent/guardian sections rather than mother/father wording used in previous NHES administrations. We will use these results to refine question wording for the 2012 data collection and improve the quality of data collected.


Impact of Mailing Materials: There were seven different screener questionnaires tested in the NHES: 2011 Field Test (pilot, screenout with and without child’s name, engaging with and without child’s name, bilingual, and dual English/Spanish). Response to the screener questionnaire package currently varies from 52.3% to 54.6% in the national sample depending on the package. All screener versions except for the Pilot contain a question asking for the respondent’s phone number. The screener debriefing questions will focus on the perceived sensitivity of items asking respondents for information on children’s names and a phone number for the household. This debriefing may shed light on the differential response rates between the packages. In addition to testing different screener packages, different incentive levels, delivery modes, and mailing envelopes are being tested in the topical mailings based on results from the NHES: 2009 Pilot study. The topical debriefing can provide valuable insight on respondent’s impressions of the different mailing packages and the aspects of the package that positively or negatively impacted their decision to participate. This will help us to refine our mailing strategy and materials for the 2012 data collection.

Design


The debriefings will be conducted as a one-on-one telephone interview between a respondent who completed a screener and topical interview and a trained qualitative interviewer. The interviewer will follow a prewritten protocol (draft attached) but will be free to deviate in order to address specific issues or anomalies in the respondent’s written or verbal reports. These interviews are expected to last 20 minutes.


Table 1. Respondent debriefing burden hours.

Respondents

Maximum interview length

Total burden hours

50

20

17


We propose to conduct a total of fifty interviews for this debriefing. We will call cases that meet the selection criteria outlined below until we have completed 50 interviews. Each case selected will receive up to five call attempts.


Respondents will be selected based on meeting a combination of characteristics shown in table 2: type of screener package, type of topical form, and responses to specific items of interest. The primary selection criteria will be responses to the specific items of interest detailed in column C of table 2. The specific respondent selection criteria for each item are provide in bold, italicized text below each item in the table. Selecting respondents meeting these criteria will provide insight into potential sources of confusion and what respondents are considering when answering these items. Due to the focus on PFI items, the aim is to complete 35 PFI interviews and 15 ECPP interviews. Given that the items of interest are on the alternate form, the focus of interviews will be on respondents to the alternate survey forms. In addition to the criteria related to the topical, completion of the screenout with name, engaging with name, bilingual form, or dual forms in Spanish and English will also be part of the selection criteria. We aim to complete 15 interviews with screenout with name respondents, 15 interviews with engaging with name respondents, 10 interviews with bilingual form respondents, and 10 interviews with dual Spanish and English package respondents. Focus on the with-name version of forms will provide insight into respondent reactions to items most likely to be identified as sensitive on the screener form.


Table 2. Key respondent characteristics that will be used to select debriefing respondents

A. Screener package

B. Topical

C. Specific item of interest

Screenout with name

ECPP alternate

PF1 #11. Since the beginning of this school year has the child been in the same school?”

Respondents who answered “Yes”

Engaging with name

PFI alternate

PF1 #7. Does your public school district let you choose schools for this child?

Respondents who answered “Yes”, “No”, or “Don’t know”

Bilingual


PF1 #4. Is this school a charter school?

Respondents who answered “No” to question 7 and “No” to question 4 or “Yes” to question 5

Dual forms: Spanish and English


PF1 #5. Did you move to your neighborhood so that this child could attend his/her current school?

Respondents who answered “No” to question 7 and “No” to question 4 or “Yes” to question 5.



PF1 #15. Since the beginning of the school year, how many times have any of this child’s teachers or school staff contacted you about…

Behavior problems this child is having in school; Problems this child is having with school work; Very good behavior; Very good school work

Respondents reporting any contacts



PF1 #22 Some students take school-related courses over the internet. Is this child receiving an instruction that way?

Respondents who answered “Yes” or “No”



PF1 #23 Is that instruction proved by any of the following places?

Your local public school; a charter school; another public school; a private school; a college, community college, or university; someplace else-specify

Respondents who answered “Yes” to 22



PF1 #25. Since the beginning of this school year, how many times has any adult in this child’s household done any of the following things at this child’s school?

Attended a school or class event such as a play, dance, sports event, or science fair; Served as a volunteer in this child’s classroom or elsewhere in the school; Attended a general school meeting, for example, an open house or back-to-school night; Attended a meeting of the parent-teacher organization or association; Participated in fundraising for the school; Served on a school committee; Met with a guidance counselor in person

Respondents reporting participation in any activities



Table 2 continued. Key respondent characteristics that will be used to select debriefing respondents


A. Screener package

B. Topical

C. Specific item of interest



PF1 #25. Since the beginning of this school year, how many times has any adult in this child’s household done any of the following things at this child’s school?

Attended a school or class event such as a play, dance, sports event, or science fair; Served as a volunteer in this child’s classroom or elsewhere in the school; Attended a general school meeting, for example, an open house or back-to-school night; Attended a meeting of the parent-teacher organization or association; Participated in fundraising for the school; Served on a school committee; Met with a guidance counselor in person

Respondents reporting participation in any activities



PFI #16. Since the beginning of this school year, how many days has this child been absent from school?

Any PFI topical questionnaire



ECPP #71. Does this child ever read or pretend to read storybooks on his/her own?

Respondents who answered “Yes” to 71



ECPP #72. Does this child actually read the words written on the page or does he/she look at the book and pretend to read?

Respondents who answered “Yes” to 71



ECPP #73. When this child pretends to read a book, does it sound like a connected story or does he/she tell what is in each picture without much connection between them?

Respondents who answered “Yes” to 71



ECPP and PFI Parent section instructions

Example: Answer questions 93 to 109 about yourself if you are the child's parent or guardian.

If you are not the child's parent or guardian, answer

questions 93 to 109 about one of this child's parents or guardians living in the household.

Any topical questionnaire


Automation

Due to the scale of this data collection, it is not resource effective to use automated tools in the collection.


Frequency of Collection

This is a one-time data collection.


Consultations Outside the Agency


Past versions of the questionnaires, from the 1989 field test through the 2007 national study, have been reviewed during development by technical review panels composed of individuals with expertise on issues included in those studies, and 2009 Pilot test and 2011 Field test methodology and instruments were developed with input from a technical review panel of experts in survey methodology.


Recruiting and Paying Respondents


Participants will not receive an incentive for the interview.


Assurance of Confidentiality

No information will be obtained from minors. No personally identifiable information will be maintained after the debriefing analyses are completed. Interviews will be recorded for analysis with respondent permission. The recordings will be destroyed at the conclusion of the project.


Respondents will be informed that their participation is voluntary at the beginning of the debrief interview call and that: “All information you provide may only be used for research purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C., § 9573). I would also appreciate your permission to audio record this conversation. The recording will be for note-taking purposes only. This allows me to listen to what you say and not have to worry about trying to write down what you are saying at the same time. When the recorder starts, I’ll ask again for your permission.”


Estimate of Hour Burden


We expect the interviews for the screener questionnaire to be less than twenty minutes in length. Thus, the estimated total respondent burden will be 17 hours (Table 1).


Estimate of Cost Burden


There is no direct cost to respondents.


Cost to the Government

The anticipated cost of conducting these interviews is less than $16,500.


Project Schedule


To reduce the possibility of the respondents having difficulty recalling the recruitment materials and their decision to participate, we would like to begin interviewing as soon as possible.


Analysis and Publication


Results from this study will be used to revise data collection instruments and mailing materials for full-scale data collection in 2012. The individual data will not be reported or published.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleVolume I:
AuthorTemp_MHolte
File Modified0000-00-00
File Created2021-01-31

© 2024 OMB.report | Privacy Policy