Response to OMB Comments

OMB NHES Questions II.docx

2009 National Household Education Surveys Program (NHES: 2009)

Response to OMB Comments

OMB: 1850-0768

Document [docx]
Download: docx | pdf

U NITED STATES DEPARTMENT OF EDUCATION

National Center for Education Statistics







July 30, 2009



MEMORANDUM



To: Shelly Martinez

From: Andy Zukerberg, Chris Chapman

Re: NCES Response to OMB NHES Questions

7/30/2009 Amended text is in blue.


  1. Confidentiality pledge – If NCES wishes to use CIPSEA in the NHES pilot (or NHES generally), it must first demonstrate to OMB sufficiently that it has in place procedures (e.g., time-limited retention of PII) that make it practically infeasible for the collected data to be used in support of anti-terrorism efforts. Unless NCES can demonstrate that to OMB’s satisfaction, it should rely on the ESRA pledge.

Per the email exchange with OMB, we will use ESRA for this round of the NHES testing. All materials will be updated to reflect the ESRA pledge. Also, attached to this package, as supplemental material, is a memo on Handling of Personally Identifiable Information (PII) in the 2009 NHES Pilot.



  1. Why are the required PRA elements (e.g., voluntary, confidentiality) not included in either the cover letter or the questionnaire?


Both the confidentiality pledge and the voluntary statements are included on the questionnaire with the commonly asked questions (NHES 2009 Pilot FAQs.docx). The fourth question in the FAQ document states: “You may choose not to answer any or all questions in this survey. “


The confidentiality pledge is also in the cover letters for the topical stating:

Your responses will be used for statistical purposes only and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002 (ESRA 2002) Public Law 107-279, Section 183].”



The cover letters are being updated to make the voluntary nature of the survey and confidentiality pledge clearer. Additionally, the questionnaires and FAQ’s will be updated to clarify these protections. Samples of the updates for each form type (Screener questionnaire, Screener cover letter, Screener non response follow up letter 1&2, Topical questionnaire, Topical cover letter, Topical non response follow up letter 1&2) are attached to this document. The other documents will be modified accordingly.


The following text has been added to the cover page of the Screener Surveys (Engaging, Core and Screen-out)


The National Center for Education Statistics is authorized to conduct this survey under Section 9543, 20 US Code . Your participation is voluntary. Your answers may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (Section 9573, 20 US Code).



The following text will be added below the instructions on the inside of the Topical Surveys (ECPP, PFI: Homeschool and Enrolled)

We are authorized to collect this information by Section 9543, 20 US Code. You do not have to provide the information requested. However, the information you provide will help the Department of Education’s ongoing efforts to learn more about the educational experiences of children and families. There are no penalties should you choose not to participate in this study.


Your answers may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (Section 9573, 20 US Code). Your responses will be combined with those from other participants to produce summary statistics and reports.


This survey is estimated to take an average of 20 minutes, including time for reviewing instructions, and completing and reviewing the collection of information. An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB control number. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to: Andrew Zukerberg National Center for Education Statistics, U.S. Department of Education, 1990 K Street NW, Room 9036, Washington, DC 20006-5650. Do not return the completed form to this address.




  1. Why is the confidentiality pledge restated verbatim twice on the FAQ page?

This was an oversight and has been corrected. See the attached updated commonly asked questions page from the screener questionnaire (NHES 2009 Pilot FAQs.docx).



  1. Please update the statute cited in A10 as it is out of date.

This section has been updated to reflect the current statute and the decision to conduct the data collection under ESRA.



  1. Why did NCES decide not to experiment with prepaid incentives for the topical questionnaires?

We will be experimenting with prepaid incentives for the topical questionnaires. Section A9 of the package indicates that:

For those households in which someone is selected as the subject of an ECPP or PFI questionnaire, cases will be subsampled to receive $0, $5 or $15 with the topical surveys to experimentally test effects on response of three different levels of monetary incentive in the mail survey.

The incentive is prepaid and will be included in the first topical mailing package. We have chosen to use a prepaid incentive as the literature indicates that this method of providing an incentive is generally more effective in self administered questionnaires than using a postpaid incentive. The information you provide will be combined with information from other participants to produce statistical summaries and reports.


There may have been some confusion in the package between this prepaid experiment (listed as number 4) for the topical interview and another experiment (number six in the package) that will be conducted to look at the effectiveness of a nominal postpaid incentive on telephone response rates for the topical survey. For this experiment, a small group of respondents that are selected to complete the topical interview by telephone rather than mail will be offered a $5 incentive by the CATI interviewer. This incentive will be postpaid upon completion of the interview. This is to allow the interviewer to make the offer directly to the potential respondent, rather than mailing the incentive to the household, where it might be collected by someone other than the household member that answers the phone.





  1. Can NCES provide assurances to OMB that the experiment evaluations will include sufficient cost data to yield a good cost-benefit analysis?

We will be capturing information on the number of mailings, the paradata on when the case went to the phone (if at all), the mailing or call on which the case was completed/finalized, the number of phone attempts, the cost per mailing, and costs of telephone interviewing. While we will have the data needed for a cost-benefit analysis, our primary objective is to evaluate the response and bias properties of the methods.


  1. Questionnaires.

    1. Please clarify the strategy use for selecting the specific questions in the longer screener questionnaires.

    2. Please also clarify the strategy for including so many items in these questionnaires (i.e., more than we were expecting).

    3. Please also clarify which of the screener questions are expected to be useful for nonresponse bias analysis.









The three versions of the Screener (Screen-Out, Core, and Engaging) will be randomly assigned to addresses. The Screen-out Screener has only questions needed for determining household eligibility and sampling, and for use in identifying the sampled child when re-contacting the household to complete the Topical survey. The Core Screener has those plus a few questions that will be useful for weighting and/or bias analysis. The Engaging Screener has the Core Screener questions plus a set of questions aimed at engaging the respondent, in order to encourage cooperation. Many of the additional items in the Core and Engaging Screeners were proposed by the Technical Review Panel (TRP), during or after the January 9, 2009, meeting.

The Engaging Screener items were selected by taking a large set of policy and civic engagement items used on previous education surveys. From that list, three criteria were used to determine which to field. 1) questions that cognitive interview participants indicated were thought provoking, interesting or otherwise engaging were kept. 2) Items where a large number of respondents were anticipated to be able to answer positively (e.g., having 10 or more books in the house) were added. It was felt that giving respondents many opportunities to provide ‘yes’ answers would be affirming and engage them in the survey process. 3) Some items were added to maintain flow and organization of the questionnaire (e.g., use of a library of bookstore). During the cognitive interviewing, a number of the Core and Screenout survey participants indicated that they felt the surveys were too short and did not provide enough information to the sponsor to be useful. Many questioned why the Department of Education was asking for information about the household rather than education related questions. In refining the engaging questionnaire we attempted to balance this concern against overall burden. We settled on two pages of engaging education questions followed by one page of weighting/non response bias analysis items and the child roster. No negative comments regarding the Engagement Screener length were received during the subsequent cognitive interviewing.



During cognitive interviewing, many respondents without school age children indicated that they thought the survey did not apply to them. The engagement items were added to broaden the appeal of the survey beyond households containing a K-12 aged child. The questions were pulled from existing education studies conducted by polling firms (e.g., SRBI and Gallup) as well as Department of Education surveys. The table below indicates the source for each of the engagement items. Most of the engagement items were taken from the NHES Adult Education (fielded in 1991, 1995, 1999, 2001, 2003, 2005) and Civic Involvement (fielded in 1996, 1999) modules. The items were modified for household level reporting and in some cases (e.g., books in house and visits to library or bookstore) to allow more respondents to report a positive answer. NCES is eager to work with groups in the Department of Education and other agencies to identify a better set of engaging questions that convey a general (not k-12 appeal) and are analytically interesting to the government. If the engaging screener is successful in this test, we will begin pursuing these alternate items.

Question

Source

1. Main focus of federal government

New item, designed to have general appeal. A broad opinion item is used first so that everyone can answer. During cognitive interviewing respondents found this item thought provoking.

2. Federal government spending on education

Adapted from SRBI/Time Magazine education poll 2006

3. Role of government in school policy

Adapted from CER/ICR education poll , similar to a Gallup Phi Delta Kappa education poll item

4. Training for a new job

Adapted from NHES Adult education survey

5. Training for current job

Adapted from NHES Adult education survey

6. Training courses taken

Adapted from NHES Adult education survey

7./8. Quality of public schools nationally / community

A version of this item appears on many education surveys

9. School / community relations

Developed to touch on issues around perception of schools in the community at large

10. Schools preparing students for careers

New item, designed to be education related but have general appeal

11. Sources of information on current events

Adapted from NHES Civic Involvement survey

12. Books in home

Adapted from NHES Civic Involvement survey

13. Other reading materials in home

Adapted from NHES Civic Involvement survey

14. Hours spent reading

Adapted from NHES Civic Involvement survey

15. Visiting a library / bookstore

Adapted from NHES Civic Involvement survey



Items that are included (in the Core and Engaging Screeners) for weighting and/or bias analysis purposes include items that have been used for these purposes in past NHES surveys as well as a few additional items:



  • Is this house… ? (Home tenure question)

Home tenure has been used in weighting for previous NHES surveys; this will be considered as a possible weighting variable, as we expect that coverage and response propensities for renters may be different from others (primarily, owners).

  • What is the highest grade or level of school completed among the adults in this household?

Educational attainment has been used in weighting for previous NHES surveys; this will be considered as a possible weighting variable, as we expect that response propensities (particularly for a Department of Education-sponsored survey) will differ among households with different levels of educational attainment.

  • How many females live in this household?

How many males live in this household?

Of everyone in this household, how many are age 20 or younger?

This set of questions will be used to determine whether there is an adult female in the household. Age and sex of adults are items that have been used in weighting in previous NHES surveys; we will consider this for weighting, as we expect that households with an adult female will have higher response propensities than households without an adult female.

  • How many computers are in this household that can access the internet?

This question was added to allow for assessment of whether households with internet access differ in response propensities from households without internet access. This may inform future consideration as to whether to include web as a mode in the future.

  • How many years have you lived at this address?

This question was added to allow for assessment of whether more stable/established households differ in response propensities from households that have more recently moved into their homes.

  • Are there any adults in this household who do not speak English at home?

What language do they speak at home?

At the recommendation of the TRP, these questions were added as an attempt to measure English literacy, and will be considered in response propensity estimation.






Attachments (3)

Revised Commonly Asked Question Sheet (NHES 2009 Pilot FAQs.docx)

Revised OMB Submission Chapter A (Part A NHES2009 Pilot 2009-05-22 Supporting Statement.docx)

Handling of Personally Identifiable Information (PII) in the 2009 NHES Pilot Memo (Handling_of_PII_in_NHES_2009_pilot.docx)



Core Screener Questionnaire (NHES 2009 Screener Core.docx)

Core Screener Cover Letter (SCNR1 Letter CORE_ENGAG.docx)

Core Non Response Follow up Letter (SCNR2 Letter CORE_ENGAG.docx)

3 rd Mailing Core Screener Non Response Follow up Letter (SCNR3 Letter CORE_ENGAG.docx)

ECPP Topical Questionnaire (NHES 2009 ECPP.docx)

ECPP Cover Letter (NHES TOPICAL1 Letter – ECPP .docx)

ECPP Non Response Follow up Letter (TOPICAL2 Letter – ECPP.docx)

3 rd Mailing ECPP Non Response Follow up Letter (TOPICAL3 Letter – ECPP.docx)



1990 K Street, N.W., Washington, DC 20006-5650

Our mission is to ensure equal access to education and to promote educational excellence throughout the Nation.





File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleJuly 24, 2009
AuthorAuthorised User
File Modified0000-00-00
File Created2021-02-03

© 2024 OMB.report | Privacy Policy