Part A NHES 2016

Part A NHES 2016.docx

National Household Education Survey 2016 (NHES:2016) Full-scale Data Collection

OMB: 1850-0768

Document [docx]
Download: docx | pdf




National Household Education Survey 2016 (NHES:2016)

Full-scale Data Collection



OMB# 1850-0768 v.11

Part A
















April 2015


Revised in July 2015







TABLE OF CONTENTS

Page


JUSTIFICATION 1


A.1 Circumstances Necessitating Collection of Information 1

A.2 Purposes and Uses of the Data 1

A.3 Use of Improved Information Technology 5

A.4 Efforts to Identify Duplication 6

A.5 Collection of Data from Small Businesses 6

A.6 Consequences of Less Frequent Data Collection 7

A.7 Special Circumstances of Data Collection 7

A.8 Consultations Outside the Agency 7

A.9 Payments to Respondents 9

A.10 Assurance of Confidentiality 12

A.11 Sensitive Questions 12

A.12 Estimated Response Burden 13

A.13 Cost to Respondents 14

A.14 Cost to the Federal Government 14

A.15 Reasons for Program Changes 14

A.16 Publication Plans and Project Schedule 15

A.17 Approval to Not Display the Expiration Date for OMB Approval 15

A.18 Exceptions to the Certification Statement 15



List of Tables

Table

1 Child survey eligibility rates by predicted household response propensity score 11

2 Incentive amounts and sample sizes for targeted incentive experiment 12

3 Estimated response burden for NHES:2016 14



List of Exhibits

Exhibit

1 Surveys conducted under the National Household Education Surveys Program, by years administered: 1991 through 2012 3

2 NHES:2016 schedule of major activities 15






JUSTIFICATION

NHES Program - Request for Clearance

The National Household Education Survey (NHES) is a data collection program of the National Center for Education Statistics (NCES) designed to provide descriptive data on the education activities of the U.S. population, with an emphasis on topics that are appropriate for household surveys rather than institutional surveys. Such topics have covered a wide range of issues, including early childhood care and education, children’s readiness for school, parents’ perceptions of school safety and discipline, before- and after-school activities of school-age children, participation in adult and career education, parents’ involvement in their children’s education, school choice, homeschooling, and civic involvement. NCES received approval in September 2014 to conduct the full-scale NHES in 2015 (OMB# 1850-0768 v. 10) but decided to delay data collection until 2016 to further refine the surveys and data collection processes and to better manage budget constraints. This request is to conduct NHES:2016 full scale data collection, as described in this submission.

A.1 Circumstances Necessitating Collection of Information

The Education Sciences Reform Act of 2002 (ESRA 2002: 20 U.S. Code § 9543) defines the legislative mission of NCES to collect, report, analyze, and disseminate statistical data related to education in the United States and in other nations. The NHES is specifically designed to support this mission by providing a means to investigate education issues that cannot be adequately studied through the Center’s institution-based data collection efforts. For example, some school-age children are homeschooled rather than attending a public or private school. There is no available sample frame that includes all of the homeschooling students across the United States. Likewise, although attaining a postsecondary credential has become increasingly important for securing opportunities to get high-return jobs in the United States in the 21st century, NCES has traditionally only collected data on postsecondary certificates and degrees awarded through credit-bearing instruction in institutions of higher education that participate in Title IV federal student aid programs. These comprise only a portion of the subbaccalaureate education and training that American adults seek and complete in order to learn the skills they need for finding and keeping good-paying jobs.

It is efficient and economical to interview parents about their children’s participation in child care programs and family participation in school and other education activities through a household-based approach rather than incurring the cost and nonresponse involved in enlisting schools, obtaining lists of parents, and sampling parents from those lists. Similarly, it is also most efficient to interview adults through a household-based approach rather than trying to obtain lists from a myriad of private credential-awarding bodies. Also, the household approach allows for inclusion of adults who do not participate in training or have a credential, providing a point of comparison.

Repeating the NHES:2012 child surveys will provide the first trend data available under the new NHES design. Tracking trends in education topics on a regular, repeating basis is a key research goal of the NHES program. Adding the adult education component will provide the first publicly available adult education data from the NHES in over a decade, and provide detailed data about previously under measured non-degree credentials.

A.2 Purposes and Uses of the Data

The NHES:2016 data collection will provide policymakers and researchers with data on early childhood education, parent and family involvement in education, homeschooling, and adult training and education that are not available elsewhere. Researchers nationwide rely on NHES data for important policy analyses. Survey data from the NHES have been used for a large number of descriptive and analytic reports and articles, including NCES education indicators, reports, and statistical abstracts; publications of other Federal agencies; policy analyses; theses and dissertations; conference papers; and journal articles. A list of NHES publications issued by NCES can be found on the NHES website, http://nces.ed.gov/nhes.

NHES Program

NHES uses a two-stage design in which sampled households complete a screener questionnaire to enumerate household members and their key characteristics. Within-household sampling from the screener data determines which household member receives which topical survey. NHES typically fields 2 to 3 topical surveys at a time, although the number has varied across its administrations. Surveys are administered in English and in Spanish. Data from the NHES are used to provide national cross-sectional estimates on populations of special interest to education researchers and policymakers.

Beginning in 1991, NHES was administered approximately every other year as a landline random-digit-dial (RDD) survey. During a period of declining response rates in all RDD surveys, NCES decided to conduct a series of field tests to determine if a change to self-administered mailed questionnaires would improve response rates. A feasibility test of the new design was conducted in 2009 followed by a field test in 2011. The field test results helped to inform the final design of a full-scale NHES collection in 2012 (OMB# 1850-0768 v.9), which included the Early Childhood Program Participation (ECPP), the Parent and Family Involvement in Education-Enrolled (PFI-E), and the Parent and Family Involvement in Education-Homeschooled (PFI-H) surveys.

ATES Development

During the same period of time, NCES began supporting developmental work on new questionnaire items for federal household surveys on work-related education, training, and credentials for adults and out-of-school youth. The Interagency Working Group on Expanded Measures of Enrollment and Attainment (GEMEnA) is a collaboration among federal statistical agencies established by the OMB Office of Statistical and Science Policy, the Council of Economic Advisors, and the Under Secretary of Education to improve federal household statistics on the attainment of non-degree credentials such as industry-recognized certifications, occupational licenses, and educational certificates. In 2012, GEMEnA’s commission expanded to include the development of new and revised measures of enrollment or participation in education and training for work. One of GEMEnA’s roles is to guide NCES’s development of a new household survey on these topics to support research and policy analysis. To achieve this purpose, NCES conducted focus groups, cognitive interviews, and two pilot studies (OMB# 1850-0803), first a two-stage telephone survey and then a single-stage self-administered mail survey. Detailed information and reports from these activities can be found at nces.ed.gov/surveys/gemena.

In 2016, the NHES will field the first full-scale administration of the Adult Training and Education Survey (ATES), which will provide new measures of adults’ educational and occupational credentials, including counts of (1) adults who have an industry-recognized certification or occupational license, including the number of such credentials, the type of work they are for, their perceived labor market value, and the role of education in preparing for them; (2) adults who have educational certificates, including the subject field of the certificate, its perceived labor market value, and its role in preparing for occupational credentialing; and (3) adults who have completed an initial work experience program (such as an apprenticeship or internship), including characteristics of the program and its perceived labor market value.

NHES Feasibility Study

One of NCES’s goals from the beginning of the GEMEnA project was to determine the feasibility of eventually incorporating a survey of adults back into the NHES. In 2014, NCES conducted a Feasibility Study testing the integration of an adult topical survey into NHES mail operations and processing (OMB# 1850-0803). The NHES Feasibility Study (NHES-FS) included several experiments to inform the final design of the 2016 full-scale NHES. Using one household survey platform for both child and adult surveys provides greater efficiency in the data collection and reduces overall national burden by maximizing the use of a single household sample draw. Before adding an adult survey back into the NHES it was important to test the feasibility of using a mail survey to screen households for both adults and children, and to test different approaches to collecting topical data from households (e.g., sampling either an adult or a child from the same household for topical follow up compared to sampling both and adult and a child for follow up). The NHES-FS also included several other experiments to test approaches aimed toward decreasing unit and item nonresponse. The results of these experiments have informed the design of the NHES:2016.

NHES Cognitive Interviews

NCES conducted a number of rounds of cognitive interviews from December of 2014 through June of 2015 (listed below) that lead to changes designed to improve respondents’ understanding of the language used in the letters, postcards, and survey items. The results of these cognitive interviews informed the development of the documents in this submission and are presented in Appendix 4.

  • Phase 1 Spanish language interviews for respondent contact materials and screener instrument—OMB #1850-0803 v.128

  • Phase 2 Spanish language interviews for topical questionnaires –OMB #1850-0803 v.131

  • Phase 3 Spanish and English interviews on contact materials for web experiment –OMB #1850-0803 v. 136

  • PFI and ECPP interviews –OMB #1850-0803 v.121

  • ATES interviews—OMB #1850-0803 v.127

Overview of NHES:2016 Target population

The NHES:2016 will include the ATES, the PFI-E, the PFI-H, and the ECPP. Adults ages 16 to 65 who are not enrolled in or homeschooled for grade 12 or below will be eligible for the ATES topical survey, and children from birth through 12th grade who are ages 20 or younger will be eligible for the child-focused surveys. The PFI-E samples children and youth ages 20 or younger enrolled in kindergarten through 12th grade, while the PFI-H targets families of children and youth ages 20 or younger homeschooled for the equivalent of kindergarten through 12th grade. The ECPP samples children ages 6 or younger who are not yet enrolled in kindergarten. Adults knowledgeable about the care and education of the sampled children respond to the surveys about children, whereas sampled adults answer ATES surveys about themselves.

This submission includes a number of letters and postcards for each stage of the study tailored for the screener instrument and each topical survey. It also includes respondent materials designed for a planned web survey experiment described in the section entitled NHES:2016 Experiments. All English-language respondent contact materials are provided in Appendix 1 and all English- and Spanish-language mail survey materials are provided in Appendix 2. Appendix 2 also includes a table describing differences between the mail surveys and the web surveys.

NHES:2016 Screener Instrument

The household screener instrument was revised from the 2012 NHES to include a complete listing of all household members rather than just of children in the household. The response rates for a 5-person child-only screener and a 10-person all household member screener were found to be comparable in a small experiment conducted in conjunction with an ATES pilot test in 2013. The NHES-FS used the 10-person screener as part of its goal to evaluate the procedures needed to include an adult-focused survey in NHES and included an experiment comparing response rate differences between a screener that asked for age measured in years versus a screener that asked for age measured as year and month of birth. Based on the results of this experiment, NHES:2016 will use the screener that asks for age as year and month of birth. English and Spanish versions of the screener are shown in Appendix 2.

NHES:2016 Topical Surveys

As shown in Exhibit 1, each administration of the NHES has included more than one topical survey. The NHES:2016 will include one adult-focused survey (ATES) and three child-focused topical surveys (PFI-E, PFI-H, and ECPP). The surveys are shown in Appendix 2. The planned NHES:2016 administration of the PFI and ECPP surveys is a repeat of the child-focused topics administered for the first time in mail survey mode as part of NHES:2012. NCES decided to repeat these surveys in 2016 to establish the first trend data for the newly-designed NHES. Tracking changes in the population over time is a key research goal of the NHES program. To develop ATES content, NCES under the guidance of GEMEnA has been developing and testing new survey items since 2009, including conducting expert review, focus groups, cognitive tests and pilot tests. English and Spanish versions of the PFI-E, PFI-H, ECPP, and ATES are shown in Appendix 2.

Exhibit 1. Topical surveys conducted under the National Household Education Surveys Program, by

years administered: 1991–2012

Topical survey

NHES survey administration

1991

1993

1995

1996

19991

2001

2003

2005

2007

2012

Early childhood education/program participation





Adult education





School readiness








School safety and discipline










Parent and family involvement in education






Homeschooling







Civic involvement









After-school programs and activities



2


3




Household library use










1 The NHES:1999 was a special end-of-decade administration that measured key indicators from the surveys fielded during the 1990s.

2 The After-School Programs and Activities Survey of the NHES:1995 only asked about children in first through third grades.

3 The After-School Programs and Activities Survey of the NHES:2001 also included items on before-school programs.

SOURCE: U.S. Department of Education, National Center for Education Statistics, National Household Education Surveys Program (NHES), 1991–2012.

The Parent and Family Involvement in Education Surveys (PFI)

The PFI, previously conducted in 1996, 2003, 2007, and 2012, surveys families of children and youth enrolled in kindergarten through 12th grade or homeschooled for these grades, with an age limit of 20 years, and addresses specific ways that families are involved in their children’s school; school practices to involve and support families; involvement with children’s homework; and involvement in education activities outside of school. Parents of homeschoolers are asked about their reasons for choosing homeschooling and resources they used in homeschooling. Information about child, parent, and household characteristics is also collected. To minimize response burden and potential respondent confusion, both enrolled and homeschool versions of the PFI questionnaire were created for self administration. This submission includes both PFI-E and PFI-H instruments.

The Early Childhood Program Participation Survey (ECPP)

The ECPP, previously conducted in 1991, 1995, 2001, 2005, and 2012, surveys families of children ages 6 or younger who are not yet enrolled in kindergarten and provides estimates of children’s participation in care by relatives and non-relatives in private homes and in center-based daycare or preschool programs (including Head Start and Early Head Start). Additional topics addressed in ECPP interviews have included family learning activities; out-of-pocket expenses for nonparental care; continuity of care; factors related to parental selection of care; parents’ perceptions of care quality; child health and disability; and child, parent, and household characteristics.

The Adult Training and Education Survey (ATES)

ATES provides means to investigate issues related to adults’ education, training, and credentials that cannot be adequately studied through the Center’s institution-based data collection efforts. It targets non-institutionalized adults in the United States ages 16 to 65 not enrolled at grade 12 or below. The ATES will collect information on educational attainment, prevalence and characteristics of certifications and licenses and their holders, prevalence and characteristics of educational certificates and certificate holders, and completion and key characteristics of work experience programs such as apprenticeships and internships. It will also collect detailed employment and background information.

NHES:2016 Experiments

Web Experiment

NCES is planning an experiment as part of NHES:2016 to evaluate response rates for a subsample of respondents requested to complete the screener and topical instruments over the internet. The web instruments are being developed using the paper and pencil versions submitted for clearance herein, with a few differences described in Appendix 2. This experiment will also test real-time sampling between the screener and the topical stages of data collection. The functionality of the web interface will permit immediate sampling of a household member for a topical survey, and if the screener respondent is the sampled adult respondent or the most knowledgeable adult about the sampled child respondent, the web instrument will allow him or her to continue immediately to the topical survey. The benefit of immediate sampling is that it will create the potential for respondents to complete the screener and topical surveys in one sitting rather than being contacted on two separate occasions to complete the survey; this should increase topical response rates while reducing cost and burden. At any stage during the web experiment, respondents will be able to call the Census Bureau to receive a paper and pencil version of the survey. In addition, beginning at the point of the third nonresponse follow up mailing for both the screener and the topical surveys, sampled web households that have not yet responded will automatically receive a paper and pencil version of the survey and continue with paper and pencil follow-up thereafter.

From the original sample of 206,000 households, 35,000 will be allocated to the web experiment. The experiment is being designed to measure 1) overall web screener and web topical response rates, 2) demographic characteristics of web respondents versus pencil and paper respondents in the web sample, 3) the number and type of respondents who try to answer the survey on a smart phone, 4) the number of breakoffs at the screener and topical stages, and 5) the number of screener respondents with a sampled child in their household who indicated that they are not the most knowledgeable about the care and education of the sampled child. A random half of the web respondents will be asked for an email address for nonresponse followup to evaluate whether that request leads to breakoffs. With 80 percent power and a significance level of 0.05, the treatment group size of 35,000 will allow a minimum detectible difference in screener response rates between the web treatment group and the paper treatment group of approximately 1 percentage point. At the topical level, response rate differences of 3.5 percentage points or greater will be detectible.

Targeted Incentive Experiment

Preliminary analysis of an experiment in the NHES-FS comparing no incentive to a $5 incentive suggests that there are certain types of households for which the incentive had a relatively limited impact on response rates. This means that NCES can potentially reduce operational costs without introducing bias by targeting specific households with a smaller incentive. On the other end of the spectrum, there are certain households that are less likely to respond. The Total Survey Error1 paradigm developed by Paul Biemer indicates that nonresponse bias should be reduced to enhance the quality of resulting survey estimates; therefore, NCES could potentially increase the quality of survey estimates by encouraging sampled households who may not otherwise do so to respond by offering a larger incentive amount. For NHES:2016, NCES plans to create a response propensity model using data from the NHES-FS, the address-based sample frame, and Census data to identify suitable households to target with a smaller or larger incentive. The plans for the targeted incentive experiment are described in section A.9 of this document.

Seeded Sample of Certificate Holders

As part of its ongoing effort to evaluate new survey measures of non-degree credentials, the NHES:2016 will include a seeded sample of 1,000 known holders of educational certificates. This experiment will follow the model successfully used in the ATES Pilot Study of 2010, NATES:2013, and NHES:FS of including a small opportunity sample of credential holders to help evaluate the characteristics of “true” credentials and to assess the extent of false negative responses to the main certificate survey item. These cases will only receive the ATES topical survey—not the screener—and are not included in the main national sample of 206,000 screener cases for operations, weighting, or analysis.

A.3 Use of Improved Information Technology

The paper and pencil instruments in the NHES:2016 will be collected for NCES by the Census Bureau using three complementary survey systems - (1) Amgraf One Form Plus, (2) Docuprint, and (3) integrated Computer Assisted Data Entry (iCADE), chosen for their efficiency and accuracy in the data collection process.

  • Forms Design. Questionnaires will be created using Amgraf One Form Plus. Completed hardcopy forms can be processed by iCADE to capture responses through optical mark recognition (OMR) and keying from image (KFI). Questionnaires will be printed, trimmed, and stitched through an in-house print on-demand process using a Docuprint system which allows personalization of some survey items. The data from the questionnaires will be captured by the iCADE technology/software, which automatically extracts all check box entries (OMR) and captures and displays an image of all other entries to an operator for KFI.

  • Image Preprocessing. iCADE applies image preprocessing to the forms in their image format in order to correct any skewing at the time of scanning, and the iCADE software performs registration to align the individual questionnaire page template with the appropriate scanned image. The scanner despeckles the image to remove unwanted pixels.

  • Data Capture. iCADE reads the form image files, checks the presence of data, processes all check box fields through OMR, and presents an image of the handwritten fields to an operator for KFI.

  • Verification. Extracted KFI data are subject to 100% field validation according to project specifications. If a data value violates validation rules, the value is flagged for review by verifiers who interactively review the images and the corresponding extracted data, and resolve validation errors.

  • Archiving. Images will be scanned and archived to magnetic storage located on a secured server in case they are needed later. This eliminates the need to save paper copies of the completed questionnaires.

The NHES:2016 web experiment described in section A.2 will be conducted to determine whether respondents access and respond to a web-based screener instrument followed by a web-based topical survey. The web-based instruments are designed to minimize respondent burden by eliminating the cumbersome skip patterns required in the pencil and paper instruments. The instruments will be securely hosted on the NCES server and regular updates on incomplete cases will be securely transmitted to the Census Bureau for nonresponse follow up.

A.4 Efforts to Identify Duplication

PFI and ECPP

Population: Most other surveys do not address the topics covered in NHES for the populations of interest. For example, the Head Start Family and Child Experiences Survey (FACES) focuses on children in Head Start, whereas all children who have not yet started kindergarten are of interest in the ECPP Survey. Likewise, the National Survey of Early Care and Education (NSECE) focuses primarily on low income children and their program participation. The National Survey of Parents of Public School Students and Survey of Family and School Partnerships in Public Schools focus on parents of children in public schools. Those whose children attend private schools or are homeschooled are not represented. Some studies, such as the Early Childhood Longitudinal Study, Birth Cohort (ECLS-B); the Early Childhood Longitudinal Study, Kindergarten Class of 1998-1999 (ECLS-K); and the Early Childhood Longitudinal Study-Kindergarten Class of 2010-11 (ECLS-K:2011) focus on single-year cohorts that are followed over time and therefore do not provide nationally representative data on different age groups. The NHES surveys are designed to complement these longitudinal collections with more frequent and more inclusive cross-sectional data.

Survey Content: Extant studies are limited in the content that they include relative to the goals of the NHES surveys. Studies such as the National Survey of America’s Families and the National Study of the Changing Workforce collect some information on child care or program participation, but their primary emphasis is on other topics, and the depth of information on early care and education experiences is limited. The Head Start FACES project collects information on Head Start program participation and some family measures, but does not account for all nonparental care and programs. The Current Population Survey October Education Supplement is limited to a relatively small number of items on education participation and does not address the roles that parents play in their children’s school, schoolwork, and home activities. Also, no nationally representative study other than the NHES collects detailed data on homeschooling.

Current Estimates and Measuring Change Over Time. Many of the extant surveys follow one cohort or periodic cohorts (e.g., the ECLS-K, Head Start FACES, NSECE) or are no longer conducted (e.g., the National Survey of America’s Families, Family Involvement in Education: A National Portrait). As a result, they cannot meet the NHES goal of providing up-to-date cross-sectional estimates and measures of change over time for all children who have not started kindergarten or for children in kindergarten through 12th grade, as is provided by the NHES.

ATES

Senior policy officials in the Departments of Education, Commerce, and Labor, foundations including the Gates Foundation and Lumina, and research organizations such as the Georgetown Center for Education and the Workforce have recognized that there is a lack of valid statistical information on prevalence of industry-recognized certifications and education certificates and have called for the development of new data sources. A series of meetings during the fall of 2009 launched a broad effort to begin to define and enumerate these credentials. NCES conducted a review of research literature and data collections since the work of a previous Interagency Committee in 2000, from which NCES developed a bank of existing survey items on certifications (completed 11/2009) and education certificates (completed 1/2010). This research found no surveys that adequately captured comprehensive data on the extent to which adults participate in training or non-Title IV credit bearing education and attain non-degree credentials.

Due to these limitations in extant studies and the household based sampling of NHES, NCES plans to conduct the PFI-E, PFI-H, ECPP, and ATES surveys under the NHES program. Appendix 3 contains a review of other surveys that cover topics similar to those in the NHES child surveys. The review shows that there is little overlap between the NHES and these other surveys. Although GEMEnA’s work has resulted in the addition of survey items on certifications and licenses to the Current Population Survey and other federal surveys, ATES is the only one that collects detailed information on the attainment of non-degree credentials from the general U.S. adult population.

A.5 Collection of Data from Small Businesses

Not applicable.

A.6 Consequences of Less Frequent Data Collection

This request is for clearance of the NHES:2016. Topics covered in the child-focused surveys proposed for this collection have been addressed in previous NHES administrations; repeating the surveys on a regular basis allows for analysis of trends over time. In the past, NHES has been administered on a biennial cycle. The last full NHES study was conducted in 2012. Rather than conduct a full scale NHES in 2014, NCES decided to field the NHES Feasibility Study to evaluate the procedures needed to incorporate an adult-focused survey into the mail survey mode. Due to funding constraints and in order to allow for developmental testing between cycles, NCES moved to a triennial NHES survey administration. The next full scale NHES is projected to take place in 2019. NCES believes that this is the maximum periodicity that will allow NHES to maintain its purpose of tracking changes in key education estimates over time.

A.7 Special Circumstances of Data Collection

None of the special circumstances listed in the instructions for completing the supporting statement apply to NHES:2016.

A.8 Consultations Outside the Agency

A Technical Review Panel (TRP) comprising leading experts in survey methodology was established to provide input to the redesign of the NHES system. Most members of the panel met in February 2010 to discuss the proposed design for the field test, and their comments and suggestions led to changes reflected in this submission.


Technical Review Panel Participants and Their Affiliation at the Time of TRP Recruitment


Nancy Bates

U.S. Census Bureau

649 A. St. N.E.

Washington, DC 20002

E-mail: [email protected]


Paul Beatty

National Center for Health Statistics

Division of Health Care Statistics

3311 Toledo Road,

Hyattsville, MD 20782

E-mail: [email protected]


Johnny Blair

Survey Sampling and Methodology

Abt Associates Inc.

4550 Montgomery Avenue

Bethesda, MD 20814-3343

E-mail: [email protected]


Stephen Blumberg

National Center for Health Statistics

3311 Toledo Road

Hyattsville, MD 20782

E-mail: [email protected]


Mick Couper

Survey Research Center

University of Michigan

ISR, 426 Thompson Street

Ann Arbor, MI 48104

E-mail: [email protected]


Don Dillman

Social and Economic Sciences Research Center, Professor

Washington State University

133 Wilson Hall

Pullman, WA 99164-4014

E-mail: [email protected]


Robert Groves

Survey Research Center, Institute for Social Research

University of Michigan

426 Thompson Street

Ann Arbor, MI 48106-1248

E-mail: [email protected]


Scott Keeter

Pew Research Center

1615 L. St. NW. Suite 700

Washington, DC 20036

E-mail: [email protected]


Kristen Olsen

Survey Research and Methodology

University of Nebraska-Lincoln

201 N. 13th St.

Lincoln, NE 68588-0241

E-mail: [email protected]


Roger Tourangeau

Joint Program in Survey Methodology

University of Maryland

1218 LeFrak Hall, University of Maryland

College Park, MD 20742

E-mail: [email protected]


Gordon Willis

Division of Cancer Control / Population Sciences

National Cancer Institute

6130 Executive Blvd, MSC 7344, EPN 4005

Bethesda, MD 20892-7344

E-mail: [email protected]



The content of the NHES:2016 child-focused topical surveys repeats the content developed for the NHES:2012 administration and prior NHES administrations. As a result, the PFI and ECPP surveys reflect the cumulative input of many experts in the field and past NHES Technical Review Panels. In order to ensure that the ECPP and PFI surveys address important issues in the topical areas of interest and incorporate important emerging issues, the design phase of the 2012 study included consultations with experts in the substantive areas addressed in the surveys. These experts included persons in government agencies, academe, and research organizations.


Substantive Experts: ECPP and Their Affiliation at the Time of TRP Recruitment


Jerry West - Mathematica

Mathematica Policy Research, Inc.

600 Maryland Ave., SW, Suite 550

Washington, DC 20024-2512

E-mail: [email protected]


Ann Collins – Abt Assoc. Cambridge, MA

Abt Associates Inc.

55 Wheeler Street

Cambridge, MA 02138-1168

E-mail: [email protected]


Ron Haskins – Brookings Institution and Casey Foundation

The Brookings Institution

1775 Massachusetts Ave., NW

Washington, DC 20036

E-mail: [email protected]


Ivelisse Martinez-Beck – HHS Division of Child and Family Development

Administration for Children and Families

370 L’Enfant Promenade, S.W.

7th Floor West, Room 7A011

Washington, D.C. 20447

E-mail: [email protected]


Lynda Laughlin – Census

U.S. Census Bureau

4600 Silver Hill Road

Suitland, MD 20746

E-mail: [email protected]


Substantive Experts: PFI and Their Affiliation at the Time of TRP Recruitment


Richard Brandon – Univ. of Washington

Human Services Policy Center, Evans School of Public Affairs

University of Washington

1107 NE 45th St.

Seattle, WA 98105

E-mail: [email protected]


Annette Lareau – Univ. of Pennsylvania

Department of Sociology

University of Pennsylvania

McNeil Hall

Philadelphia, PA 19104

E-mail: [email protected]


Joyce Epstein – The Johns Hopkins University

Center for Social Organization of Schools

3003 N. Charles St., Suite 200

Baltimore, MD 21218

E-mail: [email protected]


Lawrence Aber - NYU

Steinhardt School of Culture, Education, and Human Development

New York University

82 Washington Square East

New York, NY 10003

E-mail: [email protected]


As noted above, the ATES is a product of ongoing work guided by GEMEnA, which has met monthly since October 2009 and consists of senior staff from the Bureau of the Census, the Bureau of Labor Statistics, the Council of Economic Advisors, the National Center for Education Statistics, the National Center for Science and Engineering Statistics, the Office of Statistical and Science Policy (OMB), and the Office of the Under Secretary of Education. In addition, GEMEnA established an Expert Panel of substantive experts in the fields of workforce education, economic development, and non-degree credentials that met in November of 2012 and March and December of 2014 to provide input on ATES content.


Survey and Methodology Experts: GEMEnA Member Agency Representatives

Census Bureau

Kurt Bauman

James Spletzer


Bureau of Labor Statistics

Dori Allard

Harley Frazis


National Center for Science and Engineering Statistics

Dan Foley

John Finamore


Council of Economic Advisors

Jordan Matsudairas



OMB Office of Statistical and Science Policy

Shelly Martinez


Department of Education – Office of the Under Secretary

Jon O’Bergh




National Center for Education Statistics

Sharon Boivin

Lisa Hudson

Kashka Kubzdela

Sarah Grady

Andy Zukerberg



Substantive Experts: GEMEnA Expert Panel Members

Jim Van Erden
Senior Policy Advisor
National Association of State Workforce Agencies/
Information Technology Support Center
Washington, DC

Evelyn Ganzglass
Director of Workforce Development
CLASP
Washington, DC

Parminder Jassal
Executive Director
ACT Foundation
Austin, TX

Morris Kleiner
Professor and Director of Graduate Studies
Humphrey School of Public Affairs
University of Minnesota
Minneapolis, MN

James Parker
Senior Research and Policy Associate
Council for Advancement of Adult Literacy
New York City, NY

Kent Phillippe
Associate Vice President, 
Research and Student Success
American Association of Community Colleges
Washington, DC



Kenneth Poole
CEO/President
Center for Regional Economic Competitiveness
Arlington, VA

Andrew Reamer
Research Professor
George Washington Institute of Public Policy
George Washington University
Washington, DC

Jesse Rothstein
Associate Professor of Public Policy and Economics
Richard & Rhoda Goldman School of Public Policy
University of California, Berkeley
Berkeley, CA

Jeff Strohl
Director of Research
Center on Education and the Workforce
Georgetown Public Policy Institute
Georgetown University
Washington, DC

Michelle Van Noy
Researcher
Heldrich Center for Workforce Development
Rutgers, The State University of New Jersey
New Brunswick, NJ

Holly Zanville
Senior Research Officer
Lumina Foundation
Indianapolis, IN


A.9 Payments to Respondents

Screener incentives. The NHES:2003 included an extensive experiment in the use of small cash incentives to improve unit response. The experiment demonstrated that gains in respondent cooperation could be realized with relatively modest cash incentives (Brick et al. 2006). Such incentives were used in NHES:2005 and NHES:2007. The NHES:2011 Field Test included an incentive experiment at the screener level testing the effect of including a $2 cash incentive on response rates compared to a $5 cash incentive in the initial screener mailing. The $5 screener was associated with higher response rates than the $2 incentive, so the $5 incentive was used in the NHES:2012. Results from the NHES-FS indicate that the $5 incentive is associated with higher response rates than no incentive. We will continue with this approach in NHES:2016 and use a $5 cash incentive in the first screener questionnaire mailing, except for households allocated to the treatment groups for the incentive experiment (described in detail below and in section A-2 above).

Topical incentives. The NHES:2012 included an incentive experiment at the topical level to further refine an optimal strategy for the use of incentives in the NHES. For those households in which a child was selected as the subject of an ECPP or PFI questionnaire, cases that responded to the first or second mailing of the screener received a $5 cash incentive with the initial topical survey mailing. Evidence from the 2011 Field Test indicated that topical response rates could benefit significantly by providing later screener respondents with a larger topical incentive. To confirm this finding, NCES subsampled late screener respondents (those responding to the 3rd or 4th questionnaire mailing) to receive either a $5 or $15 cash incentive with their first topical survey mailing. The results from the NHES:2012 indicate that among later screener responders, the $15 incentive was associated with higher response rates compared to the $5 incentive. Based on these findings, the same strategy is planned for NHES:2016. NCES will send a $5 cash incentive in the initial topical mailing to cases that respond to the first or second screener mailing and a $15 cash incentive in the initial topical mailing to any cases that respond later than three days after the third screener mailing.

Targeted incentive experiment incentives. As described in section A-2, the NHES:2016 will also include a targeted incentive experiment designed to examine the effectiveness of leveraging auxiliary frame data to target lower screener incentives to households expected to be most likely to respond regardless of incentive amount, and higher screener incentives to households expected to be less likely to respond. The plan for the screener incentive experiment is described below.

The allocation to the targeted incentive experiment totals 45,000 households, of which 35,000 will be randomly selected for the “modeled incentive treatment group” and 10,000 will be randomly selected for the “$2-only treatment group.” Respondents in the modeled incentive treatment group will be placed in one of four treatments, depending on the predicted response propensity attributed to that address. To predict response propensity for sample members, a logistic regression model was estimated using the 2014 NHES-FS screener sample, with each household’s final screener response status (respondent or nonrespondent) as the dependent variable and a set of demographic characteristics (obtained from block group- or tract-level Census data and commercially available auxiliary data appended to the frame such as race/ethnicity makeup of Census block group or tract, age makeup of Census block group or tract, whether or not the address has a phone number match, and the Census Bureau’s value for a “low response score” for the address’s block group or tract) as the independent variables. The resulting model coefficients will be applied to the NHES:2016 cases assigned to the modeled incentive group, generating a predicted response propensity for each address. Households in the modeled incentive group will be sorted by their predicted response propensities and divided into four strata:

  1. a very high-propensity stratum (those whose predicted response propensity is above the 95th percentile—having response rates above 90 percent),

  2. a high-propensity stratum (those whose predicted response propensity is between the 75th and 95th percentile—having response rates between 80 and 90 percent),

  3. a medium-propensity stratum (those whose predicted response propensity is between the 15th and 74th percentiles—having response rates approximately equal to the overall NHES-FS screener response rate of 69 percent),

  4. and a low-propensity stratum (those whose predicted response propensity is below the 15th percentile—having response rates below 50 percent).

Households in the very high-propensity stratum will receive $0 with the NHES screener. Households in the high-propensity stratum will receive a $2 screener incentive, and households in the low-propensity stratum will receive $10. The remaining households in the modeled incentive group (the medium-propensity stratum) will receive the same $5 screener incentive as the main NHES sample. The treatment group will be compared to the control group in the main NHES sample (which will receive the standard $5 screener incentive) on the following dimensions: screener response rate (overall and by key subgroups); topical response rate; cost per unit (incentive + mailing costs only); demographic characteristics (from both frame and topical data); responses to key survey items; differences between respondents and nonrespondents in frame indicators used to target the incentives; and response quality among respondents.

Though we had initially planned to send $7 to the quarter of households with the lowest response propensity, we refined the threshold to only the lowest 15 percent. Outside experts on incentives, including Dr. Paul Lavrakas, have advised us that $7 is not likely to have a different effect on respondents than $5.


To truly test the success of increased incentives on response among low response propensity households, we plan to test $10 in this group. Ten dollars was chosen because it is currently the maximum amount feasible to pay an NHES screener respondent. Therefore, the results of this experiment will allow us to determine whether or not a larger incentive will entice these households to respond. If we find that these households do not respond at increased rates to $10 compared to $5, we will seek alternative contact procedures or other alternative survey design features to attract these respondents in future NHES administrations, knowing that increased incentives are not successful. With an incentive amount below the maximum we would be willing to pay, we would not ultimately know if a larger amount would have been sufficiently successful to increase response, rather than having to explore more costly contact procedures, such as in-person data collection to attract the low response propensity cases.


We also believe that testing $10 with the lowest response propensity group is important because this group contains a higher proportion of households with children. Almost 39 percent of households in the lowest response propensity group are predicted to have a child eligible for either the NHES Early Childhood Program Participation survey or the Parent and Family Involvement in Education survey. This is a higher proportion of child survey eligibility than exists in any other response propensity group, and is also higher than the eligibility rate for the total NHES:FS sample of 30.7 percent. See Table 1 below for details.


Table 1. NHES child survey eligibility rates by predicted household response propensity score thresholds, incentive groups, and NHES Feasibility Study:2014 response rates

Predicted response propensity

Incentive amount

NHES child survey eligibility rate

NHES:FS screener response rates

Total across all households


30.7

68.7

Very high (above 95th percentile)

$0 (1,750)

5.9

91.5

High (75th to 95th percentile)

$2 (7,000)

19.3

83.8

Medium (15th to 74th percentile)

$5 (21,000)

36.3

66.2

Low (below 15th percentile)

$10 (5,250)

38.9

45.8

NOTE: Numbers in parentheses show the count of screener households assigned to receive each incentive amount. Child survey eligibility rate proportions are estimated using the NHES:FS nonresponse-adjusted screener weights, and all differences between the groups are statistically significant with p<.01.



In addition to the subsample assigned to the modeled incentive treatment group, a subsample of 10,000 households (referred to as the “$2-only treatment group”) will be assigned to automatically receive $2. This will allow response rates under the targeted incentive structure to be compared to those under both a uniform $5 structure and a uniform $2 structure. It will also allow more detailed analysis of the sensitivity of different types of households to the incentive amount in order to determine whether the targeted incentive approach could be further refined in future administrations. We hypothesize that the data will show that we could use a $2 incentive for a larger proportion of the household sample in future NHES administrations.

The incentive experiment will take place at the screener level only. However, households that receive $10 at the screener stage will also receive a $10 topical incentive (unless they return a screener at the third mailing wave or after, in which case they will receive a $15 topical incentive, as was done successfully in the NHES:2012 to boost topical response among late-screener respondents). Table 2 summarizes the incentive structure and sample size for each control and treatment group in the targeted incentive experiment.


Table 2. Incentive amounts and paper screener sample sizes for targeted incentive experiment control and treatment groups: NHES:2016 Full Scale Data Collection

Predicted response propensity

Group

$5-only control group/

main collection

Modeled incentive treatment group

$2-only treatment group

Very high (above 95th percentile)

$5 (6,300)

$0 (1,750)

$2 (500)

High (75th to 95th percentile)

$5 (25,200)

$2 (7,000)

$2 (2,000)

Medium (15th to 74th percentile)

$5(75,600)

$5 (21,000)

$2(6,000)

Low (below 15th percentile)

$5 (18,900)

$10 (5,250)

$2 (1,500)

NOTE: Numbers in parentheses show the count of screener households assigned to receive each incentive amount. Counts exclude web screener cases. All 35,000 web screener cases will receive the $5 incentive.


Screener and topical response rates within the modeled incentive treatment group will be compared to those within the $5-only control group and the $2-only treatment group. In addition, key characteristics of respondents will be compared between the groups to determine the possible effect of the modeled incentive approach on unit nonresponse bias. At the screener level, the treatment group allocations will allow a response rate difference of about 1 percentage point or greater between the entire 35,000 sample member modeled incentive group and the main non-experimental NHES sample of 126,000. The experimental sample will allow about 2.5 percentage points or greater between the entire modeled incentive group and the random $2-only group to be statistically detected with 80 percent probability. At the topical level, the detectible response rate difference will be approximately 2.5 percentage points between the modeled incentive group and main non-experimental NHES sample, and approximately 5 percentage points between the modeled incentive group and the random $2-only group.

Web experiment incentives. Only households that are not selected for the web experiment will be eligible for the targeted incentive experiment; all households selected for the web experiment will receive the standard $5 screener incentive. Web experiment cases will also be comparable to the main collection for topical incentives2 with the exception that web experiment respondents who complete the screener and topical without a break-off or change in sampled household member will not receive a topical incentive.

A.10 Assurance of Confidentiality

Respondents will be informed of the voluntary nature of the survey and of the confidentiality provision in the initial cover letter and on the questionnaires, stating that their responses may be used for statistical purposes only and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002 (ESRA 2002), 20 U.S. Code § 9573].

Additionally, all staff members and subcontractors working on the NHES and having access to the data are required to sign the NCES Affidavit of Nondisclosure. Notarized affidavits are kept on file by the contractor and submitted to NCES quarterly. In addition, all contractor staff members who have access to confidential data and work on the project more than 30 days are required to have a federal background check.

A.11 Sensitive Questions

The NHES is a voluntary survey, and no persons are required to respond to it. In addition, respondents may decline to answer any question in the survey. Respondents are informed of the voluntary nature of the survey in the cover letter that accompanies the questionnaire, as well as on the actual questionnaire. At the same time, some items in the surveys may be considered sensitive by some respondents:

ATES: The ATES includes an experiment asking about earnings. This information may be considered sensitive:

  • Personal earnings in the past year (categorical)

A measure of earnings is important because educational attainment is statistically associated with earnings, and the empirical properties of the survey measures may differ for people with different earnings levels. The American Community Survey (ACS) was the source for most of the ATES employment and background items. Item response rates for earnings questions were reasonably high in the 2014 NHES Feasibility Study. The item response rate for personal earnings was 96.4 percent.

PFI and ECPP: Child development and education experts consider economic disadvantage and children’s disabilities to be important factors in children’s school experiences and their activities outside of school. As a result, the child surveys contain measures of these characteristics, including:

  • Household income;

  • Receipt of public assistance in the form of Temporary Assistance to Needy Families (TANF), food stamps, and the Women, Infants, and Children program (WIC); and

  • Children’s disability conditions.

Measures of household income and government assistance are important because access to early childhood programs by children at-risk and the education involvement of families of children from different socioeconomic backgrounds is of interest to policymakers, child development specialists, and educators. These items are important in identifying children at risk and have been administered successfully in previous NHES studies. Respondents are also asked the age at which they first became a parent. This may be sensitive for parents in some situations.

The 2012 response rates for these items were very high. For total household income, the 2012 PFI survey had an item response rate of 95.4 percent. Item response rates for receipt of public assistance were also high: for Temporary Assistance to Needy Families, 97.9 percent; for the Women, Infants, and Children Program, 97.7 percent; and for Food Stamps, 98.4 percent. In the 2012 mail survey, it is not possible to examine item missing data for child disability because of the multiple response list format of the question. Missing data may indicate either unreported data or that the child does not have a disability. However, in prior NHES collections, response to this item was high: in the 2007 PFI the item response rates were over 99 percent. In the 2012 PFI, the item response rate for age at which the child’s parent first became a parent to any child was 96.2 for the first parent reported and 96.0 for the second parent reported.

ECPP Survey: In addition to the items above, the ECPP survey also includes questions about assistance to pay for child care. This measure is important to understand families’ and children’s access to early childhood programs.

PFI Survey: The PFI survey includes items concerning children’s school performance and difficulties in school. Among these are:

  • Children’s school performance and difficulties, including school grades, grade retention, suspensions, and expulsions; and

  • Identification of children’s schools.

Items concerning school performance and difficulty are important to the PFI survey as correlates of parent and family involvement in children’s education. These items were asked in the NHES:2012 PFI and item response rates for these items were high: 99.0 percent for children’s grades, 97.6 percent for out-of-school suspension, and 97.5 percent for expulsion.

Another element of the surveys that may be sensitive to some parents is the identification of children’s schools. This feature allows analysts to link the NHES data to other NCES datasets containing additional information about schools, greatly enhancing the ability to examine the relationships between students’ and families’ experiences and the characteristics of schools. The item response rate for the identification of the child’s school was 97.0 percent in NHES:2012.

A.12 Estimated Response Burden

The NHES:2016 will screen 206,000 households. An expected screener response rate of approximately 64.021 percent and an address ineligibility3 rate of approximately 9.375 percent are assumed, bringing the total number of expected screeners to 119,520.4 From these completed screeners, it is expected that approximately 47.411 percent or 56,666 households will contain an eligible adult but no eligible children; approximately 30.4 percent or 36,333 households will contain an eligible adult and an eligible child; and approximately 1.25 percent or 1,494 households will contain an eligible child but no eligible adults (for example, children who live with grandparents above age 65)5. The seeded sample will not receive a screener questionnaire and is in addition to the screener sample of 206,000 households. A detailed description of the planned sampling design is provided in this submission in Supporting Statement Part B.

The response burden per instrument and the total response burden are shown in Table 3. The administration times for the screener, adult questionnaires, and child questionnaires are based on practice administrations and past experience. The expected number of respondents and number of responses are based on the expected numbers of completed surveys of each type, discussed in section B.1.3. The hourly rate of $22.65 is based on the average for all civilian workers from the December 2014 National Compensation Survey (http://www.bls.gov/news.release/ecec.t02.htm). For the NHES:2016, a total of 32,029 burden hours are anticipated, resulting in a burden cost to respondents of approximately $725,457.


Table 3. Estimated response burden for NHES:2016

Interview forms

Estimated time (minutes)

Number sampled

Anticipated Response Rate

Estimated Number of respondents

Estimated Number of responses

Total time (hours)

Screener

8

206,000*

64.021%

119,520

119,520

15,936

ATES questionnaire - national sample

10

63,855

74.245%

47,409

47,409

7,902

ATES questionnaire - seeded sample

10

1,000

60.000%

600

600

100

ECPP questionnaire

20

9,540

79.245%

7,560

7,560

2,520

PFI-Enrolled questionnaire

20

20,224

79.242%

16,026

16,026

5,342

PFI-Homeschooled questionnaire

20

869

79.171%

688

688

229

Study Total

 

 

 

191,803

191,803

32,029

* Approximately 9.375% of addresses will be returned by USPS as invalid, reducing the final sample size to 186,688 addresses. Calculations of number of screener respondents are based on 186,688 addresses rather than 206,000.

NOTE: Eligibility and response rates for the national sample are estimated based on NHES:2012 and the 2014 NHES Feasibility Study (NHES-FS), and represent rounded weighted averages of the rates expected within each experimental treatment group. The response rate for the ATES seeded sample is approximated based on the response rate to the NHES-FS seeded sample. Details may not sum to totals due to rounding.



A.13 Cost to Respondents

There are also no recordkeeping requirements associated with NHES, and no costs to respondents beyond the time to participate presented in table 3 above.

A.14 Cost to the Federal Government

The total cost of NHES:2016 to the federal government is approximately $11.8 million over a period of 20 months. This includes all direct and indirect costs of the design, data collection, analysis, reporting phases of the study, and creation of data sets.

A.15 Reasons for Program Changes

This is a revision of the NHES collection. The decrease in the estimated respondent burden as compared to that approved for NHES:2015 results from NHES:2015 having been planned to include five topical surveys (PFI-Enrolled, PFI-Homeschooled, ECPP, CWS, and TWS), while the NHES:2016 will include four topical surveys (PFI-Enrolled, PFI-Homeschooled, ECPP, and ATES). The initially planned Credentials for Work Survey (CWS) and the pilot Training for Work Survey (TWS), have been combined for NHES:2016 into one, more time efficient ATES instrument.

A.16 Publication Plans and Project Schedule

Exhibit 2 presents the schedule of project activities for NHES:2016. Based on the results of NHES:2016, datasets, statistics, and reports will be produced. The following are the planned outcomes of the NHES:2016:

  • A fully documented public-use data set that will be available for download from the NCES website;

  • A fully documented restricted-use data set that will be available for restricted-use data license holders only;

  • A codebook with weighted and unweighted frequencies of all variables; and

  • First Look Reports that highlight key findings from the study.

Exhibit 2.  NHES:2016 schedule of major activities

Task

Date of Scheduled Conduct/Completion

Survey Instruments Formatting and Printing

November-December, 2015

Data Collection Begins (advance letter mailing)

January 2, 2016

Data Collection Ends

August 31, 2016

Public-use data files released

August 31, 2017

Restricted-use data files released

September 30, 2017


A.17 Approval to Not Display the Expiration Date for OMB Approval

The OMB authorization number and expiration date will be displayed on the questionnaires and web screener.

A.18 Exceptions to the Certification Statement

There are no exceptions to the certification statement.

1 Biemer, P. (2010). Total Survey Error: Design, Implementation, and Evaluation. Public Opinion Quarterly, 74 (5): 817-848.

2 Web experiment cases will receive the standard $5 or $15 topical incentive depending on which mailing of the letter prompted their web response.

3 Ineligible addresses are those that are undeliverable. Screener mailings for an address where one or more mailings are returned as a postmaster return (PMR) and no mailings are returned complete or refused will lead to an address being coded as ineligible.

4 Address eligibility and response rates are estimated based on the 2014 NHES Feasibility Study (NHES-FS), and are calculated to account for expected differential response rates within sampling strata and experimental treatment groups.

5 Percentages based on estimates from the 2014 NHES Feasibility Study (NHES-FS) screener.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorWAITS_T
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy