ELS 2002 3rd-Follow-up 2012 Field Test Part A

ELS 2002 3rd-Follow-up 2012 Field Test Part A.docx

Education Longitudinal Study (ELS) 2002 Third Follow-up 2011 Field Test

OMB: 1850-0652

Document [docx]
Download: docx | pdf

June 2011




Education Longitudinal Study: 2002
(ELS:2002)


Third Follow-up 2011 Field Test


OMB Supporting Statement

Part A









OMB# 1850-0652 v.7










National Center for Education Statistics

Institute of Education Sciences

U.S. Department of Education



TABLE OF CONTENTS

Section Page


LIST OF Appendixes

Appendix 1. Draft Field Test Questionnaire 1-1

Appendix 2. Reliability Re-interview Questionnaire 2-1

Appendix 3. Commisioned Paper: Randall J. Olsen 3-1

Appendix 4: Commissioned Paper: Michael Shanahan 4-1

Appendix 5: Commissioned Items: Robert Lent 5-1

Appendix 6: Data Collection Materials (Brochure, Lead Letters) 6-1

Appendix 7: Cognitive Labs Report Summary 7-1

Appendix 8: Results of the Panel Maintenance Incentive Experiment 8-1


Exhibits

Number Page




Preface

This request concerns the third follow-up of the Education Longitudinal Study: 2002 (ELS:2002), an ongoing longitudinal study with a field test in 2011 and a full-scale data collection in 2012. This document requests clearance for field test and main study data collection activities and supplements earlier requests concerned with direct locating and contacting of individual respondents or their parents. ELS:2002 is being conducted by the RTI International under contract to the U.S. Department of Education (Contract number ED-04-CO-0036/0004).

More specifically, this clearance request is made to obtain Office of Management and Budget (OMB) approval for the field test questionnaires and for incentive experiments to be implemented in the data collection phase of the project. The request document includes estimated burden to respondents for the field test and full-scale studies. Additionally, the document contains a request for a waiver of a 60-day Federal Register notice for the full-scale study clearance to be submitted in 2011. It should be noted that generic clearance for cognitive testing of new and revised questionnaire items was requested separately, in a June 2010 submission (field test) and may be requested, if needed, in the September 2011 submission (full-scale) under OMB# 1850-0803.

The ELS:2002 study involves computer-assisted data collection (web, telephone, and field) with sample members who participated in the base-year or first follow-up ELS:2002 study (a subset of whom also participated in the second follow-up). The study will also involve the collection of postsecondary education transcripts for the cohorts in 2013–14. Full details for the transcript collection will be submitted to OMB in the full-scale data collection clearance package.

In this supporting statement, we report the purposes of the study, review the overall design, describe the field test and full-scale data collection procedures, and address how the collected information addresses the statutory provisions of the Education Sciences Reform Act of 2002 (P.L. 107-279). Subsequent sections of this document respond to OMB instructions for preparing supporting statements. Section A addresses OMB’s specific instructions for justification and provides an overview of the study’s design. The draft questionnaire is appended to this submission, and is represented by topic area in the justifications portion of Section A. Section B describes the collection of information and statistical methods.

  1. Justification of the Study

A.1 Circumstances Making Collection of Information Necessary

A.1.a Purpose of This Submission

The materials in this document support a request for clearance to conduct the third follow-up of ELS:2002. The basic components and key design features of ELS:2002 are summarized below.

Base Year

  • Baseline survey of high school sophomores, in spring term 2002 (field test in spring term 2001).

  • Assessments in Reading and Mathematics.

  • Parents and English and math teachers were surveyed in the base year. School administrator questionnaires were collected.

  • Additional components for this study included a school facilities checklist and a media center (library) questionnaire.

  • Sample sizes were about 750 schools and approximately 17,600 students (15,300 base-year respondents). Schools are first-stage unit of selection, with sophomores randomly selected within schools.

  • Oversampling of Asian Americans, private schools.

  • Design linkages (test concordances) with other assessment programs: Program for International Student Assessment (PISA), National Assessment for Educational Progress (NAEP), and test score reporting linkages to the prior longitudinal studies.

First Follow-up

  • Follow-up in spring 2004 (spring 2003 for field test), when most sample members were seniors, but some were dropouts or enrolled in other grades.

  • Student questionnaires, dropout questionnaires, cognitive tests, and school administrator questionnaires administered.

  • Returned to the same schools for data collection, but separately followed transfer students.

  • Sample members who were no longer in school were followed by telephone (computer-assisted telephone interview; CATI) or field (computer-assisted personal interview; CAPI) data collection.

  • Freshening for a nationally representative senior cohort.

  • High school transcript component in fall/winter, 2004–05 (2003–04 for field test).

Second Follow-up

  • Follow-up in spring 2006 (spring 2005 for field test) using web-based self-administered instrument with telephone (CATI) and field (CAPI) data collection for nonresponse follow-up.

  • Focus on transition to postsecondary education, labor force participation, and family formation, with emphasis on postsecondary access and choice.

Third Follow-up

  • Follow-up in summer 2012 (summer 2011 field test) using web-based self-administered instrument with telephone (CATI) and field (CAPI) data collection for nonresponse follow-up.

  • Collection of postsecondary transcripts.

  • Focus on postsecondary education, labor force participation, and family formation, with emphasis on college persistence and attainment.

The third follow-up study will provide data to map and understand the outcomes of the high school cohorts’ transition to adult roles and statuses at about age 26. For the cohort as a whole, the third follow-up will obtain information that will permit researchers and policymakers to better understand issues of postsecondary persistence and attainment, as well as sub-baccalaureate (and to a more limited degree, baccalaureate) rate of economic and noneconomic return on investments in education. The third follow-up will also provide information about high school completion (for students who dropped out or were held back) and the status of dropouts, late completers, and students who have obtained an alternative credential, such as the GED. Finally, for both college-bound and non–college-bound students, the third follow-up will map their labor market activities and family formation.

For many cohort members, complex pathways, with alternative timings and durations for work and postsecondary enrollment, will be followed at this point of transition. In the 6-year period since the previous round, a sample member may both have worked and attended school, either serially or simultaneously; a cohort member may have attended school part-time or full-time and combined education and work spells with marriage and family formation. The singular strength of longitudinal studies is their power to provide data on transitions that are both complex and of some duration. The transition from adolescence to adult roles—and in particular, the transition to and through postsecondary education, and to labor force activity, and family formation—is of this very type.

A.1.b Legislative Authorization

The National Center for Education Statistics (NCES) of the Institute of Education Sciences (IES), U.S. Department of Education, is conducting this study, as authorized under Section 151 of the Education Sciences Reform Act of 2002 (P.L. 107-279), which requires NCES to:

collect, report, analyze, and disseminate statistical data related to education in the United States and in other nations, including—

    1. collecting, acquiring, compiling (where appropriate, on a state-by-state basis), and disseminating full and complete statistics (disaggregated by the population characteristics described in paragraph (3)) on the condition and progress of education, at the preschool, elementary, secondary, postsecondary, and adult levels in the United States, including data on—

      1. state and local education reform activities;

  1. student achievement in, at a minimum, the core academic areas of reading, mathematics, and science at all levels of education;

  2. secondary school completions, dropouts, and adult literacy and reading skills;

  3. access to, and opportunity for, postsecondary education, including data on financial aid to postsecondary students;

  4. teaching, including—

  1. data on in-service professional development, including a comparison of courses taken in the core academic areas of reading, mathematics, and science with courses in noncore academic areas, including technology courses; and

  2. the percentage of teachers who are highly qualified (as such term is defined in section 9101 of the Elementary and Secondary Education Act of 1965 (20 U.S.C. 7801)) in each state and, where feasible, in each local educational agency and school;

  1. instruction, the conditions of the education workplace, and the supply of, and demand for, teachers;

  2. the incidence, frequency, seriousness, and nature of violence affecting students, school personnel, and other individuals participating in school activities, as well as other indices of school safety, including information regarding—

  1. the relationship between victims and perpetrators;

  2. demographic characteristics of the victims and perpetrators; and

  3. the type of weapons used in incidents, as classified in the Uniform Crime Reports of the Federal Bureau of Investigation;

  1. the financing and management of education, including data on revenues and expenditures;

  2. the social and economic status of children, including their academic achievement;

  3. the existence and use of educational technology and access to the Internet by students and teachers in elementary schools and secondary schools;…

  1. the availability of, and access to, before-school and after-school programs (including such programs during school recesses);

  2. student participation in and completion of secondary and postsecondary vocational and technical education programs by specific program area; and

  3. the existence and use of school libraries;

    1. conducting and publishing reports on the meaning and significance of the statistics described in paragraph (1);

    2. collecting, analyzing, cross-tabulating, and reporting, to the extent feasible, information by gender, race, ethnicity, socioeconomic status, limited English proficiency, mobility, disability, urban, rural, suburban districts, and other population characteristics, when such disaggregated information will facilitate educational and policy decisionmaking;…

(6) acquiring and disseminating data on educational activities and student achievement (such as the Third International Math and Science Study) in the United States compared with foreign nations;

  1. conducting longitudinal and special data collections necessary to report on the condition and progress of education;”

Section 183 of the Education Sciences Reform Act of 2002 further states that:

“…all collection, maintenance, use, and wide dissemination of data by the Institute, including each office, board, committee, and Center of the Institute, shall conform with the requirements of section 552A of title 5, United States Code [which protects the confidentiality rights of individual respondents with regard to the data collected, reported, and published under this title].”

A.1.c Prior and Related Studies

In 1970, NCES initiated a program of longitudinal high school studies. Its purpose was to gather time-series data on nationally representative samples of high school students that would be pertinent to the formulation and evaluation of education polices.

Starting in 1972, with the National Longitudinal Study of the High School Class of 1972 (NLS:72), NCES began providing education policymakers and researchers with longitudinal data that linked education experiences with later outcomes, such as early labor market experiences and postsecondary education enrollment and attainment. The NLS:72 cohort of high school seniors was surveyed five times (in 1972, 1973, 1974, 1979, and 1986). A wide variety of questionnaire data were collected in the follow-up surveys, including data on students’ family background, schools attended, labor force participation, family formation, and job satisfaction. In addition, postsecondary transcripts were collected.

Almost 10 years later, in 1980, the second in a series of NCES longitudinal surveys was launched, this time starting with two high school cohorts. High School and Beyond (HS&B) included one cohort of high school seniors comparable to the seniors in NLS:72. The second cohort within HS&B extended the age span and analytical range of NCES’s longitudinal studies by surveying a sample of high school sophomores. With the sophomore cohort, information became available to study the relationship between early high school experiences and students’ subsequent education experiences in high school. For the first time, national data were available showing students’ academic growth over time and how family, community, school, and classroom factors promoted or inhibited student learning. In a leap forward for education studies, researchers, using data from the extensive battery of cognitive tests within HS&B, were also able to assess the growth of cognitive abilities over time. Moreover, data were now available to analyze the school experiences of students who later dropped out of high school. These data became a rich resource for policymakers and researchers over the next decade and provided an empirical base to inform the debates of the education reform movement that began in the early 1980s. Both cohorts of HS&B participants were resurveyed in 1982, 1984, and 1986. The sophomore cohort was also resurveyed in 1992. Postsecondary transcripts were collected for both cohorts.

The third longitudinal study of students sponsored by NCES was the National Education Longitudinal Study of 1988 (NELS:88). NELS:88 further extended the age and grade span of NCES longitudinal studies by beginning the data collection with a cohort of eighth-graders. Along with the student survey, it included surveys of parents, teachers, and school administrators. It was designed not only to follow a single cohort of students over time (as had NCES’s earlier longitudinal studies, NLS:72 and HS&B), but also, by “freshening” the sample at each of the first two follow-ups, to follow three nationally representative grade cohorts over time (8th-, 10th, and 12th-grade cohorts). This provided not only comparability of NELS:88 to existing cohorts, but it also enabled researchers to conduct both cross-sectional and longitudinal analyses of the data. In 1993, high school transcripts were collected, further increasing the analytic potential of the survey system. Students were interviewed again in 1994 and 2000, and in 2000–01 their postsecondary education transcripts were collected. In sum, NELS:88 represents an integrated system of data that tracked students from middle school through secondary and postsecondary education, labor market experiences, and marriage and family formation.

HSLS:09. Finally, although not a prior study, the High School Longitudinal Study of 2009 is a related NCES study, and indeed, the successor study to ELS:2002. It began with a nationally representative sample of public and private schools in the fall of 2009, and a student sample of entering high school freshmen. HSLS:09 ninth-graders will be resurveyed in 2012, 2013, 2015, and 2021. The base-year survey included a survey and math assessment of students as well as surveys of school administrators, counselors, science teachers, math teachers, and parents. The first follow-up includes another survey and math assessment for the same students (some of whom may have transferred or left school entirely). It also will include surveys of school administrators, counselors, and parents. HSLS:09 is similar in its objectives to the other high school longitudinal studies, but places greater emphasis on choice behaviors associated with coursetaking and careers in science, technology, engineering, and mathematics than did prior studies.

A.2 Purposes and Use of ELS:2002

ELS:2002 is designed to monitor the transition of a national sample of young people as they progress from tenth grade through high school and on to postsecondary education and/or the world of work. ELS has collected data on young people in high school from multiple perspectives; previous waves surveyed parents, teachers and school administrators. This study follows young adults on many pathways, including dropouts from high school, early high school graduates, college bound graduates and non-college bound graduates. Because it draws on respondent survey information as well as administrative records such as transcripts, ELS is able to provide information on the many possible outcomes of secondary education.

ELS:2002 supports both longitudinal and descriptive cross-cohort analyses, although the study is first and foremost a longitudinal study. Survey items are chosen for their usefulness as outcome measures, particularly in the context of previously collected predictor items. ELS:2002 content will be kept comparable to that of the prior NCES high school studies, to facilitate cross-cohort comparisons (for example, trends over time can be examined by comparing 1980, 1990, and 2002 high school sophomores; or 1972, 1980, 1982, 1992, and 2004 high school seniors). The 2012 (third follow-up) round of ELS:2002 can be compared to the year 2000 round of NELS:88, when cohorts from both studies will be, typically, 8 years beyond high school graduation.

The third follow-up interview will focus on postsecondary education, work experiences, family formation, community involvement, and other life course outcomes. It will also include a range of new issues concerning students’ attainment in postsecondary education, the amount of student aid received, and, from college transcripts from all colleges attended, a complete record of all the courses they enrolled in, and the grades they received. New data will also be collected through jobs summary measures on the dynamics of the employment they have entered into and their progress in finding and forming a promising career. In addition, special attention will be given to high school dropouts’ progress toward a high school diploma, GED, or other equivalency, including GED test score information. Because some sample members will have chosen not to continue their education in the 8 years following high school, a series of questions will focus on experiences in the workforce. Yet, because another group of respondents will have been going to school and working, work and educational summaries must be collected, covering the 6 years since last interview. In addition to collecting factual information about educational enrollments and work experiences, the interview will collect information on respondents’ basic life goals. As sample members turn 26 years of age, the modal age of the participants at the time of the interview, marriage and parenthood become more common. Therefore, the third follow-up is the appropriate time to determine which participants have started forming families. With regard to community involvement, participation in volunteer work and the political process will be examined. All outcomes must be collected in this round, in the compass of a relatively brief (35-minute) interview.

The questionnaire (Appendix 1) and its research area justifications (Part C) are presented in terms of key research areas. The research areas reflect the research agenda for the third follow-up study, based both on the precedent of the prior secondary longitudinal studies and new considerations and measures, as identified by the commissioned papers, Technical Review Panel (TRP), and project staff.

A.3 Improved Information Technology

The same technologically innovative, web-based data collection technology employed in the ELS:2002 second follow-up will be used again in the third follow-up. With this web technology, the resulting survey instruments have been carefully designed to be virtually indistinguishable from each other in terms of screen text and skip patterns across all three modes of data collection: self-administered web, CATI, and CAPI. Expectations are that in the third follow-up, over 40 percent of the responses will be web self-administered. The advantages of a web-based instrument include real-time data capture and access, including data editing in parallel with data collection, and increased efficiencies in effecting timely delivery. This same approach—successfully used in the 2006 round—will also be used in the ELS:2002 third follow-up; however, the field test collection will not employ CAPI due to the smaller yield needed to meet the objectives of the field test. The CATI component will begin in the fourth week of data collection.

Additional features of the system include (1) online help for selected screens to assist in question administration (in all three modes); (2) full documentation of all instrument components, including variable ranges, formats, record layouts, labels, question wording, and flow logic; (3) capability for creating and processing hierarchical data structures to eliminate data redundancy and conserve computer resources; (4) a scheduler system to manage the flow and assignment of cases to interviewers by time zone, case status, appointment information, and prior cases disposition; (5) an integrated case-level control system to track the status of each sample member across the various data collection activities; (6) automatic audit file creation and timed backup to ensure that, if an interview is terminated prematurely and later restarted, all data entered during the earlier portion of the interview can be retrieved; and (7) a screen library containing the survey instrument as displayed to the respondent (or interviewer).

A.4 Efforts to Identify Duplication

Since the inception of its secondary education longitudinal program in 1970, NCES has consulted with other federal offices to ensure that the data collected in the series do not duplicate other national data sources. The inclusion on the Technical Review Panels for ELS:2002 both of members of the research community and of other government agencies helps to focus study and instrument design on features of youth transition that ELS:2002 uniquely can illuminate.

ELS:2002 does not duplicate, but temporally extends, the prior NCES longitudinal studies—NLS:72, HS&B, and NELS:88.

Other NCES studies involve assessments of similar age groups to ELS:2002 (PISA 15-year-olds, NAEP eighth-graders and high school seniors), but are not longitudinal, and do not collect data from parents. By the time of the second follow-up (2006, when most sample members were out of high school for 2 years), there is some similarity in sample to the NCES Beginning Postsecondary Students (BPS). However, the BPS longitudinal study focuses only on beginning postsecondary students, including late entrants into the system. In contrast, ELS:2002 includes both cohort members who go on to postsecondary education and those who do not—but misses many late entrants to the system, even if it t follows sample members to age 31. Thus BPS and ELS:2002 are fundamentally complementary, not duplicative.

The only non-NCES federal study that would appear to be comparable to ELS:2002 is the BLS National Longitudinal Survey of Youth (NLSY)—the NLSY79 and, sampling respondents closer to ELS:2002 in age, the NLSY97 shares with ELS:2002 (and the prior NCES high school cohorts) the goal of studying the transition of adolescents into adult roles. However, NLSY is an age cohort while ELS:2002 is a grade cohort, and NLSY is household based while ELS:2002 is school based. Although both studies are interested in both education and labor market experiences (and their interrelationship), ELS:2002 puts more emphasis on postsecondary education, while NLSY stresses labor market outcomes and collects detailed employment event histories. Thus, similarly as with BPS, ELS:2002 and the two NLSY cohorts are complementary rather than duplicative.

A.5 Methods Used to Minimize Burden on Small Businesses

This section has limited applicability to the proposed data collection effort. Target respondents for ELS:2002 are individuals, and direct data collection activities via web-based self-administration, CATI, and CAPI will involve no burden to small businesses or entities. Small entities such as high schools are no longer included in the data collection scheme. However, the collection of postsecondary transcripts may involve some small entities (defined as proprietary or not-for-profit postsecondary institutions enrolling fewer than 1,000 students), in which case the future package for the full study that will include the transcript collection effort will also address issues of burden minimization for small entities.

A.6 Frequency of Data Collection

This submission describes activities for the field test and full-scale survey of ELS:2002 third follow-up, in the larger context of the purposes and procedures of the study. One design element that is central to fulfilling the purpose of the study is the frequency or periodicity of data collection.

The rationale for conducting ELS:2002 is based on a historical national need for information on academic and social growth, school and work transitions, and family formation. In particular, recent education and social welfare reform initiatives, changes in federal policy concerning postsecondary student support, and other interventions necessitate frequent studies. Repeated surveys are also necessary because of rapid changes in the secondary and postsecondary educational environments and the world of work. Indeed, longitudinal information provides better measures of the effects of program, policy, and environmental changes than would multiple cross-sectional studies.

To address this need, NCES began the National Longitudinal Studies Program approximately 40 years ago with NLS:72. This study collected a wide variety of data on students’ family background, schools attended, labor force participation, family formation, and job satisfaction at five data collection points through 1986. NLS:72 was followed approximately 10 years later by HS&B, a longitudinal study of two high school cohorts (10th- and 12th-grade students). NELS:88 followed an eighth-grade cohort, which now, with a modal age of 26 years, represents the probable final data collection point. With the addition of ELS:2002, a 32-year trend line will be available. Taken together, these studies provide much better measures of the effects of social, environmental, and program and policy changes than would a single longitudinal study or multiple cross-sectional studies.

It could be argued that more frequent data collection would be desirable; that is, there would be a gain in having a program of testing and questionnaire administration that is annual throughout the high school years. However, the 2-year interval was employed with both the HS&B sophomore cohort and NELS:88, and proved sufficient to the realization of both studies’ primary objectives. Although there would be benefits to more frequent data collection in the high school years, it must also be considered that the effect would be to greatly increase the burden on schools and individuals, and that costs would also be greatly increased. Probably the most cost-efficient and least burdensome method for obtaining continuous data on student careers through the high school years comes through the avenue of collecting school records. High school transcripts were collected for a subsample of the HS&B sophomore cohort, as well as for the entire NELS:88 cohort retained in the study after eighth grade. A similar academic transcript data collection (covering grades 9 through 12) was conducted for the first follow-up of ELS:2002.

Periodicity of the survey after the high school years (at the very terminus of the study) may also be questioned—there is a 6-year gap between the 2006 round (2 years out of high school) and the final round in 2012 (8 years out of high school). Undoubtedly, more process and postsecondary education context information could be obtained if there were surveys in the intervening years (say at age 22, which would optimally capture the college experience). However, the strategy of waiting until about age 26 for the third follow-up interview is extremely cost-effective, in that the information collected at that time includes both final outcomes and statuses, and provides a basis for identifying the postsecondary institutions that individual sample members have attended. In turn, postsecondary transcripts are then obtained that provide continuous enrollment histories for specific courses taken, and provide records of course grades and other information needed to analyze postsecondary persistence and attainment.

A.7 Special Circumstances of Data Collection

All data collection guidelines in 5 CFR 1320.5 are being followed. No special circumstances of data collection are anticipated.

A.8 Consultants Outside the Agency

The 60-day Federal Register notice was published on December 20, 2010 (75 FR, No. 243, p. 79352). No public comments were received in response to this notice. In recognition of the significance of ELS:2002, several strategies have been incorporated into the project’s work plan that allow for the critical review and acquisition of comments regarding project activities, interim and final products, and projected and actual outcomes. These strategies include consultations with persons and organizations both internal and external to the National Center for Education Statistics, the U.S. Department of Education, and the federal government.

ELS:2002 project staff have established a Technical Review Panel (TRP) to review study plans and procedures. The third follow-up TRP includes some of the earlier ELS:2002 panelists for continuity with prior phases of the study. However, the membership has been reconstituted to reflect the shift in focus from high school experiences to postsecondary and labor market transitions that mark the final outcomes of the study. See Exhibit A-1 for a list of the TRP membership and their affiliations. The TRP met to discuss the ELS:2002/12 field test study design, research priorities, and survey content September 30–October 1, 2010.

ELS:2002 project staff also enlisted three academic consultants as part of a research area plan to offer advice on priorities and new topic areas. Two of these individuals wrote position papers (Olsen and Shanahan) and a third wrote new items (Lent).

Exhibit A-1. Education Longitudinal Study:2002 (ELS:2002) Third Follow-up Technical Review Panel


Participants and Staff Contact List



Technical Review Panelists

Sara Goldrick-Rab

University of Wisconsin-Madison

1025 West Johnson Street, 575K

Madison, WI 53706

Phone: (608)265-2141

E-mail: [email protected]

Robert Gonyea

Indiana University

Center for Postsecondary Research

107 S. Indiana Avenue, Eigenmann 443

Bloomington, IN 47405

Phone: (812)856-5824

E-mail: [email protected]

Robert Lent

University of Maryland

RM 3214D Benjamin Building

College Park, MD 20742

Phone: (301)774-6390

E-mail: [email protected]

Amaury Nora

The University of Texas at San Antonio

College of Education and Human Development

One UTSA Circle

San Antonio, TX 78249

Phone: (210)458-4370

E-mail: [email protected]

Randall Olsen

The Ohio State University

921 Chatham Lane, Suite 100

Columbus, OH 43221

Phone: (614)442-7348

E-mail: [email protected]

Aaron Pallas

Columbia University, Teachers College

464 Grace Dodge Hall

New York, NY 10027

Phone: (212)678-8119

E-mail: [email protected]

Kent Phillippe

American Association of Community Colleges

One Dupont Circle, NW, Suite 410

Washington, DC 20036

Phone: (202)728-0200

E-mail: [email protected]

Barbara Schneider

Michigan State University

516B Erickson Hall

East Lansing, MI 48824

Phone: (517)432-0300

E-mail: [email protected]

Michael Shanahan

University of North Carolina at Chapel Hill

Department of Sociology

CB#3210, Hamilton Hall

Chapel Hill, NC 27599

Phone: (919)843-9865

E-mail: [email protected]

Marvin Titus

University of Maryland

EDHI

Room 2200 Benjamin

College Park, MD 20742

Phone: (301)405-2220

E-mail: [email protected]

U.S. Department of Education and other Federal and Non-Federal Invitees

Elise Christopher

U.S. Department of Education, NCES

1990 K Street, NW, Room 9021

Washington, DC 20006

Phone: (202)502-7899

E-mail: [email protected]

Stephanie Cronen

American Institutes for Research

Education Statistics Services Institute

1990 K Street, NW, Suite 500

Washington, DC 20006

Phone: (202)403-6419

E-mail: [email protected]

Bruce Daniel

Kforce Government Solutions

2750 Prosperity Avenue, Suite 300

Fairfax, VA 22031

Phone: (703)245-7350

E-mail: [email protected]

Sandy Eyster

American Institutes for Research

Education Statistics Services Institute

1990 K Street, NW, Suite 500

Washington, DC 20006

Phone: (202)403-6149

E-mail: [email protected]

Mary Frase

National Science Foundation

Directorate of Social, Behavioral and Economic Sciences

Science Resources Statistics

4201 Wilson Blvd. Suite 965 S

Arlington, VA 22230

Phone: (703)292-7767

E-mail: [email protected]

Brian Harris-Kojetin

Office of Management and Budget

725 17th Street NW

Room 10201

Washington, DC 20503

Phone: (202)395-7314

E-mail: [email protected]

Lisa Hudson

U.S. Department of Education, NCES

1990 K Street, NW, Room 8104

Washington, DC 20006

Phone: (202)502-7358

E-mail: [email protected]

Tracy Hunt-White

U.S. Department of Education, NCES

1990 K Street, NW, Room 8113B

Washington, DC 20006

Phone: (202)502-7438

E-mail: [email protected]

Stuart Kerachsky

U.S. Department of Education, NCES, IES

1990 K Street NW, Room 9116

Washington, DC 20006

Phone: (202)502-7442

E-mail: [email protected]

Kashka Kubzdela

U.S. Department of Education, NCES

1990 K Street, NW, Room 9014

Washington, DC 20006

Phone: (202)502-7411

E-mail: [email protected]

Laura LoGerfo

U.S. Department of Education, NCES

1990 K Street NW, Room 9022

Washington, DC 20006

Phone: (202)502-7402

E-mail: [email protected]

Rochelle Martinez

Office of Management and Budget

725 17th Street, NW

Room 10202 NEOB

Washington, DC 20503

Phone: (202)395-3147

E-mail: Rochelle_W._Martinez@
omb.eop.gov

David Miller

American Institutes for Research

Education Statistics Services Institute

1990 K Street, NW, Suite 500

Washington, DC 20006

Phone: (202)403-6533

E-mail: [email protected]

Isaiah O’Rear

U.S. Department of Education, NCES

1990 K Street, NW

Washington, DC 20006

Phone: (202)502-7378

E-mail: Isaiah.o’[email protected]

Jeffrey Owings

U.S. Department of Education, NCES

1990 K Street, NW, Room 9105

Washington, DC 20006

Phone: (202)502-7423

E-mail: [email protected]

Leslie Scott

American Institutes for Research

Education Statistics Services Institute

1990 K Street, NW, Suite 500

Washington, DC 20006

Phone: (202)654-6542

E-mail: [email protected]

Marilyn Seastrom

U.S. Department of Education, NCES

1990 K Street, NW, Room 9051

Washington, DC 20006

Phone: (202)502-7303

E-mail: [email protected]

Matthew Soldner

U.S. Department of Education, NCES

1990 K Street, NW, Room 8121

Washington, DC 20006

Phone: (202)219-7025

E-mail: [email protected]

Tom Weko

U.S. Department of Education, NCES

1990 K Street, NW, Room 8099

Washington, DC 20006

Phone: (202)502-7643

E-mail: [email protected]

Andrew White

U.S. Department of Education, NCES, IES

1990 K Street, NW, Room 9105

Washington, DC 20006

Phone: (202)502-7472

E-mail: [email protected]

John Wirt

U.S. Department of Education, NCES

El/Sec Sample Survey Studies Program-ESLSD

1990 K Street, NW, Room 9028

Washington, DC 20006

Phone: (202)502-7478

E-mail: [email protected]

Contractor and Subcontractor Staff

Mark Dennis

Millennium Services 2000+ Incorporated

8121 Georgia Ave., Suite LL2

Silver Spring, MD 20910

Phone: (240)839-5113

E-mail: [email protected]

Steven Ingels

RTI International

701 13th Street NW, Suite 750

Washington, DC 20005

Phone: (202)974-7834

E-mail: [email protected]

Donna Jewell

RTI International

P.O. Box 12194

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: (919)541-7266

E-mail: [email protected]

Erich Lauff

RTI International

P.O. Box 12194

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: (919)990-8492

E-mail: [email protected]

Tiffany Mattox

RTI International

P.O. Box 12194

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: (919)485-7791

E-mail: [email protected]

Daniel Pratt

RTI International

PO Box 12194

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: (919)541-6615

E-mail: [email protected]

John Riccobono

RTI International

P.O. Box 12194

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: (919)541-7006

E-mail: [email protected]


A.9 Provision of Payments or Gifts to Respondents

We propose using respondent incentives at two junctures for the forthcoming study: first, at the panel maintenance (contact information confirmation/update) stage, when sample members are first contacted (approved under OMB# 1850-0652 v.5); second, at the data collection stage, when they are expected to complete the questionnaire.

Panel Maintenance Incentive. For the Fall 2010 panel maintenance activity, RTI conducted an experiment for the field test sample in which half of the sample (the panel maintenance sample comprises student sample members and their parents as well) were offered a $10 incentive check that was sent upon receipt of updated or confirmed contact information. The other half of the sample was not offered an incentive.

A summary of the results of the experiment are included in Appendix 8. The results demonstrate the effectiveness of the use of a relatively small incentive ($10) to encourage participation among sample members: 25% of the treatment cases (those offered an incentive) responded to the contact information request as compared with 20% of the control cases (those not offered an incentive). Among the panel maintenance respondents, 82% provided contact information that was not already available in our contact database.

While the incentive experiment did not specifically target historically difficult-to-track cases (e.g., second follow-up interview non-respondents), the significant increase in participation for the treatment group can be seen overall and in the five characteristics listed in Appendix 8: second follow-up interview respondent; second follow-up interview early respondent (responded in the first 4 weeks); male; obtained regular high school diploma (does not include GED, certificate of completion); and attended a postsecondary institution. Increases in participation for the treatment subgroups did not reach significance in all subgroups, but nevertheless occurred in the same direction, with the treatment subgroup demonstrating higher participation rates than the control, for all respondent subgroups evaluated. These subgroups included second follow-up non-respondents, second follow-up late respondents, and those who ever indicated dropping out of high school. An increase in these groups’ response in the third follow-up would provide great analytic value. Getting otherwise difficult sample members to respond earlier in the data collection period will save labor, telephone, prompting mailing, and case review costs associated with working the cases over several months. Further, each case participating in the contact information update process is potentially one less case requiring intensive tracing, for which the estimated costs are approximately $62 per case, including tracing specialist labor, telephone costs, and proprietary database fees.

If 25% of our full-scale sample participates in the Fall 2011 panel maintenance, the $10 incentive and associated costs amount to less than $41,000 overall. This modest investment should result in current contact information for a greater number of sample members at the start of data collection, which saves data collection costs as well as intensive tracing costs. Although quantifying the expected savings and/or potential response rate increase is not possible, an illustration of the potential savings is useful. If 5% more of our sample (800 cases) that otherwise would have required intensive tracing (estimated at $49,600) instead provided updated contact information and received a $10 incentive, then the net savings would be approximately $9,000. Additionally, if 250 of these 800 cases also responded during the early web self-administration period rather than requiring significant telephone and/or field follow-up, additional savings would be up to $25,000. Based on the success of the experiment and assuming the trends seen in the field test hold for the full-scale sample, the incentive procedure is recommended for implementation when the next set of sample maintenance materials are sent to the full-scale sample in the fall of 2011 (approved under OMB# 1850-0652 v.5).

Data Collection Incentive. We also propose incentive for questionnaire completion in the Third Follow-up. There are two possible models for the use of questionnaire completion incentives in the full-scale study, and which is chosen depends on field test results. First, the successful OMB-approved incentives of the ELS:2002 second follow-up could be repeated. Indeed, respondents may expect to receive amounts similar to what they received before, for what is a similar task and burden, 6 years later.

The alternative is to use prior-round information to model response propensities, and to prioritize cases based on that information, such that response bias will be minimized. The experimental method is similar in concept to the incentive plan offered in the second follow-up, but is a statistically well-grounded refinement of that approach. In the second follow-up plan, all respondents were given an incentive, but certifiably hard-to-get groups such as past-round nonrespondents and high school dropouts were given a larger incentive. (There was also an incentive for early response.) Because the propensity-modeling plan uses more data, the groups of individuals more likely to respond as a result of higher incentive can, in theory, be more accurately determined. Because the propensity-modeling plan is able to consider respondent information (such as a full range of response and sociodemographic characteristics) more inclusively and broadly, it is posited that it will also be able to determine which cases would contribute most to bias in estimates, and ensure that these cases receive priority.

Incentive payments to respondents were a major feature of the data collection plan for the ELS:2002 2006 study. About 90% of first follow-up respondents and 67% of non-respondents were respondents in the second follow-up, for an overall second follow-up response rate of 88%. The results of the 2003 field test experiments and the success of the 2004 round of data collection provided evidence of the value of respondent incentives in achieving high response rates (see NCES 2006-344, appendix J). An incentive plan with a two-step propensity model based on data from previous waves is designed to try to further improve upon these response rates.

Exhibit A‑2 summarizes the specific elements of the 2006 incentive plan. The regular or “base” incentive amount for all ELS:2002 sample members who had never been identified as dropouts and had participated in the F1 data collection was $20. For those sample members who participated in the base-year study but did not participate in 2004, the regular incentive was higher at $40. Likewise, those who had ever been identified as dropouts through the 2004 round were offered $40 as a base incentive.

Exhibit A-2. Second Follow-up Full-Scale Respondent Incentive Plan: 2006

Respondent type

Regular incentive

Early completer

Difficult case

Final difficult
($10 prepaid)

F1 nonrespondent

$40

$50

$50

$60

Ever dropout

40

50

50

60

F1 respondent, nondropout

20

30

30

40

NOTE: F1 = First follow-up.

SOURCE: U.S. Department of Education, National Center for Education Statistics, Education Longitudinal Study of 2002 (ELS:2002), Second Follow-up, 2006.

For the ELS Third Follow-Up Field Test, we propose experimentally testing a new methodology designed to focus on and minimize nonresponse bias in the final survey estimates. This experimental approach aims to reduce nonresponse bias by using multiple sources of data to produce models that estimate a sample member’s response propensity prior to the commencement of data collection. After we empirically identify those sample members with the lowest response propensities, we implement a differential incentive structure in an attempt to encourage these sample members’ participation. We offer larger incentives to low-propensity cases because these cases often disproportionately contribute to nonresponse bias and can be harmful to the precision of survey estimates.

The response propensity approach under development is based upon several key assumptions that will be tested in the ELS F3 Field Test (FT). First, the approach rests on the assumption that low propensity cases, i.e. the cases least likely to respond to the survey, are fundamentally different from sample members with high response propensities. If differences between low and high propensity cases do exist and are large enough, survey estimates are likely to be affected. Thus, our second assumption is that low propensity cases contribute to nonresponse bias. With these assumptions in mind, the goal of the proposed approach is to first identify the cases with low propensity that we believe are likely to contribute to nonresponse bias, and then to increase their response propensity with a targeted intervention. To be effective, an intervention during data collection aimed at reducing bias by targeting low propensity cases ideally (but not necessarily1) would meet two conditions: 1) the calculated response propensity should be a significant predictor of response outcome and 2) response propensity should be significantly associated with survey variables of interest (observed only among respondents). The above assumptions have not yet been tested – the purpose of the proposed design is to allow us to test empirically with the field test sample whether nonresponse bias can be reduced by identifying and targeting cases with predicted low response propensity.


The ELS F3 FT will be used to empirically test whether or not intervening on low propensity cases can be a practical and effective method to improve overall survey estimates. The first step will be to determine if response propensities effectively predict survey response outcomes. It can then be determined whether the variance of the response propensity, because of an intervention, was lowered, and whether the association between the response propensity and any selected survey variables was reduced.



The experiment will be carried out as follows:

Step 1—Estimate response propensities for ELS 2011 field test sample members. Response propensities have been estimated for Step 1 by predicting the second follow-up response outcome for all 2011 ELS field test sample members. Some of the variables used in the propensity modeling preparation and analysis are listed below:

  • Response status at each prior data collection

  • Mode of response in past rounds

  • Timing of response in past rounds (e.g., during early completion period)

  • Time of day of prior response

  • Number of call attempts

  • Panel maintenance update request response status

  • Completeness of contact information including address, phone, email

  • Level of recency of contact information including address, phone, email

  • High school completion status

  • Postsecondary enrollment status

  • Type of postsecondary institution attended (e.g., private, public, 4-year, 2-year)

  • Employment status

  • Family formation status

  • Assessment performance in prior rounds

  • High school academic performance (e.g., GPA)

Predicted probabilities for completed interview in the second follow-up were used to divide the 2011 field test sample into two equal groups: a low and a high response propensity (n =530 for each group). The low-propensity group consists of those sample members we predict to be the least likely to be interviewed. This approach represents a significant refinement to the nonresponse avoidance approaches previously used for ELS. In prior waves “difficult cases” were identified using only a single characteristic, such as prior nonresponse or high school dropout.

The best fitting response prediction model included the following variables as predictors: base year response status, F1 response status, F2 call count, whether the case has ever refused, whether some contact was ever made with the case, mother is a college graduate, and whether the case has ever been in an AP class. High propensity cases in the F2 FT had an overall response rate of 92%, while low propensity cases had a response rate of roughly 55%. While some survey variables were included in the response propensity model, it is informative to note that even some survey variables not ultimately retained as significant predictors are correlated with response propensities, that is, there is a relationship between response propensity and substantive variables even if the substantive variables were not significant predictors of response propensity. For instance, in the ELS F2 FT, high and low propensity cases showed sometimes large differences across key survey variables (e.g., whether currently enrolled in postsecondary institution; whether took AP exams in high school; and whether contributes to the support of a dependent).

Step 2—A multiphased data collection experiment to test higher incentive with low propensity cases. The approach we propose targets low-propensity cases with different interventions throughout data collection. In the third follow-up field test, we will implement these interventions experimentally. As described above, response propensities for all cases have been calculated and the cases were assigned to a low- or high-propensity status. Low-propensity cases were further split into treatment and control groups (n=265 for each group).

Phase 1 (early response period)—The first 3 weeks of data collection represent the early response period: self-administered web interviews. No outbound calling will be done during the early response period, though sample members may call in to the Help Desk and agree to do a CATI interview then. We will give differential incentives for nonrespondent cases based on their response propensity. Significantly larger response incentives will be offered to low-propensity treatment cases. The response incentive for treatment cases will be $45 and $25 for control cases and high-propensity cases.

Phase 2 (outbound calling begins)—After the early response period, outbound calling will begin. The response incentive will be the same as in Phase 1: $45 for low-propensity treatment cases and $25 for all other cases.

Phase 3 (large incentives for all remaining nonrespondents)—Toward the end of data collection (beginning at week 10), we will offer large incentives to all remaining nonrespondents. The response incentive for low-propensity treatment cases will be $55 and $35 for control cases and high-propensity cases. This is a final effort to bring as many sample members into the response as we can. The incentive structure is presented by data collection phase and propensity group in Exhibit A‑3 for the field test and Exhibit A‑4 for the main study.

We understand that splitting the sample based on response propensities could result in some critical subgroups (e.g., low income, minorities) disproportionately receiving larger incentives should they fall into the low response propensity group. After response propensities were calculated for the field test, we evaluated these propensities across demographic groups. Representation was not found to be disproportionate, although we are prepared to post stratify the propensity groups if necessary in the main study, if NCES and OMB so desire.

Exhibit A-3. Incentives by Data Collection Phase and Propensity Group for Field Test

Phase and Group

Percent of Total Respondents

Number of Respondents

Incentive Amount

Phase 1

30%

150


High Prop. and Low Prop. – Control


112

$25

Low Prop. – Treatment


38

$45

Phase 2

44%

220


High Prop. and Low Prop. – Control


165

$25

Low Prop. – Treatment


55

$45

Phase 3

26%

130


High Prop. and Low Prop. – Control


97

$35

Low Prop. – Treatment


33

$55



Exhibit A-4. Incentives by Data Collection Phase and Propensity Group for Main Study

Phase and Group

Percent of Total Respondents

Number of Respondents

Incentive Amount

Phase 1

30%

4,243


High Propensity


3,182

$25

Low Propensity


1,061

$45

Phase 2

44%

6,222


High Propensity


4,666

$25

Low Propensity


1,556

$45

Phase 3

26%

3,677


High Propensity


2,758

$35

Low Propensity


919

$55



Step 3—Evaluating results. The evaluation of the modeling experiment will be conducted using methods established in Schouten et al. (2009). We intend to evaluate the experimental results by examining how well our model predicts response outcomes and by investigating whether our treatments minimized bias. First, we will look at the response rates for groups defined by estimated response propensity (i.e., how well our assigned response propensities actually predict the survey outcome). We then will address whether the variance of the response propensity, , was lowered and whether the association between the response propensity and any survey variables, y, we choose to examine, , was reduced, thus minimizing nonresponse bias in survey estimates of means and proportions. In parallel with the ELS F3 FT data collection, where experimental interventions are planned on cases with low predicted response propensities, we will conduct analyses with the ELS F2 main study data. These main study analyses, in addition to the F3 FT results, where we have intervened on low propensity cases during data collection, will provide a more complete picture of the potential utility of this approach as well as refine the model for the main study. For the main study, sample weights will be incorporated into these analyses and the model-building exercise.

NCES is testing this particular response propensity approach to minimizing nonresponse bias in several field test studies for the first time. While there are variations in the study populations and in the “intervention” used in each, the goal of the approach is the same: identify cases with low response propensity, implement a targeted intervention to increase response propensity, and then evaluate the data to assess the impact to nonresponse bias. We will review all propensity modeling experiment results, their relation to sample bias, and identify the best approaches for each study to yield the most benefits for data quality in full scale collections.



A.10 Assurance of Confidentiality

RTI has prepared a data security plan (DSP) for the ELS:2002 third follow-up data collection. The ELS:2002 third follow-up data security plan will strengthen confidentiality protection and data security procedures developed for prior rounds of ELS:2002 and represent best-practice survey systems and procedures for protecting respondent confidentiality and securing survey data. An outline of this plan is provided in Exhibit A-3. The ELS:2002 third follow-up data collection DSP will

  • establish clear responsibility and accountability for data security and the protection of respondent confidentiality with corporate oversight to ensure adequate investment of resources;

  • detail a structured approach for considering and addressing risk at each step in the survey process and establish mechanisms for monitoring performance and adapting to new security concerns;

  • include technological and procedural solutions that mitigate risk and emphasize the necessary training to capitalize on these approaches; and

  • be supported by the implementation of data security controls recommended by the National Institute of Standards and Technology for protecting federal information systems.

Exhibit A-5. ELS:2002 Third Follow-up Data Security Plan Outline

ELS:2002 Data Security Plan Summary

Maintaining the Data Security Plan

Information Collection Request

Our Promise to Secure Data and Protect Confidentiality

Personally Identifying Information That We Collect and/or Manage

Institutional Review Board Human Subject Protection Requirements

Process for Addressing Survey Participant Concerns

Computing System Summary

General Description of the RTI Networks

General Description of the Data Management, Data Collection, and Data Processing Systems

Integrated Monitoring System

Receipt Control System

Instrument Development and Documentation System

Data Collection System

Document Archive and Data Library

Employee-Level Controls

Security Clearance Procedures

Nondisclosure Affidavit Collection and Storage

Security Awareness Training

Staff Termination/Transfer Procedures

Subcontractor Procedures

Physical Environment Protections

System Access Controls

Survey Data Collection/Management Procedures

Protecting Electronic Media

Encryption

Data Transmission

Storage/Archival/Destruction

Protecting Hard-Copy Media

Internal Hard-Copy Communications

External Communications to Respondents

Handling of Mail Returns, Hard-Copy Student Lists, and Parental Consent Forms

Handling and Transfer of Data Collection Materials

Tracing Operations

Software Security Controls

Data File Development: Disclosure Avoidance Plan

Data Security Monitoring

Survey Protocol Monitoring

System/Data Access Monitoring

Protocol for Reporting Potential Breaches of Confidentiality

Specific Procedures for Field Staff



Under this plan, the ELS:2002 third follow-up data collection will conform totally to federal privacy legislation, including the Privacy Act of 1974 (5 U.S.C. 552a) and Section C of Education Sciences Reform Act of 2002 (P.L. 107-279). Consistent with the Privacy Act, these data will constitute a system of records, per the system of records notice 18-13-01 National Center for Education Statistics Longitudinal Studies and the School and Staffing Surveys (64 FR, No. 107, p. 30181-82, June 4, 1999).

More specifically, it is expected that ELS:2002 will conform to the NCES Restricted Use Data Procedures Manual and NCES Standards and Policies. The plan for maintaining confidentiality includes obtaining signed confidentiality agreements and notarized nondisclosure affidavits from all personnel who will have access to individual identifiers. Each individual working on ELS:2002 will also complete the e-QIP clearance process. The security plan includes annual personnel training regarding the meaning of confidentiality and the procedures associated with maintaining confidentiality, particularly as it relates to handling requests for information and providing assurance to respondents about the protection of their responses. The training will also cover controlled and protected access to computer files, built-in safeguards concerning status monitoring and receipt control systems, and a secured and operator-manned in-house computing facility.

Immediately prior to field test data collection, contacting materials will be sent to sample members and a parent to initiate data collection and offer access to the web survey (see appendix 6). Sample members are more transient at this age than their parents, so we want to engage the parents in case the sample member has relocated since our last contact with him/her. The letter to parents thanks them for their past assistance with the study, informs them that we are trying to reach their children for the third follow-up, and requests their assistance in contacting and communicating with their children about the study. We will provide parents with complete information about the third follow-up data collection except for their children’s study ID and password. This exception will protect sample members’ privacy and help ensure data security.

The letters to both sample members and parents will describe the voluntary nature of the survey. The materials sent will include a brochure describing the study, the ways the data will be used, and conveying the extent to which the identity of the respondents and their responses will be kept confidential. The prenotification letter to the study will contain the following statement:

Your answers may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002 (ESRA 2002) Public Law 107-279, Section 183].”

During the telephone interview, the following informed consent statement will be read verbatim. We have slightly modified the language used in this passage to more accurately reflect a telephone/personal contact.

As mentioned in the letter, you previously participated in ELS:2002 with about 15,000 other students across the country who were selected from 10th-grade classes in 2001 or 12th-grade classes in 2003. This survey is part of an education research study sponsored by the U.S. Department of Education. The purpose of ELS:2002 is to provide information that will be used to improve the quality of education in America. The interview will ask questions about your further schooling and work experiences. On average, it takes about 35 minutes to complete, depending on your responses.

Participation is voluntary. Your answers may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002 (ESRA 2002) Public Law 107-279, Section 183]. You may withdraw from the study at any point. However, your answers are very important because they represent many others who were not selected to take part. You may skip any question that you don’t want to answer.”

Data files, accompanying software, and documentation will be delivered to NCES at the end of the project. Neither names nor addresses will be included on any data file. A separate locator database for these sample members will be maintained in a secure location. All hard-copy tracing directory updates will be destroyed after they are entered into magnetic form and verified.

A.11 Sensitive Questions

The student interview contains items about earnings, assets, and debts. Federal regulations governing the administration of these questions, which might be viewed as “sensitive” due to personal or private information, require (a) clear documentation of the need for such information as it relates to the primary purpose of the study, and (b) provisions to respondents which clearly inform them of the voluntary nature of participation in the study, and (c) assurances of confidential treatment of responses. Information about earnings and assets are vital labor force variables and provide important indicators of the rate of return of educational experiences to the respondent.

If a sample member’s SSN is unknown despite the prior rounds of data collection, it will be collected in the student interview. This information is needed to obtain data from a variety of extant data sources including student financial aid data from Central Processing System (CPS), data from the National Student Loan Data System (NSLDS) Pell loan and grant files, and GED test results. A description of matching procedures and the security measures in place for the linkages to extant data sources is provided in Part D.

We also plan to verify or collect locating information for the sample member and contact persons in case further data collection of this sample occurs in the future.

A.12 Estimates of Hour Burden for Information Collection for the Field Test and Full-scale Study

Estimates of response burden for the ELS:2002 third follow-up field test and full-scale study sample maintenance (tracing) and data collection activities are shown in Exhibit A-4. (The sample maintenance activities have already been approved by OMB.)

The field test administration will also include a reinterview with a randomly selected subset of 50 respondents. The purpose of this reinterview is to evaluate the temporal stability (or in effect, test-retest reliability) of the reinterview items. In choosing items for reinterview, preference is given to items that meet the following criteria: (1) any newly designed items for the study, or other new items, such as those borrowed from non-NCES studies and for which measurement properties are not well known or (2) radically revised versions of items previously used in ELS:2002 or its predecessor studies; (3) additionally the items should be factual rather than attitudinal.

Exhibit A-6. Estimated Sample Maintenance and Data Collection Burden on Respondents for Field Test Study (2011) and Main Study (2012)


Sample

Expected response rate

Number of respondents

Average burden/
response (minutes)

Range of response times (minutes)

Total burden (hours)

Sample maintenance







Field test (2011)

1,060

20%

212

5

----

18

Full-scale study (2012), 1

16,200

20%

3,240

5

----

270

Full-scale study (2012), 2

16,200

20%

3,240

5

----

270

Data collection







Field test (2011)

1,060

50%

530

35

25 to 45

309

Student reinterview

63

80%

50

10

5 to 15

8

Full-scale study (2012)

16,200

90%

14,580

35

25 to 45

8,505

NOTE: Table does not include transcript collection which will take place in 2013-14 and will be submitted in a separate package.

Included in the notification letter and on the entry page to the online survey will be the following burden statement:

According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless it displays a valid OMB control number. The valid OMB control number of this information collection is 1850-0652 and it is completely voluntary. The time required to complete this information collection is estimated to average around 35 minutes per response. If you have any comments concerning the accuracy of the time estimate or suggestions for improving the interview, please write to: U.S. Department of Education, Washington, DC 20202-4537. If you have comments or concerns regarding the status of your individual interview, write directly to: Education Longitudinal Study (ELS), National Center for Education Statistics, 1990 K Street NW, Washington, DC 20006.”

A.13 Estimates of Costs

There are no capital, startup, or operating costs to respondents for participation in the project. No equipment, printing, or postage charges will be incurred.

Estimated costs to the federal government for ELS:2002 are shown in Exhibit A-5. The estimated costs to the government for data collection for the third follow-up field test and full-scale studies are presented separately. Included in the contract estimates are all staff time, reproduction, postage, and telephone costs associated with the management, data collection, analysis, and reporting for which clearance is requested.

A.14 Costs to Federal Government

Exhibit A-7. Total Costs to NCES

Costs to NCES

Amount (in $)

Total ELS:2002/12 costs


Salaries and expenses

200,000

Contract costs

9,647,075



Total annual ELS:2002/12 cost

3,282,358

NOTE: All costs quoted are exclusive of incentive fee. Table does not include transcript collections.

A.15 Reasons for Changes in Response Burden and Costs

Projected estimates for response burden and costs are based on experiences from the second follow-up study and more recent studies, including BPS:04/09. The increase in burden from the last approved clearance package is due to the fact that the last clearance was for address updates, while this clearance also requests approval for the field test data collection burden.

A.16 Publication Plans and Time Schedule

The ELS:2002/12 field test will be used to test and improve the instrumentation and associated procedures. Publications and other significant provisions of information relevant to the data collection effort will be a part of the reports resulting from the full-scale study, and both public use and restricted use data files will be important products. The ELS:2002 data will be used by public and private organizations to produce analyses and reports covering a wide range of topics. With the third follow-up, ELS:2002 data will add a fourth point in time for longitudinal analysis, and extend the cross-cohort comparison to predecessor cohorts (NELS:88, HS&B, and NLS-72).

Data files will be distributed to a variety of organizations and researchers, including offices and programs within the U.S. Department of Education, the Congressional Budget Office, the Department of Health and Human Services, Department of Labor, Department of Defense, the National Science Foundation, the American Council on Education, and a number of other education policy and research agencies and organizations. The ELS:2002 contract requires the following reports, publications, or other public information releases:

  • detailed methodological reports (one each for the field test and full-scale survey—in the form of a comprehensive Data File Documentation Report covering the base year through the third follow-up, with an appendix for the field test) describing all aspects of the data collection effort;

  • complete restricted-use, longitudinal full-scale study data files and documentation for research data users, including postsecondary institution transcript data;

  • corresponding public-use data files for public access to ELS:2002 base-year to third follow-up results; and

  • a “first look” summary of significant descriptive findings for dissemination to a broad audience (the analysis deliverable will include technical appendices).

Final deliverables for the third follow-up are scheduled for completion in 2013. (Final deliverables for the transcript study are scheduled for completion in 2015.) The operational schedule for the ELS:2002 third follow-up field test and full-scale study is presented in Exhibit A-6.

Exhibit A-8. Operational Schedule for ELS:2002/12 Field Test and Full-Scale Activities

Activity

Start

End

Field test



Panel maintenance: contact updates for sample

10/2010

6/2011

First round of cognitive testing of items

8/2010

9/2010

Data collection

7/2011

12/2011

Second round of cognitive testing

10/2011

12/2011




Full-scale study



Panel maintenance: contact updates for sample

9/2010

6/2012

Data collection

7/2012

1/2013




Transcript collection



Pilot testing of operations

2/2013

8/2013

Transcript data collection

8/2013

3/2014

Transcript keying and coding

11/2013

8/2014



A.17 Approval to Not Display Expiration Date for OMB Approval

The expiration date for OMB approval of the information collection will be displayed on data collection instruments and materials. No special exception to this requirement is requested.

A.18 Exception to Certification for Paperwork Reduction Act Submissions

No exceptions are requested to the certification statement identified in the Certification for Paperwork Reduction Act Submissions of OMB Form 83-I.

1 To the extent that the relationship between response propensity and the key survey variables is not the same among respondents and nonrespondents, only the first condition may be sufficient – estimated response propensity needs to be associated with the response outcome.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleDecember 2010
Authorcannada
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy