Part A ELS 2002 3rd-Follow-up 2012

Part A ELS 2002 3rd-Follow-up 2012.docx

Education Longitudinal Study (ELS) 2002 Third Follow-up 2012 (Full Scale)

OMB: 1850-0652

Document [docx]
Download: docx | pdf


December 2011




Education Longitudinal Study: 2002
(ELS:2002)


Third Follow-up, 2012 Full-scale Study


OMB Supporting Statement

Part A









OMB# 1850-0652 v.8










National Center for Education Statistics

Institute of Education Sciences

U.S. Department of Education





TABLE OF CONTENTS

Section Page


Exhibits

Number Page




LIST OF Appendixes

Appendix 1. Main Study Questionnaire

Appendix 2: Data Collection Materials (Brochure, Lead Letters)

Appendix 3: Cognitive Labs Report Summary

Preface

This request concerns the third follow-up of the Education Longitudinal Study: 2002 (ELS:2002), an ongoing longitudinal study with a completed field test in 2011 and a forthcoming full-scale data collection in 2012. This document requests clearance for data collection activities and supplements earlier requests concerned with the 2011 field test, direct locating and contacting of individual respondents or their parents, and two generic clearances for cognitive interviews. Per the field test approval (OMB # 1850-0652 v.7), this submission is subject to a 60-day Federal Register notice waiver. ELS:2002 is being conducted by the National Center for Education Statistics, part of the Institute of Education Sciences, within the U.S. Department of Education. The primary contractor for this study is RTI International (Contract number ED-04-CO-0036/0004).



  1. Justification of the Study

A.1 Circumstances Making Collection of Information Necessary

A.1.a Purpose of This Submission

The materials in this document support a request for clearance to conduct the third follow-up of ELS:2002, which has the following basic components and key design features:

Base Year

  • baseline survey of high school sophomores, in spring term 2002;

  • assessments in reading and mathematics;

  • surveys of parents, English and math teachers, media center specialists, and school administrators, plus a facilities checklist;

  • samples sizes of about 750 schools and approximately 17,600 students (15,300 base-year respondents; schools are first-stage unit of selection, with sophomores randomly selected within schools);

  • oversampling of Asian Americans, private schools;

  • design linkages (test concordances) with other assessment programs: Program for International Student Assessment (PISA), National Assessment for Educational Progress (NAEP), and test score reporting linkages to the prior longitudinal studies.

First Follow-up

  • follow-up in spring 2004, when most sample members were seniors, but some were dropouts or enrolled in other grades;

  • student questionnaires, dropout questionnaires, in-school math assessments, and school administrator questionnaires;

  • returned to the same schools, but separately followed transfer students and those no longer in school by telephone (computer-assisted telephone interview; CATI) or field (computer-assisted personal interview; CAPI);

  • freshening for a nationally representative senior cohort; and

  • high school transcript component in fall/winter, 2004–05.

Second Follow-up

  • follow-up in 2006 using web-based self-administered instrument, telephone (CATI) and field (CAPI) data collection;

Third Follow-up

  • follow-up in 2012 using web-based self-administered instrument, telephone (CATI), and field (CAPI) data collection; and

  • collection of postsecondary transcripts.



The third follow-up study will provide data to map and understand the outcomes of the high school cohorts’ transition to adult roles and statuses at about age 26. For the cohort as a whole, the third follow-up will obtain information that will permit researchers and policymakers to better understand issues of postsecondary persistence and attainment, as well as sub-baccalaureate (and to a more limited degree, baccalaureate) rate of economic and noneconomic return on investments in education. The third follow-up will also provide information about high school completion (for students who dropped out or were held back) and the status of high school dropouts, late completers, and students who have obtained an alternative credential, such as the GED. Finally, for both college-bound and non–college-bound students, the third follow-up will map their labor market activities and family formation.

For many cohort members, complex pathways, with alternative timings and durations for work and postsecondary enrollment, will be followed at this point of transition. In the 6-year period since the previous round, a sample member may both have worked and attended school, either serially or simultaneously; a cohort member may have attended school part-time or full-time and combined education and work spells with marriage and family formation. The singular strength of longitudinal studies is their power to provide data on transitions that are both complex and of some duration. The transition from adolescence to adult roles—and in particular, the transition to and through postsecondary education, and to labor force activity, and family formation—is of this very type. The timing of the ELS:2002/12 data collection will facilitate capturing all of this complexity at a time when such roles are becoming the norm for many young adults.

A.1.b Legislative Authorization

ELS:2002 is sponsored by the National Center for Education Statistics (NCES) of the Institute of Education Sciences (IES), in close consultation with other offices and organizations within and outside the U.S. Department of Education (ED).ELS:2002 is authorized under Section 9543 of the Education Sciences Reform Act of 2002 (20 U.S.C.).

A.1.c Prior and Related Studies

In 1970, NCES initiated a program of longitudinal high school studies. Its purpose was to gather time-series data on nationally representative samples of high school students that would be pertinent to the formulation and evaluation of education policies.

Starting in 1972, with the National Longitudinal Study of the High School Class of 1972 (NLS:72), NCES began providing education policymakers and researchers with longitudinal data that linked education experiences with later outcomes, such as early labor market experiences and postsecondary education enrollment and attainment.

Almost 10 years later, in 1980, the second in a series of NCES longitudinal surveys was launched, High School and Beyond (HS&B), which included one cohort of high school seniors comparable to the seniors in NLS:72 as well as a sophomore cohort.

The third longitudinal study of students sponsored by NCES was the National Education Longitudinal Study of 1988 (NELS:88), a cohort of eighth-graders.

The High School Longitudinal Study of 2009, the successor study to ELS:2002, follows a cohort of fall 2009 ninth-graders.

A.2 Purposes and Use of ELS:2002

ELS:2002 is designed to monitor the transition of a national sample of young people as they progress from tenth grade through high school and on to postsecondary education and/or the world of work. ELS:2002 has collected data on young people in high school from multiple perspectives; previous waves surveyed parents, teachers, and school administrators. This study follows young adults on many pathways, including dropouts from high school, early high school graduates, college bound graduates, and non-college bound graduates. Because it draws on respondent survey information as well as administrative records such as transcripts, ELS:2002 is able to provide information on the many possible outcomes of secondary education.

ELS:2002 supports both longitudinal and descriptive cross-cohort analyses, although the study is first and foremost a longitudinal study. Survey items are chosen for their usefulness as outcome measures, particularly in the context of previously collected predictor items. ELS:2002 content will be kept comparable to that of the prior NCES high school studies, to facilitate cross-cohort comparisons (for example, trends over time can be examined by comparing 1980, 1990, and 2002 high school sophomores; or 1972, 1980, 1982, 1992, and 2004 high school seniors). The 2012 (third follow-up) round of ELS:2002 can be compared to the year 2000 round of NELS:88, when cohorts from both studies will be, typically, 8 years beyond high school graduation.

The third follow-up interview will focus on postsecondary education, work experiences, family formation, community involvement, and other life course outcomes. It will also include a range of new issues concerning students’ attainment in postsecondary education, the amount of student aid received, and, from college transcripts from all colleges attended, a complete record of all the courses they enrolled in, and the grades they received. New data will also be collected through jobs summary measures on the dynamics of the employment they have entered into and their progress in finding and forming a promising career. The data will have as an important context the special economic circumstances that this cohort has encountered. In addition, special attention will be given to high school dropouts’ progress toward a high school diploma, GED, or other equivalency, including GED test score information. Because some sample members will have chosen not to continue their education in the 8 years following high school, a series of questions will focus on experiences in the workforce. Yet, because another group of respondents will have been going to school and working, work and educational summaries must be collected, covering the 6 years since last interview. In addition to collecting factual information about educational enrollments and work experiences, the interview will collect information on respondents’ basic life goals. As sample members turn 26 years of age, the modal age of the participants at the time of the interview, marriage and parenthood become more common. Therefore, the third follow-up is the appropriate time to determine which participants have started forming families. With regard to community involvement, participation in volunteer work, and the political process will be examined. All outcomes must be collected in this round, in the compass of a relatively brief (35-minute) interview.

A.2.a Content Justifications

The questionnaire is provided in Appendix 1 and a change justifications grid for survey items in Part C. While the content of the field test questionnaires was justified in the approved field test OMB submission, there are some changes in content based on the findings of the field test, cognitive testing, and the deliberations and recommendations of the Technical Review Panel. These changes are of three kinds: some field test items have been deleted, some items have been added, and some field test items have been revised. The grid in Part C documents and justifies these changes.

A.3 Improved Information Technology

The same technologically innovative, web-based data collection technology employed in the ELS:2002 second follow-up will be used again in the third follow-up. With this web technology, the resulting survey instruments have been carefully designed to be virtually indistinguishable from each other in terms of screen text and skip patterns across all three modes of data collection: self-administered web, CATI, and CAPI. Expectations are that in the third follow-up, over 40 percent of the responses will be web self-administered. The advantages of a web-based instrument include real-time data capture and access, including data editing in parallel with data collection, and increased efficiencies in effecting timely delivery.

Additional features of the system include (1) online help for selected screens to assist in question administration (in all three modes); (2) full documentation of all instrument components, including variable ranges, formats, record layouts, labels, question wording, and flow logic; (3) capability for creating and processing hierarchical data structures to eliminate data redundancy and conserve computer resources; (4) a scheduler system to manage the flow and assignment of cases to interviewers by time zone, case status, appointment information, and prior cases disposition; (5) an integrated case-level control system to track the status of each sample member across the various data collection activities; (6) automatic audit file creation and timed backup to ensure that, if an interview is terminated prematurely and later restarted, all data entered during the earlier portion of the interview can be retrieved; and (7) a screen library containing the survey instrument as displayed to the respondent (or interviewer).

A.4 Efforts to Identify Duplication

Since the inception of its secondary education longitudinal program in 1970, NCES has consulted with other federal offices to ensure that the data collected in the series do not duplicate other national data sources. The inclusion on the Technical Review Panels for ELS:2002 both of members of the research community and of other government agencies helps to focus study and instrument design on features of youth transition that ELS:2002 uniquely can illuminate.

ELS:2002 does not duplicate, but temporally extends, the prior NCES longitudinal studies—NLS:72, HS&B, and NELS:88.

Other NCES studies involve assessments of similar age groups to ELS:2002 (PISA 15-year-olds, NAEP eighth-graders and high school seniors), but are not longitudinal, and do not collect data from parents. By the time of the second follow-up (2006, when most sample members were out of high school for 2 years), there is some similarity in sample to the NCES Beginning Postsecondary Students (BPS). However, the BPS longitudinal study focuses only on beginning postsecondary students, including late entrants into the system. In contrast, ELS:2002 includes both cohort members who go on to postsecondary education and those who do not—but misses many late entrants to the system. Thus BPS and ELS:2002 are fundamentally complementary, not duplicative.

The only non-NCES federal study that would appear to be comparable to ELS:2002 is the BLS National Longitudinal Survey of Youth (NLSY)—the NLSY79 and, sampling respondents closer to ELS:2002 in age, the NLSY97 shares with ELS:2002 (and the prior NCES high school cohorts) the goal of studying the transition of adolescents into adult roles. There are also important design differences between NLSY79/ NLSY97 and ELS:2002 that render them more complementary than duplicative. NLSY is a household-based longitudinal survey; ELS:2002 is school-based. NLSY is an age cohort while ELS:2002 is a grade cohort. For both NLSY cohorts, base year Armed Service Vocational Aptitude Battery (ASVAB) test data are available, but there is no longitudinal high school achievement measure. Although NLSY97 also gathers information from schools (including principal and teacher reports and high school transcripts), it cannot study school processes in the same way as ELS:2002, given its household sampling basis. Any given school contains only one to a handful of NLSY97 sample members, a number that constitutes neither a representative sample of students in the school nor a sufficient number to provide within-school estimates. Additionally, ELS:2002 puts more emphasis on postsecondary education, while NLSY stresses labor market outcomes and collects detailed employment event histories. Thus, although both studies provide important information for understanding the transition from high school to the labor market, ELS:2002 is uniquely able to provide information about education processes and within-school dynamics and how these affect both academic achievement and ultimate labor market outcomes.

A.5 Methods Used to Minimize Burden on Small Businesses

This section has limited applicability to the proposed data collection effort. Target respondents for ELS:2002 are individuals, and direct data collection activities via web-based self-administration, CATI, and CAPI will involve no burden to small businesses or entities.

A.6 Frequency of Data Collection

The rationale for conducting ELS:2002 is based on a historical national need for information on academic and social growth, school and work transitions, and family formation. In particular, structural changes in the economy; sector changes such as the growth of community colleges; changing youth demography; the continuing need to monitor postsecondary educational access, choice, persistence and attainment; and changes in federal policy concerning postsecondary student support and other interventions necessitate frequent studies. By following the same students over time, longitudinal studies provide better measures of the effects of program, policy, and environmental changes than would multiple cross-sectional studies.

To address this need, NCES began the National Longitudinal Studies Program approximately 40 years ago with NLS:72. This study collected a wide variety of data on students’ family background, schools attended, labor force participation, family formation, and job satisfaction at five data collection points through 1986. NLS:72 was followed approximately 10 years later by HS&B, a longitudinal study of two high school cohorts (10th- and 12th-grade students). NELS:88 followed an eighth-grade cohort, which now, with a modal age of 26 years, represents the final data collection point. With the addition of ELS:2002, a 32-year trend line will be available. Taken together, these studies provide much better measures of the effects of social, environmental, and program and policy changes than would a single longitudinal study or multiple cross-sectional studies.

It could be argued that more frequent data collection would be desirable; that is, there would be a gain in having a program of testing and questionnaire administration that is annual throughout the high school years. However, the 2-year interval was employed with both the HS&B sophomore cohort and NELS:88, and proved sufficient to the realization of both studies’ primary objectives. Although there would be benefits to more frequent data collection in the high school years, it must also be considered that the effect would be to greatly increase the burden on schools and individuals, and that costs would be greatly increased. Probably the most cost-efficient and least burdensome method for obtaining continuous data on student careers through the high school years comes through the avenue of collecting school records. High school transcripts were collected for a subsample of the HS&B sophomore cohort, as well as for the entire NELS:88 cohort retained in the study after eighth grade. A similar academic transcript data collection (covering grades 9 through 12) was conducted for the first follow-up of ELS:2002.

Periodicity of the survey after the high school years (at the very terminus of the study) may also be questioned—there is a 6-year gap between the 2006 round (2 years out of high school) and the final round in 2012 (8 years out of high school). Undoubtedly, more process and postsecondary education context information could be obtained if there were surveys in the intervening years (say at age 22, which would optimally capture the college experience). However, the strategy of waiting until about age 26 for the third follow-up interview is cost-effective, in that the information collected at that time includes both final outcomes and statuses, and provides a basis for identifying the postsecondary institutions that individual sample members have attended. In turn, postsecondary transcripts are then obtained that provide continuous enrollment histories for specific courses taken, and provide records of course grades and other information needed to analyze postsecondary persistence and attainment.

A.7 Special Circumstances of Data Collection

All data collection guidelines in 5 CFR 1320.5 are being followed. No special circumstances of data collection are anticipated.

A.8 Consultants Outside the Agency

In recognition of the significance of ELS:2002, several strategies have been incorporated into the project’s work plan that allow for the critical review and acquisition of comments regarding project activities, interim and final products, and projected and actual outcomes. These strategies include consultations with persons and organizations both internal and external to the National Center for Education Statistics, the U.S. Department of Education, and the federal government.

ELS:2002 project staff have established a Technical Review Panel (TRP) to review study plans and procedures. The third follow-up TRP includes some of the earlier ELS:2002 panelists for continuity with prior phases of the study. However, the membership has been reconstituted to reflect the shift in focus from high school experiences to postsecondary and labor market transitions that mark the final outcomes of the study. See Exhibit A-1 for a list of the TRP membership and their affiliations. The TRP met in October of 2010 and in November of 2011, and its recommendations, based on field test results presented at the November 2011 session, have been taken into consideration in revising the instrument for the full-scale study.

Exhibit A-1. Third Follow-up Technical Review Panel (Research and Policy Community Members)


Sara Goldrick-Rab

University of Wisconsin-Madison

1025 West Johnson Street, 575K

Madison, WI 53706

Phone: (608)265-2141

E-mail: [email protected]


Robert Gonyea

Indiana University

Center for Postsecondary Research

107 S. Indiana Avenue, Eigenmann 443

Bloomington, IN 47405

Phone: (812)856-5824

E-mail: [email protected]


Donald Heller

The Pennsylvania State University

406 Rackley Building

University Park, PA 16802

Phone: (814) 865-9756

Email: [email protected]


Robert Lent

University of Maryland

RM 3214D Benjamin Building

College Park, MD 20742

Phone: (301)774-6390

E-mail: [email protected]

Amaury Nora

The University of Texas at San Antonio

College of Education and Human Development

One UTSA Circle

San Antonio, TX 78249

Phone: (210)458-4370

E-mail: [email protected]


Randall Olsen

The Ohio State University

921 Chatham Lane, Suite 100

Columbus, OH 43221

Phone: (614)442-7348

E-mail: [email protected]


Aaron Pallas

Columbia University, Teachers College

464 Grace Dodge Hall

New York, NY 10027

Phone: (212)678-8119

E-mail: [email protected]


Kent Phillippe

American Association of Community Colleges

One Dupont Circle, NW, Suite 410

Washington, DC 20036

Phone: (202)728-0200

E-mail: [email protected]


Michael Shanahan

University of North Carolina at Chapel Hill

Department of Sociology

CB#3210, Hamilton Hall

Chapel Hill, NC 27599

Phone: (919)843-9865

E-mail: [email protected]

Marvin Titus

University of Maryland

EDHI

Room 2200 Benjamin

College Park, MD 20742

Phone: (301)405-2220

E-mail: [email protected]




A.9 Provision of Payments or Gifts to Respondents

Incentive payments to respondents, ranging from $20 to $60, were a major feature of the data collection plan for the ELS:2002 second follow-up study in 2006. About 90 percent of first follow-up respondents and 67 percent of non-respondents were respondents in the second follow-up, for an overall second follow-up weighted response rate of 88 percent (89 percent unweighted). The results of the 2003 field test experiments and the success of the 2004 round of data collection provided evidence of the value of respondent incentives in achieving high response rates (Education Longitudinal Study of 2002: Base-Year to First Follow-up Data File Documentation, NCES 2006-344, Appendix J, Section J3, 2005). In the second follow-up plan, all respondents were given an incentive, with larger incentives for early response and for hard-to-get groups such as past-round nonrespondents and high school dropouts.

In the recently completed third follow-up field test, use of incentives was tied to a response propensity experiment.  The details and results of the experiment are presented in part E of this package. In summary, the response propensity model successfully predicted response outcome. The inclusion of low-propensity cases showed an apparent reduction in unit-level biases. Including more low-propensity cases in the data may reduce bias and may help improve final estimates since low-propensity cases appear to be different in terms of their survey responses. A higher incentive amount for low-propensity experiment cases, $45 versus $25 which increased to $55 and $35 in week 10 of data collection, produced an observed – but not statistically significant – 6% higher response rate for the low-propensity experiment cases as compared with the low-propensity control cases. It is not known how the experiment would have concluded if the field test data collection had continued two additional months to its scheduled end. The data collection ended two months early because the interview yield goal was met. Please refer to B.3 for the full set of strategies recommended for full-scale study data collection.

For the third follow-up full-scale study, we propose a base incentive of $25 and strategically targeting certain groups of cases (described below) with modified treatments, including a higher incentive level of $55. The first group identified for modified treatment in the third follow-up is the high school dropouts. Sample members who have ever dropped out of high school are an important analytic group that was targeted for an increased incentive in past rounds (an additional $20 in the second follow-up). In the third follow-up field test, the response rate for those cases who have ever dropped out was 41.1 percent in contrast to the sample members who never dropped out, whose response rate was 59.7 percent (p<.001). Given the interest in this group of cases who have ever dropped out of high school, targeting these sample members is indicated.

In the third follow-up field test, we identified groups of cases for modified treatment by using prior-round information to model response propensities with a goal that response bias be minimized. Because the propensity-modeling plan was able to consider respondent information (such as a full range of response and sociodemographic characteristics) more inclusively and broadly, it was posited that it would also be able to determine which cases would contribute most to bias in estimates, and ensure that these cases receive priority. The details and results of the third follow-up field test experiment are presented in part E of this package.

In light of the field test results and recent discussions between OMB and NCES, ELS:2002 will move toward implementing a survey design with more responsive and adaptive features. ELS:2002 will move beyond response rates and toward a metric that better indicates bias reduction and stability in key estimates. The goal of this new approach will be to produce, when the data are weighted, more precise estimates of less biased key population parameters and population characteristics. Section B.3 contains further detail on the new approach, targeting cases based on a case level Mahalanobis distance function describing a nonrespondent case’s distance from the mean responding case.

ELS:2002 contains three key data collection points at which the Mahalanobis function should be assessed. The three points are: 1) at the conclusion of the early web self-administration phase before CATI begins, 2) when cases are being targeted for CAPI, and 3) prior to sending out a proposed express shipment mailing containing a $5 prepaid incentive near the end of data collection. At each of these points in the data collection, NCES and its contractor will evaluate progress such as the importance of the remaining cases and how different they are from the ones already interviewed. In order to reduce the risk of bias, some of the remaining cases will be designated as cases to specially target. The team will then institute appropriate strategies listed below for targeting special cases. As part of the responsive design being developed for ELS:2002/12, paradata (e.g., call counts, refusal history, contact status, etc.) will be regularly analyzed to help determine where design changes may be beneficial for reducing nonresponse.

Specifically, plans include classifying all cases who have ever dropped out of high school as a special group to target prior to data collection, and developing customized approaches for those cases as well as the special target cases identified at the three points during data collection to reduce the risk of bias. The strategies include:

    • Pre-data collection intensive tracing

    • Field tracing/CAPI field interviewing

    • Overnight express shipment near the end of data collection with $5 prepaid incentive

    • Increased incentive amount for targeted groups: $55 for targeted groups; $25 for other sample members

    • Phone interviewing from the start of data collection (i.e., no web-only period). This strategy is proposed for ever-dropouts only. They will be identified prior to data collection.

We recommend a base incentive of $25 for completing the third follow-up interview, which would increase to $55 if the case was targeted for modified treatment as a result of one or more of the three evaluations during data collection. Cases will also be targeted for the additional $5 prepaid incentive mailing near the end of data collection based on the evaluation performed prior to the mailing. Cases who have ever dropped out of high school would receive $55 throughout the data collection as well as the additional $5 prepaid incentive mailing, provided they have not responded to the interview prior to that mailing. The evaluations may result in adjustments to the selection of cases for certain treatments as data collection progresses.

Increased incentives are also indicated for achieving higher response from special groups from a cost perspective. In the second follow-up, sample members who had ever dropped out of high school presented considerable data collection challenges. We offered up to $60 and even with that increased incentive over the base level, that group’s response rate was still proportionately low (83% unweighted response rate for ever-dropouts vs. 89% overall unweighted response rate).  The costs of working the case longer in CATI, or moving it on to field work are far greater than increasing the incentive level above the base level. The field test average data-collector labor cost for a completed case in RTI’s Call Center Services was $58 (not including the cost for incentive, mailings, labor other than for data-collectors, or other costs). If incentives yield more self-administered web interviews before the CATI effort begins, then only a small cost per case is incurred in Help Desk labor.  Similarly, if incentives enable CATI interviewers to secure an interview earlier in the data collection period than would have been achieved otherwise, then the average dollar amount per complete will decrease. Further, if a case is not interviewed during CATI and the sample member must be contacted and interviewed by field staff, the cost of that additional effort is expensive. In the ELS:2002 second follow-up, the cost for a CAPI completed case was an average of $530 on top of the already-incurred costs. The ELS:2002/12 data collection approach places particular emphasis on tracing including multiple pre-data-collection panel maintenance updates, intensive tracing, field follow-up, and targeted mailings. These efforts are augmented with the incentive plan described above.

A.10 Assurance of Confidentiality

A data security plan (DSP) was developed and approved by the computer security review board for the ELS:2002 third follow-up. The ELS:2002 DSP represents best-practice survey systems and procedures for protecting respondent confidentiality and securing survey data. An outline of this plan is provided in Exhibit A-2. The ELS:2002 DSP:

  • establishes clear responsibility and accountability for data security and the protection of respondent confidentiality with corporate oversight to ensure adequate investment of resources;

  • details a structured approach for considering and addressing risk at each step in the survey process and establish mechanisms for monitoring performance and adapting to new security concerns;

  • includes technological and procedural solutions that mitigate risk and emphasize the necessary training to capitalize on these approaches; and

  • is supported by the implementation of data security controls recommended by the National Institute of Standards and Technology for protecting federal information systems.



Exhibit A-2. ELS:2002 Third Follow-up Data Security Plan Outline

ELS:2002 Data Security Plan Summary

Maintaining the Data Security Plan

Information Collection Request

Our Promise to Secure Data and Protect Confidentiality

Personally Identifying Information That We Collect and/or Manage

Institutional Review Board Human Subject Protection Requirements

Process for Addressing Survey Participant Concerns

Computing System Summary

General Description of the RTI Networks

General Description of the Data Management, Data Collection, and Data Processing Systems

Integrated Monitoring System

Receipt Control System

Instrument Development and Documentation System

Data Collection System

Document Archive and Data Library

Employee-Level Controls

Security Clearance Procedures

Nondisclosure Affidavit Collection and Storage

Security Awareness Training

Staff Termination/Transfer Procedures

Subcontractor Procedures

Physical Environment Protections

System Access Controls

Survey Data Collection/Management Procedures

Protecting Electronic Media

Encryption

Data Transmission

Storage/Archival/Destruction

Protecting Hard-Copy Media

Internal Hard-Copy Communications

External Communications to Respondents

Handling of Mail Returns, Hard-Copy Student Lists, and Parental Consent Forms

Handling and Transfer of Data Collection Materials

Tracing Operations

Software Security Controls

Data File Development: Disclosure Avoidance Plan

Data Security Monitoring

Survey Protocol Monitoring

System/Data Access Monitoring

Protocol for Reporting Potential Breaches of Confidentiality

Specific Procedures for Field Staff



Under this plan, the ELS:2002 third follow-up data collection will conform fully to federal privacy legislation, including the Privacy Act of 1974 (5 U.S.C. 552a) and Section 9543 of Education Sciences Reform Act of 2002 (20 U.S.C.). ELS:2002 will also conform to the NCES Restricted Use Data Procedures Manual and NCES Standards and Policies. The plan for maintaining confidentiality includes obtaining signed confidentiality agreements and notarized nondisclosure affidavits from all personnel who will have access to individual identifiers. Each individual working on ELS:2002 will also complete the e-QIP clearance process. The plan includes annual personnel training regarding the meaning of confidentiality and the procedures associated with maintaining confidentiality, particularly as it relates to handling requests for information and providing assurance to respondents about the protection of their responses. The training will also cover controlled and protected access to computer files, built-in safeguards concerning status monitoring and receipt control systems, and a secured and operator-manned in-house computing facility.

Communication materials provided or sent to sample members will include a statement about the voluntary nature of the survey and of the confidentiality provision, stating that their responses may be used for statistical purposes only and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002 (ESRA 2002) 20 U.S.C., § 9573].

A.11 Sensitive Questions

The interview contains items about earnings, assets, and debts. Federal regulations governing the administration of these questions, which might be viewed as “sensitive” due to personal or private information, require (a) clear documentation of the need for such information as it relates to the primary purpose of the study, and (b) provisions to respondents which clearly inform them of the voluntary nature of participation in the study, and (c) assurances of confidential treatment of responses. Information about earnings and assets are vital labor force variables and provide important indicators of the rate of return of educational experiences to the respondent.

If a sample member’s SSN is unknown despite the prior rounds of data collection, it will be collected in the interview. This information is needed to obtain data from a variety of extant data sources including student financial aid data from Central Processing System (CPS), data from the National Student Loan Data System (NSLDS) Pell loan and grant files, and GED test results. A description of matching procedures and the security measures in place for the linkages to extant data sources is provided in Part D, while the wording of the SSN question can be found in appendix 1.

A.12 Estimates of Hour Burden for Information Collection for the Full-scale Study

Estimates of response burden for the ELS:2002 third follow-up full-scale study interview are shown in Exhibit A-3.The field test interview took on average 37.4 minutes to complete. Based on field test results and Technical Review Panel input, the interview has been trimmed and streamlined (see Appendix 1 for the questionnaire and Part C for a summary of changes), resulting in an estimated overall average interview length of 35 minutes for the full-scale instrument.




Exhibit A-3. Estimated Burden for ELS:2002 Third Follow-up Full-scale Study


Sample

Expected response rate

Number of respondents

Average burden/
response (minutes)

Range of response times (minutes)

Total burden (hours)

Spring 2012 panel maintenance

16,200

20%

3,240

5

----

270

ELS:2002/12 interview

16,200

90%

14,580

35

25 to 45

8,505

NOTE: Table does not include transcript collection which will take place in 2013-14 and will be submitted in a separate package. The table includes the pre-data collection panel maintenance planned for May/June 2012. The table does not include the already-approved full-scale panel maintenance activities (OMB# 1850-0652 v.8) conducted in the fall of 2011.

Assuming a $20 hourly wage, the cost to ELS:2002/12 respondents for completing the survey is estimated at $170,100. Combined with the panel maintenance estimated cost of $5,400, the overall estimate is $175,500.

Included in the notification letter and on the entry page to the online survey will be the following burden statement:

According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless it displays a valid OMB control number. The valid OMB control number of this voluntary information collection is 1850-0652. The time required to complete this information collection is estimated to average around 35 minutes per response. If you have any comments concerning the accuracy of the time estimate or suggestions for improving the interview, please write to: U.S. Department of Education, Washington, DC 20202-4537. If you have comments or concerns regarding the status of your individual interview, write directly to: Education Longitudinal Study (ELS), National Center for Education Statistics, 1990 K Street NW, 9th floor, Washington, DC 20006.”

A.13 Estimates of Costs

There are no capital, startup, or operating costs to respondents for participation in the project. No equipment, printing, or postage charges will be incurred.

A.14 Costs to Federal Government

Estimated costs to the federal government for ELS:2002 are shown in Exhibit A-4. Included in the contract estimates are all staff time, reproduction, postage, and telephone costs associated with the management, data collection, analysis, and reporting for which clearance is requested.


Exhibit A-4. Total Costs to NCES

Costs to NCES

Amount (in $)

Total ELS:2002/12 costs

$10,397,075

Salaries and expenses

$750,000

Contract costs

$9,647,075

NOTE: All costs quoted are exclusive of award fee. Table does not include transcript collections.

A.15 Reasons for Changes in Response Burden and Costs

The apparent increase in respondent burden is due to the fact that the last OMB approval was for the ELS:2002 third follow-up field test and field test and full scale panel maintenance activities, while this request is for the ELS:2002 third follow-up full-scale study data collection plus pre-collection full scale panel maintenance activities.

A.16 Publication Plans and Time Schedule

The ELS:2002 contract requires the following reports and other public information releases:

  • a detailed methodological report (in the form of a comprehensive Data File Documentation Report covering the base year through the third follow-up, with an appendix for the field test) describing all aspects of the data collection effort; and

  • complete restricted-use and public-use longitudinal data files and documentation for research data users and a First Look Report, presenting initial descriptive findings for dissemination to a broad audience.

Final deliverables for the third follow-up are scheduled for completion in 2013. (Final deliverables for the transcript study are scheduled for completion in 2015.) The operational schedule for the ELS:2002 third follow-up field test and full-scale study is presented in Exhibit A-5.


Exhibit A-5. Operational Schedule for ELS:2002/12 Field Test and Full-Scale Activities

Activity

Start

End

Field test



Panel maintenance: contact updates for sample*

10/2010

6/2011

First round of cognitive testing of items*

8/2010

9/2010

Data collection*

7/2011

9/2011

Second round of cognitive testing*

9/2011

10/2011




Full-scale study



Panel maintenance: contact updates for sample*

10/2010

6/2012

Data collection

7/2012

1/2013




Transcript collection



Pilot testing of operations

2/2013

8/2013

Transcript data collection

8/2013

3/2014

Transcript keying and coding

11/2013

8/2014

* Denotes activities already approved by OMB.

Note: The current request for OMB review includes only data collection activities for the full-scale study.

A.17 Approval to Not Display Expiration Date for OMB Approval

The expiration date for OMB approval of the information collection will be displayed on data collection instruments and materials. No special exception to this requirement is requested.

A.18 Exception to Certification for Paperwork Reduction Act Submissions

No exceptions are requested to the certification statement identified in the Certification for Paperwork Reduction Act Submissions of OMB Form 83-I.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleDecember 2010
Authorcannada
File Modified0000-00-00
File Created2021-01-31

© 2024 OMB.report | Privacy Policy