Part A HSLS-09 HS Transcript and College Update FT

Part A HSLS-09 HS Transcript and College Update FT.docx

High School Longitudinal Study of 2009 (HSLS:09) High School Transcript Collection and College Update Field Test and Second Follow-up Panel Maintenance

OMB: 1850-0852

Document [docx]
Download: docx | pdf


October 14, 2011



High School Longitudinal Study of 2009 (HSLS:09)

College Update and Transcript Field Test (2012)




Supporting Statement

Part A






Request for OMB Review
OMB# 1850-0852 v.10







Submitted by


National Center for Education Statistics

U.S. Department of Education


Table of Contents

Section Page



EXHIBITS

Number Page




TABELS

Number Page




Attachments

Appendix 1 Data Collection Communication Materials 1-1

Appendix 2 College Update Interview (and Reinterview) Questionnaire 2-1

Appendix 3 High School Transcript Component Data Elements…….. 3-1



High School Longitudinal Study of 2009


This document has been submitted to request clearance under the Paperwork Reduction Act of 1995 and 5 CFR 1320 for the High School Longitudinal Study of 2009 (HSLS:09). The study is being conducted for the National Center for Education Statistics (NCES) by the Research Triangle Institute (RTI) International—with the American Institutes for Research (AIR), Windwalker Corporation, Horizon Research Inc., Research Support Services (RSS), and MPR Associates (MPR) as subcontractors—under contract to the U.S. Department of Education (Contract number ED-IES10C0033).

The purpose of this Office of Management and Budget (OMB) submission is to request clearance for the HSLS:09 High School Transcript and College Update field tests (2012), the second follow-up panel maintenance, and to request a 60-day federal register notice waiver for the clearance of these main study (2013) data collections, which are not expected to differ substantively from the field test data collections. This submission contains a description of the High School Transcript collection, along with a draft College Update questionnaire, which can be completed by either students or parents.

A. Justification

A.1 Circumstances Necessitating Collection of Information

A.1.a Purpose of This Submission

The materials in this document support a request for clearance for the field test and a 60-day Federal Register notice waiver for the College Update and High School Transcript collection for HSLS:09. The basic components and key design features of HSLS:09 are summarized below:

  • survey and math assessment administered to more than 21,000 high school 9th graders in 944 schools during fall 2009;

  • surveys of 9th-graders’ parents, mathematics and science teachers, school administrators, and school counselors in fall 2009;

  • follow-up in spring 2012, when sample members are high school juniors, have dropped out, or are in other grades;

  • survey of school administrators, school counselors, and subsample of parents in 2012;

  • return to the same schools but separate following of transfer students and dropouts;

  • a College Update with parents or students in the summer/fall after modal senior year (2013);

  • high school transcript component in the 2013-2014 academic year (records data for grades 9–12); and

  • post-high school follow-ups by web survey and computer-assisted telephone interview (CATI; scheduled for spring 2015 and spring 2021).

HSLS:09 supports several goals of the President’s agenda and the American Competitiveness Initiative (ACI): 1) Strengthening math and science education; 2) Improving the high school experience in the United States; 3) Expanding access to postsecondary education. First, HSLS:09 brings a new and special emphasis to the study of youth transition by exploring the paths that students pursue in the fields of science, technology, engineering, and mathematics (STEM). Second, the College Update takes a snapshot of the cohort at a key transition point between high school and postsecondary education or work. The HSLS:09 data will help analysts examine how high school experiences influence this transition. Third, HSLS:09 will provide policymakers with data about the factors that promote or inhibit postsecondary access and matriculation.

Indeed, before reaching postsecondary education, students must complete secondary education, an attainment which eludes a large fraction of the student population. High schools that do not graduate at least 60% of their students have been labeled ‘dropout factories’ (Balfanz and Legters, 2004). Yet, the decision to drop out is made on several levels, from the individual, to the parent and family, among peers, and within high schools. HSLS:09 provides data to illuminate the multi-level factors involved in students’ decisions to depart high school without a diploma. Eventually, a percentage of dropouts will obtain alternative credentialing such as the GED, and the timing and means of earning these alternative credentials will be captured in the HSLS:09 data. All students, including dropouts and holders of alternative diplomas, may pursue some form of postsecondary education at some point, so their postsecondary educational aspirations, plans, and expectations must be understood as well.

The HSLS:09 survey collects student-level factors, such as expectations and academic preparation, which may combine with background characteristics to facilitate or inhibit enrollment at a postsecondary institution. The choice set of postsecondary education includes type of institution attended, whether it was a first-choice institution, the level of intensity (full- or part-time), and the means to fund the choices, among other matters. This choice is also related to such considerations as institution types, offered programs, provided aid, and other institutional characteristics, such as reputation, size, cost, and social environment.

Once students matriculate at postsecondary institutions, additional issues that HSLS:09 can address include rate of return on sub-baccalaureate and baccalaureate education, postsecondary educational persistence, patterns of transfer and ‘swirling’, and baccalaureate attainment. Brock (2010) points out that changes in federal policy and public attitudes have opened up higher education to women, minorities, and nontraditional students, who are overrepresented in sub-baccalaureate programs and in community colleges (Provasnik and Planty 2008). Students at two-year colleges, however, are less likely than those at four-year institutions to complete a bachelor’s degree, even when they plan and expect to do so (Brock 2010; Provasnik and Planty [2008]). With the increased prevalence and importance of community college comes the need to understand better who attends, for what reasons, under what circumstances, and to what outcomes. Success in community college reflects the combined influence of opportunity structures, institutional practices, and the social, economic, and academic attributes students bring to college (Goldrick-Rab 2010).

With more nontraditional and underprepared students in community college, it is important to understand nontraditional pathways into and through higher education. As Adelman observes “the complexity of student postsecondary enrollment patterns has accelerated” (Adelman 2006). Issues such as pattern of transfer are particularly important, as exemplified by purposeful migration from two-year to four-year institutions versus “swirling” (wandering from one school to another—swirling has a significant negative association with degree completion [Adelman 2006]). Stopping out, dropping in, and auditing are additional patterns of attendance that must be delineated and understood and that may be tied to the relatively low baccalaureate attainment rate. Even when low-SES and first-generation college students begin in 4-year institutions, they are disproportionately likely to engage in “reverse” movement from 4-year to 2-year schools, which in turn is associated with much lower odds of completion (Goldrick-Rab and Pfeffer 2009). The HSLS:09 College Update and subsequent follow-ups will examine these complexities of postsecondary educational access, choice, and success.

In sum, the HSLS:09 design, with its carefully-selected points of data collection, captures the ever-longer and more branching path to adulthood. The in-school rounds capture the transition into high school, the bulk of the high school experience, algebraic reasoning achievement gain expected in the high school years, and the transition out of high school, from early plans and expectations to actual choice. The two high school assessment and survey points are complemented by the continuous data from 9th grade through 12th grade that will be supplied by high school transcripts. The College Update collects the evolved plans and decisions of students at the very transition point out of high school and onto a postsecondary pathway and opens a window onto issues of access and choice. Subsequent rounds of data collection will explore patterns of postsecondary attendance, as well as the transition of the non-college-going young adults into the labor market, especially postsecondary educational persistence, intensity of enrollment, remediation, transfer and attainment.

A.1.b Legislative Authorization

HSLS:09 is sponsored by NCES, within the Institute of Education Sciences (IES), in close consultation with other offices and organizations within and outside the U.S. Department of Education (ED). HSLS:09 is authorized under Section 9543 of the Education Sciences Reform Act of 2002 (20 U.S.C).

A.1.c Prior and Related Studies

In 1970, NCES initiated a program of longitudinal high school studies. Its purpose was to gather time-series data on nationally representative samples of high school students that would be pertinent to the formulation and evaluation of education polices.

Starting with the National Longitudinal Study of the High School Class of 1972 (NLS:72), NCES began providing education policymakers and researchers with longitudinal data that linked education experiences with later outcomes. Almost 10 years later, in 1980, High School and Beyond (HS&B) included one cohort of high school seniors comparable to the seniors in NLS:72. The third longitudinal study of students was the National Education Longitudinal Study of 1988 (NELS:88), which began with a cohort of 8th-graders. The Education Longitudinal Study of 2002 (ELS:2002) started with a 10th grade cohort in 2002, with follow-ups in 2004, 2006, and a third follow-up scheduled for 2012.

These studies have investigated the educational, personal, and vocational development of students and the school, familial, community, personal, and cultural factors that affect this development. Each of these studies has provided rich information about the critical transition from high school to postsecondary education and the workforce. HSLS:09 continues on the path of its predecessors while also creating a new focus on factors associated with choosing, persisting in, and succeeding in STEM course-taking and careers.

A.2 Purpose and Use of Information Collection

HSLS:09 is intended to be a general-purpose dataset; that is, it is designed to serve multiple policy objectives. Policy issues studied through HSLS:09 include the identification of school attributes associated with mathematics achievement, college entry, and career choice; the influence that parent and peer involvement have on student achievement, activities, plans, decisions, and development; the factors associated with dropping out of the education system; and the transition of different groups (for example, racial and ethnic, gender, and socioeconomic status groups) from high school to postsecondary institutions and the labor market, and especially into STEM curricula and careers.

The objectives of HSLS:09 also encompass the need to support both longitudinal and cross-cohort analyses and to provide a basis for important descriptive cross-sectional analyses. HSLS:09 is first and foremost a longitudinal study; hence survey items are chosen for their usefulness in predicting or explaining future outcomes as measured in later survey waves. Compared to its earlier counterparts, there are considerable changes to the design of HSLS:09 –specifically, changes in the grade at which data are collected in the in-school rounds—that will need to be considered when doing trend comparisons. NELS:88 began with an eighth-grade cohort in the spring term; although this cohort is not markedly different from the fall-term 9th-grade cohort of HSLS:09 in terms of student knowledge base, it differs at the school level. The HSLS:09 time point represents the beginning of high school rather than the point of departure from middle school. HSLS:09 includes a spring-term 11th-grade follow-up (even though none of the predecessor studies do) because only modest gains have been seen on assessments in the final year of high school, and the 11th-grade follow-up minimizes unit response problems associated with testing in the spring term of the senior year. The design of HSLS:09 calls for information to be collected from parents of (modal) 11th-graders and the use of a parent/student College Update survey. The collection of transcripts will provide continuous term-by-term data for 9th through 12th grades, which will relate achievement gain to coursetaking choices. All of these data will allow trend comparisons of coursetaking patterns and other common high school policy issues with HS&B, NELS:88, and ELS:2002.

The HSLS:09 College Update allows analysts to understand students’ college plans, preparation, and information-seeking behavior. The survey questions will tap if and where students applied, if and where they were accepted, and with what financial aid (if any); what work they expect to take in the fall after their intended high school graduation date; as well as parental resources and support. Traditionally, past studies have waited for retrospective data on these critical topics, but these issues, given today’s policy climate, are too important to tolerate inaccuracies from post-hoc recall. The College Update will provide data at the most critical time period to predict postsecondary matriculation and to track how students’ plans evolve from the beginning of high school to the end of high school.

A.2.a Content Justifications: College Update and High School Transcripts

College Update Questionnaire

The College Update is a 20-minute questionnaire to be answered by either the student or the parent. The questions will be largely objective since a student or a parent must know enough information to provide the same answers. The College Update instrument anchors responses to November 1, when college-bound students should be enrolled in postsecondary education and most of the cohort will have completed high school. Main study cohort members will not be interviewed again until 2015, when most will be two years beyond high school, and when information about postsecondary plans, choice sets, and decision processes would be stale and subject to recall bias and ex post facto rationalization.

The questionnaire is organized around seven topics: (1) high school completion status; (2) plans for fall 2012 (for the field test; 2013 for the main study); (3) applications to postsecondary education institutions; (4) acceptance to postsecondary institutions; (5) applications for and offers of financial aid; (6) rationale for deciding on a particular institution or deciding not to attend a postsecondary institution; and (7) employment experiences and plans. The College Update marks another stage for tracing and measuring the evolution of postsecondary plans, including STEM-related plans, and for connecting plans to outcomes. Each topic in the questionnaire contributes to the fuller understanding of the factors affecting postsecondary educational access and choice, or to employment outcomes. Additionally, and relevant to all topic areas, the College Update asks the respondent to report how well their high school counselor prepared them to apply to college, apply for financial aid, and find a job. Opinions from students (and parents) can then be compared against reported goals of the high school counselor for research on the role of high school counselors in the transition beyond high school. The following section discusses each topic in more detail.

High School Completion Status. Since high school completion is a critical defining event for this cohort, four distinct types of completion status will be discerned: 1) early completion (ahead of the modal completion time for the cohort); 2) on-time completion (the mode for the cohort); 3) non-completion (still in high school or in homeschool); and 4) non-completion (dropout). The survey will map these four critical pathways. Because of differences in the quality of educational experience that each represents, it is also important to know the type of high school credential obtained. Additionally, names of high schools attended since last interviewed will be collected, which will be valuable to the high school transcript collection.

Plans for Fall 2012. The field test questionnaire asks respondents to report their plans on November 1, 2012 (2013 for the main study), when the majority of the cohort will be entering postsecondary education, entering the workforce, serving in the military, starting a family, or a combination of these or other activities. Respondents planning to continue their education or training will be asked to report their chosen postsecondary institution, the type of program, full-time and part-time enrollment status, and expected major field of study. Those planning to work will be asked if they plan to work full-time or part-time. Respondents electing to enlist in the military will be asked if they plan to be on active duty.

Applications to Postsecondary Institutions. The College Update supplies an invaluable opportunity to take a snapshot of the cohort’s plans at a key transition point. All respondents, regardless of fall plans, will be asked the names of at most two postsecondary institutions to which they applied, in addition to the institution they will attend, if applicable. Respondents then will be requested to mark which of these schools was or is their first choice.

Acceptance to Postsecondary Institutions. For each of the institutions to which the student applied, the respondent will be asked if he/she was accepted, rejected, or waitlisted. Additionally, the respondents will be asked to report their first choice among those schools where they were accepted.

Applications for and Offers of Financial Aid. College choice is sensitive to cost, and cost can be mitigated by financial aid. The College Update instrument asks all respondents, regardless of postsecondary plans whether financial aid was applied for and whether financial aid was received. The instrument ascertains the type of financial aid offered by the institution to be attended on November 1, as well as financial aid offered by the institution designated as the respondent’s first choice of institutions of those where accepted. In addition, the instrument ascertains financial aid offered from other sources, such as a community or religious organization. Those students (and families) who did not apply for financial aid will be surveyed for reasons for not applying. The information collected in this section of the questionnaire will be complemented by federal financial aid data from the Central Processing System (FAFSA) and the National Student Loan Data System for federal loans (Stafford, Perkins) and federal grants (Pell) which will be acquired later through administrative records matching.

Rationale for Deciding on a Particular School. In order to understand why cohort members chose an offer of admission from a particular school or opted not to attend any, questions involve financial aid, private costs to the student (and family), reputation of the institution, and proximity to home. Cost—actual or perceived—is important information for modeling choice. Information is gathered about cost, relative to the first -choice institution and the school that will actually be attended. Those who chose not to continue their education or training will be asked about their rationale for deciding not to extend their education after high school.

Employment. The field test College Update also collects information on employment in the fall of 2012 (2013 for the main study). Questions will include the job title and duties, whether the job is an apprenticeship or will lead to licensure in an occupational field, relation to career goals, whether the high school assisted the student in getting the job, and compensation.

Locating Information. The College Update also collects some additional locating information to assist in tracking the cohort as it disperses at this key transition point.

High School Transcripts

High school transcript studies have been conducted by NCES as part of the Longitudinal Studies Program and the National Assessment of Educational Progress (NAEP) High School Transcript Studies (HSTS) program since 1982. Transcripts include the official and fixed record of student coursetaking. Apart from the enormous burden, students find it difficult to recall accurate information about the number and types of courses taken, the date taken, and the grades received over the four-year period. As such, transcripts are considered to be more accurate than students’ self-reported information. While the collection of administrative records from schools presents a number of challenges and has its own set of limitations (e.g., uncooperative schools, incomplete records), transcripts offer an objective, reliable, cost-effective means for obtaining information about crucial aspects of students’ educational experiences.

Specific content requested in the High School Transcript component includes:

Student-level information:

  • Type of diploma awarded

  • Date diploma awarded

  • Date student left school (for students who did not graduate)

  • Reason student left school (graduated, transferred, etc.)

  • Cumulative GPA

  • Dual (concurrent) enrollment

  • Standardized test scores for the PSAT, SAT, ACT, and Advanced Placement tests


Coursetaking histories for grades 9 through 12 (plus high-school-level courses such as

algebra taken before 9th grade):

  • Course title and number

  • Year, grade level, and term course taken

  • Number of credits earned

  • Grade assigned

School-level information:

  • Grade scale

  • Course grade weighting system used, if any

  • Availability of student-level information

  • GPA formula

  • Carnegie unit conversion information

  • Term system used

  • Course catalogs (if not collected previously)

  • Types of diplomas granted

  • Credits required for different types of diplomas


Course catalogs will be collected, and a course offerings file will be created in addition to a transcript file.

The immense value of high school transcripts as objective, reliable measures of crucial aspects of students’ educational experiences is widely recognized. In the course of the first major NCES transcript study (in High School and Beyond, 1982), a methodological comparison was made of student self-reports and school records. This comparison (Fetters, Stowe and Owings, 1984) established that with respect to level of detail, accuracy and completeness, transcript data are vastly superior to student self-reports of exposure to learning situations and grades received.

When coupled with data on students’ family backgrounds and demographic characteristics, school environments, and standardized achievement and outcome measures, transcripts permit the specification of complex models of educational processes. Moreover, transcript components of longitudinal studies such as HS&B, NELS:88, ELS:2002, and HSLS:09 permit the measurement of high school program and course effects on post-high school outcomes in both the labor force and postsecondary education. The data are invaluable both for inter-cohort analysis (Dalton et al., 2007) and intra-cohort analysis, such as measuring the relationship between mathematics coursetaking and achievement gains (Bozick and Ingels 2008).

A.3 Use of Improved Information Technology and Burden Reduction

A.3.a College Update

With few exceptions, College Update questionnaire data collected for the HSLS:09 College Update will be limited to electronic media. Student sample members and their parents will be given a username and password and will be asked to have one of the two (either student or parent) complete the College Update via the Internet. There will be a computer-assisted telephone interview (CATI) follow-up for student sample members and their parents who do not complete the web questionnaire by self-administration. Computer control of interviewing efficiently manages survey activities, including scheduling of calls, generation of reports on sample disposition, data quality monitoring, interviewer performance, and flow of information between telephone and field operations.

Additional features of the CATI system include: 1) online help for each screen to assist interviewers in question administration; 2) full documentation of all instrument components, including variable ranges, formats, record layouts, labels, question wording, and flow logic; 3) capability for creating and processing hierarchical data structures to eliminate data redundancy and conserve computer resources; 4) a scheduler system to manage the flow and assignment of cases to interviewers by time zone, case status, appointment information, and prior cases disposition; 5) an integrated case-level control system to track the status of each sample member across the various data collection activities; 6) automatic audit file creation and timed backup to ensure that, if an interview is terminated prematurely and later restarted, all data entered during the earlier portion of the interview can be retrieved; and 7) a screen library containing the survey instrument as displayed to the interviewer.

A.3.b High School Transcripts

As a first step in the high school transcript collection, RTI collected course catalogs, when and where possible, during the first follow-up field test in-school survey administration for the academic years 2008–09 to 2011–12 and will do likewise with main study schools for years 2009-10 to 2012-13. To the extent that the catalogs are available in electronic format, we will obtain catalogs electronically via e-mail or upload to the study website. Any schools whose catalogs cannot be obtained in that manner will be asked to provide hard copy course catalogs via an express delivery service.

High school transcripts will be requested for the cohort from the schools from which the field test (and main study) students were sampled as part of HSLS:09 base year, as well as all additional schools attended by the students since then. A complete transcript will be requested for each sample member. Several methods will be used for obtaining the transcript data including:

  1. Asking school staff to upload electronic transcripts for sampled students to a secure study website;

  2. Asking school staff to send electronic transcripts for sampled students by secure File Transfer Protocol;

  3. Asking school staff to send electronic transcripts via e-mail with encrypted attachments;

  4. Obtaining transcripts directly using a dedicated server at the University of Texas at Austin (described in more detail below) for those schools participating in the program;

  5. If none of the above methods is possible, school staff may send transcripts to a secure electronic fax at RTI (after sending a confirmed test page); or

  6. If none of the above methods is possible, school staff may send the transcripts via an express delivery service after redacting personally identifiable information.

The fourth collection method listed above is a relatively new process. Approximately 100 high schools across the nation currently send academic transcripts in standardized electronic formats via a dedicated server at the University of Texas at Austin. The server now supports Electronic Data Interchange (EDI) and XML formats. Based on RTI’s experience with the collection of postsecondary transcripts for the B&B:08/09 and BPS:04/09 transcripts, it is likely that very few high schools will provide data via this server. However, providing schools with this option will reduce the burden of providing transcripts on those schools that do use it.

After collecting the transcripts and catalogs, data from the transcripts will be keyed, when needed, and the courses coded. Courses will be coded using a course-coding taxonomy based on the Classification of Secondary School Courses from the 2000 National Assessment of Educational Progress (NAEP) High School Transcript Study, or an updated coding taxonomy if applicable. Because the CSSC is a modified version of the Classification of Instructional Programs (CIP), RTI will carefully review both the CSSC and the 2010 CIP to identify any updates that may be beneficial to carry over to the CSSC. Guidance will be provided by NCES, technical review panel members, and other key personnel on refining and reviewing the taxonomy for transcript coding and new courses and fields of study.

Verifications of transcript data keying and coding at the student level will be performed. Any errors will be recorded and corrected as needed. Once the transcripts for each school are keyed and coded, transcript course coding at the school level will be reviewed by expert coders to ensure that: (1) coding taxonomies have been applied consistently and data elements of interest have been coded properly within schools; (2) program information has been coded consistently according to the program area and sequence level indicators in course titles; (3) records of sample members who attended multiple schools do not have duplicate entries for credits that transferred from one school to another; and (4) additional information has been noted and coded properly.

A.4 Efforts to Identify Duplication and Use of Similar Information

Since 1970, NCES has consulted with other federal offices to ensure that the data collected in this important series of longitudinal studies do not duplicate information from any other national data sources. In addition, NCES staff have regularly consulted with nonfederal associations such as the College Board, American Educational Research Association, the National Association for College Admission Counseling, the American Association of Community Colleges, and other groups to confirm that the HSLS:09 data to be collected are not available from any other sources. These consultations also provide methodological and substantive insights from other studies of secondary and postsecondary students and labor force members. This openness to input and feedback ensures that the data collected through HSLS:09 will meet the needs of the federal government and other interested agencies and organizations.

Within NCES, HSLS:09 builds on and extends past studies rather than duplicating them. First, the instrumentation and design of HSLS:09 explicitly complement the redesign of NPSAS and BPS. HSLS:09 staff ensure that the questions raised by NPSAS and BPS about what happens to their participants before they enter the respective postsecondary studies are asked. Indeed, the postsecondary and secondary longitudinal survey staff collaborate extensively to align the foci of the research questions, the definition and meaning of study constructs, and the measurement of these constructs across survey programs. Such collaboration maximizes the possibility of producing an analytically valuable data product with interest to educators, researchers, and policymakers.

Second, design articulation with prior NCES secondary longitudinal studies also shows coordination, not duplication. These earlier studies were conducted during the 1970s, 1980s, 1990s, and the early 2000s and represent education, employment, and social experiences and environments different from those experienced by the HSLS:09 student sample. In addition to extending prior studies temporally as a time series, HSLS:09 extends them conceptually. To a greater degree than the prior secondary longitudinal studies, HSLS:09 provides data that are necessary to understand the role of different factors in the development of student commitment to attend higher education and the steps necessary to persist and succeed in college (applying for financial aid, taking courses in specific sequences, etc.). Further, HSLS:09 focuses on the factors associated with choosing and persisting in mathematics and science course-taking and STEM careers. These focal points present a marked difference between HSLS:09 and its predecessor studies.

The only other dataset that offers so large an opportunity to understand the key transitions into postsecondary institutions or the world of work is the Department of Labor (Bureau of Labor Statistics) longitudinal cohorts, the National Longitudinal Survey of Youth 1979 and 1997 cohorts (NLSY79, NLSY97). However, the NLSY youth cohorts represent temporally earlier cohorts than HSLS:09. There are also important design differences between NLSY79/ NLSY97 and HSLS:09 that render them more complementary than duplicative. NLSY is a household-based longitudinal survey; HSLS:09 is school-based. For both NLSY cohorts, base-year Armed Service Vocational Aptitude Battery (ASVAB) test data are available, but there is no longitudinal high school achievement measure. Although NLSY97 also gathers information from schools (including principal and teacher reports and high school transcripts), it cannot study school processes in the same way as HSLS:09, given its household sampling basis. Any given school contains at most just a few of NLSY97 sample members, a number that constitutes neither a representative sample of students in the school nor a sufficient number to provide within-school estimates. Thus, although both studies provide important information for understanding the transition from high school to the labor market, HSLS:09 is uniquely able to provide information about education processes and within-school dynamics and how these affect both school achievement and ultimate labor market outcomes, including outcomes in STEM education and occupations.

Both NAEP and the secondary longitudinal studies sponsor periodic collections of transcripts, but the NAEP transcript data cannot be used longitudinally, since NAEP is a cross-sectional study.

A.5 Impact on Small Businesses or Other Small Entities

Target respondents for HSLS:09 are individuals who typically have recently completed high school. Data collection activities will involve no burden to small businesses or entities.

A.6 Consequences of Collecting the Information Less Frequently

This submission describes the field test and main study data collection for the College Update, which will take place in the summer/fall of 2012 for the field test with the main study conducted a year later, and a high school transcript collection in the academic year of 2012-13 for the field test and 2013-14 for the main study. A second follow-up is scheduled for the spring of 2015. The tentative design for the study calls for a final round at about age 26 (2021). Recent education and social welfare reform initiatives, changes in federal policy concerning postsecondary student support, and other interventions necessitate frequent studies. Important areas of change for which better information is needed include the increasing role of community colleges, the needs of demographic minorities, and of first-generation college-goers. Repeated surveys are also necessary because of rapid changes in the secondary and postsecondary education environments and the world of work. Indeed, longitudinal information arguably provides better measures of the effects of program, policy, and environmental changes than would multiple cross-sectional studies.

The HSLS:09 cohort is first surveyed at the very beginning of high school to provide a baseline which also includes the full pool of potential high school dropouts. The First Follow-up occurs in what will be, for most, the spring of their junior year. The College Update is a snapshot taken in the summer/fall after the cohort’s modal senior year and records sample members’ status in terms of the transition to higher education and the workforce, with an anchor in expected status as of October 2013 for the main study. The timing is important in that it provides a fresh and immediate look at the outcomes of the cohort’s postsecondary planning. High school transcripts will be collected in the 2013-14 academic year, when most cohort members have completed high school. Postsecondary follow-ups are tentatively planned for the modal three-years-out of high school time point, the ideal juncture at which to study postsecondary access and choice, and for eight-years-out of high school, to capture final outcomes. While an argument could be made for additional data points, less frequent collection would adversely affect the study’s ability to meet its goals.

A.7 Special Circumstances Relating to Guidelines of 5 CFR 1320.5

All data collection guidelines in 5 CFR 1320.5 are being followed. No special circumstances of data collection are anticipated.

A.8 Consultations Outside NCES

Consultations with persons and organizations both internal and external to NCES and the federal government have been pursued. In the planning stage for HSLS:09, there were many efforts to obtain critical review and to acquire comments regarding project plans and interim and final products. The first follow-up Technical Review Panel (TRP) has also been convened and serves as the major vehicle through which future consultation will be achieved in the course of the project. The TRP met in September of 2010 and in June of 2011, and its recommendations on the data elements included in the College Update have been considered in developing the College Update instruments.

For base-year and first follow-up assessment development, a mathematics advisory panel comprising the following experts was formed:

  • Hyman Bass, Professor of Mathematics, University of Michigan;

  • Katherine Halvorsen, Professor of Mathematics and Statistics, Smith College;

  • Joan Leitzel, President Emeritus, University of New Hampshire and Professor of Mathematics (retired), Ohio State University;

  • Mark Saul, Mathematics Teacher (retired), Bronxville High School, NY; and

  • Ann Shannon, Mathematics Education Consultant, Oakland, CA.

Additional consultants outside ED and members of the base-year and first follow-up Technical Review Panels include the following individuals:

Base-Year Technical Review Panel and NCES Research Consultants

Dr. Clifford Adelman
The Institute for Higher Education Policy
1320 19th Street, NW, Suite 400
Washington, DC 20036
Phone: (202) 861-8223 ext. 228
Fax: (202) 861-9307
E-mail:
[email protected]

Dr. Kathy Borman
Department of Anthropology, SOC 107
University of South Florida
4202 Fowler Avenue
Tampa, FL 33620
Phone: (813) 974-9058
E-mail:
[email protected]

Dr. Daryl E. Chubin
Director
Center for Advancing Science & Engineering Capacity
American Association for the Advancement of Science (AAAS)
1200 New York Avenue, NW
Washington, DC 20005

Dr. Jeremy Finn
State University of New York at Buffalo
Graduate School of Education
409 Baldy Hall
Buffalo, NY 14260
Phone: (716) 645-2484
E-mail:
[email protected]

Dr. Thomas Hoffer
NORC
1155 E. 60th Street
Chicago, IL 60637
Phone: (773) 256-6097
E-mail:
[email protected]

Dr. Vinetta Jones
Howard University
525 Bryant Street NW
Academic Support Building
Washington, DC 20059
Phone: (202) 806-7340 or (301) 395-5335
E-mail:
[email protected]

Dr. Donald Rock
Before 10/15: K11 Shirley Lane
Trenton NJ 08648
Phone: 609-896-2659
After 10/15: 9357 Blind Pass Rd, #503
St. Pete Beach, FL 33706
Phone: (727) 363-3717
E-mail:
[email protected]

Dr. James Rosenbaum
Institute for Policy Research
Education and Social Policy
Annenberg Hall 110 EV2610
Evanston, IL 60204
Phone: (847) 491-3795
E-mail:
[email protected]

Dr. Russ Rumberger
Gevirtz Graduate School of Education
University of California
Santa Barbara, CA 93106
Phone: (805) 893-3385
E-mail:
[email protected]

Dr. Philip Sadler
Harvard-Smithsonian Center for Astrophysics

60 Garden St.
, MS 71
Office D-315
Cambridge, MA 02138.
Phone: (617) 496-4709
Fax: (617) 496-5405

E-mail: [email protected]

Dr. Sharon Senk
Department of Mathematics
Division of Science and Mathematics Education
D320 Wells Hall
East Lansing, MI 48824
Phone: (517) 353-4691 (office)
E‑mail:
[email protected]

Dr. Timothy Urdan
Santa Clara University
Department of Psychology
500 El Camino Real
Santa Clara, CA 95053
Phone: (408) 554-4495
Fax: (408) 554-5241
E-mail:
[email protected]

Other Consultants Outside ED

Dr. Eric Bettinger
Associate Professor, Economics
Case Western Reserve University
Weatherhead School of Management
10900 Euclid Avenue
Cleveland, OH 44106
Phone: (216) 386-2184
E-mail:
[email protected]

Dr. Audrey Champagne
Professor Emerita
University of Albany
Educational Theory and Practice
Education 119
1400 Washington Avenue
Albany, NY 12222
Phone: (518) 442-5982

Dr. Stefanie DeLuca
Assistant Professor
Johns Hopkins University
School of Arts and Sciences
Department of Sociology
532 Mergenthaler Hall
3400 North Charles Street
Baltimore, MD 21218
Phone: (410) 516-7629
E-mail:
[email protected]

Dr. Laura Hamilton
RAND Corporation
4570 Fifth Avenue, Suite 600
Pittsburgh, PA 15213
Phone: (412) 683-2300 ext. 4403
E‑mail:
[email protected]

Dr. Jacqueline King
Director for Policy Analysis
Division of Programs and Analysis
American Council for Education
Center for Policy Analysis
One Dupont Circle, NW
Washington, DC, 20036
Phone: (202) 939-9551
Fax: 202-785-2990
E-mail:
[email protected]

Dr. Joanna Kulikowich
Professor of Education
The Pennsylvania State University
232 CEDAR Building
University Park, PA 16802-3108
Phone: (814) 863-2261
E‑mail:
[email protected]

Dr. Daniel McCaffrey
RAND Corporation
4570 Fifth Avenue, Suite 600
Pittsburgh, PA 15213
Phone: (412) 683-2300 ext. 4919
E-mail:
[email protected]

Dr. Jeylan Mortimer
University of Minnesota—Dept. of Sociology
909 Social Sciences Building
267 19th Avenue South
Room 1014a Social Sciences
Minneapolis, MN 55455
Phone: (612) 624-4064
E-mail:
[email protected]

Dr. Aaron Pallas
Teachers College
Columbia University
New York, NY 10027
Phone: (646) 228-7414
E-mail:
[email protected]

Ms. Senta Raizen
Director
WestEd
National Center For Improving Science Education
1840 Wilson Blvd., Suite 201A
Arlington, VA 22201-3000
Phone: (703) 875-0496
Fax: (703) 875-0479
E-mail:
[email protected]



Technical Review Panel—First Follow-Up

Brian Cook
American Council on Education
One Dupont Circle NW, Suite 800
Washington, DC 20036
Voice: (202) 939-9381
Email: [email protected]

Regina Deil-Amen
Center for the Study of Higher Education
University of Arizona
1430 E. Second Street
Tucson, AZ 85721
Voice: (520) 621-8468, or (520) 444-7441
Email: [email protected]

Jeremy Finn
State University of New York at Buffalo
Graduate School of Education
409 Baldy Hall
Buffalo, NY 14260
Voice: (716) 645-6116, or (716) 645-2484 x1071
Email: [email protected]

Thomas Hoffer
National Opinion Research Center (NORC)
1155 E. 60th Street
Chicago, IL 60637
Voice: (773) 256-6097
Email: [email protected]

Vinetta Jones
Howard University
525 Bryant Street NW
Academic Support Building
Washingon, DC 20059
Voice: (202) 806-4947, or (301) 395-5335
Email: [email protected]

Amaury Nora
University of Texas at San Antonio
College of Education and Human Development
One UTSA Circle
San Antonio, Texas 78294
Voice: (210) 458-5436, or (210) 458-7394
Email: [email protected]

Jesse Rothstein
Richard and Rhoda Goldman School of Public Policy
University of California, Berkeley
2607 Hearst Avenue
Berkeley, CA 94720-7320
Voice: (510) 643-8561
Email: [email protected]

Russ Rumberger
University of California, Santa Barbara
Gevirtz Graduate School of Education
2329 Phelps Hall
Santa Barbara, CA 93106
Voice: (805) 451-6091
Email: [email protected]

Sarah E. Turner
Department of Leadership, Foundations and Policy
294 Ruffner Hall
University of Virginia
Charlottesville, VA 22903-2495
Voice: (434) 982-2383
Email: [email protected]

Timothy Urdan
Santa Clara University
Department of Psychology
500 El Camino Real
Santa Clara, CA 95053
Voice: (408) 554-4495
Email: [email protected]



A.9 Explanation of Payment or Gift to Respondents

The objective of the College Update phase of the HSLS:09 data collection is to gather information about students’ plans following high school. The data collection period begins in mid-June, immediately after most students have graduated from high school, and continues through October as they embark on their post high school endeavors. Because the study will occur during a time of high student mobility, the questionnaire can be taken by either a student or a parent.

When contacting sample members outside of school, lower response rates are a concern. To address this concern, the HSLS:09 College Update includes plans to monitor response rates across the data collection period and to implement a three-phase adaptive incentive plan.

The use of a pre-paid incentive has been demonstrated to be effective on studies such as BTLS and B&B:08/09 (BTLS Incentive Experiment OMB# 1850-0868 v1; Wine, Cominole, Janson, Socha 2010), and has shown to increase response rates up to 12% (Cantor, O’Hare, O’Connor 2008). In order to boost response rates, a monetary incentive will be offered to non-responding cases of interest.

HSLS:09 Use of Incentives

In both the base year and first follow-up field tests, NCES offered students $10 to participate in school. Students who had to participate out of school proved more elusive both to find and to convince to participate. Students not enrolled at the school at the time of the first follow-up study were offered $40 to complete the questionnaire outside of school. Students still enrolled at the base year school were offered $15 to complete the questionnaire outside of school if they missed the in-school session.

The first follow-up field test did not provide monetary incentives for parent participation, and this component elicited low response. Since parents may participate in the College Update if their students are unavailable, incorporating approaches to boost parent response rate is critical to this proposal. As a result of an experiment conducted in the base year1, 20% of challenging parent cases will receive a $20 incentive for participating in the first follow-up main study, which currently underway.

College Update Incentive Plan

For the College Update, the field test data collection period will be divided into three phases and target incentives in a responsive design aimed at reducing bias in the final estimates. The three phases are:

  1. A two-week web data collection period. At the start of the first phase of data collection, each of the parents and students in the College Update sample2 will receive a letter asking them to log onto the web to complete the questionnaire.

  2. A three-week web plus CATI data collection period. After the two-week web-only data collection period, outbound calling to sample members will commence and continue for three weeks.

  3. A nonresponse follow-up period. After three weeks of CATI data collection, the Mahalanobis distance function (discussed further below) will determine the target cases for nonresponse follow-up. Target cases will receive a $5 pre-paid cash incentive in this reminder mailing, and the letter will promise that an additional $10 will be sent upon completion of the questionnaire. Cases not identified as cases of interest for nonresponse follow-up will receive no monetary incentive. Table 1 shows the expected cost and response rate of this incentive proposal.

The estimates in Table 1 are based on experience to date with HSLS:09 parent- and out-of-school student- data collection. We estimate 25 percent of sample members will participate by the end of the five-week early data collection period (phase 1 and 2 listed above). Of the approximately 754 cases in the College Update field test, this leaves potentially 566 as estimated ‘late’ responders. During this third phase, Mahalanobis distance functions will be evaluated and a logical cut point (the largest distance scores) will be established so the incentive will be offered to approximately 375 of the 566 cases.


Table 1.Expected Cost and Response Rate

 

Number of cases

Projected cost of incentive

Projected number of respondents

Projected response rate

College Update sample

754

$5,063

644

85%






Estimated early respondents

754

0

189

25%






Estimated late respondents

566

$5,063

456

81%

No incentive

191

0

137

72%

$5 prepay with promise of $10 more

375

$5,063

319

85%


Mahalanobis Distance Function

The Mahalanobis distance function (MD) is a person-level measure and can be defined simply as the distance between a single case and the mean value for all responding cases. In this case, larger distance scores describe cases likely to be dissimilar from existing respondents. That is, because of the variables used in the calculation of MD (see Table 2 below), these cases would be characterized by notable differences, for example, in terms of student expected education, timing of Algebra 1 course taking, parent level of education, performance on the assessment, enrollment status, etc.


Table 2. Proposed variables to include in model to identify target cases

Source

Potential Variables

Survey variables

Student expected education level

Timing of Algebra I coursetaking

Performance on the assessment

Enrollment status

Parent level of education

Race

Gender

Sample frame data

School type

Metro area

Paradata

  • Whether sample member contacted the help desk

  • Whether sample member logged in but did not complete the College Update questionnaire

  • Number of contact attempts in the early data collection period

  • Whether sample member made an appointment to complete the interview

  • Whether sample member told interviewer they would do the web interview

  • Student base year and first follow-up response outcomes

  • Parent base year and first follow-up response outcomes

  • Parent response in the panel maintenance update

  • Student enrollment status at first follow-up

  • Reason for prior student nonresponse (refusal, absent) if applicable

  • Call counts in base year and first follow-up


The MD incorporates a measure of the likelihood of ultimate response among current nonrespondents. Therefore, in addition to survey variables and analytic variables, paradata are crucial to include with the substantive data in the case-selection process to optimize the case selection itself.

Using the MD, which will incorporate a large number of variables, is superior to simply monitoring response rates to identify target cases during nonresponse follow-up. Response rates, even within an important stratum, only provide limited information in terms of who the respondents and nonrespondents are across many important variables. When targeting cases during nonresponse follow-up, it is most efficient to select nonrespondent cases that differ from respondents across many variables, rather than a single variable. This nonresponse follow-up approach will target cases that have the greatest potential for bias reduction; i.e. those that demonstrate the most differences from existing respondents. It is worth noting that there are certain nonrespondents (e.g., study withdrawals and cases having no contact information whatsoever) who will not participate regardless of the effort employed.


Analysis Plan

To assess the effectiveness of this approach outlined above, we propose first to examine differences in key variables between response rates excluding the phase 3 results (i.e., as if the nonresponse follow-up was never implemented) and response rates including the phase 3 results.  Second, we will compare the survey estimates, relative precision, and the average cost-per-case between the two groups.  Third, we will examine the data for indications of reduced nonresponse bias resulting from phase 3 of data collection. To do this, we will compare estimates produced from the phase 1 and 2 respondents against the respondent set that additionally includes the phase 3 respondents. Estimates will be calculated for a variety of survey questions including the timing of algebra I coursetaking and student educational expectations. If differences are identified, then this would suggest that the phase 3 incentive was effective in lowering the potential for bias by capturing responses from sample members who would otherwise have not participated. Finally, another sign of success would be an increase of at least 5 percentage points in the overall response rate.

At the close of data collection, we will also re-examine the mean MD scores for respondents and nonrespondents. The goal of incentivizing cases with large MD scores will have been to reduce any differences that exist between respondents and nonrespondents. Therefore, at the conclusion of data collection, the mean Mahalanobis values between respondents and nonrespondents may not be significantly different. This would be a desired goal, because significant differences would indicate a potential risk of nonresponse bias.


Transcript Reimbursement. Schools will be reimbursed for the cost of preparing and sending transcripts at the school’s standard rate. If additional costs are incurred by the schools, RTI will reimburse such expenses to the extent that they are reasonable and properly documented.

A.10 Assurance of Confidentiality Provided to Respondents

A data security plan (DSP) for HSLS:09 was approved by the IES Disclosure Review Board chair, for the base-year and first follow-up studies. Revisions to the plan will be made to account for changes associated with the College Update and Transcript collections. The HSLS:09 plan represents best-practice survey systems and procedures for protecting respondent confidentiality and securing survey data. An outline of this plan is provided in Exhibit A-2. The HSLS:09 DSP

  • establishes clear responsibility and accountability for data security and the protection of respondent confidentiality with corporate oversight to ensure adequate investment of resources;

  • details a structured approach for considering and addressing risk at each step in the survey process and establish mechanisms for monitoring performance and adapting to new security concerns;

  • includes technological and procedural solutions that mitigate risk and emphasize the necessary training to capitalize on these approaches; and

  • is supported by the implementation of data security controls recommended by the National Institute of Standards and Technology for protecting federal information systems.

Exhibit A-1. HSLS:09 Data Security Plan Outline

HSLS:09 Data Security Plan Summary

Maintaining the Data Security Plan

Information Collection Request

Our Promise to Secure Data and Protect Confidentiality

Personally Identifying Information That We Collect and/or Manage

Institutional Review Board Human Subject Protection Requirements

Process for Addressing Survey Participant Concerns

Computing System Summary

General Description of the RTI Networks

General Description of the Data Management, Data Collection, and Data Processing Systems

Integrated Monitoring System

Receipt Control System

Instrument Development and Documentation System

Data Collection System

Document Archive and Data Library

Employee-Level Controls

Security Clearance Procedures

Nondisclosure Affidavit Collection and Storage

Security Awareness Training

Staff Termination/Transfer Procedures

Subcontractor Procedures

Physical Environment Protections

System Access Controls

Survey Data Collection/Management Procedures

Protecting Electronic Media

Encryption

Data Transmission

Storage/Archival/Destruction

Protecting Hard-Copy Media

Internal Hard-Copy Communications

External Communications to Respondents

Handling of Mail Returns, Hard-Copy Student Lists, and Parental Consent Forms

Handling and Transfer of Data Collection Materials

Tracing Operations

Transcript Operations

Software Security Controls

Data File Development: Disclosure Avoidance Plan

Data Security Monitoring

Survey Protocol Monitoring

System/Data Access Monitoring

Protocol for Reporting Potential Breaches of Confidentiality

Specific Procedures for Field Staff

Adhering to rules outlined in the DSP, invitation letters will be sent to sample members (students and parents). Respondents will be informed of the voluntary nature of the survey and of the confidentiality provision in the initial cover letter and on the questionnaires, stating that their responses may be used for statistical purposes only and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002 (ESRA 2002) 20 U.S.C., § 9573]. The material sent will also include a brochure describing the study and the extent to which respondents and their responses will be kept confidential (Appendix A.)

Additionally, HSLS:09 will conform to NCES Restricted Use Data Procedures Manual and NCES Standards and Policies. The plan for maintaining confidentiality includes obtaining signed confidentiality agreements and notarized nondisclosure affidavits from all personnel who will have access to individual identifiers. Each individual working in HSLS:09 will complete the e-QIP clearance process. The plan includes annual personnel training regarding the meaning of confidentiality and the procedures associated with maintaining confidentiality, particularly as it relates to handling requests for information and providing assurance to respondents about the protection of their responses. The training will cover controlled and protected access to computer files under the control of a single database manager; built-in safeguards concerning status monitoring and receipt control systems; and a secured and operator-manned in-house computing facility.

All data transferred electronically for HSLS:09 will be transmitted through a secure server at NCES. The system requires that both parties to the transfer be registered users of the NCES Members Site and also that their Members Site privileges be set to allow use of the new service. This service is designed for the secure transfer of electronic files containing personally identifying information (i.e., data protected under the Privacy Act or otherwise posing risk of disclosure).

This secure server has been used successfully and without incident on HSLS:09 and other NCES studies. Procedures have been put into place for using the server to transfer confidential data. The system requires that both parties to the transfer be registered users of the NCES Members Site and that their Members Site privileges be set to allow use of the secure data transfer service. These privileges are set up and carefully controlled by NCES’ Chief Technology Officer (CTO). This service has been designed by ED/NCES specifically for the secure transfer of electronic files containing personally identifying information (i.e., data protected under the Privacy Act or otherwise posing risk of disclosure) and can be used for NCES-to-Contractor, Contractor-to-Subcontractor, Subcontractor-to-Contractor, and Contractor-to-Other-Agency data transfers. The party uploading the information onto the secure server at NCES is responsible for deleting the file(s) after the successful transfer has been confirmed. Data transfers using this system will include notification to IES, the NCES CTO, the NCES Deputy Commissioner, and the NCES project officer. The notification will include the names and affiliations of the parties in the data exchange/transfer and the nature and approximate size of the data to be transferred.

A.11 Justification for Sensitive Questions

The College Update asks no questions that normally would be deemed sensitive.

A.12 Estimates of Annualized Burden Hours and Their Cost to Respondents

Estimates of response burden for the HSLS:09 College Update and transcript data collection activities are shown in Exhibits A-3 through A-4. Also included are the estimates for a panel maintenance which will be conducted between the College Update and second follow-up study. The panel maintenance will consist of a mailing to each sample member and his or her parent/guardian asking that they log onto the survey website to update contact information or that they complete a hardcopy address update. Estimates of response burden are developed from experience with the base-year and first follow-up HSLS:09 questionnaires, experience from the high school transcript collection for ELS:2002, and experience from other education longitudinal studies (e.g., ELS:2002, NELS:88, HS&B).

Exhibit A-2. Estimated Burden for HSLS:09 College Update and Transcript Field Test

Respondents

Sample

Expected response rate

Number of respondents

Number of respondents

Average burden per response1

Range of
response times

Total burden (hours)

College Update: Students or Parents

754

92%

694

694

20 minutes

15-25 minutes

231

College Update: Reinterview

200

50%

100

100

20 minutes

15-25 minutes

33

Transcripts: School registrar (base year schools)

26

92%

24

24

60 minutes

15-180 minutes

24

Transcripts: School registrar (transfer schools)

72

92%

66

66

60 minutes

30-90 minutes

66

Panel maintenance

754

20%

151

151

5 minutes

3-7 minutes

13









Total

 

 

884

884

 

 

367









Carry over from First Follow-up Full Scale Collection (to end August 2012)2



36,562

44,124



24,305

Grand Total



37,446

45,159



24,672

1 Transcript time is specified by school not by student. The higher burden per response is due to the greater number of transcripts to be processed by each school.

2 For the respondent burden estimate we are carrying over the burden from the First Follow-up Full Scale Collection (OMB# 1850-0852 v.9) because that collection will end in August 2012, after the field test will have begun.

Exhibit A-3. Estimated Burden for HSLS:09 College Update and Transcript Main Study

Respondents

Sample

Expected response rate

Number of respondents

Average burden per response1

Range of
response times

Total burden (hours)

College Update: Students or Parents

24,700

92%

22,724

20 minutes

15-25 minutes

7575

Transcripts: School registrar (base year schools)

944

92%

868

60 minutes

15-180 minutes

868

Transcripts: School registrar (transfer schools)

2,371

92%

2,182

60 minutes

30-90 minutes

2,182

Panel maintenance

24,700

20%

4,940

5 minutes

3-7 minutes

412

 

 

 

 

 

 

 

Total

 

 

30,714

 

 

11,037

1 Transcript time is specified by school not by student. The higher burden per response is due to the greater number of transcripts to be processed by each school.

For the College Update, assuming that half of the respondents are students and half are parents, and assuming an hourly rate of $7.25 per hour for students and $20 for parents, the total cost to participants is estimated at $837 for students and $2,310 for parents for the field test and $27,459 for students and $75,750 for parents for the main study. For the reinterviews conducted as part of the field test, we are planning for half the respondents to be students and half the respondents to be parents. The plan is that if the initial interview is done by the student, the reinterview will be done by the parent, and vice-versa. This will enable us to evaluate the comparability of answers provided by parents or students. Using the same hourly rates for students and parents, the estimated cost is estimated at $120 for students and $330 for parents.

Included in the parent and student notification letters will be the following statement:

According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless it displays a valid OMB control number. The valid OMB control number of this voluntary information collection is 1850-0852. The time required to complete this information collection is estimated to average 20 minutes for the questionnaire including the time to review instructions and complete and review the information collection. If you have any comments concerning the accuracy of the time estimate or suggestions for improving the interview, please write to: High School Longitudinal Study of 2009 (HSLS:09), National Center for Education Statistics, 1990 K Street NW, Washington, DC 20006.


The cost to the school registrar is estimated at $20 per hour. The cost for the school registrar to prepare and submit transcripts is estimated at $480 for the field test and $17,360 for the main study.

The cost to panel maintenance participants is estimated at $20 per hour: $260 for the field test and $8,240 for the main study.

A.13 Estimates of Total Annual Cost Burden to Respondents

There are no capital, startup, or operating costs to respondents for participation in the project. No equipment, printing, or postage charges will be incurred.

A.14 Annualized Cost to the Federal Government

Estimated costs to the federal government for HSLS:09 are shown in Exhibit A-5. The estimated costs to the government for the College Update and transcript data collection for the field test and main study are presented separately. Included in the contract estimates are all staff time, reproduction, postage, and telephone costs associated with the management, data collection, analysis, and reporting for which clearance is requested.

Exhibit A-4. Total Costs to NCES

Costs to NCES

Amount

Total HSLS:09 College Update and transcript costs

$ 8,607,518

Salaries and expenses

300,000

Contract costs

8,307,518

Field test College Update and transcripts (2012-13)

$ 1,659,923

Salaries and expenses

150,000

Contract costs

1,509,923

Main study College Update and transcripts (2013-14)

$ 6,947,595

Salaries and expenses

150,000

Contract costs

6,797,595

NOTE: All costs quoted are exclusive of award fee. Field test costs represent Task 4 of the HSLS:09 contract; main study costs include tasks 5 and 6.

A.15 Reasons for Program Changes

The primary change associated with this submission is the shift from school-based student data collection to the College Update collection to occur after students had graduated high school. Because the previously cleared First Follow-up Full Scale Collection (OMB# 1850-0852 v.9) will not have finished by the time the field test data collections will begin, there is an apparent increase to currently approved respondent burden.

A.16 Publication Plans and Project Schedule

The formal contract for HSLS:09 requires the following reports, publications, or other public information releases in the main study:

1. a detailed methodological report describing all aspects of the College Update and high school transcript main study design and data collection procedures (a working paper detailing the methodological findings from the field test will also be produced);

2. complete data files and documentation for research data users in the form of both a restricted-use and public-use electronic codebook (ECB) and a public-use data tool (i.e., EDAT); and

3. a descriptive First Look Report, reporting initial findings on issues of interest to the secondary school and higher education community, as determined by NCES.

Exhibit A-5. Operational Schedule for HSLS:09

HSLS:09 activity

Start date

End date

Field test



College Update

Jun. 2012

Oct. 2012

Transcript collection

Sept. 2012

Jan. 2013

Panel maintenance

June 2013

Aug. 2013




Main study



College Update

Jun. 2013

Oct. 2013

Transcript collection

Sept. 2013

Jan. 2014

Panel maintenance

June 2014

Aug. 2014



A.17 Reason(s) Display of OMB Expiration Date Is Inappropriate

The expiration date for OMB approval of the information collection will be displayed on data collection instruments and materials. No special exception to this requirement is requested.

A.18 Exceptions to Certification for Paperwork Reduction Act Statement

There are no exceptions to the certification statement identified in the Certification for Paperwork Reduction Act Submissions of OMB Form 83-I.

References

Adelman. C. (2006). The Toolbox Revisited: Paths to Degree Completion from High School Through College. U.S. Department of Education: Washington, DC.


Balfanz, R., and Letgers, N. (2004). Locating the Dropout Crisis: Which High Schools Produce the Nation’s Dropouts? Where Are They Located? Who Attends Them? Report 70, Center for Research on the Education of Students Placed At Risk (CRESPAR), The Johns Hopkins University. Available: http:/www.csos.jhu.edu/crespar/techReports/Report70.pdf

Bozick, R., and Ingels, S.J. (2008). Mathematics Coursetaking and Achievement at the End of High School: Evidence from the Education Longitudinal Study of 2002 (ELS:2002). (NCES 2008-319). National Center for Education Statistics, Institute of Education Sciences, U.S Department of Education: Washington, DC.

Brock, T. (2010). “Young Adults and Higher Education: Barriers and Breakthroughs to Success.” The Future of Children, 20(1).

Cantor, David., O’Hare, Barbara C., and Kathleen S. O’Connor. 2008. “The Use of Monetary Incentives to Reduce Nonresponse in Random Digit Dial Telephone Surveys.” Advances in Telephone Survey Methodology. Copyright John Wiley & Sons, Inc.

Dalton, B., Ingels, S.J., Downing, J., and Bozick, R. (2007). Advanced Mathematics and Science Coursetaking in the Spring High School Senior Classes of 1982, 1992, and 2004. (NCES 2007-312). National Center for Education Statistics, Institute of Education Sciences, U.S Department of Education: Washington, DC.

Dillman, Don A. (2007). Mail and Internet Surveys: The Tailored Design Method. New York: Wiley.

Fetters, W.B., Stowe, P.S., and Owings, J.A. (1984). Quality of Responses of High School Students to Questionnaire Items, High School and Beyond (NCES 84-216). U.S. Department of Education, National Center for Education Statistics. Washington, DC.

Goldrick-Rab, S. (2010). “Challenges and Opportunities for Improving Community College Student Outcomes.” Review of Educational Research, 80(3).

Goldrick-Rab, S., and Pfeffer, F. (2009). “Beyond Access: Explaining Social Class Differences In College Transfer.” Sociology of Education, 82 (2).

Provasnik, S., and Planty, M. (2008). Community Colleges: Special Supplement to The Condition of Education 2008 (NCES 2008-033). National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education. Washington, DC.

Memorandum of December 21, 2010, to Shelly Wilkie Martinez, OMB, from Freddie Cross and Kathryn Chandler, through Kashka Kubzdela, NCES. Results of BLTS Incentives Experiment (OMB# 1850-0868 v1).

Wine, Jennifer., Cominole, M., Janson, N., Socha, T. 2010 “2008/09 Baccalaureate and Beyond Longitudinal Study (B&B:08/09) Field Test Methodology Report.” Working Paper Series. NCES 2010-02.



1 In the base year main study, an incentive experiment among the parents was conducted to test the effectiveness of three incentive amounts ($0, $10, or $20) for the most challenging set of cases. "Challenging” cases are defined as 1) having refused to participate once, 2) reaching a pre-set number of call attempts, or 3) having no telephone contacting information. A $5 pre-paid gift card was also sent to parents with no telephone contacting information.

2 The College Update sample includes only those students who participated in the base year study and/or the first follow-up study. Students who did not participate in either prior round would not be included in the College Update or subsequent data collection rounds. Of the 754 students/parents included in the College Update field test sample, 26 sample members (3 percent) either requested that they be removed from the study or did not have sufficient contact information.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleChapter 2
Authorspowell
File Modified0000-00-00
File Created2021-01-31

© 2024 OMB.report | Privacy Policy