PartA HSLS-09 First Follow-up Field Test 2011

PartA HSLS-09 First Follow-up Field Test 2011.docx

High School Longitudinal Study of 2009 (HSLS:09) First Follow-up Field Test 2011

OMB: 1850-0852

Document [docx]
Download: docx | pdf

December 6, 2010



High School Longitudinal Study of 2009 (HSLS:09), First Follow-up

Field Test 2011




Supporting Statement

Part A







Request for OMB Review
OMB# 1850-0852 v.7










National Center for Education Statistics

U.S. Department of Education


Table of Contents

Section Page



EXHIBITS

Number Page




Attachments

Appendix A Data Collection Communication Materials A-1

Appendix B Student Questionnaire B-1

Appendix C Parent Questionnaire C-1

Appendix D Administrator Questionnaire D-1

Appendix E Counselor Questionnaire E-1


High School Longitudinal Study of 2009

This document has been submitted to request clearance under the Paperwork Reduction Act of 1995 and 5 CFR 1320 for the High School Longitudinal Study of 2009 (HSLS:09). The study is being conducted for the National Center for Education Statistics (NCES) by the Research Triangle Institute (RTI) International—with the American Institutes for Research (AIR), Windwalker Corporation, Horizon Research Inc., Research Support Services (RSS), and MPR Associates (MPR) as subcontractors—under contract to the U.S. Department of Education (Contract number ED-04-CO-0036).

The purpose of this Office of Management and Budget (OMB) submission is to request clearance for the HSLS:09 first follow-up field test (to be conducted in 2011) and a 60-day Federal Register notice waiver for the main study (to be conducted in 2012). This submission contains questionnaires for school administrators and counselors, and draft questionnaires for students and parents, which will be revised by early January 2011 based on results of cognitive testing that will be conducted in December 2010.

A. Justification

A.1 Circumstances Necessitating Collection of Information

A.1.a Purpose of This Submission

The materials in this document support a request for clearance for the field test and a 60-day Federal Register notice waiver for the first follow-up main data collection for HSLS:09. The basic components and key design features of HSLS:09 are summarized below:

Base Year

  • survey of high school 9th-graders, in fall term, 2009;

  • mathematics assessment;

  • surveys were collected from parents, mathematics and science teachers, school administrators, and school counselors;

  • sample sizes of 944 schools from which more than 21,000 students participated in data collection (schools are the first-stage unit of selection, with 9th-graders randomly selected within schools); and

  • oversampling of private schools and Asians/Pacific Islanders.

Follow-ups to the base-year study are also planned:

First Follow-up

  • follow-up in spring 2012, when most sample members are high school juniors, but some have dropped out or are in other grades;

  • student questionnaires, mathematics assessment, parent survey (subsample of parents), and school counselor and administrator questionnaires to be administered;

  • returning to the same schools, but separately following transfer students;

  • a “college update” with parents or students in the summer after modal senior year (2013); and

  • high school transcript component in fall 2013 (records data for grades 9–12).

Second Follow-up and Beyond

  • post–high school follow-ups by web survey and computer-assisted telephone interview (CATI). The second follow-up is scheduled for spring 2015.

  • an additional follow-up is tentatively scheduled for spring 2021.

HSLS:09 links to its predecessor longitudinal studies by addressing many of the same issues of transition from high school to postsecondary education and the labor force. At the same time, HSLS:09 brings a new and special emphasis to the study of youth transition by exploring the path that leads students to pursue and persist in courses and careers in the fields of science, technology, engineering, and mathematics (STEM). HSLS:09 is designed to measure math achievement gains in the first 3 years of high school, but also to relate tested achievement to students’ choice, access, and persistence of courses, college, and careers, especially in science, technology, engineering, and mathematics pipelines. The HSLS:09 assessment will serve not just as an outcome measure, but also as a predictor of readiness to proceed into college and, in particular, STEM courses and careers. The assessment administered in the first follow up will be the same as the base year assessment, with fewer than twenty new items. The addition of these new items is required to avoid ceiling effects from the more advanced students who will take the test. Questionnaires focus on factors that shape students’ decision-making about courses and postsecondary options, including what factors, from parental input to considerations of financial aid for postsecondary education, enter into these decisions.

HSLS:09 supports two of the three goals of the American Competitiveness Initiative (ACI), which aims to strengthen math and science education, foreign language studies, and the high school experience in the United States. Information collected from students, parents, and school staff will help to inform and shape efforts to improve the quality of math and science education in the United States, increase our competitiveness in STEM-related fields abroad, and improve the high school experience.

There are several reasons the transition into adulthood is of special interest to federal policy and programs. Adolescence is a time of physical and psychological changes. Attitudes, aspirations, and expectations are sensitive to the stimuli that adolescents experience, and environments influence the process of choosing among opportunities. Parents, educators, and those involved in education policy decisions all share the need to understand the effects that the presence or absence of good guidance from the school, in combination with that from the home, can have on the educational, occupational, and social success of youth.

These patterns of transition cover individual and institutional characteristics. At the individual level, the study will examine education attainment and personal development. In response to policy and scientific issues, data will also be provided on the demographic and background correlates of education outcomes. At the institutional level, HSLS:09 will focus on school effectiveness issues, including resources, strategies, and programs that may affect students’ mathematics and science courses and achievement, as well as college entry in general.

By collecting extensive information from students, parents, school staff, and school records, it will be possible to investigate the relationship between home and school factors and academic achievement, interests, and social development at this critical juncture. The extent to which schools are expected to provide special services to selected groups of students to compensate for limitations and poor performance (including special services to assist those lagging in their understanding of mathematics and science) will be examined. Resources to assist in guiding parents and students through the college decision process, from information-seeking behaviors to filing financial aid forms, will be explored in how they relate to college entry. Moreover, the study will focus, for example, on basic policy issues related to parents’ role in the education success of their children, including parents’ education attainment expectations for their children, beliefs about and attitudes toward curricular and postsecondary education choices, and any preparation made for their child’s life past high school.

Additionally, because the initial survey focused on 9th-graders, it will also permit the identification and study of high school dropouts and underwrite trend comparisons with dropouts identified and surveyed in the High School and Beyond Longitudinal Study (HS&B), the National Education Longitudinal Study of 1988 (NELS:88), and the Education Longitudinal Study of 2002 (ELS:2002).

In sum, through its core and supplemental components, HSLS:09 data will allow researchers, educators, and policymakers to examine motivation, achievement, and persistence in STEM course-taking and careers. More generally, HSLS:09 data will allow researchers from a variety of disciplines to examine issues of college entry, persistence, and success, and how changes in young people’s lives and their connections with communities, schools, teachers, families, parents, and friends affect these decision, including:

  • academic (especially in math and science), social, and interpersonal growth;

  • transitions from high school to postsecondary education, and from school to work;

  • students’ choices about, access to, and persistence in math and science courses, majors, and careers;

  • the characteristics of high schools and postsecondary institutions and their impact on student outcomes;

  • family formation, including marriage and family development, and how prior experiences in and out of school correlate with these decisions; and

  • the contexts of education, including how minority and at-risk status is associated with education and labor market outcomes.

A.1.b Legislative Authorization

HSLS:09 is sponsored by NCES, within the Institute of Education Sciences (IES), in close consultation with other offices and organizations within and outside the U.S. Department of Education (ED). HSLS:09 is authorized under Section 153 of the Education Sciences Reform Act of 2002 (P.L. 107-279, Title 1 Part C), which requires NCES to

“collect, report, analyze, and disseminate statistical data related to education in the United States and in other nations, including—

(1) collecting, acquiring, compiling (where appropriate, on a State-by-State basis), and disseminating full and complete statistics … on the condition and progress of education, at the preschool, elementary, secondary, postsecondary, and adult levels in the United States, including data on—

(A) State and local education reform activities; …

(C) student achievement in, at a minimum, the core academic areas of reading, mathematics, and science at all levels of education;

(D) secondary school completions, dropouts, and adult literacy and reading skills;

(E) access to, and opportunity for, postsecondary education, including data on financial aid to postsecondary students; …

(J) the social and economic status of children, including their academic achievement…

(2) conducting and publishing reports on the meaning and significance of the statistics described in paragraph (1);

(3) collecting, analyzing, cross-tabulating, and reporting, to the extent feasible, information by sex, race, ethnicity, socioeconomic status, limited English proficiency, mobility, disability, urbanicity, and other population characteristics, when such disaggregated information will facilitate educational and policy decisionmaking; (7) conducting longitudinal and special data collections necessary to report on the condition and progress of education…”

Section 183 of the Education Sciences Reform Act of 2002 further states that:

“…all collection, maintenance, use, and wide dissemination of data by the Institute, including each office, board, committee, and Center of the Institute, shall conform with the requirements of section 552A of title 5, United States Code [which protects the confidentiality rights of individual respondents with regard to the data collected, reported, and published under this title].”

A.1.c Prior and Related Studies

In 1970, NCES initiated a program of longitudinal high school studies. Its purpose was to gather time-series data on nationally representative samples of high school students that would be pertinent to the formulation and evaluation of education polices.

Starting in 1972, with the National Longitudinal Study of the High School Class of 1972 (NLS:72), NCES began providing education policymakers and researchers with longitudinal data that linked education experiences with later outcomes, such as early labor market experiences and postsecondary education enrollment and attainment. The NLS:72 cohort of high school seniors was surveyed five times (in 1972, 1973, 1974, 1979, and 1986). A wide variety of questionnaire data were collected in the follow-up surveys, including data on students’ family background, schools attended, labor force participation, family formation, and job satisfaction. In addition, postsecondary transcripts were collected.

Almost 10 years later, in 1980, the second in a series of NCES longitudinal surveys was launched, this time starting with two high school cohorts. High School and Beyond (HS&B) included one cohort of high school seniors comparable to the seniors in NLS:72. The second cohort within HS&B extended the age span and analytical range of NCES’s longitudinal studies by surveying a sample of high school sophomores. With the sophomore cohort, information became available to study the relationship between early high school experiences and students’ subsequent education experiences in high school. For the first time, national data were available showing students’ academic growth over time and how family, community, school, and classroom factors promoted or inhibited student learning. In a leap forward for education studies, researchers, using data from the extensive battery of cognitive tests within HS&B, were also able to assess the growth of cognitive abilities over time. Moreover, data were now available to analyze the school experiences of students who later dropped out of high school. These data became a rich resource for policymakers and researchers over the next decade and provided an empirical base to inform the debates of the education reform movement that began in the early 1980s. Both cohorts of HS&B participants were resurveyed in 1982, 1984, and 1986. The sophomore cohort was also resurveyed in 1992. Postsecondary transcripts were collected for both cohorts.

The third longitudinal study of students sponsored by NCES was the National Education Longitudinal Study of 1988 (NELS:88). NELS:88 further extended the age and grade span of NCES longitudinal studies by beginning the data collection with a cohort of eighth-graders. Along with the student survey, it included surveys of parents, teachers, and school administrators. It was designed not only to follow a single cohort of students over time (as had NCES’s earlier longitudinal studies, NLS:72 and HS&B), but also, by “freshening” the sample at each of the first two follow-ups, to follow three nationally representative grade cohorts over time (8th-, 10th, and 12th-grade cohorts). This provided not only comparability of NELS:88 to existing cohorts, but it also enabled researchers to conduct both cross-sectional and longitudinal analyses of the data. In 1993, high school transcripts were collected, further increasing the analytic potential of the survey system. Students were interviewed again in 1994 and 2000, and in 2000–01 their postsecondary education transcripts were collected. In sum, NELS:88 represents an integrated system of data that tracked students from middle school through secondary and postsecondary education, labor market experiences, and marriage and family formation.

The Education Longitudinal Study of 2002 (ELS:2002) was the fourth longitudinal high school cohort study conducted by NCES. ELS:2002 started with a sophomore cohort and was designed to provide trend data about the critical transitions experienced by students as they proceed through high school and into postsecondary education or their careers. Student questionnaires and assessments in reading and mathematics were collected along with surveys of parents, teachers, and school administrators. In addition, a facilities component and school library/media studies component were added for this study series. Freshening occurred at the first follow-up in 2004 to allow for a nationally representative cohort of high school seniors, which was followed by the collection of high school transcripts. A second follow-up was conducted in 2006, and a third follow-up is scheduled for 2012 (preceded by a 2011 field test).

These studies have investigated the education, personal, and vocational development of students, and the school, familial, community, personal, and cultural factors that affect this development. Each of these studies has provided rich information about the critical transition from high school to postsecondary education and the workforce. HSLS:09 will continue on the path of its predecessors while also focusing on the factors associated with choosing, persisting in, and succeeding in STEM course-taking and careers.

A.2 Purpose and Use of Information Collection

HSLS:09 is intended to be a general-purpose dataset; that is, it is designed to serve multiple policy objectives. Policy issues studied through HSLS:09 include the identification of school attributes associated with mathematics achievement, college entry, and career choice; the influence that parent and community involvement have on students’ achievement and development; the factors associated with dropping out of the education system; and the transition of different groups (for example, racial and ethnic, gender, and socioeconomic status groups) from high school to postsecondary institutions and the labor market, and especially into STEM curricula and careers. HSLS:09 inquires into students’ values and goals, investigates factors affecting risk and resiliency, gathers information about the social capital available to sample members, inquires into the nature of student interests and decision-making, and delineates students’ curricular and extracurricular experiences HSLS:09 includes measures of school climate; each student’s native language and language use; student and parental education expectations; attendance at school; course and program selection; college plans, preparation, and information-seeking behavior; interactions with teachers and peers; as well as parental resources and support. The HSLS:09 data elements are designed to support research that speaks to the underlying dynamics and education processes that influence student achievement, growth, and personal development over time.

The objectives of HSLS:09 also encompass the need to support both longitudinal and cross-cohort analyses and to provide a basis for important descriptive cross-sectional analyses. HSLS:09 is first and foremost a longitudinal study; hence survey items are chosen for their usefulness in predicting or explaining future outcomes as measured in later survey waves. Compared to its earlier counterparts, there are considerable changes to the design of HSLS:09 that will impact the ability to produce trend comparisons. NELS:88 began with an eighth-grade cohort in the spring term; although this cohort is not markedly different from the fall-term 9th-grade cohort of HSLS:09 in terms of student knowledge base, it differs at the school level in that the HSLS:09 time point represents the beginning of high school rather than the point of departure from middle school. HSLS:09 includes a spring-term 11th-grade follow-up (even though none of the predecessor studies do) because only modest gains have been seen on assessments in the final year of high school, and the 11th-grade follow-up minimizes unit response problems associated with testing in the spring term of the senior year. The design of HSLS:09 calls for information to be collected from parents of (modally) 11th-graders and the use of a parent/student College Update survey, and a collection of transcripts to provide continuous data for grades 9–12.

A.2.a Content Justifications

Overview. This section contains justifications for the HSLS:09 First Follow-up field test instruments. Draft field test questionnaires—student, parent, administrator, and counselor—have been included, each in its own appendix (Appendix B, C, D, and E respectively).

At time of administration, these will be electronic instruments, with no paper version, so the respondent will be automatically routed through the instrument without seeing actual skip instructions. In web versions of the questionnaire, when a respondent violates the logic of the questionnaire, the respondent will receive an error message (for example, if the respondent marks that 150% of his time is spent surfing the internet, an error message will explicitly inform the respondent of the problem [exceeds 100%] and ask for resolution). In the CATI interview, the error or inconsistency prompt comes from the interviewer. Error messages have not been included in this clearance package, but will be developed when the final items are programmed.

All questionnaires and the assessment serve to support the overall purposes of HSLS:09, which are to understand the factors (e.g., experiences, behaviors, attitudes, interactions with people) that influence students’ decision-making process about high school courses, postsecondary options, and occupation goals (especially within the STEM pipeline), and to understand how these decisions evolve through secondary school and beyond.

Student Survey: Purposes and Content Justification

The draft student questionnaire appears as Appendix B of this submission. Four primary research questions drive the student questionnaire:

1. How do students decide what courses to take in high school and what to pursue after their time in high school concludes (e.g., college, work, careers, the military)? What factors affect their decision-making, particularly factors that are malleable to school or parent influence?

a. Opportunities

b. Barriers

c. Attitudes

d. Past behaviors

e. Plans

2. What factors lead students toward or away from college entry, and in particular, STEM courses and careers?

3. How and why do students’ attitudes, goals, and learning approaches evolve in the course of high school?

The draft field test student questionnaire attempts to provide information that will help to address these and related questions from the student’s perspective.

Research Justification for HSLS:09 First Follow-up (FFU) Student Surveys

Regular student survey. The core of HSLS:09’s FFU is the student survey of primarily high school juniors. This will gather significantly similar material as the base year and provide longitudinal data. The longitudinal data are critical to the design of the study by providing information about the stability and evolution of education expectations and plans. To reinforce the ability of this longitudinal design to identify processes and practices leading to high achievement and advanced attainment, additional data on the socioeconomic background of base-year students (originally gathered by the base-year parent questionnaire, but with high levels of missing data) will be gathered. The questionnaire will ask about home life, school experiences, math and science experiences, and plans for postsecondary education and work. The 35-minute survey consists of approximately 50% items repeated from the base-year questionnaire, with another 30% adapted from other NCES studies.

Transfer student questionnaire. We anticipate that most base-year sample members will remain in their base-year school; however, some will have transferred to other schools either through reassignment or moving. Because changing schools can affect school engagement and achievement by disrupting relationships and the coherence of curricula, surveying transfer students is an integral part of constructing a longitudinal portrait of education experiences and outcomes of the original cohort. The large majority of the transfer student questionnaire will be the same as the regular student questionnaire; the difference will be additional questions asking about reasons for moving and perceptions of the new school.

Dropout/early graduate survey. The FFU will survey dropouts and those who completed school (whether with a regular diploma or alternative credential such as GED). The dropout survey is critical for understanding the personal and institutional experiences that produce early school departure and for developing effective policies to limit school failure. The dropout questionnaire will cover content such as reasons for dropping out, prevention programs experienced, interventions received, plans for completing high school or an equivalency degree, and out-of-school work and training experiences. Early graduates at this stage are often alternative credential holders and will be asked substantially similar questions about reasons for finishing, future plans, and current activities.

Home-schooled student survey. Finally, some base-year sample members will have shifted from traditional school-based settings to a home-schooled environment. Home-schooled students face challenges in maintaining community ties, access to extracurricular activities, and exposure to rich curriculum materials, while potentially benefiting from more customized instruction. Because this environment is not well understood, special home-school questions will address reasons for home-schooling and access to local school resources. However, the large majority of the home-school questions will be the same as the regular student survey, asking, for example, about math and science interest and education and occupation plans after high school.

Parent Survey

The draft parent questionnaire appears as Appendix C of this submission. The parent questionnaire complements the student questionnaire by providing information on the student’s context and history, reporting on parental school involvement, and describing the home environment (e.g., values, expectations, and opportunities). Three research questions frame the parent questionnaire:

1. What social capital resources are available in the home environment to support children’s academic development and decision making (e.g., parent involvement in child’s decision making; course selection; planning for college or the labor market; shifts in involvement around key transitions—middle to high school, high school to postsecondary life; child’s involvement in extracurricular activities; child’s involvement in community activities [e.g., Girl Scouts, church groups])?

2. What human capital resources are available in the home environment to support children’s academic development and decision making (e.g., parents’ background in mathematics; parents’ background in science; parents’ attitudes about the importance of math, science, and education in general; parents’ expectations for children’s education achievement; and parents’ expectations for their child’s career)?

3. What financial capital resources are available in the home environment to support children’s academic development and decision making (e.g., preparation for financing college)?

Parents are key sources of detailed information about family resources, home life, and financial and other planning for college. Parents are often in a position to provide more accurate and less biased answers about their children, specifically in the area of college financial planning, which is a major focus of this round of surveys. Because parental expectations, encouragement, and resources are significant predictors of postsecondary education enrollment and attainment, and because parents often have considered the postsecondary education of their children before these choices become immediate to students themselves, the parent survey can provide critical insight into the likely trajectories and barriers that students will experiences as they progress from high school to college. Likewise, within the context of their own occupation experiences, parents are able to provide an alternative perspective on the strengths, weaknesses, and interests that their child will bring to the workforce beyond that formed by the student, who has more limited work knowledge. Understanding the education and occupation preparation of high school juniors is thus particularly contingent on obtaining the parents’ viewpoint.

In a longitudinal context, the parent survey provides an important and rare chance to examine the stability of parental involvement in student life, alterations to family structure, changes in economic circumstance, and other home changes which can have a major impact on the psychological and material world of adolescents. The conjunction of longitudinal data from parents with longitudinal data from students makes for a powerful research tool to address the core education research question of how family experiences shape student outcomes.

The parent survey will complement the student surveys in capturing a comprehensive portrait of the support system of high school students. For example, the parent survey will include a special module for parents of dropouts, enabling further exploration of reasons for and interventions to prevent dropping out. The parent survey will also be particularly helpful in filling in information for home-schooled students whose change in setting closely involves the parent. Likewise, parents may be able to provide better data about reasons for moving that led to a student’s school transfer. Similarly, the parent survey will complement questions in the school counselor survey regarding how course-taking choices are made, particularly in HSLS:09’s focus on mathematics and science.

Approximately half of the parent survey will consist of repeated base-year items, with the remainder primarily addressing college applications, admissions tests, and financing, or special questions for parents of dropouts. The major source for new items will be the NELS:88 1992 parent survey.

School Administrator Questionnaire

The draft school administrator questionnaire appears as Appendix D of this submission. School administrators provide contextual information about themselves, school climate, staffing, and resources.

The purpose of the HSLS:09 School Administrator Questionnaire is to support the study’s main research objectives: How do young adults choose the pathways they do, particularly pathways into science, technology, engineering, and mathematics (STEM) careers? What role does high school (or the high school years) play in students’ ultimate decisions? And, how does “algebra learning” in high school shape students’ decisions to pursue a career in STEM specifically? To achieve its purpose, the HSLS:09 School Administrator Questionnaire has been designed to provide school-level contextual data for examining and interpreting students’ decision making and planning processes. And, because HSLS:09 schools comprise a nationally representative sample, questionnaire data may also be used to draw a descriptive profile of American high schools with 9th and 11th grades.

Although questionnaire items were selected to achieve the overall goals and purposes of the study as mentioned above, selection was guided primarily by the desire to address the following questions specific to schools:

1. What school structures, policies, practices, and offerings facilitate or inhibit different high school trajectories and decisions (e.g., course-taking, dropping out, going on to work or college)?

2. What programs and policies do schools offer to assist students at risk of school failure, students at risk of dropping out, and students struggling in math and science?

3. What are the school-level correlates of high-achieving schools in math and science (e.g., principal training and experience, climate, ease of hiring and retaining qualified math and science teachers, program offerings in math and science, and supports for struggling students)?

4. What is the math and science focus of schools (e.g., what explicit activities, if any, are schools engaged in to raise students’ interest and performance in math and science)? Is this focus associated with students’ subsequent performance in math and science and decisions to pursue careers in math and science?

Items were also selected based on the need to collect certain data in students’ 11th-grade versus 9th-grade year. For example, school practices and policies that are less likely to change over time were reserved for the 11th-grade School Administrator Questionnaire. This division of items was also intended to keep the burden of the 11th grade questionnaire to 30 minutes.

The School Administrator Questionnaire collects information on the school in five domains: (1) school and student characteristics; (2) teaching staff characteristics; (3) school policies, practices, and programs; (4) school governance and climate; and (5) principal background and experiences. Data gathered in the School Administrator Questionnaire can be merged with data from the student and counselor questionnaires and the student cognitive assessment. This link will allow researchers to determine the school structures, policies, and practices that may encourage or discourage different high school trajectories and decisions.

Counselor Survey

The draft school counselor questionnaire appears as Appendix E of this submission. The counselor component is targeted to the head counselor or whomever the head counselor designates as a knowledgeable source about the questionnaire contents. HSLS:09 First Follow-up is not a study of counselors and cannot generalize about counselors as a special population, but rather, a study that employs counselor data contextually to illuminate characteristics and practices of the school, particularly those related to student placement in mathematics and science, and the availability and role of counseling services as students transition out of high school. Key research questions that the counselor survey may help to address include the following:

1. How do students get placed into and out of classes?

2. What counseling resources are available to the students within the school (e.g., how many counselors; what is their student load; what do they do, what are their responsibilities—such as course placement, college planning, career planning; transitions from high school to postsecondary)?

3. What are the course placement procedures, policies, and graduation requirements (e.g., how many credits/courses in English, in math, in science, etc.)?

4. What college and workplace preparation practices occur at the school (e.g., AP classes/AP exams, preparation for SATs, work experience programs, job fairs)?

Questionnaire items were selected based on the need to collect certain data in students’ 11th-grade year. For example, questions about students’ transition from middle school to high school from the Base Year survey were dropped in favor of questions about students’ transition from high school to post-secondary school or work.

The School Counselor Questionnaire collects information on the school in five areas: (1) counseling services provided; (2) course placement policies; (3) school-based remediation and enrichment services offered (with a focus on STEM); (4) postsecondary counseling; and (5) out-of-school learning experiences/opportunities. Data gathered from the counselor questionnaire can be merged with data from the student assessment and survey to determine if and how disparities in education aspirations, expectations, and outcomes of various student populations can be attributed to different counseling resources and practices.

A.3 Use of Improved Information Technology and Burden Reduction

The HSLS:09 first follow-up will follow the path forged by the base year, with all or virtually all questionnaire data collected in electronic media only. In addition, the student assessment will again be a computer-assisted two-stage adaptive test. For the student component, the school’s computer lab will be used when available, and, as a backup, multiple laptops will be supplied for use by the sampled students. A trained session administrator will assist students with computer issues as needed. This is the same approach that proved effective in the HSLS:09 base year administration. However, because of the presence of out-of-school students such as dropouts and transfer students in this round of data collection, we will use the field test to evaluate the feasibility of computerized assessment and questionnaire self-administration. In other words, the option of having students outside the base-year schools log on and complete the questionnaire and assessment on a computer to which they have access will be examined.

School administrators, counselors, and parents will be given a username and password and will be asked to complete their relevant questionnaires via the Internet. There will be a computer-assisted telephone interview (CATI) follow-up for school administrators, teachers, and parents who do not complete the web questionnaire by self-administration. Computer control of interviewing offers accurate and efficient management of survey activities, including case management, scheduling of calls, generation of reports on sample disposition, data quality monitoring, interviewer performance, and flow of information between telephone and field operations.

Additional features of the CATI system include (1) online help for each screen to assist interviewers in question administration; (2) full documentation of all instrument components, including variable ranges, formats, record layouts, labels, question wording, and flow logic; (3) capability for creating and processing hierarchical data structures to eliminate data redundancy and conserve computer resources; (4) a scheduler system to manage the flow and assignment of cases to interviewers by time zone, case status, appointment information, and prior cases disposition; (5) an integrated case-level control system to track the status of each sample member across the various data collection activities; (6) automatic audit file creation and timed backup to ensure that, if an interview is terminated prematurely and later restarted, all data entered during the earlier portion of the interview can be retrieved; and (7) a screen library containing the survey instrument as displayed to the interviewer.

A.4 Efforts to Identify Duplication and Use of Similar Information

Since the inception of its secondary education longitudinal studies program in 1970, NCES has consulted with other federal offices to ensure that the data collected in this important series of longitudinal studies do not duplicate the information from any other national data sources within the U.S. Department of Education or other government agencies. In addition, NCES staff have regularly consulted with nonfederal associations such as the College Board, American Educational Research Association, the American Association of Community Colleges, and other groups to confirm that the data to be collected through this study series are not available from any other sources. These consultations also provided, and continue to provide through the HSLS:09 Technical Review Panel, methodological insights from the results of other studies of secondary and postsecondary students and labor force members, and they ensure that the data collected through HSLS:09 will meet the needs of the federal government and other interested agencies and organizations. Other longitudinal studies of secondary and postsecondary students (i.e., NLS:72, HS&B, NELS:88, ELS:2002) have been sponsored by NCES in the past. HSLS:09 builds on and extends these studies rather than duplicating them.

First, current efforts explicitly complement the redesign of NPSAS and BPS with the instrumentation and design of HSLS:09. Second, design articulation with prior NCES secondary longitudinal studies also show coordination, not duplication. These earlier studies were conducted during the 1970s, 1980s, 1990s, and the early 2000s and represent education, employment, and social experiences and environments different from those experienced by the HSLS:09 student sample. In addition to extending prior studies temporally as a time series, HSLS:09 extends them conceptually. Unlike preceding secondary longitudinal studies, HSLS:09 provides data that are necessary to understand the role of different factors in the development of student commitment to attend higher education and then to take the steps necessary to succeed in college (taking the right courses, taking courses in specific sequences, etc.). Further, HSLS:09 focuses on the factors associated with choosing and persisting in mathematics and science course-taking and STEM careers. These focal points present a marked difference between HSLS:09 and its predecessor studies.

The only other dataset that offers so large an opportunity to understand the key transitions into postsecondary institutions or the world of work is the Department of Labor (Bureau of Labor Statistics) longitudinal cohorts, the National Longitudinal Survey of Youth 1979 and 1997 cohorts (NLSY79, NLSY97). However, the NLSY youth cohorts represent temporally earlier cohorts than HSLS:09. There are also important design differences between NLSY79/ NLSY97 and HSLS:09 that render them more complementary than duplicative. NLSY is a household-based longitudinal survey; HSLS:09 is school based. For both NLSY cohorts, Base Armed Service Vocational Aptitude Battery (ASVAB) test data are available, but there is no longitudinal high school achievement measure. Although NLSY97 also gathers information from schools (including principal and teacher reports and high school transcripts), it cannot study school processes in the same way as HSLS:09, given its household sampling basis. Any given school contains only one to a handful of NLSY97 sample members, a number that constitutes neither a representative sample of students in the school nor a sufficient number to provide within-school estimates. Thus, although both studies provide important information for understanding the transition from high school to the labor market, HSLS:09 is uniquely able to provide information about education processes and within-school dynamics and how these affect both school achievement and ultimate labor market outcomes, including outcomes in science, technology, engineering, and mathematics education and occupations.

A.5 Impact on Small Businesses or Other Small Entities

This section has limited applicability to the proposed data collection effort. Target respondents for HSLS:09 are individuals (typically nested within an institutional context) of public and private schools; base-year data collection activities will involve no burden to small businesses or entities.

A.6 Consequences of Collecting the Information Less Frequently

This submission describes the field test and full-scale data collection for the first follow-up of HSLS:09. The first follow-up main study will take place in the spring of 2012, preceded by a field test in 2011. A college update interview will take place in the summer of 2013, and a high school transcript collection in the fall of 2013. A second follow-up is scheduled for the spring of 2015. The tentative design for the study calls for a final round at about age 26 (2021). Recent education and social welfare reform initiatives, changes in federal policy concerning postsecondary student support, and other interventions necessitate frequent studies. Repeated surveys are also necessary because of rapid changes in the secondary and postsecondary education environments and the world of work. Indeed, longitudinal information provides better measures of the effects of program, policy, and environmental changes than would multiple cross-sectional studies.

The scheduled student follow-ups of HSLS:09 are less frequent than the 2-year interval employed with HS&B, NELS:88, and ELS:2002. The first follow-up takes place at 2.5 years after the base year, and the second follow-up 3 years after the first follow-up. However, data may be collected at grade 12, and a high school transcript study to be conducted soon after graduation will provide continuous course-taking data for all on-time or early completers. The initial data collection occurred at the start of the students’ high school careers and will allow researchers to understand decision-making processes as they pertain to the selection of STEM-related courses. By following up at the end of the students’ junior year, researchers will be able to measure achievement gain and postsecondary planning information. Collecting a postsecondary update in summer 2013 (directly after most students’ 12th grade year) and transcripts in fall 2013 will minimize burden on schools and respondents, while also allowing for further inter-cohort comparability with the main transition themes of the prior studies. The second follow-up is scheduled to occur in the second year after high school, which is on track with the timing of the predecessor studies, thus facilitating comparisons in the domain of postsecondary access and choice. Despite the changes in grade cohorts and data collection time points for the first two rounds, general trends will still be measurable, because the same key transitions, albeit with slightly different data collection points, will be captured with the HSLS:09 data. Probably the most cost-efficient and least burdensome method for obtaining continuous data on student careers through the high school years comes through the avenue of collecting school records. In most cases, transcript data are more accurate than self-report data as well. High school transcripts were collected for a subsample of the HS&B sophomore cohort, and for the entire NELS:88 cohort retained in the study after eighth grade and the entire ELS:2002 sophomore and senior cohorts.

A.7 Special Circumstances Relating to Guidelines of 5 CFR 1320.5

All data collection guidelines in 5 CFR 1320.5 are being followed. No special circumstances of data collection are anticipated.

A.8 Consultations Outside NCES

The 60-day Federal Register notice was published on October 13, 2010 (75 FR, No. 197, p. 62806). No public comments were received in response to this notice.

Consultations with persons and organizations both internal and external to NCES and the federal government have been pursued. In the planning stage for HSLS:09, there were many efforts to obtain critical review and to acquire comments regarding project plans and interim and final products. The first follow-up Technical Review Panel has also been convened and serves as the major vehicle through which future consultation will be achieved in the course of the project.

For base year assessment development, a mathematics advisory panel comprising the following experts was formed:

  • Hyman Bass, Professor of Mathematics, University of Michigan;

  • Katherine Halvorsen, Professor of Mathematics and Statistics, Smith College;

  • Joan Leitzel, President Emeritus, University of New Hampshire and Professor of Mathematics (retired), Ohio State University;

  • Mark Saul, Mathematics Teacher (retired), Bronxville High School, NY; and

  • Ann Shannon, Mathematics Education Consultant, Oakland, CA.

Additional consultants outside ED and members of the base-year and first follow-up Technical Review Panels include the following individuals:

Base-Year Technical Review Panel and NCES Research Consultants

Dr. Clifford Adelman
The Institute for Higher Education Policy
1320 19th Street, NW, Suite 400
Washington, DC 20036
Phone: (202) 861-8223 ext. 228
Fax: (202) 861-9307
E-mail:
[email protected]

Dr. Kathy Borman
Department of Anthropology, SOC 107
University of South Florida
4202 Fowler Avenue
Tampa, FL 33620
Phone: (813) 974-9058
E-mail:
[email protected]

Dr. Daryl E. Chubin
Director
Center for Advancing Science & Engineering Capacity
American Association for the Advancement of Science (AAAS)
1200 New York Avenue, NW
Washington, DC 20005

Dr. Jeremy Finn
State University of New York at Buffalo
Graduate School of Education
409 Baldy Hall
Buffalo, NY 14260
Phone: (716) 645-2484
E-mail:
[email protected]

Dr. Thomas Hoffer
NORC
1155 E. 60th Street
Chicago, IL 60637
Phone: (773) 256-6097
E-mail:
[email protected]


Dr. Vinetta Jones
Howard University
525 Bryant Street NW
Academic Support Building
Washington, DC 20059
Phone: (202) 806-7340 or (301) 395-5335
E-mail:
[email protected]

Dr. Donald Rock
Before 10/15: K11 Shirley Lane
Trenton NJ 08648
Phone: 609-896-2659
After 10/15: 9357 Blind Pass Rd, #503
St. Pete Beach, FL 33706
Phone: (727) 363-3717
E-mail:
[email protected]

Dr. James Rosenbaum
Institute for Policy Research
Education and Social Policy
Annenberg Hall 110 EV2610
Evanston, IL 60204
Phone: (847) 491-3795
E-mail:
[email protected]

Dr. Russ Rumberger
Gevirtz Graduate School of Education
University of California
Santa Barbara, CA 93106
Phone: (805) 893-3385
E-mail:
[email protected]

Dr. Philip Sadler
Harvard-Smithsonian Center for Astrophysics

60 Garden St.
, MS 71
Office D-315
Cambridge, MA 02138.
Phone: (617) 496-4709
Fax: (617) 496-5405

E-mail: [email protected]

Dr. Sharon Senk
Department of Mathematics
Division of Science and Mathematics Education
D320 Wells Hall
East Lansing, MI 48824
Phone: (517) 353-4691 (office)
E‑mail:
[email protected]

Dr. Timothy Urdan
Santa Clara University
Department of Psychology
500 El Camino Real
Santa Clara, CA 95053
Phone: (408) 554-4495
Fax: (408) 554-5241
E-mail:
[email protected]

Other Consultants Outside ED

Dr. Eric Bettinger
Associate Professor, Economics
Case Western Reserve University
Weatherhead School of Management
10900 Euclid Avenue
Cleveland, OH 44106
Phone: (216) 386-2184
E-mail:
[email protected]

Dr. Audrey Champagne
Professor Emerita
University of Albany
Educational Theory and Practice
Education 119
1400 Washington Avenue
Albany, NY 12222
Phone: (518) 442-5982

Dr. Stefanie DeLuca
Assistant Professor
Johns Hopkins University
School of Arts and Sciences
Department of Sociology
532 Mergenthaler Hall
3400 North Charles Street
Baltimore, MD 21218
Phone: (410) 516-7629
E-mail:
[email protected]

Dr. Laura Hamilton
RAND Corporation
4570 Fifth Avenue, Suite 600
Pittsburgh, PA 15213
Phone: (412) 683-2300 ext. 4403
E‑mail:
[email protected]

Dr. Jacqueline King
Director for Policy Analysis
Division of Programs and Analysis
American Council for Education
Center for Policy Analysis
One Dupont Circle, NW
Washington, DC, 20036
Phone: (202) 939-9551
Fax: 202-785-2990
E-mail: j
[email protected]

Dr. Joanna Kulikowich
Professor of Education
The Pennsylvania State University
232 CEDAR Building
University Park, PA 16802-3108
Phone: (814) 863-2261
E‑mail:
[email protected]

Dr. Daniel McCaffrey
RAND Corporation
4570 Fifth Avenue, Suite 600
Pittsburgh, PA 15213
Phone: (412) 683-2300 ext. 4919
E-mail:
[email protected]

Dr. Jeylan Mortimer
University of Minnesota—Dept. of Sociology
909 Social Sciences Building
267 19th Avenue South
Room 1014a Social Sciences
Minneapolis, MN 55455
Phone: (612) 624-4064
E-mail:
[email protected]

Dr. Aaron Pallas
Teachers College
Columbia University
New York, NY 10027
Phone: (646) 228-7414
E-mail:
[email protected]

Ms. Senta Raizen
Director
WestEd
National Center For Improving Science Education
1840 Wilson Blvd., Suite 201A
Arlington, VA 22201-3000
Phone: (703) 875-0496
Fax: (703) 875-0479
E-mail: [email protected]



Technical Review Panel—First Follow-Up

Brian Cook
American Council on Education
One Dupont Circle NW, Suite 800
Washington, DC 20036
Voice: (202) 939-9381
Email: [email protected]

Regina Deil-Amen
Center for the Study of Higher Education
University of Arizona
1430 E. Second Street
Tucson, AZ 85721
Voice: (520) 621-8468, or (520) 444-7441
Email: [email protected]

Jeremy Finn
State University of New York at Buffalo
Graduate School of Education
409 Baldy Hall
Buffalo, NY 14260
Voice: (716) 645-6116, or (716) 645-2484 x1071
Email: [email protected]

Thomas Hoffer
National Opinion Research Center (NORC)
1155 E. 60th Street
Chicago, IL 60637
Voice: (773) 256-6097
Email: [email protected]

Vinetta Jones
Howard University
525 Bryant Street NW
Academic Support Building
Washingon, DC 20059
Voice: (202) 806-4947, or (301) 395-5335
Email: [email protected]

Amaury Nora
University of Texas at San Antonio
College of Education and Human Development
One UTSA Circle
San Antonio, Texas 78294
Voice: (210) 458-5436, or (210) 458-7394
Email: [email protected]

Jesse Rothstein
Richard and Rhoda Goldman Schol of Public Policy
University of California, Berkeley
2607 Hearst Avenue
Berkeley, CA 94720-7320
Voice: (returning to UC-B after leave)
Email: [email protected]

Russ Rumberger
University of California, Santa Barbara
Gevirtz Graduate School of Education
2329 Phelps Hall
Santa Barbara, CA 93106
Voice: (805) 451-6091
Email: [email protected]

Sarah E. Turner
Department of Leadership, Foundations and Policy
294 Ruffner Hall
University of Virginia
Charlottesville, VA 22903-2495
Voice: (434) 982-2383
Email: [email protected]

Timothy Urdan
Santa Clara University
Department of Psychology
500 El Camino Real
Santa Clara, CA 95053
Voice: (408) 554-4495
Email: [email protected]


A.9 Explanation of Payment or Gift to Respondents

Incentives are proposed to maximize school and student participation within schools and to encourage students and parents to participate outside of school. Incentives are also intended to help improve the chances of study participation by non-responding sample members. The use of incentives provides significant advantages to the government in terms of increased overall response rates, timely data collection, and reduction of nonresponse bias. In turn, increased response rates result in decreased data collection costs.

The incentive structure requested for the HSLS:09 first follow-up is presented by respondent type in Exhibit A‑1 for the field test and Exhibit A‑2 for the main study. A break out of incentives by respondent type is provided in Exhibit A-3 for the field test and Exhibit A-4 for the main study. A description and rationale for each incentive is provided below.


Exhibit A-1. Incentives by respondent type proposed for field test

Respondent

Incentive/Honorarium

School

Subscription or equivalent ~$50

School coordinator

$100 plus $25 for ≥ 85% or $50 for ≥ 92% student participation

IT coordinator

$50

School reimbursement for

costs incurred

Up to $100 as required by schools

In-school student

$10

Out Of School student (OOS)

$15 for completing questionnaire plus $10 for completing assessment. An additional $25 to pre-identified “low propensity to respond” cases after the early web data collection period expires.

Parent

none

School administrator

none

School counselor

none

Note: Student cases would be categorized as “low propensity to respond” prospectively based on their contact and response history in addition to their current enrollment status. The additional incentive for low propensity to respond cases would only be implemented after the three-week early web data collection period had expired, during which time the outbound calling efforts commence.


Exhibit A-2. Incentives by respondent type proposed for full scale

Respondent

Incentive/Honorarium

School

Subscription equivalent; list of choices ~$50

School coordinator

$100 plus $25 for ≥ 85% or $50 for ≥ 92% student participation

IT coordinator

$50

School reimbursement for

costs incurred

Up to $100 as required by schools

In-school student

$10

Out of school student (OOS)

$15 for completing questionnaire plus $10 for completing assessment. An additional $25 to pre-identified “low propensity to respond” cases after the early web data collection period expires.

Parent

$20 for ”difficult cases” only

School administrator

none

School counselor

none

NOTE: Student incentives for out-of-school data collection would be applied as described in the Note for Exhibit A-1. In the case of parents, incentives are only offered to the subset of the population who become “difficult cases” (estimated to be 20% of parents at $20) for the main study. No field test parent cases will receive incentives given that the necessary yield for the field test does not justify such an incentive. Parent response rate requirements for the main study, though, combined with the positive HSLS base year experience with the incentive experiment justifies the $20 incentive for “difficult cases.”


Exhibit A-3. Incentives by type of case and data collection phase for field test

Type of case and phase

% of sample

Response rate

% of respondents by phase / of all student respondents

Number of respondents

Survey incentive amount

Additional assessment incentive

Total incentive amount

In-school student

83%

80%

100% / 71.2%

398

$10

NA

$10

Out-of-school student*

17%

80%






Early Web



30% / 8.6%

48

$15

$10

$25

Production – high prop.



50% / 14.5%

81

$15

$10

$25

Low Propensity.



20% / 5.7%

32

$40

$10

$50

Parent

100%

48%






Early Web



30%

90

NA

NA

NA

Production



50%

150

NA

NA

NA

Difficult



20%

60

NA

NA

NA

Note: In-school nonrespondents will be contacted out-of-school and are included in the number of respondents, but are not reflected in the percent of sample to avoid double counting of sample members. High Prop. refers to high propensity of response cases and low prop. refers to low propensity of response cases. Percent of sample refers to the percent of the overall sample for each category (i.e., in-school student, out-of-school student, and parent), and the percentages associated with the data collection periods are the percent of responding sample members to participate within each data collection period.



Exhibit A-4. Incentives by type of case and data collection phase for main study

Type of case and phase

% of sample

Response rate

% of respondents by phase / of all student respondents

Number of respondents

Survey incentive amount

Additional assessment incentive

Total incentive amount

In-School Student

83%

90%

100% / 78.7%

18,829

$10

NA

$10

Out-of-school student*

17%

80%






Early Web



30% / 6.4%

1,531

$15

$10

$25

Production – high prop.



50% / 10.7%

2,551

$15

$10

$25

Production - low prop.



20% / 4.3%

1020

$40

$10

$50

Parent

100%

80%






Early Web



30%

2,748

NA

NA

NA

Production



50%

4,580

NA

NA

NA

Difficult



20%

1,832

$20

NA

$20


Note: In-school nonrespondents will be contacted out-of-school and are included in the number of respondents, but are not reflected in the percent of sample to avoid double counting of sample members. “High prop.” refers to high propensity of response cases and “low prop.” refers to low propensity of response cases. Percent of sample refers to the percent of the overall sample for each category (i.e., in-school student, dropout, etc) and the percentages associated with the data collection periods are the percent of responding sample members to participate within each data collection period.


A few of the incentives presented in the tables above were approved as part of the HSLS:09 First Follow-up School Recruitment Procedures and Materials change request (1850-0852 v.6) in August 2010. As in the base year, school coordinators will be offered an honorarium of $100 with the opportunity to earn an additional $25 for achieving at least an 85% student participation rate or an additional $50 for achieving a student response rate of 92% or better at the school. A modest token of appreciation to the schools, with an estimated value of $50 per school, in the form of a choice of 1-year science- or math-related magazine subscriptions for the school media center, was approved for the field test, with the understanding that during the field test recruitment effort, we will ask schools to suggest additional, alternative low-cost options that are meaningful to the schools, in order to develop a list of five options to present to schools during the main study to assist with the school recruitment/retention effort for the first follow-up. Lastly, as in the base year field test, a $10 incentive was approved for first follow-up field test in-school student respondents.

Incentives for students. The use of a $10 monetary student incentive was approved by OMB for students participating in in-school sessions for the base year and first follow-up field test (OMB# 1850-0852 v.2 and v.6). We request that the same incentive be offered to students participating in-school during the main study data collection. Most participants in the HSLS:09 first follow-up will be nearing the end of their junior year of high school, making them similar to high school seniors for whom research has demonstrated the importance of incentivizing to participate in voluntary research studies (National Commission on NAEP 12th Grade Assessment and Reporting, 2004; National Research Council 2003). In the base year main study, 9th graders were given a goody bag filled with education supplies worth an estimated $5. We propose to give students a $10 cash incentive since it is anticipated that a cash incentive will be more positively received by upperclassmen than a token incentive. To support this point, an experiment conducted during the ELS first follow-up field test found that high school seniors were more likely to participate when receiving a $20 cash incentive (95.2% student response rate) than a token incentive (86.8% response rate). In addition, the cash incentive responds to the increased student reluctance to leave class for 90 minutes to participate in voluntary research that we encountered in the HSLS:09 base year data collection and offsets the perceived stress of missing class to take another assessment. Finally, the $10 incentive should help increase response rates for the in-school session, thus reducing the number of students requiring the costlier Web, CATI, or Field follow-up.

It is anticipated that approximately 75% of students will be available to participate in the in-school data collection for the HSLS:09 first follow-up. An estimated 8% of students will be enrolled in the base-year school but will be absent or unable to participate in the in-school session and will need to be contacted for an out-of-school administration. The remaining 17% of students will no longer be enrolled in the base-year school and will need to be contacted out of school for the study. Our experience on the ELS:2002/04 first follow-up demonstrated that additional incentives were necessary to gain cooperation from students participating outside of school. We propose to offer a $15 base incentive for students completing the questionnaire outside of school. For the first time in the series of high school longitudinal studies, NCES also will be administering the student assessment outside of school in addition to the questionnaire. We propose to offer students an additional $10 for completing the assessment, for a total of $25 to students who complete both components of the study.

Some students will be more reluctant than others and will be classified as having a “low propensity to respond” to the HSLS:09 FFU. Among the most serious problems created by nonresponse is the bias that can lead to inaccurate estimates and can compromise data quality. It is common for survey organizations to address nonresponse bias by attempting to increase the survey response rate, which is usually accomplished by pursuing the nonresponse cases most likely to be interviewed. However, this approach may not be successful in reducing nonresponse bias even if higher response rates are achieved—in fact, nonresponse bias could even be increased by adding more cases that are similar to those that have already responded (Merkle and Edelman 2009). If low propensity (i.e., difficult to complete) cases are brought into the response pool, we anticipate that this will not only increase the weighted response rate and result in less biased survey estimates.

RTI is currently undertaking an initiative, modeled on the Responsive Design methodologies developed by Groves (Groves and Heeringa, 2006), to develop new approaches to improve survey outcomes that incorporate different responsive and adaptive features. Although still in the development phase, RTI has implemented several of these procedures on recent studies and have published preliminary results (Rosen et al., in press; Peytchev et al., 2010). RTI’s approach aims to reduce nonresponse bias by using multiple sources of data to produce models that estimate a sample member’s response propensity prior to the commencement of data collection. After empirically identifying sample members with the lowest response propensities, the field team targets those cases with interventions (such as a higher incentive, prompting, use of a select group of interviewers that are specially trained in refusal conversion techniques, whatever may be appropriate for the sample) in an attempt to maximize the average response propensity.

The ultimate goal of the approach is to minimize bias by targeting the cases that, based on the available data, are expected to have a low likelihood of response and a high likelihood of contributing to nonresponse bias. Because the propensity-modeling plan considers respondent information (including survey response behaviors and socio-demographic characteristics) more inclusively and broadly, it is expected that it will also be able to determine which cases would potentially contribute most to minimization of bias in estimates, and ensure that these cases receive priority, via an effective treatment.

Criteria reviewed to determine response propensity classifications will include participation in the base year study, enrollment status (e.g., dropout, transfer), existence of contact information (e.g., mailing address, telephone number, email address), parent participation in base year, type of school, and school locale. Other variables may be added as the model is finalized, though race/ethnicity, gender, income and socioeconomic status will not be included in the model as the model focuses more on history of participation. A low propensity to respond will be determined initially at the start of data collection. The propensity model will be refined during the early web period based on the actual early web response. The identified “low propensity to respond” cases would be offered an increased incentive, but only after the three-week early web data collection period has expired and outbound telephone contacts have commenced. After the three week early web data collection period, “low propensity to respond” students would be offered $40 to complete the questionnaire and $10 to complete the mathematics assessment for a total of $50. We propose to implement this incentive immediately following the early web data collection period to ensure the opportunity for all sample members to respond in the early phase and to better determine which cases will have a lower propensity to respond. The $50 targeted for low propensity cases (with an additional $10 for assessment completion) provides a strong incentive level to encourage cooperation among the set that would otherwise potentially increase bias through nonresponse. These incentive amounts are comparable to those offered to difficult cases in the ELS first follow-up study, when challenges experienced with obtaining their participation resulted in the need to request additional incentives during the data collection period to achieve target response rates. At that time, OMB approved an increased incentive for the difficult cases from $40 to $60 that resulted in a final response rate of 78%, as compared with an overall response rate of 87%. The increased incentive level helped to generate a 20% increase in the overall out-of-school response rate in the last 8 weeks of ELS data collection.

Prior high school longitudinal studies, such as the ELS:2002/04 FFU, have shown that low propensity cases, which include large percentages of dropout students (an extremely important policy-relevant group), are particularly difficult to reach and to convince to participate. They require added effort to secure their participation, and respond positively to an increased incentive and more intensive and focused outreach efforts. The HSLS:09/12 plan is designed to minimize nonresponse bias, achieve sufficient yield for analytic needs, limit the number of cases requiring more costly follow-up steps, and accommodate the additional assessment component for the out of school cases among low propensity cases, in particular.

All other students participating outside of school will be offered the $15 incentive for completing the questionnaire with the additional $10 for completing the mathematics assessment for a total of $25.

IT Coordinator. During the base-year field test, it was determined that an IT Coordinator was necessary at each school to facilitate the use of the school computer labs and to ensure compatibility between the school’s computers and network connectivity and the Sojourn CD which provides a secure connection between the school’s computer and the NCES website for data collection. In the base-year main study, OMB approved a $50 honorarium for IT Coordinators who facilitated the in-school data collection. This honorarium proved extremely effective to enlist the assistance of an IT Coordinator in the schools. We propose to continue to offer the $50 honorarium to IT Coordinators for the First Follow-Up Study.

Incentive for counselors. No incentive is proposed for the counselors to complete their questionnaires. This precedent was set in the base-year study which realized high counselor response rates without the use of monetary incentives. Counselors would typically provide the information requested in the questionnaire as well as the administrative records as part of their normal duties. Because of the nature of the study, NCES suspects that many school principals will designate a counselor to perform the school coordinator duties, in which case the counselor will receive the coordinator honorarium as was previously approved by OMB.

Incentive for school administrators. NCES has achieved high response rates for the school administrator questionnaire on the HSLS:09 base year and on ELS:2002 and the ELS:2002 follow-up conducted in 2004. Based on past experience, no incentive will be offered for the school administrator questionnaire on HSLS:09.

Incentives for parents. For the parent data collection, we do not request an incentive for the field test, but propose that one be offered to a subset of parents for the full-scale study. The field test parent data collection will consist of a small set of parents with a low response rate expectation to test the questionnaire and procedures while containing costs. The full-scale study, however, will comprise a subsample of parents for which achieving high response rates is critical. In the base year, we experienced challenges achieving high parent response and used an incentive experiment to determine the most effective incentive threshold. Based on the results of the experiment (submitted to OMB earlier this year), we propose to offer a $20 incentive for nonresponse follow-up for the most challenging cases. The decision to offer an incentive for parents will be determined by rules similar to those implemented in the base-year incentive experiment, consisting of sample members who have not responded after receiving a high number of calls from RTI, refusals, and sample members for whom we have a good address but no good phone number. Given the two year lapse of time between data collections and the effectiveness of the experiment, using these conditions to dictate timing for offering incentives to parents should be effective for the first follow-up study.

Reimbursement of reasonable school expenses. In some cases there may be requests from schools for reimbursement of expenses associated with the testing session. For example, a number of base-year schools requested reimbursement for the production of enrollment lists and three others asked for reimbursement to keep the school open for testing sessions that occurred outside of normal school hours. Such cases will be reviewed by project staff on an individual basis and will be approved if the request is deemed reasonable.

A.10 Assurance of Confidentiality Provided to Respondents

A data security plan (DSP) for HSLS:09 was developed and approved by the computer security review board for the base-year study. Revisions to the plan to account for changes associated with the First Follow-Up Study are in progress. The HSLS:09 plan represents best-practice survey systems and procedures for protecting respondent confidentiality and securing survey data. An outline of this plan is provided in Exhibit A-5. The HSLS:09 DSP will

  • establish clear responsibility and accountability for data security and the protection of respondent confidentiality with corporate oversight to ensure adequate investment of resources;

  • detail a structured approach for considering and addressing risk at each step in the survey process and establish mechanisms for monitoring performance and adapting to new security concerns;

  • include technological and procedural solutions that mitigate risk and emphasize the necessary training to capitalize on these approaches; and

  • be supported by the implementation of data security controls recommended by the National Institute of Standards and Technology for protecting federal information systems.

Exhibit A-5. HSLS:09 Data Security Plan Outline

HSLS:09 Data Security Plan Summary

Maintaining the Data Security Plan

Information Collection Request

Our Promise to Secure Data and Protect Confidentiality

Personally Identifying Information That We Collect and/or Manage

Institutional Review Board Human Subject Protection Requirements

Process for Addressing Survey Participant Concerns

Computing System Summary

General Description of the RTI Networks

General Description of the Data Management, Data Collection, and Data Processing Systems

Integrated Monitoring System

Receipt Control System

Instrument Development and Documentation System

Data Collection System

Document Archive and Data Library

Employee-Level Controls

Security Clearance Procedures

Nondisclosure Affidavit Collection and Storage

Security Awareness Training

Staff Termination/Transfer Procedures

Subcontractor Procedures

Physical Environment Protections

System Access Controls

Survey Data Collection/Management Procedures

Protecting Electronic Media

Encryption

Data Transmission

Storage/Archival/Destruction

Protecting Hard-Copy Media

Internal Hard-Copy Communications

External Communications to Respondents

Handling of Mail Returns, Hard-Copy Student Lists, and Parental Consent Forms

Handling and Transfer of Data Collection Materials

Tracing Operations

Software Security Controls

Data File Development: Disclosure Avoidance Plan

Data Security Monitoring

Survey Protocol Monitoring

System/Data Access Monitoring

Protocol for Reporting Potential Breaches of Confidentiality

Specific Procedures for Field Staff



Invitation letters will be sent to districts, schools, and sample members (students, parents, school administrators, and school counselors). Respondents will be informed of the voluntary nature of the survey and of the confidentiality provision in the initial cover letter and on the questionnaires, stating that their responses may be used for statistical purposes only and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002 (ESRA 2002) Public Law 107-279, Section 183]. The material sent will also include a brochure describing the study and the extent to which respondents and their responses will be kept confidential (Appendix A.)

Additionally, HSLS:09 will conform to NCES Restricted Use Data Procedures Manual and NCES Standards and Policies. The plan for maintaining confidentiality includes obtaining signed confidentiality agreements and notarized nondisclosure affidavits from all personnel who will have access to individual identifiers. Each individual working in HSLS:09 will complete the e-QIP clearance process. The plan includes annual personnel training regarding the meaning of confidentiality and the procedures associated with maintaining confidentiality, particularly as it relates to handling requests for information and providing assurance to respondents about the protection of their responses. The training will cover controlled and protected access to computer files under the control of a single database manager; built-in safeguards concerning status monitoring and receipt control systems; and a secured and operator-manned in-house computing facility.

A.11 Justification for Sensitive Questions

With the exception of family income, no sensitive questions are included. Income is needed to support poverty and socioeconomic status variables. All responses are voluntary. For parents reluctant to give a precise amount, answers within a broad categorical range may be recorded.

A.12 Estimates of Annualized Burden Hours and Their Cost to Respondents

Estimates of response burden for the HSLS:09 base-year field test and full-scale data collection activities are shown in Exhibits A-6 through A-7. Estimates of response burden are based on estimates developed from experience with the base-year HSLS:09 questionnaires and experience on other education longitudinal studies (e.g., ELS:2002, NELS:88, HS&B). Please note that the time students will spend completing the cognitive assessment has not been included in the estimated burden.

Exhibit A-6. Estimated Burden for HSLS:09 First Follow-up Field Test

Respondents

Sample

Expected response rate

Number of respondents

Average burden per response1

Range of
response times

Total burden (hours)

School Coordinators

24

24

240 minutes

180 – 300 minutes

96

IT Coordinators

24

24

120 minutes

60-180 minutes

48

School Administrators

24

92%

22

30 minutes

11

School Counselors

24

92%

22

30 minutes

11

Students—Questionnaire

720

69%

500

35 minutes

35 minutes

292

Students—Assessment

720

69%

500

40 minutes

40 minutes

333

Parents

600

50%

300

30 minutes

30 minutes

150

Panel Maintenance (Parents)

720

20%

144

3 minutes

2-4 minutes

 7








Total

 

 

892

 

 

615

Exhibit A-7. Estimated Burden for HSLS:09 First Follow-up Full-Scale Study

Respondents

Sample

Expected response rate

Number of respondents

Average burden per response1

Range of
response times

Total burden (hours)

School Coordinators (enrollment update)

944

95%

897

20 minutes

10-30 minutes

299

Panel Maintenance (Parents)

24,700

20%

4,940

3 minutes

2-4 minutes

247

School Coordinators (data collection logistics)

944

92%

868

240 minutes

180 – 300 minutes

3,472

IT Coordinators

944

92%

868

120 minutes

60-180 minutes

1,736

School Administrators

944

92%

868

30 minutes

434

School Counselors

944

92%

868

30 minutes

434

Students—Questionnaire

24,700

92%

22,724

35 minutes

35 minutes

13,256

Students—Assessment

24,700

92%

22,724

40 minutes

40 minutes

15,149

Parents

7,500

75%

5,625

30 minutes

30 minutes

2,813

 

 

 

 

 

 

 

Total

 

 

31,821

 

 

22,392



The burden for the full scale student enrollment update by school coordinators and panel maintenance (student address update) by parents is included with this request because they will take place before the full scale package will be approved in the fall 2011.

The cost to the school coordinator and IT coordinator is estimated at $20 per hour. The cost for the school coordinator for data collection activities is estimated at $1,920 for the field test and $69,440 for the main study. The cost for the IT coordinator is estimated at $960 for the field test and $34,720 for main study.

Assuming an hourly rate of $7.25 per hour, the estimated cost to student participants is estimated at $2,117 for field test and $96,104 for the main study. For parents, assuming a $20 hourly wage, the cost to parent respondents is estimated to be $3,000 for field test and 56,250 for the main study.

For school administrators, approximately three-fourths of the questionnaire is typically completed by clerical staff in the school office with the last section completed by the school principal. Again assuming a $20 hourly cost, the cost to respondents is $220 for the field test and $8,680 for the main study.

For the counselor questionnaire, the respondent dollar cost, assuming an average hourly rate of $20 for school employees, is estimated to be $220 for the field test and $8,680 for the main study.

The panel maintenance address update is estimated to take 2-4 minutes to complete. Assuming a cost of $10 per hour for students to complete this activity, the total cost would be $70 for the field test and $2,457 for the main study.

Included in the parent, teacher, school administrator, and counselor notification letters will be the following burden statement:

According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless it displays a valid OMB control number. The valid OMB control number of this voluntary information collection is [1850-0852]. The time required to complete this information collection is estimated to average 30 minutes for the parent, teacher, and school administrator questionnaires, including the time to review instructions and complete and review the information collection. The student questionnaire will be no more than 35 minutes in length, and the math test will take about 40 minutes. If you have any comments concerning the accuracy of the time estimate or suggestions for improving the interview, please write to: High School Longitudinal Study of 2009 (HSLS:09), National Center for Education Statistics, 1990 K Street NW, Washington, DC 20006.

A.13 Estimates of Total Annual Cost Burden to Respondents

There are no capital, startup, or operating costs to respondents for participation in the project. No equipment, printing, or postage charges will be incurred.

A.14 Annualized Cost to the Federal Government

Estimated costs to the federal government for HSLS:09 are shown in Exhibit A-8. The estimated costs to the government for data collection for the field test and main study are presented separately. Included in the contract estimates are all staff time, reproduction, postage, and telephone costs associated with the management, data collection, analysis, and reporting for which clearance is requested.

Exhibit A-8. Total Costs to NCES

Costs to NCES

Amount

Total HSLS:09 first follow-up costs

$ 15,557,197

Salaries and expenses

719,900

Contract costs

14,774,940

Field test (2011)

3,541,587

Salaries and expenses

215,648

Contract costs

2,894,294

Full-scale survey (2012)

12,015,610

Salaries and expenses

504,252

Contract costs

11,880,646

NOTE: All costs quoted are exclusive of incentives. Field test costs represent Task 2 of the HSLS:09 contract; base-year main study costs include task 3.

A.15 Reasons for Program Changes

The decrease in the burden for this collection is due to the fact that the last OMB approval included the base year main study data collection, while this request instead includes data collection for the HSLS:09 first follow-up field test.

A.16 Publication Plans and Project Schedule

The formal contract for HSLS:09 requires the following reports, publications, or other public information releases:

1. a detailed methodological report describing all aspects of the full-scale study design and data collection procedures (a working paper detailing the methodological findings from the field test will also be produced);

2. complete data files and documentation for research data users in the form of both a restricted-use electronic codebook (ECB) and a public-use data tool (i.e., EDAT); and

3. special tabulations of issues of interest to the higher education community, as determined by NCES.

The operational schedule for the HSLS:09 field test and full-scale study is shown in Exhibit A‑9.

Exhibit A-9. Operational Schedule for HSLS:09

HSLS:09 activity

Start date

End date

Field test



School recruitment*

Sept. 2010

May 2011

Enrollment status verification*

Oct. 2010

Dec. 2010

Parent address update*

Oct. 2010

Dec. 2010

Cognitive interviewing

Dec. 2010

Jan. 2011

Batch tracing*

Jan. 2011

Jan. 2011

Student in-school data collection

March 2011

Jun. 2011

Self-administered web-based data collection

March 2011

Jun. 2011

Conduct telephone interviews

March 2011

Jun. 2011

Conduct field interviews

March 2011

Jun. 2011

Process data, construct data files

Jun. 2011

Aug. 2011

Prepare/update field test reports

Jun. 2011

Dec. 2011

College update

Jun. 2012

Oct. 2012

Transcript collection

Sept. 2012

Jan. 2013

Panel maintenance

Fall 2013





Full-scale study



School recruitment

Jan. 2011

May 2012

Enrollment status verification

Sept. 2011

Dec. 2011

Parent address update

Sept. 2011

Dec. 2011

Cognitive interviewing

Sept. 2011

Oct. 2011

Batch tracing

Oct. 2011

Oct. 2011

Student in-school data collection

Jan. 2012

Jun. 2012

Self-administered web-based data collection

Feb. 2012

Aug. 2012

Conduct telephone interviews

Feb. 2012

Aug. 2012

Conduct field interviews

Feb. 2012

Aug. 2012

Process data, construct data files

Jun. 2012

Dec. 2012

Prepare/update reports

Jun. 2012

Dec. 2012

College update

Jun. 2013

Oct. 2013

Transcript collection

Sept. 2013

Jan. 2014

Panel maintenance

Fall 2014


* Denotes activities already approved by OMB.
Note: The current request for OMB review includes only data collection activities for the field test study.

A.17 Reason(s) Display of OMB Expiration Date Is Inappropriate

The expiration date for OMB approval of the information collection will be displayed on data collection instruments and materials. No special exception to this requirement is requested.

A.18 Exceptions to Certification for Paperwork Reduction Act Statement

There are no exceptions to the certification statement identified in the Certification for Paperwork Reduction Act Submissions of OMB Form 83-I.

References

Groves, R. M., & Heeringa, S. (2006). Responsive design for household surveys: tools for actively controlling survey errors and costs. Journal of the Royal Statistical Society Series A: Statistics in Society, 169(Part 3), 439-457.

Merkle, D. M., & Edelman, M. (2009). An Experiment on Improving Response Rates and Its Unintended Impact on Survey Error. Survey Practice, (March)

National Commission on NAEP 12th Grade Assessment and Reporting. (2004, March). 12th Grade Student Achievement in America: A New Vision for NAEP. Retrieved September 22, 2010, from http://www.nagb.org/publications/12_gr_commission_rpt.pdf

National Research Council, Committee on Increasing High School Students’ Engagement and Motivation to Learn. (2003). Engaging Schools: Fostering High School Students’ Motivation to Learn. Washington, DC: National Academies Press.

Peytchev, A., S. Riley, J.A. Rosen, J.J. Murphy, and M. Lindblad. (2010). Reduction of Nonresponse Bias in Surveys through Case Prioritization. Survey Research Methods, 4(1), 21-29


Rosen, J., Murphy, J. J, A. Peytchev, Riley, S., & Lindblad, M;. The Effects of Differential Interviewer Incentives on a Field Data Collection Effort. Forthcoming in Field Methods.





File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleChapter 2
Authorspowell
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy