HSLS 09 Supportive Statement Part A

HSLS 09 Supportive Statement Part A.doc

High School Longitudinal Study of 2009 (HSLS:09)

OMB: 1850-0852

Document [doc]
Download: doc | pdf







High School Longitudinal Study of 2009 (HSLS:09)





Supporting Statement
Request for OMB Review (SF83-I)
OMB# 1850-0852 v.3







Submitted by:

National Center for Education Statistics

U.S. Department of Education






June 17, 2009



Contents

A. Justification 1

1. Circumstances Making the Collection of Information Necessary 1

a. Purpose of This Submission (PIA, D, 2, a, i) 1

b. Legislative Authorization 4

c. Prior and Related Studies 5

2. Purpose and Use of Information Collection 6

a. Content Justifications 8

b. Sharing survey information (PIA, D, 2, a, ii) 17

c. Impact on respondent privacy (PIA, D, 2, a, iii) 17

3. Use of Improved Information Technology and Burden Reduction 18

4. Efforts to Identify Duplication and Use of Similar Information 18

5. Impact on Small Businesses or Other Small Entities 19

6. Consequences of Collecting the Information Less Frequently 19

7. Special Circumstances Relating to Guidelines of 5 CFR 1320.5 21

8. Consultations Outside NCES 21

9. Explanation of Any Payment or Gift to Respondents: Field Test Incentive Results and Recommendations for the Main Study 25

10. Assurance of Confidentiality Provided to Respondents (PIA, D, 2, iv) 27

11. Justification for Sensitive Questions 29

12. Estimates of Annualized Burden Hours and Costs: Changes 30

13. Estimates of Other Total Annual Cost Burden 32

14. Annualized Cost to the Federal Government 32

15. Explanation for Program Changes or Adjustments 32

16. Plans for Tabulation and Publication and Project Time Schedule 33

17. Reason(s) Display of OMB Expiration Date Is Inappropriate 33

18. Exceptions to Certification for Paperwork Reduction Act Submissions 33

B. Collection of Information Employing Statistical Methods 35

1. Target Universe and Sampling Frames 35

2. Statistical Procedures for Collecting Information 35

a. School Frames and Samples 35

b. Student Frames and Samples 39

c. Weighting, Variance Estimation, and Imputation 42

3. Methods for Maximizing Response Rates 44

4. Individuals Consulted on Statistical Design 47

Appendix A. Recruitment Materials A-1

Appendix B. IRB Approval for Recruitment Materials B-1

Appendix C. Proposed Main Study Student Questionnaire C-1

Appendix D. Proposed Main Study Parent Questionnaire D-1

Appendix E. Proposed Main Study School Administrator Questionnaire E-1

Appendix F. Proposed Main Study Teacher (Math and Science) Questionnaire F-1

Appendix G. Proposed Main Study Counselor Questionnaire G-1

Appendix H. Proposed Main Study State Augmentation Administrative Records Collection H-1

Appendix I. TRP Meeting Minutes I-1

Appendix J. Summary of Changes Memo J-1


List of Tables

Table Page

Table 1. Incentives 27

Table 2. Estimated burden on respondents for full-scale study 30

Table 3. Estimated burden on parents for full-scale study 30

Table 4. Estimated burden on teachers for full-scale study 30

Table 5. Estimated burden on school administrators for full-scale study 30

Table 6. Estimated burden on school counselors for full-scale study 31

Table 7. Estimated burden on state employees for full-scale study State Augmentation 32

Table 8. Total costs to NCES (updated) 32

Table 9. HSLS:09 schedule 34

Table 10. Illustrative school sample allocation and expected yields (full-scale study HSLS:09) 38

Table 11. Student sample allocation and expected yields for HSLS:09 ninth-graders 41

Table 12. Consultants on statistical aspects of HSLS:09 47



List of Exhibits

Exhibit Page

1. HSLS:09 data security plan outline 28


High School Longitudinal Study of 2009

This document has been prepared to support the clearance of study data elements and procedures under the Paperwork Reduction Act of 1995 and 5 CFR 1320 for the study titled High School Longitudinal Study of 2009 (HSLS:09). This study is being conducted by RTI International1—with the American Institutes for Research (AIR), Windwalker Corporation, Horizon Research Inc., Research Support Services (RSS), and MPR Associates (MPR) as subcontractors—under contract to the U.S. Department of Education (Contract number ED-04-CO-0036/0003).

The purpose of this Office of Management and Budget (OMB) submission is to supersede the field test clearance document with a full supporting statement that requests clearance for the instrumentation and data collection methods, as well as sampling and recruitment activities, for the HSLS:09 main study.

In this supporting statement for Standard Form (SF) 83-I, we report the purposes of the study, review the data elements for which clearance is requested, and describe how the collected information addresses the statutory provisions of Section 153 of the Education Sciences Reform Act of 2002 (P.L. 107-279). Subsequent sections of this document respond to the OMB instructions for preparing supporting statements to SF 83-I. Section A addresses OMB’s specific instructions for justification and provides an overview of the study’s design and data elements. Section B describes the collection of information employing statistical methods.

A.Justification

1.Circumstances Making the Collection of Information Necessary

a.Purpose of This Submission (PIA, D, 2, a, i)

The materials in this document support a request for clearance for the main study of HSLS:09. The basic components and key design features of HSLS:09 are summarized below:

Base Year

  • baseline survey of high school ninth-graders, in fall term, 2009;

  • cognitive test in mathematics;

  • parents and mathematics and science teachers to be surveyed in the base year; school administrator and school counselor information will also be collected;

  • administrative records collected on coursetaking behavior in grades 8 and 9;

  • sample sizes of 944 schools and more than 23,000 sampled students (schools are the first-stage unit of selection, with ninth-graders randomly selected within schools); and

  • oversampling of private schools and Asians.





First Follow-up

Specifications have not yet been provided for follow-ups to the base-year study, although the following have been discussed:

  • follow-up in 2012 in the spring term, when most sample members are high school juniors, but some are dropouts or in other grades;

  • student questionnaires, mathematics assessment, and school administrator questionnaires to be administered;

  • returning to the same schools, but separately following transfer students; and

  • high school transcript component in 2013 (records data for grades 9–12).

Second Follow-up

  • post–high school follow-ups by web survey and computer-assisted telephone interview.

HSLS:09 will provide a link to its predecessor longitudinal studies, which address many of the same issues of transition from high school to postsecondary education and the labor force. At the same time, HSLS:09 will bring a new and special emphasis to the study of youth transition by exploring the path that leads students to pursue and persist in courses and careers in the fields of science, technology, engineering, and mathematics (STEM). HSLS:09 will measure math achievement gains in the first 3 years of high school, but also will relate tested achievement to students’ choice, access, and persistence—both in mathematics and science courses in high school, and thereafter in the STEM pipelines in postsecondary education and in STEM careers. That is to say, the HSLS:09 assessments will serve not just as an outcome measure, but also as a predictor of readiness to proceed into STEM courses and careers. Questionnaires will focus on factors that motivate students for STEM coursetaking and careers.

Additionally, HSLS:09 will focus on students’ decisionmaking processes. Generally, the study will question students on when, why, and how they make decisions about courses and postsecondary options, including what factors, from parental input to considerations of financial aid for postsecondary education, enter into these decisions.

HSLS:09 supports two of the three goals of the American Competitiveness Initiative (ACI), which aims to strengthen math and science education, foreign language studies, and the high school experience in the United States. Information collected from students, parents, teachers, counselors, and school administrators will help to inform and shape efforts to improve the quality of math and science education in the United States, increase our competitiveness in STEM-related fields abroad, and improve the high school experience.

There are several reasons the transition into adulthood is of special interest to federal policy and programs. Adolescence is a time of physical as well as psychological changes. Attitudes, aspirations, and expectations are sensitive to the stimuli that adolescents are exposed to, and environments influence the process of choosing among opportunities. Parents, educators, and those involved in policy decisions in the educational arena all share the need to understand the effects that the presence or absence of good educational guidance from the school, in combination with that from the home, can have on the educational, occupational, and social success of youth.

These patterns of transition cover individual as well as institutional characteristics. At the individual level the study will look into educational attainment and personal development. In response to policy and scientific issues, data will also be provided on the demographic and background correlates of educational outcomes. At the institutional level, HSLS:09 will focus on school effectiveness issues, including tracking, promotion, retention, and curriculum content, structure, and sequencing, especially as these affect students’ choice of and assignment to different mathematics and science courses and achievement in these two subject areas.

By collecting extensive information from students, parents, teachers, school counselors, school administrators, and school records, it will be possible to investigate the relationship between home and school factors and academic achievement, interests, and social development at this critical juncture. The school environment will be captured primarily through student, teacher, and administrator reports. The extent to which schools are expected to provide special services to selected groups of students to compensate for limitations and poor performance (including special services to assist those lagging in their understanding of mathematics and science) will be examined. Base-year teachers will report on sampled students’ specific classroom environment and supply information about their own background and training. Moreover, the study will focus (in particular through the base-year parent survey) on basic policy issues related to parents’ role in the educational success of their children, including parents’ educational attainment expectations for their children, beliefs about and attitudes toward curricular and postsecondary educational choices, and the correlates of active parental involvement in the school; these are among the many questions HSLS:09 will address about the home education support system and its interaction with the student and the school.

Additionally, since the survey will focus on ninth-graders, it will also permit the identification and study of high school dropouts and underwrite trend comparisons with dropouts identified and surveyed in the High School and Beyond Longitudinal Study (HS&B), the National Education Longitudinal Study of 1988 (NELS:88), and the Education Longitudinal Study of 2002 (ELS:2002).

Finally, HSLS:09 provides for a further level of analysis for certain states, through the linkage of HSLS:09 data with state longitudinal data systems. NCES has received funds from NSF to perform two functions: (1) sample additional schools to produce state-representative samples and (2) perform the State Data Record Augmentation (SDRA) which involves the collection of state administrative records from each of 10 states – California, Florida, Georgia, Michigan, North Carolina, Ohio, Pennsylvania, Tennessee, Texas, and Washington. Linking to state records systems greatly extends the power and utility of the HSLS:09 and marks an important innovation with many possible positive future implications.

In sum, through its core and supplemental components, HSLS:09 data will allow researchers, educators, and policymakers to examine motivation, achievement, and persistence in STEM coursetaking and careers. More generally, HSLS:09 data will allow researchers from a variety of disciplines to examine changes in young people’s lives and their connections with communities, schools, teachers, families, parents, and friends along a number of dimensions, including the following:

  • academic (especially in math and science), social, and interpersonal growth;

  • transitions from high school to postsecondary education, and from school to work;

  • students’ choices about, access to, and persistence in math and science courses, majors, and careers;

  • the characteristics of high schools and postsecondary institutions and their impact on student outcomes;

  • family formation, including marriage and family development, and how prior experiences in and out of school correlate with these decisions; and

  • the contexts of education, including how minority and at-risk status is associated with education and labor market outcomes.

b.Legislative Authorization

HSLS:09 is sponsored by the National Center for Education Statistics (NCES), within the Institute of Education Sciences (IES), in close consultation with other offices and organizations within and outside the U.S. Department of Education (ED). HSLS:09 is authorized under Section 153 of the Education Sciences Reform Act of 2002 (P.L. 107-279, Title 1 Part C), which requires NCES to

“collect, report, analyze, and disseminate statistical data related to education in the United States and in other nations, including

(1) collecting, acquiring, compiling (where appropriate, on a State-by-State basis), and disseminating full and complete statistics … on the condition and progress of education, at the preschool, elementary, secondary, postsecondary, and adult levels in the United States, including data on—

(A) State and local education reform activities; …

(C) student achievement in, at a minimum, the core academic areas of reading, mathematics, and science at all levels of education;

(D) secondary school completions, dropouts, and adult literacy and reading skills;

(E) access to, and opportunity for, postsecondary education, including data on financial aid to postsecondary students; …

(J) the social and economic status of children, including their academic achievement…

(2) conducting and publishing reports on the meaning and significance of the statistics described in paragraph (1);

(3) collecting, analyzing, cross-tabulating, and reporting, to the extent feasible, information by gender, race, ethnicity, socioeconomic status, limited English proficiency, mobility, disability, urbanicity, and other population characteristics, when such disaggregated information will facilitate educational and policy decisionmaking; …

(7) conducting longitudinal and special data collections necessary to report on the condition and progress of education…”

Section 183 of the Education Sciences Reform Act of 2002 further states that

all collection, maintenance, use, and wide dissemination of data by the Institute, including each office, board, committee, and Center of the Institute, shall conform with the requirements of section 552A of title 5, United States Code [which protects the confidentiality rights of individual respondents with regard to the data collected, reported, and published under this title], the confidentiality standards of subsection c of this section [which prohibits anyone from using individually identifiable data collected under the Education Sciences Reform Act (ESRA) for any purpose other than research, statistics, or evaluation (with the exception of the Attorney General per amendment by the US Patriot Act) and prohibits the publication of information that could result in the identification of a person who provided data under ESRA.

c.Prior and Related Studies

In 1970 NCES initiated a program of longitudinal high school studies. Its purpose was to gather time-series data on nationally representative samples of high school students that would be pertinent to the formulation and evaluation of educational polices.

Starting in 1972 with the National Longitudinal Study of the High School Class of 1972 (NLS:72), NCES began providing educational policymakers and researchers with longitudinal data that linked educational experiences with later outcomes, such as early labor market experiences and postsecondary education enrollment and attainment. The NLS:72 cohort of high school seniors was surveyed five times (in 1972, 1973, 1974, 1979, and 1986). A wide variety of questionnaire data were collected in the follow-up surveys, including data on students’ family background, schools attended, labor force participation, family formation, and job satisfaction. In addition, postsecondary transcripts were collected.

Almost 10 years later, in 1980, the second in a series of NCES longitudinal surveys was launched, this time starting with two high school cohorts. High School and Beyond included one cohort of high school seniors comparable to the seniors in NLS:72. The second cohort within HS&B extended the age span and analytical range of NCES’s longitudinal studies by surveying a sample of high school sophomores. With the sophomore cohort, information became available to study the relationship between early high school experiences and students’ subsequent educational experiences in high school. For the first time, national data were available showing students’ academic growth over time and how family, community, school, and classroom factors promoted or inhibited student learning. In a leap forward for educational research, researchers, using data from the extensive battery of cognitive tests within HS&B, were also able to assess the growth of cognitive abilities over time. Moreover, data were now available to analyze the school experiences of students who later dropped out of high school. These data became a rich resource for policymakers and researchers over the next decade and provided an empirical base to inform the debates of the educational reform movement that began in the early 1980s. Both cohorts of HS&B participants were resurveyed in 1982, 1984, and 1986. The sophomore cohort was also resurveyed in 1992. Postsecondary transcripts also were collected for both cohorts.

The third longitudinal study of students sponsored by NCES was the National Education Longitudinal Study of 1988. NELS:88 further extended the age and grade span of NCES longitudinal studies by beginning the data collection with a cohort of eighth-graders. Along with the student survey, it included surveys of parents, teachers, and school administrators. It was designed not only to follow a single cohort of students over time (as had NCES’s earlier longitudinal studies, NLS:72 and HS&B), but also, by “freshening” the sample at each of the first two follow-ups, to follow three nationally representative grade cohorts over time (8th-grade, 10th-grade, and 12th-grade cohorts). This provided not only comparability of NELS:88 to existing cohorts, but it also enabled researchers to conduct both cross-sectional and longitudinal analyses of the data. In 1993, high school transcripts were collected, further increasing the analytic potential of the survey system. Students were interviewed again in 1994 and 2000, and in 2000–2001 their postsecondary educational transcripts were collected. In sum, NELS:88 represents an integrated system of data that tracked students from middle school through secondary and postsecondary education, labor market experiences, and marriage and family formation.

The Education Longitudinal Study of 2002 was the fourth longitudinal high school cohort study conducted by NCES. ELS:2002 started with a sophomore cohort and was designed to provide trend data about the critical transitions experienced by students as they proceed through high school and into postsecondary education or their careers. Student questionnaires and assessments in reading and mathematics were collected along with surveys of parents, teachers, and school administrators. In addition, a facilities component and school library/media studies component were added for this study series. Freshening occurred at the first follow-up in 2004 to allow for a nationally representative cohort of high school seniors, which was followed by the collection of high school transcripts. An additional follow-up was conducted in 2006.

These studies have investigated the educational, personal, and vocational development of students, and the school, familial, community, personal, and cultural factors that affect this development. Each of these studies has provided rich information about the critical transition from high school to postsecondary education and the workforce. HSLS:09 will continue on the path of its predecessors while also focusing on the factors associated with choosing, persisting in, and succeeding in STEM coursetaking and careers.

2.Purpose and Use of Information Collection

HSLS:09 is intended to be a general-purpose dataset; that is, it will be designed to serve multiple policy objectives. Policy issues to be studied through HSLS:09 include the identification of school attributes associated with achievement (especially in mathematics); the influence that parent and community involvement have on students’ achievement and development; the factors associated with dropping out of the educational system; changes in educational practices over time; and the transition of different groups (for example, racial and ethnic, gender, and socioeconomic status groups) from high school to postsecondary institutions and the labor market, and especially into STEM curricula and careers. HSLS:09 will inquire into students’ values and goals, investigate factors affecting risk and resiliency, gather information about the social capital available to sample members, inquire into the nature of student interests and decision-making, delineate students’ curricular and extracurricular experiences, and catalogue their school programs and coursetaking experiences and results. HSLS:09 will obtain from teachers information about the classroom, school climate, and teacher background. HSLS:09 will include measures of school climate, each student’s native language and language use, student and parental educational expectations, attendance at school, course and program selection, planning for college, interactions with teachers and peers, perceptions of safety in school, parental income, resources, and home education support system. The HSLS:09 data elements will support research that speaks to the underlying dynamics and educational processes that influence student achievement, growth, and personal development over time.

The objectives of HSLS:09 also encompass the need to support both longitudinal and cross-cohort analyses and to provide a basis for important descriptive cross-sectional analyses. HSLS:09 is first and foremost a longitudinal study; hence survey items will be chosen for their usefulness in predicting or explaining future outcomes as measured in later survey waves. Compared to its earlier counterparts, there are considerable changes to the design of HSLS:09 that will have some impact on the ability to produce trend comparisons. NELS:88 began with an eighth-grade cohort in the spring term; while this cohort is not markedly different from the fall-term ninth-grade cohort of HSLS:09 in terms of student knowledge base, it differs at the school level in that the HSLS:09 time point represents the beginning of high school rather than the point of departure from middle school. HSLS:09 includes a spring-term 11th-grade follow-up (even though none of the predecessor studies do) because only modest gains have been seen on assessments in the final year of high school and the 11th-grade follow-up minimizes unit response problems associated with testing in the spring term of the senior year. The design of HSLS:09 calls for information to be collected from parents of 12th-graders and the use of transcripts to provide continuous data for grades 9–12. These data elements will provide the basis for trend analysis between HSLS:09 and its predecessor studies.

When contacting schools that have agreed to participate in HSLS:09, school IT staff will be asked a series of questions to identify the issues associated with using the school’s computer laboratories and computer equipment for the student component of HSLS:09. These questions will focus on the availability of a computer lab or a location at the school with computers that might be available for the sessions. For schools that have computers available, IT staff will report about the capacity of the computer lab (or other location with a set of computers) with regard to number of computers and Internet connectivity, the security of the computers at the school, and whether RTI and NCES will be permitted to use the computer lab (or comparable location with a set of computers) to conduct HSLS:09. As a backup, we are prepared to bring in five laptops per school to conduct the student assessment and survey. The questions we plan to ask the school are

  1. Do you have a computer lab in your school or other location with multiple computers?

  2. How many computers are there in the computer lab (or comparable location) that can be connected to the Internet?

  3. What type of Internet connections do you have in the computer lab (or comparable location)?

    1. High-speed connection

    2. Dial-up connection

    3. None

  4. Which operating system (Windows 2000/XP, Mac O/S, Linux, etc.) runs on these computers?

  5. What web browser(s) (name and version) are installed on these computers (i.e., Internet Explorer 6.0, Mozilla Firefox 2.0, Netscape 6, etc.)?

  6. Is the Internet activity of these computers recorded and/or monitored in any way?

  7. How many students and/or classes per day use the computers in the computer lab?

  8. Can RTI International use the computers at the school for conducting the web-based student assessment and survey for students participating in the High School Longitudinal Study?

  9. Are the school computers protected by

    1. antivirus software;

    2. antispyware software; or

    3. Internet firewall?

  10. Will you allow RTI International to run checks on the school computers to verify that they are not infected with viruses or spyware?

  11. Will you allow RTI International to remove viruses and spyware found as the result of the check proposed in Question 10?

a.Content Justifications

Overview. This section discusses the content of the HSLS:09 instruments. Several appendices have been produced, in order to supply further documentation relevant to revision of the field test instruments for the main study. Specifically, one appendix (J) is a change memo, further described below. Another appendix contains the summary of the November 2007, January 2008, and January 2009 HSLS:09 Technical Review Panel (TRP) meetings. The three documents have been appended to provide information about the panel’s deliberations, including recommendations concerning study and instrument design. In addition, the six draft field test questionnaires—student, parent, administrator, mathematics teacher, science teacher, and counselor—have been included, each in its own appendix, along with the proposed revised version for the main study.

All questionnaires and the assessment serve to support the overall purposes of HSLS:09, which are to understand the factors (e.g., experiences, behaviors, attitudes, interactions with people) that influence students’ decision-making process about high school courses, postsecondary options, and occupational goals (especially within the STEM pipeline), and to understand how these decisions evolve through secondary school and beyond.

In the attached change memo (appendix J), a series of tables updates the post-field test status of all HSLS:09 questionnaires: student, parent, administrator, math and science teacher, and school counselor. The tables (1) indicate the status of each field test item (retained, dropped, or revised); (2) provide a crosswalk between the question numbering of the July 2008 regular clearance submission appended questionnaires (also appended to this document), the item name, and the location of each item on the revised instruments proposed for the main study and appended to this document; and (3) provide a justification for change. A few new items are recommended and have been added to the tables as well as to the revised questionnaires.

The principal changes can be concisely summarized. First, for the main study student instrument, there has been a shift from the heavy predominance of belief and attitude items to more of a balance with behavioral measures. This reflects the fact that the large (n >1,000) ninth-grade field test sample size was exploited, through matrix sampling, to test a large number of alternative belief and attitude scales. In turn, reliability and other analyses were used to winnow, and weaker scales and weaker items within scales were dropped.

In addition, two important series of college questions were added—items that have already been tested in their measurement properties through prior NCES high school longitudinal cohort studies or the National Household Education Survey. One such series attempts to elicit more probabilistic information about subjective certainties of educational entry and attainment expectations. A second series—parallel both in the student and parent instruments—tests cost knowledge of postsecondary educational options, including a question that asks for the respondent’s degree of confidence in the accuracy of their postsecondary educational cost estimate.

A more fine-grained record of all specific changes, and the justification thereof for each individual case, is provided in the change memo appendix (appendix I).




Student Survey: Purposes and Content Justification

Four primary research questions drive the student questionnaire:

  1. How do students decide what courses to take in high school and what to pursue after their time in high school concludes (e.g., college, work, careers, the military)? What factors affect their decision-making, particularly factors that are malleable to school or parent influence?

    1. Opportunities

    2. Barriers

    3. Attitudes

    4. Past behaviors

    5. Plans

  2. What factors lead students toward or away from STEM courses and careers?

  3. How do students’ attitudes and learning approaches evolve in the course of high school?

    1. Confidence, efficacy

    2. Motivation, engagement, belonging

    3. Reasoning and problem-solving

  4. How do students prioritize various commitments or influences in high school? How do they balance social and academic engagement?

The main study student questionnaire attempts to provide information that will help to address these and related questions from the student’s perspective.

Justifications for Student Questionnaire Content. Items selected for the student questionnaire have been developed and reviewed by the NCES project officer, project staff at RTI International, and the HSLS:09 Technical Review Panel. The questionnaire will collect information on seven domains: (1) student locating and contact information; (2) student background and previous experiences; (3) students’ social context and interpersonal influences; (4) students’ values and their determinants; (5) students’ expectations for the future and their determinants; (6) student decisions; and (7) the environment of students’ ninth-grade math and science classes. Each domain and the questions that comprise it are detailed below.

Part II: Student Background and Previous Experiences

Because this study focuses on students’ trajectories through high school and the decisions they make along the way, it is important to gather information on their background and experiences before they enter high school. This is crucial in establishing baseline information on students so that effects of school, family, and peers on educational trajectories and decisions can be identified apart from preexisting characteristics of sample members. Once collected, these measures will be used mostly as predictors of secondary and postsecondary outcomes.

Part III: Social Context and Interpersonal Influences

In accord with a host of past research conducted by NCES, students’ daily interactions with their parents, teachers, and peers play a large role in their transition to high school, their current academic investments, as well as their future plans. Like the previous section, this section will collect information that will mostly serve as predictors of later outcomes. Since many of the items measured here will also be collected in later rounds, this section provides a baseline for analyses of change and stability in social context and interpersonal influences, and how this overlaps with academic success and social development. Specifically, it will allow analysts to explore issues of school climate and support networks as students progress through high school.

Part IV Students’ Values and Their Determinants

Research in social psychology, economics, and sociology identifies values as key determinants of behavior. For example, if students value going to college, their behaviors will be directed toward that end. In a departure from previous NCES longitudinal studies, HSLS:09 will include items that directly measure the value students place on school in general and in particular, math and science, their future careers, and the factors that shape these values. These factors include belonging, motivation, identity, perceived opportunities, and perceived costs. These items can be used as outcomes predicted by student background, previous experiences, social context, and interpersonal influences, or they can be used as predictors of educational and occupational decisions measured in later rounds.

Part V: Students’ Expectations for the Future and their Determinants

In addition to values, past research using NCES longitudinal studies and other data sources have demonstrated that expectations for the future play a role in shaping behavior. For example, if students expect to work in their family business when they grow up, their behaviors will be directed toward that end. To further explore the role that these expectations play in the decision making processes of contemporary adolescents, HSLS:09 will include a host of items that directly measure the expectations students hold for postsecondary education and careers, as well as the factors that influence these expectations. These include attributions, self efficacy, deterrents, and negative experiences. As with students’ values, these items can be used as outcomes predicted by student background, previous experiences, social context, and interpersonal influences, or they can be used as predictors of later educational and occupational decisions. Additionally, given the historical use of expectations questions on NCES longitudinal surveys, analysts can make comparisons with previous high school cohorts. For the main study, probabilistic versions of the expectations questions, which ask the respondent to posit the likelihood of the postsecondary outcome, are included.

Part VI: Student Decisions

The central outcomes measured in the student questionnaire are the decisions made by the students. These include coursetaking (specifically in mathematics and science), student engagement, student time use, college, and careers. The information on these decisions will allow analysts to examine their relationship with constructs measured in the previous substantive sections. These include, but are not limited to, the relationship between sociodemographic background and coursetaking patterns; the role of parent and teacher support in student engagement; linkages between early orientations toward math and science and pursuit of a STEM career; and how values and expectations influence postsecondary decisions. In this base-year survey, we focus on three decisions: coursetaking, engagement, and time use.

Part VII: The Environment of Students’ Math and Science Classes

A unique feature of HSLS:09 is the in-depth examination of the math and science courses that students take in the ninth grade. In the survey, students will identify their current math and science courses and will answer a battery of questions about their perceptions of their teachers’ effectiveness. This information will be augmented by surveys of the teachers. Using this information in tandem with the other components of the student survey and the assessment, analysts will have rich information that will help identify the influence that teachers and ninth-grade math and science courses have on achievement. We focus on two key components of the classroom: the teacher’s approach to students and the perceived effectiveness of the teacher.


Parent Survey


The parent questionnaire complements the student questionnaire by providing information on the student’s context and history, reporting on parental school involvement, and describing the home environment (e.g., values, expectations, and opportunities). Three research questions frame the parent questionnaire:


  1. What social capital resources are available in the home environment to support children’s academic development and decision making (e.g., parent involvement in child’s decision making; course selection; planning for college or the labor market; shifts in involvement around key transitions – middle to high school, high school to postsecondary life; child’s involvement in extracurricular activities; child’s involvement in community activities [e.g., Girl Scouts, church groups])?

  2. What human capital resources are available in the home environment to support children’s academic development and decision making (e.g., parents’ background in mathematics; parents’ background in science; parents’ attitudes about the importance of math, science, and education in general; parents’ expectations for children’s educational achievement; and parents’ expectations for their child’s career)?

  3. What financial capital resources are available in the home environment to support children’s academic development and decision making (e.g., household income, savings, savings set aside for college education)?

The parent interview will begin with questions about the student’s family situation. The first question will establish how the respondent is related to the ninth-grader. The parent or guardian who is most knowledgeable about the ninth-grader’s schooling will be asked to respond. Experience with this approach in the NCES longitudinal studies has shown that the vast majority of respondents to the parent interview will in fact be parents, most often mothers. However, given the diversity of family structures today a sizable number will be grandparents, other relatives, or guardians.

Parents then will report on the language background and immigration status of the family and the child specifically. Once background details are gathered, parents then will encounter questions on their involvement with the child’s education and schooling, their discussions and expectations about coursetaking and postsecondary education, and the opportunities they provide, whether these opportunities are related to financial or social capital.

Finally, parents will be asked to provide information that will assist RTI in locating them and their 9th-grader in future follow-ups of the study. Locating items will include: name, address, home and work telephone numbers, e-mail address, Social Security number, ninth-grader’s Social Security number, spouse/partner’s name, if ninth-grader has parent/guardian outside the interviewed household, name and address, home and work telephone number. Also: name, address and telephone number of close relative or friend, person’s relationship to the respondent, and their name, address and phone number.

Date of interview is also recorded, as well as source of any assistance in completing the questionnaire. All of these elements—locating information, date of interview, information about assistance in interview completion—are taken from the ELS:2002 base-year survey of parents.

School Administrator Questionnaire

The school administrator instrument provides contextual information about the principal, school characteristics, school climate, staffing, and resources.

The purpose of the HSLS:09 School Administrator Questionnaire is to support the study’s main research objectives: How do young adults choose the pathways they do, particularly pathways into science, technology, engineering, and mathematics (STEM) careers? What role does high school (or the high school years) play in students’ ultimate decisions? And, what role does “algebra learning” in high school play in students’ decisions to pursue a career in STEM specifically, and more globally, in providing students with the ability to reason, persist, and achieve throughout life?

To achieve its purpose, the HSLS:09 School Administrator Questionnaire has been designed to provide school-level contextual data and control variables (e.g., public/private high school; tracking) for examining and interpreting students’ decision making and planning processes. And, because HSLS:09 schools will comprise a representative sample, questionnaire data may also be used to draw a descriptive profile of the course and program offerings, reform efforts, and math and science focus of American high schools with 9th and 11th grades.

Although questionnaire items were selected to achieve the overall goals and purposes of the study as mentioned above, selection was guided primarily by the desire to address the following questions specific to schools:

  1. What school structures, policies, practices, and offerings facilitate or inhibit different high school trajectories and decisions (e.g., coursetaking, dropping out, going on to work or college)?

  2. What programs and policies do schools offer to assist students at risk of school failure, including students at risk of dropping out, students transitioning from middle school to high school, and students struggling in math and science?

  3. What are the school-level correlates of high-achieving schools in math and science (e.g., principal training and experience, climate, ease of hiring and retaining qualified math and science teachers, program offerings in math and science, and supports for struggling students)?

  4. What is the math and science focus of schools (e.g., what explicit activities, if any, are schools engaged in to raise students’ interest and performance in math and science)?

  5. Is the math and science focus of schools associated with students’ subsequent performance in math and science and decisions to pursue careers in math and science?

Items were also selected based on the need to collect certain data in students’ 9th- versus 11th-grade year. For example, the question on block scheduling is proposed for the 9th-grade School Administrator Questionnaire because it is subject to change. Asking the item later would not tell us whether the school had block scheduling previously to which the HSLS:09 student was exposed and may have benefited. School practices and policies that are less likely to change over time will be asked in the 11th-grade School Administrator Questionnaire scheduled for the spring of 2012. This division of items also keeps the burden of the questionnaire to 30 minutes. Four sets of items have already been identified for the 11th-grade HSLS School Administrator Questionnaire: (1) student evaluation items; (2) accountability and standardized assessment items, (3) extracurricular activities items, including career exploration; and (5) an additional item on climate.

The School Administrator Questionnaire collects information on the school in five domains: (1) school and student characteristics; (2) teaching staff characteristics; (3) school policies, practices, and programs; (4) school governance and climate; and (5) principal background and experiences. Data gathered in the School Administrator Questionnaire can be merged with data from the student, counselor, and teacher questionnaires and the student cognitive assessment. This linkage will allow researchers to determine the school structures, policies, and practices that facilitate or inhibit different high school trajectories and decisions, such as coursetaking, dropping out of school or going on to college or work, and specifically with respect to decisions concerning the pursuit of STEM careers.

Teacher Survey: Mathematics Teacher Survey, Science Teacher Survey

The teacher surveys complement the student survey by providing school context information and data about the opportunities and resources available to support student achievement. Key questions that can be answered by the teacher component are as follows:

  1. What do math and science teachers do in the classroom that engages and encourages students to pursue STEM pathways, or alternatively, disengages and discourages students from choosing STEM pathways?

    1. Interactions with students

    2. Approaches to teaching (not exact practices)

    3. Expectations, efficacy, beliefs about student potential

    4. Personal background/history (e.g., reasons for entering profession, knowledge and comfort with math/science)


  1. How do math and science teachers perceive the quality and supply of the resources and support they have available to teach effectively?

    1. Induction practices (transition from career training to career)

    2. Textbooks (usage linked to HSLS sample member’s classroom), manipulatives, supplies

    3. Curriculum specialists

    4. Current teacher assignment (e.g., how many different classes the teacher is instructing); how many planning periods does the teacher have (if any); what is the average class size (does he/she feel the classes are too big?).

The Teacher Questionnaires (mathematics and science) collect information on teachers in five areas: (1) teacher background; (2) teacher attitudes/beliefs; (3) instructional policies/programs; (4) textbook use; and (5) school/departmental climate. Data gathered from the teacher questionnaire can be merged with data from the student assessment and survey. This linkage of data will allow researchers to use the teacher data contextually with the student as the primary unit of analysis. Linkages will be possible between the teacher’s/student’s specific classroom and the textbook materials used in each ninth-grader’s mathematics and science courses.

Counselor Survey

The counselor component is targeted to the head counselor or whomever the head counselor designates as a knowledgeable source about the questionnaire contents. HSLS:09 is not a study of counselors and cannot generalize about counselors as a special population, but employs counselor data contextually to illuminate characteristics and practices of the school, particularly those related to student placement in mathematics and science, and the availability and role of counseling services vis-à-vis the transition into high school and out of high school. Key research questions that the counselor survey may help to address include the following:

  1. How do students get placed into and out of classes?

  2. What counseling resources are available to the students within the school, e.g., how many counselors; what is their student load; what do they do, what are their responsibilities (i.e., course placement, college planning, career planning; transitions from middle school, from high school to postsecondary)? What supports are available for struggling students and for gifted students?

  3. What are the tracking procedures and policies and graduation requirements (e.g., how many credits/courses in English, in math, in science, etc.)?

The School Counselor Questionnaire collects information on the school in six areas: (1) counseling services provided; (2) postsecondary counseling; (3) course placement policies; (4) school-based remediation and enrichment services offered (with a focus on STEM); (5) out-of-school learning experiences/opportunities; and (6) school climate. Data gathered from the counselor questionnaire can be merged with data from the student assessment and survey. This linkage of the data will allow researchers to determine to what degree disparities in educational aspirations, expectations, and outcomes of various student populations are accounted for by differences in their educational experiences. Questions selected for the counselor questionnaire were reviewed by the HSLS:09 Technical Review Panel in its November 2007, January 2008, and January 2009 meetings.

Part I of the Counselor Questionnaire collects information on the counseling services available at the school, including the number of full-time and part-time counselors; certification of those counselors, and goals of the counseling program. This section also includes questions on the frequency of counselor/student interactions.

Part II focuses on services the school offers to assist students with the transition from high school to postsecondary education or work.

Part III of the Counselor Questionnaire addresses the procedures used to place students into their ninth-grade mathematics and science courses. In addition, this section includes questions about requirements for entry into advanced mathematics and science courses.

Part IV emphasizes alternative educational opportunities available for students during both the academic year and in summer, including programs/policies designed to encourage students in STEM, and whether the school offers vocational-technical programs. Also of interest are interventions and services available for both at-risk students and advanced/gifted students.

Part V addresses student opportunities to take STEM courses not offered at school.



b.Sharing survey information (PIA, D, 2, a, ii)

NCES is authorized to use information collected under ESRA for statistical purposes, providing that no published information can be used to identify a specific respondent. Furthermore, only individuals authorized by the IES Director may be permitted to examine the individual reports. To meet these requirements, if present during data collection, direct identifiers are removed from each survey record; then all NCES data files that include any indirect identifiers that are considered to be potentially personally identifiable undergo a confidentiality edit. This edit is implemented using perturbation techniques to alter some of the responses in the individual data records prior to any release of data or data tabulations. Thus all released information is protected in a consistent way. Further, the perturbation techniques are designed to preserve the level of detail that exists in the information provided by respondents.

NCES publishes data collected under ESRA in tabular form. NCES makes data available for statistical uses in either a public use data file or a restricted use data file. In the case of public use data files, following the confidentiality edits, the individual records are subjected to additional disclosure limitation analysis that results in either further data perturbations or coarsening, or both. These analyses and the resulting data are reviewed by the IES Disclosure Review Board to ensure that appropriate protections have been implemented before a recommendation to release the data is made. Since ESRA prohibits the use of any data collected or made available under ESRA in any attempt to identify survey respondents, users of public use data must attest their willingness to follow the legal requirements imposed under ESRA before they may access the public use data.

In the case of restricted use data, under ESRA the Director of IES may authorize individuals to examine individual respondent records. Using this authority, NCES established a licensing system (see http://nces.ed.gov/statprog/rudman/toc.asp) whereby qualified researcher’s and their sponsoring institutions or organizations enter into a legally binding contract to protect the confidentiality of the individual respondent records and to abide by the specific terms of section 183 of ESRA and those specified in the license document, the data security plan, and the affidavits of nondisclosure. After meeting these conditions and duly executing the legal documents, qualified researchers are permitted to analyze restricted use data in a physically and electronically secure space.

NCES also makes data available to external researchers through the use of on-line data tools. With these tools, the data user can produce summary statistics, tabulations and analyses, but cannot access the individual respondent records. These data tools can use either restricted-use or public use data files; however, when a data tool is supported by a restricted use data file, additional protections are included in the software and limitations are imposed on the output to ensure appropriate protection to the underlying data.

c. Impact on respondent privacy (PIA, D, 2, a, iii)

Prior to students being asked to participate in this study, the parent is notified about their child’s selection for participation in the study. In addition to describing the purpose for the survey, the use of the data, the legal authority for the data collection, and the confidentiality protections, the voluntary nature of the study is described. In addition, under the Protection of Pupil Rights Act, the student questionnaires have been reviewed to ensure that there are no questionnaire items that include topics specified in law as requiring written parental consent (this includes the seven topics called out in law as being potentially demeaning or invasive towards the student or their family members). Furthermore, anyone granted access to these data, either public or restricted use, must agree that they will make no attempt to identify any individual survey respondent; unauthorized contact with survey respondents is strictly prohibited. (Please see appendix A for the recruitment documents and information provided to prospective respondents, and item 10 below for a more specific discussion of the confidentiality laws.)

3.Use of Improved Information Technology and Burden Reduction

For the first time in the series of NCES longitudinal studies, all questionnaire data will be collected in electronic media only. In addition, the student assessment will also be a computer-assisted two-stage adaptive test. For the student component, we will use the school’s computer lab when available, and, as a backup, we will bring multiple laptops into the school for use by the sampled students. A member of the research team will be present to assist students with computer issues as needed.

School administrators, teachers, and parents will be given a username and password and will be asked to complete the questionnaire via the Internet. Follow-up for school administrators, teachers, and parents who do not complete the web questionnaire by self-administration will be in the form of computer-assisted telephone interviewing (CATI). Computer control of interviewing offers accurate and efficient management of survey activities, including case management, scheduling of calls, generation of reports on sample disposition, data quality monitoring, interviewer performance, and flow of information between telephone and field operations.

Additional features of the system include (1) online help for each screen to assist interviewers in question administration; (2) full documentation of all instrument components, including variable ranges, formats, record layouts, labels, question wording, and flow logic; (3) capability for creating and processing hierarchical data structures to eliminate data redundancy and conserve computer resources; (4) a scheduler system to manage the flow and assignment of cases to interviewers by time zone, case status, appointment information, and prior cases disposition; (5) an integrated case-level control system to track the status of each sample member across the various data collection activities; (6) automatic audit file creation and timed backup to ensure that, if an interview is terminated prematurely and later restarted, all data entered during the earlier portion of the interview can be retrieved; and (7) a screen library containing the survey instrument as displayed to the interviewer.

4.Efforts to Identify Duplication and Use of Similar Information

Since the inception of its secondary education longitudinal studies program in 1970, NCES has consulted with other federal offices to ensure that the data collected in this important series of longitudinal studies do not duplicate the information from any other national data sources within the U.S. Department of Education or other government agencies. In addition, NCES staff have regularly consulted with nonfederal associations such as the College Board, American Educational Research Association, the American Association of Community Colleges, and other groups to confirm that the data to be collected through this study series are not available from any other sources. These consultations also provided, and continue to provide through the HSLS:09 Technical Review Panel, methodological insights from the results of other studies of secondary and postsecondary students and labor force members, and they ensure that the data collected through HSLS:09 will meet the needs of the federal government and other interested agencies and organizations.

Other longitudinal studies of secondary and postsecondary students (i.e., NLS:72, HS&B, NELS:88, ELS:2002) have been sponsored by NCES in the past. HSLS:09 builds on and extends these studies rather than duplicating them. These earlier studies were conducted during the 1970s, 1980s, 1990s, and the early 2000s and represent educational, employment, and social experiences and environments different from those experienced by the HSLS:09 student sample. In addition to extending prior studies temporally as a time series, HSLS:09 will extend them conceptually. The historical studies do not fully provide the data that are necessary to understand the role of different factors in the development of student commitment to attend higher education and then to take the steps necessary to succeed in college (take the right courses, take courses in specific sequences, etc.). Using items and inventories, the study will enable researchers to move beyond the traditional covariates to ask, “How do students and parents construct their choice set?” Further, HSLS:09 will focus on the factors associated with choosing and persisting in mathematics and science coursetaking and STEM careers. These focal points present a marked difference between HSLS:09 and its predecessor studies.

The only other dataset that offers so large an opportunity to understand the key transitions into postsecondary institutions and/or the world of work, is the Department of Labor (Bureau of Labor Statistics) longitudinal cohorts, the National Longitudinal Survey of Youth 1979 and 1997 cohorts (NLSY79, NLSY97). Clearly, however, the NLSY youth cohorts represent temporally earlier cohorts than HSLS:09. There are also important design differences between NLSY79/ NLSY97 and HSLS:09 that render them more complementary than duplicative. NLSY is a household-based longitudinal survey; HSLS:09 is school based. For both NLSY cohorts, baseline Armed Service Vocational Aptitude Battery (ASVAB) test data are available, but there is no longitudinal high school achievement measure. While NLSY97 also gathers information from schools (including principal and teacher reports and high school transcripts), it cannot study school processes in the same way as HSLS:09, given its household sampling basis. Any given school contains only one to a handful of NLSY97 sample members, a number that constitutes neither a representative sample of students in the school nor a sufficient number to provide within-school estimates. Thus, although both studies provide important information for understanding the transition from high school to the labor market, HSLS:09 is uniquely able to provide information about educational processes and within-school dynamics and how these affect both school achievement and ultimate labor market outcomes, including outcomes in science, technology, engineering, and mathematics education and occupations.

5.Impact on Small Businesses or Other Small Entities

This section has limited applicability to the proposed data collection effort. Target respondents for HSLS:09 are individuals (typically nested within an institutional context) of public and private schools; base-year data collection activities will involve no burden to small businesses or entities.

6.Consequences of Collecting the Information Less Frequently

This submission describes the full-scale data collection for the base year of HSLS:09. Base-year data collection will take place in the fall of 2009, which was preceded by a field test in 2008. First follow-up data collection will take place 2½ years later, in the spring term of 2012, with a field test in 2011. The initial out-of-school follow-up is tentatively scheduled for 3 years thereafter.

The rationale for conducting HSLS: 09 is based on a historical national need for information on academic and social growth, school and work transitions, and family formation. In particular, recent education and social welfare reform initiatives, changes in federal policy concerning postsecondary student support, and other interventions necessitate frequent studies. Repeated surveys are also necessary because of rapid changes in the secondary and postsecondary educational environments and the world of work. Indeed, longitudinal information provides better measures of the effects of program, policy, and environmental changes than would multiple cross-sectional studies.

To address this need, NCES began the National Longitudinal Studies Program more than 35 years ago with the National Longitudinal Study of 1972 (NLS:72). This study collected a wide variety of data on students’ family background, schools attended, labor force participation, family formation, and job satisfaction at five data collection points through 1986. NLS:72 was followed approximately 10 years later by High School and Beyond (HS&B), a longitudinal study of two high school cohorts (10th- and 12th-grade students). The National Education Longitudinal Study of 1988 (NELS:88) followed an eighth-grade cohort, which, upon completion in 2000, reflected a modal respondent age of about 26 years. The Education Longitudinal Study of 2002 (ELS:2002) followed a 10th-grade cohort and allows for the availability of a 32-year trend line.

The scheduled student follow-ups of HSLS:09 are less frequent than the 2-year interval employed with HS&B, NELS:88, and ELS:2002. The first follow-up takes place at 2½ years after the base year, and the second follow-up 3 years after the first follow-up. However, parent data may be collected at grade 12, and a high school transcripts study to be conducted soon after graduation will provide continuous coursetaking data for the cohort’s high school careers for all on-time or early completers. The initial data collection occurs at the start of the students’ high school careers and will allow researchers to understand decisionmaking processes as they pertain to the selection of STEM-related courses. By following up at the end of the students’ junior year, researchers will be able to measure achievement gain as well as postsecondary planning information. Collecting parent and transcript information in the 12th grade will minimize burden on schools and respondents, while also allowing for further intercohort comparability with the main transition themes of the prior studies. The second follow-up is scheduled to occur in the second year after high school, which is on track with the timing of the predecessor studies, thus facilitating comparisons in the domain of postsecondary access and choice. Despite the changes in grade cohorts and data collection time points for the first two rounds, general trends will still be measurable, since the same key transitions, albeit with slightly different data collection points, will be captured with the HSLS:09 data.

Probably the most cost-efficient and least burdensome method for obtaining continuous data on student careers through the high school years comes through the avenue of collecting school records. In most cases, transcript data are more accurate than self-report data as well. High school transcripts were collected for a subsample of the HS&B sophomore cohort, as well as for the entire NELS:88 cohort retained in the study after eighth grade and the entire ELS:2002 sophomore and senior cohorts. The collection of administrative records will take place at the onset of HSLS:09 to identify coursetaking behaviors in grades 8 and 9, and a full transcript study is tentatively scheduled to take place after high school graduation.

7.Special Circumstances Relating to Guidelines of 5 CFR 1320.5

All data collection guidelines in 5 CFR 1320.5 are being followed. No special circumstances of data collection are anticipated.

8.Consultations Outside NCES

Consultations with persons and organizations both internal and external to the National Center for Education Statistics, the U.S. Department of Education (ED), and the federal government have been pursued. In the planning stage for HSLS:09, there were many efforts to obtain critical review and to acquire comments regarding project plans and interim and final products. We are in the process of convening the Technical Review Panel, which becomes the major vehicle through which future consultation is achieved in the course of the project. Consultants outside ED and members of the Technical Review Panel include the following individuals:

Technical Review Panel

Dr. Clifford Adelman

The Institute for Higher Education Policy

1320 19th Street, NW, Suite 400

Washington, DC 20036

Phone: (202) 861-8223 ext. 228

Fax: (202) 861-9307

E-mail: [email protected]


Dr. Kathy Borman

Department of Anthropology, SOC 107

University of South Florida

4202 Fowler Avenue

Tampa, FL 33620

Phone: (813) 974-9058

E-mail: [email protected]


Dr. Daryl E. Chubin

Director

Center for Advancing Science & Engineering Capacity

American Association for the Advancement of Science (AAAS)

1200 New York Avenue, NW

Washington, DC 20005


Dr. Jeremy Finn

State University of New York at Buffalo

Graduate School of Education

409 Baldy Hall

Buffalo, NY 14260

Phone: (716) 645-2484

E-mail: [email protected]

Dr. Thomas Hoffer

NORC

1155 E. 60th Street

Chicago, IL 60637

Phone: (773) 256-6097

E-mail: [email protected]


Dr. Vinetta Jones

Howard University

525 Bryant Street NW

Academic Support Building

Washington DC 20059

Phone: (202) 806-7340 or (301) 395-5335

E-mail: [email protected]


Dr. Donald Rock

Before 10/15: K11 Shirley Lane

Trenton NJ 08648

Phone: 609-896-2659

After 10/15: 9357 Blind Pass Rd, #503

St. Pete Beach, FL 33706

Phone: (727) 363-3717

E-mail: [email protected]


Dr. James Rosenbaum

Institute for Policy Research

Education and Social Policy

Annenberg Hall 110 EV2610

Evanston, IL 60204

Phone: (847) 491-3795

E-mail: [email protected]


Dr. Russ Rumberger

Gevirtz Graduate School of Education

University of California

Santa Barbara, CA 93106

Phone: (805) 893-3385

E-mail: [email protected]


Dr. Philip Sadler

Harvard-Smithsonian Center for Astrophysics
60 Garden St.
, MS 71

Office D-315

Cambridge, MA 02138.

Phone: (617) 496-4709

Fax: (617) 496-5405

E-mail: [email protected]


Dr. Sharon Senk

Department of Mathematics

Division of Science and Mathematics Education

D320 Wells Hall

East Lansing, MI 48824

Phone: (517) 353-4691 (office)

E‑mail: [email protected]


Dr. Timothy Urdan

Santa Clara University

Department of Psychology

500 El Camino Real

Santa Clara, CA 95053

Phone: (408) 554-4495

Fax: (408) 554-5241

E-mail: [email protected]

Other Consultants Outside ED

Dr. Eric Bettinger

Associate Professor, Economics

Case Western Reserve University

Weatherhead School of Management

10900 Euclid Avenue

Cleveland, OH 44106

Phone: (216) 386-2184

E-mail: [email protected]


Dr. Audrey Champagne

Professor Emerita

University of Albany

Educational Theory and Practice

Education 119

1400 Washington Avenue

Albany, NY 12222

Phone: (518) 442-5982


Dr. Stefanie DeLuca

Assistant Professor

Johns Hopkins University

School of Arts and Sciences

Department of Sociology

532 Mergenthaler Hall

3400 North Charles Street

Baltimore, MD 21218

Phone: (410) 516-7629

E-mail: [email protected]


Dr. Laura Hamilton

RAND Corporation

4570 Fifth Avenue, Suite 600

Pittsburgh, PA 15213

Phone: (412) 683-2300 ext. 4403

E‑mail: [email protected]


Dr. Jacqueline King

Director for Policy Analysis

Division of Programs and Analysis

American Council for Education

Center for Policy Analysis

One Dupont Circle, NW

Washington, DC, 20036

Phone: (202) 939-9551

Fax: 202-785-2990

E-mail: j[email protected]


Dr. Joanna Kulikowich

Professor of Education

The Pennsylvania State University

232 CEDAR Building

University Park, PA 16802-3108

Phone: (814) 863-2261

E‑mail: [email protected]


Dr. Daniel McCaffrey

RAND Corporation

4570 Fifth Avenue, Suite 600

Pittsburgh, PA 15213

Phone: (412) 683-2300 ext. 4919

E-mail: [email protected]


Dr. Jeylan Mortimer

University of Minnesota - Dept of Sociology

909 Social Sciences Building

267 19th Avenue South

Minneapolis, MN 55455

Room 1014a Social Sciences

Phone: (612) 624-4064

E-mail: [email protected]


Dr. Aaron Pallas

Teachers College

Columbia University

New York, NY 10027

Phone: (646) 228-7414

E-mail: [email protected]


Ms. Senta Raizen

Director

WestEd

National Center For Improving Science Education

1840 Wilson Blvd., Suite 201A

Arlington, VA 22201-3000

Phone: (703) 875-0496

Fax: (703) 875-0479

E-mail: [email protected]

9.Explanation of Any Payment or Gift to Respondents: Field Test Incentive Results and Recommendations for the Main Study

A school-level incentive experiment was conducted in the field test to find methods to help offset some of the challenges associated with obtaining school cooperation. The field test experiment compared the effect of a $500 technology allowance on a school’s participation in the field test against the effect (or lack of effect) of no incentive. NCES’s contractor, RTI International, found that the technology allowance did not have an impact on schools’ decision-making processes – i.e., school participation rates were not higher for the “incentive” group.

Before the schools were initially contacted, the treatment and control groups for the incentive experiment were drawn from the field test school sample, which included 92 schools (90 eligible; 2 ineligible). These schools were divided evenly between “incentive offer” schools and “no incentive offer” schools (46 in each group; both ineligibles were in the “incentive offer” group). The technology allowance was in the form of a check written to the school to be used at the school’s discretion, though HSLS:09 field staff recommended that it be used toward technology for the school to align with the focus of the study. The technology allowance was introduced to school districts and schools in the lead letter and during initial telephone conversations about the study when discussing the benefits of participating. This incentive was re-emphasized to district and school officials on the phone and via email when hesitancy was expressed about participation in HSLS:09.

Initially, 48 of the 90 eligible sampled schools agreed to participate. Seven of those schools declined participation after initial agreement, either because they felt the burden of the study was too great—they were unable to make the computerization work and were unwilling to have HSLS there for an extended period using laptops—or because they couldn’t fit HSLS into the school schedule. Thus, RTI administered the field test in a total of 41 schools. Of these schools, 20 were offered the $500 technology allowance and 21 were not offered the technology allowance.

The majority of schools that were offered the technology allowance yet still declined to participate made one of two responses to the incentive: (1) there is too much high stakes testing already scheduled in the school and there is no time to fit in another test, or (2) extreme budget and staffing cuts make it difficult to spare time or staff to take on another assessment. Additionally, many schools indicated that the incentive was too small to be meaningful to the school. Other school officials told RTI that no monetary incentive would be enough to offset the fact that students are over-tested and there is just no time in the schedule for another test.

In sum, the equal distribution of schools receiving the technology allowance versus not receiving the technology allowance leads to the conclusion that this was not an effective tool in gaining cooperation. As a result, eliminating the technology allowance for schools is recommended for the main study.

Though school-level incentives did not prove fruitful, several school districts and schools indicated that providing school-specific results would be a compelling option to secure their participation. School-level test results would offer a tangible response to schools’ concerns about the benefits of their participation, the answer to a school’s question: “what do we get out of this?” Thus, providing school-level test results to schools is recommended for the main study. These results would be in aggregate form to ensure the promised confidentiality to students.

The possibility of supplying schools with data depends on having a robust in-school participating sample, and this requirement would be made clear to all schools that have an interest in receiving school-level test results. The school would be required to have at least 10 participating students and an 85 percent participation rate among the eligible sampled students.

It is recommended that the school-level mathematics assessment score be provided to the school because it is the safest information to provide from a confidentiality and disclosure avoidance perspective and at the same time the data most of interest. The score reported would be the school’s mean IRT-estimated number-right average score. Based on confidence intervals and statistical significance at .05, results could be reported in three broad bands: below, at, or above the national norm (and various subnational norms) on the same assessment. Alternatively, the scale score itself could be provided, with confidence intervals, for school versus national or subnational comparisons. The following comparisons could be made: school score versus national norm; school score versus mean score for its sector (public, Catholic, or other private); school score versus mean score for its Census region; and for public schools only (given the low-n for cross-classified Catholic schools and other private schools), school score by the relevant locale code (city, suburban, town or rural). Respondent incentives for most sample members were determined in the original OMB package. Table 1 shows the field test and main study incentive structures for each respondent type and, where applicable, a justification for changes proposed.

There are no proposed changes to incentive decisions for parent, school counselor, or school administrator respondents, or for the school coordinator honorarium. Two changes from the field test model will be implemented based on approvals to the originally submitted OMB package: (1) the school incentive will not be implemented for the main study after lackluster results in the field test, and (2) students will receive an educational “goodie bag” (a drawstring bag with school supplies: pencils, pads, etc.; estimated value $5 each) instead of the $10 cash incentive offered in the field test.

Based on unforeseen field test results and thus not discussed in the original OMB package, one change and one addition to the HSLS incentive package are proposed. Specifically, change in the design of the teacher survey warrants the elimination of the variable teacher incentive, and the field test experience suggests that an honorarium for an IT coordinator at each school is warranted to facilitate the administration of the questionnaire and assessment.

The field test teacher survey had teachers of the sampled 9th grade students report on the textbook used in each class for which a sampled student was enrolled. Teachers were offered a $25 incentive for participation including the first textbook report, and owing to the high and variable burden of providing textbook information, an additional $5 for each textbook reported on thereafter. As described in the portion of this change memo that documents content changes in the questionnaires, it is recommended that the textbook items on the teacher questionnaire be eliminated. As a result, the additional variable incentive is no longer applicable, leaving a single teacher incentive of $25.

Table 1. Incentives

Respondent

Field Test Incentive

Main Study Incentive

Justification

School

Incentive Experiment: $500 vs. no incentive

None

$500 incentive did not influence school participation

Student

9th and 12th grade students received $10

Educational “goody bag”

Approved in original OMB package

Parent

None

None

No change

Math and Science Teacher

$25 plus an additional $5 for each textbook reported on after the 1st

$25

Removed sliding scale due to elimination of textbook reporting

School Counselor

None

None

No change

School Administrator

None

None

No change

School Coordinator

$100 base plus $25-$50 based on student participation rates

$100 base plus $25-$50 based on student participation rates

No change

IT Coordinator

None – role established in the middle of data collection

Proposed honorarium = $50

Role was critical to the success of the computerization of HSLS.

During the field test, the support of an IT person at the school was invaluable to test the live CD (the means of securely administering survey and assessment) and help the Session Administrator (SA) to load and troubleshoot the live CDs in the school computer lab on test day. Several of the schools that backed out of the study after initially agreeing to participate did so due to issues with getting the computerized assessment to work in the computer lab. During debriefing meetings with the SAs at the end of the data collection, the SAs unanimously requested that an IT coordinator be designated at each school, in addition to the school coordinator (SC) and that this IT coordinator be offered a $50 honorarium. Their duties relative to that of the school coordinator suggest that an amount around $50 would be appropriate. The wider context for this is that honoraria are meant as an expression of appreciation, not as literal labor market-level payments, but must be a sum of money more than just token if they are to be effective in enlisting extra help for the study. This addition would not result in an additional cost to the project, as the money for the IT coordinator would come from the savings incurred from not using the variable teacher incentive for the main study.

10.Assurance of Confidentiality Provided to Respondents (PIA, D, a, 2, iv)

RTI has developed a data security plan (DSP) for HSLS:09 that was acceptable to the Neil Russell, Institute of Education Sciences Disclosure Review Board Chair, and the computer security review board. The HSLS:09 plan will strengthen confidentiality protection and data security procedures developed for ELS:2002 and represents best-practice survey systems and procedures for protecting respondent confidentiality and securing survey data. An outline of this plan is provided in exhibit 1. The HSLS:09 DSP will

  • establish clear responsibility and accountability for data security and the protection of respondent confidentiality with corporate oversight to ensure adequate investment of resources;

  • detail a structured approach for considering and addressing risk at each step in the survey process and establish mechanisms for monitoring performance and adapting to new security concerns;

  • include technological and procedural solutions that mitigate risk and emphasize the necessary training to capitalize on these approaches; and

  • be supported by the implementation of data security controls recommended by the National Institute of Standards and Technology for protecting federal information systems.

Exhibit 1. HSLS:09 data security plan outline

HSLS:09 Data Security Plan Summary

Maintaining the Data Security Plan

Information Collection Request

Our Promise to Secure Data and Protect Confidentiality

Personally Identifying Information That We Collect and/or Manage

Institutional Review Board Human Subject Protection Requirements

Process for Addressing Survey Participant Concerns

Computing System Summary

General Description of the RTI Networks

General Description of the Data Management, Data Collection, and Data Processing Systems

Integrated Monitoring System

Receipt Control System

Instrument Development and Documentation System

Data Collection System

Document Archive and Data Library

Employee-Level Controls

Security Clearance Procedures

Nondisclosure Affidavit Collection and Storage

Security Awareness Training

Staff Termination/Transfer Procedures

Subcontractor Procedures

Physical Environment Protections

System Access Controls

Survey Data Collection/Management Procedures

Protecting Electronic Media

Encryption

Data Transmission

Storage/Archival/Destruction

Protecting Hard-Copy Media

Internal Hard-Copy Communications

External Communications to Respondents

Handling of Mail Returns, Hard-Copy Student Lists, and Parental Consent Forms

Handling and Transfer of Data Collection Materials

Tracing Operations

Software Security Controls

Data File Development: Disclosure Avoidance Plan

Data Security Monitoring

Survey Protocol Monitoring

System/Data Access Monitoring

Protocol for Reporting Potential Breaches of Confidentiality

Specific Procedures for Field Staff

Under this plan, HSLS:09 will conform totally to federal privacy legislation, including

  • the Privacy Act of 1974 (5 U.S.C. 552a);

  • Section C of Education Sciences Reform Act of 2002 (P.L. 107-279);

  • the USA Patriot Act of 2001 (P.L. 107-56);

  • the Family Educational Rights and Privacy Act (FERPA) (20 U.S.C. 1232g; 34 CFR Part 99);

  • the Protection of Pupil Rights Amendment (PPRA) (20 U.S.C. § 1232h; 34 CFR Part 98); and

  • Confidential Information Protection provisions of Title V, Subtitle A, Public Law 107–347 (CIPSEA).

Consistent with the Privacy Act, these data will constitute a system of records, per the system of records notice 18-13-01 National Center for Education Statistics Longitudinal Studies and the School and Staffing Surveys.

HSLS:09 also will conform to NCES Restricted Use Data Procedures Manual and NCES Standards and Policies. The plan for maintaining confidentiality includes obtaining signed confidentiality agreements and notarized nondisclosure affidavits from all personnel who will have access to individual identifiers. Each individual working in HSLS:09 will also complete the e-QIP clearance process. The plan also includes annual personnel training regarding the meaning of confidentiality and the procedures associated with maintaining confidentiality, particularly as it relates to handling requests for information and providing assurance to respondents about the protection of their responses. The training will also cover controlled and protected access to computer files under the control of a single database manager; built-in safeguards concerning status monitoring and receipt control systems; and a secured and operator-manned in-house computing facility.

Invitation letters will be sent to states, districts, and schools describing the voluntary nature of this survey. The material sent will include a brochure to describe the study and to convey the extent to which respondents and their responses will be kept confidential. (Materials are provided in appendix A.)

Contact materials to parents, including the study brochure, explicitly convey the voluntary nature of parent and student participation, and explain the confidential nature of the data, noting that responses will be used for statistical purposes and may not be disclosed in identifiable form except as required by law.  All activities are undertaken with parental consent, either active or passive, as required by the school.  Additionally, the introductory script for students at the survey session reiterates the voluntary nature of participation.  (Materials are provided in appendix A).   

All recruiting materials and procedures will be reviewed and approved by RTI’s Committee for the Protection of Human Subjects prior to sample selection. This committee serves as RTI’s Institutional Review Board (IRB) as required by 45 CFR 46. It is RTI policy that the all RTI research involving human subjects, regardless of funding source, undergoes IRB review in a manner consistent with the regulations in 45 CFR 46 to ensure that all such RTI studies comply with applicable regulations concerning informed consent, confidentiality, and protection of privacy.

11.Justification for Sensitive Questions

Questions about sensitive behaviors or beliefs are not asked in the HSLS:09 proposed instruments. The exception is that there are some perhaps marginally sensitive items on the proposed parent questionnaire. The HSLS:09 interview protocol contains several items about income and marital and dependency status. Federal regulations governing the administration of these questions, which might be viewed by some as "sensitive" because of their requirement for personal or private information, require (a) clear documentation of the need for such information as it relates to the primary purpose of the study, (b) provisions to respondents which clearly inform them of the voluntary nature of participation in the study, and (c) assurances of confidential treatment of responses.

The collection of data related to income, earnings, assets and indebtedness is central to understanding key policy issues driving this study. One issue concerns access to and successful completion of postsecondary education. Information about parental financial assets and liabilities can play an important role in explaining postsecondary expectations and choices and whether one starts and finishes the postsecondary program of one's choice, and whether and at what level financial aid should be made available.

The collection of information about marital status and dependents also facilitates the exploration of key policy issues given the importance of family structure to the home environment and home education support system.

12.Estimates of Annualized Burden Hours and Costs: Changes

Estimates of response burden for the HSLS:09 full-scale data collection activities are shown in tables 2 through 7. Please note that the time students will spend completing the cognitive assessment has not been included in the estimated burden. The estimated number of respondents and the corresponding burden hours show an increase from the previously cleared OMB submission for HSLS:09 due solely to the State Supplement portion of the design (funded by NSF). The original design required 800 schools - 600 Public schools, 100 Catholic schools, and 100 Other Private schools. The State Supplement required an additional 144 participating schools resulting in a total of 944. The total sample of 9th grade students given in Table 2 is thus calculated as 944 schools times an average of 25 students per school, or 23,600 = 944 x 25. This also increases the sample of school administrators and school counselors from 800 to 944 each, and the parent sample from 21,800 to 23,600.

Table 2. Estimated burden on respondents for full-scale study

Respondents

Sample

Expected response rate

Number of respondents

Average burden/ response

Range of response times

Total burden (hours)

9th graders

 

 

 

 

 

 

Full-scale (2009)

23,600

92

21,713

35 minutes

12,666


Table 3. Estimated burden on parents for full-scale study

Parents

Sample

Expected response rate

Number of respondents

Average burden/ response

Range of response times

Total burden (hours)

Full-scale (2009)

23,600

92

21,713

30 minutes

30

10,857


Table 4. Estimated burden on teachers for full-scale study

Teachers (math, science)

Sample

Expected response rate

Number of respondents

Average burden/ response

Range of response times

Total burden (hours)

Full-scale (2009)

11,328

92

10,422

30 minutes

30 minutes

5,211

 

Table 5. Estimated burden on school administrators for full-scale study

School administrator

Sample

Expected response rate

Number of respondents

Average burden/ response

Range of response times

Total burden (hours)

Full-scale (2009)

944

98

925

30 minutes

463


Table 6. Estimated burden on school counselors for full-scale study

Counselors

Sample

Expected response rate

Number of respondents

Average burden/ response

Range of response times

Total burden (hours)

Full-scale (2009)

944

92

869

30 minutes

435


Costs to respondents may be estimated as follows. For high school students, we have used $7.25 per hour for the main study to estimate the cost to participants which amounts to a total of $157,419. For parents, assuming a $20 hourly wage, the cost to parent respondents is estimated to be $217,130 for the 2009 base year main study.

Assuming an hourly wage of $20 for school personnel, main study respondent costs for this component add to $104,200. In the field test, for teachers in the linked design (math and science teachers providing contextual data for student analysis), teacher burden was highly variable because teachers may have had different numbers of classes to provide information for or, even more importantly, different numbers of students to evaluate. However, this design will not be implemented in HSLS’ main full-scale study. Additionally, teachers participating in the full-scale study will not be providing text book information for each of their classes, resulting in minimal variability among teachers. Thus, the teacher burden is reduced from the field test experience to a 30-minute burden per teacher.

Sample sizes for the teacher sample are harder to predict with full accuracy than other sample sizes in HSLS, since the number is not preset for this component and some of the information needed to model probable sample sizes is not available from other national datasets. (Ideally, one would be able to tap comprehensive national statistics for how many science and mathematics teachers, in each school in a simulated stratified probability-proportionate-to-size (PPS) sample, were engaged in teaching ninth-graders.)

For school administrators (the greater part of the questionnaire is typically completed by clerical staff in the school office with the last section completed by the school principal), again assuming a $20 hourly cost, the cost to respondents is $9,250 in the main study.

For the counselor questionnaire, the respondent dollar cost, assuming an average hourly rate of $20 for school employees, is estimated to be $8,680 in the main study.

With the State Data Records Augmentation, there is a possibility that administrative records held at the State level in data warehouses or other such entities will be merged with HSLS data. The process described earlier in the memo will require at least one senior and one junior staff person in each State to prepare the data file and transfer it to NCES. Data received at NCES will be merged with HSLS data by NCES’s contractor, RTI International. The estimated burden presented in Table 8 represents a rough estimate of the time the data file preparation and transfer process will take and the burden imposed on each state. The expected response rate is likely to be less than 100%, however the numbers in this table assume the best case scenario for data collection and collaboration with States. Based on the Teacher Compensation Survey pilot study estimates, there is a $42 hourly cost for State data technicians and an $83 hourly cost for State data managers. Thus for collaborating with HSLS to extract data, transfer data to NCES, and respond to quality checks, the cost to the technicians is estimated to be $58,800 and $12,450 to the managers in the main study, for a total of $71,250.

Table 7. Estimated burden on state employees for full-scale study State Augmentation

State employees

Sample

Expected response rate

Number of respondents

Average burden/ response

Total burden (hours)

Total cost
(dollars)

Full-scale (2009)

10 States

100

20 technicians

70 hours

1400

$58,800




10 managers

15 hours

150

12,450

Total





1550

71,250


13.Estimates of Other Total Annual Cost Burden

There are no capital, startup, or operating costs to respondents for participation in the project. No equipment, printing, or postage charges will be incurred.

14.Annualized Cost to the Federal Government

The cost estimate to the federal government, given the current scope of the contract, must be revised also. Updated estimated costs for HSLS:09 are shown in Table 9 along with cost estimates supplied in the prior submission. The updated estimated costs reflect comprehensive estimated costs including additions to the scope of the contract subsequent to the delivery of the field test OMB package. Additional efforts for the State-representative sample augmentation (recruitment, data collection, weight development, data preparation, etc.) are included.

Table 8. Total costs to NCES (updated)

Costs to NCES

Updated Amount

Original Amount

Total HSLS:09 base-year costs

$18,226,580

$15,205,684

Salaries and expenses

719,900

719,900

Contract costs

17,506,680

14,485,784




Field test (2008)

3,541,587

3,035,673

Salaries and expenses

215,648

215,648

Contract costs

3,325,939

2,820,025




Full-scale survey (2009)

14,684,993

12,170,011

Salaries and expenses

504,252

504,252

Contract costs

14,180,741

11,665,759

NOTE: All costs quoted are exclusive of award fee. Field test costs represent Tasks 2 and 5 of the HSLS:09 contract; base-year main study costs include tasks 1, 3, 4, and 6.


As noted above, the Supporting Statement Parts A and B as well as three sets of appendices are attached to this memo: (1) the originally-submitted hardcopy versions of the field test questionnaires; (2) the revised field test questionnaires in their proposed form for the main study; (3) a memorandum summarizing all instrument changes between field test and main study..

15.Explanation for Program Changes or Adjustments

There is an increase in hours due to moving from field test to a full scale collection.

16.Plans for Tabulation and Publication and Project Time Schedule

The HSLS:09 field test will be used to test and improve the instrumentation and associated procedures. Publications and other significant provisions of information relevant to the data collection effort will be a part of the reports resulting from the field test and main study, and both public use (Data Analysis System [DAS]) and restricted use (electronic codebook microdata) files will be important products resulting from the full-scale survey. The HSLS:09 data will be used by public and private organizations to produce analyses and reports covering a wide range of topics.

Data files will be made available to a variety of organizations and researchers, including offices and programs within the U.S. Department of Education, the Congressional Budget Office, the Department of Health and Human Services, the Department of Labor, the Department of Defense, the National Science Foundation, the American Council on Education, and a number of other education policy and research agencies and organizations. The HSLS:09 contract requires the following reports, publications, or other public information releases:

  • detailed methodological reports (one each for the field test and full-scale survey) describing all aspects of the data collection effort;

  • complete full-scale study data files and documentation for research data users;

  • a DAS for public access to HSLS:09 results;

  • an ECB for restricted access to HSLS:09 microdata; and

  • a “first look” summary of significant descriptive findings for dissemination to a broad audience (the analysis deliverable will include technical appendices).

Final deliverables are scheduled for completion by mid-2010.

The operational schedule for the HSLS:09 field test and full-scale study is presented in table 9.

17.Reason(s) Display of OMB Expiration Date Is Inappropriate

The expiration date for OMB approval of the information collection will be displayed on data collection instruments and materials. No special exception to this requirement is requested.

18.Exceptions to Certification for Paperwork Reduction Act Submissions

There are no exceptions to the certification statement identified in the Certification for Paperwork Reduction Act Submissions of OMB Form 83-I.

Table 9. HSLS:09 schedule

Activity

Start

End

Base year



School sampling

2/2008

2/2008

Sample recruitment

2/2008

11/2009

List receipt, student sampling

8/2009

11/2009

Student/staff data collection

9/2009

11/2009

Parent data collection

9/2009

2/2010

Nonresponse follow-up

10/2009

2/2010



1 RTI International is a trade name of the Research Triangle Institute.


File Typeapplication/msword
File TitleChapter 2
Authorspowell
Last Modified By#Administrator
File Modified2009-07-15
File Created2009-07-15

© 2024 OMB.report | Privacy Policy