TRP Minutes

HSLS 09 Appendix_I_TRP_Meeting_Minutes_APRIL8.doc

High School Longitudinal Study of 2009 (HSLS:09)

TRP Minutes

OMB: 1850-0852

Document [doc]
Download: doc | pdf


Appendix I
TRP Meeting Minutes









High School Longitudinal Study of 2009 (HSLS:09)



Technical Review Panel Meeting

November 28–29, 2007



Meeting Summary








Submitted by:

RTI International

PO Box 12194

Research Triangle Park, NC 27709


Prepared December 14, 2007


High School Longitudinal Study of 2009 (HSLS:09)
Technical Review Panel Meeting Summary

November 28–29, 2007


Meeting Attendees

Clifford Adelman

Eric Banilower

Kathy Borman

Robert Bozick

Jack Buckley

Laura Burns

James Chromy

Daryl Chubin

Steve Ferrara

Jeremy Finn

Howard Fleischman

James Griffith

Becki Herman

Debbie Herget

Thomas Hoffer

Lisa Hudson

Tracy Hunt-White

Steven Ingels

Vinetta Jones

Steve Leinwand

Laura LoGerfo

Rochelle Martinez

Edith McArthur

Stacey Merola

Jeffrey Owings

Gary Phillips

Daniel Pratt

John Riccobono

Donald Rock

Michael Ross

Russ Rumberger

Phillip Sadler

Marilyn Seastrom

Sharon Senk

Marsha Silverberg

Timothy Urdan

Thomas Weko

John Wirt


November 28, 2007

Welcome From NCES, Introduction to HSLS: 09, Components and Study Goals For Base Year and Beyond

Laura LoGerfo

  • Purpose of the meeting is to seek expert advice on what to expect in preparing the High School Longitudinal Study of 2009 (HSLS:09) and how to overcome hurdles when planning for this study.

  • HSLS:09 will continue in the tradition of NCES high school studies in some respects: longitudinal, data on access and choice, equity, transitions to postsecondary education or work.

  • In other respects HSLS:09 represents a new direction for the NCES high school studies: shifting focus to decisionmaking that affects transition to postsecondary education and work, and within that context, a special emphasis on science, technology, engineering, and math (STEM); web-based administration; collecting more information from administrative records; potential for state-representative samples in some states.

  • Items for discussion include the following:

    • how should HSLS:09 consider STEM in the context of the study;

    • state linkages;

    • digging deeper into administrative records;

    • new developments ( e.g., computer administration); and

    • sample design.

Introductions, Format, and Objectives of the Technical Review Panel Meeting, HSLS:09 Schedule, through Base Year Full-scale

Daniel Pratt

  • RTI would like to get the Technical Review Panel’s (TRP) input on the following topics:

    • baseline assessments;

    • components of the study in general;

    • instruments( i.e., questionnaires);

    • sample design; and

    • data collection plan.

  • HSLS:09 focus is on math and science.

    • A great deal of effort has been placed on improving those skills.

    • The American Competitive Initiative focuses on math and science competitiveness with other countries.

  • HSLS:09 is also a study of understanding student choices and decision-making process.

  • Fall 9th graders were chosen for HSLS:09 to enable us to track students from their early high school experiences on. Follow-up is planned for 2 years later when most of the students are spring 11th graders, though sample members will be followed outside of school as well. The spring 11th grade timing was selected in part because of concerns about the test-taking motivation and engagement of high school seniors. Dr. Rosenbaum (via e-mail) asked for consideration of a spring 12th grade follow-up as well.

  • The study will target the same students over the years and plans to follow students that leave the originally sampled schools with plans to conduct an out-of-school component.

Session Action Items/Additional Points to Consider:

  • Send the research questions to the panel for consideration. This will make it easier to provide feedback.

Sample Design, Field Test, and Full-scale Study

Steven Ingels

General Design Specification

  • Main study will:

    • sample 20,000 students in 9th grade from about 800 schools, plus a supplemental sample of approximately 1,800 Asian students;

    • draw a school sample of about 1,350 schools; and

    • recruit 800 schools and approximately 25 students per school.

  • Field test will:

    • sample about 1,200 students from 9th grade and 1,200 from 12th grade;

    • yield 1,000 student surveys per grade; and

    • sample 85 schools to yield 50 participating schools.

  • RTI estimates that 10% of the 9th graders will be repeating the grade.

School Eligibility

  • Located in one of the 50 states or the District of Columbia

  • Public schools, Catholic schools, and other private schools

  • State department of education schools (represented on the Common Core of Data (CCD) and Private School Survey (PSS))

  • Include fall 9th and spring 11th grade students (for the main study) and fall 12th grade students (for the field test)

  • The following schools will be excluded:

    • ungraded schools;

    • Bureau of Indian Affairs (BIA) schools;

    • special education schools;

    • area vocation schools that do not enroll students directly; and

    • Department of Defense (DoD) schools.

School Sampling Frame and Stratification Plan

  • Public schools (CCD 2005–2006); private schools (PSS 2005–2006)

  • Sample stratification-school type (public, Catholic, other private schools), census region (Northeast, Midwest, South and West), and locality (city, suburban, town and rural)

  • There will be enough schools to report across all nine census divisions and a variable will be constructed to support analysis both by region (4) and division (9).

Student Sampling Assumptions and Yield Rates

  • Oversampling will be conducted for Asian students because of the precision requirements.

  • There are no explicit attempts made to oversample special education students.

  • Students from the sample who cannot respond, even with accommodations, will be included contextually through the parent questionnaire, teacher reports, administrative records, etc.

  • Students from the sample who participate in the 9th grade will be followed as well as those that were selected but didn’t participate in 9th grade.

  • While yield rates reflect an additive attrition with each round of data collection, sample members will be followed longitudinally even if they miss a data point during the study.

Teacher, School Counselor, and Parent Samples

  • One math and one science teacher will be selected for each student (per the current contract).

  • One school counselor will be selected from each school (preferably the lead school counselor).

  • A parent most knowledgeable with the student’s school situation will be selected. The parents will be asked directly who knows the most about the school situation, so parents will self-select into the study (as was done in NELS:88 and ELS:2002).

Session Action Items/Additional Points to Consider:

  • About 5% of 9th graders are in schools that span grades to 7–9. Those students will not be sampled—the study will not select schools that do not have grades 9–11.

  • Use the new OMB race/ethnicity categories—Asians are no longer combined with Pacific Islanders.

  • When documenting special education students who cannot be validly assessed use the NAEP excluded student questionnaire as a model.

  • Consider doing a census of all math and science teachers while also creating a link between the student and his or her specific teachers.

  • Guidance counselors are now referred to as school counselors.

Overall Designs for Math and Science Assessments

Gary Phillips

  • 9th graders will complete the assessments on computers. They will be re-assessed as 11th graders.

  • A 2-stage 30-item math test focused on algebra and algebraic reasoning is proposed.

  • A 20-item science test focused on scientific inquiry is proposed.

  • A total of 50 minutes of testing time is proposed.

Session Action Items/Additional Points to Consider

  • There were concerns about the small number of items on the assessments. Measuring change over a 3-year period is difficult with a 30-item assessment. A student may show no change only because the measurement is not fine enough to detect relatively modest gains. Don Rock indicated that to measure change, at least 40 items were needed for the math test. Also, each item would need to be administered to at least 300 field test students. A number of possible solutions were suggested:

  1. Eliminate the science assessment. Don Rock indicated that he does not expect the science assessment to measure much change between 9th and 11th grade with only 20 items, and because it is not an adaptive (i.e., 2-stage) test. Most of the panelists were comfortable with this concession although some were not given the special focus on STEM.

  2. Split the sample. Administer either math or science assessment, but not both, to any given student.

  3. Lengthen the in-school session. There was concern that this would make recruiting schools and students much more difficult.

  4. Remove administration of the student questionnaire from the in-school session, and prompt students to complete the questionnaire on their own time. The incentive would be tied to completion of the questionnaire. This would allow the full 90 minutes for the administration of the assessments. However, there was concern about the effect this would have on the response rate.

  • Concern was expressed over the ordering of the administration of the survey questionnaire and the assessment. The proposal presented to the panel is to have the survey first, followed by the assessments. The following concerns were expressed:

  1. Some were concerned that students may not perform as well as they could on the assessments because they are tired from completing the questionnaire. Also, answers to questions (e.g., race/ethnicity) may have a priming effect on the assessments.

  2. It was mentioned that the priming effect could occur on the questionnaire if the assessments were administered first. For example, if the student performed poorly on the assessment, then their answers on the questionnaire could be influenced.

  3. One alternative mentioned was to administer the demographic and critical questionnaire items first, then administer the assessments, and conclude with the remaining questionnaire items.

  4. Another recommendation was to divide the session into two 45-minute blocks and allow parents and/or students to choose which class period(s) to forgo to take part in HSLS:09 (e.g., study hall, physical education). Concern was expressed about adding burden on the school with this approach.

  5. It was suggested that different orderings could be tested during the field test (e.g., survey first, then assessment; or assessment first, then survey, etc.)

  • Don Rock recommended adding some questions at the 7th grade level because at the beginning of 9th grade, students are still essentially 8th graders. Need 7th grade items to distinguish the lowest achievers.

    • Be sensitive to overall floor and ceiling and the ability to estimate change

    • Concern was expressed about using the schools’ computers. A suggestion was made to try to avoid technical problems by bringing laptops to schools.

Mathematics Assessment

Steve Leinwand

  • Algebra is the focus.

  • Math advisory group used to develop the assessment items includes:

    • Ann Shannon (formerly consultant of America’s Choice);

    • Mark Saul (rotator at the National Science Foundation [NSF]);

    • Katherine Halvorsen (Smith College);

    • John Dossey (former president, National Council of Teachers of Mathematics [NCTM]);

    • Hyman Bass (University of Michigan); and

    • Joan Leitzel (former president of the University of New Hampshire).

  • Testing time of 25-30 minutes is proposed; each student receives 30 items.

  • All items are 4-option multiple choice questions.

  • Plan to allow students to use their own graphing calculators or an online version of an equivalent calculator accessible on the computer.

  • Approximate distribution of items is as follows: ¼ low, ½ moderate and ¼ high complexity.

Session Action Items/Additional Points to Consider

  • The following concerns were expressed about the use of a graphing calculator:

    • Some feel it is a distraction.

    • Others questioned how often graphing calculators are used in the 8th grade and in algebra. Some indicated that they are not used often, so many students would not be familiar with them. Also, allowing the use of a graphing calculator could be an equity issue if some students have used them and others have not.

    • Some wondered what the correlation is between understanding math concepts and how well a student uses a graphing calculator.

    • It was pointed out that if students are not allowed to use graphing calculators in 9th grade, but are then allowed to use them in 11th grade, the 9th-11th grade linking items would no longer be comparable.

    • It was noted that the Program for International Student Assessment (PISA) allows the use of 4-function calculators.

  • The recommendation from the majority of the panel was not to use graphing calculators with the 9th grade test (the assumption is that even if used at 11th grade, no 11th grade test items requiring or benefiting from graphing calculators will appear on the 9th grade assessment).

  • There was no clear consensus on allowing use of a graphing calculator in 11th grade. It was suggested that an experiment in the field test may be helpful in determining the best approach.

  • A middle-ground alternative supported by some was to allow the use of a 4-function calculator in both the 9th and 11th grades.

  • There was a question as to what incentive students have to take the test seriously.

  • Concern was raised about a ceiling effect given the small number of items. The concern was that the test in 11th grade may not show much growth, especially for those pursuing STEM.

  • Concern was raised about being able to make distinctions at the lowest levels. Items that apply algebra to real-life situations are needed.

  • Don Rock indicated that since the math test would be adaptive, one minute per item was reasonable because students would be getting items appropriate for their level.

  • There was a question about the race/ethnicity/gender of the math advisory group and the item writers. Concern was raised about the items, given the lack of racial/ethnic diversity of the math advisory group. It was noted that the item writers were racially diverse. Also, men and women are both well represented among the advisory group and item writers.

Science Assessment

Steve Ferrara

  • Scientific inquiry is focus: inquiry based skills, not analytical skills.

  • Science was defined using National Assessment of Educational Progress (NAEP’s) definition.

  • Three approaches to inquiry are: inquiry-specific, content-minimized, and content-provided items.

  • 20 minutes of testing time is proposed.

  • Tests are fixed length.

  • Average of 1 minute per item is proposed.

  • All items are 4-option multiple choice.

  • It will be administered online.

Session Action Items/Additional Points to Consider

  • A question was raised about whether test items were a test of paradigms or of science. The panel questioned the focus on inquiry rather than content.

  • There were concerns that the science assessment would actually be a test of reading ability. Some were concerned that students with low reading ability would be disadvantaged.

  • Some panelists favored replacing the science assessment with a literacy assessment.

  • To test on science content, a suggestion was made to test everyone on middle school science content. Possible sources for content standards were: (1) NSF’s classifications of science or middle school content for 9th grade items, (2) the National Research Council’s content standards for science (grades 5-8), and (3) the American Associations for the Advancement of Science’s (AAAS) benchmarks for grades 5-8.

  • A question was raised about the value of having a 20-item science test at the expense of the math test.

  • One suggestion was to consider removing the science assessment all together.

  • Concern was expressed about the importance of scientific inquiry as it relates to science.

  • Concern was expressed that 1 minute per item would not be enough time for some students.

Item Screening, Calibration, Estimation

Gary Phillips

  • Field test results will be used to precalibrate for the full-scale assessments. The results of the full-scale assessments will then be postcalibrated.

  • A three-parameter Item Response Theory (IRT) model will be calculated for each examinee. There will be two types of scales. A scale score will be calculated using Bayesian estimates. A domain score will also estimate what the student’s score would be if he or she had answered all the items.

  • Three other possibilities that remove the measurement error and could potentially be used with HSLS:09 are: (1) plausible values (using Trends in International Mathematics and Science Study [TIMSS], NAEP methodology), (2) Marginal Maximum Likelihood (NAAL Methodology), and (3) Murray Aitkin’s 4-level hierarchical linear model.

  • Don Rock expects that scores with a 2-stage adaptive math test would be as reliable as NAEP’s plausible values because there are more items per student and the selection of items is based on a routing test.

  • RTI needs to determine what statistical analysis will be used.

Data Collection Challenges and Plans, Including Field Test Experiments

Debbie Herget

Recruitment

  • RTI has gained endorsements from 19 national organizations that support their efforts and has solicited endorsements to add validity. As we move forward with specific states, we will try to get buy-in from state organizations. Cliff Adelman suggested tapping the Data Quality Campaign for their endorsement

  • An incentive experiment will be conducted during the field test: $500 will be given to half of the schools that participate; the others will not receive an incentive. The money will be listed as a technology allowance in the budget.

    • OMB often requires that an experiment about incentives be conducted before they will approve them.

  • An incentive experiment will also be conducted with students during the field test. Participating students from half of the schools will receive $20 for participating in the study. Similar experiments conducted for ELS:2002 found that student incentives are effective.

  • Windwalker mails information packets, and RTI follows up with a phone call.

  • States notified the districts and schools recruited.

  • Diocese approval will be sought for Catholic schools; other private schools will be contacted directly.

  • In cases of very small schools, if the school requests that all students be assessed, HSLS:09 will accommodate.

Challenges

  • Overall test burden

  • Timing of assessment

  • Length of assessment

  • Lack of benefit to individual schools and students

  • Voluntary nature of study plus the lack of name recognition

  • Not all parents will speak English; parental consent forms will be translated into various Asian languages and Spanish.

Session Action Items/Additional Points to Consider:

  • The cash incentive to schools was discussed and the following points were mentioned:

    • Some thought that $500 was too little. One study mentioned paid schools $1,200 to $1,500. Another study mentioned found that $1,000 is greatly appreciated by urban schools, but not as much by suburban schools in more affluent communities.

    • A panelist recommended giving the $500 incentive to the math department.

  • One suggestion is to consider offering school-level results as an incentive for schools to participate (results should describe how they fit into the national or regional performance even if not school level).

    • If offered at the school-level, principal would need to agree not to distribute the information outside the school to comply with regulations.

    • Concern was expressed that parents may be disinclined to let their student participate if they know that this information would be shared with the school. Parents would need to be notified. Also, results may bias performance in 11th grade.

  • There was some question as to how many teachers would respond using the Web survey. The Schools and Staffing Survey found that the Web survey was not used often by teachers. However, more schools have computers now so it may be more successful now.

  • It was noted that half of Asian students live in second-language households. HSLS:09 needs to translate for these populations, especially given the oversampling of Asians.

  • It was noted that year-round schools start in July. In these schools, the assessment will be administered 3 months into the school year so these students may have higher scores than students in schools on traditional calendars.

  • There is a need to plan for multiple sessions in multitrack schools.

  • One panelist asked if block scheduling had been taken into account when designing the data collection strategy. This is one reason for the 1 ½ hour session.


November 29, 2007

Instrument Design: Plans and Issues

Steven Ingels

  • HSLS:09 will serve two primary functions:

  1. General purpose dataset to investigate factors related to students’ academic and social development; and

  2. Special stress on STEM antecedents and outcomes.

  • Sources of information:

    • school administrative records

    • 8th grade math-science records data in 2009

    • 9-12th grade transcripts after graduation

    • linkable external data sources

    • school administrator questionnaire

    • school counselor questionnaire

    • student questionnaire

    • student assessment in math and science

    • teacher questionnaire

    • parent questionnaire

  • For the field test, the sample size would allow for matrix sampling of survey questionnaire items to increase the number of items that can be evaluated.

  • 9th grade teachers will not be asked to rate the sampled students because they are unlikely to know the students well enough at the beginning of the school year.

  • Transcripts help fill gaps and provide continuous information for grades 9-12.

  • States need to buy-in to confidentiality to participate; they will be unidentifiable.

Session Action Items/Additional Points to Consider:

  • Link external data sources that will enrich data without having additional burden on students.

  • Is there a linkage with state database for student level data?

  • How will you choose the teachers selected to participate?

    • Will you do a census of all math and science teachers? There was a motion to survey all math and science teachers to get a sense of school climate.

    • There was an additional suggestion to ask all math teachers, but all have a question about whether they teach 9th grade and that it would be important to ask the teachers about the class in which they have the student.

    • Do they have to be math or science teachers?

    • There was a suggestion that it was still important to get the science teachers, even if the science assessment is cancelled.

    • Will they be the teachers of the students sampled?

  • Concerns were expressed that if students cannot be linked to a particular classroom, the value of those type of questions is diminished.

Student Questionnaire

Robert Bozick

  • Survey will be online.

  • It will be administered in English.

  • An emphasis was placed on student decisionmaking, using a social psychology framework.

Session Action Items/Additional Points to Consider:

  • RTI agreed to send out a diagram of the conceptual model with a list of domain priorities. The panel members are going to send responses to RTI about the diagram and domains to remove or rethink.

  • Immigration module issues are the following:

    • The question, What is your first language? should be considered.

    • Rather than asking about years in the United States, it may be better to ask students what grade they started in school in the United States.

    • To measure language domination, it may be better to ask the students how often they speak a specific language with their peers or with their mother.

    • The discrimination item should not be just asked of immigrants—if only for immigrants, it should be removed.

  • 9th grade retention domain:

    • Consider including these options “ I don’t know” or “ my parent made me.”

  • Interests and Goals Module:

    • Differentiations should be made among aspirations and expectations, ideas, and plans.

    • The sequence of the questions are important and should be considered when organizing the instrument.

    • The interest and goals section should be decreased because there are too many items. Though Vinetta Jones commented that the attitude questions are key, especially considering that minority students often have fewer opportunities for action

    • Include as responses: positive responses, neutral, negative, just don’t care

    • There is interest in the consistency of expectations across time.

    • There is a desire to figure out if students are thinking about the future.

    • Rather than asking about the “certainty” of future plans, it may be better to get at expectations unobtrusively. For example, “What do you plan to do the year after graduation?” and “Would you be disappointed if you were not a college graduate after age 30?”

    • There was a suggestion that the pair of items about aspirations and expectations from the High School and Beyond study (HS&B) may be better measures than those proposed (used in NELS and ELS).

    • Interest in school:

    • If you are asking if are they interested in school, follow the question up with why (National Education Longitudinal Study [NELS] asks this question).

    • Consider adding items about their attitudes toward school.

    • Limit the extrinsic motivation items.

    • Interest in math and science

    • Consider asking about role models.

    • The NSF may have some items specifically for math and/or science interest that would be useful.

    • Planned coursetaking:

    • What are students’ perceptions of course offerings? How do students perceive their placement in the school? There is often a dissonance between students’ location in the school and the track they are on based on records.

    • May need to cut aspirations a bit to make sure we get behavior. Behavior is important from a policy perspective.

    • In the interest and goals domain and planned coursetaking construct, keep the data element related to perception of current high school program.

    • Values:

    • Utility gets closer to the areas being examined.

    • Need to put something in the stem about students’ ranking of activities.

    • For the life values data elements within the interests and goals domain, find out how they have been used and aggregated by researchers and select from them on that basis.

    • Consider that there is a difference between extracurricular activities and after-school programs and they have different impacts.

    • Identity formation:

    • For role models, recommend asking about an adult who really influences the student’s choices.

    • Ask the question: ”How do you see yourself in the future?”

    • Include a question about the future sense of self such as “People like me do this…” Tim will look for samples and send it to the group.

  • Adaptation to high school:

    • This area is weak currently.

    • What does ”adaptation to high school” mean? This appears in the behavior and feedback.

    • Marsha Silverberg is going to send some comments about this area.

    • Consider using Melissa Roderick’s transition to middle school items.

    • Preload on current courses so you ask about just one course they are in.

  • Math and science coursetaking:

    • Include a question that asks: “How much math do you intend to take?”

    • Include a question about whether the student is currently taking math. Some students in schools with block scheduling may not take math until the spring semester.

  • Grades:

    • Grades are useful for secondary data analysis.

    • We could ask them what grade they received in 8th grade math.

    • May be better to just use the 8th grade scores that we get from the administrative records.

  • School-sponsored activities:

    • The ELS 2002 inventory is extremely long. You may want to distinguish sports from nonsports.

    • This construct links to the items about how you value activities.

    • Since it is fall of 9th grade and students may not have established themselves in extracurricular activities, you may need to ask about what they are planning to do.

  • School climate:

    • Consider asking one question about current math/science class climate.

  • Employment:

    • The panelists felt that it was very important to ask about student employment.

    • Time investment was considered the most important; type of job and relevance to occupational goals were not considered as important.

  • Social and cultural experiences:

    • This construct was felt to be redundant with others and they suggested removing it.

  • Family and home life:

    • The panelists felt that all the questions in this construct are high priority.

  • Positive/negative experiences with STEM:

    • Get some STEM-related questions from the National Math Panel.

  • Remove the regulated learning and locus of control data elements.

  • Remove the engagement in school item since there are better measures of this.

  • A question about homework should be included.

  • Consider reviewing Henry Treisman’s study on how Asian and Black students interact with their respective groups; reviewing this might provide more information on the STEM material.

    1. Distinguish Southeast Asian from East Asian etc. (cultural identity is key to much of this)

  • The perceived obstacles construct include items that deal with the obstacles the schools put in the student’s way (zero tolerance policies, teacher and counselors, school rules).

  • For the next meeting you may want to consider how the data will be analyzed.

Parent Questionnaire

Laura Burns

  • The parent questionnaire will take 30 minutes on average, with 5 of those minutes to collect information that will be used to locate students and parents in the follow-ups.

  • Parents will complete the survey online. If they are unable to complete the survey online, they will be interviewed over the phone.

  • It will be translated into Spanish.

Session Action Items/Additional Points to Consider:

  • General

    • Due to the oversampling of Asian students, consider translating the parent questionnaire into Asian languages, particularly Chinese, Korean, and Vietnamese.

    • Limit the number of items. One panelist answered all the questions including immigrant questions. It took him 54 minutes to complete.

    • Consider adding an item about residential mobility. Refer to the Census’ study of migration.

    • Consider adding an item that asks if the child participates in a gifted and talented program.

    • Consider adding questions about social capital (e.g., whether parent knows friends, parents of friends).

    • Consider adding questions about parenting (e.g., monitoring).

    • Consider adding a question about how far in school curriculum they want the student to go? How far do they think they will go?

    • Consider adding questions about the degree to which parents are informed about the immediate and future paths in high school (e.g., whether a high school preparation night was offered in 8th grade if the parent went; parent’s knowledge of IB, AP, GATE)

    • Consider adding question about how parents learn to negotiate the system (e.g., has school done anything to inform them of what their student needs to do to get into a 4-year college) and whether they understand the consequences of decisions.

    • Consider asking parents about their ideas about coursetaking.

  • Family structure domain:

    • Panelists suggested reducing the number of questions on family structure. Advised to refer to the American Community Survey or ECLS for examples as to how to collect this information efficiently.

    • The panelists agreed that family size is important.

    • Stability of family structure over time is important. Proposed family structure questions are just a snapshot.

    • Consider adding an item that asks for the total number of siblings of the students and student’s birth order.

    • Distinguishing among full, half, and step-siblings is not important.

  • Race/ethnicity

    • Use the current OMB race/ethnicity categories.

    • Consider not asking for Hispanic subgroups if there will not be enough students in each subgroup to be analytically useful.

  • Language:

    • It is important to know if the household is monolingual or bilingual, and if bilingual, what the dominant language is. Language proficiency is also important.

    • Some felt the proposed questions work. Others thought there were too many questions on language.

    • The absence of the task-oriented language questions (from NELS:88 and ELS:2002) was questioned.

  • Religion:

    • Most panelists recommended eliminating the questions about religious denomination. However, it was noted that analysts may be interested in the religious denomination in the context of school choice (e.g., denomination of students in Catholic private schools).

    • Some felt that religiosity is more important than denomination.

  • Respondent’s occupation:

    • Interested in getting a sense of whether parent is using STEM on their job. Some thought it was better to ask for parent’s opinion. Others were concerned that some parents who use basic math skills (e.g., cashier) may indicate that they use STEM on the job.

    • Ask for industry to understand the context of the occupation.

    • There was a suggestion to ask immigrants about their occupation in their country of origin. Others felt that the highest level of education and the field in which the degree was earned was all that was needed.

  • Wealth:

    • Recommended using items on wealth from the NCES postsecondary study instead of the proposed items (e.g., assets $10,000 or more).

  • Stop-out:

    • The utility of the item about reasons for dropping out was questioned.

  • Changing schools:

    • The utility of the item about reasons for changing schools was questioned.

  • Academic classes outside of school:

    • The question about whether the student has taken any courses outside of school was considered very important.

    • A suggestion was made to add tutoring to the question about education outside of school.

  • Disability:

    • The panelists suggested asking about whether or not the child has ever had an IEP.

    • Collect data on the type of disability from the school not the parent.

  • Siblings as role models:

    • For the sibling as role models construct, consider prefacing the questions with the following phrase, “Of the siblings mentioned previously…” or move this question into the family structure section.

  • Parent-school relationship:

    • The panelists suggested that the focus should be on parent advocacy and knowledge of the appropriate avenues for advocacy

    • How does the parent learn to negotiate the school system?

    • Does the parent know how to be an effective advocate for his/her child? The question about whether the parent has ever requested a particular teacher speaks to this. Ask why or why not as a follow up to this item.

    • Consider limiting the question about requesting a particular teacher or course to the 8th and 9th grades; also, consider adding a why or why not follow-up question

  • Availability/exercise of school choice:

    • Consider trimming the school choice section.

    • At a bare minimum, want to know if parent chose the school or not.

    • Some thought reasons for choosing the school were less important.

    • Ask about school satisfaction in the 11th grade since in the fall of 9th grade parents may not have an opinion yet.

  • Parents attitudes about math and science:

    • Trim these items.

  • Religion and science:

    • Eliminate the questions about evolution, intelligent design, and global warming. They are off-putting.

  • Encouragement of STEM careers

    • The panel prefers the focus be on careers rather than skills.

    • There was some doubt about whether parents would discourage their student from pursuing a STEM career.

    • Refer to Steve Barley’s (Stanford) work on how people become technicians.

  • Homework:

    • Consider eliminating the question that asks parents to rate how difficult various subjects are for the student.

  • Discussions with 9th grader:

    • Trim items by eliminating redundancy.

  • Exposure to work:

    • Eliminate “Take your child to work day” item.

  • Financing postsecondary education:

    • Refer to NCES postsecondary studies for questions about financing postsecondary education.

    • Add “college savings account” as way to finance postsecondary education.

  • Locating for future follow-up:

    • Ask for parent cell phone numbers.

Teacher Questionnaire

Eric Banilower

Session Action Items/Additional Points to Consider:

  • Review existing NCES studies to see if some of these items are already covered to reduce the number of items.

  • Remove questions about school policy.

  • Look at how NSF conceptualizes the science field.

  • The trust construct is very long. The focus should be on autonomy and classroom structure. This also seems related to morale which can be measured in better ways.

  • How much control over the classroom practices was considered important since it may be an indicator of school climate. Charter schools, career academies, etc. may allow more control.

  • May be able to cut the current state/district standards and accountability systems construct since you may be able to get this from a policy document.

  • Teacher credentials:

    • Allow teachers to indicate all areas in which they received certifications.

    • Use the NSF subjects for major and minor fields of study for bachelor’s degree.

  • You may need to “unpack” the professional development item: “In the past 12 months, have you…”

  • No consensus was reached on how to select teachers. Consider the following options:

    • a census;

    • teachers of the sampled students;

    • census with linking to the students in the sample;

    • pick one class: select class with most 9th grade students in general such as algebra; if the teacher doesn’t teach algebra select geometry.

    • classroom practices

    • Look at work of Valerie Lee about linkages between achievement and teacher self-reported instructional practices.

    • Content versus how it is taught is an important distinction.

  • Consider adding items about:

    • the student’s ability to learn;

    • the teacher’s ability to teach non-English speakers;

    • the highest course in math taken by the science teachers;

    • school climate (the Schools and Staffing Survey may have items that would work well for this)

    • questions such as: “What percentage of your students do you believe will pursue science careers?” and “Do you think your students like science and math?”

    • textbooks used;

    • suspension;

    • absenteeism; and

    • substitutes with college degrees ( *a question about qualifications of substitutes was also suggested for the school questionnaire).

School Counselor Questionnaire

Eric Banilower

  • Jeremy Finn believes there should not be a counselor survey. He doesn’t believe school counselors are the best source of information based on the goals presented by NCES.

  • Cliff Adelman suggested trying to see what HSLS:09 can get from administrative records.

  • A suggestion was made to see if some of the items could go on the school administrator survey or alternatively have a school survey with a counselor module.

  • Another panelist suggested surveying school counselors twice (i.e., in 9th and 11th grade).









High School Longitudinal Study of 2009 (HSLS:09)



Technical Review Panel Meeting

January 30-31, 2008



Meeting Summary








Submitted by:

RTI International

PO Box 12194

Research Triangle Park, NC 27709


Prepared February 22, 2008









High School Longitudinal Study of 2009 (HSLS: 09)
Technical Review Panel Meeting Summary

January 30–31, 2008

Meeting Attendees

Clifford Adelman

Sharon Anderson

Eric Banilower

Kathy Borman

Robert Bozick

Jack Buckley

Laura Burns

Daryl Chubin

Jeremy Finn

Kristin Flanagan

Mary Frase

Tate Gould

James Griffith

Debbie Herget

Rebecca Herman Thomas Hoffer

Lisa Hudson

Tracy Hunt-White

Steven Ingels

Vinetta Jones

Steve Leinwand

Laura LoGerfo

Patricia Martin

Rochelle Martinez Edith McArthur

Jeffrey Owings

Gary Phillips

Daniel Pratt

John Riccobono

Donald Rock

James Rosenbaum
(via teleconference)

Michael Ross

Russ Rumberger

Leslie Scott

Marilyn Seastrom

Sharon Senk

Timothy Urdan

Andrew White

John Wirt


January 30, 2008

Welcome From NCES

Laura LoGerfo

  • The purpose of the meeting is to review the revised HSLS:09 instruments discussed during the last meeting.

  • Items for discussion include the following:

    • identify HSLS:09 priorities;

    • how much do we want to ask the students, principals and parents;

    • which items provide the richest data;

    • what is missing from the existing instruments;

    • do the questions address what we want ; and

    • which questions should be removed from the instruments.

  • RTI will send out the revised instruments to the panel.

Mathematics Test Design

Gary Phillips

Operational (Main Study) Assessment

  • The operational test features and design elements:

    • include two stages at each grade level and will vary in difficulty from low to high;

    • each student will have 40 minutes to complete 40 items;

    • include linking items between grades 9 and 11; and

    • include 84 items in grade 9 and 76 items in grade 11 (138 unique items with 22 of them linking items).

Field Test Assessment

  • The field test elements:

    • A pool of 266 items has been assembled (2 more than needed for field-testing).

    • fall 12th graders will be surrogates for the spring 11th graders of the main study;

    • four forms will be field tested at grades 9 and 12 (total = 8 forms);

    • 1100 students are needed at grades 9 and 12 to take the test (total = 2200 students);

    • race and sex for students will be collected;

    • each student will have 40 minutes for the 40 items on the given form;

    • items will be ordered according to difficulty; and

    • timing information will be saved for each item.

Math Test Update

Steve Leinwand

  • Since the first TRP meeting (11/28/07), the following general specifications have been revised:

    • moved from 30 to 40 items per student;

    • moved from field test pool of 172 items to 266 items;

    • moved from operational test pool of 94 items to 138 items; and

    • moved from 30 items in 30 minutes to 40 items in 40 minutes.

  • 264 items will be field tested to get 138 items for the operational test - 9th and 12th graders will be field tested in the fall of 2008; for the main study, testing for 9th graders will take place in fall 2009 and 11th graders in the spring of 2012.

  • A math advisory panel meeting was held on 12/14/07; the panel reviewed 240 items and rejected 12, made corrections and established consistency of wording, checked answer keys and distractors, and checked distractor rationales.

  • After the panel meeting, all math items were reviewed by an outside expert (John Dossey), 6 additional items were written and 32 additional NAEP items were selected.

  • The item pool includes 266 items (two more than needed – but these are still in item pool):

    • 32 NAEP released items;

    • 234 “new items”;

      • 106 developed by John Dossey; and

      • 128 developed by AIR.

  • 4 distractors will be used rather than 5; this will save time on the test.

  • Instructions for the test have been drafted but they are still being reviewed.

  • Students will be allowed to skip questions. They will also be able to go back to questions.

  • A cognitive lab will be conducted in NC and DC to test the computerized delivery of the items and the instructions and to test the set-up on the computer

  • If accommodations cannot be made, students will be documented as test ineligible.

  • Test exclusions will be described in detail for schools and schools will determine which students need accommodations or must be excused from testing.

  • The math test will focus on algebraic content and algebraic processes. There will be a balance in focus between skills and problem solving.

  • The approximate distribution of item complexity is as follows: 37% are low, 54% are moderate and 9% are high.

  • The actual difficulty (as contrasted to complexity) of the items will be determined in the field test

  • The existing items show:

    • at the 9th grade level, the calculator is not helpful for 77 items and helpful but not essential for 10 items;

    • at the 9th-11th grade level, the calculator is not helpful for 102 items and helpful but not essential for 6 items;

    • at the 11th grade level the calculator is not helpful for 67 items and helpful but not essential for 4 items.

  • AIR recommends that all students, at both grades, have access to a scientific (but not a graphing) calculator during the test - either one that they bring or one that is provided on-line as part of the test.

Session Action Items/Additional Points to Consider

  • Don Rock recommended starting off with the easiest items.

  • Instructions about when to leave a question blank and when to guess need to be carefully written in order to maximize the number of items completed.

  • The following concerns were expressed about the use of a graphing calculator:

    • will the calculator impact the amount of time allotted for the test?;

    • will the calculator create an expectation on how/if students will use the calculator?;

    • the test should include a direction along the lines of “a calculator may help you to answer these questions, feel free to use one if you wish”.

  • Language in the instructions needs to reflect that students are not required to use the calculator.

Science Assessment

Steve Leinwand

  • During the last TRP, the panel recommended that a science assessment not be included at the 9th grade level.

  • The 11th grade field test will take place in the spring of 2011 and the operational test will be in 2012.

  • The science assessment, if pursued for the 11th grade follow-up, may draw items from NAEP, NELS, and PISA.

  • Panelists are encouraged to offer suggestions on how to shape the content (i.e., scientific literacy, the nature of science etc.) for the science assessment.

  • One question is whether science literacy questions should be posed in lieu of the 9th grade science assessment.

  • Additional conversation on the science assessment will resume at a later TRP meeting.

TRP comments on a possible science assessment at 11th grade

  • 9th grade – not clear what high school science is or applied science is at this juncture

  • How best to shape science content? Inquiry? Learning progressions? Utilize PISA items or strive for something new to capture scientific literacy?

  • In 11th grade, you’d need to keep the math assessment at 40 minutes, because that’s the timing for math in 9th grade, and won’t want to skimp on the student questionnaire. Then where does the time for a science assessment get carved out?

  • If plan to assess in science (and certainly NCLB will soon require testing in science), measuring scientific literacy may be the right approach – could consider following up with postsecondary assessment

  • Surveys give window into scientific literacy from different perspective… a back-door approach to science

  • NSF has documented scientific literacy for decades… adult scientific literacy not considered predictor/precursor to much besides perspective

  • Science courses and grades, as will be captured in the HSLS high school transcript component after the first follow-up, are far more richly informative than a science test that amounts to a literacy test

  • Math test foundational but science test more doubtful for utility and predictive abilities

Issues of Computer Delivery

Daniel Pratt and Debbie Herget

Bootable CD

  • RTI developed a school based solution for administering the test at schools. They will plan to use school computers (or will bring laptops if the school computers are unsuitable or unavailable)

  • Using school computers cuts down on the financial burden of paying for all of the equipment and reduces the burden of having the test administrators carry computer equipment to each school.

  • The session administrator will bring 5 backup laptops.

  • RTI recommends using a bootable CD to administer the test. The CD will load the operating system and internet browser into memory on school computers so that students can take the tests directly from the internet. The survey site will be hosted by NCES and data will be entered and stored on the secure site.

  • In order to use the bootable CD the computer must be a PC or Mac, have a high speed internet connection, a dynamic IP address, and a bootable CD-Rom drive.

  • RTI cited the following benefits for using a bootable CD:

    • eliminates concerns about viruses;

    • ensures consistency of operating systems;

    • data loss will be minimized;

    • privacy will be protected;

    • school equipment will not be compromised; and

    • students will be able to access the test and the survey on the computer.

Data Collection Logistics

  • RTI will work with the schools in advance to gain access to school computers.

  • NAEP computerization experiments are being carefully reviewed by NCES and RTI to anticipate and be prepared for potential problems associated with computerized assessment.

  • According to an NCES report (Internet Access in U.S. Public Schools and Classrooms: 1994-2005), nearly 100 percent of U.S. public schools had access to the Internet in fall 2005 and 94 percent of public school instructional rooms had Internet access.  The same report indicated that in 2005, 97 percent of public schools with Internet access used broadband connections to access the Internet. The most recent comparable NCES study in private schools and classrooms was in 1998 so no presumptions can be made with regard to private schools.

Session Action Items/Additional Points to Consider

  • There was concern about RTI relying on school computers and their availability. The following comments were made:

    • to reduce the burden on the test administrators, get a computer cart that can be used to carry all computer equipment to the schools;

    • compare costs between supplying all computers and adding personnel to carry the computers to the site and administer the sessions.

    • some schools may not have bootable CD drives but they may have non-bootable CD drives.  Might we be able to load the operating system and browser on a floppy disk to boot up the computer and then use the non-bootable CD after that?

    • screen display differences may be an issue.  It is important for the screen display not to compromise the assessment results. All test takers should have the same basic stimulus in presentation of the assessment.

    • similarly, if there are issues with bandwidth or connectivity, the assessments could be compromised.  Such issues must be thoroughly investigated ahead of time -- each school will have different issues.

    • while high speed Internet connectivity is available at most schools, will working computers with high speed Internet access be made available for assessments?

    • It may be worthwhile to explore alternatives to a web-based design such as the use of software installed temporarily on school computers with the data saved temporarily on an external device (e.g., a memory stick) which could then be transmitted at the end of the session.

Recommended Design Changes

Steven Ingels

  • Updates since the last TRP meeting:

Student Questionnaire

    • The new time allocation is 35 minutes: 30 minutes of substantive questions and 5 minutes for future locating questions.

    • The student portion will take 90 minutes (15 minutes for set-up and closure; 40 minutes for math assessments; and 35 minutes for the student questionnaire).

Parent Questionnaire

    • The parent questionnaire will be available in English and Spanish. A self-administered paper and pencil questionnaire option is proposed.

Math and Science Teacher Survey

    • Under consideration:

      • a census of 9th grade math and science teachers.;

      • surveying department chairs/coordinators;

      • a student-driven linked-to-teacher design ; and

      • a census of all math-science teachers in the high school.

Session Action Items/Additional Points to Consider

  • Consider surveying all math/science teachers and not just 9th grade math-/science teachers.

  • The following areas of concern were mentioned:

    • Teacher turnover may impact survey results.

    • Both school climate and culture, and math and science departmental climate and culture, are of interest to measure

    • Add a question for the parent about the child’s IEP.

    • Support was expressed for surveying math-science departmental chairs as sources of information about rules and practices for student placement and progression in the two subject areas, as informants on the school’s subject-specific culture and ethos, and as sources of information on standards and requirements shaping the delivery of math and science instruction in the school. It was also thought that department chairs could relieve burden from teachers by providing information about the math and science textbooks in use at 9th grade.

Student Questionnaire

Steven Ingels

  • Ingels reviewed the purpose and research questions for the student questionnaire. The research questions include:

    • How do students decide what courses to take in high school and what to pursue after high school? What factors affect their decision-making, particularly factors that are malleable to school or parent influence?

    • What factors lead students towards or away from STEM?

    • How do students‘ attitudes and learning approaches (i.e. confidence, self efficacy, motivation, engagement, and belonging) evolve during high school?

    • How do students prioritize and balance various commitments, i.e. family, friends, school, job while in high school?

  • Jeremy Finn, Cliff Adelman, Russell Rumberger, Vinetta Jones and Daryl Chubin were asked to review the student questionnaire beforehand and provide feedback to the TRP. They were asked to pay specific attention to items that should be removed from the instrument and items that should be added to the instrument. The following insights and concerns were shared by these panelists:

Jeremy Finn’s feedback

    • Not enough emphasis on marginal students and students at risk

    • Models of decision-making processes from the November draft have been lost

    • Do students know what they have to do in the 9th and 10th grade to become science majors?

    • Figure out how connected the student feels to their schools and classmates (i.e., I feel welcomed by my school’s personnel; my friends are at school; school is the most important thing I do).

    • Figure out if the student values the practical things (utility) that schools provide (i.e., I get something useful out of my classes; I plan to finish school; school is a waste of time).

    • Figure out how active a student is in the school (i.e., extracurricular activities, participating in class learning activities; participation in school events).

Cliff Adelman’s feedback

    • What do students know about science/math?

    • It is helpful to know at the beginning what the students can do on computers..

    • Define what college level science means for this study.

    • Reorganize the items chronologically for a better flow.

    • Ask a question to see what else students might be interested in outside of math and science (i.e., art, history).

    • Find out where/how students begin to form their images concerning occupations (i.e., their parents, teachers, older siblings, television).

    • Find out which occupations present the most negative and positive images.

    • Find out who the students admire.

    • Ask in 11th grade in what subject they expect to be their major in college.

Russell Rumberger’s feedback

    • Want to know if coursetaking is related to a long-term plan, a means to an end.

    • Link between educational plan and occupational goal; aligned ambition

    • Determine the student’s perceived level of confidence in math.

    • Engagement should be linked to motivation, planning, and coursetaking, so that courses may be seen as a critical pathway

    • There is no coherence in the order of the items.

    • Find out if the students know what they want to be and if they know the course path they need to follow to achieve their goals.

    • Early adolescents generally have only vague notions about science and math careers or even subject matter

    • Find out if they want to go to college.

Vinetta Jones’ feedback

    • Need to add items that capture the experiences of those students that are underrepresented to go into STEM.

    • Students don’t know about pipelines. They are put into a track by the system based on race unless parents are proactive.

    • Add questions that ask about role models.

    • Add questions about involvement in after school math and science programs. Who encouraged them to participate?

    • Ask if students know what it takes to excel.

    • Ask the students to indicate how much time they think students who do well in school spend on homework.

    • Find out who encouraged them to go to college.

    • Ask student what they think their teachers and counselor expect them to be doing in 20 years.

    • Ask the students how they see themselves (i.e., leader, good student, bad or smart).

    • Ask students for reasons why they are not going to take advanced math/science.

Daryl Chubin’s feedback

    • Engagement is key and is a filter for other influences.

    • How much influence have parents, teachers, and others had?

    • Measure awareness of possibilities and interest.

    • Students do not think in terms of pipelines. Need to think about how to put these questions in their frame of reference.

    • Ask about their interests in high school.

    • Ask the students to define science.

    • Ask if their interests have been reinforced.

    • Measure of intensity of interest; have they revisited a museum or applied learning from museum to something else

Session Action Items/Additional Points to Consider

  • Student background domain

    • Consider not asking about Asian subgroups if there will not be enough students in each subgroup to be analytically useful.

    • Remove the academic environment data element question: it is not relevant.

  • Previous experience domain

    • Consider removing previous school year grades data element, check the student transcript instead.

    • The bilingualism data element question should read, “how often do you speak (preloaded language) with your parents? Your friends at school? Your friends in your neighborhood?”

    • Laura Burns will provide a clearer item on student bilingualism from NHES:2003.

    • The middle school activities question should read, “have you participated in the following activities in grades 8 and 9” or “between the start of G8 and now”?

    • The activities question should include out-of-school activities as they relate to engagement.

    • Reword the science activities question, the current wording may yield inaccurate results.

    • Update the question stem so that it reads, “watched science movies and…”

    • In the self-reported 8th grade math course data element, include an option for honors courses.

    • Add a computer technology item to the instrument.

  • Social context domain

    • Remove the “school climate” data element.

  • Interpersonal influences domain

    • Remove versions 2 and 3 of the discuss school and work with significant others data element. Keep only version 1.

  • Values domain

    • In the occupational values data element, remove the question stem. Instead ask “what do you want to be at age 30?” and “what do you have to do to get there?”.

      • Jeremy Finn will send a guide that helps to identify occupational values.

  • Motivation domain

    • Intrinsic motivation items focus more on experience. The current item is listed as a value. This item should not focus on its importance but should identify if the student likes or dislikes math/science.

    • Remove the extrinsic motivation data element question.

  • Identity domain

    • These items are not focused. Identity questions ask “am I capable”, “do you see yourself as a math person”.

    • Additional questions should be added to identify ways the student believes their peers view them.

    • Consider asking “which do you value most”.

    • Remove the “future identity adult role model” data element.

  • Utility value domain

    • For the value in learning class material data element, add the option “The information is important for my career and everyday life”.

    • For the value in school data element, lump the multiple items into one.

  • Perceived opportunities and barriers domain

    • For the future barriers to math/science data element, add “check the two most important reasons” to the stem.

    • Remove the following sections: abstract attitudes toward educational opportunity data element and the concrete attitudes toward educational opportunity data element.

  • Costs domain

    • Remove the current time use data element section.

  • Expectancy domain

    • Improve the wording in the item concerning plans to take PSAT/SAT/ACT/AP/IB.

    • Ask “if there were no barriers, what is the highest level of education you expect to attain”.

    • Remove the question about plans right after high school. Instead ask what students are most likely to do after high school. Remove the option “go to college” and reorder all of the options.

    • After the intensity item ask “How confident are you?” – need full line to create context and thus clarity

  • Remove the attributions and self concept domains.

  • Deterrents and negative experiences domain

    • The question should read, “was there any class that you especially wanted to take this school year but it was not offered in your curriculum or you were discouraged from taking it?” This question should be asked to 11th graders.

    • Remove the question about negative experiences.

  • Decisions domain

    • Remove the future courses and influence on future courses questions.

    • Remove the following decision engagement questions: when I am working on a math/science assessment; when I finish a math/science assessment; do you feel bored because you do not understand what’s going on; do you feel bored because you know the answers).

    • Remove the time use intensity checklist. It is found elsewhere.

    • The panelists were asked to recommend elements that could be added to the special academic program participation question.

  • Math & science classroom environment domain

    • Remove the questions about liking the teacher and teacher approach to students

    • Shorten the list of options for the teacher competency and effectiveness section.

    • All of the items in this domain can be condensed and combined. Identify which questions stems should be paired and which question stems are repetitive.

    • Use a 5-point Likert instead of 7-point

    • Too little on peer effects

    • Need more on attendance patterns

    • Need more on how, where and why they use computers


January 31, 2008

Parent Questionnaire

Steven Ingels

  • Ingels reviewed the purpose of the parent questionnaire and the research questions for the parent questionnaire. The research questions include:

    • What social capital resources are available in the home environment to support children’s academic development and decision making?

    • What human capital resources are available in the home environment to support children’s academic development and decision making?

    • What financial capital resources are available in the home environment to support children’s academic development and decision making?

  • Three modes of administration will be available to parents: self-administration using a web interview, self-administration of a paper and pencil questionnaire, and Computer-Assisted Telephone Interview (CATI) using the web instrument.

  • Some concern was expressed that someone other than a parent will complete the web survey. It was noted that a password will be required to access the web survey and that no monetary incentive would provided to parents.

  • Cognitive pretesting will be conducted on selected new items.

  • Given that the material presented could not all be covered in a 30-minute interview, the TRP was asked to recommend items that could be removed from the instrument.

  • Kathy Borman was asked to review the questionnaire in advance of the meeting and provide feedback to the TRP.

Kathy Borman’s feedback

    • She thought the instrument did a good job of addressing human, social and financial capital.

    • She recommended asking specifically about math and science academic classes outside of school not just academic classes in general.

    • She recommends asking about informal math and science activities such as after school programs and summer camps.

    • She is concerned that some terms and language used in the instrument would not be familiar to parents.

    • She thinks some questions related to postsecondary plans are premature and redundant. Students may not know about specific jobs they will apply for after high school. They may not even know what they are likely to be doing as their main activity after high school.

    • She did not understand what was meant by a number of the subitems in the “perceived obstacles to future career plans” question. She suggested condensing the list of subitems.

    • She noticed some overlap between the student and parent questionnaire.

Session Action Items/Additional Points to Consider

  • Insert a question for the student and parent questionnaire that asks if parents use math/science on the job.

  • Family structure domain

    • Panelists did not understand what was meant by “change in family situation.”

    • Panelists want information on divorce, whether there is a parent outside the home involved in the student’s life, and death of a parent.

    • It was suggested that change in family structure between the 9th grade and the 11th grade surveys can be measured by comparing household rosters and asking for reasons a parent is no longer in the household. This approach has been used in ECLS.

    • It was recommended that a question about the number of people in a household be added.

  • Demographic characteristics domain

    • Concern was expressed that undocumented immigrants may not want to answer questions about immigrant status.

  • Socioeconomic Status domain

    • Some panelists wanted to ask parents (and students) how much they use math and science in their job. Others were concerned that parents who use very basic math (e.g., cashiers) would say they use it a lot. These items are candidates for cognitive testing.

    • A panelist suggested asking college graduates from which college they received their degree.

    • The value of the question about assets greater than $10,000 was questioned. It is used on postsecondary studies because it is one variable used to calculate expected family contribution for financial aid. Panelists agreed that $10,000 was too low.

  • Previous educational experiences domain

    • The questions about behavior problems need a time period as a frame of reference. The past year was suggested. Also, it would be more helpful to know how many times the school contacted the parent about a behavior problem rather than whether they did or did not.

    • There was some debate about the merit of the question about stopping out of high school. Some panelists thought it was more important to have an estimate of the number of days absent although other panelists indicated that parents may not know if their teenager is skipping school. Others thought the stopout question was more appropriate for the 11th grade questionnaire.

    • The question about academic classes outside of school should refer to science and math. A distinction should also be made between remedial and enrichment.

    • The question about tutoring should be expanded to include Saturday academies, learning centers, and after school programs. Need information on the subjects studied in these programs and whether they were remedial or for enrichment.

  • Current education/activities domain

    • Must ask about whether the student has an IEP even if in a Gifted and Talented program

    • There was some debate about whether parents should be asked whether their 9th grader has a disability. Some thought the question was too subjective, but others thought the parent’s perception was important. Also, some said that students with disabilities may not have an IEP.

    • Some wanted to know what disability the parent believed the teenager had, while others just wanted to know whether it was a learning disability.

    • The wording of the question about exchanging knowledge with other parents needs to be simplified. It was suggested that HSLS ask parents how often they talk with other parents about classes, schools, and teachers.

    • There was some discussion about whether the question should be limited to discussions with parents of the student’s friends in keeping with Coleman’s concept of social closure. But since the friends may attend other schools than the 9th grader, a more general question was suggested.

    • The question about conversations with other parents should not be limited to advice. Many parents may be willing to exchange helpful information but not comfortable advising other parents. Or acknowledging receipt of advice

    • Throughout the domain, refer to the past year

  • School choice

    • “Career academies” should be “Career and technical programs”

  • Parent-school relationship domain

    • Parents recommended cutting the question about frequency of contact with school teachers and counselors because it will be too early in the school year to be meaningful. Also, the question does not capture why the parent is talking to the teacher.

    • Panelists recommended splitting the question about requesting a particular teacher or course into two questions.

    • Panelists thought it was more important to know if the parent know what math and science course the student is taking in 9th grade than in the next school year. If ask this question, ask them for the course name (verbatim), not just yes or no.

    • The panelists suggested eliminating the question about satisfaction with teachers because it will be too early in the school year for them to assess this.

  • Home environment domain

    • Consider adding a question about what subjects the student prefers

      • Some panelists suggested adding question about whether the parent encourages the student in some subjects more than others, but there was also concern about social desirability biases with such a question

    • Some panelists suggested considering adding the question from NELS about decision-making to characterize parenting style. Others did not think this was a priority given the limited length of the interview.

    • Panelists suggested that the question about family rules have a balance of items related to school and socializing; others thought the focus should be on school.

    • One panelist suggested referring back to NELS for the questions about curfew.

    • One panelist recommended adding the NELS question about whether there is a place set aside for the student to do homework.

    • Panelists considered the question about STEM-related activities important. One panelist suggested broadening the scope of this question to activities with extended family members, but others thought that “family” would be interpreted as extended family so “family” suffices.

  • Educational environment at home domain

    • Remove option ‘g’ and only use for 11th grade. Explain option g

  • Parent child relationship domain

    • Panelists recommended making the question about parent influence specific to school and career choices.

  • Education expectations domain

    • There was debate about whether educational aspirations should be measured as well as educational expectations. If a question about aspirations was posed to parents it was recommended that the same question be asked of students. Also, it was recommended that the phrase “We know that things don’t always turn out the way we would like” be replaced with “If there were no barriers.”

    • Panelists recommended replacing the questions about how many years of math and science they expected the 9th grader to take with the questions from the student questionnaire about expectations for taking advanced math and science courses in high school.

  • Occupational expectations Domain

    • This question only applies to students who do not anticipate continuing their education after high school. Add a “none of these” option for question about reasons for not continuing education after high school.

    • Panelists critiqued the items in the perceived obstacles to career question. They did not know what “lack of ability” meant; lack of academic ability or lack of opportunity? Military should not be listed as an interference with career plans because for many students it is a chosen career path. Others thought that some of the items were useful.

Teacher Questionnaire

Steven Ingels

  • Ingels reviewed the purpose of the parent questionnaire and the research questions for the parent questionnaire. The research questions include:

    • What do mathematics and science teachers do in the classroom that engages and encourages students to pursue STEM pathways, or alternatively, disengages and discourages students from choosing STEM pathways?

    • How do mathematics and science teachers view the quality and supply of the school’s resources and support available?

  • Thomas Hoffer and Sharon Senk were asked to review the questionnaire and provide feedback to the TRP. The TRP was asked to pay specific attention to items that should be removed from the instrument.

Thomas Hoffer’s feedback

    • Consider a focus on professional background.

    • The information on textbooks should be removed.

    • There is too much detail on college coursework; look at major/minor specialty.

Sharon Senk feedback

    • Clearly identify which teachers are being asked and for what purpose.

    • There are 3 ways of referring to teachers: “this class”, “your classroom”, “your school”. It should be consistent throughout.

    • There is an inconsistency between attitudes, beliefs, and expectations; it needs to be well thought out in relation to mapping.

    • Delete questions 22-30 (professional development), 44 (textbook book usage), and 37, 45, and 48.

    • Ask if teachers feel prepared to teach math.

    • The section on certificates can be complicated.

    • Include more questions about teacher expectations.

    • Include more questions about math quality.

    • More content questions (i.e., how much emphasis do you place on skills vs. problem solving).

    • Include the item: “all students should take algebra” agree/disagree

Session Action Items/Additional Points to Consider

  • Teacher education domain

    • Insert a question that asks “do you have a degree from a college of education?”

    • Insert a question that asks “do you have a degree in arts and sciences?”

    • Remove items k-bb on the match construct.

  • Teacher certification domain

    • The first question should ask if the teacher is certified in math or science.

    • Remove items 9-13.

    • Item number 14 should really be item number 3.

  • Teacher preparedness domain

    • Consider asking the department chair the extent to which they use each of the options listed in item 15.

    • Consider asking teachers how prepared they feel to teach the course content.

  • Professional development domain

    • Put these questions in the context of math and science.

    • Consider asking these questions to the administrator.

  • Remove the STEM encouragement as a student construct

  • Teacher attitudes/beliefs domain

    • Add an option ‘f’, “if a student has never done well in math they never will”.

    • Don’t ask for percentages. Figure out a better way to ask the question.

    • Ask the teachers to guess how many students will graduate from a 4 year university and a 2 year community college. Guess seems like the wrong word that might start us down the wrong path

    • Ask the teachers to guess how many students will major in STEM related majors.

  • Instructional practices domain

    • Remove the remediation construct. Consider adding it to the department chair questionnaire.

    • Spell out all acronyms.

    • Find out how teachers encourage those students who have displayed talent in STEM areas.

    • The limit on instruction construct overlaps with items 45 and 48.

  • Use of textbooks domain

    • Consider asking what percentage of the textbook the teacher covers.

    • Textbook questions meaningful only in context of teachers’ uniquely different classes – burdensome to ask of teachers, but as a point-in-time measure not readily connectable to achievement gain.

    • Instead of teachers, ask department chair to identify the textbooks used for math and science courses, if textbook information is to be obtained at all.

    • Remove item 40.

    • In item 44, change text to “in your classes”.

  • School climate domain

    • Remove item 45.

    • Expand the school/students construct into the beliefs and attitudes construct.

    • Remove item 49.

    • Option d in item 52 should be removed.

    • Remove item 52 and 53 and move the item to the department chair questionnaire.

    • Include a question about collegiality.

    • Include a question about common planning time.

    • Review the literature on trust and schools.

    • Consider referring to “school leadership” instead of “principal”

School Administrator Questionnaire

Steven Ingels

  • Ingels reviewed the purpose of the parent questionnaire and the research questions for the parent questionnaire. The research questions include:

    • What are the school-level correlates of high achieving schools, particularly in math and science?

    • What is the math and science focus of schools?

    • Is the math and science focus of schools associated with a student's subsequent decisions to pursue careers in math and science?

    • What programs and policies do schools offer to assist student at risk of school failure, transitioning from middle school to high school, and struggling in math and science?

  • The existing instrument is 95 minutes. It needs to be condensed to 30 minutes.

Session Action Items/Additional Points to Consider

  • Remove the school size and grade span construct; that information may be found in the CCD, which in future will be more timely than in the past.

  • Ask the department chair (if surveyed) about student/teacher ratios.

  • Include a question that asks about the length of the school day and class period.

  • Include an item on teacher absenteeism.

  • Teacher staff characteristics domain

    • In the staffing construct, remove the “do you find that it is easier to hire qualified math teachers and science teachers if they WHO? enter alternative certification programs?”

    • Insert the word “district” at the top of page 4.

    • In the qualifications construct, remove a “successfully completed postsecondary period” from the “what are the requirements for employment as a full-time math teacher in your school” item.

    • In the qualifications construct, remove a “successfully completed postsecondary period” from the “what are the requirements for employment as a full-time science teacher in your school” item.

    • Remove the first two items under the retention/turnover construct. In remaining items, change “this year” to “last year”. This survey had no page # so refresh us – what’s this about?

    • In the last question in the retention construct, insert the option ‘left teaching’ or ‘retired’.

  • School, policies, practices and programs domain

    • Remove the flexibility of course assignment practices construct; the counselor is asked that question.

    • For the last question in the accountability construct, remove the phrase “when a student fails a competency test”.

    • Remove the first item in the extracurricular activities offered construct. Add the following options to the second item: “career exploration and internship programs” and “tutoring opportunities”.

    • For the dropout prevention program, include a question that asks how many schools are transferred out into alternative programs.

    • Include a question that asks how schools support struggling students.

    • Include a question that asks how schools support students who excel.

    • In the next to last item under transition construct, remove the “full or part time” from the item.

    • Remove the last item in the transition construct.

    • Remove the parent and community outreach construct.

  • Technology domain

    • Remove the technology resources/availability construct

  • School governance domain

    • Remove the mission statement construct.

    • Remove the autonomy construct.

    • Remove the evaluation of performance construct.

    • For the crime and safety construct, the question should read, “How would you describe the crime level in the neighborhood in which the school resides”.

    • The principal perceptions/beliefs construct can be used if additional time is available at the end of the test; otherwise remove it.

Counselor Questionnaire

Steven Ingels

  • Ingels reviewed the purpose of the parent questionnaire and the research questions for the parent questionnaire. The research questions include:

    • How do students get placed into and out of classes?

    • What counseling resources are available to the students within school?

    • What are the tracking procedures and policies and graduation requirements?

    • What college preparation programs are in place at the school?

  • Patricia Martin and James Rosenbaum were asked to review the questionnaire beforehand and provide feedback to the TRP. The TRP was asked to pay specific attention to items that should be removed from the instrument.

Pat Martin feedback

    • Include more questions about beliefs and behaviors, in particular about math and science.

    • Ask how long they have been a counselor.

    • Ask if they have any teaching experience.

    • Ask if they have a math or science background.

    • Include questions about academic plans.

    • Ask if the academic plan is used in preparing course schedules.

    • Check for placement and tracking procedures.

      • Tracking begins before a student gets to high school. Counselors take information from former teachers such as eighth grade instructors.

      • Every school is different; find out about the formal and informal process.

    • Keep in mind counselors will know very little about the 9th graders at the time of the survey. Head counselors will know even less than the regular counselors.

    • Find out how students are assigned to the counselor.

    • Ask parents, students, and teachers about the perception of students being “counseled out”.

    • Ask counselors to describe how students are placed in classes.

James Rosenbaum feedback

    • Some questions can be answered beforehand without asking the counselors.

    • Define the purpose for the survey. Is the survey’s purpose to support information received from the other questionnaires or identify barriers or support in having success in the STEM pipeline.

    • There are not enough questions about beliefs and behaviors.

    • What is the allotted time for each question?

    • Include questions about decisions to take technical education courses.





High School Longitudinal Study of 2009 (HSLS:09)



Third Technical Review Panel Meeting

January 28-29, 2009



Meeting Summary




























Prepared March 3, 2009


High School Longitudinal Study of 2009 (HSLS: 09)
Third Technical Review Panel Meeting Summary

January 28-29, 2009


Meeting Attendees

Clifford Adelman

Eric Banilower

Kathy Borman

Robert Bozick

Laura Burns

Stephanie Cronen

Kristin Denton-Flanagan

Jill Dever

Jim Fey

Mary Frase


Eric Grodsky

Debbie Herget

Thomas Hoffer

Lisa Hudson

Tracy Hunt-White

Steven Ingels

Ying Jin

Vinetta Jones


Stuart Kerachsky

Steve Leinwand

Laura LoGerfo

Shelly Martinez

Jeffrey Owings

Gary Phillips

Mike Planty

Dan Pratt

Don Rock

Russ Rumberger

Marilyn Seastrom


Sharon Senk

Leslie Scott

Marsha Silverberg

Larry Suter

Timothy Urdan

Iris Weiss

Andy White

John Wirt




January 28, 2009


Welcome From NCES

Laura LoGerfo

  • The purpose of this meeting is to review the results of the field test and to discuss possible changes to the questionnaires.

  • Field test is completed, results are in, and the mathematics assessment results are exciting. The computer assessment went well.

  • RTI is working with NSF to merge state data (e.g., administrative, etc.) with HSLS data to ensure a richer set.

  • School recruitment is difficult, so RTI is trying to think creatively about how to increase schools’ positive response rates. Any suggestions are welcome.



Overview

Steven Ingels

  • HSLS:09 field test tested instruments, forms, and procedures, including: 1) items for the mathematics assessment; 2) the questionnaire content for the main study; 3) new approaches to data capture, in particular, computer-based instrumentation; 4) school recruitment and data collection methods; and 5) overall study design.

  • Before the field test, cognitive interviews evaluated the new questionnaire items. Extensive pilot testing was used to evaluate the technical feasibility of the computerized assessment administration. HSLS has significant technological and design innovations, which extend the methods and substance of the previous high school longitudinal studies into new areas.

  • New approaches, compared to those employed in the predecessor high school cohort studies, to understanding the transition from high school to work and higher education: 1) Fall-of-9th grade starting point (2) greater emphasis on STEM; and 3) emphasis on choice behaviors and their timing. Radically new approaches to collecting data include computerized forms of test and surveys for students and availability of all five HSLS questionnaires in electronic form, either as web survey or computer-assisted telephone interview.

      • The Field Test Report will be available in first draft form in March, final draft in July 2009.

      • This TRP meeting will focus on revision of instruments. School sampling has been completed. School recruitment is ongoing and will continue through November 2009. In-school data collection will occur September through December 2009. Out-of-school data collection will occur between September 2009 and February 2010. Field testing the follow-up will occur in spring 2011, and main study follow-up will occur in spring 2012. Transcripts will be collected in 2013-14.

      • The idea of a math-science departmental census of teachers, or longitudinal treatment of the teacher survey, was dropped. Teacher design will be based on linkage to HSLS students, though teachers will not supply ratings of students, given the early autumn starting point for the study.

      • With NSF sponsorship, a state sample augmentation will be undertaken in 10 states. The purpose is to use two NCES-sponsored programs (HSLS:09 and Statewide Longitudinal Data System grant program) together. In these augmentation states, state administrative data will be merged with HSLS:09 student records to create state-specific datasets. Viable state-representative samples comprise 40 public schools participating per state

      • Since the money has already been allocated, there is no possibility of doing more than 10 augmented states.





Comments and Questions:

    • Which are the augmentation states? How will the data merge proceed? This is part of RTI’s contract to work with state representatives. The augmentation states are: CA, TX, FL, GA, MI, NC, OH, PA, TN, and WA. Care taken to a) let affected constituency know because of consent and privacy concerns and b) be able to get data they need (i.e., test and 8th grade information). It’s a minimal sample, approximately 25-35 9th-grade students per school from 40 public schools in 10 states. Didn’t have to augment in CA and TX but have augmented in the other 8 states, in some cases tripling the number of schools. Power analysis shows that 40 is a good number, given budget parameters and constraints. But does it include attrition? Yes, RTI looked at multiple waves of data collection. Of concern: confidentiality and will states let NCES/RTI have data?

    • Panelists asked, what kind of data and at what levels? Dan Pratt replied, the intent is student level and to backfill test information that is available (for example, state standardized test results).

    • Panelist query—will you include state augmentations in national data, and if so, is that a good idea? Dan Pratt replied that it actually will strengthen the national data. This was done with NELS state augmentation schools for selected strata, so there is precedent. Weights will be calculated to reflect such inclusions within the national files.

    • When will students be selected? Which day in the fall? Debbie Herget responded that they are working with schools to get “best possible list” (e.g., 10 or 20 days after start of school). Since data collection is occurring in a finite period of time, trying to keep window tight; have to allow some tolerance level for school to take part. October 1 is the federal/CCD date for student lists, but HSLS needs to start collecting in September; would be nice to align with federal, but may not be possible. An alternative would be to start linking with state data, working with them to get potential linkages.

    • Will we be collecting SES within race in surveys? Yes.

    • MIS conference in February—Debbie Herget and Laura LoGerfo will be meeting with state contacts on data management to discuss more details and operational concerns related to the state administrative data.

    • How are you going to track mobile kids? Dan Pratt—intent is to keep all of the sample, such as in ELS, i.e., early graduates and dropouts included in first and second follow-ups. Steven Ingels noted that, though surveyed, NCES had to give up on follow-up testing for transfer students in ELS (their assessment scores were statistically imputed). This model will likely be applied to HSLS:09 too, since many will move to new schools.


Operations: Field Test Data Collection and Recruitment Strategy

Debbie Herget and Dan Pratt

      • Student Data Collection

    • 41 sampled schools and 11 supplemental (test only) schools participated. This fell short of the hoped for 55 schools, though supplemental schools ensured that the needed number of test observations was obtained. Timing was a challenge since recruiting started in March..

    • Computerized assessment/questionnaire proved successful.

    • 35 sampled schools used the live CD, the remaining 6 used study-provided laptops in small groups.

    • 1035 9th grade students (81%) completed the questionnaire (1026 completed both the questionnaire and the test, while 9 took the questionnaire only). In addition, at supplemental schools, 381 9th grade students were administered the test only. Some 1407 9th grade students completed the assessment in total. Also, 1344 12th grade students completed the assessment—946 from the primary sample and 398 from the special supplemental sample.

      • List Collection: Schools struggled with parent and teacher list collection, which delayed contacting parents and staff. The recommendation for the main study is to collect parent and teacher information with initial student list. Another problem was that 8th grade administrative records collection added a huge burden to schools, delayed parent and teacher lists, and returned incomplete data with generic or unclear course titles. The recommendation is to eliminate this collection from high schools directly, and to collect pre-high school administrative records from augmentation states and through transcripts for all schools in 2013.

      • Parent and Staff Data Collection: Parent and staff data collection ended December 19th. Collecting parent and teacher information with student lists will expedite parent and staff contacts, which should increase response rates. Recommend to contact staff directly and not through school coordinator to improve staff response.

      • Recruiting: Main study target is 600 public schools; 100 Catholic schools; 100 other private high schools; plus another 144 public schools for state representation in 10 states. The 10 augmentation states are: CA, TX, FL, GA, MI, NC, OH, PA, TN and WA. As of January 28, 478 districts (or dioceses) have granted approval for schools to be contacted, and 237 schools have agreed to participate. 97 districts/dioceses (with 128 associated schools) and 59 schools have initially declined participation.

      • Recruiting Package for School: 90 minute sessions; school coordinator honorarium ($100-150); considering an IT coordinator honorarium; no monetary incentive for schools or students, though students will receive an “educational goodie bag;” survey administrator conducts session and reduces school burden as much as possible; considering offering school-level test results to schools; sample overlap avoidance with NAEP:10 and PISA:09; and working with districts and schools almost a year in advance gives them time to schedule HSLS on their calendar.

      • Recruiting Challenges: Difficult objections to address include 1) schools are busy due to over-testing, number of high stakes tests, NCLB/AYP and too many other studies, grants and/or initiatives; and 2) budget and staffing cuts (some schools/districts have seen 100% turnover).

Comments and Questions

  • Does the web based survey work on Macs? Some, but still working on technology (e.g., the computer needs an Intel chip to work properly).

  • Did students have ID and login? Yes.

  • Live CD use: interviewers inserted disk on PCs, rebooted, and the CD takes over the computer system. Some schools will have to use laptops locally and then the data will be transferred securely.

  • Problem of collecting parent and teacher list data—schools said it was easier to just give RTI all data for school as opposed to pulling out the specific information needed. One confidentiality concern is giving information out before obtaining parental consent.

  • Will 8th grade information be available on high school transcripts? May pose a problem. We know from NELS and ELS that this information can be obtained through high school transcripts, though only 7% of ELS:2002 transcripts included 8th grade information. More frequent use of electronic records should facilitate this collection. In addition, 8th grade algebra coursetaking will be collected on the student questionnaire and missing data will be imputed as was done on NELS:88.

  • It was asked what percentage of parents are not native English speakers? Dan Pratt stated that the parent questionnaires are translated into Spanish, but consent forms and instructions have been translated into additional languages as well. Only a small percentage (6-7%) did not participate due to language barriers. Debbie Herget stated that RTI is working on ideas for obtaining better participation rates. For example, parent contacting information will be collected as part of the initial student list, facilitating an earlier start to the parent data collection.

  • School coordinator role did not work as well as hoped for distributing materials to and prompting staff, so midway through the field test, RTI started contacting staff directly. This method proved effective and will be used in the main study. RTI is also expecting that parent response rates will increase in the main study and will be linked to student data.

  • Some parents and staff had problems with pop-up blockers and passwords.

  • Russ Rumberger, a panelist from the University of California, Santa Barbara, suggested going to monthly county meetings of school districts in California to try to increase participation rates as superintendents talk to each other about topics of mutual interest at these meetings.

  • Some oversampling of Catholic/private schools, but will be taken care of by weighting.

  • In order to verify self-reporting of 9th grade students—ask teachers. Though Dan Pratt claimed that while teacher data are linked to students, in order to keep burden down, the emphasis is not on asking teachers to report on individual students. It was suggested that teachers be asked: “Is it assumed that students taking this course have taken algebra?” Dan Pratt responded that student level data will link to teacher data including classroom.


Mathematics Assessment: Psychometric Analyses of 2008 Field Test

Gary W. Phillips & Ying Jin

  • HSLS:09 2008 Math Field Test Results

  • Psychometric Model for Items

  • Estimation of Item Parameters

  • Psychometric Model for Examinees

  • Estimation of Examinee Parameters

  • Statistical Flags Used to Identify Items for Additional Content Review in the 2008 Field Test of the HSLS:09

  • Number of Flagged Items

  • Summary of Classical Test Theory Statistics

  • Summary of Item Response Theory Statistics

  • Descriptive Statistics for Theta

  • Distribution for Theta in Grade 9 and Grade 12

  • Distribution for Theta in Grades 9 and 12 Superimposed on Distribution of Item Difficulty

  • General Psychometric Considerations for Full Scale Assessment

  • Psychometric Characteristics of Stage-1 and Stage-2 for Grade 9 and Grade 11

  • Assigning Students to the 2nd Stage for Grade 9 and 11

Comments and Questions

  • Theta distribution—don’t want to have too much lack of overlap. There was some discussion of this distribution both at the high end (needing discrimination at the top) and also at the low end (to be able to show gain, avoiding a floor effect).

  • Another issue broached was how 12 router questions are being chosen and whether 12 is the optimal number.

  • Discussion of router and high/medium/low questions on tests. Gary Phillips says there are two decisions to be made: 1) Which math items should go on router and which should be placed on each level (high/medium/low) of the second-stage of the assessment; and 2) How will cut points (based on patterns of response on the router) be determined for assignment of different levels of the second-stage test?

HSLS:09 Math Test Update

Steve Leinwand

  • Results from the HSLS:09 Math Content Field Test

    • Background review

    • Who took the field test?

    • What did the test takers do?

    • How did the items work?

    • What does the main study pool look like and what is needed for the main study?

Comments and Questions

  • Question asked about how instructions are provided to students. Directions ask to “do your best to answer all,” though there is no advice on guessing. Students click to skip questions, but can go back to answer.

  • Question asked whether the sub-scales had a good distribution, and the answer was yes.



Student Questionnaire

Steven Ingels

  • Ingels reviewed the purpose and research questions for the student questionnaire.

  • Three tasks for group break-outs: identify indispensable items first; identify lower priority items that could be taken out; and identify ”keepers” that need to be fixed owing to flaws.

  • Average completion times were good on the parent and teacher questionnaires in the field test (approximately 30 minutes) while counselor came in well under 30 minutes and administrator was somewhat over. The situation with student was different, however, in that (a) unlike the other questionnaires, it is a strictly timed administration; and (b) by design, specifically through rotation of sections, more items were tested in the field test that can be administered in the main study. The 95th percentile would be projected to take 60 minutes to complete, so there is a strong need to cut at least 25 minutes from the student instrument.


Results of Break-out sessions on student questionnaires / Recommendations for Main Study


  • Locating information—suggestion was to ask only for an email and primary telephone number as opposed to three separate contact numbers.

  • Item 31 regarding household. Suggest: reducing household roster; reducing stem; perhaps verify through parent questionnaire (but what if you don’t get that information? Impute?); one concern is that you won’t obtain family size—could you obtain from parent questionnaire?

  • Item 35 on language. Concern over fluency versus frequency. Question doesn’t really get to whether students/parents are fluent in English. Could you ask parents? In addition, extra burden of what type of language spoken.

  • Item 40. One group suggested looking at frequencies in field test results to reduce sets. Another group questioned the use of “church groups” as opposed to “religious.”

  • Item 41. One group recommended deleting the entire item. Others suggested making response options “yes/no” to save time but then it would be an untested item. Another group was concerned that the items only referred to science and not math. Another suggestion was to combine “never” and “rarely” response options. TV shows were suggested to be too receptive as opposed to proactive. Finally, one suggestion was to ask how much time per week a student spent on items, a la NELS, which could serve as a more accurate time variable.

  • Item 42. Pre-Algebra missing from options.

  • Item 43. Grade inflation possible but would be nice to have this type of data now as opposed to having to wait until HS transcripts are submitted.

  • Item 46. One group suggested deleting outright.

  • Item 49. One group suggested deleting outright; another suggested deleting option B since statements A and B were highly correlated. Delete statement C to improve reliability.

  • Item 50. Groups A and B suggested deleting entire item. Another option is to delete statements C and D.

  • Item 51. One group believed that this item is redundant. Otherwise, delete C to improve reliability.

  • Item 52 and 53. One group suggested that these two items were strong candidates for removal, while another argued for them to be kept.

  • Item 54. Why is Pre-Algebra not included? Keep item and fix titles.

  • Item 55. Delete or collapse categories.

  • Item 56. Delete or collapse, though one group argued to keep it.

  • Item 57. One group suggested collapsing sub-items to two (one positive, one negative), though another group suggested leaving alone.

  • Item 58. Drop.

  • Item 59. Change “The information” in sub-item A to “What we learn.” Possibly delete sub-items D and E, though one group wanted to keep it.

  • Item 60. One group suggested dropping, another keeping.

  • Item 61. Drop.

  • Item 63. Cut 1/3 of sub-items (same for science item that follows in questionnaire). Change “boys and girls” to “males and females.”

  • Item 65. Delete B and C (or follow decision from Item 49).

  • Item 66. Delete C and D (or follow decision from Item 50).

  • Remaining Science items. Follow lead from math item decisions.

  • Item 80. One group suggested dropping while another group deemed it okay.

  • Item 81. Add period to sub-item B. Change “class” to “grade” in C. Delete D? Overall, rework question.

  • Item 82. Change sub-item B to “a means of taking notes.” Can you incorporate skipping class into this matrix?

  • Item 83. Check correlation before elimination.

  • Item 89. Change “closest friend” to “close friends”—it’s a nice concept in 1988 and 1992, but assumption may have changed in this newer communication environment with changing natures of friendship. Some argument surrounding this question, but no decision made.

  • Item 90. Rephrase “too much time” to “a lot of time”? Does “that I enjoy” need to remain in sub-item A?

  • Item 91. Change “boys and girls” to “males and females.” One group suggested dropping entire question, though others suggested changing “reading and writing” to academic disciplines (i.e., English and Social Science).

  • Item 93. Change to waking hours offered as possibility. Though some of these are not mutually exclusive so cannot add up to 100%.

  • Item 95. One group suggested dropping. Another suggested changing “The information” to “What.”

  • Item 96. Options are to drop entirely or to change to “how many years of math before you finish high school?”

  • Item 99. Drop or collapse.

  • Item 100. One group suggested dropping.

  • Item 101. One group suggested dropping.

  • Item 106. One group suggested dropping, but what about aspirations, etc.—“were you discouraged to take courses?” would be another option.

  • Item 109 and 110. Some suggested collapsing these into “post High School plan.”

  • Item 114. Change to Associate’s and Bachelor’s degrees. Also split out “attend” vs. “complete.”

  • Item 119. Some were concerned with the lack of science and math items. But then others claimed you would have to re-test new questions.




January 29, 2009

Parent, Administrator, Teacher, and Counselor Questionnaires

  • Field Test results and instructions for breakout groups

  • TRP recommendations:


Recommendations for Parent Questionnaire:


  • Discussion of where household information might be better gleaned, from parents or from students. RTI expects a coverage rate in the mid-80s for parents. Marilyn Seastrom is not in favor of deleting from student survey. The questions in the parent survey may not really be getting at household composition, and may not satisfactorily address the issue of stability (though this is somewhat addressed in Items 7 and 8). One suggestion was to translate Student Questionnaire Item 31 to the perspective of the adult.

  • Item 7. Change.

  • Item 8. Consider changing to “mark all that apply” with an adjusted stem.

  • Item 16. Consider adding additional Hispanic categories (e.g., Puerto Rican), like ELS and if changes made in this questionnaire, make consistent with student questionnaire.

  • Would it be beneficial to ask country or region of origin if not born in the United States?

  • Changes made to Language section of Student questionnaire should be repeated in Parent questionnaire.

  • Subtlety of usage/frequency versus fluency. Can you really disentangle without increasing burden? Examine a little more. Options to look at include NHES Parental Involvement Survey, Adult Literacy Survey, and the Department of Education’s shorthand list of questions that try to get to ESL need (i.e., “how well do you…” though these may be too subjective). Item 32 is closer than Item 31 at getting to whether language is a barrier to parental involvement.

  • Item 35. Change to Associate’s and Bachelor’s, and include started and completed sub-items. Make these changes consistent among all questionnaires.

  • Item 36. Discussion regarding amplifying this item to include bachelor’s levels of education. Also of adding question (to perhaps replace this one): “Do any of your completed degrees focus or major in math or natural/life sciences or engineering?” Possibly make for bachelor’s or above only to limit burden. American Community Survey has a good example.

  • Item 37. Make consistent Associate and BA/BS changes.

  • One group suggested asking a question about which school parent attended to try to get at relationship between parents and student decision-making, access and choice.

  • Add: item asking parents about how much they think public college costs (like NHES); tie to financial question.

  • Add a focus question after Item 38.

  • Item 39. One group suggested adding an occupation/industry section. Might be helpful in long run even though might not be able to code now.

  • One group suggested asking two questions about college preparation (i.e., a) what are parent’s sources of information and b) how do they plan on paying for it?). Discussion revolved around whether to ask in 9th or 12th grade surveys, arguing that might be helpful now as it is indicative of planning (like NHES module).

  • Item 44/45. Ask for “current or most recent job”. Also ask about industry (NELS model).

  • Item 54. Change to NELS wording: “starting with grade 1” (as opposed to “when first entered school”).

  • The item order doesn’t seem right from Item 56 to end.

  • Item 58. Drop, since parents might not be differentiating between pre and regular algebra.

  • Item 60. Reading and writing problematic as they are not disciplines.

  • Item 63. Verify federal disability categories to see if they have been updated. Add two questions: 1) have they ever had an IEP and 2) does your child currently have an IEP.

  • Item 66. Some believe this is a very crude indicator.

  • Item 68. Might be useful to see if parents have already had a child at this school.

  • Item 70. One group suggested attempting to further delve into influence levels, perhaps by asking “how much influence do you think?” only if the parent answers sometimes or often. Try to measure probability attached to expectations. Change “grows up” to “finishes school/education.”

  • Item 71. Consider changing to a 10 point scale.

  • Item 79. Change as in Student questionnaire. Change “girls and boys” to “females and males,” and “reading and writing” to “English and social studies.”

  • Item 80. A lot of discussion revolved around this item. If measuring aspirations, will parents really mark that they want their child to start but not complete? One option would be to change stem to include “If there were no barriers, including financial, etc.” like NLS:72 survey. Discussion of inclusion of certificates as an option as they are often STEM related and increasingly more common (though this item hasn’t been tested yet). One suggestion was to add drop downs of what is important to take only to some of the options to measure what parents think are the courses that are necessary for their aspirations. Change past verb tense (e.g., graduated to graduate).

  • Item 81. Perhaps change to a more probabilistic measure (e.g., first, do you think your child will…, then how likely is your child to…”).

  • Item 82. Change to Associate’s and Bachelor’s.

  • Ask parents if they were notified of or asked to sign an education plan for their children.

  • Item 83. May want to ask if parent knows the process or the courses that the student will need.

  • Item 84. Change from “think the following subjects are” to “think each of the following are.” Add “or career” after educational. One group discussed substituting course names instead (e.g., algebra, calculus, science, etc.). If this item is intended to investigate how informed parents are, then add an item after #80 asking them what is important for their children to take to reach their goals. Another suggestion was to include an unsure/don’t know category.


Recommendations for School Administrator Questionnaire:


  • Item 3. One group commented on the number of non-responses and suggested thinking of ways to collapse sub-items or create online branching. Another suggestion is to preload the question. Another suggestion was to delete “public school of choice” because it is not an institutional distinction, that is, how does it differ from charter or magnet schools? Perhaps need a little more clarification.

  • Items 13 and 14. Move to Counselor questionnaire.

  • Item 15. Suggestion to add additional item similar to this vein investigating programs that encourage involvement in math and science. Also, should this item be made for general topics instead of only math and science (to reduce burden)?

  • Items 15, 16, and 18. Align.

  • Item 18. Need for some coordination with Counselor questionnaire.

  • Item 29. First ask total, and then ask only for math and science; collect full and part time for each. Be careful with wording.

  • Items 32 and 33. One suggestion was to combine for both math and science teachers.

  • Items 34 - 37. One suggestion was to eliminate “full-time.” Also, move adjacent to Item 29.

  • Items 38 and 39. Delete.

  • Items 40, 41, and 43. Move to Counselor questionnaire. Also, suggested to eliminate “Offered to 9th graders.” Another suggestion is to collapse options, but to provide list. List should be aligned with the student questionnaire list. Add trigonometry.

  • Item 43. Change wording from “graduation” to “regular diploma as of this year.” Add Trigonometry as a sub-item. One suggestion was to split into two measurements: offered versus required.

  • Items 44 and 45. Align to Teacher questionnaire.

  • Item 46. Some still don’t like the use of the word “abilities.”

  • Item 47. Possibly delete.

  • Items 46 and 47. Stems are difficult to understand and suggest rethinking their inclusion.

  • Possibly add item investigating whether school is in school reform status.

  • Items 51 and 52. Delete or trim.

  • Item 52. Rework, possibly by making parallel to SSOCS items. Consider adding student absenteeism and staff absenteeism.

  • Item 65. Drop or change (some schools might have more than one administrator working on specific tasks).


Recommendations for Science Teacher Questionnaire (same changes apply to Math Teacher Questionnaire):


  • Item 17. Not clear if teachers would remember specific numbers, so suggest changing to “have you taken…” and add roster of degree, colleges and majors for teachers. Suggest looking at NSF/Iris question for guidance.

  • Item 18. Try to find out if they took more than just general science courses.

  • Item 19. Add “science” before “teaching certificate.”

  • Item 22. Suggest using codebook wording.

  • Item 28. Cut approximately half of sub-items based on correlations.

  • Item 32. Don’t make it sum to 100%; let analysts re-standardize.

  • Item 34. Eliminate passive voice in stem. Also delete some sub-items based on correlations.

  • Item 35. Delete.

  • Item 37. Clean up based on correlations. Capitalize Language Learners in sub-item G.

  • Item 38. Change to “male and female” and make consistent with similar questions in other questionnaires (i.e., reading/writing vs. math/science).

  • Item 41. Break into two questions, reduce prompt time.


Recommendations for Counselor Questionnaire:


  • Item 4. Add “only one counselor in school” and “small learning community or school within school.”

  • Item 5. Add “typically” to the question stem, “What percentage of students typically meet…”

  • Item 6. Drop or rework. If rework, add “most” in front of students in stem. Add “Participation in career preparation;” “Participation in career and technical education preparation.” One group recommended changing this from “yes/no” to “how often” or “what proportion of students”.

  • Item 7. Add “Dean” as option because of number of write-ins. Delete “Other”/close question.

  • Item 8. Change “plan” to “any of the following.” Discussion ensued over difference between this item and college preparation plans. One suggestion is to change to small set of items (e.g., school preparation plans for graduation, preparation plans for community college, preparation plans for 4 year college, none are required). Link to parent questionnaire, ask if parents are required to 1) be notified of plan and 2) sign off on plan (to measure policy versus practice).

  • Item 9. Delete.

  • Item 10 and 11. Delete.

  • Item 11. Add option to Item 11 saying “None.” In Item 11.a change to “information to groups of 8th grade students.” Suggest splitting out students and parents in 11.a. Change 11.b from “Assisting individual 8th grade students” to “Meeting 8th grade students one-on-one”. Delete 11.d.

  • Item 12. Present only to schools with more than one counselor and move it forward in questionnaire.

  • Items 13 and 14. Work on closing questions. Add programs that try to get underrepresented students into math and science (e.g., MESAS, etc.). Add financial aid, skills assessments, and career skills from write-ins.

  • Item 15. Split into “at this school” and “offsite” OR change wording to “available to your students” OR delete item entirely.

  • Item 16. Delete “activities” from stem. Add career academy as option? Consider changing “activities” to “job skills” in the stem. Add voc-tech as an option.

  • Item 17. Delete “activities” from stem. Change order of sub-items based on frequencies.

  • Item 19. Add Math enrichment experiences such as Math Olympiad and math teams. One comment concerned being wary of excluding options that were open to all students, not just high achievers, and suggested broadening beyond math and science and beyond high achieving students.

  • Item 21. Fix wording and close question.

  • Item 22. Add “for credit” after “take science…courses.” Change “not offered by” to “outside” your school. Change “on-line courses” to include “and/or distance learning.”

  • Item 23. Move to follow Item 11 (section on transitions). Also, add an option for schools that do not have a choice, and add a standardized test option (state, district). One group felt this didn’t provide enough information about how placement happens and that a scale should be used to measure how important each type of influence is.

  • Item 25. Delete.

  • Move course offering items to this section from administrator questionnaire. Consider asking what percentage of last year’s graduates have taken which courses, limited to high end.

  • Item 27. Delete other option/close question.

  • Item 28. Add screener item. Change stem to include exit exam. Change awkward wording in stem.

  • Item 29. Move up to follow student planning section.

  • Item 33. Revisit wording, regardless of grade level total and then high school grade levels. Change 33.b to say “for any high school grade”.




0

I-0 Supporting Statement Request for OMB Review (SF83I)


File Typeapplication/msword
File TitleAppendix I
Authorddoucet
Last Modified By#Administrator
File Modified2009-04-09
File Created2009-04-09

© 2024 OMB.report | Privacy Policy