ED Response to OMB Comments

HSLS 09 OMB Passback Response Memo.doc

High School Longitudinal Study of 2009 (HSLS:09)

ED Response to OMB Comments

OMB: 1850-0852

Document [doc]
Download: doc | pdf
  1. Is NCES still planning to provide incentives to the few schools in the district that raised issues during the field test?


The school district was contacted several months ago to inform them about the change in school- and student-level incentives for the main study. RTI indicated that the field test experiment did not produce results that showed better response rates with a school-level incentive and for that reason a school-level incentive cannot be provided for the main study. The district board discussed the change in a meeting on May 11, which was the timing of our annual renewal for the study. The response from the district board was a recommendation that the field test school-level incentive be retained. No consequence was mentioned if the incentive is not offered but our sense, after speaking with someone in the research review department, is that the board expects schools to be reluctant to participate without it. On the other hand, our field test experience demonstrated that schools in this school district were reluctant to participate even with the monetary incentive provided in the field test, indeed just one of three schools participated. RTI requested a personalized approach to recruitment in which a member of the project management team would personally meet with the principals of sampled schools in this district to present the merits of the study and solicit cooperation. The district board will revisit its participation in HSLS and our response to their questions on June 9 and a letter with the district’s decision is anticipated approximately two weeks later. Schools in this district have not yet been contacted as RTI awaits the response from the school district.


  1. In all letters, please clarify that “30 students” is “per school”. Also, please include a standard confidentiality assurance in each. If you are unsure what we are looking for, please consult Marilyn Seastrom.


Changes were made to the letters and brochures with input from Marilyn Seastrom. The confidentiality assurance in these documents currently reads as follows:


All responses will be protected from disclosure; no parents, students, school officials, teachers or staff will see your answers and no individually identifying data will be reported. Data collected are used only for statistical purposes and may not be disclosed or used, in identifiable form for any other purpose except as required by law (Public Law 107-279, Section 183).


Appendix A, with the aforementioned changes and clarifying the number of students per school are included with this submission.


  1. Please clarify what length of clearance is required to complete the full scale study.


We request clearance through December 2010, which will be sufficient time to complete main study data collection (and completion of base year reports).


  1. Are the adaptive assessments and questionnaires compliant with Section 508 of the Americans With Disability Act?


The assessment and questionnaires have been designed to comply with Section 508 of the Americans with Disability Act. Screen readers may be used to complete the questionnaires. Our assessment is developed with the ability to use a screen reader if necessary, though assessment items with graphs or visual equations are not compatible with a screen reader. Provisions are made to allow school-provided assistance in circumstances where such an accommodation is needed for student participation. In addition, a computer-assisted telephone interview (CATI) option is available for all parent and staff respondents who require assistance.


  1. As we requested for ECLS-K:11, please use the correct racial categories when discussing oversampling plans. We are not asking NCES to change its plans or to produce separate estimates of “Asians” and “Native Hawaiians and Other Pacific Islanders” if sample sizes do not support it. However, we would like to see the correct categories used in descriptive language.


All references to Asian/Pacific Islanders have been corrected to reflect just only Asians per the sample design. At the school sampling level, the information on the frame does not allow for separate estimates in the sampling design. Only counts of “Asian/Pacific Islander” students are available for the HSLS:09 sampling design from NCES’s Common Core of Data1 (CCD) and Private School Universe Survey2 (PSS). At the respondent-level, study participants are asked to identify their racial category according to the OMB-approved guidelines which distinguishes “Asians” from “Native Hawaiian or Other Pacific Islanders.” This is true both in the student list collection, in which school staff are asked to list the race categories for “Asians” and “Native Hawaiian or other Pacific Islanders” separately, and on each of the questionnaires.


  1. In SS A1 where state augmentation is explained, please clarify that the NSF funding was “to sample additional cases” in order to produce state-level estimates for 10 states. Please also clarify that the administrative records component is a separate feature (ideally with a different name from the additional sampling, as we would like to be able to refer to them separately).


NSF provided funding to perform two functions: (1) to sample additional schools to produce state-representative samples and (2) to collect state administrative records from each of 10 states. From the Memo of Understanding:


The state representative samples that NCES proposes NSF sponsor not only will support comparative analyses between States and the Nation but also will combine data from State administrative databases (e.g., students’ test scores, math and science courses, student records prior to entering 9th grade) with rich, longitudinal survey data for sampled students in those States.


This supplement will provide the government with an opportunity to demonstrate how States can merge administrative data collected by the States with rich survey data on secondary and postsecondary math and science collected as part of a national longitudinal study.


However, technically, NSF is funding the addition of approximately 150 schools to the sample to produce state representative samples for 10 states, as also seen in the MOU:


NSF has agreed to fund this supplement for $1,000,000 per year for the next five years. These funds will support the addition of 150 schools to the HSLS:09 sample. These 150 schools will be dispersed across several States.


The addition of the schools to produce state representative samples in 10 states may be labeled the State Supplement to HSLS:09, and the merging of administrative record data to cases in these states may be termed the State Data Record Augmentation (SDRA for short). The supplement and the merge are separate processes, which are being pursued through different pathways. The State Supplement can exist separately and exclusively from the State Data Record Augmentation; in fact, the State Supplement is already complete in terms of both sampling sufficient number of schools in each supplemented state and recruiting these schools to participate. The State Data Record Augmentation requires the trickier, more time-consuming process of developing Memoranda of Understanding with individual states and if and where successful, merging state record data to the HSLS:09 cases after data collection in schools has ceased.


  1. In SS A2, (page 8), the last paragraph before “a.” appears to be out of date. Please update or clarify what would be submitted later. Similarly, page 10 just before “Justification.”


The paragraph referenced on page 8 does not apply to the full scale submission and should not have been included in the document. The sentence on page 10 referenced the draft field test questionnaire in error, and should refer instead to the main study questionnaire submitted with the supporting statement. Edited version of SS A2 is attached with this submission.


  1. Please provide any additional detail possible on the finding that “several school districts and schools indicated that providing school-specific results would be a compelling option to secure their participation.” For example, how frequently did this arise? What types of results did they seek?


Due to the nature of the field test, we were unable to provide school-specific results. There were 8 districts and 5 schools whose participation hinged on the receipt of school-specific results. These school districts and schools participated but explicitly requested that they receive a copy of the First Look report which will be available in late 2010. RTI has secured the participation of 790 schools for the main study to date. At least fifty-seven schools suggested explicitly that school-specific results would satisfy the need to have a tangible benefit of participation in HSLS:09. RTI’s recruitment team has informed these schools that we are requesting OMB approval to provide this information.


The plan for providing these results to school principals is as follows: a minimum of 20 or 25 student participants—reflecting an unweighted student participation rate of at least 85 percent3—would be required in order for the school to receive a report. The exact minimum criterion of 20 or 25 students will be approved by Marilyn Seastrom, Chief Statistician at NCES, based on the results of a simulation study conducted with the NELS:88 or ELS:2002 data. The participation rate would motivate schools to encourage student participation while requiring 20 or more students would avoid disclosure risk. The results would be presented weighted and with the confidence intervals for the estimates (e.g., box-and-whiskers plots). The report would indicate whether the average of the school’s students taking the math assessment is below, at, or above the national average, along with a set of caveats in interpreting the results.


In addition to supplying an overall national estimate against which the school could compare itself, we would also like to be able to compare the school-level estimate to several subnational norms, (again, based on three levels: below, at, or above the norm) reflecting the four regions of the country, the three school sectors, and the four school locales. Specifically, schools could compare their school-level results to the norm for similar schools – for example, their school versus other schools in the West, their school versus other public schools, or their school versus other suburban schools.


It is appealing to provide school-level estimates that would be representative of all 9th graders in the given school, which would require a minimum number of students per school to participate. Thus, the report should help both in convincing schools to participate in the study and in attaining a strong student response rate. The appeal of returning test scores to schools is three-fold: it is simple, corresponds to the most probable area of the school’s interest, and carries with it almost no confidentiality risk.


  1. Who will decide on the exact statistic(s) to provide to schools among the options laid out? Has Marilyn Seastrom signed off on this proposal?


NCES (Laura LoGerfo, Jeff Owings, and Marilyn Seastrom) will decide what statistic(s) to provide to schools, in keeping with the plan indicated in the response to the previous question. The proposed model for providing school-specific reports described in the previous question’s answer is consistent with reports to be provided to schools for PISA and has been approved by Marilyn Seastrom.


  1. If all textbook information has been dropped from the teacher survey, thereby reducing burden on all teacher, why is the initial incentive level required?


The time burden for teacher participation in the main study is consistent with the field test time allocation for a teacher reporting on one subject or textbook. Teachers will spend about 30 minutes, on average, on the main study teacher survey. In the field test, teachers also spent 30 minutes, on average, if reporting on a single subject or textbook. Teachers are facing a high burden in the coming school year with budget cuts across the country in addition to current pressures related to high-stakes test accountability and other responsibilities. The HSLS:09 teacher survey is an added burden for teachers that requires completion on their own time. Given the voluntary nature of the study and the added demands on the teachers, we feel that $25 is a modest and reasonable amount to ensure a satisfactory response rate and to compensate teachers for their time and effort on the 30-minute survey. 


  1. Isn’t some of the School Coordinator function being shifted to the IT Coordinator? If so, please consider reducing the amount of the former’s incentive.


In the majority of field test schools, an IT person at the school spent considerable time and effort to ensure that the Sojourn CD (the means to administer the survey and assessment via computer) was operational in the school and that the computers were set up in time for the students’ arrival on test day. This proved to be a critical role in the field test, as ensuring the compatibility of the equipment with the Sojourn bootable CD prior to test day was fundamentally crucial to the success of the in-school sessions. For the main study, each school’s IT coordinator is asked to test the Sojourn CD in the school, provide information about success or failure of the Sojourn test, work with RTI’s programmers if Sojourn does not work in the initial test to determine if a solution is possible, reconfigure the school’s computers to allow booting from the CD-ROM drive when applicable, and assist with setting up the computers on test day to ensure the session begins on time.


The school coordinator executes many tasks in the school that completely differ from what the IT coordinator does. These tasks include helping to facilitate and arrange interactions between RTI and the IT coordinator, assembling student lists, compiling parent contacting information and teacher/course linkages to sampled students, notifying school personnel of HSLS:09, distributing and tracking parental consent forms, gathering students to participate, following up with paperwork, coordinating a location for the conduct of HSLS:09, and prompting staff for the completion of outstanding questionnaires.


In these ways, the roles of the IT coordinator and of the school coordinator are distinctly separate. The IT coordinator’s role was uncompensated in the field test, but due to the heavy burden, and the time and effort invested to ensure the success of HSLS:09, we request that the IT coordinator receive a small honorarium in recognition of their important contributions. This IT coordinator honorarium should be in addition to the honorarium for the school coordinator, as their obligations to HSLS:09 do not overlap but are fundamental to the successful implementation of HSLS:09.


  1. In SS A10, please provide the affiliation of Neil Russell (ie, we know who he is, but this is a public document).


Neil Russell’s title, Institute of Education Sciences Disclosure Review Board Chair, has been added to the master document.


  1. In SS A12, please clarify whether all sample size increases are due to the “state augmentation.”


The sample size increase is due solely to the “state augmentation” portion of the design. The original design required 800 schools – 600 Public schools, 100 Catholic schools, and 100 Other Private schools. The state augmentation required an additional 144 participating schools resulting in a total of 944. The total sample of 9th grade students given in Table 2 is thus calculated as 944 schools times an average of 25 students per school, or 23,600 = 944 x 25. This also increases the sample of school administrators and school counselors from 800 to 944 each, and the parent sample from 21,800 to 23,600.


  1. When will the field test report be shared with OMB? It would be our preference not to clear HSLS without reviewing it.


RTI is revising the Field Test Report, the first draft of which Laura LoGerfo and Jeff Owings reviewed two weeks ago. NCES will provide the Field Test Report to OMB after it has been reviewed and revised. The most relevant information for HSLS:09 full-scale field operations has been shared with OMB.


  1. In Appendix H, please clarify the language that says “States will be asked to provide as much information as they have for all of the 9th graders in selected schools.” We understand from later text that NCES does not mean this literally, and of course, that would be rather privacy insensitive. Same edit is needed in Appendix J.


States will be asked to provide data on seven variables: (1) Course titles; (2) Course grades; (3) Entry/exit codes; (4) Retention – yes/no; (5) Test scores; (6) Attendance records; and (7) a Data dictionary to learn how States define the data elements they have. Despite our efforts for simplicity, this is a complex process which will be highly variable by state. States will be asked to provide as much of the requested information as they have available for all of the 9th graders in selected schools. This means that the State knows the schools, not the students, which are participating in HSLS, thereby maintaining student anonymity. To facilitate the linkage between the student and the State data, each State will be asked to identify one or more variables in common at the state and school levels, such as a state-level ID number, to be included on the student list collected from the school.



16. Appendix H indicates that “…each State will be asked to identify one or more variables in common at the state and school levels….to be included on the student list collected from the school.”


  1. What is the status of negotiations with the 10 states?


a) Tennessee – The HSLS:09 Project Officer sent them a letter that outlined what we need and under what conditions. Tennessee’s officials met May 14th and approved proceeding with the merge. Their lawyers need resolutions to a few questions that the Tennessee representative, Irma Jones, is sending to Laura LoGerfo the week of June 8th.

b) Pennsylvania – The HSLS:09 Project Officer sent their CSSO a letter asking for permission to continue talking with their data staff. The HSLS:09 Project Officer has not received a stop order, preventing her from talking with the Pennsylvania data staff. Currently, a draft MOU is in progress to send to Pennsylvania as a means to continue the conversation.

c) Texas – Texas is a new SLDS grantee, and one of the SLDS grant program officers, Tate Gould, is building a relationship with the contact there before asking to link HSLS:09 data.

d) North Carolina – The HSLS:09 Project Officer owes the representative at MIS a phone call to elicit further details.

e) Georgia – The HSLS:09 Project Officer sent a letter to their CSSO similar to the letter sent to the Pennsylvania CSSO. Currently, a draft MOU is in progress to send to Georgia as a means to continue the conversation.

f) Michigan – The MOU is in the initial stages and will be sent to Michigan once a draft MOU is completed to elicit Michigan’s comments.

g) Ohio – The MOU is in the initial stages and will be sent to Ohio once a draft MOU is completed to elicit Ohio’s comments.

h) California – The HSLS:09 Project Officer met with the California data representative last month and will be sending the draft MOU as the next stage in the conversation.

i) Washington State – The application is essentially a letter outlining a data sharing agreement, purposes for the data, privacy/confidentiality, etc. Completing this application is underway. Lisa Ireland is the Washington State data representative.

j) Florida – The research application that Florida provided does not permit the distribution of data in any form. As such, NCES is working with Florida to develop a more tailored approach that meets the needs of both institutions’ goals.


  1. Will these variables be identified in time to obtain them during the regular data collection from schools or might a second “call” be required? If that latter, what can NCES to reduce the risk of having to collect the student rosters a second time with an additional variable?


It will be requested that schools provide an ID for each student that is common to both the school and the state to facilitate the matching process. To the extent possible, a single contact would be required for the list collection. In situations in which the school does not provide an ID that is used by both the school and the state, a second contact may be required to accurately match the state-provided records to the school-provided data.


  1. Please clarify in the last paragraph of the appendix that the “merge” will not be conducted by the State, but by NCES’s contractor.


Parent consent forms provided to schools in each of the 10 states will be customized to inform parents of the specific information that will be provided by the state on behalf of their teenager. Student data collection in the schools is scheduled to end in mid-December 2009. In early January 2010, NCES will share agreed-upon information about participating schools with the States to commence the state-level administrative records collection. States will be asked to upload the requested data to the NCES secure server by the end of February 2010. After quality control checks have been performed on the data, NCES’s contractor, RTI International, will merge the HSLS:09 and State data for the augmentation states. After delivery of a master HSLS data file, State data will be returned or destroyed (per agreement with the particular state).


  1. We are quite concerned about the apparent abandonment of plans to obtain 8th grade transcript information. What quick tests or follow-up activities could OMB approve for NCES to better understand sources for this poor information and whether state data is really better or different?


The field test experience demonstrated that the request for 8th grade transcript information from the students’ 9th grade schools was a massive effort for the schools, which resulted in data that held little value and utility. Thirty-three of the 41 participating schools provided 8th grade records, but of those only 18 passed quality control checks. There was little consistency among the courses and grades provided, and what each of these courses actually included per content and rigor. A review of the lists showed that most schools had their own idiosyncrasies contributing to the inconsistency of the data overall. A few examples are:

    • Some schools provided meaningful course names such as Pre-Algebra or Algebra I, while most provided courses called “math”, “math 8”, “science”, or “general”. The course titles had no context and did not describe the content covered.

    • The provision of grades was inconsistent. Some schools provided letter grades, some with and some without pluses and minuses. Other schools provided numeric grades. Two schools reported letter grades for some courses and numeric grades for others within the same school.

    • Three schools provided grades but no course names.

To present meaningful data, both courses and grades need to be coded, matched, and compared.


We also found that access to students’ 8th grade records was variable by state, and sometimes by school within state. For example, Illinois schools had to search each student’s 8th grade file individually. Schools in Texas did not have a consistent way of gathering the requested 8th grade information. In one school in New York, a school contact spent hours digging through boxes of hardcopy data to locate the 8th grade records. Our SA keyed the data from the hardcopy records. At least three other schools declined to provide 8th grade records because all they had was hard-copy records in boxes in a storage area or basement.


  1. What is the source of the language questions proposed for the parent questionnaire?


The source of the language questions is the NELS:88 Base Year Parent interview questions 22A-C, 27 and 28. 


  1. What is the source of the disability questions proposed for the parent questionnaire? We are concerned that use of these questions is measuring “condition” (in the vernacular) rather than “functioning (in each of three domains – physical, cognition and psychological),” with the former often being misunderstood to be a measure of the prevalence of disability. There has been a great deal of work in this area across the Federal statistical system and internationally in the past few years. Please explore the feasibility of using a measure of “functioning” separately from “conditions.” Please consider the American Community Survey disability question as well as the questions on the National Survey of Children’s Health at NCHS (contact for latter is Stephen Blumberg or Patricia Pastor).


Based on a conversation with OMB on June 30, it was agreed that the word disability will be replaced with the more appropriate word “condition” in the parent survey. On July 2, Laura LoGerfo spoke with Stephen Blumberg and Patricia Pastor at the National Center for Health Statistics to determine what condition items from the National Survey of Children’s Health might be most appropriate and educationally-relevant to include in the HSLS:09 parent survey. The NCHS survey contained lots of questions that are highly relevant to a health context, but perhaps less relevant to an educational context and to the HSLS:09 targeted age group. Thus, guidance from the NCHS experts was sought.


The NCHS experts indicated that the federal government is shifting from a focus on conditions to an emphasis on functioning, and they suggested explicit changes to the wording of the question stem and item responses to meet these shifting priorities. The following items are what LoGerfo, Blumberg, and Pastor discussed as best meeting the twin needs of consistency across agencies and educational relevance. The first set of items should replace the NHES disability items that currently are suggested for the HSLS:09 main study. The second set of items is an addition to represent students’ internalizing and externalizing behaviors that may not warrant official diagnoses or special education services but may affect students’ home and school experiences. The listing of HSLS Base Year Parent Interview Modifications is provided below, followed by the questions that were added.


Questions added after March 31st OMB submission, Appendix D

BPNICKNM – Collects student’s nickname to set more conversational tone in interview

BPSTDEG1 – Collects information that was previously collected in BPEDUP1

BPSTRDEG – Collects information that was previously collected in BPEDUP2

BPSPECED – Added from National Center on Health Statistics, who were contacted by HSLS:09 Project Officer at OMB request

BPADDMED – Added from National Center on Health Statistics, who were contacted by HSLS:09 Project Officer at OMB request

BPDIFF – Added from National Center on Health Statistics, who were contacted by HSLS:09 Project Officer at OMB request


Questions deleted after March 31st OMB submission, Appendix D

BP_ADD10 – Replaced by condition questions added from National Center on Health Statistics, who were contacted by HSLS:09 Project Officer at OMB request

BPIEP – Replaced by condition questions added from National Center on Health Statistics, who were contacted by HSLS:09 Project Officer at OMB request


Questions added after March 31st OMB submission, Appendix D


Has a doctor, health care provider, teacher, or school official ever told you that your 9th grader has any of the following conditions?


Yes

No

Specific learning disability

Any developmental delay that affects (his/her) ability to learn

Autism, Asperger's Disorder, pervasive developmental disorder, or other autism spectrum disorder

Hearing problems or Vision problems that cannot be corrected with glasses or contact lenses?

Bone, joint, or muscle problems

Intellectual disability or mental retardation

Attention Deficit Disorder or Attention Deficit Hyperactive Disorder, that is, ADD or ADHD


Does [student] currently receive Special Educational Services? Children receiving these services often have an Individualized Education Plan.

1=Yes

2=No

3=Don’t know


Is [student] currently taking medication for ADD or ADHD?

1=Yes

2=No



Compared to other 9th graders, would you say [he/she] experiences a lot, a little, or no difficulty learning, understanding, or paying attention?

Compared to other 9th graders, would you say [he/she] experiences a lot, a little, or no difficulty speaking, communicating, or being understood?


Compared to other 9th graders, would you say [he/she] experiences a lot, a little, or no difficulty with feeling anxious or depressed?


Compared to other 9th graders, would you say [he/she] experiences a lot, a little, or no difficulty...With behavior problems, such as acting-out, fighting, bullying, or arguing?

Compared to other 9th graders, would you say [he/she] experiences a lot, a little, or no difficulty… Making and keeping friends?



  1. Is the definition of “STEM” that HSLS:09 plans to employ consistent with that used by SRS/NSF in its surveys (e.g,. includes the social sciences)? If uncertain, please check with Tom Weko or Mary Frase, who have been working to harmonize NCES post-secondary surveys with SRS’s surveys.


The definition of STEM in postsecondary disciplines—as employed in conjunction with the Classification of Instructional Programs (CIP) and with the Standard Occupational Coding system (SOC)—will be used. The CIP will be used eventually in the creation of derived variables and in descriptive reporting and analysis based on those derived variables from secondary and postsecondary transcripts. For coding parent responses about their occupations and school staff responses about their college majors, the SOC coding will be applied. Similarly, students will report on what job or occupation they expect or plan to have at age 30, and these responses will be coded according to SOC guidelines as well.


HSLS will follow the STEM definition proposed by the Associate Commissioner of the NCES Postsecondary Studies Division, Tom Weko, which uses all of the same series as the Office of Postsecondary Education with the exception of Series 16, which includes foreign language. It departs somewhat from those used by the NSF Surveys as it does not include Family and Consumer Sciences, Architecture, the Social Sciences or Public Policy and Administration. It closely resembles the STEM Designated Programs defined by ICE with the exception that it does not include Agriculture, Architecture, or the Multi-Disciplinary Sciences.


  1. Did NCES consider asking students directly whether they have an IEP? We would like to be able to learn about students with them have expectations than students without IEPs.


NCES did consider asking students directly whether they had an IEP, and decided instead to collect this information from the school as part of the student list collection and from the parent in the parent questionnaire. Official IEP information will be collected from the school, and parents were deemed the best informants about students’ IEPs. In addition, HSLS will collect data from parents of students who were excluded from the study due to a physical or cognitive disability or because of English Language Learner designation (fewer than 3 years of English instruction and deemed incapable of completing the assessment and questionnaire). For these students, IEP data will still be collected from the parent where we otherwise would not have information if it were only collected by the student.

3 While it is true that 20 participants of 25 sampled is indeed an unweighted 80 percent, under the original specifications, we were sampling sufficient size to obtain on average 25 student participants per school. Based on the ELS simulation analyses, we hope to suggest whether a participant size as low as 20 is sufficient for stable survey estimates, i.e., sample size of at least 24 with an 85% response rate. Or stated a little more precisely, it would be the case that at least 24 eligible students are selected (ineligibles eliminated from response rate calculation) and at least 20 respond for an 85 percent response rate.

File Typeapplication/msword
AuthorLaura F. LoGerfo
Last Modified By#Administrator
File Modified2009-07-15
File Created2009-07-15

© 2024 OMB.report | Privacy Policy