Part B HSLS 2009 Panel Maintenance 2018-2021

Part B HSLS 2009 Panel Maintenance 2018-2021.docx

High School Longitudinal Study of 2009 (HSLS:09) Panel Maintenance 2018 and 2021

OMB: 1850-0852

Document [docx]
Download: docx | pdf



High School Longitudinal Study of 2009 (HSLS:09) Panel Maintenance 2018 & 2021



Supporting Statement

Part B






OMB# 1850-0852 v.29









National Center for Education Statistics

U.S. Department of Education





June 2021



Table of Contents


Section Page



B. Collection of Information Employing Statistical Methods

Panel maintenance activities were approved through December of 2018 (OMB # 1850-0852 v.17-28). This request is to update the contact materials and adjust plans to conduct panel maintenance with both students and their parent.

This section describes the target universe for the High School Longitudinal Study of 2009 (HSLS:09) and the sampling and statistical methodologies used for the second follow-up main study. This section also introduces the statisticians and other technical staff responsible for design and administration of the study.

B.1 Target Universe and Sampling Frames

B.1.a Student Universe

The base-year target population for HSLS:09 consisted of 9th grade students in the fall of 2009 in public and private schools that include 9th and 11th grades. The target population for the second follow-up was this same 9th grade cohort in 2016. The target population for the postsecondary transcripts and student financial aid records collections was the subset of the 9th grade cohort sample members who had postsecondary education enrollment as of 2016.

B.1.b Institution Universe

The high school universe consisted of public and private schools in the United States with a 9th and 11th grade as of the fall of 2009.

B.2 Statistical Procedures for Collecting Information

The HSLS:09 panel maintenance collection includes information from students and their parents.

B.2.a Student Sample

Excluding deceased sample members, those who elected to withdraw from HSLS:09, and those who did not respond in either the base year or first follow-up, the same students who were sampled for the 2009 base-year data collection were recruited to participate in the second follow-up main study’s 2016 data collection. The main study sample included 25,184 non-deceased sample members. Of these, 23,316 were fielded; the remaining cases have been identified as final refusals or as sample members who did not respond in either the base year or first follow-up (i.e., double nonrespondents) and have been classified as study nonrespondents.

B.3 Methods for Maximizing Response Rates

B.3.a Locating

The response rate for the HSLS:09 data collection is a function of success in two basic activities: locating the sample members and gaining their cooperation. Many factors will affect the ability to successfully locate and survey sample members for HSLS:09. Among them are the availability, completeness, and accuracy of the locating data collected in the prior interviews. The locator database includes critical tracing information for nearly all sample members and their parents, including address information for their previous residences, telephone numbers, and e-mail addresses. This database allows interviewers and tracers to have ready access to all of the contact information available and to new leads developed through locating efforts. To achieve the desired locating and response rates, a multistage locating approach that capitalizes on available data for the HSLS:09 sample will be employed. The proposed locating approach includes the following activities:

  1. Panel maintenance to maintain up-to-date contact information for sample members.1

  2. Advance tracing includes batch database searches and contact information updates.

  3. Prompting sample members with mail and e-mail contacts will maintain regular contact with sample members and encourage them to complete the survey.

  4. Other locating activities will take place as needed and include additional tracing resources (e.g., matches to Department of Education financial aid data sources) that are not part of the previous stages.

The methods chosen to locate sample members are based on the successful approaches employed in earlier rounds of this study as well as experience gained from other recent NCES longitudinal cohort studies. The tracing approach is designed to locate the maximum number of sample members with the least expense. The most cost-effective steps will be taken first so as to minimize the number of cases requiring more costly intensive tracing efforts. Panel Maintenance contacting materials are provided in appendix D.

B.4 Tests of Procedures and Methods

The design of the HSLS:09 main study—in particular, its use of responsive design (an approach in which nonresponding sample members predicted to be most likely to contribute to nonresponse bias in different estimates of interest are identified at multiple points during data collection and targeted with changes in protocol to increase their participation and to reduce nonresponse bias)—expanded on data collection experiments designed for a series of NCES studies, including the HSLS:09 second follow-up field test and 2013 Update. The design of the HSLS:09 second follow-up main study built upon what was learned in HSLS:09 and other NCES studies, most recently BPS:12/14. The results of the main study responsive designs were provided to OMB in the change requests for the second follow-up between May and December 2016 (request 1850-0852 v.19-24).

Evaluation of responsive design approach.

Three elements were evaluated in the HSLS:09 second follow-up responsive design study: (1) that sample cases that contributed to sample representativeness could be identified at the beginning of the third and subsequent data collection phases, (2) that interventions used during each phase of the data collection design were effective in increasing participation, and (3) that increasing response rates among the targeted cases would improve sample representativeness. These three aspects of the responsive design and its implementation for the HSLS:09 second follow up were examined as follows:

  1. Evaluate the bias likelihood model used to identify targeted cases. To assess whether the bias likelihood model successfully identifies nonresponding cases that are underrepresented on key survey variables, estimates within the categories of each model variable for respondents were compared with those of nonrespondents at each phase. This comparison highlighted the model variables that exhibited bias at each phase and the relative size of the imbalance that remained to be reduced through the intervention.

  2. Evaluate the effectiveness of each intervention in increasing survey participation. The second key component of this responsive design was the effectiveness of the targeted treatments in increasing participation. Experiments conducted with the calibration samples allowed us to assess the efficacy of the various treatments.

  3. Evaluate the ability to increase sample representativeness, by identifying cases for targeted treatment. We measured sample representativeness by comparing estimates on key variables for respondents and nonrespondents, at each phase of data collection and at the end of data collection. We then assessed whether sample representativeness is improved over the course of data collection through the use of the targeted interventions for cases identified with the bias likelihood model.

The evaluation of the HSLS:09 second follow-up responsive design approach is provided in Appendix C.

B.5 Reviewing Statisticians and Individuals Responsible for Study Design

The following statisticians at NCES are responsible for the statistical aspects of the study: Dr. Elise Christopher, Dr. Sean Simone, Dr. Chris Chapman, Dr. Marilyn Seastrom, Dr. Tracy Hunt-White, Dr. David Richards, and Mr. Ted Socha. The following RTI staff work on the statistical aspects of the study design: Mr. Daniel Pratt and Dr. David Wilson. The following RTI staff led other HSLS:09 activities: Ms. Colleen Spagnardi, Ms. Debbie Herget, Ms. Laura Fritch, Mr. Saju Joshua, Mr. Jim Rogers, Mr. Ethan Ritchie, and Ms. Jacquie Goeking.



References

Rosen, J. A., Murphy, J. J., Peytchev, A., Holder, T. E., Dever, J. A., Herget, D. R., & Pratt, D. J. (2014). Prioritizing low-propensity sample members in a survey: Implications for nonresponse bias. Survey Practice, 7(1), 1–8.

Pratt, D. J. (Invited Speaker). (2014, March). What is adaptive design in practice? Approaches, experiences, and perspectives. Presented at FedCASIC 2014 Workshop Plenary Panel Session, Washington, DC.

Pratt, D. J. (2013, December). Modeling, prioritization, and phased interventions to reduce potential nonresponse bias. Presented at Workshop on Advances in Adaptive and Responsive Survey Design, Heerlen, Netherlands.

1 2018 panel maintenance activities for the main study were approved as part of OMB #1850-0852 v.17-21 and are completed. Approval to conduct the subsequent round of panel maintenance in 2021 is being requested as part of this submission.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleChapter 2
Authorspowell
File Modified0000-00-00
File Created2021-05-28

© 2024 OMB.report | Privacy Policy