High School Longitudinal Study of 2009 (HSLS:09) Panel Maintenance 2018 & 2021
Supporting Statement
Part B
OMB# 1850-0852 v.28
National Center for Education Statistics
U.S. Department of Education
June 2018
Table of Contents
Section Page
B. Collection of Information Employing Statistical Methods 2
B.1 Target Universe and Sampling Frames 2
B.2 Statistical Procedures for Collecting Information 2
B.3 Methods for Maximizing Response Rates 2
B.4 Tests of Procedures and Methods 3
B.5 Reviewing Statisticians and Individuals Responsible for Study Design 4
Panel maintenance activities were approved through December of 2018 (OMB # 1850-0852 v.17-27). This request is to extend this activity through November of 2021.
This section describes the target universe for the High School Longitudinal Study of 2009 (HSLS:09) and the sampling and statistical methodologies used for the second follow-up main study. This section also introduces the statisticians and other technical staff responsible for design and administration of the study.
The base-year target population for HSLS:09 consisted of 9th grade students in the fall of 2009 in public and private schools that include 9th and 11th grades. The target population for the second follow-up was this same 9th grade cohort in 2016. The target population for the postsecondary transcripts and student financial aid records collections was the subset of the 9th grade cohort sample members who had postsecondary education enrollment as of 2016.
The high school universe consisted of public and private schools in the United States with a 9th and 11th grade as of the fall of 2009.
The HSLS:09 panel maintenance collection includes information from students and their parents.
Excluding deceased sample members, those who elected to withdraw from HSLS:09, and those who did not respond in either the base year or first follow-up, the same students who were sampled for the 2009 base-year data collection were recruited to participate in the second follow-up main study’s 2016 data collection. The main study sample included 25,184 non-deceased sample members. Of these, 23,316 were fielded; the remaining cases have been identified as final refusals or as sample members who did not respond in either the base year or first follow-up (i.e., double nonrespondents) and have been classified as study nonrespondents.
The response rate for the HSLS:09 data collection is a function of success in two basic activities: locating the sample members and gaining their cooperation. Many factors will affect the ability to successfully locate and survey sample members for HSLS:09. Among them are the availability, completeness, and accuracy of the locating data collected in the prior interviews. The locator database includes critical tracing information for nearly all sample members and their parents, including address information for their previous residences, telephone numbers, and e-mail addresses. This database allows interviewers and tracers to have ready access to all of the contact information available and to new leads developed through locating efforts. To achieve the desired locating and response rates, a multistage locating approach that capitalizes on available data for the HSLS:09 sample will be employed. The proposed locating approach includes the following activities:
Panel maintenance to maintain up-to-date contact information for sample members.1
Advance tracing includes batch database searches, contact information updates, and intensive tracing that will be conducted prior to the start of data collection.
Prompting sample members with mail and e-mail contacts will maintain regular contact with sample members and encourage them to complete the survey.
Telephone locating and interviewing includes calling all available telephone numbers and following up on leads provided by parents and other contacts. Interviewers will take full advantage of the contacting information available for parents and other contacts (for this cohort, parent contact information is often more reliable than sample member contact information).
Pre-intensive batch tracing consists of the Premium Phone searches that will be conducted between the telephone locating and interviewing stage and the intensive tracing stage.
Intensive tracing consists of tracers checking all telephone numbers and conducting credit bureau database searches after all current telephone numbers have been exhausted.
Other locating activities will take place as needed and include additional tracing resources (e.g., matches to Department of Education financial aid data sources) that are not part of the previous stages.
The methods chosen to locate sample members are based on the successful approaches employed in earlier rounds of this study as well as experience gained from other recent NCES longitudinal cohort studies. The tracing approach is designed to locate the maximum number of sample members with the least expense. The most cost-effective steps will be taken first so as to minimize the number of cases requiring more costly intensive tracing efforts. Panel Maintenance contacting materials are provided in appendix D.
The design of the HSLS:09 main study—in particular, its use of responsive design (an approach in which nonresponding sample members predicted to be most likely to contribute to nonresponse bias in different estimates of interest are identified at multiple points during data collection and targeted with changes in protocol to increase their participation and to reduce nonresponse bias)—expanded on data collection experiments designed for a series of NCES studies, including the HSLS:09 second follow-up field test and 2013 Update. The design of the HSLS:09 second follow-up main study built upon what was learned in HSLS:09 and other NCES studies, most recently BPS:12/14. The results of the main study responsive designs were provided to OMB in the change requests for the second follow-up between May and December 2016 (request 1850-0852 v.19-24).
Evaluation of responsive design approach.
Three elements were evaluated in the HSLS:09 second follow-up responsive design study: (1) that sample cases that contributed to sample representativeness could be identified at the beginning of the third and subsequent data collection phases, (2) that interventions used during each phase of the data collection design were effective in increasing participation, and (3) that increasing response rates among the targeted cases would improve sample representativeness. These three aspects of the responsive design and its implementation for the HSLS:09 second follow up were examined as follows:
Evaluate the bias likelihood model used to identify targeted cases. To assess whether the bias likelihood model successfully identifies nonresponding cases that are underrepresented on key survey variables, estimates within the categories of each model variable for respondents were compared with those of nonrespondents at each phase. This comparison highlighted the model variables that exhibited bias at each phase and the relative size of the imbalance that remained to be reduced through the intervention.
Evaluate the effectiveness of each intervention in increasing survey participation. The second key component of this responsive design was the effectiveness of the targeted treatments in increasing participation. Experiments conducted with the calibration samples allowed us to assess the efficacy of the various treatments.
Evaluate the ability to increase sample representativeness, by identifying cases for targeted treatment. We measured sample representativeness by comparing estimates on key variables for respondents and nonrespondents, at each phase of data collection and at the end of data collection. We then assessed whether sample representativeness is improved over the course of data collection through the use of the targeted interventions for cases identified with the bias likelihood model.
The evaluation of the HSLS:09 second follow-up responsive design approach is provided in Appendix C.
The following statisticians at NCES are responsible for the statistical aspects of the study: Dr. Elise Christopher, Dr. Sean Simone, Dr. Chris Chapman, Dr. Marilyn Seastrom, Dr. Tracy Hunt-White, Dr. David Richards, and Mr. Ted Socha. The following RTI staff work on the statistical aspects of the study design: Mr. Daniel Pratt, Ms. Melissa Cominole, Dr. David Wilson, Dr. Steven Ingels, Dr. Emilia Peytcheva, Dr. Andy Peytchev, and Dr. Jeff Rosen. The following RTI staff led other HSLS:09 activities: Ms. Laura Fritch, Mr. Saju Joshua, Ms. Tiffany Mattox, Dr. Alexandria Radford, Mr. Jim Rogers, and Ms. Jamie Wescott.
Rosen, J. A., Murphy, J. J., Peytchev, A., Holder, T. E., Dever, J. A., Herget, D. R., & Pratt, D. J. (2014). Prioritizing low-propensity sample members in a survey: Implications for nonresponse bias. Survey Practice, 7(1), 1–8.
Pratt, D. J. (Invited Speaker). (2014, March). What is adaptive design in practice? Approaches, experiences, and perspectives. Presented at FedCASIC 2014 Workshop Plenary Panel Session, Washington, DC.
Pratt, D. J. (2013, December). Modeling, prioritization, and phased interventions to reduce potential nonresponse bias. Presented at Workshop on Advances in Adaptive and Responsive Survey Design, Heerlen, Netherlands.
1 2018 panel maintenance activities for the main study were approved as part of OMB #1850-0852 v.17-21 and are underway. Approval to conduct the subsequent round of panel maintenance in 2021 is being requested as part of this submission.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Chapter 2 |
Author | spowell |
File Modified | 0000-00-00 |
File Created | 2021-01-20 |