HSLS 2009 2nd Follow-up MS 2nd Incentive Boost Change Memo

HSLS 2009 2nd Follow-up MS 2nd Incentive Boost Change Memo.docx

High School Longitudinal Study of 2009 (HSLS:09) Second Follow-up Main Study and 2018 Panel Maintenance

HSLS 2009 2nd Follow-up MS 2nd Incentive Boost Change Memo

OMB: 1850-0852

Document [docx]
Download: docx | pdf

Memorandum United States Department of Education

Institute of Education Sciences

National Center for Education Statistics


DATE: July 21, 2016


TO: Robert Sivinski, OMB


THROUGH: Kashka Kubzdela, OMB Liaison, NCES


FROM: Elise Christopher, HSLS:09 Project Officer, NCES


SUBJECT: High School Longitudinal Study of 2009 (HSLS:09) Second Follow-up Main Study Second Incentive Boost Change Request (OMB# 1850-0852 v.22) - Calibration Sample Results: Phase 4 and revised institution contacting materials


The High School Longitudinal Study of 2009 (HSLS:09) Second Follow-up Main Study and 2018 Panel Maintenance request was approved by OMB in December 2015 (OMB# 1850-0852 v.17) with updates in March, May, and June 2016 (OMB# 1850-0852 v.18-21). This submission provides results from the fourth phase of the calibration sample experiment, and requests approval for recommended second incentive boost amount for phase 4 of the main sample data collection, as well as approval to revise institution contacting materials that will be used in the upcoming student records and transcript collections. No change to estimated response burden or the total cost to the federal government is associated with this request.

  1. Phase 4 incentive boost amount

A change request was submitted and approved in June to establish the incentive amount for the first of two “incentive boosts” for the main sample. This memorandum provides data collection results from the calibration sample fourth phase (second incentive boost experiment), and requests approval for recommended incentive boost plans for the main sample.

For reference, an excerpt from Part B of the second follow-up main study submission that describes main study responsive design plans is presented in Attachment 1 below. In the HSLS:09 second follow-up main study, there are three subgroups of special interest.

  1. Subgroup A1 (high school late/alternative/non-completers) is the subset of sample members who, as of the 2013 Update, had not completed high school, were still enrolled in high school, received an alternative credential, completed high school late, or experienced a dropout episode with unknown completion status.

  2. Subgroup B (ultra-cooperative respondents) includes sample members who participated in the base year, first follow-up, and 2013 Update without an incentive offer. These cases were also early web respondents in the 2013 Update and, by definition, are high school completers.

  3. Subgroup C (high school completers and unknown high school completion status) includes cases that, as of the 2013 Update, were known to be on-time or early regular diploma completers (and not identified as ultra-cooperative) and cases with unknown high school completion status who were not previously identified as ever having a dropout episode.

To determine optimal incentive amounts, a calibration subsample has been selected from each of the aforementioned subgroups to begin data collection ahead of the main sample. The experimental subsamples are fielded about six weeks prior to the main sample to allow time to analyze the results and consult with OMB to determine the baseline incentive amounts to be implemented for each subgroup in the main sample.

Calibration sample results from phase 4 (boost 2). Phase 4 of the calibration study introduced a second incentive boost that was offered to a subset of pending nonrespondents in addition to the baseline amount, and first boost when applicable, offered in the prior phases. The purpose of this memo is to present the results from phase 4 for the calibration sample and to recommend an incentive boost amount for each subgroup to be implemented in the main sample.

  • Subgroup A (High School Late/Alternative/ Non-Completers.) Among all remaining nonrespondents, cases were randomized to a boost 2 incentive amount of $10 or $20.

  • Subgroup B (Ultra-Cooperative Respondents.) Among the remaining nonrespondents, cases that were identified for targeting by the bias-likelihood model were randomized to a boost 2 incentive amount of $10 or $20. Non-targeted cases were assigned a boost 2 incentive of $0.

  • Subgroup C (All Other High School Completers and Unknown Cases.) Among the remaining nonrespondents, cases that were identified for targeting by the bias-likelihood model were randomized to a boost 2 incentive amount of $10 or $20. Non-targeted cases were assigned a boost 2 incentive of $0.

Subgroup A (High School Late/Alternative/ Non-Completers). Exhibit 1 displays response rates during phase 4 by incentive boost level. One-way chi-square tests were used to perform pairwise contrasts between the boost amounts offered to all remaining nonrespondents in Subgroup A calibration sample. No significant difference was detected between the response rates of sample members who were offered the $10 (5.3 percent) and $20 (5.8 percent) boost 2 incentive (χ2 (1, N = 308) = .04, p = .85). Our recommendation, therefore, is that a boost of $10 be offered to all cases in the Subgroup A main sample.


Exhibit 1. Subgroup A response rates in phase 4, by boost 2 incentive amount

Boost 2

Sample members
(n)

Boost 2 response

Boost 2 response rate (%)

 

 

Yes

No

 






$10

152

8

144

5.3

$20

156

9

147

5.8






Total

308

17

291

5.5

NOTE: Excludes partially completed cases.

SOURCE: U.S. Department of Education, National Center for Education Statistics. High School Longitudinal Study of 2009 (HSLS:09) Second Follow-up Main Study.


Subgroup B (Ultra-Cooperative Respondents). Exhibit 2 displays Subgroup B response rates during phase 4 by incentive boost level for those targeted by the bias-likelihood model for intervention. Note that most of the ultra-cooperative sample members had responded in prior phases, leaving only 38 remaining cases for phase 4. Of these 38 cases, 13 were targeted for an incentive intervention in phase 4. Given the small number of targeted cases within Subgroup B, statistical analysis of the boost 2 incentive was not conducted. However, we recommend that the $10 boost 2 incentive be offered to Subgroup B main sample cases that are targeted by the bias-likelihood model for intervention.

Separate from the proposed $10 boost 2 incentive for targeted cases, we recommend that a $10 first-time incentive be offered to main sample Subgroup B cases that were neither targeted for the boost 1 incentive nor for the boost 2 incentive. Given that these cases heretofore have received no financial incentives across baseline and boost stages, we recommend this intervention as an effort to induce participation among these particular Subgroup B nonrespondents, such that all pending nonrespondent cases will now receive some monetary incentive offer.


Exhibit 2. Subgroup B response rates in phase 4, by boost 2 incentive amount

Boost 2

Sample members
(n)

Boost 2 response

Boost 2 response rate (%)

 

 

Yes

No

 






$10

7

1

6

14.3

$20

6

1

5

16.7






Total

13

2

11

15.4

NOTE: Excludes partially completed cases.
SOURCE: U.S. Department of Education, National Center for Education Statistics. High School Longitudinal Study of 2009 (HSLS:09) Second Follow-up Main Study.


Subgroup C (All Other High School Completers and Unknown Cases). Exhibit 3 displays Subgroup C response rates during phase 4 by incentive level, among the 573 cases selected for an incentive boost based on the bias-likelihood model. (Another 420 Subgroup C cases were not targeted for a phase 4 incentive boost.) No significant difference was detected between the phase 4 response rates of sample members offered $10 (6.0 percent) and $20 (6.6 percent) boost 2 (χ2 (1, N = 573) = 0.10, p = .76). As such we recommend a boost 2 of $10 be offered to all cases in the Subgroup C main sample that are targeted for intervention by the bias-likelihood model.


Exhibit 3. Subgroup C response rates in phase 4, by boost 2 incentive amount

Boost 2

Sample members
(n)

Boost 2 response

Boost 2 response rate (%)

 

 

Yes

No

 






$10

285

17

268

6.0

$20

288

19

269

6.6







Total

573

36

537

6.3

NOTE: Excludes partially completed cases.
SOURCE: U.S. Department of Education, National Center for Education Statistics. High School Longitudinal Study of 2009 (HSLS:09) Second Follow-up Main Study.


  1. Revisions to institution contacting materials

This submission requests approval for changes to the transcript data request and student records request contacting materials that were approved in March 2016 (Appendixes F and G). The changes are a result of the decision to conduct HSLS:09 transcript and student records collections concurrently with the BPS:12 student records collection and BPS:12 pilot test transcript collection. The list of revised contacting materials is presented in Attachment 2 in this document, and the revised Appendixes F and G document is also included with this request. Please note that materials that are used by both studies are labeled with “Joint Study Collection;” these materials are also being submitted in an appendix of the BPS:12/17 OMB package.

Because several NCES studies with postsecondary collections (e.g., HSLS:09, BPS, NPSAS) will be requesting transcripts and student records moving forward, NCES intends to streamline requests to postsecondary institutional records staff by minimizing the number of contacts each institution receives and thereby decrease burden. The goal is to combine the requests for as many studies as possible that have ongoing transcripts and/or student records collections. In 2017, HSLS:09 Second Follow-up and BPS:12/17 will have ongoing transcripts and student records collections, and so NCES combined communication materials for the two studies. Relevant contacts (e.g., Institutional Records or IR staff) will get one letter informing them of an upcoming request for transcripts and student records, rather than two. Following this initial contact, as HSLS:09 and BPS collect transcripts and code that information, more postsecondary institutions attended by sample students may be learned of, thereby increasing the set of institutions to contact. In such cases, NCES will combine as many of the newly discovered cases into as few requests as possible. For example, if a sample member did not respond to the Second Follow-up but their postsecondary institution attended was known as of 2013, upon receipt of the 2013 institution’s transcript, another institutional attendance between 2013 and 2016 may be discovered and thus be added to the contact set. Rather than sending a request for another transcript for one student to the new institution’s IR staff person immediately, the institutional contacting staff will hold this request until additional new cases have been discovered. A request will be placed later in the collection window to ask for multiple student transcripts at the same time. Because adding individual student records to one request is simpler than initiating multiple new requests, it will reduce burden on the institutional record staff.

NCES thanks OMB for considering these changes. Phase 4 of data collection for the main sample is scheduled to begin on August 1, 2016.

Attachment 1 – Description of main study responsive design plans (excerpt from Supporting Statement Part B approved on 12/15/2015)


B.4.d Main Study Plans

NCES and RTI are working closely together to design a data collection approach that makes use of evaluations from prior interventions that were used to improve sample representativeness by ensuring that the responding sample is as similar as possible to the total sample. In previous rounds of HSLS:09 and in other NCES studies (such as BPS:12/14, B&B:08/12, and ELS:2002 third follow-up), responsive designs have been used to improve sample representativeness in key survey variables. The proposed main study data collection plan has been designed to maximize data quality through a responsive design approach in which variance between the responding sample and the overall sample is estimated at several points during data collection. An advantage of the proposed responsive design is that it allows us to determine, during data collection, how representative the responding sample is of the total sample, so that we can focus efforts and resources on bringing in the cases that are most needed to achieve balance in the responding sample.

Plans for the HSLS:09 second follow-up main study are based upon 1) results of incentive experiments and responsive design modeling simulations from the HSLS:09 second follow-up field test, 2) results from related longitudinal studies, and 3) prior experience with the HSLS:09 cohort. This section describes plans for responsive design in the main study data collection. In particular, there are three subgroups of interest that will be handled differently. This section describes the phases of data collection and how and when interventions will be implemented and evaluated. Finally, we discuss the development of the response likelihood and bias likelihood models that will be used to identify cases for targeted treatments.

Sample subgroup classification. In the HSLS:09 second follow-up main study, there will be three subgroups of special interest.

  1. Subgroup 1 (high school late/alternative/non-completers) will be the subset of sample members who, as of the 2013 Update, had not completed high school, were still enrolled in high school, received an alternative credential, completed high school late, or experienced a dropout episode with unknown completion status.

  2. Subgroup 2 (ultra-cooperative respondents) includes sample members who participated in the base year, first follow-up, and 2013 Update without an incentive offer. These cases were also early web respondents in 2013 Update and, by definition, are high school completers.

  3. Subgroup 3 (high school completers and unknown high school completion status) will include cases that, as of the 2013 Update, were known on-time or early regular diploma completers (and not identified as ultra-cooperative) and cases with unknown high school completion status who were not previously identified as ever having a dropout episode.

Calibration subsamples. To determine optimal incentive amounts, a calibration subsample will be selected from each of the aforementioned subgroups to begin data collection ahead of the main sample. A similar approach was used successfully in BPS:12/14, where approximately 10 percent of that sample (3,700 cases) was selected and fielded seven weeks prior to the rest of the BPS:12/14 sample. The experimental subsample was treated in advance of the remaining cases, and after analyzing the results for the experimental sample and consultation with OMB, the successful treatment was implemented with the remaining sample. In the HSLS:09 second follow-up main study, a similar approach is proposed with the HSLS:09 calibration subsamples fielded six weeks prior to the rest of the HSLS:09 sample. Exhibit B-7 shows the estimated size of each subgroup, the percentage of cases to be selected for the calibration subsample, and the estimated number of cases in the calibration sample.

Exhibit B-7. Calibration Sample Sizes, by Subgroup

Subgroup Number

Subgroup Description

Main Sample

Calibration Sample

Calibration Percent

1

High School Late/Alternative/ Non-Completers

Non-completers, late completers, still enrolled, and alternative credential as of the 2013 Update as well as ever dropouts with no completion status,

2,545

509

20%

2

Ultra-Cooperative Respondents

High school completers who participated in base year and the first follow-up, and completed the 2013 Update in early web period, with no incentive

1,027

154

15%

3

All Other High School Completers and Unknown Cases

HS Diploma completed early/on-time unknown or unknown completion status with no known dropout episode

19,747

1,975

10%



Data collection phases, treatments, and evaluations. For the second follow-up main study, the data collection plan includes a phased responsive design strategy specifically aimed at improving sample representativeness in the final survey participants. Exhibit B-8 presents the schedule for the planned phases of data collection for both the calibration samples and the main samples. Exhibit B-9 summarizes the baseline and boost incentives to be tested for each subgroup. The phases will proceed as follows:

Baseline incentive (phase 1). During this beginning phase of data collection, the survey will be open exclusively for self-administered interviews via the web. Web response will remain open throughout the entire data collection. As described above, the calibration samples will allow for testing of incentive amounts on a subset of cases, and the results will inform the implementation plan for the main samples. Prior to the start of the main sample data collection for phase 1, calibration sample response rates will be evaluated. An ANOVA-based model will be used to perform pairwise contrasts between the different incentive amounts offered to the treatment and control groups in each phase. NCES and OMB will meet to review the results of the calibration experiment and determine the optimal incentive amount for each of the subgroups.

  • Subgroup 1 (high school late/alternative/non-completers) will be offered 3 different baseline incentive amounts ($30, $40, or $50). The optimal amount (to be determined in consultation with OMB) will be offered to all cases in the subgroup 1 main sample.

  • Subgroup 2 (ultra-cooperative respondents) will not be offered a baseline incentive. The subgroup 2 calibration sample response rate will be evaluated against early response rates for other cohorts (such as BPS:12/14 and ELS:2002 third follow-up) to estimate a “successful” response benchmark for HSLS:09. If it is determined that the subgroup 2 calibration sample response rate is not successful, we will discuss with OMB the possibility of offering a baseline incentive (amount to be determined in consultation with OMB) to the subgroup 2 main sample.

  • Subgroup 3 (high school completers and unknowns) will be offered 6 different incentive amounts, ranging from $15 to $40 ($15, $20, $25, $30, $35, or $40). The $15 starting point for this baseline incentive calibration experiment is based on the results of the HSLS:09 second follow-up field test experiment. The optimal amount (to be determined in consultation with OMB) will be offered to all cases in the subgroup 3 main sample.





Exhibit B-8. Data Collection Schedule and Phases.


Outbound CATI prompting (phase 2).
After phase 1 data collection which is self-administered via the web (except for instances when sample members call in to the help desk), phase 2 will initiate another mode of data collection. Telephone interviewers will begin making outbound calls to prompt for self-administration or to conduct telephone interviews. No additional incentives will be offered during phase 2.

  • Subgroup 1 will begin outbound CATI earlier than the other subgroups, to allow additional time for telephone interviewers to work these high priority cases.

Incentive boosts (phases 3 and 4). Phases 3 and 4 introduce the use of responsive design with the bias likelihood model. Targeted cases will be offered an incentive boost in addition to the baseline incentive offer. The calibration samples will allow for testing of incentive boost amounts on a subset of the remaining nonrespondents in phases 3 and 4, and the results will inform the incentive boost implementation plan for the main samples. Prior to the start of the main sample data collection for phases 3 and 4, calibration sample response rates will be evaluated. An ANOVA-based model will be used to perform pairwise contrasts between the different incentive boost amounts offered to the treatment and control groups in each phase. NCES and OMB will meet to review the results of the calibration experiment and determine the optimal incentive boost amount for each of the subgroups.

  • Subgroup 1 (high school late/alternative/non-completers) will be offered an incentive boost of either $15 or $25, on top of the baseline incentive they were offered in phase 1. The optimal amount (to be determined in consultation with OMB) based on the calibration sample results, will be offered to all remaining nonrespondents in subgroup 1.

  • The subset of subgroup 2 (ultra-cooperative respondents) cases that are targeted for intervention, based on bias likelihood modeling, will be offered an incentive boost of either $10 or $20, and the optimal amount (to be determined in consultation with OMB) will be offered only to targeted cases among the remaining subgroup 2 nonrespondents.

  • The subset of subgroup 3 (high school completers and unknowns) cases that are targeted for intervention, based on bias likelihood modeling, will be offered an incentive boost of either $10 or $20, and the optimal amount (to be determined in consultation with OMB) will be offered only to targeted cases among the remaining subgroup 3 nonrespondents.


Exhibit B-9. Main study baseline and incentive boost experiments


Incentive Phase

Amount

Total Cumulative Incentives Offered

Estimated Number of Cases to be Worked

High School Late/Alternative/Non-Completers

Base Incentive

$30

$30 to $50

170

(all calibration sample cases)

$40

170

 

$50

169

Boost 1 (all remaining calibration sample nonrespondents)

$15

$45 to $75

158

$25

158

Boost 2 (all remaining calibration sample nonrespondents)

$10

$55 to $95

102

$20

102

Ultra-Cooperative Respondents

Base Incentive (all calibration sample cases)

$0

$0

154


Boost 1 (for targeted cases only: combined with subsample 3)

$10

$10 to $20 targeted; $0 otherwise

(very few if any cases expected to be selected)

$20

Boost 2 (for targeted cases only: combined with subsample 3)

$10

$10 to $40 targeted; $0 to $20 otherwise

(very few if any cases expected to be selected)

$20

High School Completers and Unknowns

Base Incentive

$15

$15 to $40

330

(all calibration sample cases)

$20

329

 

$25

329

 

$30

329

 

$35

329

 

$40

329

Boost 1 (for targeted cases: 1/2 of non-respondents)

$10

$25 to $60 targeted; $15 to $40 otherwise

250

$20

250

Boost 2 (for targeted cases: 1/2 of non-respondents)

$10

$25 to $80 targeted; $15 to $60 otherwise

175

$20

175



Additional treatments for targeted cases. In addition to the monetary interventions described above, the HSLS:09 second follow-up main study design includes non-monetary treatments to be used with targeted cases.

Field interviewing (phase 5). Field interviewing will be conducted for all targeted nonrespondents at the same time; there will be no time lag between the calibration and main samples. Cases identified for targeted treatment (all high school late/alternative/non-completers, and sample members with high bias likelihood scores) will be considered for field interviewing. The decision to conduct field interviewing for a case may also be determined by other factors, such as the location of a case and its proximity to other likely field cases. Nontargeted cases may potentially be included in field interviewing if it is cost effective to do so. Conversely, given the expense of field interviewing, cases with a very low response likelihood may not be pursued.

Extended data collection (phase 6). Cases identified for targeted treatment (all high school late/alternative/non-completers, and sample members with high bias likelihood scores) will be part of an extended data collection period. During this period (the last month of data collection), only targeted cases will be actively prompted to participate. Data collection will remain open for all other cases if they choose to participate, but effort to pursue those cases will be suspended.

Model development. A critical element of any responsive design is the method used to identify cases that will receive special treatment. As described above, the primary goal of this approach is to improve sample representativeness. The bias likelihood model will help determine which cases are most needed to balance the responding sample, and the response likelihood model will help determine which cases may not be optimal for pursuing with targeted interventions so that project resources can be most effectively allocated. In this section, we describe our modeling approach and the variables to be considered for use as predictor variables for both the bias likelihood and the response likelihood models. Variables will be drawn from data obtained in prior waves of data collection with this cohort (base-year, first follow-up, and 2013 Update survey data; high school transcripts; school characteristics; sampling frame information; and paradata). The models for the HSLS:09 second follow-up main study have been developed and will be refined from models for previous rounds of HSLS:09, ELS:2002, and other NCES studies, including BPS:12/14.

Response Likelihood Model. The response likelihood model will be run only once, before data collection begins. Using data obtained in prior waves that are correlated with response outcome (primarily paradata variables), we will fit a model predicting response outcome in the 2013 Update. We will then use the coefficients associated with the significant predictors to estimate the likelihood of response in the second follow-up main study, and each sample member will be assigned a likelihood score prior to the start of data collection. Exhibit B-10 lists the universe of predictor variables that will be considered for the response likelihood model.

During data collection, the response likelihood scores will be used as a “filter” to assist in determining intervention resource allocation. For example, cases that have a very high likelihood of participation may not be offered an incentive boost, since they are likely to participate without it. The response likelihood score can also be used to exclude cases with very low likelihood from the field interviewing intervention. We will also consider using the response likelihood score to adjust the classification of cases in the subgroups. For example, cases with very high response likelihood scores could potentially be treated as “ultra-cooperative” cases. The primary objective of the response likelihood model is to provide information that will inform decisions about inclusion or exclusion of targeted cases for interventions, thereby controlling costs.

Bias Likelihood Model. The bias likelihood model will be used to identify cases that are most unlike the set of sample members that have responded. As was done in the responsive design approach for the 2013 Update, the bias likelihood model will use only key survey and frame variables as predictors to identify nonrespondents most likely to reduce bias in key survey variables if converted to respondents. To calculate bias likelihood, we will run a logistic regression with the second follow-up response outcome as the dependent variable. The bias likelihood model will be run at the beginning of phases 3, 4, 5, and 6 for the calibration samples and at the beginning of phases 3, 4, 5, and 6 for the rest of the cases. (Modeling will be done on the combined sample [calibration cases and rest of cases] prior to phases 5 and 6.) We will then use the coefficients associated with the significant predictors to assign a bias likelihood score for each case. Because the set of respondents and nonrespondents is dynamic, the bias likelihood score for an individual case may change across the phases. The universe of candidate predictor variables have been selected due to their analytic importance for the study, and are presented in Exhibit B-11.


Exhibit B-10. Candidate Variables for the Main Study Response Likelihood Model

Data collection wave

Variables

Base year

Response outcome

Response mode

Early phase response status

First follow-up

Response outcome

Response mode

Early phase response status

Panel Maintenance and Address Updates

Panel maintenance response status

Address update response status

2013 Update Survey

2013 Update response by student (not parent)

Early phase response status

Response mode

Incentive amount (to control for the effect of incentives on response outcomes)

Ever called in to the help desk

Ever refused (sample member)

Ever refused (other contact)

Ever agreed to complete web interview

Dual language speaker

HS completion status indicator

Gender


Count of email addresses

Count of phone numbers

Count of addresses



Exhibit B-11. Candidate Variables for the Main Study Bias Likelihood Model

Data collection wave

Variables

Sampling frame

Race


Gender


School type


Metropolitan area


Geographic region

Base Year

Student’s educational expectations

Parent’s education expectations

Taking a Fall 09 math course

Taking a Fall 09 science course

Base year math assessment score

First Follow up

When Algebra 1 was taken

Grade in Algebra 1

Student’s educational expectations

Parent’s education expectations

Grade in 2011-12

Location

Dual language indicator

Socioeconomic status indicator

Repeated a grade?

F1 math assessment score

Attended a job fair?

Toured a college?

Taken a college class?

Completed an internship?

Performed work in job related to career goals?

Searched internet/college guides for college options?

Talked to HS counselor about after high school options?

Talked with college admission counselor?

Taken a college entrance exam prep course?

Taking math classes in spring 2012?

2013 Update Survey and High School Transcript Collection

Sample member has high school credential

Date of high school credential

School characteristics of last-attended high school

Dual-enrollment status/information

Taking postsecondary classes as of Nov 1 2013

Sector of postsecondary institution as of Nov 1 2013

Apprenticing as of Nov 1 2013

Working for pay as of Nov 1 2013

Serving in the military as of Nov 1 2013

Starting family/taking care of children as of Nov 1 2013

Attending high school or homeschool as of Nov 1 2013

In a course to prepare for GED as of Nov 1 2013

Number of postsecondary institutions applied to

Completed a FAFSA for teenager's education

Did not complete FAFSA because did not want to go into debt

Did not complete FAFSA because can afford college without financial aid

Did not complete FAFSA because thought ineligible or unqualified

Did not complete FAFSA because did not know how

Did not complete FAFSA because forms were too time-consuming/too much work

Did not complete FAFSA because did not know could

Did not complete FAFSA because teen does not plan to continue education

Currently working for pay

Number of high schools attended

Attended CTE center (flag)

English language learner status

GPA: overall

GPA: English

GPA: mathematics

GPA: science

Total credits earned

Credits earned in academic courses

Ever had a dropout episode


The goal of the bias likelihood model is not to accurately predict response, but to classify sample members’ current response rates along the dimensions represented by the predictor variables. As such, statistical significance should not be a determining factor in which variables are included in the model, rather the criterion should be that variable’s importance for HSLS:09. The threshold for identifying cases for targeted treatment will be based on an assessment of the bias likelihood score, the response likelihood score, and available project resources.

Evaluation of responsive design approach.

There are three elements to be evaluated in the proposed responsive design approach: (1) that sample cases that contribute to sample representativeness can be identified at the beginning of the third and subsequent data collection phases, (2) that interventions used during each phase of the data collection design are effective in increasing participation, and (3) that increasing response rates among the targeted cases will improve sample representativeness. We intend to examine these three aspects of the responsive design and its implementation for the HSLS:09 second follow up as follows:

  1. Evaluate the bias likelihood model used to identify targeted cases. To assess whether the bias likelihood model successfully identifies nonresponding cases that are underrepresented on key survey variables, we will compare estimates within the categories of each model variable for respondents and nonrespondents at each phase. This comparison will highlight the model variables that exhibit bias at each phase and the relative size of the imbalance that remains to be reduced through the intervention.

  2. Evaluate the effectiveness of each intervention in increasing survey participation. The second key component of this responsive design is the effectiveness of the targeted treatments in increasing participation. Experiments conducted with the calibration samples will allow us to assess the efficacy of the various treatments.

  3. Evaluate the ability to increase sample representativeness, by identifying cases for targeted treatment. We will measure sample representativeness by comparing estimates on key variables for respondents and nonrespondents, at each phase of data collection and at the end of data collection. We will then be able to assess whether sample representativeness is improved over the course of data collection through the use of the targeted interventions for cases identified with the bias likelihood model.



References

Rosen, J. A., Murphy, J. J., Peytchev, A., Holder, T. E., Dever, J. A., Herget, D. R., & Pratt, D. J. (2014). Prioritizing low-propensity sample members in a survey: Implications for nonresponse bias. Survey Practice, 7(1), 1–8.

Pratt, D. J. (Invited Speaker). (2014, March). What is adaptive design in practice? Approaches, experiences, and perspectives. Presented at FedCASIC 2014 Workshop Plenary Panel Session, Washington, DC.

Pratt, D. J. (2013, December). Modeling, prioritization, and phased interventions to reduce potential nonresponse bias. Presented at Workshop on Advances in Adaptive and Responsive Survey Design, Heerlen, Netherlands.


Attachment 2 – Changes to transcript and student records contacting materials


As mentioned in the memo, this submission requests approval for changes to the transcript data request and student records request contacting materials. The table below includes the new page number of each document within the revised Appendixes F and G, the document name, when and to whom it will be sent, and a summary of revisions when applicable (color coded in blue font).


Exhibit 1: Changes to contacting materials since the 3/7/2016 OMB approval

New Page #

Item Name

When/where Sent

Change
Added (A)
Revised (R)

Revision

F-3 ADDED

Postsecondary Data Portal Packet Contents

Sent prior to data collection to IR director or Chief Administrator (CA) to introduce the PDP.

A


F-6

Transcript And Student Records Collection Letter To IR Director or Chief Administrator – Joint Study Collection

Sent to IR director or CA as cover letter with request packets.

R

Moved from G-3. Edits included adding BPS:12 and clarifications.

F-8

Transcript Collection Request Letter From RTI – Joint Study Collection

Included in transcript request packet for Registrar or designee.

R

Added BPS:12 and edited text to clarify.

F-10

Transcript Collection Request Letter From NCES – Joint Study Collection

Included in transcript request packet for Registrar or designee.

R

Added BPS:12 and edited text to clarify.

F-11

Transcript Request Letter From Endorsing Agency – Joint Study Collection

Included in transcript request packet for Registrar or designee.

R

Added BPS:12 and edited text to clarify.

F-12 ADDED

Transcript Request Letter – Additional Students – Joint Study Collection

Included in subsequent transcript request packet for respondent Registrar or designee when additional students are requested.

A


F-13 ADDED

Transcript Request Letter – Additional Students – To Non-Respondents To First Request - Joint Study Collection

Included in subsequent transcript request packet for non-respondent Registrar or designee when additional students are requested.

A


F-14

Sample List Of Endorsing Associations And Organizations – Joint Study Collection

Included in transcript request packet for Registrar or designee.


No change

F-15

Instructions For Providing Transcript Data – Joint Study Collection

Included in transcript request packet for Registrar or designee.

R

Edited text to clarify example instructions.

F-17

Family Educational Rights And Privacy Act Fact Sheet – Joint Study Collection

Included in transcript request packet for Registrar or designee.


No change

F-24

Brochure Text – HSLS:09 PETS Only

Included in transcript request packet for Registrar or designee.


No change

F-27

Disclosure Notice – Joint Study Collection

Included in transcript request packet for Registrar or designee.

R

Added BPS:12.

F-28

Student Transcript Fax Test Page – Joint Study Collection

Included in transcript request packet for Registrar or designee.

R

Revised “IPEDS ID” to “Study ID.”

F-29

Student Transcript Fax Transmittal Sheet – Joint Study Collection

Included in transcript request packet for Registrar or designee.

R

Revised “IPEDS ID” to “Study ID.”

F-30

Catalog Transmittal Sheet – Joint Study Collection

Included in transcript request packet for Registrar or designee.

R

Revised “IPEDS ID” to “Study ID.”

F-31

Sample Reminder Email – Joint Study Collection

Sent to Primary Coordinator or designee to prompt for data.

R

Added BPS:12 and added text to clarify.

F-32

Postsecondary Data Portal Flyer Text

Given to institution staff at conferences.


No change

F-33

Website Text

Available on PDP.

R

Added BPS:12 content. Added “Confidentiality and Security” section (see page F-36).

F-45

Data Elements

Posted on website as reference material.


No change

G-3

Student Records Request Letter – Joint Study Collection

Included in student records request packet for Financial Aid Director or designee.

R

Added BPS:12 and clarified text.

G-5

HSLS:09 Student Records Collection Frequently Asked Questions (Faqs)

Included in student records request packet for Financial Aid Director or designee.


No change

G-6

Disclosure Notice – Joint Study Collection

Included in transcript request packet for Financial Aid Director or designee.

R

Added BPS:12.

G-7 ADDED

Student Records Request Letter To Financial Aid Personnel – Additional Students – Joint Study Collection

Included in subsequent student records request packet for Financial Aid Director or designee.

A


G-9 ADDED

Student Records Sample Reminder Email 1 – Joint Study Collection

Sent to Primary Coordinator or designee to prompt for data.

A


G-10 ADDED

Student Records Sample Reminder Email 2 – Joint Study Collection

Sent to Primary Coordinator or designee to prompt for data.

A


G-11

Example Text For Quick Guide To Providing Record Data – Joint Study Collection

Included in student records request packet for Financial Aid Director or designee.

R

Revised to clarify example instructions.

G-14

Text For Student Records Collection Brochure

Included in student records request packet for Financial Aid Director or designee.

R

Revised into brochure, added content.

G-16

Postsecondary Data Portal Flyer Text

Given to institution staff at conferences.


No change

G-17

Website Text

Available on PDP.


No change






1 Note that in Attachment 1, the subgroups are labeled as 1, 2, and 3 rather than A, B, and C.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleMemorandum United States Department of Education
Authoraudrey.pendleton
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy