Memorandum United States Department of Education
Institute of Education Sciences
National Center for Education Statistics
DATE: March 19, 2018
TO: Robert Sivinski, OMB
THROUGH: Kashka Kubzdela, OMB Liaison, NCES
FROM: Tracy Hunt-White, Team Lead, Postsecondary Longitudinal and Sample Surveys, NCES
Ted Socha, B&B:16/17 Project Officer, NCES
SUBJECT: 2016-17 Baccalaureate and Beyond Longitudinal Study (B&B:16/17) Main Study Nonresponse Change Request (OMB# 1850-0926 v.7)
The 2016/17 Baccalaureate and Beyond Longitudinal Study (B&B:16/17) is conducted by the National Center for Education Statistics (NCES), within the U.S. Department of Education (ED). B&B is designed to follow a cohort of students who completed the requirements for their bachelor’s degree during the same academic year. Data from B&B are used to help researchers and policymakers better understand the experiences of bachelor’s degree recipients in the years following their degree completion. The request to conduct the B&B:16/17 Main Study was approved in May 2017 with the latest change request approved in November 2017 (OMB# 1850-0926 v. 3-6). The response rates towards the originally planned data collection are below those expected and, thus, to improve response rates and quality of resulting data, this request is to extend the B&B:16/17 Main Study data collection period, add contacting materials, add an incentive boost and flash incentive for two nonrespondent subgroups, and re-introduce the eligibility screener near the end of data collection.
This request does not introduce changes to the estimated respondent burden or the costs to the federal government. Revisions were made to Supporting Statements Parts A and B, Appendix E, and Appendix F of the B&B:16/17 Main Study clearance documents (OMB# 1850-0926 v. 3-6). Attachment 1a in this memo shows the revisions made to Section A.9 of Part A; Attachment 1b shows revisions to Section B.4.b of Part B; Attachment 1c the update to contacting calendar in Appendix E, which also reflects the inclusion of new contacting material; and Attachment 1d shows revisions made to Appendix F.
We request to extend the end of B&B:16/17 Main Study data collection from March 23, 2018 to July 21, 2018 due to lower than expected response rates (previous B&B cohorts had much higher response rates). We now plan to end student interviewing on June 30, 2018, but allow an additional three weeks of eligibility screening that would end on July 21, 2018. This additional eligibility screening would allow NCES to identify sample members that are ineligible for B&B:16/17 in order to remove them from the pool, and thus intrinsically improve response rate and data accuracy.
As of March 20, 2018, the overall unweighted response rate was 64.0 percent. With just 3 days remaining in data collection, we expect to reach an unweighted response rate of about 64.2 percent by the current planned end date of data collection, assuming data collection progress maintains the current rate. These response rates are significantly below the targeted response rate of 82 percent.
It is helpful to look at the current response rates of the B&B-eligible sample based on their NPSAS:16 interview response status. As defined in Part B of the approved OMB package (OMB# 1850-0926 v. 6), the four groups are:
Group 0: NPSAS:16 non-study members (non-fielded)
Group 1: Non-located NPSAS:16 interview nonrespondents
Group 2: Located NPSAS:16 interview nonrespondents
Group 3: NPSAS:16 late interview respondents
Group 4: NPSAS:16 early interview respondents (responded in the first three weeks of the NPSAS:16 interview data collection effort)
After approximately 31 weeks in B&B:16/17 data collection with NPSAS:16 respondents (Group 3 and 4), and 24 weeks in data collection with NPSAS:16 nonrespondents and NPSAS:16 abbreviated respondents (Groups 1 and 2), the unweighted response rate is 62.2 percent among eligible sample, with 16,938 having completed full (n=15,327), partial (n=431), and abbreviated interviews (n=1,180). These response rates are significantly below the expected response rate of 82 percent1 and increase the potential for nonresponse bias.
Based on administrative frame data, nonresponse bias analyses was conducted for a set of variables (see Attachment 2 in this memo for details). This analysis shows that while the magnitude of both the average and the median absolute relative nonresponse bias is similar to that in the B&B:08/09 full scale study, the number of B&B:16/17 variables with significant bias is higher. We anticipate that the data collection extension will improve the current level of nonresponse bias.
The extension of the end of data collection from the originally planned March 23, 2018 to July 21, 2018 will allow an additional three months of production for interviews to end on June 30, 2018, followed by a three-week period of eligibility screening with no survey data collection. Time series regression modeling suggests that this data collection extension can increase the response rate by about 8.4 percent to achieve an overall unweighted response rate of about 72.3 percent.
During this extension, we will continue to monitor response rates and nonresponse bias.
Table 6 in Section A.16 of Part A (page 17) has been updated to reflect changes to the B&B:16/17 main study schedule (also included below). In table 6, a new activity was also added to reflect the reintroduction of the eligibility screener to all remaining nonrespondents. The expanded data collection protocol is also reflectd in Table 8 in Section B.4.b of Part B (also shown in Attachment 1.b of this memo). The green text reflects the revisions made in Table 6 below and in Table 8 in Attachment 1b.
Table 6. Operational schedule for B&B:16/17
B&B:16/17 activity |
Start date |
End date |
Main study |
|
|
Select student sample |
Nov. 15, 2016 |
April 27, 2017 |
Eligibility screener with address update |
July 24, 2017 |
August 28, 2017 |
Self-administered web-based data collection |
July 24, 2017 |
June 30, 2018 |
Conduct telephone interviews of students |
July 24, 2017 |
June 30, 2018 |
Re-introduce eligibility screener to all remaining nonrespondents |
July 1, 2018 |
July 21, 2018 |
Process data, construct data files |
July 24, 2017 |
December 21, 2018 |
Prepare/update reports |
Jan. 8, 2018 |
November 1, 2019 |
Panel maintenance for B&B:16/20 field test (anticipated) |
Oct. 1, 2018 |
To reflect the extension of data collection, the following three items were revised in the data collection instrument (Appendix F; also, see Attachment 1d in this memo):
INFOPAGE - Condensed study information
BB17GLINTRO - Locating introduction form
END - End survey screen
The approved interventions in our current data collection protocols have reached a saturation point, and are only marginally increasing the rate at which the response rate increases. Therefore, we request adding contacting materials, adding an incentive boost and flash incentive, plus re-introducing the eligibility screener near the end of data collection. These activities are described in more detail below.
Contacting Materials. Additional communications will be sent to all interview nonrespondents. Given that the current reminders no longer significantly increase response rates among sample members of Group 3 and 4, we suggest further interventions that have also been shown to reduce the potential for nonresponse bias (e.g., B&B:08/18 FT – OMB# 1850-0729 v. 13; see also Dillman et al. 2014). These include:
Adjusting the e-mail subject lines by appealing to sample member’s civic duty, e.g., ‘Make your Voice Heard: Your education in <<Major>> and work experience is unique,’ ‘Start a Civic Tradition: Help by taking the B&B Survey’;
Switching the sender email, e.g., emphasizing NCES sponsorship by sending e-mails from the NCES project officer;
Adjusting the type of mail packaging, i.e., mailing via FedEx or USPS Priority Mail; and
Emphasizing study legitimacy by including prior study results, e.g., the newly available NPSAS:16 First Look results and results of prior rounds of B&B.
In combination, these efforts should increase contact rates and improve cooperation by potentially increasing the perceived relevance, legitimacy, and importance of the study. In particular, switching the sender during data collection also increases the chance that the survey invitation is delivered to the sample member rather than to a spam filter.
See Attachment 1c of this memo, for the revised contact schedule. Appendix E of the approved OMB package has been updated to include these new contacting materials.
Incentive Boost and Flash Incentive. Incentive boosts are a common element used in nonresponse conversion (e.g., Groves and Heeringa 2006; Singer and Ye 2013). These boosts are especially common in large federal surveys during their nonresponse follow up phase (e.g., The National Survey of Family Growth) and have been implemented successfully in other NCES postsecondary education surveys (e.g., HSLS:09 F2; BPS:12/17).
The idea of flash incentives is similar to that of early bird incentives which have been shown to lead to faster responses and increased participation rates within the early incentive period (e.g., Coppersmith et al. 2016; LeClere et al. 2012) and can provide efficiencies by reducing both data collection costs and time. For nonresponse conversion during the B&B:16/17 data collection extension period, a $10 incentive boost increase and $5 flash incentive will be added to the baseline incentive as follows:
All remaining nonrespondents in data collection Groups 3 and 4 (NPSAS:16 respondents) will be offered the $10 incentive boost combined with the $5 flash incentive if they complete the survey within the first three weeks from the start of the new incentive protocol; and
After three weeks have passed, the flash incentive offer will expire, and all remaining nonrespondents in data collection Groups 3 and 4 (NPSAS:16 respondents) sample members will only be eligible for the $10 incentive boost throughout the remainder of the extended data collection.
Sample members in Groups 1 and 2 (NPSAS:16 nonrespondents) have not been in data collection as long as Groups 3 and 4 and are still responding well to contact attempts. Therefore, we do not propose changing their incentive amount.
The updated distribution of incentive amounts for the B&B:16/17 main study is shown in Table 2 of Section A.9 of Part A (page 11), and also copied in Attachment 1a of this memo (the text in green represents additions to the table). Note that student interview data collection will end June 30, and then the eligibility screener will be introduced and continue until July 21, 2018.
Reintroducing the Eligibility Screener. Because the current ineligibility rates are low compared to the B&B:16/17 field test data collection and what was observed when creating B&B:08/09 full-scale sampling frame, upon completion of data collection, on June 30, 2018, we will reintroduce the eligibility screener to all remaining nonrespondents for a duration of three weeks. The purpose of the screener is to help identify ineligible cases among those who have not completed the B&B:16/17 questionnaire by the end of the data collection. Any cases deemed ineligible during B&B:16/17 will be excluded from fielded cases in the next follow up with this cohort (B&B:16/20).
Contrary to the expected total of 3,808 ineligible cases estimated based on the field test results, as of March 20, 2018, a total of 1,609 cases have been deemed ineligible. More specifically, during the B&B:16/17 field test, 3.6 percent of base-year respondents and 11.3 percent of base-year nonrespondents were ineligible for an overall total ineligibility rate of 6.6 percent. In B&B:08/09, 6.0 percent of base-year respondents and 27.0 percent of base-year nonrespondents were confirmed ineligible via transcripts, for an overall ineligibility rate of 7.1 percent. Currently, just 5.6 percent of the full sample have been determined ineligible, with only 4.9 percent of base-year nonrespondents and 5.8 percent of base-year respondents in the full-scale sample deemed ineligible.
When the eligibility screener was sent to the data collection groups containing NPSAS:16 nonrespondents and NPSAS:16 abbreviated respondents (Groups 0, 1, and 2) in July and August 2017, cases had not yet been through tracing and locating activities, which resulted in a very low screener response and locate rate (i.e., 4.9 percent). The table below shows the B&B:16/17 overall locate rate among the data collection group and the locate rate among the remaining nonrespondents. With the additional locating data, we propose asking all remaining nonrespondents to participate in the 5-minute eligibility screener (for a $10 incentive) in order to improve the accuracy of eligibility status among survey nonrespondents.
B&B:16/17 locate rates as of March 20, 2018
Data Collection Group |
Locate rate – remaining B&B:16/17 nonrespondents |
Overall locate rate among sample members |
Non-located NPSAS:16 interview nonrespondents - (Group 1) |
44.8 % |
63.2 % |
Located NPSAS:16 interview nonrespondents and abbreviated respondents - (Group 2) |
75.3 % |
90.0 % |
Late NPSAS:16 interview respondents - (Group 3) |
75.5 % |
93.5 % |
Early NPSAS:16 interview respondents - (Group 4) |
73.7 % |
95.9 % |
Sections A.9 and A.16 of Part A and Section B.4.b of Part B were revised to reflect the reintroduction of the eligibility screener. In this memo, see Attachment 1a for the revised Section A.9; Table 6 on page 2 for the revised B&B:16/17 operational schedule (reflecting edits in Section A.16 of Part A); and Attachment 1b for the revisions made to Section B.4.b of Part B.
Coppersmith, J., Vogel, L.K., Bruursema, T., and K. Feeney. 2016. Effects of Incentive Amount and Type of Web Survey Response Rates. Survey Practice, no pp.
Dillman, D.A., Smyth, J.D., and Christian, L.M. 2014. Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method 4th Edition. Hoboken, NJ: John Wiley & Sons.
Groves R.M., and Heeringa, S.G. 2006 Responsive Design for Household Surveys: Tools for Actively Controlling Survey Errors and Costs. Journal of the Royal Statistical Society Series A-Statistics in Society, 169(3), 439-457.
LeClere, F., Plumme, S., Vanicek, J., Amaya, A., and K. Carris. 2012. Household Early Bird Incentives: Leveraging Family Influence to Improve Household Response Rates.” American Statistical Association Joint Statistical Meetings, Section on Survey Research.
Singer, E. and Ye, C. 2013. The Use and Effects of Incentives in Surveys. Annals. Annals of the American Academy of Political and Social Science, 645(1), 112-141
Attachment 1a – Supporting Statement Part A Revisions
9. Provision of Payments or Gifts to Respondents (New text added in Green)
The use of incentives in B&B provides significant advantages to the government in terms of increased overall student response rates, timely data collection, and reduction of bias-inducing nonresponse. In addition, the use of incentives may also result in decreased data collection costs. Therefore, NPSAS:16 study members selected for participation in the B&B:16/17 main study will be offered monetary incentives at three points in the data collection. First, prior to the start of main data collection, NPSAS:16 nonrespondents will be offered a $10 incentive to complete the eligibility screener and update contact information. As described in sections 1.a and 2.c of this document, the screener will evaluate their eligibility for the B&B:16 cohort, confirming that they completed all requirements for the bachelor’s degree by June 30, 2016 and received the degree by June 30, 2017.
Second, for completing the B&B:16/17 interview, sample members will be offered one of three incentive amounts:
Sample members who completed the full base-year interview – the majority of the sample – will be offered $30 (the baseline incentive offered in B&B studies);
Those who completed only the abbreviated interview in NPSAS:16 and those who were located but not interviewed during NPSAS:16 will be offered $50; and
Those who were neither located nor interviewed during NPSAS:16 will be offered $55.
To encourage participation, sample members who completed the full NPSAS:16 interview, but did so only after repeated contacts, can increase their incentive offer by $5 for completing the B&B:16/17 interview within the first 3 weeks of data collection (an “early bird” incentive for $35 total). Similarly, those who were located during NPSAS:16, but not interviewed, and those who completed only the abbreviated interview will also be able to add $5 to their incentive for an early response ($55 total). Incentives will be provided through the sample member’s choice of either check or PayPal (about 38% of B&B:16/17 field test respondents chose PayPal – an online money transfer service). The incentive amounts being offered during B&B:16/17 are consistent with those approved in previous B&B and NPSAS collections. During the B&B:08/12 main study, base-year nonrespondents were offered a base incentive of $55. NPSAS:16 offered $30 for completion of the full interview.
Data collection for NPSAS:16 nonrespondents (Groups 1 and 2) will begin on September 18, 2017, and for NPSAS:16 respondents (Groups 3 and 4) on July 31, 2017. After March 23, 2018, in order to encourage participation among B&B:16/17 nonrespondents in Groups 3 and 4, they will be offered an additional $10 incentive, for a total of $40 to complete the abbreviated interview. They will also be offered an additional “flash” incentive of $5 if they complete the survey within the first three weeks of the offer.
Third, after the end of student interview data collection, the eligibility screener will be reintroduced to all remaining nonrespondents to identify ineligible cases. As in the early stage of B&B:16/17 data collection, respondents to the post-collection eligibility screener will receive a $10 incentive.
Table 2 shows the distribution of the incentive amounts possible throughout the entire B&B:16/17 main study data collection. More information regarding the use of incentives is provided in the Supporting Statement Part B of this submission.
Table 2. Distribution of incentive amounts for the B&B:16/17 main study data collection
|
|
NPSAS:16 Interview Respondent Status |
||||
Type of B&B:16/17 response |
Full NPSAS:16 interview respondents |
Abbreviated NPSAS:16 interview respondents |
NPSAS:16 Interview nonrespondents |
|||
Responded early (Group 3) |
Responded late (Group 4) |
Located (Group 2) |
Not located (Group 1) |
|||
July 31-Mar. 23 – Prior to Extension |
Address update with eligibility screener |
--- |
--- |
--- |
$10 |
$10 |
Full survey |
$30 |
$30 |
$50 |
$50 |
$55 |
|
Plus early bird incentive |
+$5 |
--- |
+$5 |
+$5 |
--- |
|
Total potential incentive for full interview |
$35 |
$30 |
$55 |
$55 |
$55 |
|
Abbreviated survey |
$30 |
$30 |
$50 |
$50 |
$55 |
|
Mar. 23-Jun 30 – After Extension |
Abbreviated survey plus incentive boost |
+$10 |
+$10 |
n/a |
n/a |
n/a |
Plus flash incentive (offer expires after 3 weeks) |
+$5 |
+$5 |
n/a |
n/a |
n/a |
|
Total potential incentive for abbreviated interview |
$45 |
$45 |
$50 (no boost) |
$50 (no boost) |
$55 (no boost) |
|
July 1- July 21 |
Address update with eligibility screener for B&B:16/17 nonrespondents |
$10 |
$10 |
$10 |
$10 |
$10 |
Prior to the start of data collection, B&B:16/17 sample members will be matched to a federal database maintained by the U.S. Department of the Treasury’s Office of Foreign Assets Controls (OFAC). OFAC administers and enforces economic and trade sanctions based on U.S. foreign policy and national security goals. As part of its enforcement efforts, OFAC publishes a list of individuals and companies called the "Specially Designated Nationals List” or "SDN." Their assets are blocked and U.S. entities are prohibited from conducting trade or financial transactions with those on the list (https://www.treasury.gov/resource-center/sanctions/Pages/default.aspx). In order to determine if there are any B&B:16/17 sample members to whom NCES cannot offer an incentive, the sample members will be matched to the SDN using the Jaro-Winkler and Soundex algorithms recommended by OFAC. To avoid over-matching, B&B staff will review the cases based on full name, date of birth, and address. The small number of individuals that NCES anticipates to not be able to confirm as not matching the SDN list will receive a survey request without an incentive offer.
Attachment 1b - Supporting Statement Part B Revisions
b. B&B:16/17 Main Study Data Collection Design (New text added in Green; Deleted text is in Red)
Confidentiality Pledge Wording and Placement Experiment. In July 2015, the Homeland Security Act of 2002 was amended to require the Department of Homeland Security (DHS) to monitor federal agency information systems, including survey data transmissions. As a result, the confidentiality pledge cited to sample members in recruitment materials and at the start of each data collection instrument, must be updated to reflect the change. Throughout 2016, cognitive testing was conducted to determine the wording for the pledge so that it communicates the purposes of the legislation. Testing yielded three wording versions (shown below). We will now conduct an experiment in B&B:16/17 to determine the effect of the three different versions of the confidentiality pledge on sample members’ willingness to both initially choose to participate in (participation rate) and ultimately continue through and complete (response rate) each – the screener and the survey. In addition, the experiment will determine whether the placement of the confidentiality pledge (A) on the first screen encountered (Login Screen Group) together with the Paperwork Reduction Act (PRA) statement and the study authorization citation, or (B) on a second screen with the pledge wording presented by itself (Separate Pledge Screen) has an effect on participation and/or response rates (see appendix F for the experimental screens). The three confidentiality pledge wording versions are:
Control Group:
All of the information you provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151).
Experimental Group 1 (“Homeland Security”):
All of the information you provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573). By law, anyone who willfully discloses any identifiable information about you or your school is subject to a jail term of up to 5 years, a fine of up to $250,000, or both. Electronic transmission of your information will be monitored for viruses, malware, and other threats by Homeland Security in accordance with the Cybersecurity Enhancement Act of 2015.
Text for Experimental Group 2 (“Federal Staff”):
All of the information you provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573). By law, anyone who willfully discloses any identifiable information about you or your school is subject to a jail term of up to 5 years, a fine of up to $250,000, or both. Electronic transmission of your information will be monitored for viruses, malware, and other threats by Federal employees and contractors in accordance with the Cybersecurity Enhancement Act of 2015.
Sample members will be assigned to groups at random, prior to the start of data collection, with the large majority in the Control-Login Screen Group (N=18,779). The rest will be distributed equally across the remaining five groups (N=1,733 in each group): Control-Separate Pledge Screen; Homeland Security-Login Screen; Homeland Security-Separate Pledge Screen; Federal Staff-Login Screen; and Federal Staff-Separate Pledge Screen.
Two outcomes are of particular interest. The first, participation rate, will evaluate the willingness of sample members to start the B&B:16/17 interview after having been shown their respective Confidentiality Pledge. Sample members start the interview either by entering a Study ID and password and clicking the Login button on the study website, or by clicking the next button on the Survey Start page reached through links included in individualized emails. The second outcome of interest, response rate, will evaluate the willingness of sample members to continue through to the end of the interview after having been shown the pledge2. Participation and response rates will be compared for pledge wording, placement, and the interaction of pledge wording and placement.
In addition to the Confidentiality Pledge experiment, other features of the B&B:16/17 main study design, many of which have been retained from the field test, are discussed below and summarized in table 8.
Eligibility Screener with Address Update. During the B&B:16/17
FT, 22% of NPSAS:16 FT nonrespondents who participated in B&B:16/17
were determined ineligible by the B&B interview. For the
B&B:16/17 main study, at the start of data collection, B&B:16/17
base year nonrespondents, both study members (N=4,905) and non-study
members (N=1,352), and base year abbreviated respondents (N=2,005)
will be sent an initial letter and email inviting them to complete an
address update and eligibility screener either online or by
telephone. This first step should result in more efficient locating
and earlier identification of ineligible sample members. Those who
complete the screener will receive a $10 postpaid incentive paid by
their choice of check or via PayPal. Requests to complete the
screener will be mailed and emailed to sample members at the start of
data collection, and a reminder will be sent about two weeks after
the initial invitation. Data collection
will continue for about six weeks.
At the start of data collection, eligibility determination with the screener will continue for about six weeks. If at the end of data collection, on June 30, 2018, ineligibility rates prove low compared to the B&B:16/17 field test and what was observed when creating B&B:08/09 full-scale sampling frame, the eligibility screener will be reintroduced on July 1, 2018, to all remaining nonrespondents for a period of three weeks as a final attempt to identify cases that are not eligible for B&B:16/17. A $10 incentive will accompany the screener. Any cases deemed ineligible during B&B:16/17 will be excluded from fielded cases in the next follow up with this cohort (B&B:16/20).
B&B:16/17 Main Study Data Collection Group Assignments and Protocols. Given the relative success of the aggressive, default, and relaxed protocols observed in the field test, a similar approach will be used in the main study, with minor changes in the groupings of sample members as shown in Table 8.
NPSAS:16 interview nonrespondents: All NPSAS:16 nonrespondents will receive the aggressive protocol. In addition, in order to administer appropriate interventions throughout data collection, nonrespondents will be further divided into those who were (Group 2; N=3,686) and were not (Group 1; N=1,219) located in NPSAS:16.
NPSAS:16 abbreviated interview respondents: Also receiving the aggressive protocol will be respondents who completed either version of abbreviated interview in NPSAS:16 (Group 2; N=3,224).
Late NPSAS:16 interview respondents: NPSAS:16 interview respondents who completed their base year interview later in data collection, that is, after the first 3 weeks, will receive the default protocol (Group 3; N=7,333).
Early NPSAS:16 interview respondents: NPSAS:16 interview respondents who completed their base year interview within the first 3 weeks will receive the relaxed protocol (Group 4; N=11,982).
Table 8. B&B:16/17 main study data collection protocols by data collection phase
|
Data Collection Group Assignments |
|||
Non-located NPSAS:16 interview nonrespondents (Group 1) |
Located NPSAS:16 interview nonrespondents and abbreviated respondents (Group 2) |
Late NPSAS:16 interview respondents (Group 3) |
Early NPSAS:16 interview respondents (Group 4) |
|
Protocol |
Aggressive |
Aggressive |
Default |
Relaxed |
Eligibility screener & address update – First 6 weeks, prior to main data collection1 |
$10 postpaid incentive |
$10 postpaid incentive |
--- |
--- |
Early Completion Phase – First 4 weeks of main data collection1 |
Data collection announcement letter and email |
Data collection announcement letter and email offering additional $5 “Early Bird” incentive |
DC announcement letter and email thanking for prior participation |
|
CATI starts 2 weeks after mail outs – continues through all phases |
Mode tailoring in NPSAS:16 completion mode |
Mode tailoring in NPSAS:16 completion mode |
||
Production Phase I – Next <3 months |
Postcard reminders |
“Light” CATI Outbound begins |
“Light” CATI Outbound begins 2 weeks after start of phase |
|
Production Phase II – Next 3 months |
Abbreviated interview offered Postcard reminders |
Postcard reminders |
||
Continued CATI interviewing/ locating efforts |
||||
Abbreviated interview offer for potential refusals |
||||
Nonresponse Conversion Phase – Final months |
Continued CATI interviewing/locating efforts |
|||
Postcard reminders |
||||
Abbreviated interview offered – offered to Group 3 & 4 pending cases on February 19 |
||||
Total postpaid incentive – for respondents before extension |
$55 (+$10 screener/address) |
$50 + $5 early bird (+$10 screener/address) |
$30 + $5 early bird |
$30 |
Data Collection Extension |
Continued CATI interviewing/locating efforts |
|||
Abbreviated interview offered to all cases |
||||
Incentive offer remains the same |
Additional $10 incentive boost + 3-week $5 “flash” incentive |
|||
Total postpaid incentive – for respondents during extension |
$55 (+$10 screener/address) |
$50 + (+$10 screener/address) |
$30 + $10 boost + $5 flash |
$30 + $10 boost + $5 flash |
Post data collection – eligibility screener for B&B:16/17 nonrespondents after June 30, 2018 |
$10 postpaid incentive (with no earlier incentive received for B&B:16/17) |
Note: In addition to contacts shown in table, all groups will receive regular email and, with permission, text message reminders.
1 Main data collection begins after the 6-week screener period for base year nonrespondents and immediately upon OMB clearance for base year respondents. Main data collection will end at the same time for both groups, therefore the duration of production phase I will be adjusted for base year nonrespondents to ensure Production Phase II and the Nonresponse Conversion Phase have sufficient time.
Early Bird Incentive. Early bird incentives have been shown to lead to faster responses and increased participation rates within the early bird incentive period (e.g., LeClere et al. 2012; Coppersmith et al. 2016), and can provide efficiencies by reducing both data collection costs and time. Given this, rather than continuing to offer a $10 prepaid incentive as part of the aggressive protocol, for responding within the first three weeks of the start of data collection, sample members in Groups 2 and 3 will be offered the opportunity to increase their total incentive by $5, to $55 for located base-year nonrespondents, and to $35 for base-year late respondents. Because Group 1 sample members may not be located until well after the B&B:16/17 early bird phase ends, they will not receive the same early bird incentive, but will instead be offered an incentive that is increased by $5 throughout the entire data collection period. Sample members in Group 4 will not receive the early bird incentive because, as shown in the B&B:16/17 FT, they generally needed the least prompting to participate.
Other data collection incentives. As described in the last section, instead of offering either a prepaid or an early bird incentive to Group 1, sample members will instead be offered an overall incentive amount of $55 upon survey completion. This amount matches that offered to all base-year nonrespondents in the B&B:08/09 FT, which achieved an overall response rate of 44.0% for the equivalent group. In addition to the $5 early bird incentive, sample members in Group 2 will received $50 upon survey completion (total of $55 for an early response). This amount matches the amount offered to equivalent groups in previous data collections with the B&B:08 cohort. For Group 3, we recommend maintaining the same $30 incentive level that was used in the B&B:16/17 FT and in NPSAS:16 main study. In addition to the $30, Group 3 sample members will receive an additional $5 early bird incentive, for a total of $35, if they complete the survey within the first three weeks of data collection. While Group 4 sample members were prompted least during the B&B:16/17 FT yet still achieved a 75.1% response rate, that response rate is still considerably lower than what was observed during the B&B:08/09 field test (80.9 % among all base-year interview respondents). The primary difference in the two collections was the incentive amount offered -- $20 for B&B:16/17 compared to $30 for B&B:08/18 – so, in order to maximize the possible response rate with Group 4, its sample members will be offered $30. It is worth noting that these same sample members received $30 for completing the NPSAS:16 interview one year earlier.
Data collection for NPSAS:16 nonrespondents (Groups 1 and 2) will begin on September 18, 2017, and for NPSAS:16 respondents (Groups 3 and 4) on July 31, 2017. After March 23, 2018, for B&B:16/17 data collection nonresponse conversion in Groups 3 and 4, a $10 incentive boost increase and $5 flash incentive will be added to the baseline incentive as follows:
All remaining nonrespondents in data collection Groups 3 and 4 (NPSAS:16 respondents) will be offered the $10 incentive boost combined with the $5 flash incentive if they complete the survey within three weeks of the incentive boost plus flash offer; and
After those three weeks, the flash incentive offer will expire, and all remaining nonrespondents in data collection Groups 3 and 4 (NPSAS:16 respondents) will be eligible only for the $10 incentive boost to the baseline incentive throughout the remainder of the extended data collection, until June 30, 2018.
Mode tailoring. Leverage-saliency theory and social exchange theory suggest that offering a person the mode they prefer, e.g., by telephone or the Web, increases their likelihood of participating (Groves et al. 2000; Dillman et al. 2014). This theory is supported by empirical evidence that offering people their preferred mode choice speeds up their response and is associated with higher participation rates (e.g., Olson et al. 2012). With the NPSAS:16 interview completion mode as a proxy for mode preference, during the B&B:16/17 main study early completion phase, Groups 3 and 4 will be approached in the NPSAS:16 preferred mode. Specifically, while all NPSAS:16 interview respondents will receive identical data collection announcement letters and emails, members of Groups 3 and 4 who completed the NPSAS:16 interview by telephone (N=355) will be approached by telephone from the start of data collection. Likewise, those who completed the NPSAS:16 main study interview online will not be contacted by telephone before a preassigned outbound data collection date.
Light CATI outbound calling. Anecdotally, introduction of light, or less intense, CATI interviewing in B&B:16/17 FT seemed to increase production phase response rates among Group 3 sample members (35%) compared to Group 4 in the same phase (24%). Light CATI involves a minimal number of phone calls, used mainly to prompt web response, while regular CATI efforts include more frequent phone efforts, with the goal to locate sample members and encourage their participation. Although one should use caution when interpreting these results – group assignment was not random – the findings are consistent with the literature which has shown that web surveys tend to have lower response rates compared to interviewer-administered surveys (e.g., Lozar Manfreda et al. 2008). Attempting to interview sample members by telephone also increases the likelihood of initiating locating efforts sooner when they cannot be located. B&B:16/17 FT results showed higher locate rates in Group 3 (93.7%), which had light CATI, compared to that of Group 4 (77.8%; χ2 = 63.2, p < 0.001) which did not. For the B&B:16/17 main study data collection, light CATI will be used with both Groups 3 and 4 once CATI begins in Production Phase I, the first half of the 6-month production phase.
Abbreviated Interviews. Obtaining responses from all sample members in a data collection is important to assessing and improving sample representativeness (e.g., Kreuter et al. 2010). During the B&B:16/17 FT data collection, sample members in Group 1 offered the abbreviated interview during the production phase responded at higher rates (22.7%) than those in Group 2 who were not offered the abbreviated interview at the same time (12.1%; t(2,097) = 3.67, p < 0.001). An abbreviated interview option will be offered to all sample members in the B&B:16/17 main study data collection. For pending cases in Groups 1 and 2, it will be offered on January 8, 2018, during Production Phase II (the latter half of the production phase of data collection). For pending cases in Groups 3 & 4, the abbreviated interview offer will be made on February 19, 2018. Further, for sample members who express concerns about the length of the interview, the abbreviated interview offer will be made ahead of schedule, beginning in November 2017, based on their refusal to complete the full survey.
B&B:16/17 Confidentiality Pledge Experiment Research Questions. As described above, there are two outcomes of interest with the pledge experiment. The participation rate outcome will measure the willingness of sample members to enter or start the B&B:16/17 interview either by entering a Study ID and password and clicking the Login button or by clicking the next button on the Survey Start page reached through a direct link from emails. The response rate outcome will measure sample members’ willingness to continue to the end of the interview. Given the design of the pledge experiment, both in terms of wording (control, Homeland Security, federal staff) and placement of the pledge text (on the log in/direct link screen or a separate second screen), we will test the following:
Research question 1.1: Is there a difference in participation rates across the three pledge wording options?
H0: There is no observed difference in likelihood of participation between:
1.1a. The Control and Homeland Security wording options
1.1b. The Control and Federal Staff wording options
1.1c. The Homeland Security and Federal Staff wording options
Research question 1.2: Is there a difference in response rates, given participation, across the three pledge wording options?
H0: There is no observed difference in likelihood of response, given participation, between:
1.2a. The Control and Homeland Security wording options
1.2b. The Control and Federal Staff wording options
1.2c. The Homeland Security and Federal Staff wording options
Research question 2.1: Is there a difference in participation rates across the two text placement options, on the login screen or on a separate screen immediately following login?
H0: There is no observed difference in likelihood of participation between the Login Page and Separate Page options.
Research question 2.2: Is there a difference in response rates, given participation, across the two text placement options or on a separate screen immediately following login?
H0: There is no observed difference in likelihood of response, given participation, between the Login Page and Separate Page options.
Research question 3.1: Is there a difference in participation rates across the three pledge text/two placement combinations?
H0: There is no difference in the likelihood of participation, among the pledge text/placement combinations.
Research question 3.2: Is there a difference in response rates across the three pledge text/two placement combinations?
H0: There is no difference in response rates among the pledge text/placement combinations.
The differences between the control and treatment group(s) necessary to detect statistically significant differences are shown in table 9.
Table 9. Two-group detectable differences for the pledge text/pledge placement experiment
Hypothesis |
Group 1 |
Group 2 |
Detectable difference (95% confidence) |
||
Definition |
N |
Definition |
N |
||
1.a |
Control text |
20,512 |
Homeland Security text |
3,466 |
2.3% |
1.b |
Control text |
20,512 |
Federal Staff text |
3,466 |
2.3% |
1.c |
Homeland Security text |
3,466 |
Federal Staff text |
3,466 |
3.0% |
|
|
|
|
|
|
2 |
Login screen |
22,245 |
Separate Page screen |
5,199 |
1.9% |
3 |
Interactions |
||||
|
Control/Login |
18,779 |
Any other pledge/placement option |
1,733 |
3.1% |
|
All other pledge/placement options (other than Control/Login) |
1,733 |
All other pledge/placement options (other than Control/Login) |
1,733 |
4.3% |
Contacts Calendar
Revised contacts are highlighted in Turquoise and additional contacts are highlighted in Green.
Contact Name |
Page in Appendix E |
Data Collection Phase |
Type |
Approximate Date - Groups 1 & 2 |
Approximate Date - Groups 3 & 4 |
Reason for Change |
Initial Contact/Screener Invitation Letter |
E-15, E-16 |
Pre-data collection |
Letter |
7/31/17 |
N/A |
No change |
Initial Contact/Screener Invitation Email |
E-44 |
Pre-data collection |
7/31/17 |
N/A |
No change |
|
Initial Contact/Screener Reminder Email |
E-45 |
Pre-data collection |
8/14/17 |
N/A |
No change |
|
Data collection Announcement Letter |
E-17 |
Early Completion |
Letter |
9/25/2017 |
7/31/2017 |
No change |
Data collection announcement email |
E-46 |
Early Completion |
9/18/2017 |
7/31/2017 |
No change |
|
Reminder text 1 |
E-90 |
Early Completion |
Text |
12/15/2017 |
9/25/17 |
No change |
Reminder email 1 (only those with no text) |
E-47 |
Early Completion |
09/25/2017 |
8/7/2017 |
No change |
|
Reminder postcard 1 |
E-19 – E-20 |
Early Completion |
Postcard |
10/9/2017 |
8/14/2017 |
No change |
Reminder email 2 |
E-48 |
Production I |
10/11/2017 |
8/23/2017 |
No change |
|
Reminder email 3 |
E-49 |
Production I |
10/24/2017 |
9/5/2017 |
No change |
|
Reminder text 2 |
E-90 |
Production I |
Text |
1/15/2018 |
10/31/2017 |
No change |
Reminder email 4 |
E-50 |
Production I |
11/1/2017 |
9/13/2017 |
No change |
|
Reminder postcard 2 |
E-21 – E-22 |
Production I |
Postcard |
11/20/2017 |
9/25/2017 |
No change |
Reminder text 3 |
E-90 |
Production I |
Text |
2/14/2017 |
12/01/2017 |
No change |
Reminder email 5 |
E-51 |
Production I |
12/2/2017 |
10/14/2017 |
No change |
|
Reminder email 6 |
E-52 |
Production I |
12/16/2017 |
10/25/2017 |
No change |
|
NCES Email #1 |
E-68 |
Production I |
1/10/2017 |
11/9/2017 |
No change |
|
Reminder postcard 3 |
E-23 – E-24 |
Production I |
Postcard |
1/17/2018 |
11/15/2017 |
No change |
Reminder 7/Abbreviated Announcement Email |
E-53 |
Production II |
1/8/2018 (or upon refusal of full interview) |
2/19/2018 (or upon refusal of full interview) |
No change |
|
Reminder text 4 |
E-90 |
Production II |
Text |
3/15/2018 |
1/15/2018 |
No change |
Reminder email 8 (only those with no text) |
E-54 |
Production II |
1/24/2018 |
12/3/2017 |
No change |
|
Holiday Postcard (Postcard 6) |
E-29 |
Production I |
Postcard |
12/15/2017 |
12/15/2017 |
No change |
Reminder postcard 4 |
E-25 – E-26 |
Production II |
Postcard |
2/8/2018 |
1/25/2017 |
No change |
Holiday Email |
E-55 |
Production I |
12/20/2017 |
12/20/2017 |
No change |
|
Reminder email 9 |
E-56 |
Production II |
2/17/2018 |
12/27/2017 |
No change |
|
Reminder email 10 |
E-57 |
Production II |
2/24/2018 |
1/3/2018 |
No change |
|
First Look Email |
E-58 |
Production II |
1/13/2018 |
1/13/2018 |
No change |
|
Reminder Email 11 |
E-59 |
Production II |
|
1/24/2018 |
No change |
|
Reminder Postcard 7 |
E-30 |
Production II |
Postcard |
3/6/2018 |
3/6/2018 |
No change |
Reminder text 5 |
E-90 |
Nonresponse Conversion |
Text |
6/26/2018 |
6/26/2018 |
Changed timing because of extension |
NCES Email #2 |
E-69 |
Production II |
3/9/2018 |
3/9/2018 |
No change |
|
Reminder Postcard 8 |
E-29 |
Production II |
Postcard |
|
3/13/2018 |
No change |
Reminder text 6 |
E-90 |
Nonresponse Conversion |
Text |
6/29/2018 |
6/29/2018 |
Changed timing because of extension |
Reminder email 12 |
E-60 |
Production II |
|
2/16/2018 |
No change |
|
Reminder email 13 |
E-61 |
Production II |
3/2/2018 |
3/2/2018 |
No change |
|
Reminder postcard 5 |
E-26 – E-27 |
Nonresponse Conversion |
Postcard |
|
2/27/2018 |
No change |
Reminder email 14 |
E-62 |
Nonresponse Conversion |
3/13/2018 |
3/13/2018 |
No change |
|
Reminder text 7 |
E-90 |
Data Collection Extension |
Text |
3/27/2018 |
3/27/2018 |
Added because of extension |
Reminder email 15 |
E-63 |
Nonresponse Conversion |
6/15/2018 |
6/15/2018 |
Changed timing because of extension |
|
Reminder email 16 |
E-64 |
Nonresponse Conversion |
6/25/2018 |
6/25/2018 |
Changed timing because of extension |
|
Reminder email 17 |
E-65 |
Nonresponse Conversion |
6/27/2018 |
6/27/2018 |
Changed timing because of extension |
|
Reminder email 18 |
E-66 |
Nonresponse Conversion |
6/28/2018 |
6/28/2018 |
Changed timing because of extension |
|
Reminder email 19 |
E-67 |
Nonresponse Conversion |
6/29/2018 |
6/29/2018 |
Changed timing because of extension |
|
Reminder email 20 |
E-70 |
Data Collection Extension |
3/21/2018 |
3/21/2018 |
Added because of extension |
|
Reminder email 21 |
E-71 |
Data Collection Extension |
3/23/2018 |
3/23/2018 |
Added because of extension |
|
Reminder email 22 |
E-72 |
Data Collection Extension |
3/29/2018 |
3/29/2018 |
Added because of extension |
|
Reminder Postcard 9 |
E-36 |
Data Collection Extension |
Postcard |
3/30/2018 |
3/30/2018 |
Added because of extension |
Reminder email 23 |
E-73 |
Data Collection Extension |
4/4/2018 |
4/4/2018 |
Added because of extension |
|
Reminder Postcard 10 |
E-37 |
Data Collection Extension |
Postcard |
4/6/2018 |
4/6/2018 |
Added because of extension |
Reminder email 24 |
E-74 |
Data Collection Extension |
4/9/2018 |
4/9/2018 |
Added because of extension |
|
Reminder Letter 1 |
E-32 – E-33 |
Data Collection Extension |
Letter |
4/13/2018 |
4/16/2018 |
Added because of extension |
Reminder email 25 |
E-75 |
Data Collection Extension |
|
4/18/2018 |
Added because of extension |
|
Reminder Postcard 11 |
E-38 |
Data Collection Extension |
Postcard |
4/20/2018 |
4/23/2018 |
Added because of extension |
Reminder email 26 |
E-76 |
Data Collection Extension |
4/23/2018 |
4/23/2018 |
Added because of extension |
|
Reminder Postcard 12 |
E-39 |
Data Collection Extension |
Postcard |
4/27/2018 |
4/30/2018 |
Added because of extension |
Reminder email 27 |
E-77 |
Data Collection Extension |
4/28/2018 |
4/28/2018 |
Added because of extension |
|
Reminder text 8 |
E-90 |
Data Collection Extension |
Text |
4/29/2018 |
4/29/2018 |
Added because of extension |
Reminder email 28 |
E-78 |
Data Collection Extension |
5/3/2018 |
5/3/2018 |
Added because of extension |
|
Reminder email 29 |
E-79 |
Data Collection Extension |
5/8/2018 |
5/8/2018 |
Added because of extension |
|
Reminder Letter 2 |
E-34 E-35 |
Data Collection Extension |
Letter |
5/11/2018 |
5/11/2018 |
Added because of extension |
Reminder email 30 |
E-80 |
Data Collection Extension |
5/14/2018 |
5/14/2018 |
Added because of extension |
|
Reminder text 9 |
E-91 |
Data Collection Extension |
|
5/20/2018 |
5/20/2018 |
Added because of extension |
Reminder email 31 |
E-81 |
Data Collection Extension |
5/23/2018 |
5/23/2018 |
Added because of extension |
|
Reminder email 32 |
E-82 |
Data Collection Extension |
5/25/2018 |
5/25/2018 |
Added because of extension |
|
Reminder Postcard 13 |
E-40 |
Data Collection Extension |
Postcard |
5/28/2018 |
5/28/2018 |
Added because of extension |
Reminder email 33 |
E-83 |
Data Collection Extension |
5/29/2018 |
5/29/2018 |
Added because of extension |
|
Reminder text 10 |
E-91 |
Data Collection Extension |
|
6/4/2018 |
6/4/2018 |
Added because of extension |
Reminder Postcard 14 |
E-41 |
Data Collection Extension |
Postcard |
6/11/2018 |
6/8/2018 |
Added because of extension |
Reminder email 34 |
E-84 |
Data Collection Extension |
6/8/2018 |
6/8/2018 |
Added because of extension |
|
Reminder email 35 |
E-85 |
Data Collection Extension |
6/12/2018 |
6/12/2018 |
Added because of extension |
|
Reminder Flyer |
E-43 |
Data Collection Extension |
Flyer |
6/18/2018 |
6/18/2018 |
Added because of extension |
Screener Invitation Email (post-data collection) |
E-86 |
Post-data collection |
7/1/2018 |
7/1/2018 |
Added because of extension |
|
Screener Postcard 15 |
E-42 |
Post-data collection |
Postcard |
7/2/2018 |
7/2/2018 |
Added because of extension |
Screener Reminder Email (post-data collection) |
E-87 |
Post-data collection |
7/16/2018 |
7/16/2018 |
Added because of extension |
|
NCES Email #3 |
E-88 |
Data Collection Extension |
As needed |
As needed |
Added because of extension |
|
NCES Email #4 |
E-89 |
Data Collection Extension |
As needed |
As needed |
Added because of extension |
New text added in Green.
[If reintroduced screener] Recently, we sent you information about a study we’re conducting for the U.S. Department of Education about the issues facing college graduates one year after earning their bachelor's degree. In order to determine whether you are eligible for this study, we ask you to answer a few questions about your bachelor’s degree and update your contact information. These questions take about 10 minutes [{IF NO PAY GROUP} no words {else}, and as a token of our appreciation, you will receive $[SCREENER INCENTIVE AMOUNT] upon completion]. You may decline to answer any question or stop at any time. If you have any questions about this study, you may contact the study's director, Jennifer Wine, at 877-225-8470. (To learn more about your rights as a participant, click here.) Do you want to begin the survey screener now?
[If initial screener and NPSAS study member]: Soon we will send you information about a study we’re conducting for the U.S. Department of Education about the issues facing college graduates one year after earning their bachelor's degree. In order to determine whether you are eligible to participate in this study, we ask you to answer a few questions about your bachelor’s degree and update your contact information. These questions take about 10 minutes[{IF NO PAY GROUP} no words {else}, and as a token of our appreciation, you will receive $[SCREENER INCENTIVE AMOUNT] upon completion]. You may decline to answer any question or stop at any time. If you have any questions about this study, you may contact the study's director, Jennifer Wine, at 877-225-8470. (To learn more about your rights as a participant, click here.) Do you want to begin the survey screener now?
[If initial screener and NPSAS non-study member]: Soon we will be conducting a study for the U.S. Department of Education about the issues facing college graduates one year after earning their bachelor's degree. In order to determine whether you are eligible for this study, we ask you to answer a few questions about your bachelor’s degree. These questions take about 10 minutes[{IF NO PAY GROUP} no words {else}, and as a token of our appreciation, you will receive $[SCREENER INCENTIVE AMOUNT] upon completion]. You may decline to answer any question or stop at any time. If you have any questions about this study, you may contact the study's director, Jennifer Wine, at 877-225-8470. (To learn more about your rights as a participant, click here.) Do you want to begin the survey screener now?
[else] Recently, we sent you information about a study we’re conducting for the U.S. Department of Education about the issues facing college graduates one year after earning their bachelor's degree. The survey takes about [{if ABBREV = 1} 10 {else} 30 minutes] [{IF NO PAY GROUP} no words {else}, and as a token of our appreciation, you will receive $[INCENTIVE AMOUNT] for participating]. [If EARLY BIRD and NO PAY RESTRICTIONS] If you complete the survey by [EARLY_COMP_DATE], you will receive an additional $[EARLYBIRD_INC].] You may decline to answer any question or stop the survey at any time. If you have any questions about this study, you may contact the study's director, Jennifer Wine, at 877-225-8470. (To learn more about your rights as a participant, click here.) To review the letter that we mailed, click here (PDF letter). To review the study brochure, click here (PDF brochure). Do you want to begin the survey now?
1=Yes, I agree to participate now
2=Not now, but I want to participate at a later time
3=No, I do not want to participate at all
Help Text:
• You are one of approximately 29,000 recent college graduates who will be taking part in this study.
• In addition to your survey responses, we collect financial aid information, student records and related information from your school and sources such as student loan databases and admissions testing agencies.
• Your participation is voluntary and will not affect any aid or other benefits that you may receive.
[If initial screener] Data collection for B&B begins in a few weeks. Please help us update our records so we can contact you then.
[else if full survey or reintroduction of screener] In about three years, we would like to be able to get in touch with you again to see what you’re doing and what has changed in your life. To find you then, we need to collect some contact information.
Help Text:
All contact information you provide will be kept in secure and protected data files, and will be separate from the responses you've already provided in this survey.
Please click the "Next" button to continue.
[If reintroduced screener] On behalf of the U.S. Department of Education, thank you for your time and cooperation.
[If initial screener] On behalf of the U.S. Department of Education, thank you for completing the B&B survey screener. [{IF ELIGIBLE AND A NPSAS STUDY MEMBER} We will be contacting you soon about the B&B survey.
[else if END_FLAG=1] Thank you.
[else] On behalf of the U.S. Department of Education, thank you for your time and cooperation. We greatly appreciate your participation in this study.
Help Text:
If you have any questions, please contact our Help Desk at 877-287-3782.
To estimate nonresponse bias, data on survey characteristics for nonrespondents and respondents are required. Since survey characteristics are not known for nonrespondents, a set of proxy variables is used instead. We examine the following proxy variables, available from the NPSAS:16 enrollment list, IPEDS header file, CPS and NSLDS matches:
institution control (categorical);
institution enrollment from IPEDS file (categorical);
Pell Grant receipt (yes/no);
Pell Grant amount (categorical);
Direct Loan receipt (yes/no);
Direct Loan amount (categorical);
Parent Loan for Undergraduate Students (PLUS) amount (categorical);
federal aid receipt (yes/no);
institutional aid receipt (yes/no);
state aid receipt (yes/no);
any aid receipt (yes/no);
Sex (categorical);
Age Group (categorical); and
Successful match to CPS (yes/no).
The bias (B) in an estimated mean based on respondents, , is the difference between the expected value of this mean and the target parameter, π, i.e., the mean that would be estimated if a complete census of the target population was conducted and everyone responded. This bias can be expressed as follows:
The estimated mean based on nonrespondents, , can be computed if data for the particular variable is available for most of the nonrespondents. The true target parameter, , can be estimated for these variables follows:
where is the weighted unit (or item) nonresponse rate. For the variables that are from the frame, rather than from the sample, can be estimated without sampling error. The bias can then be estimated as follows:
or equivalently
.
This formula shows that the estimate of the nonresponse bias is the difference between the mean for respondents and nonrespondents multiplied by the weighted nonresponse rate. The variance of the bias will be computed using Taylor Series estimation in RTI’s software package SUDAAN® (RTI International [RTI], 2012).
The relative bias estimate is defined as the ratio of the estimated bias divided by the sample mean based only on respondent cases, using the base weight, as follows:
This definition of relative bias provides a measure of the magnitude
of the bias relative to the respondent
weighted
mean.
Conducting a preliminary unit nonresponse bias analysis based on this approach shows that overall 50 percent of the proxy variables exhibit significant bias. This nonresponse bias is largely driven by the private nonprofit sectors (62.9 percent). However, it is important to note that the magnitude of the average and median absolute relative bias is not that different from the average and median absolute relative nonresponse bias estimates observed in B&B:08/09, where the response rate was 87 percent. Table 3 presents a summary of the nonresponse bias estimates by institution control and B&B implementation year.
Table 3. Summary of nonresponse bias estimates, by institution control: 2017 and 2009
Institution Control |
Percentage of proxy variables with significant bias |
Average absolute relative bias |
Median absolute relative bias |
2017 - preliminary |
|
|
|
Total |
50.00 |
5.20 |
3.64 |
Public |
41.67 |
5.20 |
4.08 |
Private nonprofit |
62.86 |
6.83 |
6.36 |
Private for-profit |
16.67 |
4.69 |
3.13 |
2009 |
|
|
|
Total |
27.50 |
3.90 |
3.14 |
Public |
32.43 |
4.56 |
3.95 |
Private nonprofit |
38.89 |
6.00 |
4.60 |
Private for-profit |
8.11 |
10.73 |
6.79 |
1 In order to achieve the expected 82 percent response rate among the eligible sample, the response rate among fielded cases must be at least 87 percent (Page 3, OMB# 1850-0926 v.3). The B&B:16/17 full-scale sample includes 1,352 NPSAS:16 non-study members who were not fielded in data collection and have automatically been designated as nonrespondents.
2 For sample members determined eligible for the B&B:16 cohort, both the full and abbreviated interviews end after locating information is collected. Ineligible sample members will be considered “responding” if they continue through the end of the eligibility section, reaching interview item BB17ABYE [which requests contacting information should they need to be recontacted after eligibility review.]
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Memorandum United States Department of Education |
Author | audrey.pendleton |
File Modified | 0000-00-00 |
File Created | 2021-01-21 |