Memorandum United States Department of Education
Institute of Education Sciences
National Center for Education Statistics
DATE: November 18, 2020
TO: Robert Sivinski, OMB
THROUGH: Carrie Clarady, OMB Liaison, IES
FROM: Tracy Hunt-White, Team Lead, Postsecondary Longitudinal and Sample Surveys, NCES
SUBJECT: 2019–20 National Postsecondary Student Aid Study (NPSAS:20) End of Data Collection Change Request (OMB# 1850-0666 v.32)
The 2019-20 National Postsecondary Student Aid Study (NPSAS:20) is a nationally representative cross-sectional study of how students and their families finance education beyond high school in a given academic year. NPSAS is conducted by the National Center for Education Statistics (NCES) and was first implemented by NCES during the 1986–87 academic year and has been fielded every 3 to 4 years since. This request pertains to the 11th cycle in the NPSAS series conducted during the 2019–20 academic year. NPSAS:20 is both nationally and state-representative and will serve as the base year data collection for the 2020 cohort of the Beginning Postsecondary Students Longitudinal Study (BPS:20), a study of first-time beginning postsecondary students that will be conducted three years (BPS:20/22) and six years (BPS:20/25) after beginning their postsecondary education. NPSAS:20 will consist of a nationally representative sample of undergraduate and graduate students, and a nationally representative sample of first-time beginning students (FTBs). Subsets of questions in the NPSAS:20 student interview will focus on describing aspects of the experience of beginning students in their first year of postsecondary education, including student debt and education experiences.
The request is to conduct all activities related to NPSAS:20, including materials and procedures related to: the NPSAS:20 student data collection, consisting of abstraction of student data from institutions and a student survey; panel maintenance activities for a NPSAS:20 follow-up field test (for BPS:20/22); and carried over respondent burden, procedures, and materials related to the NPSAS:20 institution sampling, enrollment list collection, and matching to administrative data files was approved by OMB in December 2019 (OMB#1859-0666 v.25). The NPSAS:20 enrollment list collection from institutions takes place from October 2019 to October 2020, the student records collection takes place from March 2020 through February 2021, and the student survey data collection takes place from March 2020 through January 2021.
NPSAS:20 has been affected in numerous ways by the unprecedented worldwide coronavirus pandemic. One of the most visible effects for the study is on response rates. Institutions experienced delays in providing the enrollment lists needed for sampling, forcing an extension of the overall data collection schedule. Students’ lives were upended in the spring of 2020 as postsecondary institutions made a sudden shift to virtual enrollment in the middle of the spring 2020 semester. In addition to their classes being affected, many students were either required to move out of campus-owned housing or encouraged to move out of off-campus housing and return to their permanent residences. As a result, response rates for student-level data collection are lagging behind expectations for this point in the data collection period. As described in this memo, we would like to target three salient groups with increased contacting and an additional $10 incentive to help increase participation.
This request is to modify our contacting strategy and change the incentive offer for specific nonrespondents – potential first-time beginning students (FTBs), undergraduate students from private for-profit institutions, and undergraduates who did not file a Free Application for Federal Student Aid (called FAFSA non-filers), to encourage their participation in NPSAS:20. This submission also requests a change to the student sample size from 150,000 to around 170,000, to help increase the survey yield so that it comes closer to the initial goal of 99,750 respondents. This request does not introduce changes to the costs to the federal government, but the estimated changes to respondent burden, reflecting the response rates currently being observed, are reported below in the revised burden table. The following revisions were made to Part A, and to Appendix E, the Student Data Collection Materials.
Background and Purpose for proposed changes
The NPSAS student interview sample is selected from enrollment lists submitted by the sampled institutions. Because of differences in institution calendars, institutions submit their enrollment list on a rolling basis. The NPSAS:20 list collection period began in January 2020 and continued through September 2020. Before the coronavirus pandemic, we had anticipated ending enrollment list collection in July 2020. However, we had to extend enrollment list collection due to the difficulty of recruiting institutions. While public institutions tend to submit their institution lists relatively early in the list collection period, continuous enrollment institutions, including many of the private for-profit institutions, typically cannot provide enrollment lists until later in the list collection period to ensure that they are capturing students in their spring terms. In addition, institutions with limited resources have also been difficult to recruit in recent years and often submit lists as late as possible.
As lists are received, RTI samples students from the lists and creates a wave (or batch) of students that can begin student data collection activities. The composition and size of each wave are unique – with varying counts of students from each sector, level, and/or control (public, private for-profit, private nonprofit). However, the end of data collection remains the same, so students from the institutions that submit enrollment lists later in the collection period have less time in data collection and potentially lower interview response rates than other students. For NPSAS:20, waves were generally created every 2 or 3 weeks, beginning in March 2020 and ending in November 2020. NPSAS:20 will include 14 waves (i.e., waves 0 through 13) of student interview cases.
The NPSAS:20 team continuously monitors response rates for each wave and for several key student types in data collection. We use these data, along with data from past cross-sectional and longitudinal studies, to estimate final response rates. As of October 28, 2020, after several months in NPSAS:20 data collection, the current unweighted response rate is 47.9 percent among the eligible sample released to the field (n=134,268), with 64,344 completed interviews, consisting of full (n=59,440), partial (n=3,024), and abbreviated interviews (n=1,880). Approximately 36,000 sample members are still to be fielded for an interview with about two months remaining in data collection. The present response rates are tracking below the expected response rate1 for this point in data collection and increase the potential for nonresponse bias and less precision. The discussion of the key analysis groups (below) describes how an increased number of interview respondents will help reduce nonresponse bias and increase precision.
We estimate the daily cumulative response rate for each wave as a function of various indicators including data collection interventions (e.g., reminders, prioritizing cases for telephone interviews and prompting, offering the abbreviated interview) and sample composition for each wave (e.g., institution control). The overall projection is the weighted average of the projections for each wave at the current planned end date of data collection on January 31, 2021. For details, see Table 1 in Attachment 1 below, which shows actual and projected response rates by wave. The project unweighted response rate for the cases already released to the field, in waves 0-11, is about 60 percent. Since NPSAS releases new sample on a rolling basis, this overall response rate of 60 percent will decrease as additional cases are released to the field (i.e., wave 12 and 13), particularly because these cases will have a relatively short time in data collection. Given the time remaining, we expect to achieve an unweighted final response rate of about 57 percent once all cases (waves 0 through 13) are released to the field, if data collection continues as currently designed.
To further encourage participation and reduce the potential for nonresponse bias, we propose modifying our contacting strategy and offering an increased incentive for key analysis groups. The three groups are:
Potential First-time Beginning (FTBs) Postsecondary Students: FTBs are a critical subgroup of the NPSAS:20 sample and are oversampled accordingly. FTBs in the 2019-20 NPSAS academic year form the basis for the two follow-ups that comprise the Beginning Postsecondary Students Longitudinal Study (BPS). Currently, there are plans to follow-up with FTBs at their third (2021-22 academic year) and sixth (2024-25 academic year) years after initial enrollment in postsecondary education. FTBs, historically, have been a difficult student group to survey. In NPSAS:12, for example, the response rate for FTBs was 60 percent, compared to the overall response rate of 69 percent. Consequently, to secure an adequate number of potential respondents for minimally biased and more precise estimates of the BPS:20 cohort, we need to begin with an adequate sample in the NPSAS base year. Estimating response rates of FTBs in the first and second follow-ups using previous BPS data, we estimated needing 30,000 potential FTBs in the NPSAS base year. Given the shorter than anticipated data collection period for those students who attend institutions that submitted lists later than anticipated, an increased incentive offer can help us maximize the participation of potential FTBs.
Undergraduate student sample from private for-profit institutions: As stated above, another result of the pandemic has been the slowed and delayed institution response to requests for enrollment lists that we use for sampling. Historically, private for-profit institutions have been less likely to submit lists and NPSAS:20 is no different. Further complicating matters is that private for-profit institutions are generally sampled later in the process given the institutions’ continuous enrollment status. As a result of these two issues, the number of students sampled from private for-profit institutions has been low and the students who will be sampled near the end of the sampling period will not have adequate time in data collection for sufficient follow-up to maximize response for this group. Because private for-profit institutions constitute an important sector in higher education, NPSAS needs adequate numbers of students from these institutions in order to be representative, with minimal nonresponse bias and more accurate estimates, of this unique sector in higher education. The increased incentive offer can help us encourage the participation of students who attended private for-profit institutions.
Undergraduate
students who have not filed a FAFSA (called FAFSA non-filers):
NPSAS combines administrative and student interview data to create
composite variables and impute all missing data to provide a complete
data product to users. Approximately 67 percent of student interview
respondents file the Free Application for Federal Student Aid (FAFSA)
that provides us with high-quality data on important topics that we
do not get otherwise (e.g., student earnings, family income for
dependent students). Although we can impute these data, it is
difficult to do so given that it is a not a random group who does not
file the FAFSA – students from both very wealthy and very poor
families tend not to file, though for different reasons. As a result,
the interview is a vital source of information for students who do
not fill out the FAFSA. In fact, for this reason, our abbreviated
interview is largely composed of questions that replicate data
obtained from the FAFSA. Obtaining interview data for these students
helps ensure that the NPSAS data are minimally biased and more
precise.
With the end of data collection approaching, we propose changing design features that might “tip the balance” towards a participatory decision among potential FTBs, students at private for-profit institutions, and those who have not filed a FAFSA. The current data collection plan allows for offering an abbreviated interview as a form of nonresponse conversion. However, preliminary analyses suggest that offering an abbreviated instrument is not the design feature that will most attract these groups. Furthermore, the current incentive plan more generally is also not attracting these subgroups at the desired rates, which is why we recommend a targeted incentive increase.
Support for the additional incentive amounts for these three groups is found in the literature. The leverage saliency theory (Groves, Singer and Corning, 2000) suggests that different design features will attract different sample members, suggesting that a “one size fits all” incentive amount may not be a good solution to nonresponse. Indeed, differential incentives are not a new idea and have been shown to be successful in bringing in groups of focal importance who were otherwise underrepresented (e.g., Groves, Singer and Corning, 2000; Groves and Heeringa, 2006; Peytcheva, Kirchner and Cooney, 2018). In line with such findings, we recommend boosting the $30 baseline incentive by $10 to $40 overall for cases belonging to any of the three groups of interest that are either newly launched in the last waves of data collection or that have already received the original $30 offer to complete the full survey, but have not completed it yet. This strategy is expected to have a longer-term effect - studies on longitudinal data collections show that baseline incentives usually set the retention rate for the survey; thus, larger incentives are often recommended in the base year, without creating an expectation for subsequent waves (Lengacher et al., 1995; Singer, Van Hoywek and Maher, 1998; Baker et al., 2010). This is of special importance for subgroups who tend to participate at lower rates or require more effort. When faced with lower than anticipated response rates in the Baccalaureate and Beyond Longitudinal Study (B&B:16/17), main data collection was extended and an additional $10 incentive was offered to prior-round nonrespondents. Eleven percent of all completed interviews were completed during that extension phase (Wine, et. al., 2019).
In addition to providing an incentive boost, we can boost the student sample size for the institutions for which students have yet to be selected in order to compensate for lower than expected response rates for students at institutions where students have already been sampled. Sampling a total of approximately 170,000 students will help get the number of survey respondents closer to the desired yield of around 99,750. Table 3 from Part A shows the updated sample size with no increased burden, due to the number of respondents not increasing.
Revisions were made to Appendix E to address the additional $10 incentive offer to specific nonrespondents to encourage their participation, including:
potential FTBs
students from private for-profit institutions, and
FAFSA non-filers.
Nonmonetary Strategies for Increasing Participation
We consistently employ nonmonetary strategies to encourage participation among targeted groups and increase overall response rates. Over the last few months of data collection, we will implement additional strategies aimed at increasing participation levels primarily among nonrespondents, prioritizing potential FTBs, students from private for-profit institutions, and FAFSA non-filers. We outline these strategies by communication type below. However, we do not believe these strategies alone are sufficient to increase response for these three groups, given the short amount of time left in the data collection period and the fact that, as described above, other strategies for increasing response do not appear to be as effective with these groups.
Tracing / Computer-Assisted Telephone Interview (CATI)
For NPSAS:20, RTI uses a multistage locating approach that starts with batch database searches, including National Change of Address (NCOA), Department of Education databases, Single Best Phone, and Premium Phone. Tracers in RTI’s Tracing Operations Center provide updated contact information for cases where we are having trouble contacting the sample member. Information collected from tracing allows us to prioritize phone numbers for out-bound calls based on how recent the information provided is. We plan to rerun batch searches and utilize the Tracing Operations Center staff to identify new telephone information that may be available for outbound calling.
For outbound dialing, our telephone interviewing system identifies all NPSAS outbound calls as coming from a Washington, DC area code (202) with the name “U.S Dept of Educ Study.” This information helps the sample member distinguish our calls from other unsolicited calls. To ensure that our calls avoid being flagged as spam, we will continue to rotate the phone number to a different 202 number every two weeks. Periodically rotating the numbers is a call center best practice that helps our calls be received as intended, without spam warning attached. In addition, we prioritize cases in special dialing queues to provide targeted effort on any cases that require more attention. For example, we move cases from specific sampling waves to the priority dialing queues several weeks prior to setting those cases to the abbreviated survey, to maximize our response rate for full surveys. We will continue to provide FTBs and other targeted cases additional priority over the remaining weeks of data collection.
During the final months of data collection, we will also update CATI protocols, including changing the voicemail message periodically to include language that notes data collection end dates and, as appropriate, we can update calling parameters to shorten or extend the duration between calls – for example:
Changing answering machine callback delays to 5 days instead of 7;
Changing email password contacts to 7 days instead of 14;
Changing contacts for sample members who have indicated they would complete the survey on the web on their own to 7 days instead of 10;
Leaving voicemails on every other answering machine event, instead of every 3rd answering machine event; and
Placing additional calling effort on those cases that have been in data collection for the least amount of time and decreasing or eliminating effort on cases that have been in data collection the longest.
Emails
Our email efforts will continue to be a focus in the final months of data collection. We will increase the frequency of our email communications from the current rate of approximately once every 10 days to once per week. We will also send ‘targeted’ email communications to nonrespondents, including potential FTBs, students from private for-profit institutions, and FAFSA non-filers. These communications will include language that will be more appealing to those groups, such as current email reminders from Appendix E that refer directly to the sample members’ NPSAS schools to target private for-profit schools and updated merge fields from Appendix E that are more appropriate for FTB and non-FAFSA students.
We also expect to send emails directly from the NPSAS Project Officer, from an @ed.gov email address, as a more ‘personal’ plea to complete the survey. Sample members are accustomed to getting email communications from [email protected], and some may even be blocking emails from [email protected], so an email from a trusted authority (a U.S. Department of Education employee) may legitimize the study request and spur participation. For example, we can send Email Reminder 18 approximately 10 weeks prior to data collection end, then send an additional email from the Project Officer the last week of data collection.
We monitor all responses that we receive from email communications for wrong email addresses, bouncebacks (bad email addresses), and requests for communications in a different language to help ensure we are reaching the correct sample members (and in the language they prefer).
Mailings
Our reminder mailings utilize different formats with an NCES return address to increase survey legitimacy. These formats include:
letters mailing in 9x12 and #10 windowed envelopes, and
5.5x8.5 bifold and 3.75x7.75 trifold postcards.
Like emails, we expect to increase the frequency of mailings during the final months of data collection. We can increase mailing averages from the current rate of a mailing sent approximately every 3 weeks to sending a mailing twice a month.
As with emails, we plan to send ‘targeted’ communications, such as letters that that refer directly to sample members’ NPSAS schools to target private for-profit institutions, such as Letters 3 – 5 from Appendix E.
SMS Text Messages
Like mailings and emails, we expect to increase the frequency of our text communications from the current rate of approximately every 15 days to every 7 – 10 days. Additionally, we will send targeted text messages (text reminders 2, 12, and 19 in Appendix E) that refer to the sample members’ NPSAS schools (private for-profit institutions), in addition to updated merge fields from Appendix E that may appeal directly to potential FTBs and non-FAFSA sample members.
As with email, we monitor all responses that we receive from text messages for wrong #s, bouncebacks (bad/untextable phone numbers), and requests for communications in a different language, and update these accordingly.
NPSAS:20 has been affected in numerous ways by the unprecedented worldwide coronavirus pandemic. One of the most visible effects for the study is on response rates. Students’ lives were upended in the spring of 2020 as postsecondary institutions made a sudden shift to virtual enrollment in the middle of the spring 2020 semester. In addition to their classes being affected, many students were either required to move out of campus-owned housing or encouraged to move out of off-campus housing and return to their permanent residences. As a result, response rates for student-level data collection are lagging behind expectations for this point in the data collection period. To try to correct for this trend, we propose targeting three salient groups (potential first-time beginning students (FTBs), undergraduate students from private for-profit institutions, and undergraduates who did not file a Free Application for Federal Student Aid (called FAFSA non-filers)), with increased contacting and an additional $10 incentive to help increase participation.
Table 3. Average estimated burden to institution and student respondents for the NPSAS:20 data collection
Data collection activity |
Sample |
Expected eligible |
Expected response rate (percent) |
Expected number of respondents |
Expected number of responses |
Average time burden per response (mins) |
|
Institutional collection |
|
|
|
|
|
|
|
Eligibility-screening calls |
853 |
845 |
100 |
8454 |
845 |
5 |
71 |
3,106 |
3,075 |
85 |
2,614 |
2,614 |
10 |
436 |
|
Institutional enrollment lists |
3,106 |
3,075 |
85 |
2,6144 |
2,614 |
300 |
13,070 |
Institutional collection subtotal1 |
|
|
|
2,614 |
6,073 |
|
13,577 |
Student collection |
|
|
|
|
|
|
|
Student record collection2 |
2,614 |
2,614 |
93 |
2,4314 |
2,431 |
1,800 |
72,930 |
Student survey |
|
|
|
|
|
|
|
Full survey |
|
161,500 |
47 |
75,910 |
75,910 |
30 |
37,955 |
Abbreviated survey |
|
161,500 |
7 |
11,300 |
11,300 |
10 |
1,883 |
NRFU survey |
|
161,500 |
2 |
3,200 |
3,200 |
3 |
160 |
|
|
|
|
|
|
|
|
BPS:20/22 field test panel maintenance |
3,400 |
--- |
15 |
5104 |
510 |
3 |
26 |
Student collection subtotal1 |
|
|
|
90,410
|
93,351 |
|
112,954 |
Total |
|
|
|
93,0243 |
99,424 |
|
126,531 |
1 Gray font depicts activities for which burden is being carried over but not requested in this submission as it was approved in the NPSAS 2020 institution data collection package (OMB# 1850-0666 v.23-24). The subtotals for the student collection represent all burden newly requested in this submission.
2 The sample for student record collection is the number of institutions that provide enrollment lists for student sampling.
3 This total count represents the unduplicated sum of all estimated student survey respondents plus the number of estimated responding institutions.
4
These expected numbers of respondents are not included in the
subtotal and total count because these respondents are acccounted for
in adjacent cells above.
Revisions were made to Appendix E to address the additional $10 incentive offer to specific nonrespondents to encourage their participation, including:
potential FTBs,
students from private for-profit institutions, and
FAFSA non-filers.
Below is a summary of the changes (See Attachment 2 below for detailed changes).
Added – 1 new sample member communication (letter, email, and SMS text) was added that specifically addresses the additional $10 incentive offer. The mail communication appears on page E-36, the email communication appears on page E-87, and the SMS text communication appears on E-139.
Added – 1 new sample member communication (letter, email, and SMS text) was added as a holiday communication, like other holiday communications included for Fourth of July and Thanksgiving. The letter communication appears on page E-37, the email communication appears on page E-95, and the SMS text communication appears on E-139.
Revised – The text for several communications were updated to include language announcing the $10 incentive boost for targeted nonrespondents. The revised text for these communications appears on:
Mail communications on pages E-22, E-26, E-28, E-32, E-36 (new), E-37 (new), and E-43.
Email communications on pages E-62 through E73, E-75, E-76, E-78 through E-81, E-84, E-87 (new), E-92, and E-94 through E-95.
SMS text communications on pages E-133 through E-137, and E-139 (new).
Merge fields:
Letter Merge Fields on page E-149
Email merge fields on page E-151
Postcard merge fields on page E-154
Text merge fields on page E-155
References:
Baker R, Blumberg S, Brick M, et al. AAPOR report on online panels. Public Opinion Quarterly. 2010;74(4):711-781.
Groves RM, Singer E, Corning AD. Leverage-salience theory of survey participation: Description and an illustration. Public Opinion Quarterly. 2000;64:299-308.
Groves RM, Heeringa SG. Responsive design for household surveys: tools for actively controlling survey errors and costs. Journal of the Royal Statistical Society Series a-Statistics in Society. 2006;169(3):439-457. doi: DOI 10.1111/j.1467-985X.2006.00423.x.
Lengacher JE, Sullivan CM, Couper MP, et al. Once reluctant, always reluctant? Effects of differential incentives on later survey participation in a longitudinal study. Proceedings of the Section on Survey Research Methods, American Statistical Association. Alexandria: ASA, pp. 1029-1034 (Paper presented at the AAPOR Conference, Fort Lauderdale, May)
Peytcheva, E, Kirchner, A., and Cooney, J. 2018. Experimental Comparison of Two Data Collection Protocols for Previous Wave Nonrespondents. Paper presented at the Methodology of Longitudinal Surveys II conference, Essex, U.K.
Singer E, Van Hoywek J, Maher M. Does the payment of incentives create expectation effects? Public Opinion Quarterly. 1998; 62:152-164.
Wine, J., Tate, N., Thomsen, E., Cooney, J., and Haynes, H. (2019). 2016/17 Baccalaureate and Beyond Longitudinal Study (B&B:16/17) Data File Documentation (NCES 2020-441). U.S. Department of Education, Institute of Education Sciences. Washington DC: National Center for Education Statistics. Retrieved 11/10/20 from https://nces.ed.gov/pubsearch.
This attachment provides actual and projected response rates by wave, under the assumption that data collection continues as currently designed (that is, without the requested incentive boost but with a sample increase). All projections for waves 0 to 9 are based on time series regression modeling with the daily cumulative response rates as the dependent variable (with Newey-West standard errors). Models assume a third order autocorrelation (lag = 3). In all projection models presented based on all available data on 10/28/2020, the actual response rates and the projections at the current date never deviate by more than .5 percentage per wave up to wave 7. Given the shorter time in data collection, the projections for the later waves are slightly more variable.
This type of projection modeling was not applied to waves that have either not been in data collection long enough to yield a stable prediction (waves 10 and 11) or have not yet been released to the field (waves 12 and 13). For those waves we combine the actual or projected response rates from waves that are similar in sample composition (wave 6 to 9) at comparable points in time. We use these ensemble projections to derive an overall projection across all waves currently released into the field as well as including those not yet released to the field.
These derived projections suggest that we will reach a maximum unweighted response rate (including final partials) of about 60 percent for cases already released into the field by the end date of data collection. These estimates differ considerably by priority status. For example, for the non-priority cases we expect a response rate of about 70 percent compared to only 52 percent for the priority cases at the end of data collection in the waves already released into the field. Once we include the cases not yet released into the field (waves 12 and 13), we expect a maximum overall unweighted response rate (including surveys that are finalized as partial completes) of about 57 percent at the end of data collection.
Table 1 displays the current preliminary response rates by wave, overall and by priority status, and their corresponding projections on October 28, 2020, both for the original end of data collection (January 2, 2021) and for the planned end of data collection (January 31, 2021), for all waves already released to the field. The estimates shown in Table 1 treat final partials as respondents.
Table 1. Data Collection Actual and Projected Response Rates by Waves*
Data Collection Wave |
Date, Started Data Collection |
# of Days in Data Collection, as of 10/28/2020 |
Eligible Sample (10/28/20) |
Actual Response Rate (10/28/20) |
Projection Until 10/28/2020 |
Projection Until 1/2/2021
|
Projection Until 1/31/2021
|
Overall (Wave 0 to 11) |
|
|
134,268 |
47.9% |
47.5% |
57.6% |
59.9% |
Non-Priority Cases |
|
|
58,886 |
57.9% |
58.1% |
67.4% |
69.8% |
Priority Cases |
|
|
74,501 |
40.1% |
40.0% |
49.6% |
52.0% |
Wave 0 (Calibration) |
March 2, 2020 |
240 |
6,033 |
70.9% |
70.9% |
72.9% |
73.6% |
Non-Priority Cases |
|
|
3,389 |
76.8% |
76.7% |
78.5% |
79.1% |
Priority Cases |
|
|
2,644 |
63.4% |
63.3% |
65.4% |
66.2% |
Wave 1 |
April 22, 2020 |
189 |
12,900 |
70.7% |
70.6% |
72.6% |
73.3% |
Non-Priority Cases |
|
|
7,347 |
75.5% |
75.3% |
77.3% |
78.1% |
Priority Cases |
|
|
5,553 |
64.5% |
64.4% |
66.4% |
67.2% |
Wave 2 |
June 3, 2020 |
147 |
16,252 |
65.1% |
64.9% |
69.2% |
70.7% |
Non-Priority Cases |
|
|
8,627 |
71.8% |
71.6% |
75.9% |
77.4% |
Priority Cases |
|
|
7,625 |
57.5% |
57.4% |
61.7% |
63.2% |
Wave 3 |
June 12, 2020 |
138 |
8,160 |
65.1% |
64.9% |
70.2% |
72.0% |
Non-Priority Cases |
|
|
4,274 |
71.9% |
71.8% |
76.8% |
78.5% |
Priority Cases |
|
|
3,886 |
57.5% |
57.4% |
62.9% |
64.7% |
Wave 4 |
July 6, 2020 |
114 |
14,008 |
55.9% |
55.8% |
64.2% |
62.9% |
Non-Priority Cases |
|
|
5,786 |
65.0% |
64.8% |
70.1% |
71.8% |
Priority Cases |
|
|
8,222 |
49.4% |
49.5% |
54.8% |
56.6% |
Wave 5 |
July 17, 2020 |
103 |
9,216 |
55.7% |
55.6% |
67.7% |
63.6% |
Non-Priority Cases |
|
|
4,223 |
63.7% |
63.5% |
70.1% |
72.2% |
Priority Cases |
|
|
4,993 |
48.8% |
48.9% |
54.8% |
56.6% |
Wave 6 |
July 31, 2020 |
89 |
10,322 |
51.3% |
51.0% |
57.2% |
59.1% |
Non-Priority Cases |
|
|
3,967 |
58.4% |
58.5% |
65.0% |
67.0% |
Priority Cases |
|
|
6,355 |
46.8% |
46.3% |
52.3% |
54.1% |
Wave 7 |
August 21, 2020 |
68 |
13,267 |
43.0% |
42.7% |
51.8% |
54.4% |
Non-Priority Cases |
|
|
4,771 |
51.7% |
51.8% |
61.9% |
64.9% |
Priority Cases |
|
|
8,496 |
38.1% |
37.7% |
46.1% |
48.4% |
Wave 8 |
September 15, 2020 |
43 |
12,897 |
34.8% |
33.3% |
45.4% |
48.4% |
Non-Priority Cases |
|
|
5,400 |
41.3% |
39.0% |
53.4% |
56.8% |
Priority Cases |
|
|
7,497 |
30.2% |
28.7% |
39.4% |
42.0% |
Wave 9 |
September 25, 2020 |
33 |
11,764 |
28.4% |
28.4% |
43.6% |
47.1% |
Non-Priority Cases |
|
|
4,596 |
37.8% |
37.8% |
56.9% |
61.3% |
Priority Cases |
|
|
7,168 |
22.3% |
22.4% |
35.1% |
38.0% |
Wave 10 |
October 9, 2020 |
19 |
11,380 |
24.3% |
NA* |
NA* |
NA* |
Non-Priority Cases |
|
|
4,468 |
29.0% |
NA* |
NA* |
NA* |
Priority Cases |
|
|
6,912 |
21.2% |
NA* |
NA* |
NA* |
Wave 11 |
October 23, 2020 |
5 |
8,069 |
6.1% |
NA* |
NA* |
NA* |
Non-Priority Cases |
|
|
2,038 |
8.6% |
NA* |
NA* |
NA* |
Priority Cases |
|
|
6,031 |
5.3% |
NA* |
NA* |
NA* |
Projection overall (Wave 0 to 13) |
|
|
170,000 |
|
|
53.6% |
56.7% |
Notes: Denominator in all analyses excludes ineligibles. Final partials are counted as respondents.
* Projections displayed only include those based on the time series modeling approach for waves 0 to 9.
#1 Incentive Boost Letter* (page E-36)
«date»
Study ID: «caseid»
«addr1»
«addr2»
«city», «state» «zip» «zip4»
Dear «fname»,
[INSERT MERGE FIELD FROM TABLE – TARGETED STUDENT GROUP, PAGE E-149] In fact, your participation is so crucial, you have been selected to receive an additional $10 for completing your NPSAS survey—that’s a total of $«inc_amount»! Complete your «time»-minute survey today and receive your additional $10.
To complete your survey today, go to the NPSAS website and log on using your study ID and password below:
https://surveys.nces.ed.gov/npsas/
Study ID: «caseid»
Password: «password»
Note: Your password is case sensitive; you will need to enter it exactly as it appears here.
Or use the camera on your phone to scan the QR code below to take you to the «survey»/«website»:
«QRCODE»
If you have questions, need help completing your survey online, or prefer to complete the survey over the telephone, simply call the NPSAS Help Desk at 877-677-2766 or e-mail us at [email protected].
Thank you, in advance, for your participation.
Para solicitar materiales de contacto en español en el futuro, por favor llame al 877-677-2766 o envia un e-mail a [email protected].
|
|
Tracy Hunt-White, Ph.D. Project Officer, NPSAS National Center for Education Statistics [email protected] | 202-245-6507 |
Jennifer Wine, Ph.D. Project Director, NPSAS RTI International [email protected] | 877-225-8470 |
«panelinfo»/«controlID»
#2 Holiday Mailing (page E-37)
OUTSIDE CARD TEXT: Warmest Holiday Greetings of the Season!
INSIDE CARD TEXT: From all of us on the NPSAS survey team, we wish you happy holidays.
BACK OF CARD TEXT: The 2019–20 National Postsecondary Student Aid Study (NPSAS:20) is a national study of approximately 150,000 students enrolled in postsecondary education. The National Center for Education Statistics (NCES) in the U.S. Department of Education’s Institute of Education Sciences has contracted with RTI International to collect data for NPSAS on its behalf. OMB Control Number: 1850-0666
Help Desk: 877-677-2766
Holiday Mailing Insert
«fname»,
Don’t forget to participate in NPSAS– take <<time>> minutes and [IF INCENTIVE ELIGIBLE: <<receive <<inc_amount>> when you >>] complete the survey! We rely on students like you to make NPSAS a success.
Go to https://surveys.nces.ed.gov/npsas/ Enter your Study ID: <<caseid>> Enter your Password: <<password>> |
OR |
Scan this QR code <<QRCODE>> |
If you have questions, problems completing your survey online, or prefer to complete the survey over the telephone, simply call the NPSAS Help Desk at 877-677-2766.
We appreciate your help.
OMB Control Number: 1850-0666
Learn more about our confidentiality procedures at https://surveys.nces.ed.gov/npsas/confidentiality.aspx
#3 Incentive Boost E-mail (page E-87)
SUBJECT: We Just Increased the Incentive for Your Participation in NPSAS
Hi, «fname»,
[INSERT MERGE FIELD FROM TABLE- TARGETED STUDENT GROUP, PAGE E-151] In fact, your participation is so crucial, you have been selected to receive an additional $10 for completing your NPSAS survey—that’s a total of «inc_amount» payable by «PayPal or »check! Complete your «time»-minute survey today and receive your additional $10.
Or, you can visit the NPSAS website and log in: https://surveys.nces.ed.gov/npsas/
Study ID: «caseID»
Password: «password»
If you have questions or prefer to participate by telephone, please call 877-677-2766.
Thanks in advance for your participation.
[INSERT MERGE FIELD FROM TABLE – SOURCE AND SIGNATORY, PAGE E-149]
OMB Control Number: 1850-0666
Learn more about our confidentiality procedures at https://surveys.nces.ed.gov/npsas/confidentiality.aspx
«emailID»
Haga clic aqui para solicitar materiales de contacto en español.
#4 Holiday E-mail (page E-95)
SUBJECT LINE (holiday theme): «fname», wishing you a wonderful holiday season!
Dear «fname»,
Happy Holidays from the NPSAS study team!
We’ve been trying to contact you regarding your participation in the National Postsecondary Student Aid Study. [INSERT MERGE FIELD FROM TABLE – TARGETED STUDENT GROUP, PAGE E-151]
I hope that you will take time out of your busy schedule to complete your survey today [IF INCENTIVE ELIGIBLE: «and receive a little extra money just in time for the holidays»].
Here are the important details:
[IF INCENTIVE ELIGIBLE AND NO BOOST: «You’ll receive $«inc_amount» when you complete the survey, payable by check« or PayPal».»// [IF INCENTIVE ELIGIBLE AND BOOST: Because your participation is so important, you have been selected to receive an additional $10 for completing your NPSAS survey for a total of $«inc_amount».» // [IF NOT INCENTIVE ELIGIBLE: «You were selected to represent many students at «NPSASschool» and the study won't be a success without you!»]
It will take about «time» minutes.
Click the link below or login at https://surveys.nces.ed.gov/npsas/
Study ID: «caseID»
Password: «password»
Alternatively, you can complete the survey over the phone: 877-677-2766.
If you have questions or problems completing your survey, simply contact the NPSAS Help Desk at 877-677-2766 or [email protected].
Thank you for helping make NPSAS a success.
[INSERT MERGE FIELD FROM TABLE – SOURCE AND SIGNATORY, PAGE E-149]
OMB Control Number: 1850-0666
Learn more about our confidentiality procedures at https://surveys.nces.ed.gov/npsas/confidentiality.aspx
«emailID»
Por favor responde a este correo electrónico para solicitar materiales en español.
#5 Incentive Boost Announcement Text (page E-139)
US DEPT OF EDUC: <<fname>>, [INSERT MERGE FIELD FROM TABLE – TARGETED STUDENT GROUP, PAGE E-155], so we are increasing the incentive for your <<time>>-minute survey to <<inc_amount>>! Click here: [bitly link]. Reply STOP to opt out of future text messages. Responde “Español” para solicitar este mensaje en español.
#6 Holiday Text (page E-139)
US DEPT OF EDUC: Happy Holidays from the NPSAS team, <<fname>>! We hope you can find time to complete your <<shortened >><<time>>-minute NPSAS survey[IF INCENTIVE ELIGIBLE: and <<now>><<still>> receive <<inc_amount>>]. Click here to start: [bitly link]. Reply STOP to opt out of future text messages. Responde “Español” para solicitar este mensaje en español.
1 We expected to achieve at least 70 percent response rate among the eligible sample overall (Part A, page 14, OMB# 1850-0666.v 25.).
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Memorandum United States Department of Education |
Author | audrey.pendleton |
File Modified | 0000-00-00 |
File Created | 2021-01-12 |