DATE: December 8, 2021
TO: Robert Sivinski, OMB
THROUGH: Carrie Clarady, NCES
FROM: Tracy Hunt-White and David Richards, NCES
Re: 2020/22 Beginning Postsecondary Students (BPS:20/22) Full-Scale (OMB # 1850-0631 v.19)
The 2020/22 Beginning Postsecondary Students Longitudinal Study (BPS:20/22), conducted by the National Center for Education Statistics (NCES), is designed to follow a cohort of students who enroll in postsecondary education for the first time during the same academic year, irrespective of the date of high school completion. The request to conduct the BPS:20/22 field test was last approved in December 2020 (OMB# 1850-0631 v.18), and we are now requesting to conduct the full-scale data collection. The BPS:20/22 full-scale materials and procedures included in this submission are based on those approved for the field test, with a number of clarifications and revisions based on our experiences and experiment results from the field test and Technical Review Panel feedback. This memo summarizes the revisions we made to the original approved package to reflect the finalized full-scale plans.
For each revised section of the package, we note in this memo the substantive differences and provide the rationale for the changes. Note that additions/revisions are marked in a green font and deletions in red.
Supporting Statement Part A
Changes in Part A include the removal of reference to field test plans and the inclusion of updated plans for the full-scale study. Also, editorial revisions were made to improve clarity.
Preface
This section has been revised remove reference to field test plans and includes editorial revisions to improve clarity.
Section 1. Collection Circumstances
This section was revised to remove references to the field test sampling and includes new text about the full-scale BPS:20/22 cohort. Below is the new text on the full-scale cohort in this section:
BPS:20/22 will be a nationally-representative sample of approximately 37,000 students who were first-time beginning students (FTBs) during the 2019-20 academic year. The BPS:20/22 field test included approximately 3,700 students who first began in the 2018-19 academic year. These students are asked to complete a survey and administrative data are also collected for them. Administrative data matching will be conducted with sources including the National Student Loan Data System (NSLDS), which contains federal loan and grant files; the Central Processing System (CPS), which houses and processes data contained in the Free Application for Federal Student Aid (FAFSA) forms; the National Student Clearinghouse (NSC) which provides enrollment and degree verification; vendors of national undergraduate, graduate, and professional student admission tests; and possible other administrative data sources such as the Veterans Benefits Administration (VBA). These data will be obtained through file matching and downloading. In addition, this request includes conducting panel maintenance activities for the BPS:20/25 field test sample. BPS:20/25 is anticipated but has not yet been authorized.
Section 2. Information Purpose
This section has been revised to include reference to Spanish reference materials and to mention the survey representativeness. Minor editorial revisions were made to improve clarity.
Section 3. Collection Techniques
Minor editorial revisions were made to this section to improve clarity.
Section 6. Consequences Not Conducted/Conducted Less
The following language referencing the oversample of certificate seekers was removed:
The new oversample of certificate seekers that will be available for the full-scale study, combined with the longitudinal nature of BPS, allows for analysis of how the value of these credentials shifts in response to market forces.
Section 8. Outside Agency Consultation/Federal Register Notice
Introduction language referencing what is included in this section was added for clarity. Minor editorial revisions were made in this section to improve clarity.
Here is the new introduction text in this section:
Included in this section is a copy and details identifying the date and page number of publication in the Federal Register of the agency’s notice, required by 5 CFR 1320.8(d).
Section 9. Respondent Gift/Payment
This section has been revised to reflect the final incentive plans for full-scale data collection and includes a description of planned processes for matching to a federal database maintained by the U.S. Department of the Treasury’s Office of Foreign Assets Controls (OFAC) that will be used to identify sample members who cannot be offered an incentive.
The full-scale incentive plan is pasted below for reference:
The monetary incentives for this respondent class are valued at $30-$67 and include the following: BPS:20/22 full-scale data collection will involve two distinct data collection groups (i.e., default and aggressive group) and four main data collection phases (i.e., early completion, production phases 1 and 2, and nonresponse conversion). This general setup builds upon the designs implemented in other longitudinal studies where it has contributed to maximizing response rates and minimizing the potential for nonresponse bias (e.g., BPS:12/14, BPS:12/17, B&B:16/17, B&B:08/18, and the BPS:20/22 field test).
In BPS:20/22 we plan to implement differential treatments based on prior round response status, an approach that was successfully implemented in the B&B:16/17 field test, where NPSAS:16 field test nonrespondents who received the aggressive protocol was about 12% higher than the group that received the default protocol (37% response rate for the aggressive protocol versus 25% response rate t(2,097) = 3.52, p <.001).
For the BPS:20/22 full-scale design, we will distinguish the following data collection groups and design protocols:
Default Protocol: NPSAS:20 survey respondents.
Aggressive Protocol: NPSAS:20 survey nonrespondents, including sample members who failed to respond to NPSAS:20 and those who are NPSAS:20 administrative-only sample members (who were never invited to complete the NPSAS:20 student survey), who are potential academic year 2019-20 FTBs based on administrative data.
The baseline incentive for the default protocol will be $30 along with the possibility of a $10 boost incentive applied later in data collection.
The baseline incentive for the aggressive protocol will be $45. The baseline incentive will be paid in addition to a $2 prepaid incentive, and a $20 boost incentive applied later in data collection. The maximum possible total incentive is $40 for the default data collection protocol and $67 for this aggressive data collection protocol.
Aggressive protocol prepaid incentives:
$2 prepaid incentive - cash prepaid incentives have been shown to significantly increase response rates in both interviewer-administered as well as self-administered surveys and hence reduce the potential for nonresponse bias (e.g., Church 1993; Cantor et al. 2008; Goeritz 2006; Medway and Tourangeau 2015; Messer and Dillman 2011; Parsons and Manierre 2014; Singer 2002). During the early completion phase in the B&B:16/17 field test, prepaid incentives ($10 via check or PayPal) in combination with telephone prompting also significantly increased response rates by 4.4 percentage points in the aggressive protocol implemented for prior round nonrespondents. Further, in a B&B:16/20 calibration experiment, the overall response rate for sample members who were given a prepaid incentive in cash was not statistically significantly different from those who were prepaid using PayPal, indicating that both prepayment methods can achieve similar response rates (Kirchner et al. 2021). Given these positive findings combined with general recommendations in the literature (e.g., Singer and Ye 2013; DeBell et al. 2019), BPS:20/22 will send all prior round nonrespondents in the aggressive protocol a $2 prepaid "visible" cash incentive and notify them of this prepaid incentive in the data collection announcement letter. Where necessary due to low address quality, a $2 prepaid PayPal incentive announced on a separate index card will be sent in lieu of a cash incentive.
$10 and $20 post-paid boost incentive for nonresponse conversion - incentive boosts are successful nonresponse conversion strategies, increasing response rates across various modes of data collection (e.g., Singer and Ye, 2013; Dykema et al., 2015; Stevenson et al., 2016; Lynn, 2017) especially among sample members who have implicitly or explicitly refused to complete the survey (e.g., Groves and Heeringa 2006). Incentive boosts are especially common in large federal surveys during their nonresponse follow-up phase (e.g., The Center for Disease Control and Prevention's National Survey of Family Growth) and have been implemented successfully in other postsecondary education surveys (e.g., HSLS:09 Second Follow-up; BPS:12/17; NPSAS:20). For BPS:20/22, a $20 post-paid incentive boost is planned for all remaining nonrespondents in the aggressive protocol, and $10 for all remaining nonrespondents in the default protocol. This incentive boost will be offered during Production Phase 2 for both protocols. This boost incentive will be offered in addition to the baseline incentive, so the total incentive for completing the survey can include the combined totals of the baseline and boost amounts.
Prior to the start of data collection, BPS:20/22 sample members will be matched to a federal database maintained by the U.S. Department of the Treasury's Office of Foreign Assets Controls (OFAC). OFAC administers and enforces economic and trade sanctions based on U.S. foreign policy and national security goals. As part of its enforcement efforts, OFAC publishes a list of individuals and companies called the “Specially Designated Nationals List” or SDN. Their assets are blocked and U.S. entities are prohibited from conducting trade or financial transactions with those on the list (https://www.treasury.gov/resource-center/sanctions/Pages/default.aspx). In order to determine if there are any BPS:20/22 sample members to whom NCES cannot offer an incentive, the sample members will be matched to the SDN using the Jaro-Winkler and Soundex algorithms recommended by OFAC. To avoid over-matching, BPS:20/22 staff will review the cases based on full name, date of birth, and address. The small number of individuals who cannot be confirmed as not matching the SDN list will receive a survey request without an incentive offer.
The justification for the aforementioned incentives is as follows: The use of incentives for completion of the student survey can provide significant advantages to the government in terms of increased response rates and higher quality data with minimal nonresponse bias. In addition, the use of incentives may also result in decreased data collection costs due to improved efficiency.
The use of incentives for completion of the student survey can provide significant advantages to the government in terms of increased response rates and higher quality data with minimal nonresponse bias. In addition, the use of incentives may also result in decreased data collection costs due to improved efficiency.
Section 10 - Assurance of Confidentiality
Revisions to the informed consent language that is part of the student survey are included in this section. Minor editorial revisions were made to this section improve clarity.
Here is the new informed consent language:
Recently, we sent you material about the U.S. Department of Education's Beginning Postsecondary Students Longitudinal Study (BPS). As part of the study, the survey is being conducted to better understand the education and employment experiences of students who began their postsecondary education during the 2019-2020 academic year.
[{If the respondent is not on the OFAC no-pay list and is a base-year survey nonrespondent} You should have already received a $2 prepaid incentive. If you have not received the $2, please contact our Help Desk toll-free at 1-800-247-6056 for assistance.] [{If the respondent is on the OFAC no-pay list} The survey takes about [{if full survey} 30 minutes to complete {if abbreviated survey} 15 minutes to complete]. {else} The survey takes about [{if full survey30 minutes{if abbreviated survey} 15 minutes] and, as a token of our appreciation, you will receive [{if respondent is not on the OFAC no-pay list and is a base-year nonrespondent} an additional] $[INC_AMOUNT] for participating.]
In addition to your survey responses, we collect other enrollment-related data from your institution and sources such as student loan databases and admissions testing agencies. Your responses, combined with any student record information, may be used for statistical purposes and will not be disclosed, or used, in personally identifiable forms for any other purpose, except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151).
Sometimes there are opportunities for researchers to use data from previous studies or to share data with each other if they are conducting similar research. For these reasons, we may use or share your deidentified data with other researchers. If we do so, we will not contact you to ask for your additional informed consent.
Your participation is voluntary and will not affect any aid or other benefits that you may receive. You may decline to answer any question or stop the survey at any time. The risks of participating in this study are small and relate to data security. We have the following precautions in place - your responses are stored within an enhanced security network, only authorized project staff have access to data, and all staff have participated in privacy training, signed confidentiality agreements, and undergone rigorous background checks. All personally identifiable information will be kept in secure and protected data files, and will be separate from the responses you provide in this survey.
If you are located in the European Union (EU) or the United Kingdom (UK), you have rights under the EU’s General Data Protection Regulation and the UK Data Protection Act. By providing consent, you agree and understand that your personal information will be transferred to a data center located in the United States.
If you wish to exercise any of your data subjects’ rights or have additional questions regarding the use and protection of your information, or have any questions about the study, you should contact the study director, Michael Bryan, at 800-844-8959. For questions about your rights as a participant, please contact RTI’s Office of Research Protection toll-free at 866-214-2043.
To review the letter that we mailed, click here (PDF letter).
To review the study brochure, click here (PDF brochure).
Do you want to begin the survey now?
Help text:
You are one of approximately 37,000 students who will be taking part in this study.
The risk of participating in this study is small and relates to data security. However, we have put strict security procedures in place to protect your information. Procedures include:
Responses are collected and stored on RTI's network which complies with all applicable security and privacy regulations including strong encryption during internet transmission (Secure Sockets Layer (SSL) protocol).
All data entry modules are password protected and require the user to log in before accessing confidential data.
Project staff are subject to large fines and/or imprisonment if they knowingly publish or communicate any individually identifiable information.
Section 11 -Sensitive Questions
This section has been updated to include reference to information collected about student basic needs, mental health during the coronavirus pandemic, and housing security. While these items were included in the field test, we added information under this section acknowledging the sensitivity of the items.
The following language was added:
Homelessness is one indicator of housing security, which is a multidimensional construct. To capture more nuanced information on student housing security, BPS:20/22 will collect additional indicators of housing security such as housing affordability, stability, and safety. The addition of these indicators will allow researchers to understand the impact that housing security can have on student persistence and attainment and other outcomes of interest. The addition of the food security and housing security measures will allow us to use nationally representative data to better understand whether the basic needs of postsecondary students are being met.
BPS:20/22 will also field items related to student experiences during the coronavirus pandemic. The pandemic caused disruptions to student's academic, social, and personal experiences. Given strong feedback from Technical Review Panel members, student mental health during this period is a key construct of interest. Data will be collected on how the coronavirus pandemic impacted students' levels of stress or anxiety, difficulty concentrating, loneliness or isolation, and feeling down, depressed, or hopeless. Several studies conducted by federal agencies, including the Census, have used similar measures to collect information on experiences during the coronavirus pandemic.
A.12. Estimates of Response Burden
This section has been revised to remove burden associated with data collection activities that were approved in the field test package. The full-scale burden estimates have been updated based on final data collection plans regarding full-scale data collection and panel maintenance.
The full-scale burden estimate is pasted below for reference.
Table 2. Average estimated maximum burden to student respondents for the BPS:20/22 full-scale data collection and panel maintenance
Data collection activity |
Sample |
Expected eligible |
Expected response rate (percent) |
Expected number of respondents |
Expected number of responses |
Average time burden per response (mins) |
Total time burden (hours) |
BPS:20/22 full-scale |
|
|
|
|
|
|
|
Panel maintenance (address updates)1 |
37,330 |
NA |
15 |
5,6003 |
5,600 |
3 |
280 |
|
|
|
|
|
|
|
|
Student survey |
37,330 |
34,950 |
72 |
25,030 |
25,030 |
30 |
12,515 |
NPSAS:20 Respondents |
26,470 |
26,470 |
82 |
21,710 |
21,710 |
30 |
10,855 |
NPSAS:20 Nonrespondents |
5,510 |
4,300 |
22 |
950 |
950 |
30 |
475 |
NPSAS:20 Administrative-only |
5,350 |
4,170 |
57 |
2,380 |
2,380 |
30 |
1,190 |
|
|
|
|
|
|
|
|
BPS:20/25 field test |
|
|
|
|
|
|
|
Panel maintenance (address updates)2 |
3,510 |
NA |
15 |
530 |
530 |
3 |
27 |
|
|
|
|
|
|
|
|
Total |
|
|
|
25,560 |
25,610 |
|
12,544 |
1 Greyed out rows represent tasks for which burden is not currently being requested. In this case, project burden for the administration of full-scale panel maintenance was approved in the BPS:20/22 field test package, OMB # 1850-0631 v.18. However, because in v. 18 we estimated 5,550 responses and 278 burden hours associated with this activity and have since revised that estimate slightly upward, a contribution of 50 responses and 2 burden hours from this activity line have been added to the total in this request.
2 BPS:20/25 is anticipated but not yet exercised.
3 The expected numbers of respondents (5,600) is not included in the total count because they are acccounted for in the adjacent student survey respondent cells. These cases are included in the total expected number of responses.
A.14. Annualized Cost to Federal Government
Table 3 from the field test package was moved to section A. 15, and here in section A.14 replaced with a cost breakdown of the full-scale study in text.
The full-scale cost is pasted below for reference.
The total cost to the federal government for all activities included in this package is $4,680,807. This includes the total cost for the full-scale study, $4,667,607 and also the total cost for the field test panel maintenance, $13,200.
Table 3. Costs to NCES for the 2020/22 Beginning Postsecondary Students Full-scale Study
BPS:20/22 study implementation |
Costs to NCES |
NCES Salaries and expenses |
$378,813 |
BPS:20/22 full-scale student survey |
|
Contract costs |
$4,288,794 |
Instrumentation and materials |
$349,553 |
Data collection |
$2,492,288 |
Systems and data processing |
$979,085 |
Planning and documentation |
$467,868 |
|
|
BPS:20/25 field test panel maintenance (address updates)1 |
$13,200 |
|
|
|
|
Total |
1 BPS:20/25 is anticipated but not yet exercised.
A.15. Reasons for Program Changes/Adjustments.
Table 3 from A.14 section of the field test package was moved to A.15 of the full-scale package, with revisions to reflect full-scale adjustments and costs associated with these adjustments.
The full-scale cost table pasted below for reference.
The total estimated burden time is 12,544 hours.
A.16. Tabulation and Publication.
Table 4 was updated to reflect dates of full-scale operation. This section also removes reference to documentation regarding DataLab’s QuickStats.
The full-scale operation schedule table is pasted below for reference.
Table 4.Operational schedule for BPS:20/22 Full-scale Study
|
Start date |
End date |
BPS:20/22 activity |
|
|
Full-scale study |
|
|
Select student sample |
Sep. 27, 2021 |
Nov. 24, 2021 |
Panel maintenance |
Oct. 19, 2021 |
Feb. 28, 2022 |
Self-administered web-based data collection |
Mar. 1, 2022 |
Nov. 11, 2022 |
Conduct telephone surveys of students |
Mar. 1, 2022 |
Nov. 11, 2022 |
Process data, construct data files |
Mar. 1, 2022 |
Dec. 2, 2022 |
Prepare/update data collection reports |
Jan. 11, 2022 |
Dec. 15, 2022 |
|
|
|
BPS:20/25 activity1 |
|
|
Field test |
|
|
Panel maintenance (address updates) |
Oct 19, 2022 |
Feb. 28, 2023 |
1 BPS:20/25 is anticipated but not yet exercised.
Supporting Statement Part B
Changes in Part B include removing references to field test sample design and tests of procedures and methods and adding a summary of field test results. New content is added pertaining to the full-scale response universe, sample design, and procedures and methods.
Preface
The preface was updated to specify the clearance request is for full-scale clearance.
B.1. Respondent Universe.
This section was updated to remove references to the field test respondent universe and sample and to specify the clearance request is for the full-scale sample of students who were identified as confirmed or potential 2019-20 academic year first-time beginner students based on survey, institution, or other administrative data.
B.2. Collection Procedures.
The title of this section was revised from “Statistical Methodology” to “Collection Procedures.”
Text pertaining to the field test sample was removed and replaced with information on the sample design of the BPS:20/22 full-scale study, including information on the specific number of cases that will be fielded. Sampling for the NPSAS:20 study, from which the BPS:20/22 sample is drawn, is described, including the two-stage procedures of first selecting institutions into the sample and then sampling students from these institutions. Also described is the process for identifying first time beginner students. Table 1 was changed from field test sampling quotas to instead show details on the BPS:20/22 full-scale sampling frame. Table 2 was changed from a table of expected outcomes from field test groups to show the BPS:20/22 full-scale sample counts by control and level of institution. A third table, Table 3, was added to display BPS:20/22 sample counts by state. A fourth table, Table 4, was added to show expected full-scale completion counts and rates.
Below are the updated Tables 1 and 2 and new Tables 3 and 4 for reference.
Table 1. BPS:20/22 full-scale frame by NPSAS:20 sample and response status
Table 2. BPS:20/22 sample by control and level of institution
|
|
Confirmed and potential FTB administrative student respondents from NPSAS:20 Administrative Sample |
||
Institution characteristics |
Total |
NPSAS:20 Survey Respondents |
NPSAS:20 Survey Nonrespondents |
NPSAS:20 Admin-Only Respondents |
Total |
37,330 |
26,470 |
5,510 |
5,350 |
|
|
|
|
|
Institution type |
|
|
|
|
Public |
|
|
|
|
Less-than-2-year |
400 |
270 |
120 |
<5 |
2-year |
12,370 |
9,000 |
2,210 |
1,160 |
4-year non-doctorate-granting primarily sub-baccalaureate |
2,610 |
2,010 |
550 |
50 |
4-year non-doctorate-granting primarily baccalaureate |
2,290 |
1,850 |
260 |
180 |
4-year doctorate-granting |
7,840 |
4,750 |
790 |
2,300 |
Private nonprofit |
|
|
|
|
Less-than-4-year |
290 |
190 |
100 |
<5 |
4-year non-doctorate- granting |
2,940 |
2,000 |
220 |
720 |
4-year doctorate-granting |
3,600 |
2,400 |
330 |
880 |
Private for-profit |
|
|
|
|
Less-than-2-year |
980 |
780 |
150 |
50 |
2-year |
1,600 |
1,280 |
300 |
10 |
4-year |
2,430 |
1,940 |
480 |
<5 |
NOTE: Detail may not sum to totals because of rounding. Potential FTB’s are individuals who did not complete a NPSAS survey who appeared to be FTB’s in enrollment or NPSAS:20 admin data. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2020/22 Beginning Postsecondary Students Longitudinal Study (BPS:20/22) Full-scale. |
Table 3. BPS:20/22 sample by state
|
|
Confirmed and potential FTB administrative student respondents from NPSAS:20 Administrative Sample |
||
State |
Total |
NPSAS Survey Respondents |
NPSAS Survey Nonrespondents |
NPSAS Admin-Only Respondents |
Total |
37,330 |
26,470 |
5,510 |
5,350 |
|
|
|
|
|
California |
2,750 |
2,230 |
500 |
20 |
Florida |
1,970 |
1,580 |
350 |
40 |
Georgia1 |
1,410 |
860 |
390 |
170 |
North Carolina |
1,370 |
970 |
230 |
180 |
New York |
2,030 |
1,660 |
280 |
90 |
Pennsylvania |
1,340 |
940 |
170 |
230 |
Texas |
2,340 |
1,810 |
410 |
120 |
|
|
|
|
|
All other states |
24,120 |
16,410 |
3,190 |
4,510 |
1 Georgia includes an oversample of an additional 250 NPSAS survey nonrespondents and 80 NPSAS admin-only respondents. NOTE: Detail may not sum to totals because of rounding. Potential FTB’s are individuals who did not complete a NPSAS survey who appeared to be FTB’s in enrollment or NPSAS:20 admin data. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2020/22 Beginning Postsecondary Students Longitudinal Study (BPS:20/22) Full-scale. |
Table 4. BPS:20/22 expected completes by NPSAS:20 data collection outcome
NPSAS:20 Outcome |
Sample Size |
Eligibility Rate |
Expected Response Rate |
Expected Completes |
Overall |
37,330 |
0.94 |
0.72 |
25,030 |
|
|
|
|
|
NPSAS Survey Respondents |
26,470 |
1.00 |
0.82 |
21,710 |
NPSAS Survey Nonrespondents |
5,510 |
0.78 |
0.22 |
950 |
NPSAS Admin-Only Respondents |
5,350 |
0.78 |
0.57 |
2,380 |
NOTE:
Detail may not sum to totals because of rounding. |
B.3. Methods to Maximize Response.
Changes to this section were minor, focused on changing “field test” to “full-scale,” updating reference years, and editorial revisions to improve clarity.
B.3.a. Tracing of Sample Members.
Tracing activities have not changed from field test to full-scale. Changes to this section are updates to years, the OMB number, and minor edits for clarity.
B.3.b. Training for Data Collection Staff.
This section was revised to note that telephone data collection staff may work remotely (as a result of the coronavirus pandemic) and that bilingual interviewers will be hired to assist in administration of the Spanish survey.
B.3.c. Case Management System.
In this section “field test” was updated to “full-scale” and minor edits were made to improve clarity.
B.3.d. Survey Instrument Design.
In this section minor edits were made to improve clarity.
B.3.e. Refusal Aversion and Conversion.
In this section minor edits were made to improve clarity and to specify that, in the full-scale study, refusal conversion efforts will not be made with individuals who become verbally aggressive or who threaten to take legal or other action.
B.4. Tests of Procedure.
This section was updated to provide a summary of results from the field test experiment and to describe the data collection design for the full-scale study. The descriptions of the field test experiment plans and the hypothesis tests were removed and replaced with a summary of the experiment results. Results from field test experiments are detailed in Appendix D of the full-scale package.
The entire text of section B.4. can be found below in Attachment 1 starting on page 13.
B.4.a. Summary of BPS:20/22 Field Test Data Collection Design and Results.
This section provides a summary of field test experiment results, with a summary of recommendations for the full-scale design. This section also includes a reference to Appendix D which presents more detailed results of field test experiment results.
B.4.b. BPS:20/22 Full-scale Data Collection Design.
This section outlines the plans for the full-scale data collection, describing the two different sample member groups. Table 5 presents the set of interventions for each group.
Below are the descriptions of default and aggressive data collection protocols, for reference.
NPSAS:20 survey respondents (default group): Sample members who responded to NPSAS:20 and self-identified that they began their postsecondary education between July 1, 2019 and April 30, 2020 will receive a default data collection protocol (n = 26,470).
NPSAS:20 survey nonrespondents and administrative-only cases (aggressive group): NPSAS:20 survey nonrespondents and administrative-only cases: NPSAS:20 administrative student respondents who are potential 2019-20 academic year FTBs will receive an aggressive data collection protocol. This group includes NPSAS:20 survey nonrespondents (n = 5,510) and NPSAS:20 administrative-only sample (cases who were never invited to complete the NPSAS:20 survey; n = 5,350) who are potential 2019-20 academic year FTBs based on administrative data. The goal of this treatment is to convert reluctant sample members (i.e., NPSAS:20 survey nonrespondents) and sample members who have never been contacted (i.e., administrative-only cases) to participate in the study as early in data collection as possible.
Table 5. 2020/22 Beginning Postsecondary Students Full-Scale data collection protocols, by data collection phase and group assignment
|
Data Collection Group Assignments |
|
|
Default Protocol |
Aggressive Protocol |
Sample |
|
|
Data Collection Protocols |
||
Prior to data collection |
|
|
Early completion phase |
|
|
Production phase 1 |
|
|
Production phase 2 |
|
|
Nonresponse Conversion Phase |
|
|
Total incentives |
|
|
B.5. Statistical Consultation Contacts.
The title of this section was revised from “Reviewing Statisticians and Individuals Responsible for Designing and Conducting the Study” to “Statistical Consultation Contacts.” This section was updated to remove an RTI International team member no longer working on the project study.
Part B - References
The references were revised to reflect the literature used for the full-scale package.
Appendix A - Membership of the Technical Review Panel
The TRP members list presented in Appendix A was updated to reflect changes in membership and members’ contact information between TRP meeting 1 and TRP meeting 2.
Appendix B - Confidentiality for Administrative Record Matching
Changes in Appendix B include minor editorial revisions for clarity including moving text previously in a footnote from subsection Secure Data Transfers into the text. Additionally, the Processing Administrative Data section removes reference to DataLab’s QuickStats and TrendStats, and a new sentence is added at the end of this section:
Data will also be used for locating sample members, for panel maintenance and communications encouraging survey participation.
Appendix C - Student Contacting Materials
Appendix C was Student Contacting Materials in the field test package, and now contains the Student Contacting Materials for the full-scale study. Changes in Appendix C reflect updates made for full-scale implementation. Communication materials have been included to accompany interventions new to the full-scale protocols, including the prepaid incentive, incentive boost, and an abbreviated interview. Additionally, the full-scale study will be offered in both English and Spanish, so new contacting materials in Spanish are added. Attachment 2a of this memo describes the types of changes systemically made in the field test versions of contacting materials to adapt them to full-scale, and Attachment 2b describes changes made to and additions of specific contacting materials (beginning on page 19 of this document).
Appendix D - Results of the BPS:20/22 Field Test Experiments
Appendix D was the Cognitive Testing Report in the field test package, which is now replaced with Appendix D Results of the BPS:20/22 Field Test Experiments. Appendix D presents a detailed description of the results of experiments conducted in the field test, and conceptually replaces the Appendix D Cognitive Testing Report from the field test package.
Appendix E - Survey Instrument:
Changes in Appendix E include revisions to the full-scale instrument based on field test survey results, and feedback from the latest BPS:20/22 Technical Review Panel (TRP) meeting held in July 2021. Revisions to the survey are intended to reflect current research goals, reduce respondent burden, and improve data quality. Table 1 in Appendix E (pp. E3 to E10) provides a summary of the content of the BPS:20/22 full-scale student survey. A change field column in this table and font color coding indicates whether items have remained the same (black) compared to the BPS:20/22 field test survey, were revised (purple), dropped (red), or added (green). In response to recommendations made at the latest BPS:20/22 TRP, a few items have been added to the BPS:20/22 full-scale survey including: items collecting additional information about the impacts of the coronavirus pandemic, new items that will improve data quality for the abbreviated survey respondents, and a new language preference form to determine whether the survey will be administered in English or Spanish.
Additionally, following the English survey instrument, Appendix E includes the new Spanish abbreviated survey instrument.
Included in this section is information describing any tests of procedures or methods that will be undertaken.
During
the course of this data collection, the following experiment(s) will
be undertaken. The BPS:20/22 field test included two sets of
experiments: data collection experiments focused on survey
participation to reduce nonresponse error and the potential for
nonresponse bias, and questionnaire design experiments focused on
minimizing measurement error to improve data quality. The full-scale
data collection design described below will implement the tested
approaches, revised based on the BPS:20/22 field test results
described in Section 4.a.
a.
Summary of BPS:20/22 Field Test Data Collection Design and
Results
The BPS:20/22
field test contained two data collection experiments and two
questionnaire design experiments. The results of these field test
experiments are summarized below. For detailed results of the
BPS:20/22 field test experiments, see Appendix D.
The data
collection experiments explored the effectiveness of 1) offering an
extra incentive for early survey completion, and 2) sending survey
reminders via text messages. Results from these data collection
experiments provide insight in preparation for the full-scale study
regarding the effectiveness of these interventions across three data
quality indicators: survey response (operationalized using response
rates), sample representativeness (assessed across age, sex,
ethnicity, race, and institutional control), and data collection
efficiency (operationalized as the number of the days between the
start of the experiment and survey completion).
The
“early bird” incentive experiment investigated the
effectiveness of giving respondents an additional $5 incentive if
they completed the survey within the first three weeks of data
collection (experimental group) versus no additional incentive
(control group). Response rates at the end of data collection did not
differ across the early bird group (63.9 percent) and the control
group (63.4 percent; X2 = 0.08, p = .78). Both the early bird and
control groups had similar representativeness across age, sex,
ethnicity, race, and institutional control. At the end of data
collection, respondents in the early bird group took significantly
fewer days (28.1 days) than respondents in the control group (30.9
days) to complete the survey (t(2,231.7) = 2.09, p < 0.05).
However, this difference is small (2.8 days), and not long enough to
allow for any significant cost savings in the data collection process
(e.g., via fewer reminder calls, texts, or mailings). Therefore, the
use of an early bird incentive in the BPS:20/22 full-scale data
collection is not recommended.
The reminder mode
experiment compared the effectiveness of using text message reminders
(experimental group) versus telephone call reminders (control group).
Response rates at the end of data collection for the text message
group (29.6 percent) and the telephone group (31.5 percent) did not
significantly differ (X2 = 0.83, p = 0.36). In the telephone reminder
group, the percentage of white respondents (74.1 percent)
significantly differed from the percentage of white nonrespondents
(64.5 percent; X2 = 7.40, p < 0.01), indicating a potential source
of nonresponse bias. For the text message reminder group, there was
not a significant difference between the percentage of White
respondents (65.3 percent) and nonrespondents (63.3 percent) (X2 =
0.28, p = 0.60), indicating better sample representativeness. The
text message and telephone groups had similar representativeness
across the remaining respondent characteristics: age, sex, ethnicity,
and institutional control.
Finally, the number of days it
took for respondents in the text message reminder group to complete
the survey (75.5 days) was not significantly different from the
telephone reminder group (77.0 days; t(542.8) = 0.62, p = 0.27). As
text message reminders achieved response rates, representativeness,
and efficiency that was comparable to more expensive telephone
reminders, the use of text reminders (coupled with telephone
reminders as described in Section 4.b) is recommended as a part of
the BPS:20/22 full-scale data collection.
The
questionnaire design experiments explored the effectiveness of 1)
different methods for collecting enrollment data, and 2) using a
predictive search database on a question collecting address
information. In addition, information about the impacts of the
coronavirus pandemic was collected by randomly assigning respondents
one of two separate topical modules to maximize the number of
questions fielded without increasing burden. Results from these
questionnaire collection experiments provide insight in preparation
for the full-scale study regarding the effectiveness of these methods
across three data quality indicators: missingness (operationalized as
item- and question-level nonresponse rate), administrative data
concordance (operationalized as agreement rates between self-reported
enrollment and administrative records; month-level enrollment
intensity experiment only), and timing burden (operationalized as the
mean complete time at the question level).
The
month-level enrollment intensity experiment compared two methods for
collecting enrollment information in the 2020-21 academic year: a
single forced-choice grid question that displayed all enrollment
intensities (i.e., full-time, part-time, mixed, and no enrollment) on
one form (control group) and separate yes/no radio gates for
full-time and part-time enrollment (experimental group). There were
no statistically significant differences across the control and
treatment conditions on rates of missingness (0 percent and 0.03
percent missing, respectively (t(636) = 1.42, p = 0.1575)) or
agreement rates with administrative enrollment data (70.0 percent
agreement and 70.6 percent agreement, respectively (t(1311.1) = 0.24,
p = 0.8119)). On average, the treatment group took significantly
longer to complete the enrollment question (17.2 seconds) than the
control group (10.5 seconds; t(1312.8) = 15.47, p < .0001), though
this difference is expected given the additional screen respondents
must navigate in the experimental group. As the experimental question
did not represent a clear improvement over the original forced-choice
grid, the use of the original question is recommended BPS:20/22
full-scale data collection.
The predictive search address
database experiment explored the utility of suggesting
USPS-standardized addresses to respondents as they entered their
address into the survey. This analysis compares address entry for the
same set of respondents across the BPS:20/22 field test (using the
database-assisted predictive search method) and the NPSAS:20
full-scale survey (using traditional, manual address entry). Overall,
98 percent of respondents provided a complete permanent address using
the manual method in NPSAS:20, compared to 85 percent using the
predictive search method in BPS:20/22 field test (t(2023.5) = 13.88,
p < .0001). However, it should be noted that addresses obtained
using the predictive search method were error-free (0 percent of FTB
check addresses were undeliverable), while 1.2 percent of addresses
obtained using the manual method were undeliverable. Also, additional
improvements to the survey instrument (e.g., soft check validations
for incomplete addresses) may further reduce rates of missingness for
the predictive search method. Finally, on average, respondents took
longer to provide their address using manual entry (29.5 seconds)
compared to the predictive search system (27.1 seconds; t(3055.7) =
4.09, p < .0001). Given the higher quality data resulting from the
predictive search method, the potential to improve the predictive
search method via instrument adjustments, and the significant
reduction in completion time compared to manual entry, the
continuation of predictive search method is proposed for BPS:20/22
full-scale.
Given the impact of the coronavirus pandemic
on higher education, researchers have expressed interest in using
BPS:20/22 data to examine these impacts on postsecondary students.
BPS:20/22 field test respondents were randomly assigned into two
groups that received one of two modules. Each module measured similar
constructs, however, module one consisted of survey questions from
NPSAS:20 that measured student academic, social, and personal
experiences related to the coronavirus pandemic, and module two
collected a new set of constructs, including changes in enrollment
and borrowing, changes in academic engagement, and access to support
resources, that may be of analytic value to researchers and
policymakers. Across both modules, the average item nonresponse rate
was 2 percent. Module one had an average nonresponse rate of 3
percent, significantly higher than the 0.6 percent nonresponse rate
of module two (t(836.72) = 5.16, p < .0001). Regardless of module
assignment, the coronavirus pandemic questions took respondents an
average of 2.7 minutes to complete. The BPS:20/22 full-scale survey
instrument will administer a subset of the questions from both field
test coronavirus pandemic modules, based upon field test performance
and TRP feedback. The coronavirus pandemic module for the full-scale
maintains the burden goal of three minutes.
b.
BPS:20/22 Full-scale Data Collection Design
The
data collection design proposed for the BPS:20/22 full-scale study
builds on the designs implemented in past BPS studies, as well as the
National Postsecondary Student Aid Study (NPSAS) and the
Baccalaureate and Beyond (B&B) studies. Additionally, results
from the BPS:20/22 field test Data Collection Experiments (Appendix
D) inform recommendations for the BPS:20/22 full-scale data
collection design.
A primary goal of the full-scale
design is to minimize the potential for nonresponse bias that could
be introduced into BPS:20/22, especially bias that could be due to
lower response rates among NPSAS:20 nonrespondents. Another important
goal is to reduce the amount of time and cost of data collection
efforts.
To accomplish these goals, the plan is to
achieve at least a 70 percent response rate. Doing so will minimize
potential nonresponse bias, optimize statistical power, and enable
sub-group analyses. The sample will be divided into two groups and
differential data collection treatments will be implemented based on
prior round response status. A similar approach was successfully
implemented in the BPS:20/22 field test, and the latest B&B
studies where more reluctant sample members received a more
aggressive protocol (for an experimental comparison see B&B:16/17
field test https://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2020441).
For the BPS:20/22 full-scale design, the following sample
groupings will be used:
•NPSAS:20 survey respondents:
Sample members who responded to NPSAS:20 and self-identified that
they began their postsecondary education between July 1, 2019 and
April 30, 2020 will receive a default data collection protocol (n =
26,470).
•NPSAS:20 survey nonrespondents and
administrative-only cases: NPSAS:20 administrative student
respondents who are potential 2019-20 academic year FTBs will receive
an aggressive data collection protocol. This group includes NPSAS:20
survey nonrespondents (n = 5,510) and NPSAS:20 administrative-only
sample (cases who were never invited to complete the NPSAS:20 survey;
n = 5,350) who are potential 2019-20 academic year FTBs based on
administrative data.
The goal of this treatment is to convert reluctant sample members (i.e., NPSAS:20 survey nonrespondents) and sample members who have never been contacted (i.e., administrative-only cases) to participate in the study as early in data collection as possible.
Table
5 below presents the type and timing of interventions to be applied
in data collection by groups and protocol. The details of these
interventions are described below.
Table 5. 2020/22 Beginning Postsecondary Students Full-Scale data collection protocols, by data collection phase and group assignment
|
Data Collection Group Assignments |
|
|
Default Protocol |
Aggressive Protocol |
Sample |
|
|
Data Collection Protocols |
||
Prior to data collection |
|
|
Early completion phase |
|
|
Production phase 1 |
|
|
Production phase 2 |
|
|
Nonresponse Conversion Phase |
|
|
Total incentives |
|
|
The
duration of each phase of data collection will be determined based on
phase capacity—the time at which a subgroup's estimates remain
stable regardless of additional data collection efforts. For example,
during the early completion phase, key metrics are continually
monitored, and when they stabilize over a period of time, cases are
then transferred to the next phase. Phase capacity will be determined
based on a series of individual indicators within each data
collection protocol. For example, response rates and other level of
effort indicators over time accounting for covariates, such as
institution control, will be assessed.
Incentives.
The baseline incentive for the default protocol will be $30 with a
$10 incentive boost in Production Phase 2, leading to a maximum
possible total incentive of $40. The baseline incentive for the
aggressive protocol will be $45. An experiment conducted in BPS:12/14
showed that a $45 baseline incentive yielded the highest response
rates (Hill et al. 2016). However, this experiment was underpowered
to detect differences from $30 in the lower propensity response
groups (Wilson et al. 2015). Nonetheless, implementing a higher
baseline incentive is recommended given the $30 baseline incentive
and the $10 incentive boost from NPSAS:20 was not enough to encourage
prior year nonrespondents to participate. Further, the $40 BPS:20/22
field test incentive yielded a response rate of only 25.3 percent
among these “aggressive protocol” sample members. The
baseline incentive will be paid in addition to a possible $2 prepaid
incentive (see prepaid incentive section below), and a $20 incentive
boost (see nonresponse conversion incentive section below). The
maximum possible total incentive is $67 in the aggressive data
collection protocol. Results from the BPS:20/22 field test showed
that offering sample members an early bird incentive did not
significantly improve response rates or representativeness by the end
of data collection, nor did it practically improve data collection
efficiency (Appendix D). Therefore, early bird incentives will not be
used in the full-scale study for either the default or aggressive
protocols.
Beyond the baseline incentives, both data
collection protocols employ similar interventions, although the
timing and intensity of these interventions differ across groups.
Interventions occur sooner in the aggressive protocol and are more
intense.
Prenotification.
The first mailing that individuals in the default and aggressive data
collection protocols will receive is a greeting card. This mailing is
aimed to increase the perceived legitimacy of the upcoming survey
request (e.g., Groves et al. 1992) in both data collection groups and
announce the incentive amounts. Greeting cards, in particular, have
been shown to significantly increase response rates in longitudinal
studies (Griggs et al. 2019) and this method will be used as a
precursor to the invitation letter. The greeting card will be mailed
a few weeks in advance of data collection.
$2
prepaid incentive. Cash prepaid
incentives have been shown to significantly increase response rates
in both interviewer-administered and self-administered surveys. These
prepaid incentives increase the perceived legitimacy of the survey
request and therefore reduce the potential for nonresponse bias
(e.g., Church 1993; Cantor et al. 2008; Goeritz 2006; Medway and
Tourangeau 2015; Messer and Dillman 2011; Parsons and Manierre 2014;
Singer 2002). During the early completion phase in the B&B:16/17
field test, prepaid incentives ($10 via check or PayPal) in
combination with telephone prompting also significantly increased
response rates by 4.4 percentage points in the aggressive protocol
group. Given these positive findings combined with general
recommendations in the literature (e.g., Singer and Ye 2013; DeBell
et al. 2019), a small $2 cash prepaid ‘visible'
incentive, or, where necessary due to low address quality, a $2
prepaid PayPal incentive announced on a separate index card will be
sent to all cases in the aggressive protocol for BPS:20/22 full-scale
(see results from the B&B:16/20 calibration experiment –
Kirchner et al. 2021). Sample members will be notified of this
prepaid incentive in the data collection prenotification, and it will
be included in the data collection announcement letter.
Mode
tailoring. The leverage-saliency
theory suggests that respondents have different hooks that drive
their likelihood of survey participation (Groves et al. 2000); thus,
offering a person the survey mode (e.g., web, mail, telephone) that
they prefer may increase their likelihood of responding. This is
further supported by empirical evidence that shows offering people
their preferred mode speeds up their response and is associated with
higher participation rates (e.g., Olson et al. 2012). Using the
NPSAS:20 survey completion mode as a proxy for mode preference, the
BPS:20/22 full-scale early completion phase will approach sample
members in the default protocol with their mode of completion for
NPSAS:20. Specifically, while all sample members in the default
protocol will receive identical data collection announcement letters
and e-mails, those who completed the NPSAS:20 survey by telephone
(4.3 percent) will be approached by telephone from the start of data
collection. Likewise, those who completed the NPSAS:20 main study
survey online will not be contacted by telephone before a preassigned
outbound telephone data collection date.
(Light)
outbound CATI calling and text messaging.
The results from the BPS:20/22 field test showed that there were no
statistically significant differences in the response rates for the
text message reminder and the telephone only group at the end of the
experimental period (Appendix D). As a result, both data collection
groups will receive early text message reminders combined with
prioritized telephone calls. Telephone calls will be prioritized to
individuals for whom no cell phone number exists, those who opt out
of the text message reminders, and those sample members who will be
prioritized based on other criteria (e.g., from lower performing
sectors). Text messages from sample members will be answered with an
automated text response, with the possibility of two-way text
messaging (i.e., interviewers respond to text message questions sent
by sample members) in some cases.
Sample members in the
default group who qualify for telephone calls will receive a light
CATI protocol. Light CATI involves a minimal number of phone calls,
used mainly to prompt web response (as opposed to regular CATI
efforts that involve more frequent phone efforts, with the goal to
locate sample members and encourage their participation). In the
B&B:16/17 field test, introduction of light CATI interviewing
appeared to increase production phase response rates in the default
protocol. Although one should use caution when interpreting these
results – group assignment in B&B:16/17 field test was not
random but instead compared NPSAS:16 “early” and “late”
respondents– the findings are consistent with the literature
which has shown that web surveys tend to have lower response rates
compared to interviewer-administered surveys (e.g., Lozar Manfreda et
al. 2008). Attempting to survey sample members by telephone also
increases the likelihood of initiating locating efforts sooner.
B&B:16/17 field test results showed higher locate rates in the
default protocol (93.7 percent), which had light CATI, compared to a
more relaxed protocol without light CATI (77.8 percent; p <
0.001). For the BPS:20/22 full-scale data collection, light CATI will
be used in the default protocol once CATI begins in Production Phase
1. Additionally, all cases in the aggressive protocol will receive
earlier and more intense telephone prompting than eligible cases in
the default group.
Incentive
boosts. Researchers have commonly
used incentive boosts as a nonresponse conversion strategy for sample
members who have implicitly or explicitly refused to complete the
survey (e.g., Groves and Heeringa 2006; Singer and Ye 2013). These
boosts are especially common in large federal surveys during their
nonresponse follow-up phase (e.g., The Center for Disease Control and
Prevention's National Survey of Family Growth) and have been
implemented successfully in other postsecondary education surveys
(e.g., HSLS:09 second follow-up; BPS:12/17; NPSAS:20). In NPSAS:20, a
$10 incentive boost increased the overall response rate by about 3.2
percentage points above the projected response rate. Therefore, a $10
incentive boost increase to the BPS:20/22 baseline incentive is
planned during Production Phase 2 for all remaining nonrespondents in
the default data collection protocol, before the abbreviated survey
is offered in the nonresponse conversion phase. Remaining
nonrespondents in the aggressive data collection protocol will be
offered a $20 incentive boost increase to the baseline incentive
before the abbreviated survey (both offered in Production Phase 2).
This is because the $10 incentive boost in NPSAS:20 did not show any
effect on this group. If necessary, incentive boosts may be targeted
only at certain groups of nonrespondents to achieve response goals
(e.g., targeting nonrespondents from certain states to ensure
representativeness, targeting aggressive group nonrespondents to
reduce the potential for nonresponse bias).
Abbreviated
survey. Obtaining responses from all
sample members is an important assumption of the inferential
paradigm. The leverage-saliency theory (Groves et al. 2000) and the
social exchange theory (Dillman et al. 2014) suggest that the
participation decision of an individual is driven by different survey
design factors or perceived cost of participating. As such, reducing
the perceived burden of participating by reducing the survey length
may motivate sample members to participate.
During the
B&B:16/17 field test, prior round nonrespondents were randomly
assigned to one of two groups: 1) prior round nonrespondents who were
offered the abbreviated survey during the production phase (i.e.,
before the nonresponse conversion phase), and 2) prior round
nonrespondents who were offered the abbreviated survey during the
nonresponse conversion phase (i.e., after the production phase). At
the end of the production phase, prior round nonrespondents who
received the abbreviated survey had a higher overall response rate
(22.7 percent) than those who were not offered the abbreviated during
that phase (12.1 percent; t(2,097) = 3.67, p < 0.001). Further, at
the end of data collection, prior round nonrespondents who were
offered the abbreviated survey during the earlier production phase
had a significantly higher response rate (37 percent) than prior
round nonrespondents who were not offered the abbreviated survey
until the nonresponse conversion phase (25 percent) (t(2,097) = 3.52,
p =.001). These results indicate that offering an abbreviated survey
to prior round nonrespondents during the production phase (i.e.,
earlier in data collection) significantly increases response rates.
The B&B:08/12 and B&B:08/18 full-scale studies also
demonstrated the benefit of an abbreviated survey. Offering the
abbreviated survey to prior round nonrespondents increased overall
response rates of that group by 18.2 (B&B:08/12) and 8.8
(B&B:08/18) percentage points (Cominole et al. 2015). In
NPSAS:20, 14.4 percent of those offered the abbreviated survey
completed it. Therefore, an abbreviated survey option will be offered
to all sample members in the BPS:20/22 full-scale study. For the
aggressive protocol, the abbreviated survey will be offered during
Production Phase 2, which is the latter half of the production phase
of data collection. For the default protocol, the abbreviated survey
will be offered as the last step in nonresponse conversion.
Other
interventions. While all BPS studies
are conducted by NCES, the data collection contractor, RTI
International, has typically used the study-specific e-mail
“@rti.org” to contact and support sample members.
Changing the e-mail sender to the NCES project officer or the RTI
project director may increase the perceived importance of the survey
and help personalize the contact materials, thereby potentially
increasing relevance. Switching the sender during data collection
also increases the chance that the survey invitation is delivered to
the sample member rather than to a spam filter.
As a result of the above experiment(s), detailed information on field results can be found in Appendix D.
Change |
Page in FT package |
Page in FS package |
Type of change |
Reason for change |
Notes |
Updated phone numbers and email addresses |
Throughout |
Multiple pages |
Contact information updated |
Phone numbers and email addresses are known and availble. |
|
Updated fills associated with FS incentive plan |
Throughout |
Multiple pages |
Updated text and fills to be consistent with FS incentive plan |
Revised incentive offerings for FS study include $2 prepaid incentive, $30/$45 baseline incentive, and $10/$20 boost incentive. |
Removed early bird incentive offer details and prepaid $10 incentives offered during the field test. Added details for $30/$45 baseline incentive, $2 prepaid incentive for FS aggressive protocol cases, and the $10/$20 boost offered later in DC. |
Added sentence to English contacts explaining how to request Spanish contact materials |
Throughout |
Multiple pages |
Added Spanish sentence to English materials. |
Sentence added to alert sample members that Spanish materials are available upon request. |
Spanish materials were not available durig the field test. Por favor responde a este correo electrónico para solicitar materiales en español. Para solicitar materiales de contacto en español en el futuro, por favor llama al 800-247-6056 o envia un correo electronico a [email protected]. Responde “Español” para solicitar este mensaje en español. |
Removed any reference to the field test reinterview |
Throughout |
Multiple pages |
Removed any reference to the reinterview and all reinterview materials. |
No reinterview collection during the FS |
|
Added time fill associated with survey length |
Throughout |
Multiple pages |
Merge fill added |
An abbreviated interview will be available in the full-scale. Adding appropriate time fill merges allows the contacts to be used with abbreviated interview offer, if applicable. |
|
Added fills to emphasize that your unique experiences during coronavirus are important to BPS researchers. |
Throughout |
Multiple pages |
Merge fill added |
To emphasize that coronavirus impacts are unique and important to this study. |
|
Added fill to clarify we want to hear from both "current and former” students |
Throughout |
Multiple pages |
Merge fill added |
Assuring all sample members we would like their participation, regardless of current enrollment status. |
Your personal circumstances are unique, and you cannot be replaced in this study. |
Added fills to better tailor text to the full-scale sample groups (NPSAS:20 Resp, NPSAS:20 NR, NPSAS:20 Admin-only) |
Throughout |
Multiple pages |
Merge fill added |
Tailor language based on prior NPSAS:20 participation status. |
|
Added fill to insert NCES Letterhead design in emails |
Many emails |
Multiple pages |
Merge fill added |
The addition of NCES letterhead design in selected emails could increase legitimacy of request. |
|
Revised Click Here button on all emails |
Many emails |
Multiple pages |
|
Per standards, the BPS logo will no longer be used, so a new click-here button will be developed. |
|
Contact Material |
Page in FT package |
Page in FS package |
Type of change |
Reason for change |
Notes |
Tentative Schedule for Sample Member Contacts |
C-3 |
C-2 |
Updated list of contacts and tentative dates for full-scale data collection |
|
|
Draft Brochure Text |
C-8 |
C-7 |
Updated text for FS implementation, Editorial to improve readability and clarity, minor revisions to FAQs |
|
"Why am I being asked to participate?" FAQ revised to include more detail. |
Draft Website Text |
C-10 |
C-9 |
Updated text for FS implementation, Editorial to improve readability and clarity, clarified fields on Login Help page, added text in Previous Results, added "unique experiences related to the coronavirus pandemic" to the list of topics included. |
|
NCES authorization added to login screen, moved cell phone and email collection to the top of the Update Contact Info form, Previous Results section revised to include more clarity on selected findings. |
Draft Consent Text |
C-15 |
C-14 |
Updated text for FS implementation, Editorial to improve readability and clarity based on interviewer feedback, Added text for GDPR, better organized information on NOLET screen |
RTI required addition of GDPR language. Reorganized information on NOLET screen to make it more readable for interviewers. |
Removed Web Consent text since included in the survey appendix |
Greeting Card/Initial Contact |
C-20 |
C-105 |
This contact was initially deleted from file - added back with full-scale edits and included on page C-105 |
Initially deleted from file - added back with full-scale edits and included on page C-105 |
Updated to account for different sample types, incentive offers, etc. |
Data Collection Announcement Letter |
C-21 |
C-16 |
Updated text for FS implementation, Editorial to improve readability and clarity |
|
Most sample members in the aggressive protocol will receive $2 cash, only those that receive PayPal will get the index card. |
Reminder Letter 1 |
C-23 |
C-18 |
Added merge fills for Spanish language speakers and boost offer (if applicable). |
Added a merge fill to appeal to Spanish language speakers, and, if applicable at the time of the mailing, a merge fill explaining the $10/$20 boost offer assures the sample member is informed of current incentive offer. . |
Boost offer ($10/$20) merge fill is added to many other contacts for the same reason. |
Reminder Letter 2 |
C-25 |
C-20 |
Updated text for FS implementation, Editorial to improve readability and clarity. |
|
|
Incentive Change Letter ($10 Boost) |
C-34 |
N/A |
Removed letter from FS contacts. |
Letter not needed because we are not offering prepaid boosts. |
|
Re-interview letter |
C-36 |
N/A |
Removed letter from FS contacts. |
Letter not needed because we are not conducting reinterviews during FS. |
|
Reminder E-mail 1 |
C-39 |
C-31 |
Updated text for FS implementation, Editorial to improve readability and clarity, added text to emphasize importance of knowing how pandemic experiences affected postsecondary education. |
|
|
Reminder E-mail 5 |
C-43 |
C-35 |
Updated text for FS implementation, Editorial to improve readability and clarity, added text to emphasize importance of knowing how pandemic experiences affected postsecondary education and incentive boost offer, if applicable. |
|
|
Reminder E-mail 7 |
C-45 |
C-37 |
Revised subject line |
Updated subject line to improve consistency with content of email (FAQs). |
"«fname», What Questions do you Have About the BPS Study?" |
Reminder E-mail 9 |
C-47 |
C-39 |
Revised subject line |
Minor edits to the subject line to improve readability. |
"Your Participation in the BPS Study Matters" |
Reminder E-mail 10 |
C-48 |
C-40 |
Revised subject line |
Minor edits to the subject line to improve readability. |
Important Education Research - Participate in the BPS Study |
Reminder E-mail 11 |
C-49 |
C-41 |
Revised subject line |
Minor edits to the subject line to improve readability. Added clock emoji. |
<<Urgent: >>The BPS Survey is Coming to a Close <<Clcck emoji >> |
Reminder E-mail 12 |
C-50 |
C-42 |
Revised subject line, Coronavirus pandemic reference. |
Minor edits to the subject line to improve readability. Added coronavirus merge fill. |
The BPS Survey Ends Tomorrow–Help Us Learn About Your Experiences |
Reminder E-mail 13 |
C-51 |
C-43 |
Revised subject line and text |
Minor edits to the subject line and text to improve readability. |
<<Fname>>, Your Last Chance to Participate… The BPS Survey Ends Today! |
As Needed Reminder E-mail 1 |
C-53 |
C-45 |
Revised subject line |
Minor edits to the subject line to improve readability. |
«fname», Help Inform Education Policy: Participate in the BPS Study Today |
As Needed Reminder E-mail 6 |
C-58 |
C-50 |
Revised text |
Minor edits to text to clarify we need responses from all types of students. |
|
Initial Contact E-mail (If time allows) |
C-63 |
C-109 |
This contact was initially deleted from file - added back with full-scale edits and included on page C-109 |
Initially deleted from file - added back with full-scale edits and included on page C-109 |
Updated to account for different sample types, incentive offers, etc. |
Incentive Change E-mail (Boost) — (Incentive Eligible) |
C-64 |
N/A |
Deleted email |
We are not offering a prepaid boost incentive so email is no longer needed. |
|
Partial Complete E-mail |
C-66 |
C-53 |
Revised subject line |
Minor edits to the subject line to improve readability. |
Important Reminder: «fname», Finish Completing Your BPS Survey! |
Paired Contact E-mail |
C-67 |
C-54 |
Alternate subject line added. Merge sentence added to remind of NPSAS participation, if applicable. |
|
<<fname>>, thank you for speaking with us about the BPS survey. |
Reviewed Pending Ineligible E-mail (Common Eligibility Issues) |
C-71 |
N/A |
Deleted email |
Not needed for FS study |
|
Reviewed Pending Ineligible E-mail (Branch Campus Issue) |
C-72 |
N/A |
Deleted email |
Not needed for FS study |
|
No Contact E-mail |
C-77 |
C-61 |
Minor edit to subject line. merge sentence added to remind of NPSAS participation, if applicable. |
|
|
Data Collection Extension E-mail |
C-81 |
C-65 |
Minor edit to subject line and text of email. |
Clarification that we have extended the deadline for participation in BPS |
|
Data Collection Staff Prompt E-mail |
C-82 |
C-66 |
Merge sentence added to remind of NPSAS participation, if applicable. |
Merge sentence added to remind of NPSAS participation, if applicable. |
|
Unclaimed Prepaid PayPal E-mail |
C-88 |
C-72 |
Added text to alert us "if this email has reached you in error". |
Encourage response if the email was sent to the wrong person. |
|
BPS:20/25 FT Panel Maintenance E-mail |
C-92 |
C-76 |
Updated email and website for BPS:20/25 FT Panel Maintenance |
Updated email for BPS:20/25 FT and minor updates needed for the website |
Added additional minor update needed for the website. |
Reminder Postcard 3 |
C-94 |
C-80 |
Updated text of email |
Clarify that scanning the QR code provides access to the survey. |
|
As Needed Postcard 1 (Incentive Eligible) |
C-94 |
C-80 |
Added merge fields so postcard can be sent to incentive ineligible cases. |
Update to make applicable to incentive ineligible cases. |
|
BPS:20/25 FT Panel Maintenance Postcard |
C-101 |
C-87 |
Updated postcard for BPS:20/25 FT Panel Maintenance |
Updated postcard for BPS:20/25 FT |
|
End of Early Bird Text Reminder |
C-105 |
N/A |
Deleted contact |
No early bird incentive offered |
|
Prepaid Text Reminder |
C-105 |
C-91 |
Updated text to be applicable for $2 prepaid incentive |
Removed reference to prepaid boost incentive. |
|
Re-Interview Texts |
C-106 |
N/A |
Deleted contact |
Deleted Reinterview Initial Text, Reminder 1, Reminder 2 |
|
Data Collection Extension Text Reminder |
C-106 |
C-92 |
Minor wording update. |
Clarification that we have extended the deadline for participation in BPS |
|
Interesting Facts—Trend Data |
C-109 |
C-94 |
Rephrased fact 11 for clarity. |
|
On average, first-time beginning college students who entered postsecondary education in 2011–12 attended 1.5 postsecondary institutions by June of 2014. |
Letter Merge Fields Interesting Facts |
C-110 |
C-95 |
Rephrased Fact 11 for clarity, added Selected findings from the 2019–20 National Postsecondary Student Aid Study (NPSAS:20): First Look (COVID-19) |
Added Selected findings from the 2019–20 National Postsecondary Student Aid Study (NPSAS:20): First Look (COVID-19) |
|
Reinterview type |
C-111 |
N/A |
Deleted merge field |
Not needed for FS study |
|
E-mail Merge Fields, Interesting Facts |
C-113 |
C-98 |
Rephrased fact 11 for clarity. |
|
On average, first-time beginning college students who entered postsecondary education in 2011–12 attended 1.5 postsecondary institutions by June of 2014. |
Postcard Merge Fields, Interesting Facts |
C-116 |
C-101 |
Rephrased fact 11 for clarity. |
|
On average, first-time beginning college students who entered postsecondary education in 2011–12 attended 1.5 postsecondary institutions by June of 2014. |
Incentive Text Outside |
C-116 |
C-103 |
Updated mere field |
Rephrased to apply to a promised boost incentive (instead of early bird incentive). |
|
Incentive Group |
C-117 |
C-104 |
Updated merge field |
Rephrased to apply to a promised boost incentive (instead of early bird incentive and $10 prepaid incentive). |
|
New letters |
N/A |
C-105 |
New contacts developed for full-scale study |
|
Note that the Greeting Card—Initial Contact was updated from field test version on page C-20. |
New E-Mails |
N/A |
C-109 |
New contacts developed for full-scale study |
|
Note that the Initial Contact E-mail was updated from field test version on page C-63. |
New Postcards |
N/A |
C-128 |
New contacts developed for full-scale study |
|
|
New Text Message Reminders |
N/A |
C-129 |
New contacts developed for full-scale study |
|
|
Communication Materials Translated into Spanish |
N/A |
C-130 |
New contacts developed for full-scale study |
|
Added Spanish website text. |
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | tracy |
File Modified | 0000-00-00 |
File Created | 2021-12-16 |