Change Memo

NPSAS24 Data Collection Change Memo v38 (1).docx

2023-24 National Postsecondary Student Aid Study (NPSAS:24) Full-Scale Study - Student Data Collection and Student Records

Change Memo

OMB: 1850-0666

Document [docx]
Download: docx | pdf

Memorandum United States Department of Education

Institute of Education Sciences

National Center for Education Statistics


DATE: July 18, 2024


TO: Beverly Pratt, OMB


THROUGH: Carrie Clarady, OMB Liaison, IES


FROM: Tracy Hunt-White, NCES


SUBJECT: 2023–24 National Postsecondary Student Aid Study (NPSAS:24) Shorter versus Longer Emails, Data Collection Pause, and Incentive Boost for Key Sample Groups Change Request (OMB# 1850-0666 v.38)

The 2023-24 National Postsecondary Student Aid Study (NPSAS:24) is a nationally representative cross-sectional study of how students and their families finance education beyond high school in a given academic year. NPSAS is conducted by the National Center for Education Statistics (NCES) and was first implemented by NCES during the 1986–87 academic year and has been fielded every 3 to 4 years since. This request pertains to the 12th cycle in the NPSAS series being conducted during the 2023–24 academic year. NPSAS:24 will consist of a nationally representative sample of undergraduate and graduate students, and a nationally representative sample of bachelor’s degree completers. Subsets of questions in the NPSAS:24 student survey focus on describing aspects of the experience of bachelor’s completers in their last year of postsecondary education.


The request is to conduct all activities related to NPSAS:24, including materials and procedures related to the NPSAS:24 student data collection, consisting of abstraction of student data from institutions and a student survey was approved by OMB in December 2023, with updates approved in January 2024 (OMB#1859-0666 v. 36 and 37, respectively) and carried over respondent burden, procedures, and materials related to the NPSAS:24 institution sampling, enrollment list collection, and matching to administrative data files as approved by OMB in September 2023 (OMB#1859-0666 v. 35). The NPSAS:24 enrollment list collection from institutions takes place from October 2023 to October 2024, the student records and student survey data collections take place from February 2024 through November 2024.


This request is to (1) test experimentally a modification of a subset of prompting emails; (2) define the schedule for planned “no contact” periods; and (3) share the results of logistic regression modeling to determine whether or not a $10 boost offered to historically underrepresented student subgroups will increase their likelihood of participation in the survey. We have also added language to the Supporting Statement Part A to inform OMB of NCES’s plans for NPSAS with respect to the newly revised SPD 15 standards for race and ethnicity data. This request does not introduce significant changes to the estimated respondent burden or the costs to the federal government.



Appendix J: Modifications to Student Follow-up Emails


A set of follow-up emails was provided in appendix J, Student Data Collection Materials, of the NPSAS:24 student data collection forms clearance package (OMB#1859-0666 v. 37). While effective in increasing the likelihood of participation, particularly on or after the date sent, a recent experiment as part of a National Science Foundation survey, found that shorter (140 word) emails to follow up with nonrespondents were more likely to elicit a response than longer (212 word) emails. 1 After the first 4 weeks of data collection, response rates were observed to be statistically significantly higher for those who had received the shorter email (18 percent; p< 0.05) than the longer email (16 percent) after the first reminder. According to the authors, this 2 percent difference was maintained through to the end of data collection although at the p < 0.075 level.


Given this finding, we have created a set of shortened NPSAS:24 nonrespondent reminder emails, removing all but the most essential text from emails 1, 2, 3, 4, and 8 as shown below and in revised appendix J attached. Nonrespondents within the last or last two (depending on timing of approval) data collection waves will be split at random into two groups with one receiving the original, longer follow-up emails (the Control group) and the other receiving the new, shorter emails (the Experimental group). Participation rates will be compared immediately before the next follow-up email is scheduled to be sent for the wave.



The proposed experimental design will allow us to test the null hypothesis that there is no statistically significant difference in participation rates between the Control and the Experimental groups (effect of shorter email).


Part B: Expected Scheduling and Duration of Sample Member “Time Out”


To encourage survey participation and decrease the potential for nonresponse bias, as discussed in the original package submission, we plan to offer nonresponding sample members in Data Collection Waves 1 - 3 either: 1) a $10 promised incentive boost if determined to be a high priority (see Incentive Boost section below), or 2) an abbreviated survey if not eligible for the boost. Sample members in these earliest waves, however, have received multiple contacts (e.g., emails, texts) over the course of the data collection period. Repetition of this nature can cause “wearout” (Pechmann & Stewart, 1988), which reduces sensitivity and attention to communications.2 Over time, sample members may become desensitized to the content of study communication materials, and may be unlikely to notice the offer of a boost incentive or abbreviated survey. Consequently, we plan to pause reminders for nonresponding sample members in Waves 1 - 3 for a period of four weeks (weeks of 7/8 to 7/29) as noted below in table 1. After this pause, we will re-engage with these sample members with either the incentive boost or the abbreviated survey. We expect that the pause in data collection results in an increased salience of the boost/abbreviated survey offer when it arrives, thereby increasing the propensity of response to the new request (Groves, Singer and Corning, 2000).3


NPSAS:24 data collection comprises a total of 9 sample waves which correspond to time in data collection, as shown below in table 1. We will offer 4-week breaks in communications according to the wave of data collection, as noted in red in the table. In addition, all waves will receive a break of almost 2 weeks ahead of the United States Presidential Election on November 5, 2024.



Table 1. Weeks in NPSAS:24 data collection, by data collection wave


Week

Date beginning1

Wave

1

2

3

4

5

6

7

8

9

1

26-Feb

1

 

 

 

 

 

 

 

 

2

4-Mar

2

 

 

 

 

 

 

 

 

3

11-Mar

3

 

 

 

 

 

 

 

 

4

18-Mar

4

 

 

 

 

 

 

 

 

5

25-Mar

5

1

 

 

 

 

 

 

 

6

1-Apr

6

2

 

 

 

 

 

 

 

7

8-Apr

7

3

 

 

 

 

 

 

 

8

15-Apr

8

4

1

 

 

 

 

 

 

9

22-Apr

9

5

2

 

 

 

 

 

 

10

29-Apr

10

6

3

 

 

 

 

 

 

11

6-May

11

7

4

1

 

 

 

 

 

12

13-May

12

8

5

2

 

 

 

 

 

13

20-May

13

9

6

3

 

 

 

 

 

14

27-May

14

10

7

4

1

 

 

 

 

15

3-Jun

15

11

8

5

2

 

 

 

 

16

10-Jun

16

12

9

6

3

 

 

 

 

17

17-Jun

17

13

10

7

4

1

 

 

 

18

24-Jun

18

14

11

8

5

2

 

 

 

19

1-Jul

19

15

12

9

6

3

 

 

 

20

8-Jul

20

16

13

10

7

4

1

 

 

21

15-Jul

21

17

14

11

8

5

2

 

 

22

22-Jul

22

18

15

12

9

6

3

 

 

23

29-Jul

23

19

16

13

10

7

4

1

 

24

5-Aug

24

20

17

14

11

8

5

2

 

25

12-Aug

25

21

18

15

12

9

6

3

 

26

19-Aug

26

22

19

16

13

10

7

4

1

27

26-Aug

27

23

20

17

14

11

8

5

2

28

2-Sep

28

24

21

18

15

12

9

6

3

29

9-Sep

29

25

22

19

16

13

10

7

4

30

16-Sep

30

26

23

20

17

14

11

8

5

31

23-Sep

31

27

24

21

18

15

12

9

6

32

30-Sep

32

28

25

22

19

16

13

10

7

33

7-Oct

33

29

26

23

20

17

14

11

8

34

14-Oct

34

30

27

24

21

18

15

12

9

35

21-Oct

35

31

28

25

22

19

16

13

10

36

28-Oct

36

32

29

26

23

20

17

14

11

37

4-Nov

37

33

30

27

24

21

18

15

12

38

11-Nov

38

34

31

28

25

22

19

16

13

39

18-Nov

39

35

32

29

26

23

20

17

14

40

25-Nov

Expected end of data collection

1All dates shown are in 2024.


Part B: Boost Incentives for Key Sample Member Groups


To identify subgroups for the incentive boost, we estimated a binary logistic regression model predicting the probability of a NPSAS:24 survey response for respondents, partial respondents, and nonrespondents in Waves 1-5 of data collection. This model included nine sample member characteristics as substantive predictors: gender (coded as male, female, other), age (25 or younger, 26 to 39, 40 and older), race (White, races other than White), ethnicity (Hispanic/Latino, not Hispanic/Latino), veteran status (veteran, not a veteran), control of sample member’s institution (public, private non-profit, private for-profit), level of sample member’s institution (less than 2 year, 2 to less than 4 year, 4 year or higher non-doctoral, 4 year or higher doctoral), whether sample member’s course of study is STEM (STEM, not STEM), and undergraduate status (undergraduate, not an undergraduate). These variables were obtained from enrollment lists. In cases where data from the enrollment lists were missing, we replaced missing values with sample members’ substantive answers to the NPSAS:24 survey, where available.


The model also included three variables controlling for design features of the survey: the sample member’s data collection wave, whether the sample member was assigned for CATI calling, and time of day that reminder emails were sent to the sample member. The overall model fit was good – the pseudo R-square for the final model was 0.4.


We then used this model to estimate predicted probabilities of NPSAS:24 survey response for each category of each of our nine sample member characteristics, holding all other variables at their means. Table 2 below displays these predicted probabilities, along with response rates for each subgroup as of July 1st, 2024.


We identified three subgroups that had lower response rates and/or propensities – control of private for-profit (low response rate of 38.4 percent), institution level of less-than-2 year (low response propensity of 0.58 and low response rate of 34.1 percent), and institution level of 2-years but less-than-4 year institutions (low response propensity of 0.66 and low response rate of 37.5 percent). Those three groups have historically responded at lower rates across NPSAS surveys and may benefit from a design change in line with the leverage saliency theory, stipulating that “one-fits all” incentive amount is not a good solution to nonresponse error (Groves, Singer and Corning, 2000).4


Differential incentives have been proven successful in bringing in groups of focal importance who were otherwise underrepresented (e.g., Groves, Singer and Corning, 2000; Groves and Heeringa, 2006; Peytcheva, Kirchner and Cooney, 2018).5,6 Such strategy was successfully employed in NPSAS:20 when additional $10 were offered to nonrespondents in three key analyses groups during the last 8 waves of data collection, resulting in an average response rate increase of 17.53 percent across waves relative to the projected response rate under the original design. We therefore recommend offering a $10 promised incentive boost for cases belonging to any of the three groups mentioned above in the earliest waves, to encourage participation and reduce the potential for nonresponse bias. This would result in an incentive boost for approximately 4,500 nonresponding sample members from Waves 1 – 3. For the rest of the sample member characteristics, response rates and propensities were generally similar across subgroups, but we will continue to monitor them during data collection and we will re-evaluate the remaining waves in September 2024.


Table 2. Response Rates and Mean Predicted Propensities for Selected Subgroups


 Sample member characteristic

Predicted Probability of Survey Response

Standard Error

Response Rate as of 7/1/2024

Gender




Male

0.67

0.01

43.5%

Female

0.70

0.01

49.6%

Other

0.78

0.02

62.1%

Age




25 or younger

0.70

0.01

44.1%

26 - 39

0.66

0.01

47.5%

40 or older

0.71

0.01

47.5%

Race




Races other than White

0.72

0.01

52.3%

White

0.68

0.01

50.6%

Ethnicity




Not Hispanic

0.67

0.01

48.0%

Hispanic

0.81

0.01

43.7%





Veteran Status




Not a Veteran

0.69

0.01

45.4%

Veteran

0.67

0.01

44.4%





Control of Institution




Public Institution

0.68

0.01

44.1%

Private non-profit Institution

0.72

0.01

49.8%

Private for-profit Institution

0.72

0.01

38.4%





Institution Level




Less-than-2-year

0.58

0.03

34.1%

2-year but less-than-4 year

0.66

0.01

37.5%

4-year or higher non-doctorate granting

0.70

0.01

47.6%

4-year or higher doctorate granting

0.71

0.01

51.0%





STEM Status




Not in a STEM Program

0.69

0.01

44.7%

In a STEM Program

0.70

0.01

49.8%





Undergraduate Status




Not an Undergraduate

0.73

0.01

52.4%

Undergraduate

0.68

0.01

43.0%





Part A: Initial Plans for NPSAS and the revised SPD 15 Standards


OMB recently adopted new standards for the collection of race and ethnicity data on Federal data collections, and agencies have been instructed that all new packages must include information about their compliance or lack thereof with the new standards. Toward that end, we have added the following language to section A.7 of the Supporting Statement Part A.


In March 2024, the Office of Management and Budget (OMB) announced revisions to Statistical Policy Directive No. 15: Standards for Maintaining, Collecting, and Presenting Federal Data on Race and Ethnicity (SPD 15) and published the revised SPD15 standard in the Federal Register (89 FR 22182). The NPSAS:2024 data collection described in this package is not compliant with the new standard and continues to use race and ethnicity categories as described in the 1997 SPD 15 standards.


NPSAS collects race and ethnicity data in multiple ways, from individual student respondents in the Student Data Collection, from postsecondary institutions as part of the Enrollment List Data Collection, and from occasional NPSAS Administrative Collections that are solely reliant on administrative data sources. NPSAS:24 started Enrollment List Data Collection in the first half of 2024 and began Student Data Collection in February 2024. Because data collection for NPSAS:24 is already well underway, NCES has chosen not to update the student race and ethnicity items at this time.


The next NPSAS study is currently expected to take place in the 2026-27 school year, and will be a NPSAS Administrative Collection. All data collected in Administrative Collections is reported by third-party data collection, and it is not year clear if postsecondary institutions will be prepared to submit third-party race and ethnicity data in accordance with the revised SPD 15 standards at that time. NCES and the Department of Education are currently working with a number of stakeholders to establish timelines for compliance for all school systems and postsecondary institutions across the country. The details of these timelines will be included in the ED Action Plan on Race and Ethnicity when it is submitted to OMB on or by September 28, 2025.

1 Bowman, M., Bryant, A., Griffiths, R., Hare, A., Huey, L., McCall, J., Scanlon, J., and Wakar, B. (2023). Assessment of Stakeholder Experiences with NSF’s Merit Review Process: Findings form the 2021 Merit Review Survey. Alexandria, CA: National Science Foundation.

2 Pechmann, C. and Stewart, D.W., 1988. Advertising repetition: A critical review of wearin and wearout. Current issues and research in advertising, 11(1-2), pp.285-329.


3 Groves RM, Singer E, Corning AD. Leverage-salience theory of survey participation: Description and an illustration. Public Opinion Quarterly. 2000;64:299-308.


4 Ibid.

5 Groves RM, Heeringa SG. Responsive design for household surveys: tools for actively

controlling survey errors and costs. Journal of the Royal Statistical Society Series a-Statistics in Society.

2006;169(3):439-457. doi: DOI 10.1111/j.1467-985X.2006.00423.x.

6 Peytcheva, E, Kirchner, A., and Cooney, J. 2018. Experimental Comparison of Two Data Collection Protocols for Previous Wave Nonrespondents. Paper presented at the Methodology of Longitudinal Surveys II conference, Essex, U.K.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleMemorandum United States Department of Education
Authoraudrey.pendleton
File Modified0000-00-00
File Created2024-07-26

© 2024 OMB.report | Privacy Policy