Part B B&B 2008-2018

Part B B&B 2008-2018.docx

2008/18 Baccalaureate and Beyond (B&B:08/18) Full-Scale

OMB: 1850-0729

Document [docx]
Download: docx | pdf

2008/18 BACCALAUREATE AND BEYOND (B&B:08/18) FULL-SCALE

OMB # 1850-0729 v. 13



Supporting Statement Part B







Submitted by

National Center for Education Statistics

U.S. Department of Education












February 2018



Contents




Tables



  1. Collection of Information Employing Statistical Methods

This submission requests clearance for the 2008/18 Baccalaureate and Beyond Longitudinal Study (B&B:08/18) full-scale data collection instrument and methods. B&B:08/18 is the third and final follow-up of sample members from the 2007-08 National Postsecondary Student Aid Study (NPSAS:08) who were baccalaureate recipients during the 2007-08 academic year. For details on the NPSAS:08 sampling design and the final full-scale study designs of the B&B follow-up studies see the Supporting Statements Part B of the NPSAS:08 Full Scale (OMB#1850-0666), B&B:08/09 full-scale (OMB# 1850-0729, v. 2), and the B&B:08/12 full-scale (OMB# 1850-0729; v. 7-10). Specific plans for B&B:08/18 full-scale are provided below.

    1. Respondent Universe – B&B:08/18 Target Population

The target population for the B&B:08/18 full-scale study includes all eligible NPSAS:08 sample members who completed requirements for the bachelor’s degree from NPSAS-eligible institutions during the 2007-08 academic year, that is, between July 1, 2007, and June 30, 2008, and were awarded their baccalaureate degree by the institution no later than June 30, 2009. There is a known and well-defined probability of selection for each student in the B&B sample. Through the institution awarding the degree, each completer has exactly one linkage to the B&B sampling frame.

B&B:08/18, which is a 10-year follow-up to its base-year, NPSAS:08, will be conducted six years after the latest round of data collection with this cohort, in B&B:08/12. In B&B:08/12, the unweighted response rate was 85 percent of the eligible sample. In the only other 10-year B&B cohort follow-up, B&B:1993/03, about 86 percent of the eligible sample responded. Longitudinal studies often experience some attrition between waves, especially lengthy ones when some sample members become unlocatable. Additionally, survey response rates have continued to decline over the past six years, as evidenced by response rates in other recent NCES postsecondary studies (BPS:12/17 achieved a 64% response rate among eligible-sample members; and B&B:16/17 is a month away from the end of its data collection, with a 61% response rate to-date, and projected not to exceed 70%). The B&B:08/18 field test response rate was about 72% though, being a field test, it does not utilize a random sample and the field test data collection is abbreviated in time and effort relative to full-scale data collection. Based on prior response status and planned data collection protocols, we expect a 75% response rate.

    1. Statistical Methodology – B&B:08/18 Sample Design

The sample design for the B&B:08/18 full-scale study will mimic the design used for the field test sample. Table 1 shows the distribution of the B&B:08/18 sample by response status from all prior rounds and Table 2 shows the distribution of the B&B:08/18 sample by NPSAS:08 study member status. The B&B:08/18 full-scale sample will include all B&B:08/12 eligible sample members. However, only those who were NPSAS:08 study members will be fielded; NPSAS:08 non-study members (n=74) will be treated as B&B:08/18 nonrespondents and will not be pursued for data collection. NPSAS:08 non-study members are those sample members lacking enough information from the NPSAS interview and administrative collections to qualify them as a respondent to NPSAS.

Table 1. Distribution of the B&B:08/18 sample by response status for NPSAS:08, B&B:08/09, and B&B:08/12

NPSAS:08

B&B:08/09

B&B:08/12

Count

Total



17,114

Respondent

Respondent

Respondent

13,490

Respondent

Respondent

Nonrespondent

1,547

Respondent

Nonrespondent

Respondent

1,069

Respondent

Nonrespondent

Nonrespondent

955

Nonrespondent

Respondent

Respondent

39

Nonrespondent

Respondent

Nonrespondent

13

Nonrespondent

Nonrespondent

Respondent

5

Nonrespondent

Nonrespondent

Nonrespondent

17

NOTE: NPSAS:08 study members are sample members who met the requirements to be an analysis case in NPSAS:08 either through the interview or administrative data collections.

Table 2. Distribution of the B&B:08/18 sample by NPSAS:08 study member status


Count

Total

17,114

NPSAS:08 study member

17,040

NPSAS:08 non-study member1

74

1 NPSAS:08 non-study members will not be fielded in the B&B:08/18 data collection and will be considered B&B:08/18 nonrespondents.

    1. Methods for Maximizing Response Rates

Response rates in the B&B:08/18 full-scale data collections will be maximized to the extent that B&B:08/18 staff are successful in identifying and locating the sample members involved, and then contacting them and gaining their cooperation. The following sections outline methods for maximizing response to the B&B:08/18 survey.

      1. Survey Instrument Design

Development of the full-scale interview began with the preparation of the field test instrument. After field test data collection, cognitive interviews were conducted in fall 2017 to capture feedback on targeted sections of the field test survey. A second technical review panel (TRP) meeting was held December 2017 to review field test results and cognitive testing results, and discuss improvements to survey items for the full-scale implementation.

The B&B:08/18 full-scale survey will have three versions: the full, an abbreviated, and a mini survey. The abbreviated survey will collect information on key topics of interest for B&B, including additional education, current employment status, K-12 teaching status, and marital status. In addition to these key items, an updated employment history since the last follow-up will be collected. The mini survey will collect the same set of key items as the abbreviated, but will not collect detailed information about employment history. A facsimile of the survey instruments that will be used in the B&B:08/18 full-scale data collection is provided in appendix F.

The B&B:08/18 interview will be available to sample members through a web-based instrument that can be used for self-administration on mobile and non-mobile devices, and Computer-Assisted Telephone Interviews (CATI). When the mini survey becomes available, respondents will have the option of completing the survey via a paper-and-pencil (PAPI) questionnaire. Completed paper surveys from mini survey respondents will be returned using an addressed, postage paid envelope enclosed in the package sent to sample members, and survey responses will be scanned and keyed by project staff. The response data from PAPI surveys will be stored in the same manner as web and CATI survey data.

Other methods will be implemented to increase response rates and decrease the potential for nonresponse bias, such as by simplifying the login process. As an example, quick response (QR) codes will be included in recruitment mailings to provide automated access to the survey. Through the camera application available on most smartphones and tablets, sample members can read the QR code and access the study website without having to type in a URL address. When communicating with sample members through e-mail and text messages, abbreviated links will enable automated access to the survey.

      1. Tracing of Sample Members

In order to yield the maximum number of locates with the least expense, an integrated tracing approach was developed. During the field test, the effectiveness of these procedures for the full-scale study effort was evaluated. The steps of the tracing plan for the full-scale study include the following elements.

  • Tracing activities prior to the start of data collection will include batch database searches and advance intensive tracing. Some cases will require more advance tracing before mailings can be sent or the cases can be worked in CATI. To handle cases for which mailing address, phone number, or other contact information is invalid or unavailable, B&B staff plan to conduct advance tracing of the cases prior to lead letter mailing and data collection. As lead information is found, additional searches will be conducted through interactive databases to expand on leads found.

  • The telephone locating and interviewing stage includes calling all available telephone numbers and following up on leads provided by parents and other contacts.

  • The pre-intensive batch tracing stage consists of the LexisNexis SSN and Premium Phone batch searches that will be conducted between the telephone locating and interviewing stage and the intensive tracing stage.

  • The intensive tracing stage consists of tracers conducting database searches after all current telephone numbers have been exhausted. In B&B:08/09, about 77 percent of sample members requiring intensive tracing were located, and about 46 percent of those located responded to the interview. In B&B: 08/12, about 89 percent of sample members requiring intensive tracing were located, and about 39 percent of those located responded to the interview. Intensive interactive tracing differs from batch tracing in that a tracer can assess each case on an individual basis to determine which resources are most appropriate and the order in which they should be used. Intensive interactive tracing is also much more detailed due to the personal review of information. During interactive tracing, tracers utilize all previously obtained contact information to make tracing decisions about each case. These intensive interactive searches are completed using RTI’s integrated case management system to provide organization and efficiency in the intensive tracing process. Sources that may be used, as appropriate, include credit database searches, such as Experian, various public websites, and other integrated database services.

  • Other locating activities will take place as needed, including e-mail searches in which B&B:08/18 staff will match SSN and phone number to obtain e-mail addresses.

      1. Contacts with Sample Members

Various communication methods will be used to maintain continuing contact with sample members as needed, prior to and throughout data collection. Letters, postcards, e-mail messages, and text messages (for those from whom prior permission to text was received) will be used to communicate with sample members regarding their study participation. Appendix E presents the communication materials to be used in the B&B:08/18 full-scale study, and the tentative1 schedule for contacting sample members.

      1. Training for Data Collection Staff

Telephone data collection will be conducted at the contractor’s Research Operations Center (ROC), where B&B staff will include Performance Team Leaders (PTLs) and Data Collection Interviewers (DCIs). Training programs for these staff members are critical to maximizing response rates and collecting accurate and reliable data.

PTLs, who are responsible for all supervisory tasks, will attend project-specific training in addition to the interviewer training. They will receive an overview of the study, background and objectives, and the survey instrument through a question-by-question review. PTLs will also receive training in the following areas: providing direct supervision of DCIs; handling refusals; monitoring interviews and maintaining records of monitoring results; problem resolution; case review; specific project procedures and protocols; reviewing CATI reports; and monitoring data collection progress.

Training for DCIs is designed to help staff become proficient in using the CATI case management system and the survey instrument, as well as to learn project procedures and requirements. Particular attention will be paid to quality control initiatives, including refusal avoidance and methods to ensure that quality data are collected. DCIs will receive project-specific training on telephone interviewing and answering questions from web participants regarding the study or related to specific items within the interview. After training, all ROC staff must meet certification requirements by successfully completing a certification interview. This evaluation consists of a full-length interview with project staff observing and evaluating interviewers, as well as an oral evaluation of interviewers’ knowledge of the study’s Frequently Asked Questions.

      1. Case Management System

Interviews will be conducted using a single web-based survey instrument for both web (including mobile devices) and CATI data collection. The telephone data collection activities will be accomplished through a CATI case management system, which is equipped with the numerous capabilities, including: on-line access to locating information and histories of locating efforts for each case; sample management module for tracking case progress and status; and automated scheduling module which delivers cases to interviewers. The automated scheduling module incorporates the following features:

  • Automatic delivery of appointment and call-back cases at specified times. This reduces the need for tracking appointments and helps ensure the interviewer is punctual. The scheduler automatically accounts for the appropriate time zone when scheduling calls.

  • Sorting of non-appointment cases according to parameters and priorities set by project staff. Cases may be prioritized for CATI within certain sub-samples or geographic areas, or cases may be sorted to establish priorities between cases of differing status. B&B:08/18 staff may also use patterns of calling outcomes to prioritize calls (e.g., cases with more than a certain number of unsuccessful attempts during a given time of day may be passed over for a brief time). As another example, mode tailoring is a strategy in which sample members who participated via phone on prior rounds will be contacted by phone earlier than other sample members.

  • Restriction on allowable interviewers. Groups of cases (or individual cases) may be designated for delivery to specific interviewers or groups of interviewers. This feature is most commonly used in filtering refusal cases, locating problems, or foreign language cases to specific interviewers with specialized skills.

  • Complete records of calls and tracking of all previous outcomes. The scheduler tracks all outcomes for each case, labeling each with type, date, and time. These are easily accessed by the interviewer upon entering the individual case, along with interviewer notes.

  • Flagging of problem cases for supervisor action or supervisor review. For example, refusal cases may be routed to supervisors for decisions about whether and when a refusal letter should be mailed, or whether another interviewer should be assigned.

  • Complete reporting capabilities. The CMS produces default reports on the aggregate status of cases and custom report generation capabilities.

The integration of these capabilities reduces the number of discrete stages required in data collection and data preparation activities and increases capabilities for immediate error reconciliation, which results in better data quality and reduced cost. The scheduler provides an efficient case assignment and delivery function by reducing supervisory and clerical time, improving performance on the part of interviewers and supervisors by automatically monitoring appointments and call-backs, and reducing variation in implementing survey priorities and objectives.

      1. Refusal Aversion and Conversion

Recognizing and avoiding refusals is important to maximize the response rate. B&B:08/18 staff will emphasize this and other topics related to obtaining cooperation during interviewer training. PTLs will monitor interviewers intensely during the early days of outbound calling and provide retraining as necessary. In addition, the supervisors will review daily interviewer production reports produced by the CATI system to identify and retrain any data collectors who are producing unacceptable numbers of refusals or other problems.

Refusal conversion efforts will be delayed for at least one week to give the respondent time after the initial refusal. Attempts at refusal conversion will not be made with individuals who become verbally aggressive or who threaten to take legal or other action. Refusal conversion efforts will not be conducted to a degree that would constitute harassment. B&B:08/18 staff will respect a sample member’s right to decide not to participate and will not impinge this right by carrying conversion efforts beyond the bounds of propriety. Sample members who explicitly refuse participation in the survey will be offered the mini survey option, described above.

    1. Tests of Procedures and Methods

The B&B:08/18 field test included two sets of experiments: the first set of experiments focused on interventions designed to increase survey participation and reduce nonresponse error, while the second set of experiments focused on minimizing measurement error in selected survey items. The full-scale data collection design described below will implement the tested approaches, revised based on the B&B:08/18 field test results described in Section 4.a. These approaches are also informed by previous studies, including B&B:08/12 (B&B:08/12 Data File Documentation https://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2015141) and the B&B:16/17 field test (see B&B:16/17 Appendix C, OMB# 1850-0926 v.3).

      1. Summary of B&B:08/18 Field Test Data Collection Design and Results

The B&B:08/18 field test contained three data collection experiments and one questionnaire design experiment. The first three experiments examined three different aspects of tailoring contact: tailoring of contact materials, emphasis of NCES as the source and signatory of e-mails (referred to as the sponsorship experiment), and offering a mini survey with an additional survey mode (i.e., offering the mini survey with/without a PAPI option). The B&B:08/18 field test experiment results provide insight in preparation for the full-scale study regarding the effectiveness of these interventions in terms of rates of survey response, representativeness, and data collection efficiency. Survey response was investigated using response rates and résumé upload rates conditional on survey participation. Based on administrative frame data, B&B:08/18 staff conducted nonresponse bias analyses for age, institutional sector of the NPSAS institution, region of the United States that the NPSAS institution is located in, and total enrollment counts. Efficiency is operationalized as the number of days between the start of the experiment and survey completion excluding respondents who completed a partial interview or via paper. The analysis uses one-sided t-tests to assess whether survey response or the efficiency increases significantly in the experimental groups and two-sided t-tests to assess nonresponse bias.

The questionnaire design experiment investigated alternatives to the commonly used “check all that apply” format in item batteries. The results of field test experiments are summarized below. For detailed results of the B&B:08/18 field test experiments, see Appendix C.

The tailoring of contact materials experiment investigated whether referencing the sample members’ degree major improved overall data quality by increasing the personal relevance of participating. The results in Table 3 show that there were no statistically significant differences for response, or efficiency. Overall response rates in the two conditions were similar (tailoring: 71.7%, standardized: 72.0%, t(1,098) = -0.11, p = .55). However, the response rate among B&B:08/12 field test nonrespondents who received the tailored materials was 42.4% while the response rate among B&B:08/12 field test nonrespondents in the standardized condition was 36.0% (t(179) = 0.88, p = .19). While not significantly different, these results suggest that tailoring may be important among prior round nonrespondents. Tailoring also reduces the magnitude of nonresponse bias as well as the number of significantly biased indicators.

Table 3. Tailoring results by experimental condition


Standardized materials

Tailored Materials

t-statistic

p-value

N

Response






Survey Completion (in percent)

72.04%

71.73%

-0.11

0.5540

1,100

Résumé Submission (in percent)

38.16%

35.82%

-0.67

0.7489

791

Representativeness






Average absolute relative nonresponse bias

8.38

9.45

na

na

1,100

Median absolute relative nonresponse bias

5.53

5.53

Maximum absolute relative nonresponse bias

44.47

39.40

Percentage of significantly biased indicators

14.29%

0.00%

Efficiency (in days since start of the experiment)

30.16

28.65

-0.70

0.2419

755

The results in the sponsorship experiment, that is sending reminder e-mails from an “@rti.org” or an “@ed.gov” e-mail address, are very similar to those from the tailoring experiment with respect to response representativeness and efficiency (see Table 4). Both groups achieved identical response rates at the end of data collection (54.8% for both conditions, t(1,330) = 0.02, p = .49).

Table 4. Sponsorship results by experimental condition


RTI

NCES

t-statistic

p-value

N

Response






Survey Completion (in percent)

54.78%

54.83%

0.02

0.4916

1,332

Résumé Submission (in percent)

30.51%

33.06%

0.74

0.2309

730

Representativeness






Average absolute relative nonresponse bias

10.96

10.05

na

na

1,332

Median absolute relative nonresponse bias

8.90

7.37

Maximum absolute relative nonresponse bias

54.47

44.50

Percentage of significantly biased indicators

10.53%

4.76%

Efficiency (in days since start of the experiment)

32.67

30.78

0.84

0.1999

682

The results for the mini survey experiment, that is completing the mini survey in the original survey modes (mini-standard) or a condition that additionally allowed a completion using paper and pencil (mini-PAPI), in Table 5 show that there was no statistical difference in response rates among the mini-PAPI and the mini-standard condition. Conditional on participation in the mini survey, respondents in the mini-PAPI condition did upload their résumés at significantly lower rates (18.3%) compared to those in the mini-standard condition (35.1%) (t(196) = -2.73, p < .01). This lower rate is driven by the fact that none of the respondents who completed the survey via mail uploaded their résumés when asked to do so in a thank-you mailing, though they received the same $10 incentive offer for résumé submission as all other respondents did. There is no difference in representativeness across conditions. However, respondents in the mini-PAPI condition who completed the survey on the web2 did so approximately three days later (23.0 days after the start of the experiment) compared to respondents in the mini-standard condition (19.9 days after the start of the experiment). This result is statistically significant (t(166) = 1.66, p < .05).

Table 5. Mini survey results by experimental condition


Standardized materials

Tailored Materials

t-statistic

p-value

N

Response






Survey Completion (in percent)

23.44%

26.07%

0.86

0.1953

800

Résumé Submission (in percent)

35.11%

18.27%

-2.73

0.0035

198

Representativeness






Average absolute relative nonresponse bias

15.46

14.64

na

na

806

Median absolute relative nonresponse bias

10.95

9.06

Maximum absolute relative nonresponse bias

78.50

44.50

Percentage of significantly biased indicators

0.00%

0.00%

Efficiency (in days since start of the experiment)

22.99

19.89

1.66

0.0496

168

NOTE: Six mini survey respondents were excluded from the response and efficiency analyses since they completed the survey before they were informed about the mini survey and were hence not part of the experiment. The sample for the nonresponse bias analysis includes all respondents to the mini survey.

While there was no statistically significant increase in response rates or résumé submission rates3 in any of the data collection experiments, the field test results are suggestive of positive effects. Given that the interventions tested in the field test show no indication of negative effects4, that the technical review panel members and the literature support these adjustments5, and they are low-cost and easy to implement, the recommendations for the full-scale study are to:

  • tailor the contact materials and reference the sample members’ degree major(s),

  • use NCES as the primary signatory and sender of the electronic communication materials, and

  • use a sequential approach such as offering the mini survey followed by the mini-PAPI.

The questionnaire design experiment compared alternative formats in a set of questions in which respondents are asked to select items that apply to them: 1) the traditional “check all that apply” format in which respondents were asked to check a box for each item that applies, and 2) the forced-choice format that presents respondents with explicit “yes/no” options for each item in the list, and 3) the forced-choice format that presents respondents with explicit “no/yes” options for each item in the list. The results suggest trends of higher endorsement rates among the forced-choice formats and longer completion times indicating more cognitive processing in this forced-choice yes-no format. Therefore, the forced-choice format is planned for in the full-scale instrument.

      1. B&B:08/18 Full-scale Data Collection Design

The data collection design proposed for the B&B:08/18 full-scale study builds upon the designs implemented in B&B:08/12, the B&B:08/18 field test, and other related studies (e.g., B&B:16/17). As part of B&B:08/12, a responsive design using a Mahalanobis distance model was employed to identify the cases most likely to introduce nonresponse bias if they did not respond (see the B&B:08/12 Data File Documentation). Nonresponse bias analyses of the B&B:08/12 experimental data demonstrated minimal presence of nonresponse bias in the set of variables examined. Given this relatively homogenous sample, there was no significant reduction in the average nonresponse bias across variables when cases more likely to induce bias were targeted to receive special attention (e.g., an incentive boost, a prepaid incentive, an abbreviated interview). Similar outcomes were observed in the B&B:16/17 field test study where analyses revealed that, after 20 days of data collection the mean Mahalanobis distance value decreased only marginally, suggesting that adding more sample members to the respondent pool did not necessarily contribute to a more representative sample (discussed in detail in the B&B:16/17 Appendix C, OMB# 1850-0926 v.3). Similar to the B&B:08 cohort, the B&B:16 cohort also has a relatively low potential for nonresponse bias due to its homogeneity in the variables used to examine nonresponse bias.

However, this homogeneity assumption may no longer hold six years after the last data collection, especially for variables that cannot be measured for survey nonrespondents (like those related to employment). A primary goal of the full-scale design is to minimize any potential nonresponse bias that could be introduced into B&B:08/18, especially bias that could be due to lower response among prior-round nonrespondents. Another important goal is to reduce the amount of time and costs of data collection efforts.

To accomplish these goals, the plan is to achieve at least a 75 percent response rate. Doing so will minimize potential nonresponse bias, and will also optimize statistical power and enable sub-group analyses. B&B:08/18 staff plan to divide the sample into two groups and implement differential treatments based on prior round response status. A similar approach was successfully implemented in the B&B:16/17 field test, where prior round nonrespondents received either an aggressive or a default protocol. The response rate among prior nonrespondents who received the aggressive protocol was about 12% higher than the group that received the default protocol ( 37%; default 25% response rate t(2,097) = 3.52, p < .001). For the B&B:08/18 full-scale design, the following groupings will be used:

  • B&B:08/09 and B&B:08/12 interview respondent: All sample members who responded to both B&B:08/09 and B&B:08/12, the double respondents, will receive the default data collection protocol (N=13,490).

  • B&B:08/09 and/or B&B:08/12 interview nonrespondent: Sample members who failed to respond to either of the prior two follow-up studies (B&B:08/09 and B&B:08/12), the ever nonrespondents, will receive the aggressive data collection protocol (N=3,550).

Table 4 presents the type and timing of interventions to be used in the full-scale for both the default and aggressive protocol groups. To begin, both groups will be offered a $2 prepaid incentive. The baseline incentive offer for completing the survey will be $30 for the sample members in the default protocol group, and will be $50 for those in the aggressive protocol group. Beyond the baseline incentive, both data collection protocols employ the same interventions, although the timing of these interventions differs across groups: data collection protocol switches in the aggressive protocol occur sooner relative to making additional efforts under the same protocol (e.g., Peytchev et al. 2009). Each intervention is described in more detail below.

Table 6. B&B:08/18 full-scale data collection protocols by data collection phase

Intervention

Default
(Double Respondents)
N=13,490

Aggressive
(Ever Nonrespondent)
N=3,550

Announcement with $2 prepaid incentive

Week 1
Baseline incentive $30

Week 1
Baseline incentive $50

Start CATI contacting

Week 2 for prior CATI completers, Week 10 for others

Week 2 for prior CATI completers, Week 6 for others

Infographic Mailing

Week 14

Week 12

2-week flash incentive – additional $5

Week 26

Week 15

Offer abbreviated survey

Week 28

Week 19

Incentive boost – additional $10

Week 30

Week 23

Offer mini survey

Week 33

Week 28

Offer mini survey (PAPI)

If data collection is extended

Week 33


Change sponsorship

Based on data collection.

Will assess periodically and implement if projected response rate

< 75%


Potential extension of data collection

Maximum Incentive

$30 + $2 (prepaid)

$5 (flash)

$10 (boost)

$5 (résumé upload)

Total = $52

$50 + $2 (prepaid)

$5 (flash)

$10 (boost)

$5 (résumé upload)

Total = $72

NOTE: Data collection is scheduled for July 12, 2018 through March 25, 2019, approximately 37 weeks.

Prepaid incentive. Cash prepaid or unconditional incentives have been shown to significantly increase response rates in both interviewer-administered as well as self-administered surveys and hence reduce the potential for nonresponse bias (e.g., Church 1993; Cantor et al. 2008; Goeritz 2006; Medway and Tourangeau 2015; Messer and Dillman 2011; Parsons and Manierre 2014; Singer 2002). Medway and Tourangeau (2015) show that prepaid cash incentives not only significantly increase response rates in telephone surveys, but decrease refusal rates and increase contact rates.

During the early completion phase in the B&B:16/17 field test, prepaid incentives ($10 via check or PayPal) in combination with telephone prompting also significantly increased response rates by 4.4 percentage points in the aggressive protocol. Given these positive findings combined with general recommendations in the literature (Singer and Ye 2013), B&B:08/18 staff will offer a small prepaid incentive of $2 to each sample member in the B&B:08/18 full-scale study. This amount has been shown to effectively increase response rates at more efficient field costs compared to higher or lower prepaid incentives (e.g., Beebe et al. 2005; Millar and Dillman 2011; Tourangeau et al. 2013, p. 48).

For all sample members with good contacting information (e.g., respondents to the panel maintenance, successful tracing) a $2 bill will be included in the data collection announcement.6 All sample members for whom no good contacting information exists at the start of data collection will receive a $2 prepaid incentive via PayPal to their best known e-mail address (in the B&B:08/18 field test cohort 40% of all respondents chose to receive their incentive via PayPal, and 47% of the B&B:16/17 full-scale cohort). PayPal was successfully used for prepaid incentives in BPS:12/17 and B&B:16/17. Once B&B:08/18 staff obtain good contacting information for a sample member, a $2 cash incentive will be mailed out if the sample member has not yet retrieved the $2 PayPal offer and completed the survey (similar to the B&B:08/12 full-scale responsive design experiment). All data collection announcements related to interventions will be designed to stand out (for example, they will be mailed using Priority or FedEx mail as suggested by DeBell et al. 2017; Dillman et al. 2014).

Baseline incentive. Sample members in the default group will receive a $30 baseline incentive and sample members in the aggressive group will be offered a $50 baseline incentive. This matches the amount offered to equivalent groups in previous data collections with the B&B:08 cohort as well as the B&B:16 cohort.

Mode tailoring and CATI outbound calling. Leverage-saliency theory and social exchange theory suggest that offering a person the mode they prefer, e.g., by telephone or the web, increases their likelihood of participating (Dillman et al. 2014; Groves et al. 2000). This theory is supported by empirical evidence that offering people their preferred mode choice speeds up their response and is associated with higher participation rates (e.g., Olson et al. 2012). The outbound CATI calling will begin earlier (at the end of week two) for sample members who completed the prior round on the phone. Likewise, those sample members who completed the prior round interview online will not be contacted by telephone before the preassigned outbound data collection date (default protocol week 10; aggressive protocol week 6). Attempting to interview sample members in the aggressive group by telephone sooner also increases the likelihood of initiating locating efforts sooner. For example, B&B:16/17 field test results showed higher locate rates for a data collection protocol which had outbound CATI calling (93.7%), compared to that of a data collection protocol which did not (77.8%; χ2 = 63.2, p < .001).

Infographic. Groves et al. 1992 argue that informational letters will gain higher cooperation rates than not providing such materials. Ongoing panel surveys often provide general feedback on study results to respondents (e.g., Blom et al. 2015; Scherpenzeel and Toepoel 2014). While the results in the U.S. are mixed, several studies do suggest a positive effect of personalized feedback on response rates (e.g., Baelter et al. 2011; Marcus et al. 2007). In order to communicate the importance, relevance and legitimacy of the B&B:08/18 full-scale study B&B:08/18 staff plan to send an infographic as part of a reminder contact via mail and e-mail with link to infographic on the website (default group week 14; aggressive group week 12). This infographic will contain a 1-page visual representation of results from prior B&B:08 studies and stress the importance of taking part in the study. See appendix E for the infographic text and other communication materials to be used in the full-scale study.

Flash incentive. Early bird incentives have been shown to lead to faster responses and increased participation rates within the early bird incentive period (e.g., Coppersmith et al. 2016; LeClere et al. 2012), and can provide efficiencies by reducing both data collection costs and time. Given these positive effects, B&B:08/18 full-scale will apply a variation on the idea of early bird incentives to speed up survey completion by offering a flash incentive. More specifically, all sample members will be offered an additional $5 incentive if they complete the survey within the next two weeks, that is, a ‘flash’ request instead of an early bird incentive. The default group would receive this request in week 26 and the aggressive group in week 15. This particular timing ensures that the ‘flash’ intervention occurs towards the end of the full survey collection phase just before the offer to complete the abbreviated survey is mailed out. This timing also ensures that most sample members have been located.

Abbreviated survey. Obtaining responses from all sample members is important for sample representativeness (e.g., Kreuter et al. 2010). Several postsecondary studies conducted by NCES have offered reluctant sample members an abbreviated interview to decrease response burden, increase response rates, and decrease nonresponse bias. For example, during the B&B:16/17 field test, sample members who were prior round nonrespondents and who were offered the abbreviated interview during the production phase responded at higher rates (22.7%) than those who were not offered the abbreviated interview at the same time (12.1%; t(2,097) = 3.67, p < .001). The B&B:08/12 full-scale showed similar results in that offering the abbreviated interview to prior round nonrespondents increased overall response rates of that group by 18.2 percentage points (B&B:08/12 full-scale DFD p. 33). An abbreviated interview option will be offered to all sample members in the B&B:08/18 full-scale. For the default group, it will be offered in week 28, and for the aggressive group in week 19. The abbreviated survey, shown in Appendix F, contains a set of critical items and collects information about employment history.

Incentive boost. Incentive boosts for those sample members who refused, or a subsample thereof, are a common element used in nonresponse conversion (e.g., Groves and Heeringa 2006; Singer and Ye 2013). These boosts are especially common in large federal surveys during their nonresponse follow up phase (e.g., The National Survey of Family Growth) and have been implemented successfully in other postsecondary education surveys (e.g., HSLS:09 F2; BPS:12/17). For nonresponse conversion, a $10 incentive boost increase to the B&B:08/18 baseline incentive is planned for all remaining nonrespondents. For all remaining nonrespondents in the aggressive protocol group, the incentive boost will be offered beginning in week 23 of data collection, approximately four weeks after the abbreviated survey offer and about five weeks before the mini survey offer. The incentive boost will be offered to all remaining nonrespondents in the default group beginning in week 30, allowing two weeks after the abbreviated survey offer and three weeks before implementation of the mini survey.

Mini survey. The shorter the stated length of a survey, the lower the perceived burden for the respondent. Motivated by this approach, B&B:08/18 staff will offer an extremely abbreviated questionnaire – a mini survey – to all nonrespondents late in the data collection period (default group week 33; aggressive group week 28). The mini survey, presented in Appendix F, contains only the most critical survey items. Again, obtaining information on these nonrespondents is crucial to assess and increase sample representativeness (e.g., Kreuter et al. 2010). The mini survey offer is expected to further increase response rates compared to the traditional abbreviated interview given that the mini survey is less burdensome as it is even shorter.

Mini-PAPI. In addition to offering shorter interviews, offering multiple survey completion modes may improve response rates and representativeness (e.g., Dillman et al. 2014; Shettle and Mooney 2009). Switching from web to mail, for example, has been shown to increase response rates between 5 to 19 percentage points and to increase representativeness by bringing in different types of respondents (e.g., Messer and Dillman 2011; Millar and Dillman 2011).7 In a non-experimental design, Biemer et al. (2016) show that a mode switch from web to mail improved response rates by up to 20 percentage points. The B&B:08/18 field test results showed that there was not a significant difference in response rates between those offered the mini-PAPI option and those that received the mini-standard option, though about 15 percent of mini completions were submitted via PAPI, indicating that some portion of the B&B sample is receptive to responding via PAPI.

Résumé collection. As was done in the B&B:08/18 field test, all survey respondents will be asked to upload their résumé at the end of the interview. From B&B:16/17, B&B:08/18 staff learned that some respondents upload their résumés even if no additional incentive is provided (at the time of writing the submission rate among respondents in the B&B:16/17 full-scale is approximately 11%). For this reason, and to preserve project resources for other primary data collection interventions, B&B:08/18 staff propose to decrease the $10 offered for a résumé upload during the B&B:08/18 field test to a $5 incentive to thank survey respondents for the additional time and effort required by the résumé upload process in the full-scale study.

Additional interventions. B&B:08/18 staff will continuously monitor data collection measures to achieve the targeted response rate of 75%. For example, time series modeling to project the response rate at the end of data collection, overall and for key analytic subgroups, may be used. In combination with a comparison of the weekly performance of the B&B:08/18 full-scale sample to that of the B&B:08/12 full-scale, NCES will determine whether and when any of the interventions below should be implemented:

  • Change sponsorship. While B&B is conducted by NCES, with the U.S. Department of Education (ED), RTI has typically used a study-specific e-mail “@rti.org” and telephone number to contact and support sample members. For the B&B:08/18 field test, B&B:08/18 staff investigated the effect of sending reminder e-mails from an “@ed.gov” e-mail address. Compared to sending e-mails from “@rti.org” the B&B:08/18 field test showed that sending reminders from NCES had positive effects on sample representativeness and data collection efficiency. For the full-scale study, B&B:08/18 staff will e-mail the contact materials from “@ed.gov.” If, however, the models suggest that the targeted response rates for the B&B:08/18 full-scale will not be met, B&B:08/18 staff will send e-mail reminders either from the NCES project officer, the RTI project officer, or the standard e-mail address (i.e., [email protected]). Changing the sender of the e-mail to the project officers may increase the perceived importance of the survey and help personalize the contact materials, thereby potentially increasing relevance. Switching the sender during data collection also increases the chance that the survey invitation is delivered to the sample member rather than to a spam filter.

  • Extension of data collection. Should it become clear from our monitoring activities that the targeted response rate will not be met by the planned end date of data collection, data collection will be extended for about four weeks. During this extended data collection phase, the mini-PAPI survey will be offered to all remaining nonrespondents in the default protocol, and the mini-PAPI survey offer will be continued for the aggressive protocol group.

    1. Reviewing Statisticians and Individuals Responsible for Designing and Conducting the Study

The study is being conducted by NCES/ED. The following statisticians at NCES are responsible for the statistical aspects of the study: Mr. Ted Socha, Dr. Tracy Hunt-White, Dr. David Richards, Dr. Sean Simone, Dr. Elise Christopher, and Dr. Gail Mulligan. NCES’ prime contractor for B&B:08/18 is RTI International (RTI). The following staff members at RTI are working on the statistical aspects of the study design: Dr. Jennifer Wine, Dr. Melissa Cominole, Ms. Jennifer Cooney, Mr. Jeff Franklin, Dr. Antje Kirchner, Dr. T. Austin Lacy, Dr. Emilia Peytcheva, Mr. Peter Siegel, and Ms. Ashley Wilson.

Subcontractors include Coffey Consulting; HR Directions; Shugoll Research; and Strategic Communications, Inc. Consultants are Dr. Sandy Baum, Ms. Alisa Cunningham, and Dr. Stephen Porter. Principal professional RTI staff, not listed above, who are assigned to the study include Ms. Donna Anderson and Ms. Chris Rasmussen.


  1. References

Avdeyeva, O.A., and Matland, R.E. 2013. An Experimental Test of Mail Surveys as a Tool for Social Inquiry in Russia. International Journal of Public Opinion Research, 25(2), 173–194.

Baelter, O., Fondell, E., and Baelter, K. 2011. Feedback in web-based questionnaires as incentive to increase compliande in studies on lifestyle factors. Public Health Nutrition, 15, 982-988.

Beebe, T.J., Davern, M.E., McAlpine, D.D., Call, K.T., and Rockwood, T.H. 2005. Increasing response rates in a survey of Medicaid enrollees: The effect of a prepaid monetary incentive and mixed modes (mail and telephone). Medical Care, 43, 411- 420.

Biemer, P., Murphy, J., Zimmer, S., Berry, C., Deng, G., and Lewis, K. 2016. A Test of Web/PAPI Protocols and Incentives for the Residential Energy Consumption Survey. Paper presented at the 2016 Annual Conference of the American Association for Public Opinion Research, Austin, TX.

Blom, A., Gathman, C., and Krieger, U. 2015. Setting up an Online Panel Representative of the General Population. The German Internet Panel. Field Methods, 27, 391-408.

Cantor, D., O’Hare, B.C., and O’Connor, K.S. 2008. The Use of Monetary Incentives to Reduce Nonreponse in Random Digit Dial Telephone Surveys. In Lepkowski, J.M., Tucker, N.C., Brick, J.M., de Leeuw, E., Japec, L., Lavrakas, P.J., Link, M.W., and Sangster, R.L. (eds.). Advances in Telephone Survey Methodology. New York: Wiley.

Church, A.H. 1993. Estimating the Effect of Incentives on Mail Survey Response Rates: A Meta-Analysis. Public Opinion Quarterly, 57(1), 62-79.

Coppersmith, J., Vogel, L.K., Bruursema, T., and K. Feeney. 2016. Effects of Incentive Amount and Type of Web Survey Response Rates. Survey Practice, no pp.

DeBell, M., Maisel, N., Edwards, B., Amsbary, M., and Meldener, V. 2017. Improving General Population Survey Response Rates with Visible Money. Paper presented at the 2017 Annual Conference of the American Association for Public Opinion Research, New Orleans, LA.

Dillman, D.A., Smyth, J.D., and Christian, L.M. 2014. Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method 4th Edition. Hoboken, NJ: John Wiley & Sons.

Edwards, M.L., Dillman, D.A., and Smyth, J.D. 2014. An Experimental Test of the Effects of Survey Sponsorship on Internet and Mail Survey Response. Public Opinion Quarterly, 78(3), 734-750.

Goeritz, A.S. 2006. Incentives in web studies: Methodological issues and review. International Journal of Internet Science, 1, 58-70.

Groves, R.M., Cialdini, R., and Couper, M. 1992. Understanding the Decision to Participate in a Survey. Public Opinion Quarterly, 56(4), 475-495.

Groves R.M., and Heeringa, S.G. 2006 Responsive Design for Household Surveys: Tools for Actively Controlling Survey Errors and Costs. Journal of the Royal Statistical Society Series A-Statistics in Society, 169(3), 439-457.

Groves, R.M., Singer, E. and Corning, A. 2000. Leverage-Saliency Theory of Survey Participation. Description and Illustration. Public Opinion Quarterly, 64, 299-308.

Groves, R.M., Presser, S., Tourangeau, R., West, B.T., Couper, M.P., Singer, E., and Toppe, C. 2012. Support for the Survey Sponsor and Nonresponse Bias. Public Opinion Quarterly, 76, 512-24.

Kreuter, F., Olson, K., Wagner, J., Yan, T., EzzatiRice, T.M., CasasCordero, C., Lemay, M., Peytchev, A., Groves, R.M., and Raghunathan, T.E. 2010. Using Proxy Measures and Other Correlates of Survey Outcomes to Adjust for NonResponse: Examples from Multiple Surveys. Journal of the Royal Statistical Society: Series A, 173(2), 389-407.

Lynn, P. 2016. Targeted Appeals for Participation in Letters to Panel Survey Members. Public Opinion Quarterly, 80(3), 771-782.

LeClere, F., Plumme, S., Vanicek, J., Amaya, A., and K. Carris. 2012. Household Early Bird Incentives: Leveraging Family Influence to Improve Household Response Rates.” American Statistical Association Joint Statistical Meetings, Section on Survey Research.

Marcus, B., Bosnjak, M., Lindner, S., Pilischenko, S., and Schuetz, A. 2007. Compensating for Low Topic Interest and Long Surveys. A field Experiment on Nonresponse in Web Surveys. Social Science Computer Review, 25, 372-383.

Medway, R.L. and Tourangeau, R. 2015. Response Quality in Telephone Surveys. Do Prepaid Incentives Make a Difference? Public Opinion Quarterly, 79(2), 524-543.

Messer, B.L., and Dillman, D.A. 2011. Surveying the General Public Over the Internet Using Address-Based Sampling and Mail Contact Procedures. Public Opinion Quarterly, 75(3), 429–457.

Millar, M.M., and Dillman, D.A. 2011. Improving Response to Web and Mixed-Mode Surveys. Public Opinion Quarterly, 75(2), 249–269.

Olson, K., Smyth, J. D., and Wood, H. 2012. Does Giving People Their Preferred Survey Mode Actually Increase Survey Participation? An Experimental Examination. Public Opinion Quarterly, 76, 611–635.

Parsons, L., and Manierre, M.J. 2014. Investigating the Relationship among Prepaid Token Incentives, Response Rates, and Nonresponse Bias in a Web Survey. Field Methods, 26(2), 191-204.

Peytchev, A., Baxter, R., and Carley-Baxter, L.R. 2009. Not all Survey Effort is Equal. Reduction of Nonresponse Bias and Nonresponse Error. Public Opinion Quarterly, 73(4), 785-806.

Scherpenzeel Toepoel 2014. Informing Panel Members About Study Results: Effects of Traditional and Innovative Forms lf Feedback on Participation. In Callegaro, M., Baker, R., Bethlehem, J., Goeritz, A.S., Krosnick, J.A., and Lavrakas, P.J. (eds) Online Panel Research. A Data Quality Perspective. New York: Wiley.

Shettle, C. and Mooney, G. 1999. Monetary Incentives in US Government Surveys. Journal of Official Statistics, 15(2), 231-250.

Singer, E. 2002. The Use of Incentives to Reduce Nonresponse in Household Surveys. In Groves, R.M., Dillman, D. A., Eltinge, J.L., Little, R.J.A. (eds.), Survey Nonresponse. New York: Wiley.

Singer, E. and Ye, C. 2013. The Use and Effects of Incentives in Surveys. Annals. Annals of the American Academy of Political and Social Science, 645(1), 112-141.

Tourangeau, R., Conrad, F.G., and Couper, M. 2013. The Science of Web Surveys. Oxford, NY: Oxford University Press.

Tourangeau, R., Groves, R.M., and C.D. Redline. 2010. Sensitive Topics and Reluctant Respondents. Demonstrating a Link Between Nonresponse Bias and Measurement Error. Public Opinion Quarterly 74(3), 413-432.

1 Dates are shown to reflect the general schedule, but are subject to change as needed.

2 Excluding the respondents in the mini-PAPI version who complete the survey via mail (n=30).

3 With the exception of the lower rate of résumé submission among mini-PAPI respondents.

4 With the exception of the lower rate of résumé submission among mini-PAPI respondents.

5 For tailoring, see Lynn 2016 and Tourangeau et al. 2010. For sponsorship, see Avdeyeva and Matland 2013; Edwards et al. 2014; and Groves et al. 2012. For mini-PAPI, see: Biemer et al. 2016; Galesic and Bosnjak 2009; and Messer and Dillman 2011.

6 A $2 bill is also considered unique and should therefore gauge the sample member’s interest more compared to other cash incentives.

7 It is worth noting that these studies are lacking a true control condition, i.e., no web condition was followed-up in web only. The mode switch was from web to mail, or vice versa, hence number of contact attempts and mode switch are potentially confounded. B&B:08/18 staff will extrapolate what the response rates would have been, had the mini survey not been offered, and compare the pure mini survey effect to that simulation. This allows us to test the mini survey effect and the mode switch effect.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleChapter 2
Authorspowell
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy