Changes Memo

B&B 2008-2018 Changes Memo.docx

2008/18 Baccalaureate and Beyond (B&B:08/18) Full-Scale

Changes Memo

OMB: 1850-0729

Document [docx]
Download: docx | pdf

DATE: February 28, 2018

TO: Robert Sivinski, OMB

THROUGH: Kashka Kubzdela, NCES

FROM: Tracy Hunt-White and Ted Socha, NCES

Re: 2008/18 Baccalaureate and Beyond (B&B:08/18) Full-Scale (OMB # 1850-0729 v. 13)



The 2008/18 Baccalaureate and Beyond Longitudinal Study (B&B:08/18), conducted by the National Center for Education Statistics (NCES), is designed to follow a cohort of students who completed the requirements for their bachelor’s degree during the same academic year. The request to conduct the B&B:08/18 field test was last approved in May 2017 (OMB# 1850-0729 v. 11-12), and we are now requesting to conduct the full-scale data collection. The provided in this submission B&B:08/18 full-scale materials and procedures are based on those approved for the field test, with a number of clarifications and revisions based on our experiences and experiment results from the field test, winter 2017 cognitive testing results, and Technical Review Panel feedback. This memo summarizes the revisions we made to the original approved package to reflect the finalized full-scale plans.

For each revised section of the package, we note in this memo the substantive differences and provide the rationale for the changes. Note that additions/revisions are marked in a green font and deletions in red.

Supporting Statement Part A

Changes in Part A include the removal of reference to field test plans and the inclusion of updated plans for the full-scale study. Also, editorial revisions were made to improve clarity.

A.1.a. Purpose of this Submission (pages A-1 and A-2)

This section has been revised to clarify the goals of the study design and to provide a summary of the full-scale data collection design and administrative file matching plans.

Here is the new text on full-scale data collection plans for this section:

B&B:08/18 full-scale data collection plans have been developed based on lessons learned from the B&B:08/18 field test and from earlier waves of the B&B:08 cohort as well as from other related NCES studies, such as NPSAS and BPS. The primary goals of the full-scale design are to minimize nonresponse bias and reduce the amount of time and costs of data collection efforts. Further, to optimize statistical power and to enable sub-group analysis, a response rate of at least 75 percent is targeted.

The differences between the study designs approved for the B&B:08/18 field test and those planned for the B&B:08/18 full-scale are minimal and build upon information gained in earlier collections. Most of the data collection strategies to be employed in B&B:08/18 were approved by OMB for prior B&B or related studies1 and include:

  • a prepaid incentive of $2 paid by cash or PayPal;

  • a promised baseline incentive of $30 for prior round respondents and $50 for prior round nonrespondents;

  • a “flash” incentive of an additional $5 for completing the survey within a specified 2-week period;

  • an abbreviated interview of about 15 minutes; and

  • a mini survey of about 5 minutes with a paper-and-pencil version (PAPI).

Résumés were collected from B&B:08 sample members for the first time during the field test for a $10 incentive. Sample members in the full-scale study will also be asked to upload their résumé to a secure NCES server for a $5 incentive. The full-scale incentive offer for résumés has been reduced from the field test amount in order to preserve project resources for primary data collection activities. The full-scale data collection design also includes options to be implemented if needed to achieve the targeted response rate of 75 percent, including an incentive boost and an extension to the data collection period.

A.2.c. Study Design for B&B:08/18 (page A-6)

This section has been revised to reflect the full-scale data collection design and administrative file matching plans, including plans to match with the Veterans Benefits Administration for information on veterans’ federal education benefit amounts. Description of the field test design (e.g. incentive plans, experiments, résumé and LinkedIn requests) has been removed.

A.3. Use of Information Technology (page A-6)

This section has been revised to include information about survey type and completion mode results from the field test, and provides additional detail about the utility of résumé data.

A.9. Provision of Payments or Gifts to Respondents (pages A-8 and A-9)

This section has been revised to reflect the final incentive plans for full-scale data collection and includes a description of planned processes for matching to a federal database maintained by the U.S. Department of the Treasury’s Office of Foreign Assets Controls (OFAC) that will be used to identify sample members who cannot be offered an incentive.

The full-scale incentive plan is pasted below for reference:

  1. Provision of Payments or Gifts to Respondents

The use of incentives for completion of the interview can provide significant advantages to the government in terms of increased response rates and higher quality data with minimal nonresponse bias. In addition, the use of incentives may also result in decreased data collection costs due to improved efficiency. Therefore, all eligible cases in the B&B:08/18 full-scale will be offered a monetary incentive for completing the B&B interview. Table 2 below describes the various incentives included in the data collection plan. More information regarding the timing and distribution of the incentives by respondent group is provided in the Supporting Statement Part B of this submission.

Table 2. Incentive types and amounts included in the B&B:08/18 data collection plan

Incentive type

Description

Amount

Payment method

Prepaid incentive

Included with data collection announcement

$2

Cash or PayPal

Baseline incentive

Paid upon survey completion

$30 - double respondents

$50 - prior nonrespondents

Check or PayPal

Flash incentive

Added to baseline incentive if survey is completed within 2 weeks of notification, paid upon survey completion

$5

Check or PayPal

Incentive boost

Added to baseline incentive, paid upon survey completion

$10

Check or PayPal

Résumé incentive

Paid upon upload of résumé

$5

Check or PayPal

With the exception of the $2 prepaid incentive (which will be paid in cash to sample members with a good mailing address, and PayPal (an online money transfer service) to those without a good mailing address but with a good e-mail address), all other incentives for the B&B:08/18 full-scale will be paid by the sample member’s choice of check or PayPal. For reference, about 39% of respondents to the B&B:08/18 field test opted to receive promised incentives via PayPal.

Prior to the start of data collection, B&B:08/18 sample members will be matched to a federal database maintained by the U.S. Department of the Treasury’s Office of Foreign Assets Controls (OFAC). OFAC administers and enforces economic and trade sanctions based on U.S. foreign policy and national security goals. As part of its enforcement efforts, OFAC publishes a list of individuals and companies called the “Specially Designated Nationals List” or “SDN.” Their assets are blocked and U.S. entities are prohibited from conducting trade or financial transactions with those on the list (https://www.treasury.gov/resource-center/sanctions/Pages/default.aspx). In order to determine if there are any B&B:08/18 sample members to whom NCES cannot offer an incentive, the sample members will be matched to the SDN using the Jaro-Winkler and Soundex algorithms recommended by OFAC. To avoid over-matching, B&B staff will review the cases based on full name, date of birth, and address. The small number of individuals that NCES anticipates to not be able to confirm as not matching the SDN list will receive a survey request without an incentive offer.

A.10. Assurance of Confidentiality (pages A-9 to A-11)

This section was updated to include the security measures that will be used in the full-scale data collection to protect the web survey from unauthorized access, the original reference to LinkedIn was removed because it will not be used in the full-scale study, and a reference to VBA was added. In addition, we removed a bullet that provided an explanation for why a waiver of consent was not needed for administrative data linking under the Family Educational Rights and Privacy Act (FERPA) (34 CFR Part 99) because we felt the bullet was unnecessary and that prior bullets make more important points regarding this waiver.

The following bullet was removed:

The potential knowledge from the study is important enough to justify the waiver. These linked data for B&B:08/18 will provide invaluable data to researchers and education policy makers about the federal financial aid that students have received related to their persistence in and graduation from postsecondary education. Rather than relying on students for information about financial aid, NCES is obtaining it from the NSLDS, which is the Department’s system of recording federally aided student loans taken out and grants received by students. Students generally tend not to be a very reliable source of information about the amounts or timing of grants and loans they have received. The administrative record data are accurate and much easier to obtain than collecting the same data by administering a questionnaire.

A.11. Sensitive Items (pages A-11 and A-12)

This section has been updated to include reference to the sexual orientation and gender identity survey item set. While these items were included in the field test, we added information under this section acknowledging the sensitivity of the items.

This statement was added:

The addition of gender identity and sexual orientation questions allow for additional demographic covariates in analyses of education and employment outcomes. Several procedures have been implemented (see section A.10) to provide assurances to sample members about the voluntary nature of their participation in the study as well as the confidentiality provisions for survey responses.

A.12. Estimates of Response Burden (page A-12)

This section has been revised to remove burden associated with data collection activities that were approved in the field test package. The full-scale burden estimates have been updated based on final data collection plans regarding full, abbreviated, and mini interviews as well as résumé collection.

The full-scale burden estimate is pasted below for reference.

Table 3. Maximum estimated burden to respondents in B&B:08/18

Data collection activity

Sample

Expected response rate (percent)

Expected number of respondents

Expected number of responses

Average time burden per response (minutes)

Total time burden (hours)

Address update

17,040

15

2,556*

2,556

3

128

Full interview, no résumé

17,040

49

8,350

8,350

35

4,871

Full interview, with résumé

17,040

16

2,726

2,726

40

1,818

Abbreviated interview, no résumé

17,040

5

852

852

15

213

Abbreviated interview, with résumé

17,040

2

341

341

20

114

Mini interview, no résumé

17,040

2

341

341

5

29

Mini interview, with résumé

17,040

1

170

170

10

29

Full-scale total

17,040

75

12,780

15,336

-- 

7,202

* Duplicative counts of individuals, not included in respondent totals.

Supporting Statement Part B

Changes in Part B include removing references to field test sample design and tests of procedures of method and updating the information pertaining to the full-scale response universe, sample design, and procedures and methods. In addition, field test results are summarized in this part.

B.1. Respondent Universe- B&B:08/18 Target Population (page B-1)

This section was updated to remove references to the field test, and to provide more information on the target population. Information has also been added pertaining to response rates for previous B&B follow-ups and previous cohorts, as well as response rates for recent postsecondary longitudinal studies.


The following paragraph was added:

B&B:08/18, which is a 10-year follow-up to its base-year, NPSAS:08, will be conducted six years after the latest round of data collection with this cohort, in B&B:08/12. In B&B:08/12, the unweighted response rate was 85 percent of the eligible sample. In the only other 10-year B&B cohort follow-up, B&B:1993/03, about 86 percent of the eligible sample responded. Longitudinal studies often experience some attrition between waves, especially lengthy ones when some sample members become unlocatable. Additionally, survey response rates have continued to decline over the past six years, as evidenced by response rates in other recent NCES postsecondary studies (BPS:12/17 achieved a 64% response rate among eligible-sample members; and B&B:16/17 is a month away from the end of its data collection, with a 61% response rate to-date, and projected not to exceed 70%). The B&B:08/18 field test response rate was about 72% though, being a field test, it does not utilize a random sample and the field test data collection is abbreviated in time and effort relative to full-scale data collection. Based on prior response status and planned data collection protocols, we expect a 75% response rate.

B.2. Statistical Methodology – B&B:08/18 Sample Design (pages B-1 and B-2)

All text pertaining to the field test sample design was removed and replaced with information on the sample design of the full-scale study, including information on the specific number of cases that will not be fielded. In addition, Table 1 was updated to remove the field test sample and to update sample size counts to reflect the entire eligible full-scale sample, whereas the field test package displayed counts of the eligible sample to be fielded. Table 2 was added to show the distribution of B&B:08/18 sample members by whether or not they were considered NPSAS:08 study members.


Below are the updated Tables 1 and 2:

Table 1. Distribution of the B&B:08/18 sample by response status for NPSAS:08, B&B:08/09, and B&B:08/12

NPSAS:08

B&B:08/09

B&B:08/12

Count

Total



17,114

Respondent

Respondent

Respondent

13,490

Respondent

Respondent

Nonrespondent

1,547

Respondent

Nonrespondent

Respondent

1,069

Respondent

Nonrespondent

Nonrespondent

955

Nonrespondent

Respondent

Respondent

39

Nonrespondent

Respondent

Nonrespondent

13

Nonrespondent

Nonrespondent

Respondent

5

Nonrespondent

Nonrespondent

Nonrespondent

17


Table 2. Distribution of the B&B:08/18 sample by NPSAS:08 study member status


Count

Total

17,114

NPSAS:08 study member

17,040

NPSAS:08 non-study member1

74

1 NPSAS:08 non-study members will not be fielded in the B&B:08/18 data collection and will be considered B&B:08/18 nonrespondents.

B.3.a. Survey Instrument Design (page B-2)

This section was moved up from section 4.4 in the field test package to be included with the discussion of the study design. New text was included about specific full-scale instrument revisions to maximize response rates; the three versions of the survey, including the full, abbreviated, and mini; new features introduced to improve access to the survey (bitly link and quick response (QR) code).

The new text for this section is as follows:

Development of the full-scale interview began with the preparation of the field test instrument. After field test data collection, cognitive interviews were conducted in fall 2017 to capture feedback on targeted sections of the field test survey. A second technical review panel (TRP) meeting was held December 2017 to review field test results and cognitive testing results, and discuss improvements to survey items for the full-scale implementation.

The B&B:08/18 full-scale survey will have three versions: the full, an abbreviated, and a mini survey. The abbreviated survey will collect information on key topics of interest for B&B, including additional education, current employment status, K-12 teaching status, and marital status. In addition to these key items, an updated employment history since the last follow-up will be collected. The mini survey will collect the same set of key items as the abbreviated, but will not collect detailed information about employment history. A facsimile of the survey instruments that will be used in the B&B:08/18 full-scale data collection is provided in appendix F.

The B&B:08/18 interview will be available to sample members through a web-based instrument that can be used for self-administration on mobile and non-mobile devices, and Computer-Assisted Telephone Interviews (CATI). When the mini survey becomes available, respondents will have the option of completing the survey via a paper-and-pencil (PAPI) questionnaire. Completed paper surveys from mini survey respondents will be returned using an addressed, postage paid envelope enclosed in the package sent to sample members, and survey responses will be scanned and keyed by project staff. The response data from PAPI surveys will be stored in the same manner as web and CATI survey data.

Other methods will be implemented to increase response rates and decrease the potential for nonresponse bias, such as by simplifying the login process. As an example, quick response (QR) codes will be included in recruitment mailings to provide automated access to the survey. Through the camera application available on most smartphones and tablets, sample members can read the QR code and access the study website without having to type in a URL address. When communicating with sample members through e-mail and text messages, abbreviated links will enable automated access to the survey.

B.3.b. Tracing of Sample Members (pages B-2 and B-3)

This section was revised to improve clarity. A bullet specific to contacting materials was pulled to a separate section, and corrected to indicate that contacts will not be made with parents of sample members.


This text was removed:

Data collection mailings and emails will be used to maintain persistent contact with sample members as needed, prior to and throughout data collection. Initial contact letters will be sent to parents and sample members prior to the start of data collection once OMB clearance is received. The letter will remind sample members of their inclusion in the study and request that they provide any contact information updates. Following the initial contact letters, an announcement letter will be sent to all sample members by the first day of data collection to announce the start of data collection. The data collection announcement will include a toll-free number, the study website address, a Study ID and password, and will request that sample members complete the web survey. Two days after the data collection announcement mailing, an email message mirroring the letter will also be sent to sample members.

The modified paragraph is now under a separate section, B.3.c. Contacts with Sample Members on page B-3.

Various communication methods will be used to maintain continuing contact with sample members as needed, prior to and throughout data collection. Letters, postcards, e-mail messages, and text messages (for those from whom prior permission to text was received) will be used to communicate with sample members regarding their study participation. Appendix E presents the communication materials to be used in the B&B:08/18 full-scale study, and the tentative2 schedule for contacting sample members.

B.3.e. Case Management System (pages B-3 and B-4)

In this section, under “Sorting on of non-appointment cases according to parameters and priorities set by project staff,” the paragraph was updated to include information about planned mode tailoring:

Sorting of non-appointment cases according to parameters and priorities set by project staff. Cases may be prioritized for CATI within certain sub-samples or geographic areas, or cases may be sorted to establish priorities between cases of differing status. B&B:08/18 staff may also use patterns of calling outcomes to prioritize calls (e.g., cases with more than a certain number of unsuccessful attempts during a given time of day may be passed over for a brief time). As another example, mode tailoring is a strategy in which sample members who participated via phone on prior rounds will be contacted by phone earlier than other sample members.

The following paragraph pertaining to the mini paper survey was updated and moved to page B-3. This information was previously on page B-6 of the field test OMB package.

Completed paper surveys will be returned to survey staff using an addressed, postage paid envelope enclosed in the package sent to sample members. Because use of the 10-item paper survey is experimental and offered to a small percentage of the field test sample, we are expecting that only a small number of completed paper surveys will be returned. Consequently, survey data will be keyed by telephone interviewers already working on the B&B:08/18 field test. Paper surveys can be scanned by call center staff if the mode is offered during the full-scale data collection; a description of those processes will be provided in the full-scale package.

The revised paragraph is now on page B-2:

Completed paper surveys from mini survey respondents will be returned using an addressed, postage paid envelope enclosed in the package sent to sample members, and survey responses will be scanned and keyed by project staff. The response data from PAPI surveys will be stored in the same manner as web and CATI survey data.

B.4. Tests of Procedures and Methods (pages B-4 to B-11)

This section was updated to provide a summary of results from the field test experiment and to describe the data collection design for the full-scale study. The descriptions of the field test experiment plans and the hypothesis tests were removed, and a summary of the experiment results was provided.

The entire text of section B.4. can be found below in Attachment 1 starting on page 9.

B.4.a. Summary of B&B:08/18 Field Test Data Collection Design and Results (pages B-5 and B-6)

This section provides a summary of field test experiment results, with a summary of recommendations for the full-scale design (shown below). This section also includes a reference to Appendix C which presents a more detailed description of field test experiment results.

B.4.b. B&B:08/18 Full-scale Data Collection Design (pages B-7 to B-11)

This section outlines the plans for the full-scale data collection, describing the two different sample member groups. Table 6 presents the set of interventions and the timeline in which they will be implemented. Here is a description of the default and aggressive data collection protocols found on page B-7.

B&B:08/09 and B&B:08/12 interview respondent: All sample members who responded to both B&B:08/09 and B&B:08/12, the double respondents, will receive the default data collection protocol (N=13,490).

B&B:08/09 and/or B&B:08/12 interview nonrespondent: Sample members who failed to respond to either of the prior two follow-up studies (B&B:08/09 and B&B:08/12), the ever nonrespondents, will receive the aggressive data collection protocol (N=3,550).

Table 6. B&B:08/18 full-scale data collection protocols by data collection phase

Intervention

Default
(Double Respondents)
N=13,490

Aggressive
(Ever Nonrespondent)
N=3,550

Announcement with $2 prepaid incentive

Week 1
Baseline incentive $30

Week 1
Baseline incentive $50

Start CATI contacting

Week 2 for prior CATI completers, Week 10 for others

Week 2 for prior CATI completers, Week 6 for others

Infographic Mailing

Week 14

Week 12

2-week flash incentive – additional $5

Week 26

Week 15

Offer abbreviated survey

Week 28

Week 19

Incentive boost – additional $10

Week 30

Week 23

Offer mini survey

Week 33

Week 28

Offer mini survey (PAPI)

If data collection is extended

Week 33


Change sponsorship

Based on data collection.

Will assess periodically and implement if projected response rate

< 75%


Potential extension of data collection

Maximum Incentive

$30 + $2 (prepaid)

$5 (flash)

$10 (boost)

$5 (résumé upload)

Total = $52

$50 + $2 (prepaid)

$5 (flash)

$10 (boost)

$5 (résumé upload)

Total = $72

NOTE: Data collection is scheduled for July 12, 2018 through March 25, 2019, approximately 37 weeks.

B.5. Reviewing Statisticians and Individuals Responsible for Designing and Conducting the Study (page B-11)

This section was updated to remove subcontractors not working on the full-scale study.

Part B - References (page B-11)

The references were revised to reflect the literature used for the full-scale package.

Appendix A - Membership of the Technical Review Panel

The TRP members list presented in Appendix A was updated to reflect changes in membership and members’ contact information between TRP meeting 1 and TRP meeting 2.

Appendix B - Confidentiality for Administrative Record Matching

Changes in Appendix B include editorial revisions to reflect full-scale implementation, updates to file transfer procedures, and a newly added data source.

B.1. Develop Linkages with Administrative Data Sources (page B-2)

File merges will be performed by the data collection contractor with the CPS data containing federal student aid application information. The merge with CPS can occur at any time for any number of cases, provided that the case has an apparently valid SSN associated with it. A file will be sent to CPS and in return a large data file containing all matched students who applied for federal aid will be received. The data collection contractor has programs and procedures in place to prepare and submit files according to rigorous CPS standards, and to receive and process data obtained from CPS.

Secure Data Transfers. In order to transfer data between the B&B:08/18 contractor and owners of the desired datasets, data files will undergo validated encryption and be transferred through a secure electronic data transfer system (https://transfer.ies.ed.gov/) using Secure Sockets Layer (SSL) technology. The entities sending and receiving the B&B:08/18 datasets must be registered users of the NCES Members Site with privileges set to allow them to use the secure data transfer service3. This service will be used for the file matching procedures described below, except in instances when the vendor already has a secure data transfer system in place.

A description of plans to match the B&B:08/18 full-scale sample to obtain data from the Veterans Benefits Administration (VBA) was included:

To identify veterans and their federal Veteran’s education benefits, a file merge using the secure data transfer system (https://transfer.ies.ed.gov/) will be performed with the VBA data. The resulting data will contain reliable military service records of all applicable sample members, so that variables pertaining to military service can be derived. The data obtained from VBA will also contain detailed information on veterans’ federal education benefit amounts and the enrollment information associated with those benefits.

Appendix C - Results of the B&B:08/18 Field Test Experiments

Appendix C presents a detailed description of the results of experiments conducted in the field test. This is a completely new appendix.

Appendix D - Cognitive Testing Report

The field test package included the report summarizing the first round of cognitive testing. The full-scale package includes the report summarizing the second round of cognitive testing, conducted between September and December 2017.

Appendix E - Sample Member Communication Materials

Changes in Appendix E reflect updates made for full-scale implementation. Communication materials have been included to accompany interventions new to the full-scale protocols, including the prepaid incentive, Infographic, flash incentive, incentive boost, and abbreviated interview. Additionally, we do not plan to implement the “résumé-only” phase in the full-scale, so those contacting materials have been removed. The number of reminders has increased due to the longer data collection period in the full-scale study. We have also included “if needed” materials in case there is an interruption to data collection or an extension to the data collection period. Attachment 2a of this memo describes the types of changes systemically made in the field test versions of contacting materials to adapt them to full-scale, and Attachment 2b describes changes made to and additions of specific contacting materials (beginning on page 16 of this document).

Appendix F - Survey Facsimile for the B&B:08/18 Full-Scale Study, including the Abbreviated Survey and Mini Survey Items:

Changes in Appendix F include revisions to the full-scale instrument based on field test survey results, cognitive interviews conducted in fall 2017, and feedback from the latest B&B:08/18 Technical Review Panel (TRP) meeting held in December 2017. Revisions to the survey are intended to reflect current research goals, reduce respondent burden, and improve data quality. Table 1 in Appendix F (pp. F3 to F-10) provides a summary of the content of the B&B:08/18 full-scale student survey. A change field column in this table and font color coding indicate whether items have remained the same (black) compared to the B&B:08/18 field test survey, were revised (purple), dropped (red), or added (green). In response to recommendations made at the latest B&B:08/18 TRP, a few items have been added to the B&B:08/18 full-scale survey including: items previously tested and included in other federal studies4, items from the B&B:08/12 full-scale survey not included in the B&B:08/18 field test, and new items that will improve survey efficiency or data quality.





Attachment 1 – B.4. Tests of Procedure and Methods (copy of the revised section)

  1. Tests of Procedures and Methods

The B&B:08/18 field test included two sets of experiments: the first set of experiments focused on interventions designed to increase survey participation and reduce nonresponse error, while the second set of experiments focused on minimizing measurement error in selected survey items. The full-scale data collection design described below will implement the tested approaches, revised based on the B&B:08/18 field test results described in Section 4.a. These approaches are also informed by previous studies, including B&B:08/12 (B&B:08/12 Data File Documentation https://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2015141) and the B&B:16/17 field test (see B&B:16/17 Appendix C, OMB# 1850-0926 v.3).

      1. Summary of B&B:08/18 Field Test Data Collection Design and Results

The B&B:08/18 field test contained three data collection experiments and one questionnaire design experiment. The first three experiments examined three different aspects of tailoring contact: tailoring of contact materials, emphasis of NCES as the source and signatory of e-mails (referred to as the sponsorship experiment), and offering a mini survey with an additional survey mode (i.e., offering the mini survey with/without a PAPI option). The B&B:08/18 field test experiment results provide insight in preparation for the full-scale study regarding the effectiveness of these interventions in terms of rates of survey response, representativeness, and data collection efficiency. Survey response was investigated using response rates and résumé upload rates conditional on survey participation. Based on administrative frame data, B&B:08/18 staff conducted nonresponse bias analyses for age, institutional sector of the NPSAS institution, region of the United States that the NPSAS institution is located in, and total enrollment counts. Efficiency is operationalized as the number of days between the start of the experiment and survey completion excluding respondents who completed a partial interview or via paper. The analysis uses one-sided t-tests to assess whether survey response or the efficiency increases significantly in the experimental groups and two-sided t-tests to assess nonresponse bias.

The questionnaire design experiment investigated alternatives to the commonly used “check all that apply” format in item batteries. The results of field test experiments are summarized below. For detailed results of the B&B:08/18 field test experiments, see Appendix C.

The tailoring of contact materials experiment investigated whether referencing the sample members’ degree major improved overall data quality by increasing the personal relevance of participating. The results in Table 3 show that there were no statistically significant differences for response, or efficiency. Overall response rates in the two conditions were similar (tailoring: 71.7%, standardized: 72.0%, t(1,098) = -0.11, p = .55). However, the response rate among B&B:08/12 field test nonrespondents who received the tailored materials was 42.4% while the response rate among B&B:08/12 field test nonrespondents in the standardized condition was 36.0% (t(179) = 0.88, p = .19). While not significantly different, these results suggest that tailoring may be important among prior round nonrespondents. Tailoring also reduces the magnitude of nonresponse bias as well as the number of significantly biased indicators.

Table 3. Tailoring results by experimental condition


Standardized materials

Tailored Materials

t-statistic

p-value

N

Response






Survey Completion (in percent)

72.04%

71.73%

-0.11

0.5540

1,100

Résumé Submission (in percent)

38.16%

35.82%

-0.67

0.7489

791

Representativeness






Average absolute relative nonresponse bias

8.38

9.45

na

na

1,100

Median absolute relative nonresponse bias

5.53

5.53

Maximum absolute relative nonresponse bias

44.47

39.40

Percentage of significantly biased indicators

14.29%

0.00%

Efficiency (in days since start of the experiment)

30.16

28.65

-0.70

0.2419

755

The results in the sponsorship experiment, that is sending reminder e-mails from an “@rti.org” or an “@ed.gov” e-mail address, are very similar to those from the tailoring experiment with respect to response representativeness and efficiency (see Table 4). Both groups achieved identical response rates at the end of data collection (54.8% for both conditions, t(1,330) = 0.02, p = .49).

Table 4. Sponsorship results by experimental condition


RTI

NCES

t-statistic

p-value

N

Response






Survey Completion (in percent)

54.78%

54.83%

0.02

0.4916

1,332

Résumé Submission (in percent)

30.51%

33.06%

0.74

0.2309

730

Representativeness






Average absolute relative nonresponse bias

10.96

10.05

na

na

1,332

Median absolute relative nonresponse bias

8.90

7.37

Maximum absolute relative nonresponse bias

54.47

44.50

Percentage of significantly biased indicators

10.53%

4.76%

Efficiency (in days since start of the experiment)

32.67

30.78

0.84

0.1999

682

The results for the mini survey experiment, that is completing the mini survey in the original survey modes (mini-standard) or a condition that additionally allowed a completion using paper and pencil (mini-PAPI), in Table 5 show that there was no statistical difference in response rates among the mini-PAPI and the mini-standard condition. Conditional on participation in the mini survey, respondents in the mini-PAPI condition did upload their résumés at significantly lower rates (18.3%) compared to those in the mini-standard condition (35.1%) (t(196) = -2.73, p < .01). This lower rate is driven by the fact that none of the respondents who completed the survey via mail uploaded their résumés when asked to do so in a thank-you mailing, though they received the same $10 incentive offer for résumé submission as all other respondents did. There is no difference in representativeness across conditions. However, respondents in the mini-PAPI condition who completed the survey on the web5 did so approximately three days later (23.0 days after the start of the experiment) compared to respondents in the mini-standard condition (19.9 days after the start of the experiment). This result is statistically significant (t(166) = 1.66, p < .05).

Table 5. Mini survey results by experimental condition


Standardized materials

Tailored Materials

t-statistic

p-value

N

Response






Survey Completion (in percent)

23.44%

26.07%

0.86

0.1953

800

Résumé Submission (in percent)

35.11%

18.27%

-2.73

0.0035

198

Representativeness






Average absolute relative nonresponse bias

15.46

14.64

na

na

806

Median absolute relative nonresponse bias

10.95

9.06

Maximum absolute relative nonresponse bias

78.50

44.50

Percentage of significantly biased indicators

0.00%

0.00%

Efficiency (in days since start of the experiment)

22.99

19.89

1.66

0.0496

168

NOTE: Six mini survey respondents were excluded from the response and efficiency analyses since they completed the survey before they were informed about the mini survey and were hence not part of the experiment. The sample for the nonresponse bias analysis includes all respondents to the mini survey.

While there was no statistically significant increase in response rates or résumé submission rates6 in any of the data collection experiments, the field test results are suggestive of positive effects. Given that the interventions tested in the field test show no indication of negative effects7, that the technical review panel members and the literature support these adjustments8, and they are low-cost and easy to implement, the recommendations for the full-scale study are to:

  • tailor the contact materials and reference the sample members’ degree major(s),

  • use NCES as the primary signatory and sender of the electronic communication materials, and

  • use a sequential approach such as offering the mini survey followed by the mini-PAPI.

The questionnaire design experiment compared alternative formats in a set of questions in which respondents are asked to select items that apply to them: 1) the traditional “check all that apply” format in which respondents were asked to check a box for each item that applies, and 2) the forced-choice format that presents respondents with explicit “yes/no” options for each item in the list, and 3) the forced-choice format that presents respondents with explicit “no/yes” options for each item in the list. The results suggest trends of higher endorsement rates among the forced-choice formats and longer completion times indicating more cognitive processing in this forced-choice yes-no format. Therefore, the forced-choice format is planned for in the full-scale instrument.

      1. B&B:08/18 Full-scale Data Collection Design

The data collection design proposed for the B&B:08/18 full-scale study builds upon the designs implemented in B&B:08/12, the B&B:08/18 field test, and other related studies (e.g., B&B:16/17). As part of B&B:08/12, a responsive design using a Mahalanobis distance model was employed to identify the cases most likely to introduce nonresponse bias if they did not respond (see the B&B:08/12 Data File Documentation). Nonresponse bias analyses of the B&B:08/12 experimental data demonstrated minimal presence of nonresponse bias in the set of variables examined. Given this relatively homogenous sample, there was no significant reduction in the average nonresponse bias across variables when cases more likely to induce bias were targeted to receive special attention (e.g., an incentive boost, a prepaid incentive, an abbreviated interview). Similar outcomes were observed in the B&B:16/17 field test study where analyses revealed that, after 20 days of data collection the mean Mahalanobis distance value decreased only marginally, suggesting that adding more sample members to the respondent pool did not necessarily contribute to a more representative sample (discussed in detail in the B&B:16/17 Appendix C, OMB# 1850-0926 v.3). Similar to the B&B:08 cohort, the B&B:16 cohort also has a relatively low potential for nonresponse bias due to its homogeneity in the variables used to examine nonresponse bias.

However, this homogeneity assumption may no longer hold six years after the last data collection, especially for variables that cannot be measured for survey nonrespondents (like those related to employment). A primary goal of the full-scale design is to minimize any potential nonresponse bias that could be introduced into B&B:08/18, especially bias that could be due to lower response among prior-round nonrespondents. Another important goal is to reduce the amount of time and costs of data collection efforts.

To accomplish these goals, the plan is to achieve at least a 75 percent response rate. Doing so will minimize potential nonresponse bias, and will also optimize statistical power and enable sub-group analyses. B&B:08/18 staff plan to divide the sample into two groups and implement differential treatments based on prior round response status. A similar approach was successfully implemented in the B&B:16/17 field test, where prior round nonrespondents received either an aggressive or a default protocol. The response rate among prior nonrespondents who received the aggressive protocol was about 12% higher than the group that received the default protocol ( 37%; default 25% response rate t(2,097) = 3.52, p < .001). For the B&B:08/18 full-scale design, the following groupings will be used:

  • B&B:08/09 and B&B:08/12 interview respondent: All sample members who responded to both B&B:08/09 and B&B:08/12, the double respondents, will receive the default data collection protocol (N=13,490).

  • B&B:08/09 and/or B&B:08/12 interview nonrespondent: Sample members who failed to respond to either of the prior two follow-up studies (B&B:08/09 and B&B:08/12), the ever nonrespondents, will receive the aggressive data collection protocol (N=3,550).

Table 4 presents the type and timing of interventions to be used in the full-scale for both the default and aggressive protocol groups. To begin, both groups will be offered a $2 prepaid incentive. The baseline incentive offer for completing the survey will be $30 for the sample members in the default protocol group, and will be $50 for those in the aggressive protocol group. Beyond the baseline incentive, both data collection protocols employ the same interventions, although the timing of these interventions differs across groups: data collection protocol switches in the aggressive protocol occur sooner relative to making additional efforts under the same protocol (e.g., Peytchev et al. 2009). Each intervention is described in more detail below.

Table 6. B&B:08/18 full-scale data collection protocols by data collection phase

Intervention

Default
(Double Respondents)
N=13,490

Aggressive
(Ever Nonrespondent)
N=3,550

Announcement with $2 prepaid incentive

Week 1
Baseline incentive $30

Week 1
Baseline incentive $50

Start CATI contacting

Week 2 for prior CATI completers, Week 10 for others

Week 2 for prior CATI completers, Week 6 for others

Infographic Mailing

Week 14

Week 12

2-week flash incentive – additional $5

Week 26

Week 15

Offer abbreviated survey

Week 28

Week 19

Incentive boost – additional $10

Week 30

Week 23

Offer mini survey

Week 33

Week 28

Offer mini survey (PAPI)

If data collection is extended

Week 33


Change sponsorship

Based on data collection.

Will assess periodically and implement if projected response rate

< 75%


Potential extension of data collection

Maximum Incentive

$30 + $2 (prepaid)

$5 (flash)

$10 (boost)

$5 (résumé upload)

Total = $52

$50 + $2 (prepaid)

$5 (flash)

$10 (boost)

$5 (résumé upload)

Total = $72

NOTE: Data collection is scheduled for July 12, 2018 through March 25, 2019, approximately 37 weeks.

Prepaid incentive. Cash prepaid or unconditional incentives have been shown to significantly increase response rates in both interviewer-administered as well as self-administered surveys and hence reduce the potential for nonresponse bias (e.g., Church 1993; Cantor et al. 2008; Goeritz 2006; Medway and Tourangeau 2015; Messer and Dillman 2011; Parsons and Manierre 2014; Singer 2002). Medway and Tourangeau (2015) show that prepaid cash incentives not only significantly increase response rates in telephone surveys, but decrease refusal rates and increase contact rates.

During the early completion phase in the B&B:16/17 field test, prepaid incentives ($10 via check or PayPal) in combination with telephone prompting also significantly increased response rates by 4.4 percentage points in the aggressive protocol. Given these positive findings combined with general recommendations in the literature (Singer and Ye 2013), B&B:08/18 staff will offer a small prepaid incentive of $2 to each sample member in the B&B:08/18 full-scale study. This amount has been shown to effectively increase response rates at more efficient field costs compared to higher or lower prepaid incentives (e.g., Beebe et al. 2005; Millar and Dillman 2011; Tourangeau et al. 2013, p. 48).

For all sample members with good contacting information (e.g., respondents to the panel maintenance, successful tracing) a $2 bill will be included in the data collection announcement.9 All sample members for whom no good contacting information exists at the start of data collection will receive a $2 prepaid incentive via PayPal to their best known e-mail address (in the B&B:08/18 field test cohort 40% of all respondents chose to receive their incentive via PayPal, and 47% of the B&B:16/17 full-scale cohort). PayPal was successfully used for prepaid incentives in BPS:12/17 and B&B:16/17. Once B&B:08/18 staff obtain good contacting information for a sample member, a $2 cash incentive will be mailed out if the sample member has not yet retrieved the $2 PayPal offer and completed the survey (similar to the B&B:08/12 full-scale responsive design experiment). All data collection announcements related to interventions will be designed to stand out (for example, they will be mailed using Priority or FedEx mail as suggested by DeBell et al. 2017; Dillman et al. 2014).

Baseline incentive. Sample members in the default group will receive a $30 baseline incentive and sample members in the aggressive group will be offered a $50 baseline incentive. This matches the amount offered to equivalent groups in previous data collections with the B&B:08 cohort as well as the B&B:16 cohort.

Mode tailoring and CATI outbound calling. Leverage-saliency theory and social exchange theory suggest that offering a person the mode they prefer, e.g., by telephone or the web, increases their likelihood of participating (Dillman et al. 2014; Groves et al. 2000). This theory is supported by empirical evidence that offering people their preferred mode choice speeds up their response and is associated with higher participation rates (e.g., Olson et al. 2012). The outbound CATI calling will begin earlier (at the end of week two) for sample members who completed the prior round on the phone. Likewise, those sample members who completed the prior round interview online will not be contacted by telephone before the preassigned outbound data collection date (default protocol week 10; aggressive protocol week 6). Attempting to interview sample members in the aggressive group by telephone sooner also increases the likelihood of initiating locating efforts sooner. For example, B&B:16/17 field test results showed higher locate rates for a data collection protocol which had outbound CATI calling (93.7%), compared to that of a data collection protocol which did not (77.8%; χ2 = 63.2, p < .001).

Infographic. Groves et al. 1992 argue that informational letters will gain higher cooperation rates than not providing such materials. Ongoing panel surveys often provide general feedback on study results to respondents (e.g., Blom et al. 2015; Scherpenzeel and Toepoel 2014). While the results in the U.S. are mixed, several studies do suggest a positive effect of personalized feedback on response rates (e.g., Baelter et al. 2011; Marcus et al. 2007). In order to communicate the importance, relevance and legitimacy of the B&B:08/18 full-scale study B&B:08/18 staff plan to send an infographic as part of a reminder contact via mail and e-mail with link to infographic on the website (default group week 14; aggressive group week 12). This infographic will contain a 1-page visual representation of results from prior B&B:08 studies and stress the importance of taking part in the study. See appendix E for the infographic text and other communication materials to be used in the full-scale study.

Flash incentive. Early bird incentives have been shown to lead to faster responses and increased participation rates within the early bird incentive period (e.g., Coppersmith et al. 2016; LeClere et al. 2012), and can provide efficiencies by reducing both data collection costs and time. Given these positive effects, B&B:08/18 full-scale will apply a variation on the idea of early bird incentives to speed up survey completion by offering a flash incentive. More specifically, all sample members will be offered an additional $5 incentive if they complete the survey within the next two weeks, that is, a ‘flash’ request instead of an early bird incentive. The default group would receive this request in week 26 and the aggressive group in week 15. This particular timing ensures that the ‘flash’ intervention occurs towards the end of the full survey collection phase just before the offer to complete the abbreviated survey is mailed out. This timing also ensures that most sample members have been located.

Abbreviated survey. Obtaining responses from all sample members is important for sample representativeness (e.g., Kreuter et al. 2010). Several postsecondary studies conducted by NCES have offered reluctant sample members an abbreviated interview to decrease response burden, increase response rates, and decrease nonresponse bias. For example, during the B&B:16/17 field test, sample members who were prior round nonrespondents and who were offered the abbreviated interview during the production phase responded at higher rates (22.7%) than those who were not offered the abbreviated interview at the same time (12.1%; t(2,097) = 3.67, p < .001). The B&B:08/12 full-scale showed similar results in that offering the abbreviated interview to prior round nonrespondents increased overall response rates of that group by 18.2 percentage points (B&B:08/12 full-scale DFD p. 33). An abbreviated interview option will be offered to all sample members in the B&B:08/18 full-scale. For the default group, it will be offered in week 28, and for the aggressive group in week 19. The abbreviated survey, shown in Appendix F, contains a set of critical items and collects information about employment history.

Incentive boost. Incentive boosts for those sample members who refused, or a subsample thereof, are a common element used in nonresponse conversion (e.g., Groves and Heeringa 2006; Singer and Ye 2013). These boosts are especially common in large federal surveys during their nonresponse follow up phase (e.g., The National Survey of Family Growth) and have been implemented successfully in other postsecondary education surveys (e.g., HSLS:09 F2; BPS:12/17). For nonresponse conversion, a $10 incentive boost increase to the B&B:08/18 baseline incentive is planned for all remaining nonrespondents. For all remaining nonrespondents in the aggressive protocol group, the incentive boost will be offered beginning in week 23 of data collection, approximately four weeks after the abbreviated survey offer and about five weeks before the mini survey offer. The incentive boost will be offered to all remaining nonrespondents in the default group beginning in week 30, allowing two weeks after the abbreviated survey offer and three weeks before implementation of the mini survey.

Mini survey. The shorter the stated length of a survey, the lower the perceived burden for the respondent. Motivated by this approach, B&B:08/18 staff will offer an extremely abbreviated questionnaire – a mini survey – to all nonrespondents late in the data collection period (default group week 33; aggressive group week 28). The mini survey, presented in Appendix F, contains only the most critical survey items. Again, obtaining information on these nonrespondents is crucial to assess and increase sample representativeness (e.g., Kreuter et al. 2010). The mini survey offer is expected to further increase response rates compared to the traditional abbreviated interview given that the mini survey is less burdensome as it is even shorter.

Mini-PAPI. In addition to offering shorter interviews, offering multiple survey completion modes may improve response rates and representativeness (e.g., Dillman et al. 2014; Shettle and Mooney 2009). Switching from web to mail, for example, has been shown to increase response rates between 5 to 19 percentage points and to increase representativeness by bringing in different types of respondents (e.g., Messer and Dillman 2011; Millar and Dillman 2011).10 In a non-experimental design, Biemer et al. (2016) show that a mode switch from web to mail improved response rates by up to 20 percentage points. The B&B:08/18 field test results showed that there was not a significant difference in response rates between those offered the mini-PAPI option and those that received the mini-standard option, though about 15 percent of mini completions were submitted via PAPI, indicating that some portion of the B&B sample is receptive to responding via PAPI.

Résumé collection. As was done in the B&B:08/18 field test, all survey respondents will be asked to upload their résumé at the end of the interview. From B&B:16/17, B&B:08/18 staff learned that some respondents upload their résumés even if no additional incentive is provided (at the time of writing the submission rate among respondents in the B&B:16/17 full-scale is approximately 11%). For this reason, and to preserve project resources for other primary data collection interventions, B&B:08/18 staff propose to decrease the $10 offered for a résumé upload during the B&B:08/18 field test to a $5 incentive to thank survey respondents for the additional time and effort required by the résumé upload process in the full-scale study.

Additional interventions. B&B:08/18 staff will continuously monitor data collection measures to achieve the targeted response rate of 75%. For example, time series modeling to project the response rate at the end of data collection, overall and for key analytic subgroups, may be used. In combination with a comparison of the weekly performance of the B&B:08/18 full-scale sample to that of the B&B:08/12 full-scale, NCES will determine whether and when any of the interventions below should be implemented:

  • Change sponsorship. While B&B is conducted by NCES, with the U.S. Department of Education (ED), RTI has typically used a study-specific e-mail “@rti.org” and telephone number to contact and support sample members. For the B&B:08/18 field test, B&B:08/18 staff investigated the effect of sending reminder e-mails from an “@ed.gov” e-mail address. Compared to sending e-mails from “@rti.org” the B&B:08/18 field test showed that sending reminders from NCES had positive effects on sample representativeness and data collection efficiency. For the full-scale study, B&B:08/18 staff will e-mail the contact materials from “@ed.gov.” If, however, the models suggest that the targeted response rates for the B&B:08/18 full-scale will not be met, B&B:08/18 staff will send e-mail reminders either from the NCES project officer, the RTI project officer, or the standard e-mail address (i.e., [email protected]). Changing the sender of the e-mail to the project officers may increase the perceived importance of the survey and help personalize the contact materials, thereby potentially increasing relevance. Switching the sender during data collection also increases the chance that the survey invitation is delivered to the sample member rather than to a spam filter.

  • Extension of data collection. Should it become clear from our monitoring activities that the targeted response rate will not be met by the planned end date of data collection, data collection will be extended for about four weeks. During this extended data collection phase, the mini-PAPI survey will be offered to all remaining nonrespondents in the default protocol, and the mini-PAPI survey offer will be continued for the aggressive protocol group.





Attachment 2a – Systemic changes to adapt to full-scale contacting materials

Change

Page in FT package

Page in FS package

Type of change

Reason for change

Notes

QR code

throughout

Multiple pages

New from FT

To improve ease of access to survey from hard copy mailing

QR code added to all hard copy mailings to facilitate access to the survey

Bitly link

throughout

Multiple pages

New from FT

To improve ease of access to survey from text message

Bitly link added to all text messages to facilitate access to the survey

Click here button

throughout

Multiple pages

New from FT

To improve ease of access to survey from e-mail

Click here button replaces the click here link, facilitating access to the survey

Revised source and signatory table

C-32

E-33

Revised from FT

Flexibility for including different signatures

FT included signature option for NCES and RTI, FS adds Data collection task leader and data collection team

Abbreviated and Incentive Eligibility Table

throughout

E-48

New from FT

Table accounts for various incentive amounts and survey types

Includes new text options based on survey type and incentive eligibility

Incentive Eligibility Table

throughout

E-20

New from FT

Table accounts for various incentive amounts and survey types

 

Prior Response Table

throughout

E-16

New from FT

Editorial to personalize communication

 

Resume Table

C-51

E-62

Revised from FT

FS excludes resume only option

 



Attachment 2b – Changes to and additions of specific full-scale contacting materials

Contact Material

Page in FT package

Page in FS package

Type of change

Reason for change

Notes

Draft Brochure Text

C-4

E-8

Updated text for FS implementation, Editorial to improve readability and clarity, Reference to more recent data

 

What have we learned from previous rounds' FAQ revised to include more recent results

Draft Website Text

C-6

E-10

Updated text for FS implementation, Editorial to improve readability and clarity, New from FT, Reference to more recent data

 

NCES authorization added to login screen, moved cell phone and email collection to the top of the Update Contact Info form, Previous Results section revised to include more recent results

Initial Contact letter

C-10

E-13

Updated text for FS implementation, Editorial to personalize communication, Editorial to improve readability and clarity,

 

«panelinfo»/«controlid» is used for fulfillment tracking purposes, signed by NCES Associate Commissioner

Data Collection Announcement Letter

C-12

E-15

Updated text for FS implementation, Editorial to personalize communication, Editorial to improve readability and clarity, New from FT

Additional intervention (prepaid incentive) to improve response rates

Includes prior response table, QR code. Signed by NCES Project Officer and RTI Project Director

Infographics Mailing

N/A

E-17

New from FT

Additional non-monetary intervention to improve sample member engagement and participation

QR code

Flash Announcement Mailing

N/A

E-19

New from FT

Additional intervention to improve response rates

QR Code, signed by NCES Project Officer and RTI Project Director, Incentive Eligibility table

Abbreviated Interview Invitation Letter

N/A

E-21

New from FT

Additional intervention to improve response rates

QR Code, signed by NCES Project Officer and RTI Project Director, incentive/eligibility fills

Incentive Boost Letter

N/A

E-23

New from FT

Additional intervention to improve response rates

QR Code, signed by NCES Project Officer and RTI Project Director, Incentive Boost Letter only applies to incentive eligible cases

Mini Survey Letter

C-14

E-25

Updated text for FS implementation, Editorial to personalize communication, Editorial to improve readability and clarity

To improve ease of access to survey from hard copy mailing

QR Code, signed by NCES Project Officer and RTI Project Director, incentive/eligibility fills, removed hard copy experiment table from FT draft

Hard Copy (PAPI) Mini Survey Letter

N/A

E-27

New from FT

 

QR Code, signed by NCES Project Officer and RTI Project Director, incentive/eligibility fills

Hard Copy (PAPI) Mini Survey Letter-Replacement Questionnaire

C-16

E-29

Updated text for FS implementation, Editorial to improve readability and clarity

To improve ease of access to survey from hard copy mailing

QR Code, signed by NCES Project Officer and RTI Project Director, incentive/eligibility fills, removed hard copy experiment table from FT draft

Initial Contact E-mail

C-33

E-31

Updated text for FS implementation, Editorial to personalize communication, Editorial to improve readability and clarity

 

Using click here link, signed by NCES Project Officer, incentive/eligibility fills, directs sms to update contact information

Data Collection Announcement E-mail

C-34

E-32

Updated text for FS implementation, Editorial to personalize communication, Editorial to improve readability and clarity

 

Prior response table, signed by NCES Project Officer and RTI Project Director, incentive/eligibility fills, click here button

Flash Announcement E-mail

N/A

E-33

New from FT

Additional prompt to improve response rates

Incentive/eligibility fills, click here button, source and signatory table

Abbreviated Announcement E-mail

N/A

E-34

New from FT

Additional prompt to improve response rates

 

Incentive Boost E-mail

N/A

E-35

New from FT

Additional prompt to improve response rates

 

Mini Survey E-mail

N/A

E-36

New from FT

Additional prompt to improve response rates

 

Hard Copy (PAPI) Mini Survey Announcement E-mail

N/A

E-37

New from FT

Additional prompt to improve response rates

 

Reminder E-mail 1

C-35

E-38

Updated text for FS implementation, Editorial to improve readability and clarity, Editorial to personalize communication

Reminder emails have been reordered and revised based on B&B:16/17 and BPS:12/17 approved OMB packages

Incentive/eligibility fills, click here button, source and signatory table

Reminder E-mail 2

C-36

E-39

Updated text for FS implementation, Editorial to improve readability and clarity, Editorial to personalize communication

Reminder emails have been reordered and revised based on B&B:16/17 and BPS:12/17 approved OMB packages

Incentive/eligibility fills, click here button, source and signatory table

Reminder E-mail 3

C-36

E-40

Updated text for FS implementation, Editorial to improve readability and clarity, Editorial to personalize communication

Reminder emails have been reordered and revised based on B&B:16/17 and BPS:12/17 approved OMB packages

Incentive/eligibility fills, click here button, source and signatory table

Reminder E-mail 4

C-37

E-41

Updated text for FS implementation, Editorial to improve readability and clarity, Editorial to personalize communication

Reminder emails have been reordered and revised based on B&B:16/17 and BPS:12/17 approved OMB packages

Incentive/eligibility fills, click here button, source and signatory table

Reminder E-mail 5

C-37

E-42

Updated text for FS implementation, Editorial to improve readability and clarity, Editorial to personalize communication

Reminder emails have been reordered and revised based on B&B:16/17 and BPS:12/17 approved OMB packages

Incentive/eligibility fills, click here button, source and signatory table, includes B&B banner

Reminder E-mail 6

C-38

E-43

Updated text for FS implementation, Editorial to improve readability and clarity, Editorial to personalize communication

Reminder emails have been reordered and revised based on B&B:16/17 and BPS:12/17 approved OMB packages

Incentive/eligibility fills, click here button, source and signatory table

Reminder E-mail 7

C-39

E-44

Updated text for FS implementation, Editorial to improve readability and clarity, Editorial to personalize communication

Reminder emails have been reordered and revised based on B&B:16/17 and BPS:12/17 approved OMB packages

Incentive/eligibility fills, click here button, source and signatory table, prior response table

Reminder E-mail 8

C-40

E-45

Updated text for FS implementation, Editorial to improve readability and clarity, Editorial to personalize communication

Reminder emails have been reordered and revised based on B&B:16/17 and BPS:12/17 approved OMB packages

Incentive/eligibility fills, click here button, source and signatory table

Reminder E-mail 9

C-41

E-46

Updated text for FS implementation, Editorial to improve readability and clarity, Editorial to personalize communication

Reminder emails have been reordered and revised based on B&B:16/17 and BPS:12/17 approved OMB packages

Incentive/eligibility fills, click here button, source and signatory table

Reminder E-mail 10

C-42

E-47

Updated text for FS implementation, Editorial to improve readability and clarity, Editorial to personalize communication

Reminder emails have been reordered and revised based on B&B:16/17 and BPS:12/17 approved OMB packages

Incentive eligibility table, click here button, source and signatory table

Reminder E-mail 11

C-43

E-48

Updated text for FS implementation, Editorial to improve readability and clarity, Editorial to personalize communication

Reminder emails have been reordered and revised based on B&B:16/17 and BPS:12/17 approved OMB packages

Click here button, source and signatory table, abbreviated and incentive eligibility table

Reminder E-mail 12

C-44

E-49

Updated text for FS implementation, Editorial to improve readability and clarity, Editorial to personalize communication

Reminder emails have been reordered and revised based on B&B:16/17 and BPS:12/17 approved OMB packages

Click here button, source and signatory table, abbreviated and incentive eligibility table, includes percent of participants in the subject line

Reminder E-mail 13

C-45

E-50

Updated text for FS implementation, Editorial to improve readability and clarity, Editorial to personalize communication

Reminder emails have been reordered and revised based on B&B:16/17 and BPS:12/17 approved OMB packages

Click here button, source and signatory table, abbreviated and incentive eligibility table

Reminder E-mail 14

C-46

E-51

Updated text for FS implementation, Editorial to improve readability and clarity, Editorial to personalize communication

Reminder emails have been reordered and revised based on B&B:16/17 and BPS:12/17 approved OMB packages

Incentive/eligibility fills, click here button, source and signatory table, Holiday table, alternate subject lines, number of months fill for non-holiday theme

Reminder E-mail 15 -- 24

N/A

E-52 – E-61

New from FT

Additional prompt to improve response rates

Incentive/eligibility fills, click here button, source and signatory table, New Year’s themed e-mail

Thank You Letter- sent with check

C-50

E-62

Updated text for FS implementation, Editorial to improve readability and clarity

Prompt for incentive receipt and provide additional details for resume upload, to improve ease of access to resume upload from hard copy mailing

QR code, source and signatory table, resume table, incentive/eligibility fills

Thank You E-mail - PayPal Incentive/No Incentive

C-52

E-64

Updated text for FS implementation, Editorial to improve readability and clarity

Prompt for incentive receipt and provide additional details for resume upload

PayPal fill, source and signatory table, resume table

Thank You E-mail - Check Incentive

C-52

E-65

Updated text for FS implementation, Editorial to improve readability and clarity

Prompt for incentive receipt and provide additional details for resume upload

Resume table, source signatory table

Resume Reminder E-mail 1

C-53

E-66

Updated text for FS implementation, Editorial to improve readability and clarity

Provide additional details for resume upload

Click here link, source and signature, incentive/eligibility fills

Text Message Reminders

C-47

E-67

Updated text for FS implementation, Editorial to improve readability and clarity

To improve ease of access to survey from text message

Bitly links, incentive/eligibility fill, alternate text for bitly/non-bitly

Reminder Postcard 1

C-18

E-70

Updated text for FS implementation, Editorial to personalize communication

Reminder postcards have been reordered with only slight revisions

QR code, incentive/eligibility fills

Reminder Postcard 2

C-20

E-72

Updated text for FS implementation, Editorial to personalize communication

Reminder postcards have been reordered with only slight revisions

QR code, incentive/eligibility fills

Reminder Postcard 3

C-22

E-74

Updated text for FS implementation, Editorial to personalize communication

Reminder postcards have been reordered with only slight revisions

QR code, incentive/eligibility fills

Reminder Postcard 4

C-24

E-76

Updated text for FS implementation, Editorial to personalize communication

Reminder postcards have been reordered with only slight revisions

Mini survey reminder, QR code, incentive/eligibility fills

Reminder Postcard 5

C-29

E-78

Updated text for FS implementation, Editorial to personalize communication, Editorial to improve readability and clarity,

Reminder postcards have been reordered with only slight revisions

This postcard was Reminder Postcard 6 in FT, QR code, incentive/eligibility fills

Hard Copy (PAPI) Incentive Postcard

C-24

E-80

Updated text for FS implementation, Editorial to improve readability and clarity

Reminder postcards have been reordered with only slight revisions

Additional instruction for mailing

Mini Survey Postcard

N/A

E-82

New from FT

Additional prompt to improve response rates

 

Holiday Mailing

N/A

E-84

New from FT

Additional prompt to improve response rates

 

Holiday Mailing Insert

N/A

E-86

New from FT

Additional prompt to improve response rates

 

Thanksgiving Themed E-mail

N/A

E-88

New from FT

Contingency contact

 

Partial Complete E-mail

N/A

E-89

New from FT

Additional prompt to improve response rates

 

Paired Contact E-mail

N/A

E-90

New from FT

Additional prompt to improve response rates

 

Interruption in Data Collection - Notice/Warning E-mail

N/A

E-91

New from FT

Contingency contact

 

Interruption in Data Collection - Resume E-mail

N/A

E-92

New from FT

Contingency contact

 

Pending Ineligible E-mail

N/A

E-93

New from FT

Additional prompt to improve response rates

 

Hard Refusal E-mail

N/A

E-94

New from FT

Additional prompt to improve response rates

 

No Good Address for Payment E-mail

N/A

E-95

New from FT

Additional prompt for incentive receipt

 

Canceled PayPal E-mail

N/A

E-96

New from FT

Additional prompt for incentive receipt

 

No Incentive Data E-mail

N/A

E-97

New from FT

Additional prompt for incentive receipt

 

No Contact E-mail

N/A

E-98

New from FT

Additional prompt to improve response rates

 

Expired Incentive E-mail

N/A

E-99

New from FT

Additional prompt for incentive receipt

 

Preferred to Complete Online E-mail

N/A

E-100

New from FT

Additional prompt to improve response rates

 

Data Collection Extension E-mail

N/A

E-101

New from FT

Contingency contact

 

Spare Reminder E-mail 1

N/A

E-102

New from FT

Contingency contact

 

Spare Reminder E-mail 2

N/A

E-103

New from FT

Contingency contact

 

Spare Reminder E-mail 3

N/A

E-104

New from FT

Contingency contact

 

Mini Survey Letter - Reminder

N/A

E-105

New from FT

Additional prompt to improve response rates

 

Data Collection Staff Prompt E-mail

N/A

E-107

New from FT

Additional prompt to improve response rates

 

B&B:08/18 Consent Text

N/A

E-108

New from FT

Includes FS web and CATI consent text

 



1 See the Supporting Statements Part B for the NPSAS:08 Full Scale (OMB#1850-0666), the B&B:08/09 full-scale collection (OMB# 1850-0729, v. 2), and the B&B:08/12 full-scale collection (OMB# 1850-0729; v. 7-10). Related studies include the B&B:16/17 full-scale collection (OMB# 1850-0926, v. 3), and the Beginning Postsecondary Student Longitudinal Study (BPS:12/17) full-scale collection (OMB# 1850-0631, v.10).

2 Dates are shown to reflect the general schedule, but are subject to change as needed.

3 These privileges are set up and carefully controlled by the Institute of Education Sciences (IES) Chief Technology Officer (CTO). This service has been designed specifically for the secure transfer of electronic files containing personally identifiable information (PII), such as social security numbers (SSNs), names, and dates of birth of study sample members along with their survey IDs (different from the IDs used in the restricted-use data files), i.e., data protected under the Privacy Act or otherwise posing risk of disclosure. This service can be used for NCES-to-Contractor; Contractor-to-Subcontractor; Subcontractor-to-Contractor; and Contractor-to-Other-Agency data transfers. Matching of sample members to the external datasets will be initiated by: (a) the B&B:08/18 contractor creating a password-protected, encrypted file containing sample members’ PII; (b) uploading the electronic file to the file transfer system for pick-up by the external data source; and (c) the external data source processing the data on their database and providing the matched data on the file transfer system for secure download. The party uploading data into the file transfer system is responsible for deleting the file(s) after the successful transfer has been confirmed. Software has been developed to create the files for the merge and also to read the data received. All matching processes are initiated by the B&B:08/18 data collection staff providing a file with one record per sample member to be merged.

4 Items from the following studies have been added to the B&B:08/18 full-scale survey: Current Population Survey (OMB# 1220-0100), and the 2012-13 Teacher Follow-up Survey (TFS) of the Schools and Staffing Survey (SASS) (OMB# 1850-0598).


5 Excluding the respondents in the mini-PAPI version who complete the survey via mail (n=30).

6 With the exception of the lower rate of résumé submission among mini-PAPI respondents.

7 With the exception of the lower rate of résumé submission among mini-PAPI respondents.

8 For tailoring, see Lynn 2016 and Tourangeau et al. 2010. For sponsorship, see Avdeyeva and Matland 2013; Edwards et al. 2014; and Groves et al. 2012. For mini-PAPI, see: Biemer et al. 2016; Galesic and Bosnjak 2009; and Messer and Dillman 2011.

9 A $2 bill is also considered unique and should therefore gauge the sample member’s interest more compared to other cash incentives.

10 It is worth noting that these studies are lacking a true control condition, i.e., no web condition was followed-up in web only. The mode switch was from web to mail, or vice versa, hence number of contact attempts and mode switch are potentially confounded. B&B:08/18 staff will extrapolate what the response rates would have been, had the mini survey not been offered, and compare the pure mini survey effect to that simulation. This allows us to test the mini survey effect and the mode switch effect.

33

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authortracy
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy