Part B B&B 2016-2017 Main Study

Part B B&B 2016-2017 Main Study.docx

2016-17 Baccalaureate and Beyond Longitudinal Study (B&B:16/17) Main Study

OMB: 1850-0926

Document [docx]
Download: docx | pdf










2016/17 BACCALAUREATE AND BEYOND (B&B:16/17) MAIN STUDY




Supporting Statement Part B

OMB # 1850-0926 v.6







Submitted by

National Center for Education Statistics

U.S. Department of Education












April 2017

revised November 2017






Contents



Tables




  1. Collection of Information Employing Statistical Methods

This request is to conduct the 2016/17 Baccalaureate and Beyond Longitudinal Study (B&B:16/17) and B&B:16/20 field test study panel maintenance activities. B&B:16/17 is the first follow-up of sample members from the 2015-16 National Postsecondary Student Aid Study (NPSAS:16) who were baccalaureate recipients during the 2015–16 academic year. For details on the NPSAS:16 sample and main study design, see NPSAS:16 Full Scale (OMB# 1850-0666 v. 15-19) Supporting Statement Part B. B B&B cohorts prior to B&B:16 are approved under OMB# 1850-0729 while the B&B:16 cohort is approved under OMB# 1850-0926.

    1. Respondent Universe – B&B:16/17 Target Population

The target population for B&B:16/17 main study includes all eligible NPSAS:16 sample members who completed requirements for the bachelor’s degree from NPSAS-eligible institutions during the 2015–16 academic year, that is, between July 1, 2015 and June 30, 2016, and were awarded their baccalaureate degree by the institution no later than June 30, 2017. There is a known and well-defined probability of selection for each student in the B&B sample. Through the institution awarding the degree, each completer has exactly one linkage to the B&B sampling frame.

    1. Statistical Methodology – B&B:16/17 Sample Design

Eligibility for the B&B:16 cohort will be based primarily on information obtained from the respondent’s NPSAS interview. If the sample member lacks a NPSAS base-year interview, eligibility will be based on the enrollment list provided by the NPSAS institution at the time of NPSAS student sampling. Questions are included in both the B&B:16/17 screener and in the main study interview to clearly determine when the degree requirements were met and when the degree was received, which will ultimately determine eligibility. The B&B sample consists of three different groups of sample members based on their NPSAS response status: (1) NPSAS study members1 who responded to the NPSAS interview, (2) NPSAS study members who did not respond to the NPSAS interview, and (3) NPSAS non-study members, i.e. sample members lacking enough information from the NPSAS interview and administrative collections to qualify them as a respondent to NPSAS. The latter two groups will begin eligibility screening at the same time as the start of main data collection. If screener-deemed eligible, these study members will begin main data collection after a brief (1-2 weeks) suspension of activities. Given their lack of base year data, non-study members will be excluded from the study irrespective of eligibility, but ineligible non-study members will be removed from the cohort.

Historically, NPSAS-based longitudinal studies, such as B&B, have included a small percentage of sample members who were non-study members in NPSAS. This sampling strategy has generated analysis issues in the follow-up studies when these sample members respond in the follow-up interviews without having base-year data. Nonresponse bias analysis from the previous B&B cohort suggests that there is very little bias associated with this group as a whole, therefore, we are including in the sample all NPSAS:16 non-study members who are potentially eligible for B&B, but not fielding those cases, that is, not moving them to data collection. NPSAS:16 non-study members (n=1,352) will be counted as B&B:16/17 nonrespondents unless determined cohort-ineligible during screening. Table 1 shows the distribution of the potentially-eligible B&B frame.

Table 1. B&B:16/17 main study frame by NPSAS:16 study member status

NPSAS:16 Main Study Status

Count

Total

33,701

Study member

32,349

Non-study member

1,352

The B&B:16/17 main study fielded sample will only consist of individuals who are NPSAS study members (the first group in the above Table 1). These individuals did not have to respond to the NPSAS interview to be study members, as described above. Table 2 shows the distribution of the NPSAS study members who are potentially eligible for B&B by their NPSAS interview response status.

Table 2. B&B:16/17 main study fielded frame by NPSAS:16 interview response

NPSAS:16 B&B:16/17 Main Study Eligibility

Count

Total study members

32,349

Interview respondent: baccalaureate receipt confirmed in NPSAS:16 interview

22,539

Interview nonrespondent: listed as potential baccalaureate recipient

9,810

In the B&B:16/17 field test, 95% of sample members who confirmed B&B eligibility in the NPSAS interview also confirmed being eligible in the B&B interview. Therefore, we will include all NPSAS:16 main study interview respondents in the B&B:16/17 sample. In the B&B:16/17 field test, 24% of sample members who were NPSAS interview nonrespondents confirmed in the B&B interview that they were not eligible for B&B. To assist in identifying eligible cohort members, a concordance analysis was conducted using NPSAS:16 student records, the National Student Clearinghouse (NSC) data on degree completion, and the National Student Loan Data System (NSLDS) data on enrollment status and exit counseling for students with federal loans. The results indicated that these sources can be used to identify eligible students, but cannot identify ineligible students with certainty. We therefore will use these data for implicit stratification as described below.

In order to have full population coverage of the B&B sample, some portion of the NPSAS:16 interview nonrespondents, who are potentially eligible for B&B, will be selected. We will subsample half of the NPSAS:16 interview nonrespondents (n=9,810) resulting in 4,905 interview nonrespondents being included in the sample. A subsample size of 50% will help alleviate unequal weighting resulting from the methodology of selecting 10% used in the prior B&B cohorts. With this new approach, the design weight associated with these sample members will be approximately double, which should not result in extreme weights. The subsample will be a sequential selection with probability proportional to the NPSAS:16 student base weight to maximize eligibility and allow for different data collection protocols for individuals located2 or not located in NPSAS. The 9,810 NPSAS:16 interview nonrespondents will be explicitly stratified by their “located in NPSAS” flag and then implicitly stratified (sorted) based on the administrative sources available at the time of sampling. Additionally, within the administrative data strata, the NPSAS:16 interview nonrespondents will be implicitly stratified by institution sector to ensure representation of the sample. The subsample will be drawn with probabilities proportional to the NPSAS:16 sampling weight, and within sector the NPSAS:16 interview nonrespondents will be sorted by this weight. The sample sizes for the explicit strata will be determined proportional to the sum of the NPSAS:16 student base weights associated with each stratum.

Table 3 shows the distribution and expected sample sizes of NPSAS:16 interview nonrespondents by the NPSAS:16 located flag; this is a proportional allocation such that the overall subsampling rate is 50%. The total fielded sample size will be 27,444 and the total sample size will be 28,796. Table 4 shows the distribution of the B&B:16/17 main study sample.

Table 3. Distribution of B&B eligible NPSAS:16 interview nonrespondents by NPSAS:16 located flag


Count

B&B:16/17 expected sample size

Total

9,810

4,905

Not located in NPSAS:16

2,438

1,219

Located in NPSAS:16

7,372

3,686

Table 4. Sample Sizes for the B&B:16/17 main study


Count

Total

28,796

NPSAS:16 study member

27,444

NPSAS:16 interview respondent

22,539

NPSAS:16 interview nonrespondent

4,905

NPSAS:16 non-study member1

1,352

NPSAS:16 non-study members will not be fielded in B&B:16/17 and will be B&B:16/17 nonrespondents.


For the Confidentiality Pledge experiment, introduced in Part A and described below, sample members will be assigned at random into one of 6 groups, prior to the start of all data collection activities. Four experimental groups will have sample sizes of 1,733 each (total N=8,665), while the control group will include all remaining sample members (N=19,994). Although group assignment will be made prior to the start of data collection, the group sample sizes have been adjusted to compensate for the expected amount of attrition due to ineligibility. The expected eligibility and response rates for the fielded sample are shown in Table 5. The final sample sizes for the B&B:16/17 main study data collection and pledge experiment are provided in Table 6.

Table 5. Expected B&B:16/17 main study eligibility and response rates by base-year response status among fielded sample


Total

Expected eligible

Expected respondents

Number

Rate

Number

Rate

Total NPSAS:16 study members

27,444

23,602

86%

20,534

87%

NPSAS:16 interview respondent

22,539

20,285

90%

19,068

94%

NPSAS:16 interview nonrespondent

4,905

3,434

70%

1,545

45%

Table 6. Sample sizes for the B&B:16/17 main study data collection and pledge experiment

Data Collection Protocol

Pledge wording and placement

Total

Control

Homeland Security”

Federal Staff and Contractors”

Login/direct link

Separate Page

Login/direct link

Separate Page

Login/direct link

Separate Page

Total

18,779

1,733

1,733

1,733

1,733

1,733

27,444

Group 1 – Nonlocated nonrespondents

834

77

77

77

77

77

1,219

Group 2 – Located nonrespondents

4,755

431

431

431

431

431

6,910

Group 3 – Late respondents

5,008

465

465

465

465

465

7,333

Group 4 – Early respondents

8,182

760

760

760

760

760

11,982


    1. Methods for Maximizing Response Rates

Achieving high response rates in the B&B:16/17 main study data collection will depend on successfully identifying and locating sample members and being able to contact them and gain their cooperation. The following sections outline methods for maximizing response to the B&B:16/17 student survey.

      1. Tracing of Sample Members

To yield the maximum number of locates with the least expense, we designed an integrated tracing approach, with the following elements.

  • Advance tracing activities, which will occur prior to the start of data collection, include initial batch database searches, such as to the National Change of Address databases, for cases with sufficient contact information to be matched. To handle cases for which contact information is invalid or unavailable, B&B staff will conduct additional advance tracing through proprietary interactive databases to expand on leads found.

  • Hard copy mailings and emails will be used to maintain ongoing contact with sample members, prior to and throughout data collection. At the start of data collection, initial contact letters will request that prior round nonrespondents, including non-study members, update their contact information and complete the eligibility screener; a follow-up reminder email will be sent approximately 2 weeks after the initial letter to remind them to respond. Also, at the start of data collection for study members (both prior round respondents and nonrespondents), a letter will be sent to announce the start of data collection. The announcement will include a toll-free number, the study website address, and a Study ID and password, and will request that sample members complete the web survey. After the data collection announcement mailing, an email message mirroring the letter will also be sent.

  • The telephone locating and interviewing stage includes calling all available telephone numbers and following up on leads provided by parents and other contacts.

  • The pre-intensive batch tracing stage consists of the LexisNexis SSN and Premium Phone batch searches that will be conducted between the telephone locating and interviewing stage and the intensive tracing stage.

  • Once all known telephone numbers are exhausted, a case will move into the intensive tracing stage during which tracers will conduct interactive database searches using all known contact information for a sample member. During the B&B:16/17 field test, about 91 percent of sample members who reached the intensive tracing stage were located, and about 17 percent of those located responded to the interview. With interactive tracing, a tracer assesses each case on an individual basis to determine which resources are most appropriate and the order in which each should be used. Sources that may be used, as appropriate, include credit database searches, such as Experian, various public websites, and other integrated database services.

  • Other locating activities will take place as needed, including a LexisNexis email search conducted for nonrespondents toward the end of data collection.

      1. Training for Data Collection Staff

Telephone data collection will be conducted at the contractor’s call center. B&B staff at the call center will include Performance Team Leaders (PTLs) and Data Collection Interviewers (DCIs). Training programs for these staff members are critical to maximizing response rates and collecting accurate and reliable data.

Performance Team Leaders, who are responsible for all supervisory tasks, will attend project-specific training for PTLs, in addition to the interviewer training. They will receive an overview of the study, background and objectives, and the data collection instrument through a question-by-question review. PTLs will also receive training in the following areas: providing direct supervision during data collection; handling refusals; monitoring interviews and maintaining records of monitoring results; problem resolution; case review; specific project procedures and protocols; reviewing reports generated from the ongoing Computer Assisted Telephone Interviewing (CATI); and monitoring data collection progress.

Training for DCIs is designed to help staff become familiar with and practice using the CATI case management system and the survey instrument, as well as to learn project procedures and requirements. Particular attention will be paid to quality control initiatives, including refusal avoidance and methods to ensure that quality data are collected. DCIs will receive project-specific training on telephone interviewing and answering questions from web participants regarding the study or related to specific items within the interview. At the conclusion of training, all B&B call center staff must meet certification requirements by successfully completing a certification interview. This evaluation consists of a full-length interview with project staff observing and evaluating interviewers, as well as an oral evaluation of interviewers’ knowledge of the study’s Frequently Asked Questions.

      1. Case Management System

Interviews will be conducted using a single web-based survey instrument for both web (including mobile devices) and CATI data collection. The data collection activities will be accomplished through a CATI case management system, which is equipped with the numerous capabilities, including: on-line access to locating information and histories of locating efforts for each case; a questionnaire administration module with full “front-end cleaning” capabilities (i.e., editing as information is obtained from respondents); sample management module for tracking case progress and status; and automated scheduling module which delivers cases to interviewers. The automated scheduling module incorporates the following features:

  • Automatic delivery of appointment and call-back cases at specified times. This reduces the need for tracking appointments and helps ensure the interviewer is punctual. The scheduler automatically calculates the delivery time of the case in reference to the appropriate time zone.

  • Sorting of non-appointment cases according to parameters and priorities set by project staff. For instance, priorities may be set to give first preference to cases within certain sub-samples or geographic areas; cases may be sorted to establish priorities between cases of differing status. Furthermore, the historic pattern of calling outcomes may be used to set priorities (e.g., cases with more than a certain number of unsuccessful attempts during a given time of day may be passed over until the next time period). These parameters ensure that cases are delivered to interviewers in a consistent manner according to specified project priorities.

  • Restriction on allowable interviewers. Groups of cases (or individual cases) may be designated for delivery to specific interviewers or groups of interviewers. This feature is most commonly used in filtering refusal cases, locating problems, or foreign language cases to specific interviewers with specialized skills.

  • Complete records of calls and tracking of all previous outcomes. The scheduler tracks all outcomes for each case, labeling each with type, date, and time. These are easily accessed by the interviewer upon entering the individual case, along with interviewer notes.

  • Flagging of problem cases for supervisor action or supervisor review. For example, refusal cases may be routed to supervisors for decisions about whether and when a refusal letter should be mailed, or whether another interviewer should be assigned.

  • Complete reporting capabilities. These include default reports on the aggregate status of cases and custom report generation capabilities.

The integration of these capabilities reduces the number of discrete stages required in data collection and data preparation activities and increases capabilities for immediate error reconciliation, which results in better data quality and reduced cost. Overall, the scheduler provides an efficient case assignment and delivery function by reducing supervisory and clerical time, improving execution on the part of interviewers and supervisors by automatically monitoring appointments and call-backs, and reducing variation in implementing survey priorities and objectives.

      1. Survey Instrument Design

The B&B:16/17 interview employs a web-based instrument and deployment system, which has been in use since NPSAS:08. The system provides multimode functionality that can be used for self-administration, including on mobile devices, CATI, Computer-Assisted Personal Interview (CAPI), or data entry. In December 2016, the B&B:16/17 interview, provided in appendix F, was reviewed by the B&B:16/17 technical review panel (TRP), which made recommendations for revisions and updates.

In addition to the functional capabilities of the case management system and web instruments described above, our efforts to achieve the desired response rate will include using established procedures proven effective in other large-scale studies we have completed. These include:

  • Providing multiple response modes, including mobile-friendly self-administered and interviewer-administered options.

  • Offering incentives to encourage response (see incentive structure described in Section 4, Tests of Procedures and Methods).

  • Assigning experienced CATI interviewers who have proven their ability to contact and obtain cooperation from a high proportion of sample members.

  • Training the interviewers thoroughly on study objectives, study population characteristics, and approaches that will help gain cooperation from sample members.

  • Maintaining a high level of monitoring and direct supervision so that interviewers who are experiencing low cooperation rates are identified quickly and corrective action is taken.

  • Making every reasonable effort to obtain an interview at the initial contact, but allowing respondent flexibility in scheduling appointments to be interviewed.

  • Thoroughly reviewing all refusal cases and making special conversion efforts whenever feasible (see next section).

At the request of the Federal Interagency Working Group on Improving Measurement of Sexual Orientation and Gender Identity in Federal Surveys (SOGI), a change was made to the survey instrument beginning in November 2017. Text boxes requesting elaboration of responses were added to specific response options for items BB17FGENDER and BB17FLGBTQ (see appendix F). In addition, the following help text was added to explain the purpose of the text box:

“If you are presented with a Please describe textbox, please define how you would self-describe your gender if the selected response option is not an exact match with how you identify your [gender (in BB17FGENDER) / sexual orientation (in BB17FLGBTQ)]. Text responses will not be released. The data are being collected to better inform relevant response options in future surveys.”

No new items or categories were added to the survey. The request for elaboration is intended to assist the SOGI working group with additional information on how gender identity and sexual orientation questions should be asked. Responses from these additional text boxes will not be disseminated beyond the SOGI working group.

      1. Refusal Aversion and Conversion

Recognizing and avoiding refusals is important to maximize the response rate. We will emphasize this and other topics related to obtaining cooperation during interviewer training. PTLs will monitor interviewers intensely during the early days of outbound calling and provide retraining as necessary. In addition, the supervisors will review daily interviewer production reports produced by the CATI system to identify and retrain any data collectors who are producing unacceptable numbers of refusals or other problems.

Refusal conversion efforts will be delayed for at least one week to give the respondent time after the initial refusal. Attempts at refusal conversion will not be made with individuals who become verbally aggressive or who threaten to take legal or other action. Refusal conversion efforts will not be conducted to a degree that would constitute harassment. We will respect a sample member’s right to decide not to participate and will not impinge this right by carrying conversion efforts beyond the bounds of propriety.

In November 2017, as a form of refusal conversion, we will begin offering the abbreviated interview to any sample members (across all data collection groups) who refuse participation because of the interview’s length.

    1. Tests of Procedures and Methods

The sampling approach for B&B:08/09 main study (see https://nces.ed.gov/pubs2014/2014041.pdf) included a subsample of approximately 10 percent of the NPSAS base-year interview nonrespondents among the potential B&B-eligible cases. For B&B:16/17, we wanted to explore the feasibility of increasing the subsampling rate in order to minimize sampling weight variation. Therefore, in the B&B:16/17 field test (FT) we tested whether it was possible to increase response rates and minimize nonresponse bias, but not at the expense of increased nonresponse variance. The B&B:16/17 FT was comprised of all NPSAS:16 FT base-year interview respondents and nonrespondents. We separated the sample into four groups targeted for different intensities of data collection protocols: two groups of base-year interview nonrespondents and two groups of base-year interview respondents. The B&B:16/17 main study data collection design described below continues with this increase in the sample size and subsampling rate of nonrespondents, revised based on B&B:16/17 FT results of applying different data collection protocols for the different base-year interview respondent types.

      1. Summary of B&B:16/17 FT Data Collection Design and Results

At the start of the B&B:16/17 FT, the sample was split into four groups based on their response behavior during the NPSAS:16 FT interview and the intensity of data collection efforts applied during the B&B:16/17 FT. NPSAS:16 FT nonrespondents were randomly assigned to either an aggressive data collection protocol (Group 1) or a default/standard data collection protocol (Group 2). NPSAS:16 FT interview respondents were first separated according to when they responded to the base year interview. Later respondents, who completed the interview after the first 3 weeks of data collection, were also assigned to the default protocol (Group 3), while early respondents, those who completed the interview within the first 3 weeks, were assigned to a “relaxed” protocol which involved only mailed and emailed communications (Group 4). All groups received an initial email and letter followed by reminder emails and postcards to complete the survey throughout data collection, but the use of a prepaid incentive, CATI prompting, and an abbreviated interview differed according to data collection protocol (aggressive, default, or relaxed). Table 7 provides a summary of the B&B:16/17 FT design.

Table 7. B&B:16/17 field test data collection protocols by data collection phase

Phase of B&B:16/17 FT

NPSAS:16 FT nonrespondents

NPSAS:16 FT respondents

Aggressive protocol

(Group 1)


Default protocol

(Group 2)

Late respondents--

default protocol

(Group 3)

Early respondents--

relaxed protocol

(Group 4)

Early Completion

(4 weeks)


  • $10 prepaid incentive

  • Initial letter and email

  • Begin outbound calling after 2 weeks

  • No prepaid incentive

  • Initial email and letter

  • Email reminders

  • No prepaid incentive

  • Initial email and letter noting past participation

  • Email reminders

  • No prepaid incentive

  • Initial email and letter noting past participation

  • Email reminders

Production

(10 weeks)

  • Frequent email reminders

  • Postcard reminders

  • Abbreviated interview

  • Begin outbound calling

  • Frequent email reminders

  • Begin “light CATI”* outbound calling

  • Frequent email reminders

  • No outbound calling

  • Frequent email reminders

Nonresponse Conversion Phase

(4 weeks)

  • Frequent email reminders

  • Postcard reminders

  • Frequent email reminders

  • Postcard reminders

  • Abbreviated interview

  • Frequent email reminders

  • Postcard reminders

  • Abbreviated interview

  • Frequent email reminders

  • Postcard reminders

Incentive amount

$10 prepaid incentive + $20 paid upon interview completion

$30 incentive paid upon interview completion

$30 incentive paid upon interview completion

$20 incentive paid upon interview completion

* Outbound calling is considered “light CATI” when a minimal number of phone calls placed to sample members are intended mainly to prompt the web response rather than obtain a telephone interview. During the B&B:16/17 FT, these individuals only received approximately half of the calls compared to the default CATI protocols.

Comparison of response rates by protocol group showed that Group 1, interview nonrespondents who experienced the aggressive protocol, had a significantly higher response rate (37%) than Group 2, interview nonrespondents given the default protocol (25%; t(2,097) = 3.52, p <.001).

In order to assess whether the prepaid incentive and addition of telephone as a survey mode increased response, response rates at the end of the first phase of data collection were compared for Group 1, which received both a prepaid incentive and attempts to interview by telephone, and Group 2, which received only a promised incentive and no telephone interviewing. Compared to Group 2 (4.4%), the response rate in Group 1 was almost twice as high (8.4%) at the end of the early completion phase (t(2,097) = 2.29, p <.05). Overall, about 26% of Group 1 respondents participated by telephone in the early completion phase. Overall, 21% of those in Group 1, offered the prepaid incentive, accepted it and, of those, 34% completed the interview. However, since prepaid incentives and the telephone mode option both occurred in phase 1 of the data collection, the separate effects cannot be evaluated.

Introduction of the abbreviated interview during the production phase did result in a significantly higher response rate for Group 1 (22.7%) compared to Group 2 (12.1%; t(2,097) = 3.67, p < 0.001) but results are conditional on the outcomes in the early completion phase.3 Even though the abbreviated interview was also offered to Group 2 later in data collection, it did not have the same effects – but again, there is a dependency in the nonresponse conversion phase based on what happened in the previous two phases.

The default protocol used with Group 3 NPSAS:16 FT respondents, who responded later in the base year FT data collection, resulted in a 70% response rate to the B&B:16/17 FT survey. Group 4, interview early respondents, yielded a 75% response rate with the relaxed protocol.4 A naïve significance test between Group 3 and 4 shows that the response rates are significantly different at an alpha level of 0.05 (t(2,097)=2.08).

Based on administrative frame data, nonresponse bias analyses were conducted for sex, age, institutional sector of the NPSAS institution, geographic region of the bachelor’s degree institution, and total enrollment counts. Very little nonresponse bias was observed. While the average absolute relative bias in mean statistics was larger for Groups 1 and 2 (the NPSAS:16 FT nonrespondents) than for Groups 3 and 4, only one out of 23 indicators in Group 1 yielded a statistically significant difference, confirming presence of little nonresponse bias. Because in Groups 1 and 2 we were potentially bringing in more reluctant respondents who might not have been as conscientious in completing the survey, we would have expected a decline in data quality. However, the number of undergraduate and postbaccalaureate postsecondary institutions, the number of employers, and whether respondents had any dependents did not differ when comparing Group 1 to Group 2, or when comparing Groups 1 and 2 to Groups 3 and 4.5 These results are reassuring in that increased data collection efforts in lower response propensity strata do not seem to decrease data quality.

For detailed results of the B&B:16/17 FT experiments, see Appendix C.

      1. B&B:16/17 Main Study Data Collection Design

Confidentiality Pledge Wording and Placement Experiment. In July 2015, the Homeland Security Act of 2002 was amended to require the Department of Homeland Security (DHS) to monitor federal agency information systems, including survey data transmissions. As a result, the confidentiality pledge cited to sample members in recruitment materials and at the start of each data collection instrument, must be updated to reflect the change. Throughout 2016, cognitive testing was conducted to determine the wording for the pledge so that it communicates the purposes of the legislation. Testing yielded three wording versions (shown below). We will now conduct an experiment in B&B:16/17 to determine the effect of the three different versions of the confidentiality pledge on sample members’ willingness to both initially choose to participate in (participation rate) and ultimately continue through and complete (response rate) each – the screener and the survey. In addition, the experiment will determine whether the placement of the confidentiality pledge (A) on the first screen encountered (Login Screen Group) together with the Paperwork Reduction Act (PRA) statement and the study authorization citation, or (B) on a second screen with the pledge wording presented by itself (Separate Pledge Screen) has an effect on participation and/or response rates (see appendix F for the experimental screens). The three confidentiality pledge wording versions are:

Control Group:

All of the information you provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151).

Experimental Group 1 (“Homeland Security”):

All of the information you provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573). By law, anyone who willfully discloses any identifiable information about you or your school is subject to a jail term of up to 5 years, a fine of up to $250,000, or both. Electronic transmission of your information will be monitored for viruses, malware, and other threats by Homeland Security in accordance with the Cybersecurity Enhancement Act of 2015.

Text for Experimental Group 2 (“Federal Staff”):

All of the information you provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573). By law, anyone who willfully discloses any identifiable information about you or your school is subject to a jail term of up to 5 years, a fine of up to $250,000, or both. Electronic transmission of your information will be monitored for viruses, malware, and other threats by Federal employees and contractors in accordance with the Cybersecurity Enhancement Act of 2015.

Sample members will be assigned to groups at random, prior to the start of data collection, with the large majority in the Control-Login Screen Group (N=18,779). The rest will be distributed equally across the remaining five groups (N=1,733 in each group): Control-Separate Pledge Screen; Homeland Security-Login Screen; Homeland Security-Separate Pledge Screen; Federal Staff-Login Screen; and Federal Staff-Separate Pledge Screen.

Two outcomes are of particular interest. The first, participation rate, will evaluate the willingness of sample members to start the B&B:16/17 interview after having been shown their respective Confidentiality Pledge. Sample members start the interview either by entering a Study ID and password and clicking the Login button on the study website, or by clicking the next button on the Survey Start page reached through links included in individualized emails. The second outcome of interest, response rate, will evaluate the willingness of sample members to continue through to the end of the interview after having been shown the pledge6. Participation and response rates will be compared for pledge wording, placement, and the interaction of pledge wording and placement.

In addition to the Confidentiality Pledge experiment, other features of the B&B:16/17 main study design, many of which have been retained from the field test, are discussed below and summarized in table 8.

Eligibility Screener with Address Update. During the B&B:16/17 FT, 22% of NPSAS:16 FT nonrespondents who participated in B&B:16/17 were determined ineligible by the B&B interview. For the B&B:16/17 main study, at the start of data collection, B&B:16/17 base year nonrespondents, both study members (N=4,905) and non-study members (N=1,352), and base year abbreviated respondents (N=2,005) will be sent an initial letter and email inviting them to complete an address update and eligibility screener either online or by telephone. This first step should result in more efficient locating and earlier identification of ineligible sample members. Those who complete the screener will receive a $10 postpaid incentive paid by their choice of check or via PayPal. Requests to complete the screener will be mailed and emailed to sample members at the start of data collection, and a reminder will be sent about two weeks after the initial invitation. Data collection will continue for about six weeks.

B&B:16/17 Main Study Data Collection Group Assignments and Protocols. Given the relative success of the aggressive, default, and relaxed protocols observed in the field test, a similar approach will be used in the main study, with minor changes in the groupings of sample members as shown in Table 8.

  • NPSAS:16 interview nonrespondents: All NPSAS:16 nonrespondents will receive the aggressive protocol. In addition, in order to administer appropriate interventions throughout data collection, nonrespondents will be further divided into those who were (Group 2; N=3,686) and were not (Group 1; N=1,219) located in NPSAS:16.

  • NPSAS:16 abbreviated interview respondents: Also receiving the aggressive protocol will be respondents who completed either version of abbreviated interview in NPSAS:16 (Group 2; N=3,224).

  • Late NPSAS:16 interview respondents: NPSAS:16 interview respondents who completed their base year interview later in data collection, that is, after the first 3 weeks, will receive the default protocol (Group 3; N=7,333).

  • Early NPSAS:16 interview respondents: NPSAS:16 interview respondents who completed their base year interview within the first 3 weeks will receive the relaxed protocol (Group 4; N=11,982).

Table 8. B&B:16/17 main study data collection protocols by data collection phase


Data Collection Group Assignments

Non-located NPSAS:16 interview nonrespondents

(Group 1)

Located NPSAS:16 interview nonrespondents and abbreviated respondents

(Group 2)

Late NPSAS:16 interview respondents

(Group 3)

Early NPSAS:16 interview respondents

(Group 4)

Protocol

Aggressive

Aggressive

Default

Relaxed

Eligibility screener & address update – First 6 weeks, prior to main data collection1

$10 postpaid incentive

$10 postpaid incentive

---

---

Early Completion Phase – First 4 weeks of main data collection1


Data collection announcement letter and email

Data collection announcement letter and email offering additional $5 “Early Bird” incentive

DC announcement letter and email thanking for prior participation

CATI starts 2 weeks after mail outs – continues through all phases

Mode tailoring in NPSAS:16 completion mode

Mode tailoring in NPSAS:16 completion mode

Production Phase I – Next <3 months

Postcard reminders

Light” CATI Outbound begins

Light” CATI Outbound begins

2 weeks after start of phase

Production Phase II – Next 3 months


Abbreviated interview offered

Postcard reminders

Postcard reminders

Continued CATI interviewing/ locating efforts

Abbreviated interview offer for potential refusals

Nonresponse Conversion Phase – Final month


Continued CATI interviewing/locating efforts

Postcard reminders

Abbreviated interview offered – offered to Group 3 & 4 pending cases on February 19

Total postpaid incentive

$55 (+$10 screener)

$50 + $5 (+$10 screener)

$30 + $5

$30

Note: In addition to contacts shown in table, all groups will receive regular email and, with permission, text message reminders.

1 Main data collection begins after the 6-week screener period for base year nonrespondents and immediately upon OMB clearance for base year respondents. Main data collection will end at the same time for both groups, therefore the duration of production phase I will be adjusted for base year nonrespondents to ensure Production Phase II and the Nonresponse Conversion Phase have sufficient time.


Early Bird Incentive. Early bird incentives have been shown to lead to faster responses and increased participation rates within the early bird incentive period (e.g., LeClere et al. 2012; Coppersmith et al. 2016), and can provide efficiencies by reducing both data collection costs and time. Given this, rather than continuing to offer a $10 prepaid incentive as part of the aggressive protocol, for responding within the first three weeks of the start of data collection, sample members in Groups 2 and 3 will be offered the opportunity to increase their total incentive by $5, to $55 for located base-year nonrespondents, and to $35 for base-year late respondents. Because Group 1 sample members may not be located until well after the B&B:16/17 early bird phase ends, they will not receive the same early bird incentive, but will instead be offered an incentive that is increased by $5 throughout the entire data collection period. Sample members in Group 4 will not receive the early bird incentive because, as shown in the B&B:16/17 FT, they generally needed the least prompting to participate.

Other data collection incentives. As described in the last section, instead of offering either a prepaid or an early bird incentive to Group 1, sample members will instead be offered an overall incentive amount of $55 upon survey completion. This amount matches that offered to all base-year nonrespondents in the B&B:08/09 FT, which achieved an overall response rate of 44.0% for the equivalent group. In addition to the $5 early bird incentive, sample members in Group 2 will received $50 upon survey completion (total of $55 for an early response). This amount matches the amount offered to equivalent groups in previous data collections with the B&B:08 cohort. For Group 3, we recommend maintaining the same $30 incentive level that was used in the B&B:16/17 FT and in NPSAS:16 main study. In addition to the $30, Group 3 sample members will receive an additional $5 early bird incentive, for a total of $35, if they complete the survey within the first three weeks of data collection. While Group 4 sample members were prompted least during the B&B:16/17 FT yet still achieved a 75.1% response rate, that response rate is still considerably lower than what was observed during the B&B:08/09 field test (80.9 % among all base-year interview respondents). The primary difference in the two collections was the incentive amount offered -- $20 for B&B:16/17 compared to $30 for B&B:08/18 – so, in order to maximize the possible response rate with Group 4, its sample members will be offered $30. It is worth noting that these same sample members received $30 for completing the NPSAS:16 interview one year earlier.

Mode tailoring. Leverage-saliency theory and social exchange theory suggest that offering a person the mode they prefer, e.g., by telephone or the Web, increases their likelihood of participating (Groves et al. 2000; Dillman et al. 2014). This theory is supported by empirical evidence that offering people their preferred mode choice speeds up their response and is associated with higher participation rates (e.g., Olson et al. 2012). With the NPSAS:16 interview completion mode as a proxy for mode preference, during the B&B:16/17 main study early completion phase, Groups 3 and 4 will be approached in the NPSAS:16 preferred mode. Specifically, while all NPSAS:16 interview respondents will receive identical data collection announcement letters and emails, members of Groups 3 and 4 who completed the NPSAS:16 interview by telephone (N=355) will be approached by telephone from the start of data collection. Likewise, those who completed the NPSAS:16 main study interview online will not be contacted by telephone before a preassigned outbound data collection date.

Light CATI outbound calling. Anecdotally, introduction of light, or less intense, CATI interviewing in B&B:16/17 FT seemed to increase production phase response rates among Group 3 sample members (35%) compared to Group 4 in the same phase (24%). Light CATI involves a minimal number of phone calls, used mainly to prompt web response, while regular CATI efforts include more frequent phone efforts, with the goal to locate sample members and encourage their participation. Although one should use caution when interpreting these results – group assignment was not random – the findings are consistent with the literature which has shown that web surveys tend to have lower response rates compared to interviewer-administered surveys (e.g., Lozar Manfreda et al. 2008). Attempting to interview sample members by telephone also increases the likelihood of initiating locating efforts sooner when they cannot be located. B&B:16/17 FT results showed higher locate rates in Group 3 (93.7%), which had light CATI, compared to that of Group 4 (77.8%; χ2 = 63.2, p < 0.001) which did not. For the B&B:16/17 main study data collection, light CATI will be used with both Groups 3 and 4 once CATI begins in Production Phase I, the first half of the 6-month production phase.

Abbreviated Interviews. Obtaining responses from all sample members in a data collection is important to assessing and improving sample representativeness (e.g., Kreuter et al. 2010). During the B&B:16/17 FT data collection, sample members in Group 1 offered the abbreviated interview during the production phase responded at higher rates (22.7%) than those in Group 2 who were not offered the abbreviated interview at the same time (12.1%; t(2,097) = 3.67, p < 0.001). An abbreviated interview option will be offered to all sample members in the B&B:16/17 main study data collection. For pending cases in Groups 1 and 2, it will be offered on January 8, 2018, during Production Phase II (the latter half of the production phase of data collection). For pending cases in Groups 3 & 4, the abbreviated interview offer will be made on February 19, 2018. Further, for sample members who express concerns about the length of the interview, the abbreviated interview offer will be made ahead of schedule, beginning in November 2017, based on their refusal to complete the full survey.

B&B:16/17 Confidentiality Pledge Experiment Research Questions. As described above, there are two outcomes of interest with the pledge experiment. The participation rate outcome will measure the willingness of sample members to enter or start the B&B:16/17 interview either by entering a Study ID and password and clicking the Login button or by clicking the next button on the Survey Start page reached through a direct link from emails. The response rate outcome will measure sample members’ willingness to continue to the end of the interview. Given the design of the pledge experiment, both in terms of wording (control, Homeland Security, federal staff) and placement of the pledge text (on the log in/direct link screen or a separate second screen), we will test the following:

Research question 1.1: Is there a difference in participation rates across the three pledge wording options?

H0: There is no observed difference in likelihood of participation between:

1.1a. The Control and Homeland Security wording options

1.1b. The Control and Federal Staff wording options

1.1c. The Homeland Security and Federal Staff wording options

Research question 1.2: Is there a difference in response rates, given participation, across the three pledge wording options?

H0: There is no observed difference in likelihood of response, given participation, between:

1.2a. The Control and Homeland Security wording options

1.2b. The Control and Federal Staff wording options

1.2c. The Homeland Security and Federal Staff wording options

Research question 2.1: Is there a difference in participation rates across the two text placement options, on the login screen or on a separate screen immediately following login?

H0: There is no observed difference in likelihood of participation between the Login Page and Separate Page options.

Research question 2.2: Is there a difference in response rates, given participation, across the two text placement options or on a separate screen immediately following login?

H0: There is no observed difference in likelihood of response, given participation, between the Login Page and Separate Page options.

Research question 3.1: Is there a difference in participation rates across the three pledge text/two placement combinations?

H0: There is no difference in the likelihood of participation, among the pledge text/placement combinations.

Research question 3.2: Is there a difference in response rates across the three pledge text/two placement combinations?

H0: There is no difference in response rates among the pledge text/placement combinations.

The differences between the control and treatment group(s) necessary to detect statistically significant differences are shown in table 9.

Table 9. Two-group detectable differences for the pledge text/pledge placement experiment

Hypothesis

Group 1

Group 2

Detectable difference (95% confidence)

Definition

N

Definition

N

1.a

Control text

20,512

Homeland Security text

3,466

2.3%

1.b

Control text

20,512

Federal Staff text

3,466

2.3%

1.c

Homeland Security text

3,466

Federal Staff text

3,466

3.0%







2

Login screen

22,245

Separate Page screen

5,199

1.9%

3

Interactions


Control/Login

18,779

Any other pledge/placement option

1,733

3.1%


All other pledge/placement options (other than Control/Login)

1,733

All other pledge/placement options (other than Control/Login)

1,733

4.3%

    1. Reviewing Statisticians and Individuals Responsible for Designing and Conducting the Study

The study is being conducted by NCES. The following statisticians at NCES are responsible for the statistical aspects of the study: Mr. Ted Socha, Dr. Tracy Hunt-White, Dr. David Richards, Dr. Sean Simone, Dr. Elise Christopher, Dr. Cleo Redline, and Dr. Gail Mulligan. NCES’s prime contractor for B&B:16/17 is the RTI International (RTI). The following staff members at RTI are working on the statistical aspects of the study design: Dr. Jennifer Wine, Ms. Jennifer Cooney, Ms. Nicole Tate, Dr. Antje Kirchner, Dr. Erin Dunlop Velez, Dr. T. Austin Lacy, Dr. Emilia Peytcheva, and Mr. Peter Siegel.

Subcontractors include Coffey Consulting; Hermes; HR Directions; Research Support Services; Shugoll Research; and Strategic Communications, Inc. Consultants are Dr. Sandy Baum, Ms. Alisa Cunningham, and Dr. Stephen Porter. Principal professional RTI staff, not listed above, who are assigned to the study include Ms. Donna Anderson, Ms. Gayathri Bhat, Ms. Eva Ebert, Ms. Erin Thomsen, and Ms. Ashley Wilson.

  1. References

Coppersmith, J., Vogel, L.K., Bruursema, T. and K. Feeney. 2016. Effects of Incentive Amount and Type of Web Survey Response Rates. Survey Practice, no pp.

Dillman, D.A., Smyth, J.D., and Christian, L.M. 2014. Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method 4th Edition. John Wiley & Sons, Hoboken, NJ.

Groves, R.M., Singer, E. and Corning, A. 2000. Leverage-Saliency Theory of Survey Participation. Description and Illustration. Public Opinion Quarterly, 64, 299-308.

Kreuter, F., Olson, K., Wagner, J., Yan, T., EzzatiRice, T.M., CasasCordero, C., Lemay, M., Peytchev, A., Groves, R.M., and Raghunathan, T.E. 2010. Using Proxy Measures and Other Correlates of Survey Outcomes to Adjust For NonResponse: Examples from Multiple Surveys. Journal of the Royal Statistical Society: Series A, 173(2), 389-407.

LeClere, F., Plumme, S., Vanicek, J., Amaya, A. and K. Carris. 2012. Household Early Bird Incentives: Leveraging Family Influence to Improve Household Response Rates.” American Statistical Association Joint Statistical Meetings, Section on Survey Research.

Lozar Manfreda, K. Bosnjak, M., Berzelak, J. Haas, I., and V. Vehovar. 2008. Web Surveys versus other Survey Modes. A Meta-Analysis Comparing Response Rates. International Journal of Market Research, 50(1), 79-104.

Olson, K., Smyth, J. D., & Wood, H. (2012). Does giving people their preferred survey mode actually increase survey participation? An experimental examination. Public Opinion Quarterly, 76, 611–635.

1 A NPSAS study member is defined as any sample member who is determined to be eligible for the study and, minimally, has valid data from any source for student type (undergraduate or graduate), date of birth or age, gender, and at least 8 of 15 variables described in the NPSAS:16 Full Scale (OMB# 1850-0666 v. 15-19) Supporting Statement Part B.

2 A sample member is considered located if he or she meets any of the following criteria: final ineligible; completes the survey or verifies his or her identity via the survey; is assigned any final status via telephone efforts (e.g., unavailable for duration of study, incarcerated, deceased, etc.); or is a pending or final refusal. Additionally, if an answering machine confirming the sample member’s name is reached through telephone efforts, or a household member of the sample member confirms the contacting information, then the sample member is considered located. When intensive tracing efforts are able to confirm contacting information for a sample member, then the case is also considered located.

3 By the production phase, Groups 1 and 2 were no longer randomly assigned and, therefore, these results should be interpreted with caution.

4 Because Groups 3 and 4 were not randomly assigned, comparisons should be made with caution.

5 Upon comparing response distributions across groups, no evidence of differential nonresponse bias in each group was observed and hence no correction for differential selectivity was made.

6 For sample members determined eligible for the B&B:16 cohort, both the full and abbreviated interviews end after locating information is collected. Ineligible sample members will be considered “responding” if they continue through the end of the eligibility section, reaching interview item BB17ABYE [which requests contacting information should they need to be recontacted after eligibility review.]


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleChapter 2
Authorspowell
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy