B&B 08 Supporting Pkg Part B

B&B 08 Supporting Pkg Part B.doc

Baccalaureate and Beyond Longitudinal Study, Third Followup (B&B:93/2003)

OMB: 1850-0729

Document [doc]
Download: doc | pdf

B. Collection of Information Employing Statistical Methods

B. Collection of Information Employing Statistical Methods

This submission requests clearance for the implementation of the first follow-up of NPSAS:08 sample members who were baccalaureate recipients during the 2006-07 (field test) and 2007–08 (full-scale) academic years: the Baccalaureate and Beyond Longitudinal Study 2008/2009 (B&B:08/09).

1.Respondent Universe

The respondent universe for the full-scale B&B study consists of all persons who completed requirements for the bachelor’s degree during the 2007–08 academic year. For the field test, the respondent universe is the same except that the survey year is 2008 and the reference year is the 2006–07 academic year.

2.Statistical Methodology

a.Field Test Design

The B&B follow-up field test will be implemented to fully test all procedures, methods, and systems planned for the full-scale B&B follow-up in a realistic operational environment, prior to implementing them in the full-scale study. The field test will be designed to test and validate data collection and monitoring procedures that will obtain the most accurate data in the least amount of time. Specific field test evaluation goals include the following:

  • determining how to identify actual baccalaureate recipients, i.e., B&B eligible sample members, from various data sources, including the initial institution listing; student-level institutional records; student report of degree receipt in the NPSAS base year interview; the B&B:08/09 field test interviews; transcripts; and data obtained from extant data sources when available, such as the CPS and NSC

  • assessing the quality, completeness, and effectiveness of various types of locating data obtained during the base year

  • determining the “best” address (student permanent, student local, NPSAS institution, or parent) for use in establishing geographic clusters for use in field CAPI follow-up

  • evaluating the utility of pre-CATI submission of sample members to Telematch, NCOA, and CPS as a mechanism for obtaining updated locating information

  • identifying problematic data elements in the B&B survey instrument

  • determining best approaches for collecting transcripts

Additionally, we will evaluate the time required to complete the interview, and sections of the interview, in order to identify possible instrument modifications that will save time in the full-scale interview. We will also conduct a reliability reinterview to evaluate the temporal consistency of selected interview items.

The field test sample for B&B:08/09 will consist of all interview respondents from the NPSAS:08 field test who completed requirements for their bachelor’s degree at any time between July 1, 2006, and June 30, 2007. Additionally, we plan to include all potentially eligible interview nonrespondents in the field test sample (e.g. those who were listed as potential baccalaureate recipients by their NPSAS institution but did not complete a student interview to confirm their status).

As part of the field test data collection, we will assess the value in determining cohort eligibility of each independent data source, including the initial institution listing; report of degree receipt in the NPSAS:08 and B&B:08/09 interviews; transcripts; and data obtained from extant data sources when available, such as the CPS and the National Student Clearinghouse Tracker database (NSC). The evaluation will occur in steps, beginning prior to the first follow-up interview. Those potential sample members with no likelihood of eligibility (e.g., those identified as potential bachelor’s recipients on the initial institutional classification but for whom other data sources provide sufficient contradictory evidence) will be coded as ineligible prior to the start of interviewing. The remainder will be interviewed and transcripts collected for further evaluation following interviewing. A final assessment of eligibility will determine the bachelor’s cohort for continuation into the second follow-up.

The NPSAS:08 field test yielded 1,220 interview respondents who were confirmed to be bachelor’s recipients. The base-year sample also included 599 interview nonrespondents who were either identified as potential bachelor’s recipients according to the initial classification by the NPSAS sample institution at the time of student sampling (prior to base-year data collection) or were classified as such in the student institutional records obtained through CADE. Therefore, the total B&B:08/09 field test sample size will be 1,819. See table 7 for the distribution of the B&B sample by NPSAS:08 response status and B&B eligibility.

Because the data sources for determining cohort eligibility will be evaluated during the field test, the eligibility rate for the field test sample is anticipated to be about 80 percent, which should yield an eligible sample size of about 1,455. The response rate is expected to be about 80 percent among the eligible sample members, which will yield about 1,164 responding bachelor’s recipients.

b.Full-scale Design

The sample for the first follow-up with the B&B:08 cohort will contain all students who completed requirements for the bachelor’s degree as confirmed by the NPSAS:08 full-scale student interviews (expected to number approximately 23,100). In addition, to have full population coverage of the B&B sample, a subsample of 500 of the approximately 5,000 anticipated NPSAS:08 interview nonrespondents who are either listed by the NPSAS sample institution as bachelor’s degree candidates or confirmed in CADE to be degree candidates will be selected. The results from the field test examination of data sources will determine the decision rules and data sources to be used to determine eligibility for the full-scale study, as well as the allocation for the subsample of the different types of nonrespondents.

The expected eligibility rate of the sample members for the full-scale study is about 90 percent, which will give an eligible sample size of about 21,200. We also expect a response rate of about 90 percent among the eligible sample members, which will yield about 19,100 responding baccalaureate recipients.

Table 7. Distribution of the B&B:08/09 field test sample by NPSAS:08 field test response status and B&B eligibility

NPSAS:08 field test study status

NPSAS:08 field test interview status

B&B eligibility

Count

Total



1,819





Study respondent

Interview respondent

Baccalaureate receipt confirmed in interview

1,220

Study respondent

Interview nonrespondent

Baccalaureate receipt confirmed in CADE

406

Study respondent

Interview nonrespondent

Listed as potential baccalaureate recipient

159

Study nonrespondent

Interview nonrespondent

Baccalaureate receipt confirmed in CADE

8

Study nonrespondent

Interview nonrespondent

Listed as potential baccalaureate recipient

26

3.Methods for Maximizing Response Rates

Response rates in the B&B:08/09 field test and full-scale data collections are a function of success in two basic activities: identifying and locating the sample members involved, then contacting them and gaining their cooperation. Two classes of respondents are involved: institutions for the transcript component and students who earned a bachelor’s degree from those institutions.

a.Institution Contacting

The success of the B&B:08/09 transcript collection is closely tied to the active participation of selected institutions. The consent and cooperation of an institution’s coordinator is essential and helps to encourage the timely completion of the transcript collection. If the B&B coordinators have been involved with the NPSAS institution collection, they will be familiar with NPSAS and B&B and recognize the study’s importance to postsecondary education. The field test transcript collection will help to clarify the extent to which the coordinators will be the same between the two studies. Initial contact between the project team and institutional coordinators provides an opportunity to have a senior staff member emphasize the importance of the study and to address any concerns about participation.

Proven Procedures. B&B:08/09 procedures for working with institutions will be developed from those used successfully in NPSAS:08 and NELS:88/2000. B&B will use a transcript control system (TCS) similar to the system used for the ELS:2000 transcript collection to maintain relevant information about the NPSAS institution attended by each B&B cohort member. IPEDS contact information obtained for each institution will be loaded into the TCS and used for all mailings. Contact information obtained for each institution will be confirmed with a verification call to each institution to collect the name of the registrar (or other appropriate contact), address information, telephone and facsimile numbers, and e-mail addresses. This verification call will help to ensure that transcript request materials are properly routed, reviewed, and processed.

The descriptive materials sent to institutions will be clear, concise, and informative about the purpose of the study and the nature of subsequent requests. The package of materials sent to the coordinators, provided in appendix D, will contain:

  • A letter from RTI providing an introduction to the B&B:08/09 study,

  • an introductory letter from NCES on U.S. Department of Education letterhead,

  • a letter of endorsement from the American Association of Collegiate Registrars and Admission Officers (AACRAO),

  • a list of other endorsing agencies,

  • information regarding how to log on to the study’s secure website and access the list of students for which transcripts are requested as well as a form in which they can request reimbursement of expenses incurred with the request (e.g., transcript processing fees) and

  • descriptions of and instructions for the various methods of providing transcripts.

Follow-up calls to ensure receipt of packet and answer any questions about the study will occur 2 days after the initial mailing. We also anticipate that telephone prompting will be required to obtain the desired number of transcripts. Despite the relatively routine nature of the transcript request many institutions give relatively low priority to voluntary research requests. Telephone follow-up is necessary to ensure that the request is handled within the schedule constraints. In addition to telephone prompting, institutions will be contacted by e-mail prompts, letters, and postcard prompts.

Experienced staff from RTI’s Call Center Services (CCS) will carry out these contacts and will be assigned a set of institutions that is their responsibility throughout the process. This allows RTI staff members to build upon relationships developed during the NPSAS study, and to maintain rapport with the institution staff and provides a reliable point of contact at RTI. Project staff members will be thoroughly trained in transcript collection and in the purposes and requirements of the study, which helps them establish credibility with the institution staff.

Endorsements. In NPSAS studies, the specific endorsement of relevant associations was extremely useful in persuading institutions to cooperate. Endorsements from 14 professional associations have been secured for B&B:08/09. These associations are listed in appendix F.

Minimizing burden. Different options for collecting transcripts for sampled students are offered. The coordinator is invited to select the methodology of greatest convenience to the institution. The optional strategies for obtaining the data are discussed later in this section.

Another strategy RTI can investigate to increase the efficiency of the transcript data collection, encourage participation, and minimize burden to individual institutions is to solicit support at a system-wide level and state agencies. A timely contact, together with enhanced verification procedures, is likely to reduce the number of remail requests, and minimize delay caused by misrouted requests.

b.Transcript Collection Training

Institution Coordinator Training. The purpose of an effective plan for training institution coordinators is two-fold: to make certain that procedures are understood and followed, and to motivate the coordinators. The project relies on these procedures to assure transcript collections are accurate and complete. Because institution coordinators are a critical element in this process, communicating instructions about their transcript collection tasks clearly is essential.

Institution coordinators will be trained by call center staff according to the method of data collection they have selected for their institution (refer to section 3.c). All institution coordinators will be provided information on the purposes of B&B and on their tasks for the study, and assured of our commitment to maintaining the confidentiality of institution and student data.

c.Collection of Student Data from Transcripts

Transcript data will be requested for sampled students from the institutions from which they were sampled as part of NPSAS:08. Several methods will be used for obtaining the data including: (1) institution staff uploading electronic transcripts for sampled students to the secure study website; (2) institution staff sending electronic transcripts for sampled students by secure File Transfer Protocol; (3) institution staff sending electronic transcripts as encrypted attachments via email; (4) for institutions that already use this method, RTI requesting/collecting electronic transcripts via a dedicated server at the University of Texas at Austin; and as a last resort, (5) institution staff transmitting transcripts via a secure fax that is housed in a locked room at RTI, after sending a confirmed test page. Each method is described below. A complete transcript from the institution will be requested as well as the complete transcripts from transfer schools that the students attended, as applicable.

To track receipt of institution materials and student transcripts, we will add a Transcript Control System (TCS) to the IMS developed for B&B:08/09, similar to the TCS used successfully for NELS:88/2000 and ELS:2002, but updated and enhanced for the B&B postsecondary transcript collection effort. The TCS will track the status of each catalog and transcript request, from initial mailout of the requests through follow up and final receipt.

Uploading electronic transcripts to the secure study website. Goals for B&B:08/09 include reducing the data collection burden on institutions (thereby reducing project costs), expediting data delivery, improving data quality, and ensuring data security. NPSAS:2000, 2004, and 2008 demonstrated the viability of a web-based approach for receiving student enrollment lists and student record abstraction (CADE) data files. We propose to use the same functionality for uploading data that was used in NPSAS:08.

Because the open internet is not conducive to transmitting confidential data, any internet-based data collection effort necessarily raises the question of security. However, we intend to incorporate the latest technology systems into our web application to ensure strict adherence to NCES confidentiality guidelines. Our web server will include a Secure-Sockets Layer (SSL) Certificate, and will be configured to force encrypted data transmission over the Internet. The SSL technology is most commonly deployed and recognizable in electronic commerce applications that alert users when they are entering a secure server environment, thereby protecting credit card numbers and other private information. Also, all of the data entry modules on this site are password protected, requiring the user to log in to the site before accessing confidential data. The system automatically logs the user out after 20 minutes of inactivity. This safeguard prevents an unauthorized user from browsing through the site. Additionally, we will stay attuned to technological advances to ensure the B&B:08/09 data are completely secure.

Files uploaded to the secure website will be immediately moved to a secure project folder that is only accessible to specific staff members. Access to this project folder will be set so that only those who have authorized access will be able to see the included files. The folder will not even be visible to those without access. It is necessary for the files to be stored on the project share so that they can be backed up by ITS in case any problems occur that cause us to lose data. ITS will use their standard procedures for backing up data, so the backup files will exist for 3 months.

Institution staff sending electronic transcripts by secure File Transfer Protocol. FTPS (also called FTP-SSL) uses the FTP protocol on top of SSL or TSL. When using FTPS, the control session is always encrypted. The data session can optionally be encrypted if the file has not been pre-encrypted.

Files transmitted via FTPS will be copied to a secure project folder that is only accessible to specific staff members. As with uploaded files, access to this project folder will be set so that only those who have authorized access will be able to see the included files. The folder will not even be visible to those without access. After being copied, the files will be immediately deleted from the FTP server. It is necessary for the files to be stored on the project share so that they can be backed up by ITS in case any problems occur that cause us to lose data. ITS will use their standard procedures for backing up data, so the backup files will exist for 3 months.

Institution staff sending electronic transcripts as encrypted attachments via email. RTI will provide guidelines on encryption and creating “strong” passwords. Encrypted electronic files sent via e-mail to a secure e-mail folder will only be accessible to a few staff members on the project team. These files will then be copied to a project folder that is only accessible to these same staff members. Access to this project folder will be set so that only those who have authorized access will be able to see the included files. The folder will not even be visible to those without access. After being copied, the files will be deleted from the e-mail folder. The files will be stored on the network that is backed up regularly to avoid the need to recontact the institution to provide the data again should a loss occur. RTI’s information technology service (ITS) will use standard procedures for backing up data, so the backup files will exist for 3 months.

Institution staff transmitting transcripts via a secure fax. We expect that few institutions will ask to provide hardcopy transcripts. In such cases, we will encourage one of the secure electronic methods of transmission. If that is not possible, we will accept faxed transcripts. Although fax equipment and software does facilitate rapid transmission of information, this same equipment and software opens up the possibility that information could be misdirected or intercepted by individuals to whom access is not intended or authorized. To safeguard against this, as much as is practical, RTI protocol will only allow for transcripts to be faxed to a fax machine housed in a locked room and only if institutions cannot use one of the other options. To ensure the fax transmission is sent to the appropriate destination, we will require a test run with nonsensitive data prior to submitting the transcripts to eliminate errors in transmission from misdialing. RTI will provide schools with a fax cover page that includes a confidentiality statement to use when transmitting individually identifiable information.

Paper transcripts will be kept in a locked file cabinet in RTI’s secure data receipt facility. Only B&B:08/09 transcript staff will have access to the file cabinet.

Collecting electronic transcripts via a dedicated server at the University of Texas at Austin. We will also request and collect transcripts electronically via a dedicated server at the University of Texas at Austin for institutions that currently use this method. Approximately two hundred institutions currently send and receive academic transcripts in standardized electronic formats via a dedicated server at the University of Texas at Austin. The server supports Electronic Data Interchange (EDI) and XML formats. Nine (6 percent) of the field test institutions and approximately 70 (6 percent) of the likely full scale institutions are fully registered with the server. In addition, 14 of the field test institutions and approximately 60 of the likely full scale institutions are in the test phase with the server, which means that they are preparing and testing using the server but not currently using it to send data.

The dedicated server at the University of Texas at Austin supports the following methods of securely transmitting transcripts:

  • email as MIME attachment using PGP encryption

  • regular FTP using PGP encryption

  • Secure FTP (SFTP over ssh) and straight SFTP

  • FTPS (FTP over SSL/TLS)

Files collected via the dedicated server at the University of Texas at Austin will be copied to a secure project folder that is only accessible to specific staff members. The same access restrictions and storage protocol will be followed for these files as described above for files uploaded to the study website.

We do not anticipate that active student consent for the release of transcripts will be required for B&B:08/09. For certain agency purposes, the Family Educational Rights and Privacy Act of 1974 (FERPA) permits institutions to release student data to the U.S. Department of Education and its authorized agents without consent. In compliance with FERPA, a notation will be made in the student record that the transcript has been collected for use in the B&B:08/09 longitudinal study.

Despite the relatively routine nature of the transcript request, it is anticipated that telephone prompting will be required to obtain the desired number of transcripts. We will also use e-mail prompts, letters, and postcard prompts, which have proven to be effective tools in gaining cooperation. E-mail, in particular, has proven to be a low-cost and effective means of reaching institution officials who cannot be reached by phone. Because each transcript request will be accompanied by a voucher for any expenses incurred in handling the request, it is unlikely that refusals will become a significant problem. However, in the event that an institution expresses resistance to the transcript request, seasoned institutional contactors and other project staff are trained to sensitively listen to institutional concerns, address any roadblocks to participation, and negotiate with institution staff to resolve them.

Another strategy RTI can investigate to increase the efficiency of the transcript data collection, encourage participation, and minimize burden to individual institutions is to solicit support at a system-wide level. If needed, state and system-wide contacts made as part of NPSAS:08 will be asked to encourage the participation of institutions under its administration. While we are planning for transcript collection from each individual institution, we will also explore the possibility of collecting transcripts from state agencies or at a system-wide level whenever such an approach is practical. Based on our NELS:88/2000 experience, we expect that less than 5 percent of institutions will have closed, but it is often the case that transcripts can be collected for institutions that are technically closed. RTI will confirm the status of any closed institution with state departments of education, offices of higher education, or other appropriate state licensing agencies.

Transcript Collection Quality Control. As part of our quality control procedures, we will emphasize to registrars the importance of collecting complete transcript information for all sampled students. Transcripts will be reviewed for completeness. Institutional Contactors will contact the institutions to prompt for missing data and to resolve any problems or inconsistencies.

Transcripts received in hardcopy form via secure fax will be subject to a quick review prior to recording their receipt. Receipt control clerks will check transcripts for completeness and review transmittal documents to ensure that transcripts have been returned for each of the specified sample members. The disposition code for transcripts received will be entered into the TCS. Course catalogs will also be reviewed and their disposition status updated in the system in cases where this information is necessary and not available through CollegeSource Online. Hardcopy transcripts and course catalogs will be sorted and stored in a secure facility at RTI, organized by institution.

The procedures for electronic transcripts will be similar to those for hardcopy documents—receipt control personnel, assisted by programming staff, will verify that the transcript was received for the given requested sample member, record the information in the receipt control system, and check to make sure that a readable, complete electronic transcript has been received.

The initial transcript check-in procedure is designed to efficiently receipt returned materials into the TCS as they are received each day. The presence of an electronic catalog (obtained from CollegeSource Online) will be confirmed during the verification process for each institution and noted in the TCS. The remaining catalogs will be requested from the institutions directly and will be receipted in the TCS as they are received. Transcripts and supplementary materials received from institutions (including course catalogs) will be inventoried, assigned unique identifiers based on the IPEDS ID, reviewed for problems, and receipted into the TCS.

Data processing staff will be responsible for (1) sorting transcripts into alphabetical order to facilitate accurate review and receipt; (2) assigning the correct ID number to each document returned and affixing a transcript ID label to each; (3) reviewing the materials to identify missing, incomplete, or indecipherable transcripts; and (4) assigning appropriate TCS problem codes to each of the missing and problem transcripts plus providing detailed notes about each problem to facilitate follow-up by Institutional Contactors and project staff. Project staff will use daily monitoring reports to review the transcript problems and to identify approaches to solving the problems.

Web-based collection will allow timely quality control, as RTI central staff will be able to monitor data quality for participating schools closely and on a regular basis. When institutions call for technical or substantive support, we will be able to query the institution’s data and communicate much more effectively regarding any problems.

Transcript data, including paper transcripts, will be destroyed or shredded after the transcripts are keyed, coded, and quality checked at a time to be negotiated with NCES.

Transcript Keying and Coding. Once student transcripts and course catalogs are received and missing information is collected, keying and coding of transcripts and courses taken will take place. The taxonomy of coding will be modeled on those used in other postsecondary studies, specifically, NELS:88/2000 and B&B:93/94. A careful review of these taxonomies (based on Adelman’s College Course Map) will be carried out with refinements. Guidance will be provided by NCES, technical review panel members, and other key personnel on refining and reviewing the taxonomy for transcript coding and new courses and fields of study.

Keyer-Coders will have full access to all transcript-related documents including course catalogs or other course listings provided. All transcript-related documents will be thoroughly reviewed before data is abstracted from them.

Transcript Keying and Coding Quality Control. As part of our quality control procedures, we will ensure that all coders have earned a bachelor’s degree to ensure that they have firsthand knowledge of college courses, credits, and grade point averages. Emphasis is placed on recruiting professions with knowledge of transcripts and teaching curriculum as our expert keyer-coders.

A supervision and quality control plan will also be implemented. At least one supervisor will be onsite at all times to manage the effort and simultaneously perform QC checks and problem resolution. Verifications of transcript data keying and coding at the student level will be performed. Any errors will be recorded and corrected as needed.

Once the transcripts for each institution are keyed and coded, transcript course coding at the institution level will be reviewed by expert coders to ensure that (1) coding taxonomies have been applied consistently and data elements of interest have been coded properly within schools (2) program information has been coded consistently according to the program area and sequence level indicators in course titles (3) records of sample members who attended multiple institutions do not have duplicate entries for credits that transferred from one institution to another and (4) additional information has been noted and coded properly.

d.Student Locating

One of the main issues for the B&B:08/09 data collection effort will be locating the members of the sample cohort. These individuals were last contacted during the NPSAS:08 field test—NPSAS:08 nonrespondents may have never been contacted. Members of the cohort are highly mobile and, having completed their bachelor’s degrees, have likely moved on to another location. The high mobility rate of this population presents challenges to the B&B:08/09 tracing effort.

A successful tracing operation is dependent on a multitude of factors including the characteristics of the population to be located, the age of the locating information for the population, and the completeness and accuracy of that information. To maximize our location rate, sufficient resources will be devoted to tracing operations both in-house and in the field, giving careful consideration to identifying and implementing the most effective, yet cost efficient, tracing strategies for this population. Additionally, the locator database for the cohort includes critical tracing information for most of the sample members, including their previous residences and telephone numbers. Moreover, Social Security numbers will be available for virtually all of the sample members (99 percent), as well as other information useful for tracing.

To achieve the desired response rate required by NCES standards, we propose a multistage tracing approach that will capitalize on the availability of previous NPSAS:08 locating data and the continuing cooperation of sample members. This multistage approach will consist of several steps designed to yield the maximum number of locates with the least expense. During the field test, we will evaluate the effectiveness of these procedures for the full-scale survey effort. The steps of our multistage tracing plan include the following elements.

  1. Advance Tracing. We propose to employ an advance tracing operation prior to field test and full-scale data collection that will update the addresses of sample members. Included in this activity will be searches of the U.S. Department of Education’s Central Processing System (CPS) for information on financial aid recipients. We will also conduct computerized searches of other databases, including the National Change of Address (NCOA), Telematch and ComServ's Death Information System. We will compare all sample member addresses obtained from the NPSAS:08 locator database against the NCOA and Telematch databases to identify sample members who have moved since the previous follow-up. Updated addresses and telephone numbers produced by these advance tracing activities will be entered into the B&B:08/09 locator database and made available to data collection personnel at the start of data collection.

  2. Advance Interactive Tracing. After the completion of advance tracing, cases without good locating information (primarily cases from the NPSAS:08 nonrespondent group) will be directed for additional interactive tracing. Specially trained tracing staff will perform intensive tracing to locate additional contact information for these cases. In many cases, this will involve an interactive credit bureau search and may involve other interactive databases of locator information.

  3. Parent Mailing. In mid-April 2008, we will mail a letter to the parents of sample members under the age of 25, informing them that their child’s participation will be requested. This letter will also include a study leaflet, address update form, and a business reply envelope.

  4. Initial Contact Mailing to Sample Members. Beginning in May 2008, we will mail a personalized letter (signed by the NCES Commissioner), study leaflet, address update form, and business reply envelope to all sample members. This letter will include the study's website address and toll-free telephone number, and will request that sample members update their postal and electronic mail addresses. Undeliverable mailings to sample members will be recorded, and the next best address will be used to resend the materials. Once all potential addresses for the sample member are exhausted, we will contact other information sources for the sample member (e.g., a parent, other relative, or a designated contact.)

  5. Data Collection Announcement Mailing to Sample Members. Once we have the most current locator information for the sample members, we will mail a package to the sample cohort announcing the start of data collection. The package will include information about the study, and will describe the various ways to complete the interview. The package will also include the website address for the project, and the sample member’s unique username and password for the site. The sample member will receive this package in one of two ways; a full-size envelope delivered by regular mail or Priority Mail from the USPS.

  6. Intensive In-house Tracing. The goal of intensive tracing is to obtain a telephone number at which the sample member can be reached so that field interviewing will not be required. Tracing procedures may include (1) Directory Assistance for telephone listings at various addresses, (2) criss-cross directories to identify (and contact) the neighbors of sample members, (3) calling persons with the same unusual surname in small towns or rural areas to see if they are related to or know the sample member, and (4) contacting the current or last known residential sources such as the neighbors, landlords, and current residents of the last known address. Other more intensive tracing activities could include (1) database checks for sample members, parents, and other contact persons, (2) credit database and insurance database searches, (3) drivers’ license searches through the appropriate state departments of motor vehicles, (4) calls to colleges, military establishments, and correctional facilities to follow up on leads generated from other sources, (5) calls to alumni offices and associations, and (6) calls to state trade and professional associations based on information about field of study in school and other leads.

  7. Field Tracing and Interviewing. One of the challenges presented by both the B&B:08/09 field test the full-scale data collection efforts is the need for in-person tracing and interviewing nationwide. We will use a two-tiered tracing strategy for field cases that could not be completed through either self-administered web interview, computer assisted telephone interviewing (CATI) or computer assisted personal interviews (CAPI). Using the best available address for the nonresponding sample members, the cases will be clustered into geographic areas. At that time, field interviewers will be assigned areas with high concentration of sample members (e.g., a major metropolitan area). These field interviewers will be assigned to locate and interview the sample members residing in that cluster. Cases in areas without assigned field interviewers (e.g., cases not clustered with other cases) will be assigned to receive additional intensive tracing. Cases where additional telephone contact information is collected will be returned to data collection by telephone.

e.Student Data Collection: Self-Administered Web, CATI, and CAPI

Training Procedures. Training programs for those involved in survey data collection are critical quality control elements. Training for the help desk operators who answer questions for the self-administered web-based student interview, CATI telephone interviewers and CAPI field interviewers will be conducted by a training team with extensive experience. We will establish thorough selection criteria for help desk operators, telephone interviewers and field interviewers to ensure that only highly capable persons—those with exceptional computer, problem-solving, and communication skills—are selected to serve on the project and will contribute to the quality of the B&B data.

Contractor staff with extensive experience in training interviewers will prepare the B&B Telephone Interviewer Manual, which will provide detailed coverage of the background and purpose of B&B, sample design, questionnaire, and procedures for the CATI interview. This manual will be used in training and as a reference during interviewing. (Interview-specific information will be available to interviewers in the Call Center in the form of question-by-question specifications providing explanations of the purpose of each question and any definitions or other details needed to aid the interviewers in obtaining accurate data.) Along with manual preparation, training staff will prepare training exercises, mock interviews (specially constructed to highlight the potential of definitional and response problems), and other training aids.

A comprehensive training guide will also be prepared for use by trainers to standardize training and to ensure that all topics are covered thoroughly. Among the topics to be covered at the telephone interviewer training will be;

  • the background purposes and design of the survey,

  • confidentiality concerns and procedures (interviewers will take an oath and sign an affidavit agreeing to uphold the procedures),

  • NCES Security Clearance Procedures,

  • importance of locating/contacting sample members and procedures for using the locating and tracing module,

  • special practice with online coding systems used to standardize sample member responses to certain items (e.g., institution names, occupation, and for students enrolled in additional education, major or field of study),

  • review, discussion, and practice of techniques for explaining the study, answering questions asked by sample members, explaining the respondent’s role, and obtaining cooperation,

  • extensive practice in applying tracing and locating procedures,

  • demonstration interviews by the trainers,

  • round-robin (interactive mock interviews for each section of each questionnaire, followed by review of the question-by-question specifications for each section),

  • completion of classroom exercises,

  • practice interviews with trainees using the web/CATI instrument to interview each other while being observed by trainers, followed by discussion of the practice results, and

  • explanation of quality control procedures, administrative procedures, and performance standards.

In addition to topics covered in training, staff will be provided with additional practice exercises to help prepare them for production. Call center supervisors will be given project-specific training in advance of interviewer training and will assist in monitoring interviewer performance during the training.

Student Interviews (web/CATI/CAPI). Student interviews will be conducted using a single web-based survey instrument for self-administered, CATI and CAPI data collection. The data collection activities will be accomplished through the Case Management System (CMS), which is equipped with the following capabilities:

  • on-line access to locating information and histories of locating efforts for each case;

  • state-of-the-art questionnaire administration module with full “front-end cleaning” capabilities (i.e., editing as information is obtained from respondents);

  • sample management module for tracking case progress and status; and

  • automated scheduling module which delivers cases to interviewers and incorporates the following features:

  • Automatic delivery of appointment and call-back cases at specified times. This reduces the need for tracking appointments and helps ensure the interviewer is punctual. The scheduler automatically calculates the delivery time of the case in reference to the appropriate time zone.

  • Sorting of non-appointment cases according to parameters and priorities set by project staff. For instance, priorities may be set to give first preference to cases within certain subsamples or geographic areas; cases may be sorted to establish priorities between cases of differing status. Furthermore, the historic pattern of calling outcomes may be used to set priorities (e.g., cases with more than a certain number of unsuccessful attempts during a given time of day may be passed over until the next time period). These parameters ensure that cases are delivered to interviewers in a consistent manner according to specified project priorities.

  • Restriction on allowable interviewers. Groups of cases (or individual cases) may be designated for delivery to specific interviewers or groups of interviewers. This feature is most commonly used in filtering refusal cases, locating problems, or foreign language cases to specific interviewers with specialized skills.

  • Complete records of calls and tracking of all previous outcomes. The scheduler tracks all outcomes for each case, labeling each with type, date, and time. These are easily accessed by the interviewer upon entering the individual case, along with interviewer notes, thereby eliminating the need for a paper record of calls of any kind.

  • Flagging of problem cases for supervisor action or supervisor review. For example, refusal cases may be routed to supervisors for decisions about whether and when a refusal letter should be mailed, or whether another interviewer should be assigned.

  • Complete reporting capabilities. These include default reports on the aggregate status of cases and custom report generation capabilities.

The integration of these capabilities reduces the number of discrete stages required in data collection and data preparation activities and increases capabilities for immediate error reconciliation, which results in better data quality and reduced cost. Overall, the scheduler provides a highly efficient case assignment and delivery function by reducing supervisory and clerical time, improving execution on the part of interviewers and supervisors by automatically monitoring appointments and callbacks, and reducing variation in implementing survey priorities and objectives.

In addition to the management aspect of data collection, the survey instrument is another component designed to maximize efficiency and yield high-quality data. Below are some of the basic questionnaire administration features of the web-based instrument:

  • Based on responses to previous questions, the respondent or interviewer is automatically routed to the next appropriate question, according to predesignated skip patterns.

  • The web-based interview automatically inserts “text substitutions” or “text fills” where alternate wording is appropriate depending on the characteristics of the respondent or his/her responses to previous questions.

  • The web-based interview can incorporate or preload data about the individual respondent from outside sources (e.g., previous interviews, sample frame files, etc.). Such data are often used to drive skip patterns or define text substitutions. In some cases, the information is presented to the respondent for verification or to reconcile inconsistencies.

  • Numerous question-specific probes may be incorporated to explore unusual responses for reconciliation with the respondent, to probe “don’t know” responses as a way of reducing item non-response, or to clarify inconsistencies across questions.

  • Coding of multi-level variables. The web-based instrument uses an assisted coding mechanism to code text strings provided by respondents. Drawing from a database of potential codes, the assisted coder derives a list of options from which the interviewer or respondent can choose an appropriate code (or codes if it is a multi-level variable with general, specific, and/or detail components) corresponding to the text string.

  • Iterations. When identical sets of questions will be repeated for an unidentified number of entities, such as children, jobs, schools, and so on, the system allows respondents to cycle through these questions as often as is needed.

In addition to the functional capabilities of the CMS and web instrument described above, our efforts to achieve the desired response rate will include using established procedures proven effective in other large-scale studies we have completed. These include:

  • Providing multiple response modes, including self-administered and interviewer-administered options.

  • Offering incentives to encourage response (see incentive structure described below).

  • Prompting calls initiated early in the data collection to remind sample members about the study and the importance of their participation.

  • Assigning experienced CATI/CAPI data collectors who have proven their ability to contact and obtain cooperation from a high proportion of sample members.

  • Training the interviewers thoroughly on study objectives, study population characteristics, and approaches that will help gain cooperation from sample members.

  • Providing the interviewing staff with a comprehensive set of questions and answers that will provide encouraging responses to questions that sample members may ask.

  • Maintaining a high level of monitoring and direct supervision so that interviewers who are experiencing low cooperation rates are identified quickly and corrective action is taken.

  • Making every reasonable effort to obtain an interview at the initial contact, but allowing respondent flexibility in scheduling appointments to be interviewed.

  • Providing hesitant respondents with a toll-free number to use to telephone RTI and discuss the study with the project director or other senior project staff.

  • Thoroughly reviewing all refusal cases and making special conversion efforts whenever feasible (see next section).

When all leads have been exhausted in the outbound calling phase (production interviewing) the case will be transferred to the field staff for in-person contact attempts. For the purpose of the field test, however, only a subset of geographically clustered cases will be assigned to the field. This approach first identifies clusters according to know zip code of the sample members. Cases that fall within a 50-mile radius of the center of the cluster will be assigned to a CAPI field interviews.

Refusal Aversion and Conversion. Recognizing and avoiding refusals is important to maximize the response rate. We will emphasize this and other topics related to obtaining cooperation during data collector training. Supervisors will monitor interviewers intensely during the early days of data collection and provide retraining as necessary. In addition, the supervisors will review daily interviewer production reports produced by the CATI system to identify and retrain any data collectors who are experiencing unacceptable numbers of refusals or other problems.

After encountering a refusal, the data collector enters comments into the CMS record. These comments include all pertinent data regarding the refusal situation, including any unusual circumstances and any reasons given by the sample member for refusing. Supervisors will review these comments to determine what action to take with each refusal. No refusal or partial interview will be coded as final without supervisory review and approval. In completing the review, the supervisor will consider all available information about the case and will initiate appropriate action.

If a follow-up is clearly inappropriate (e.g., there are extenuating circumstances, such as illness or the sample member firmly requested that no further contact be made), the case will be coded as final and will not be recontacted. If the case appears to be a “soft” refusal, follow-up will be assigned to an interviewer other than the one who received the initial refusal. The case will be assigned to a member of a special refusal conversion team made up of interviewers who have proven especially adept at converting refusals.

Refusal conversion efforts will be delayed for at least 1 week to give the respondent some time after the initial refusal. Attempts at refusal conversion will not be made with individuals who become verbally aggressive or who threaten to take legal or other action. Refusal conversion efforts will not be conducted to a degree that would constitute harassment. We will respect a sample member’s right to decide not to participate and will not impinge this right by carrying conversion efforts beyond the bounds of propriety.

Incentives to Convert Refusals, Difficult and Unable-to-Locate Respondents. As described in the justification section (section A), we have proposed to offer incentive payments to nonresponding members of the sample population. All respondents during the early response period will be paid a $35 incentive, although how that incentive is paid (promised or partially prepaid and promised) will be manipulated during the field test. During production interviewing, half of the respondents will receive a promised $20 incentive (with a comparison of response rates with those offered no incentive). Any respondents who were base-year nonrespondents will be paid an additional $20 incentive amount to compensate for the additional burden of providing background information otherwise collected during the base-year interview.

In addition, we will offer incentives to interview nonrespondents, of which there will probably be three groups; (1) persons refusing to participate during early response or production interviewing, (2) persons who have proven difficult to interview (i.e., those who repeatedly break appointments with an interviewer), and (3) those who cannot be located or contacted by telephone. Our approach to maximizing the response of these persons—and thereby limiting potential nonresponse bias—involves an incentive payment to reimburse the respondent for time and expenses. A $35 incentive will be offered for nonresponse conversion, with a $20 differential paid for base-year nonrespondents. Additional detail about planned field test experiments is provided in section B.4.

Additional Quality Control. In addition to the quality control features inherent in the web-based interview (described in section 3), we will use data collector monitoring as a major quality control measure. Supervisory staff from RTI’s Call Center Services (CCS) will monitor the performance of the B&B data collectors throughout the data collection period to ensure they are following all data collection procedures and meeting all interviewing standards. In addition, members of the project management staff will monitor a substantial number of interviews. In all cases, students will be informed of the fact that the interview may be monitored by supervisory staff.

“Silent” monitoring equipment is used so that neither the data collector nor respondent is aware when an interview is being monitored. This equipment will allow the monitor to listen to the interview and simultaneously see the data entry on a computer screen. The monitoring system allows ready access to any of the work stations in use at any time. The monitoring equipment also enables any of the project managers and client staff at RTI or NCES to dial in and monitor interviews from any location. In the past, we have used this capability to allow the analysts to monitor interviews in progress; as a result, they have been able to provide valuable feedback on specific substantive issues and have gained exposure to qualitative information that has helped their interpretation of the quantitative analyses.

Our standard practice is to monitor 10 percent of the interviewing done by each data collector to ensure that all procedures are implemented as intended and that the procedures are effective, and to observe the utility of the questionnaire items. Any observations that might be useful in subsequent evaluation will be recorded and all such observations will be forwarded to project management staff. Staff monitors will be required to have extensive training and experience in telephone interviewing as well as supervisory experience.

f.Obtaining Consent to Participate from Sample Members

Appropriate language for obtaining informed consent for participation in the B&B longitudinal study is included in the initial screens of the interview for web, telephone, and in person administration. The wording of the informed consent statement is shown in exhibit 1. It has been reviewed and approved by RTI’s Institutional Review Board in their Office of Research Protection.

Exhibit 1. Informed consent script for the B&B interview

Screen 1


Recently, we sent you materials about the Baccalaureate and Beyond Longitudinal Study that we are conducting for the U.S. Department of Education. This survey is being conducted to better understand the education and employment experiences of students who earned their bachelor’s degree during the [2006-07] school year. The material explains that your participation in the B&B study is critical to its success.

As a token of our appreciation, you will receive a $" + [Y_EARLYINC_DOLLAR] + " check if you complete the B&B survey by
[Y_WEB_INCENT_EXP_DATE].

Have you had a chance to read the materials? (y/n)


Screen 2 – if material was read by sample member:


Good. The purpose of the B&B study is to collect information about students' experiences during college and after earning the bachelor’s degree, how they paid for their education, their employment afterwards, and to update demographic information. On average, it takes about 25 minutes.
Your responses, combined with student record information such as financial aid data, may be used only for statistical purposes and may not be disclosed, or used, in personally identifiable form for any other purpose, unless otherwise compelled by law. Your participation is voluntary and will not affect any aid or other benefits that you may receive. The risk of participation in this study relates to data security and is minimal, given the strict confidentiality and security procedures in place. You may decline to answer any question or stop the interview at any time.

If you have any questions about the study, please contact the study's director, Dr. Jennifer Wine, toll free at 1-877-225-8470. For questions about your rights as a study participant, please contact RTI's Office of Research Protection toll free at 1-866-214-2043.

To review the letter that we mailed, Click Here.
To review the study brochure, Click Here.

May we begin the interview now? (y/n)

Screen 3 – if material not yet read by sample member:

The B&B study is being conducted for the U.S. Department of Education by RTI International. The purpose of B&B is to collect information about students' education during and after earning the bachelor’s degree, how they paid for that education, their employment afterwards, and to update demographic information. We’d like you to participate.

The interview takes, on average, about 25 minutes. You are one of approximately 1,800 students who will be taking part in this study.

Your responses, combined with student record information such as financial aid data, may be used only for statistical purposes and may not be disclosed, or used, in personally identifiable form for any other purpose, unless otherwise compelled by law. Your participation is voluntary and will not affect any aid or other benefits that you may receive. The risk of participation in this study relates to data security and is minimal, given the strict confidentiality and security procedures in place. You may decline to answer any question or stop the interview at any time.

If you have questions about the study, you may contact Dr. Jennifer Wine, toll free at 1-877-225-8470. For questions about your rights as a study participant, please contact RTI's Office of Research Protection toll free at 1-866-214-2043.

At your request we can remail the material to you, or you may review the materials by clicking on the button below. To request that the study materials be mailed to you, please call the B&B Help Desk toll free at 1-877-262-4440.

To review the letter that we mailed, Click Here.
To review the project brochure, Click Here.

May we begin the interview now? (y/n)

g.Plan for Reducing Nonresponse Bias

The overall nonresponse bias analysis plan for the full-scale B&B:08/09 study is to use data that are available for both responding and nonresponding sample members to estimate bias in estimates due to unit nonresponse. There will be considerable information on sample members available from the base year study (NPSAS:08) to facilitate the analysis (e.g., NPSAS response status, age, attendance status, geographic region, number of telephone numbers obtained, and aid status). Any variables found to have significant bias due to nonresponse will be included in the weight adjustment model so that the bias may be reduced for analyses based on the final statistical analysis weights.

Additionally, we will consider adding as a predictor in the model an indicator of incentive received or time period when interview was completed, i.e., early response, production response, or nonresponse conversion. Another possible analysis would compare sample members who responded at different points in time, i.e., early responders, production responders, and nonresponse conversions, to determine if bias was introduced by completing interviews at different points in time with different incentives. After completing weight adjustments, the reduction or removal of significant unit nonresponse bias will be validated by comparing the distribution of key variables, known for most respondents and nonrespondents using the final weights after weight adjustments, with the full sample distribution before nonresponse adjustment.

Unfortunately, the direct effect of incentives on bias cannot be measured accurately during the full-scale data collection because, as designed, all sample members will be offered at least one incentive. The incentives are intended specifically to increase the overall response rate, which—if effective—will reduce overall nonresponse bias. Any significant bias remaining after weight adjustments may indicate that either the weighting or the incentives did not decrease or eliminate nonresponse bias as expected. Therefore, in an effort to measure bias due to nonresponse, we will conduct comparisons of respondent/nonrespondent characteristics during the field test, realizing that the sample sizes will be quite limiting.

4.Tests of Procedures and Methods

The following sections will briefly discuss four areas of data collection believed to affect overall study participation which will be evaluated during the B&B:08/09 field test data collection. These areas are (1) visibility of mailout materials, (2) notification by cell phones/text messaging, (3) prepaid incentives, and (4) nonresponse conversion incentives. This section will also introduce plans for experiments in these areas during the B&B:08/09 field test.

a.Visibility of Mailout Materials

Much research about survey response has focused on the impact of response rates based on the types of outgoing mail. In particular, the method of mail delivery has been found to be an important factor. For instance, Abreu and Winters (1999) found that Priority Mail was effective when used as a method to increase response rates among non-responding cases. Similarly, Moore and An (2001) found the use of Priority Mail in a prenotification mailing and a reminder mailing to be most effective in their mail questionnaire survey (2001). Additionally, Fox et. al., revealed that first class mail yielded higher response rates than bulk mail. (Fox, Crask and Jonghoon, 1988).

The reason is obvious, content is ineffective if the envelope is ignored or assumed to be junk mail. Couper, et al. found that mail is usually sorted by only one person in one-half of all house holds. Furthermore, 60 percent of people discard mail without opening it (Couper, Mathiowetz, and Singer 2005), and therefore, it is imperative that researchers maximize the chances of their mailings being read. Increasing the look of legitimacy can go a long way to ensuring that the mail is opened by the intended recipient, thereby increasing the likelihood of survey response.

While research is not extensive regarding the impact size of the mailout material has on response rates, there is some evidence to suggest that packaging size is important. Dillman, Mahon-Haft, and Parson (2004) conducted cognitive interviews on the use of larger packages in comparison to traditional size packages. Dillman et al. found that larger packages possibly improved response rates. Respondents appeared more likely to pay attention and open the larger packages and in turn seemed to motivate the respondents to read the correspondence contained within the larger packages.

We believe that using the larger more visible envelopes will signal the importance of the information contained in the package, increasing the likelihood that the materials will be read, and in turn, the likelihood of survey participation. We propose to test the impact of the visibility of mailout materials on participation rates.

Prior to the start of data collection, the field test sample will be randomly assigned to two groups: one group will receive the initial study materials via regular mail in a larger envelope, and the other group will receive the same materials via Priority Mail also delivered in a large envelope. The initial mailing will contain important information about the study, as well as the information described in section B.3.a. Results will be measured by comparing the participation rates at the end of the early response period for these two groups to determine whether participation is greater for those who receive the larger envelopes via regular mail.

Ho: There will be no difference in early participation rates among those who receive the initial mailing via Priority Mail packages and those who receive large mailing envelopes.

b.Use of Cell Phone Calling and Texting

The use of cell phone calling and text messaging is a relatively new means for contacting sample members. Little research has been conducted on the effects text messaging has on participation rates. Research conducted by Brick, et al. suggests that text messaging as a method of prenotifying sample members has nearly equal response rates as their control group counterparts (Brick, Brick, Dipko, Presser, Tucker, and Yuan, 2007). According to Lambries, et al. those households using primarily cell phones required more attempts to contact than those using either both landline and cell phones and those using landline only (Lambries, Link and Oldendick, 2006). Primarily cell phone households showed differences of 1.1 more attempts than landline only household and .8 more attempts than both landline and cell phone households.

However, text messaging does have some advantages as the first means of contacting sample members. Text messaging may help identify working numbers and in turn increase the efficiency of the calling process (Steeh, Buskirk and Callegaro (2007). Steeh, et al. research concludes that text messages can be advantages as the first means of contact for two reasons: outcome rates are substantially improved and information about the working status of the number is obtained.

Further research in this field is needed in order to better understand the effects cell phone calling and text messaging have on participation rates.

Ho: The use of text messages and instant messaging as additional means of contacting sample members will have no effect on early participation rates.

c.Use of Prepaid Incentives

There is much evidence suggesting that the use of prepaid incentives increases survey response more than promised incentives alone. While not without its operational challenges, the B&B:08/09 field test provides an opportunity to test the impact of no prepayment at all but the promise of a $35 check upon interview completion, versus either a $5 check or $5 cash – both with a promise of a $30 check upon interview completion.

Ho: Prepaid incentives, $5 cash or $5 check, will have no effect on the proportion of respondents who complete the web-based, self-administered interview during the first four weeks of data collection.

d.Incentives during Production Interviewing

Prior results (BPS:04/06 field test) suggested that paying an incentive during the production interviewing phase of data collection does increase the likelihood that sample members will participate. However, the effect was not a strong one. Consequently, the experiment is proposed again for the B&B:08/09 field test. At the end of the early response period, during which sample members will be paid $35 for a completed interview, interviewers will begin contacting the remaining sample members trying to complete a telephone interview. At random, these sample members will be assigned to a $0 or a $20 incentive group. At the end of data collection, we will compare the rate at which we completed interviews with the two groups to determine if there is a worthwhile participation rate gain offering the $20 incentive.

Ho: Production interviewing incentives of $20 will have no effect on the participation rates.

e.Experimental Testing

A summary of the 4 main experiments proposed for the B&B:08/09 field test is provided below. We also provide detail about the field test sample and its allocation to each of the cells, and discuss the assumptions made in developing the design.

Null Hypotheses

One-way comparisons

  1. There will be no difference in participation rates during the early response period for those who receive the study materials and survey invitation via Priority Mail and those who receive the study materials via regular mail but in a large envelope.

  2. There will be no difference in participation rates during the early response period for those who are notified about the start of data collection using text messages or instant messages during the early response period and those who are not.

  3. There will be no difference in participation rates during the early response period for those who are offered a promise of a $35 incentive, or a $5 check, or $5 cash and a promise of a $30 incentive upon interview completion.

Two-way comparisons

  1. There will be no difference in participation rates during the early response period for those who receive the study materials and survey invitation via Priority Mail and who are notified about the start of data collection using text messages or instant messages when compared with all others.

  2. There will be no difference in participation rates during the early response period for those who receive the study materials and survey invitation via Priority Mail and who are offered a $5 check or $5 cash and a promise of a $30 incentive upon interview completion when compared with all others.

  3. There will be no difference in participation rates during the early response period for those who are notified about the start of data collection using text messages or instant messages and who are offered a $5 check or $5 cash and a promise of a $30 incentive upon interview completion when compared with all others.

Three-way comparison

  1. There will be no difference in participation rates during the early response period for those who receive the study materials and survey invitation via Priority Mail, who are notified about the start of data collection using text messages or instant messages and who are offered a $5 check or $5 cash and a promise of a $30 incentive upon interview completion when compared with all others.

Production interviewing period

  1. There will be no difference in participation rates achieved during the production interviewing phase when respondents are paid a $20 incentive compared to participation rates when respondents are paid no incentive.

Detectable Differences

As part of the planning process for developing the field test experiment design, the participation rate differences between the control and treatment groups necessary to detect statistically significant differences will be estimated. That is, how large of a difference is necessary to be able to say that the participation rates between the two groups are different. Table 8 shows the expected sample sizes and statistically significant detectable difference for each of the eight hypotheses. Several assumptions were made regarding participation rates and sample sizes. In general, the closer the participation rate is to 50 percent (either less than or greater than), the larger the detectable difference. Likewise, the smaller the sample size, the larger the detectable difference.

Assumptions:

  1. The sample will be equally distributed across experimental cells.

  2. All ineligible cases will be included in the analysis because ineligibility will be determined after the interview begins.

  3. All 1,819 sample members will be included in the mailout and prepaid experiments, and all 1,819 sample members for whom a cell phone number was obtained during NPSAS or the address update mailing to students and parents in January will be included in the text messaging experiment.

  4. Cell phone numbers will be known for two-thirds of the sample members.

  5. The participation rate for the control group for hypotheses 1 through 7 will be 35 percent.1

  6. The participation rate for the control group for hypothesis 8 will be 25 percent.




Table 8. Detectable differences for field test experiment hypotheses

Hypothesis


Control group


Treatment group


Detectable difference with 95 percent confidence

Definition

Sample size

Definition

Sample size

1


Regular mail, large envelope

910


Priority Mail, large envelope

910


4.5

2


No text messaging

606


Text messaging

606


5.5

3


Promise of $35

910


$5 check or cash and $30 promise

910


4.5

4


All others

909


Priority Mail, large envelope and text messaging

303


6.4

5


All others

1365


Priority Mail, large envelope and $5 check or cash and $30 promise

454


5.2

6


All others

909


Text messaging and $5 check or cash and $30 promise

303


6.4

7


All others

1,060


Priority Mail, large envelope, text messaging, and $5 check or cash and $30 promise

152


8.4

8


No production interviewing incentive

592


$20 production interviewing incentive

592


5.1

.

5.Reviewing Statisticians and Individuals Responsible for Designing and Conducting the Study

Names of individuals consulted on statistical aspects of study design along with their affiliation and telephone numbers are provided below.

Name

Affiliation

Telephone

Dr. Lutz Berkner

MPR

(510) 849-4942

Dr. Susan Choy

MPR

(510) 849-4942

Dr. E. Gareth Hoachlander

MPR

(510) 849-4942

Dr. Ellen Bradburn

MPR

(510) 849-4942

Dr. Robin Henke

MPR

(510) 849-4942

Dr. John Riccobono

RTI

(919) 541-7006

Dr. Jennifer Wine

RTI

(919) 541-6870

Dr. James Chromy

RTI

(919) 541-7019

Dr. Karol Krotki

RTI

(202) 728-2485

Mr. Peter Siegel

RTI

(919) 541-6348

In addition to these statisticians and survey design experts, the following statisticians at NCES have also reviewed and approved the statistical aspects of the study: Dr. Dennis Carroll, Dr. James Griffith, Dr. Tracy Hunt-White, Dr. Paula Knepper, Ms. Kristin Perry, and Dr. Tom Weko.

6.Other Contractors’ Staff Responsible for Conducting the Study

The study is being conducted by the Postsecondary Longitudinal Studies Branch of the National Center for Education Statistics (NCES), U.S. Department of Education. NCES’s prime contractor is the RTI International (RTI). RTI is being assisted through subcontracted activities by MPR Associates. Principal professional staff of the contractors, not listed above, who are assigned to the study are provided below:

Name

Affiliation

Telephone

Ms. Vicky Dingler

MPR

(510) 849-4942

Ms. Emily Forrest-Cataldi

MPR

(510) 849-4942

Ms. Stephanie Nevill

MPR

(510) 849-4942

Mr. Jeff Franklin

RTI

(919) 541-2614

Ms. Gayathri Bhat

RTI

(919) 541-8013

Ms. Melissa Cominole

RTI

(919) 990-8456

Ms. Kristin Dudley

RTI

(919) 541-6855

Ms. Tiffany Mattox

RTI

(919) 485-7791

Mr. Brian Kuhr

RTI

(312) 456-5263




1 35 percent is used here as a baseline because it is consistent with participation rates obtained during the early response period from past studies.

Supporting Statement Request for OMB Review (SF83i) 44

File Typeapplication/msword
File TitleChapter 2
Authorelyjak
Last Modified ByEdith.McArthur
File Modified2008-02-07
File Created2008-02-07

© 2024 OMB.report | Privacy Policy