Responses to OMB

BB 08 Appendix responses to OMB 1_25.doc

Baccalaureate and Beyond Longitudinal Study, Third Followup (B&B:09)

Responses to OMB

OMB: 1850-0729

Document [doc]
Download: doc | pdf

Appendix I
Response to OMB Questions
Dated January 25, 2008



Memorandum United States Department of Education

Institute of Education Sciences

National Center for Education Statistics



TO: Edie McArthur February 1, 2008

Assistant to the Commissioner, NCES


FROM: Kristin Perry, Postsecondary Studies Division, NCES

Jennifer Wine, RTI International

SUBJECT: Responses to OMB Questions, dated January 25, 2008 (pertaining to ED-CO-05-0011, Option 6, BB:08/09)



1. What is the overall nonresponse bias analysis plan(s) for B&B?  There are several references to incentives potentially increasing response and thereby reducing potential bias.  But how will this be measured?  How will NCES be able to demonstrate that such steps have not increased bias?


The overall nonresponse bias analysis plan for the full-scale B&B:08/09 study is to use data that are available for both responding and nonresponding sample members to estimate bias in estimates due to unit nonresponse. There will be considerable information on sample members available from the base year study (NPSAS:08) to facilitate the analysis (e.g., NPSAS response status, age, attendance status, geographic region, number of telephone numbers obtained, and aid status). Any variables found to have significant bias due to nonresponse will be included in the weight adjustment model so that the bias may be reduced for analyses based on the final statistical analysis weights. Additionally, we will consider adding as a predictor in the model an indicator of incentive received or time period when interview was completed, i.e., early response, production response, or nonresponse conversion. Another possible analysis would compare sample members who responded at different points in time, i.e., early responders, production responders, and nonresponse conversions, to determine if bias was introduced by completing interviews at different points in time with different incentives. After completing weight adjustments, the reduction or removal of significant unit nonresponse bias will be validated by comparing the distribution of key variables, known for most respondents and nonrespondents using the final weights after weight adjustments, with the full sample distribution before nonresponse adjustment.

However, the direct effect of incentives on bias cannot be measured accurately during the full-scale data collection because, as designed, all sample members will be offered at least one incentive. The incentives are intended specifically to increase the overall response rate, which -- if effective -- will reduce overall nonresponse bias. Any significant bias remaining after weight adjustments may indicate that either the weighting or the incentives did not decrease or eliminate nonresponse bias as expected.

In an effort to address the notion of bias due to nonresponse, it may be possible to conduct some comparisons of respondent/nonrespondent characteristics during the field test, although the sample sizes will be quite limited.


2. Please provide the full report (or a link) from where the 2003 incentives experiments discussion was excerpted.

See pages 21-22 of the report:


Wine, J., Cominole, M., Carwile, S., Franklin, J., Carley-Baxter, L., and Wheeless, S. (2004). 1993/03Baccalaureate and Beyond Longitudinal Study (B&B:93/03) Field Test Methodology Report (NCES 2004–02). U.S. Department of Education. Washington, DC: National Center for Education Statistics.


Web link: http://www.nces.ed.gov/pubsearch/pubsinfo.asp?pubid=200402


3. Does NCES intend to study the number of “good” addresses from the various locating sources, as well as the unique contributions of each vendor (versus duplicative contributions)?  The submission currently refers only to the number of addresses received, some of which may be erroneous or duplicative of information being received from other sources. 


For batch processing, it is possible to determine which locating source provided the address at which the sample member was located and interviewed and, therefore, it lends itself to analysis. In tracing the B&B sample members, for example, RTI will first request address updates for the entire sample from three batch tracing databases: (1) the Central Processing System (CPS) database, (2) the U. S. Postal Service’s National Change of Address (NCOA) database, and (3) Telematch. These sources are noncompeting, since they each provide unique locating data. That is, matches to CPS return the permanent addresses of students who have applied for financial aid. Those addresses, combined with any other addresses for the entire sample, will be sent to NCOA for updating. Updated addresses will be sent to Telematch for new telephone numbers. All resulting data will be loaded into the case management system to be used for mailouts, prompting, and initial calls by interviewers. Since the source of the information is retained in the database, RTI can analyze and report on the source of the information obtained, although it is possible that multiple sources will provide duplicative information. When that occurs, there is usually a higher probability that the sample member is at the particular location.


Analysis of the quality of locating sources during intensive tracing is considerably more complicated and, therefore, more difficult to analyze the quality of a particular vendor’s information. Once telephone interviewers submit a case to RTI’s intensive tracing process, locating becomes a process of networking. As a result, it is difficult to precisely define a “good” address. For example, tracers will start with the interview record, which has the last known/best addresses. If the sample member is not found at that location, tracers will probe to determine if he/she ever lived at the location, if anyone there knows the sample member, and if a new telephone number or address can be provided. If a new number is obtained, tracers will add the number to the record and attempt to reach the sample member. They will continue to network in this way until they are able to locate a sample member. No one vendor can be identified as providing the correct address, but each of several vendors may have provided the pieces that led to the sample member. Consequently, simply reporting hits and misses by vendor will understate each individual vendor’s value to the tracing process.


The management team for RTI’s tracing unit has over 30 years of tracing experience, RTI quality control supervisors have at least 3 years of experience in tracing, and tracing staff receive 12 hours of classroom training and 8 hours of production tracing before they start to work on a project such as B&B. Over the years, RTI has made extensive use of locating sources such as Accurint, Axiom, Choicepoint, Equifax, Experian, FastData, LexisNexis, and TransUnion. These sources have been constantly monitored for quality and cost effectiveness. When vendors are found to be deficient, they are phased out temporarily or permanently depending upon reasons and circumstances.


Although there are some exceptions, most credit bureaus (e.g., Equifax, Experian) and proprietary databases (e.g., FastData, Accurint) gather most information from identical or very similar sources. What makes each of them unique and independently valuable is their regional coverage, different logic structures, search criteria, and output format. RTI tracers will make recommendations for which sources are best for a particular study based on characteristics of the sample to be located. For example, while FastData was recommended for locating parents participating in NCES’ Early Childhood Longitudinal Study (ECLS), Accurint was recommended for locating student sample members participating in B&B and the Beginning Postsecondary Students (BPS) Longitudinal Study.



Supporting Statement Request for OMB Review (SF83i) I-1

File Typeapplication/msword
AuthorJennifer Wine
Last Modified ByEdith.McArthur
File Modified2008-02-07
File Created2008-02-07

© 2024 OMB.report | Privacy Policy