Note to Reviewer

2006-11-08 Note to Reviewer of 1220-0157 regarding in-kind incentive experiment.doc

National Longitudinal Survey of Youth 1997

Note to Reviewer

OMB: 1220-0157

Document [doc]
Download: doc | pdf

Note to Reviewer of 1220-0157


The National Longitudinal Survey of Youth 1997 (NLSY97) has experienced a secular decline in response rates, which is a cause of increasing concern. The response rate for the ninth round of the survey, which ended in the summer of 2006, was 82.2 percent. By comparison, the response rate for the older NLSY79 cohort was 83.2 percent in the nineteenth round of data collection that was completed in 2000. Arresting the decline in NLSY97 response rates is vital to making full use of these valuable data. Longitudinal data gain value as information is collected from the same respondents over time, giving researchers the ability to examine how early events are tied to later outcomes. Little is known about how to increase or even maintain waning response rates in longitudinal surveys, however.


In October 2006, OMB approved the BLS proposal to offer all NLSY97 Round 10 sample members a base incentive of $30, an increase from the $20 offered since Round 5. OMB also approved the BLS proposal to pay $10 extra per missed round to prior-round nonrespondents who resume their participation in Round 10, with a cap of $30 extra. At that time, BLS indicated to OMB its intent to conduct an experiment during Round 10 to assess the effectiveness of personalized in-kind gifts for respondents. This document describes prior NLSY97 incentive experiments, asks what questions still remain, and details the proposed design of the Round 10 experiment.


Previous NLSY97 Incentive Experiments


Prior NLSY97 experiments with respondent incentives have provided mixed evidence that increasing respondent incentive payments will increase response rates. Our experience suggests that modest increases in respondent incentives increase response rates modestly.


An experiment in Round 4 increased respondent incentives to $15 for one third of the sample and to $20 for one third of the sample. The control group was offered an incentive of $10, the same incentive offered in prior rounds. While the sample members who were offered $20 were more likely to participate compared to sample members in the control group, sample members offered $15 were less likely to participate than those in the control group.


In Round 7, Round 6 nonrespondents were randomly separated into a control group and a treatment group. The control group received $20 for completing the interview, and the treatment group received an additional $5 for each missed round up to $15. The results showed that the treatment group not only had a higher response rate in Round 7, but the impact persisted through Round 8. That is, response rates remained higher in the round following the experiment for the treatment group than for the control group.


During Round 9, BLS conducted an experiment in high-priced metropolitan areas – Washington, Baltimore, San Francisco, San Jose, Los Angeles, Denver, Boston, New Haven, and New York City. One third of respondents received a $10 gift card in addition to their regular $20 payment for completing the interview. The control group received only the $20 payment for completing an interview. The treatment group responded at a rate that was 1.84 percent higher than that of the control group, but the difference was not statistically significant.


Remaining Questions


BLS proposes conducting an experiment in Rounds 10 and 11 to study two questions that will help us craft an effective long-term incentive policy:

  1. Can targeted in-kind incentives result in higher response than cash incentives?

  2. Will large increases in incentives have significantly greater impact than smaller increases in the current round and in subsequent rounds of data collection?


As we learned from the Response Rate Conference that BLS hosted in March 2004, sample members appreciate being treated as individuals. Consequently, targeting incentives to the individual circumstances of each sample member may increase the likelihood of response. One advantage that the NLSY97 has in this regard is the large amount of background information available for each sample member. From our data, we know whether a respondent is a college student, a married man juggling a young family and nascent career, a single young professional, or a struggling single parent. Many of our interviewers have been contacting the same sample members for many years and may know other personal details such as whether they have a pet or a hobby. Using this knowledge, BLS can test whether personalized in-kind gifts to our respondents will result in higher response rates than straight monetary incentives.


NORC field personnel feel that these personal gestures often are valued by sample members more than money because they are gracious and less crass options to “buying” respondents’ time. They are acts of consideration rather than an impersonal cash transaction. The field staff believes that courtesy buys the goodwill of respondents for whom being treated respectfully and as a valued participant is more important than an increase in cash payments. If this is the case, we might also think that targeted in-kind incentives will have longer-term benefits than increased monetary incentives.


We have seen that small increases in incentive amounts can have marginally positive results on response rates. However, we do not know how sample members, and especially recalcitrant sample members, will react in the face of a significant boost in incentive payments. Two factors push us in the direction of offering such incentives. First, fielding costs in Round 9 were actually lower than in the two previous rounds (14 percent below Round 8 costs) at least partially because we could not productively spend any more money on reluctant sample members. We attempt to work cases until we believe that one more contact will result in a greater reduction in the likelihood of completing the interview in future rounds than the increase in the likelihood that the sample member will participate in the current round. At the end of Round 9, we felt additional contacts would be counterproductive. If we have more to offer respondents, we hope that we can reduce the number of contacts that we need to make.


Second, we know that, if nonrespondents from a given round agree to participate in the next round, the chance that they will respond again in subsequent rounds is at least triple that of sample members who did not respond in the next round. From Round 4 to Round 8, the response rate of returned nonrespondents in the next round averaged 72.9 percent compared to only 21.6 percent for repeat nonrespondents. If we can convert nonrespondents into respondents, there is reason to expect that we will be able to retain them in the future. We propose to investigate whether a significant boost in incentives would arrest the decline in response rates and lay the foundation for slower rates of attrition in the future.



Experimental Design for Round 10 and 11


For Round 10 of the NLSY97, which began in October 2006 and is expected to continue until May 2007, BLS requests OMB permission to conduct an experiment involving larger in-kind and cash incentives for the least cooperative sample members. BLS proposes a 2-year experiment that will divide sample members into three randomly selected groups: a control group, a group that receives in-kind compensation that is in part based on the discretion of the interviewer, and a group that receives cash payments.


Randomization: Randomization will occur on approximately December 26, 2006, when interviews have been completed with 5,000 respondents who can be regarded as very cooperative. This randomization will occur across families so that siblings will be treated equally. If a sample member in the experiment has a sibling who already had completed a Round 10 interview before the experiment began, the siblings will be placed into the same group for Round 11. The remaining sample (about 4000 sample members from which we expect about 2,500 completions) will be split into three equal sized groups: the control group, the discretionary in-kind treatment group, and the cash payment group.


Control group: The control group will continue to receive the current respondent incentive of $30 in cash. In addition, respondents who have missed previous rounds will continue to receive the increased payments previously approved. This regime will continue for Round 11 as well.


Discretionary in-kind treatment group: The first treatment group will receive the previously approved $30, and respondents who have missed previous rounds also will be eligible for the increased payments of $10 per missed round. Respondents in this group will be eligible to receive in-kind payments that average $20 in value with a maximum value of $30. All respondents in this treatment group will receive some form of in-kind incentive. BLS proposes to allow field managers and field interviewers the ability to determine the in-kind incentive that they judge will be most effective at securing cooperation. This regime will continue for Round 11 as well.


Examples of the in-kind incentive include the following:


  • Bring a pizza with a 2-liter bottle of a soft drink to help limit the family disruption during the evening meal.

  • When an interviewer meets a respondent during their work lunch or break, the interview might purchase a latte from the coffee shop or buy the respondent’s lunch.

  • For respondents with young children, the interviewer might purchase coloring books, a G-rated video, or other distraction for the young children.

  • For respondents with pets, the interviewer might buy a bag of dog treats and a pet toy.

  • Interviewers also might offer gift cards at stores such Best Buy, Wal-Mart, Target, and gas stations.


Cash payment treatment group: For the second treatment group, BLS proposes increasing the respondent payment to $50 in cash. As with the control group and the in-kind treatment group, respondents in the cash treatment group who have missed previous rounds will receive the payments of $10 per missed round. This regime will continue for Round 11 as well.


The purpose of this large increase in payments – from Round 9 to Round 10, this represents a $30 increase in respondent incentives – is to examine whether such a large increase will have any impact on response rates. The timing of the experiment is ideal from the standpoint of detecting an impact. Because the respondents who have already completed the survey at this point cannot be affected by the increased payments, we have effectively stratified our respondents by their willingness to participate. The failure to detect an impact of the increased respondent incentives for this group of sample members would provide strong evidence that respondent incentives are not effective at increasing response rates.


Remaining respondents: In Round 11, respondents who had completed their survey prior to random assignment and who are not siblings of participants included in the experiment will continue to receive the $30 respondent payment authorized for Round 10.



This experimental design will allow the NLS program to test the effectiveness of discretionary in-kind incentives and large increases in cash incentives.


We want to see if providing field managers and field interviewers the ability to customize incentives is more effective than simply offering higher respondent payments. Because NORC staff members interact with the sample members, our hope is that offering this discretion would be more effective than increasing respondent payments.


By offering a large increase in the respondent payments, BLS hopes to determine whether the large increments have large impacts on response rates. Our previous experiments have involved relatively modest increments in the response incentive. By offering a large increase in incentives, we hope to see a large movement in response rates.


Running the experiment for two years also affords us the opportunity to look at the dynamic responses to the increased incentive. Thus, we can learn how sample members react in the second year. Do respondents come to expect such large increases in the incentive? Do any first-year improvements in response rates persist into the second year (as our previous results suggest)? Does the increase in the incentive encourage previously reluctant sample members to become more cooperative? Do they complete an interview earlier in the fielding period?


File Typeapplication/msword
File TitleNote to Reviewer of 1220-0157
AuthorAmy Hobby
Last Modified ByAmy Hobby
File Modified2006-11-16
File Created2006-11-16

© 2024 OMB.report | Privacy Policy