Media Tracking Questionnaires 1st, 2nd amp; 3rd

Evaluation of the Food and Drug Administrations's General Market Youth Tobacco Prevention Campaign

Attachment 5_Incentives_tracked changes

Media Tracking Questionnaires 1st, 2nd amp; 3rd

OMB: 0910-0753

Document [doc]
Download: doc | pdf


OMB No. 0910-xxxx

Exp. Date xx/xx/xxxx

Attachment 5: Incentives


Incentives for the Evaluation of the Public Education Campaign on Teen Tobacco (ExPECTT)



This section describes our plan for the promised incentive. First, we provide background information about the Evaluation of the Public Education Campaign on Teen Tobacco (ExPECTT) and the benefit of using incentives with self-administered survey modes. Next, a brief summary of the literature on the use of incentives is provided, highlighting a number of federal surveys, including several conducted by the Census Bureau, that offer incentives to respondents. Prepaid and promised incentives are then discussed, followed by the rationale for selecting the $20 promised incentive amount for the data collection.



Background


ExPECTT will rely on a combination of face to face and web-based interviews during data collection. Once rapport has been established by an interviewer with a household during an initial face to face interview, subsequent ExPECTT interviews may be conducted online in some cases. Regardless of mode of administration, survey response rates are declining in the U.S. This challenge leads us to explore incentives to increase respondent participation.



Use of Incentives


The mechanisms that evoke higher participation when incentives are used are unclear. Two competing theories suggest that incentives may be construed as either a token of appreciation (social exchange theory) or compensation for one’s time and effort (economic exchange theory). Which mechanism is dominant may not make a difference in cross-sectional surveys, but would likely affect cooperation in panel surveys, when the decision to participate in the first wave of the survey is, to a certain extent, a commitment to take part in following waves and the experience in the first wave is likely to be the most influential factor on future decisions to participate (Singer et al., 1998).


Longitudinal surveys often use incentives to build initial rapport with the panel respondents as participation in the baseline wave usually sets the retention rate for the life of the panel (Singer et al., 1998). That is why sizable incentives in the first wave of data collection are often recommended (Singer et al., 1998). For example, in an incentive experiment on Wave 1 of the 1996 Survey of Income and Program Participation (SIPP, U.S. Census Bureau), James (1997) found that the $20 prepaid incentive significantly lowered nonresponse rates in Waves 1-3 compared to both the $10 prepaid and the $0 conditions. Mack et al. (1998), examining cumulative response through Wave 6, found that an incentive of $20 reduced household, person, and item (gross wages) nonresponse rates in the initial interview and that cumulative household nonresponse rates remained significantly lower at Wave 6 (24.8 percent in the $20 group vs. 27.6 percent in the $0 incentive group, and 26.7 percent in the $10 group), even though no further incentive payments were made.


In addition, there seems to be no evidence of incentive expectation in subsequent waves of data collection. For example, research on the Health and Retirement Survey (HRS) suggests that respondents who are paid a refusal conversion incentive during one wave do not refuse at a higher rate than other converted refusers when reinterviewed during the next wave (Lengacher et al., 1995). Similarly, Singer et al (1998) found that respondents in the Survey of Consumer Attitudes who received a monetary incentive in the past were more likely to participate in a subsequent survey, despite receiving no further payments.


This research seeks to test two experimental conditions that represent different combinations of interviewer- and self-administered modes. The most efficient design would offer incentives only to respondents who receive web or inbound CATI – modes that lack the interviewer motivation. However, mixed-mode designs employ combinations of modes and often respondents in the same household are interviewed in different modes. In order to treat respondents in the same household equally, and provide comparisons across modes that are not confounded by the offer of an incentive, we need to offer incentives to everyone in the household, regardless of mode.


A common argument against the use of incentives is the cost associated with them. Yet, incentives can reduce the cost per case through the need for fewer interviewers to do follow-up with sample members who do not respond. Such evidence is provided by the incentive experiments conducted for the National Survey on Drug Use and Health (NSDUH, Substance Abuse and Mental Health Services Administration). Cost per interview in the $20 group was 5 percent lower than the control (no incentive), and in the $40 group costs were 4 percent lower than the control. The cost savings were gained by interviewers spending less time trying to obtain cooperation from respondents (Kennet et al., 2005). These savings were realized through reduced interviewer labor as well as reduced travel costs (mileage, tolls, parking, etc.) Similar results were experienced in an incentive experiment conducted for the National Survey of Family Growth (NSFG, National Center for Health Statistics) Cycle 5 Pretest which examined $0, $20, and $40 incentive amounts. As in the NSDUH experiments, the additional incentive costs were more than offset by savings in interviewer labor and travel costs (Duffer et al, 1994).


In addition to NSDUH and NSFG, many other federally-sponsored surveys offer incentives to gain cooperation. For example, the National Health and Nutrition Examination Survey (NHANES, National Center for Health Statistics) offers respondents up to $125, depending on the number of survey sections and exams that are completed. The National Survey of Adoptive Parents of Children with Special Health Care Needs (Department of Health and Human Services) offers parents $25 for participation in a 35-minute telephone survey. In order to improve response rates, reduce the number of contacts required to gain cooperation, and address respondent concerns about interview burden, the National Survey of Child and Adolescent Well-Being (NSCAW, Administration for Children and Families) in 2002 doubled the incentive offered to respondents from $25 to $50. The Early Childhood Longitudinal Study-Birth Cohort (ECLS-B, U.S. Department of Education) offered parent participants $50 and a children’s book for the first wave and $30 and a children’s book for subsequent waves of data collection. Over rounds 1 through 10 of the National Longitudinal Survey of Youth 1997 (NLSY97, Bureau of Labor Statistics) cohort, incentives offered to respondents ranged from $10 to $50 in an attempt to minimize attrition across waves of data collection. The National Immunization Survey (NIS, National Center for Immunizations and Respiratory Diseases) offers a combination of $5 prepaid and $10 promised incentives to encourage eligible nonrespondents to participate.


As noted earlier, the U.S. Census Bureau has also experimented with and begun offering incentives for several of its longitudinal panel surveys, including SIPP and the Survey of Program Dynamics (SPD). SIPP has conducted several multi-wave incentive studies, most recently with their 2008 panel, comparing results of $10, $20, and $40 incentive amounts to those of the $0 control group. The study has examined response rate outcomes in various subgroups of interest (e.g., the poverty stratum), use of targeted incentives for non-interview cases, and the impact of base wave incentives on participation in later waves of data collection. Overall, the results suggest that $20 incentives increase response rates and also improve the conversion rate for non-interview cases. Incentives may also have an additional impact on response rates for households in the poverty stratum and significantly reduce item nonresponse rates (see Creighton et al (2007); Clark, S.M. and Mack, S.P, (2009)). Similarly, SPD has conducted four incentive studies, testing $20, $40, $50, and $100 amounts in an effort to increase cooperation among poverty households and nonrespondents and to minimize attrition in subsequent waves of the study. Incentives were found to have a positive impact on both response and attrition rates; most recently, the fourth incentive study found that the average interview rate greatly increased with the use of incentives (Creighton et al, 2007).



Prepaid vs. Promised Incentives


Studies in the survey literature predominantly find prepaid incentives to be more effective than promised (e.g., Linsky, 1975 and Armstrong, 1975 for an overview; Church, 1993). However, in this research we will not have prior information on the composition of any sampled household because we are sampling addresses. Since we need to interview every eligible person in the household and offer the same incentive to all sample persons without prior knowledge of the number of household members, it would be challenging, if at all possible, to offer prepaid incentives in some conditions. For this reason, testing a promised incentive is recommended.


Various studies have demonstrated significant effect of promised incentives compared to a no incentive condition. For example, Cantor et al. (2003) found an almost 10 percent increase in response rate when promising $20 (vs. no incentive) in an RDD survey of caregivers to children 0-17. In a meta-analysis of 39 controlled experiments, Singer et al. (1998) found that the effect of prepaid incentives on response rates did not differ significantly from the effect of promised incentives. Consistent with other studies (e.g., Yu and Cooper, 1983) also found promised incentives significantly improved response rates. Promised incentives are fairly common at the refusal conversion stage. A number of studies have reported gains in response rates with offering relatively large amounts of money ($25 or greater) at the end of the data collection period (e.g., Olson et al. 2004; Curtin et al. 2005).


The decision to use prepaid or promised incentives is often determined by the mode of data collection – for example, usually prepayments are difficult to accomplish in telephone interviews. Some research indicates that the difference between prepaid and promised incentives is not that prevalent in certain modes – for example, Bosnjak and Tuten (2003) report this is not a relevant issue in web surveys.



Incentive Amount


In theory, incentives aid participation in two ways. First, they can be conceived as a “token” to elicit social exchange between the sample member and the survey organization and sponsor – each side doing something good for the other party, without an economic value. Small prepaid incentives can be viewed as merely invoking good will under social exchange theory. Some argue it is not social exchange, but rather establishing the legitimacy of the survey request that is achieved. Even if so, it is not the amount that is central. Second, while larger incentive amounts can also be seen as invoking social exchange, they can directly motivate sample members to participate by providing a direct benefit to the respondent in exchange for the time and burden of answering the survey questions. These can, therefore, be promised – conditional on completion of the survey. It is possible that too small an amount for a promised incentive will not help to increase participation as it can be seen as showing to little value for the respondent’s time. Although the cognitive mechanisms for how incentives influence survey participation are not well understood, other fields have proposed theories and empirical evidence showing that small monetary incentives to provide extrinsic motivation “backfire” and the incentive needs to be larger for it to work (). A critical risk in experiments is dosage in the manipulation (the incentive amount, in this case) and being too conservative can risk the success of the entire experiment. The additional concern here, is that too small of a promised incentive may fall below a threshold that is expected by many respondents in exchange for an approximately 15-20 minute survey.


The choice of an incentive amount largely depends on the survey burden, including the survey length and other tasks that may be required of the respondent, the survey topic, and whether the incentive is promised or prepaid. Promised incentives tend to be larger than prepaid incentives; Strouse and Hall, 1997, recommend that in order to be successful, promised incentives have to be $15-$35. As noted above, a number of federally funded surveys, including the NSDUH and the NSFG, currently provide incentives. For example, interviewers in the NSDUH currently offer $30 (for an interview that averages 60 minutes); interviewers in the NSFG offer $40 (for interviews that are about 60 minutes for males and 80 minutes for females). Incentives on the NHANES range from $20 to $100 depending on the survey and physical exam components respondents choose to participate in.


In addition to payment method (prepaid or promised), careful consideration has been given to the incentive amount to be tested in this research. Of particular importance is achieving sufficient response rates to analyze the effectiveness of the self-administered modes (inbound CATI and Web) during data collection. Based on the study design, estimated respondent burden (7-8 minutes per Screener, plus 8-9 minutes for each completed Crime Incident Report), and the sampling methodology--which involves the selection of all age-eligible adults in each sampled household--we believe a $10 promised incentive is the optimal amount for this research. This strategy is based on both an examination of the survey literature and results from several recent studies that analyzed the effectiveness of small prepaid and promised incentives in increasing participation in screening and topical surveys. The National Household Education Survey (NHES, U.S. Department of Education) tested $2 and $5 prepaid incentives for their mail screener, which was expected to take 2-8 minutes to complete depending on the survey version used. The study also tested the effectiveness of $5 and $15 prepaid incentives to respondents who screened eligible for the topical survey. The study found that the larger prepaid incentive amounts ($5 vs. $2, $15 vs. $5) achieved higher response rates. The NHES also offered a $5 promised incentive to a subset of respondents to encourage them to participate in the topical survey by telephone. Although higher response rates (6- 8%) were achieved with the $5 promised incentive, none of the observed differences were statistically significant (Tubman and Williams, 2010).


The National Survey of Early Care and Education (NSECE, Administration for Children and Families) also recently tested the effectiveness of small prepaid incentives in their field test, conducted earlier this year. Specifically, they tested $1 and $2 prepaid incentives with their mail screener and conducted a refusal conversion incentive experiment aimed at increasing household completion rates with respondents who screener eligible for the household or home-based provider surveys. For the latter, two experimental conditions were fielded and households were randomly assigned to receive either a $5 prepaid incentive or a $5 prepaid incentive and a $10 promised incentive upon completion of the interview. Based on the field test results (not yet released), which showed the $2 prepaid screener incentive outperforming the $1 incentive, a more aggressive incentive strategy is planned for the main study to increase response rates and reduce the amount of effort required to contact and gain cooperation from households. Although the design of the main study is still being finalized, ACF staff have indicated the study is seeking to offer a $2 prepaid incentive with the screener mailing and an additional $20 promised incentive for household and home-provider survey respondents.





5



File Typeapplication/msword
File Modified2013-10-29
File Created2013-10-28

© 2024 OMB.report | Privacy Policy