Passback Q and A document

NCVS mixed mode passback response 2-9-2012.docx

Research to support the National Crime Victimization Survey (NCVS)

Passback Q and A document

OMB: 1121-0325

Document [docx]
Download: docx | pdf

BJS Passback Response to OMB

Generic Clearance, OMB Number 1121-0325.

February 9, 2012



  1. Please update the incentives discussion in the summary document and in the incentives appendix for the other federal surveys per the additional information provided below.  It seems that your analysis is based in part on dated information and in part on a less than complete understanding of the NHES and NSECE findings. 


NHES and NSECE Findings


Thank you for providing more current information about the incentive plans for the NHES and NSECE. The results of the respective studies yield important findings on prepaid incentives for federal agencies and the research community. We have taken this opportunity to review the OMB packages for these studies and have updated our description of NHES and NSECE studies in Attachment 1 (Incentives).


Our understanding of NHES is that the 2011 field test is designed to "further refine an optimal strategy for the use of incentives" in the topical component of the survey. In addition to the prepaid incentive offered with the initial mail screener, NHES is experimenting with the amount of the topical survey incentive ($5 and $15) based on the timing of the screener response. Similarly, the incentive strategy to be deployed for the NSECE is informed by outcomes of previous incentive experiments implemented during their 2011 field test. Like NHES, the NSECE will offer a $2 incentive in the first mailing of the household screener, an amount that proved more successful than the $1 incentive tested in the NSECE field test. An additional prepaid incentive of $5 will be mailed with the eligibility notification letter that precedes the in-person interviews. Both NHES and NSECE are building upon previous incentive experiments and are at the stage of refining strategies and optimal incentive amounts for their methodologies.


How the Survey of Crime Victimization is Different


In developing the SCV incentive strategy we considered the use of pre-pay incentives which have been repeatedly endorsed in the literature (Singer, 2002). However, there are important distinctions between the Survey of Crime Victimization (SCV) and studies such as NHES and NSECE that utilize pre-paid strategies. The SCV contains multiple survey mode choices for respondents (CAPI, CATI, inbound CATI, and Web) and substantially different eligibility criteria (all individuals versus one screened individual). During the development of the SCV, these factors encouraged us to consider the promised, rather pre-paid incentive approach. Additionally—


  1. The selection process between SCV and NHES/NSECE differs substantially. Prepaid incentives are generally sent to a household in expectation that a member of the household will cooperate with the survey request. The person responding to the initial survey request can be any one person residing in the household. This is a key distinction for the SCV which is designed to elicit survey responses from multiple unknown household members during the first contact--in this case, all adults age 18 and older.


  1. Instrumentation differs amongst the SCV and NHES/NSECE studies. For the SCV, the modified crime screener is not a separate instrument from the crime incident report nor is the screener designed to identify a single respondent that will receive topical questions (e.g. crime incident form) at a later date. The SCV incident report serves two purposes—1) to provide details needed to classify and unfound incidents, and 2) as a tool to identify the number of crime incident reports to be completed during the interview.


  1. There are important mode differences amongst the studies. Unlike the NHES and NSECE, the SCV requires sample members to answer questions not only by mail or in person but also by inbound CATI or via the Web. These self-administered modes are especially important during Wave 2 in which respondents are asked to complete their interviews by calling into a CATI facility or navigating to a Website via computer.


This mixed mode design originally included mail instruments in addition to the inbound CATI and Web modes. However, due to the complexity of the NCVS, a mail instrument will not be implemented in the SCV field test. Cognitive testing of the draft mail survey demonstrated that terminology and cognitive tasks associated with the NCVS made adaptation to a mail instrument impossible without an overhaul of the questionnaire.


While the advantages of prepaid incentives over promised have been demonstrated consistently for mail surveys (see Church, 1993, for a meta-analysis), there is no research demonstrating significant advantages of prepaid over promised incentives in interviewer-administered surveys (see Singer et al., 1999 for a meta-analysis). The effectiveness of prepaid incentives on participation rates remains unknown when the survey request is separate from the actual interview as is the case with the self-administered modes of inbound CATI or Web. By design NHES and NSECE are able include a prepaid incentive with the mailed survey request, while the SCV is not.



Justification for Incentive Strategy


Our proposed incentive structure is based on the absence of research examining the effect of a single household prepaid incentive on survey participation by all eligible members of the household.


When initially designing this study, the testing of a promised $10 incentive was recommended for three reasons. First, the SCV employs incentives in order to examine whether comparable response rates may be achieved in self-administered modes (inbound CATI and Web) as in the ongoing NCVS. While a lower promised amount (e.g. $5) may suffice, without previous tests including NCVS instruments or the population sampled, this test may be jeopardized by low response rates. Second, without prior knowledge of household composition, we are unable to determine the number of prepaid incentives that would be required in each household. Third, the $10 amount has not been tested as extensively as a pre-paid $5 incentive in self-administered modes, specifically inbound CATI and Web. The hypothesis is that a promised incentive may work differently for Web and inbound CATI than a telephone or mail survey.


The design of this research would therefore add to the research literature, specifically for self-administered modes for which there is little known, i.e. inbound CATI and Web. Additionally, we are guided by the work of Groves (2008) which states there is a value in a permanent randomized incentive component in every survey rather than reliance on methodological studies that are informative for a particular estimate, from a particular survey, at a particular time. Given the length of the SCV interview (estimated respondent burden for the study is 7-8 minutes per crime Screener, plus 8-9 minutes for each completed Crime Incident Report) and other key design features (address-based sample, selection of all age-eligible adults in each household), we believe a $10 promised incentive is the optimal incentive strategy. Please see Attachment 1 for revisions to our more detailed discussion our incentive strategy.



BJS is has funded two projects examining the use of self-administered modes in the NCVS, and we are keenly aware of how these two projects will ultimately dovetail to inform a redesigned NCVS.

The purpose of these projects is to examine the viability of less expensive alternative modes of survey administration. If results from this mode research indicate that response rates are affected by incentives for inbound CATI and Web, then BJS may pursue additional testing to refine an optimal incentive amount. At this time, we are deeply concerned that the lack of a sufficiently motivating promised incentive to all members of the household upon completion of the survey will adversely affect response rates. BJS recently received approval to test Interactive Voice Technology (IVR) utilizing a $10 promised incentive. We feel that the current study will add to our IVR efforts in assessing the viability of self-administered modes in a redesigned NCVS.


Clarification on Inclusion of NSFG and NSDUH in Previous Justification


Our discussion of incentives in Attachment 1 (Incentives) includes a brief summary of incentive amounts offered by other federal surveys. As OMB has pointed out, some of these studies, like the National Survey of Family Growth (NSFG), place a greater burden on respondents as a result of the sensitive nature of some survey items and the length and complexity of the interview itself.


Although shorter in length, the SCV also includes questions that may be deemed sensitive by some respondents. This is especially the case for victims of crime such as intimate partner violence, rape or sexual assault, robbery or aggravated assaults. In developing our incentive strategy, we examined the incentive protocols of a number of federal surveys, including those that collect sensitive data like the NSFG, to better understand the range of incentives that are being offered to respondents given their survey mode options and the response burden.


In the updated Attachment 1 we have cited additional examples of federal surveys that offer incentives, most of which are conditional on the completion of the survey. This information was helpful as we considered the optimal incentive amount to test against a $0 incentive condition in the SCV. The $10 promised amount was chosen in light of prior research showing that significant effects of promised incentives (compared to a no incentive condition), offered at the completion of the survey, were at least $5, with most being $15 or more (Yu and Cooper, 1983; Strouse and Hall, 1997; Singer et al., 2000; Cantor et al, 2003).



  1. Please remove references to the incentive payment being “compensation for your time” in the cover to the survey and letters.  Incentives are not payments.  You can use “expression of appreciation” or similar.


We have revised the wording in the survey materials to refer to incentives as “tokens of appreciation” rather than “compensation for your time.” The revised materials are provided in Attachment 11 (Contacting Sampled Addresses and Gaining Cooperation) and changes are highlighted in yellow for ease of review. Below is a list of specific items that have been modified:


  • Exhibits 11-1 and 11-3: Consent forms for incentive treatment group

  • Exhibits 11-6 and 11-8: Lead letters for incentive treatment group

  • Exhibit 11-13: Return letter for incentive treatment group

  • Exhibit 11-15: Thank for letter for incentive treatment group

  • Exhibits 11-17, 11-19, and 11-21: Nonresponse letters for incentive treatment group

  • Exhibits 11-23, 11-25, and 11-27: Refusal letters for incentive treatment group

  • Exhibit 11-28 and 11-30: Thank you/reminder cards for incentive treatment group

  • Exhibit 11-33: Incentive receipt


  1. The race question should include the instruction “please check one or more” – you may NOT use “all that apply.”  PLEASE check this before submitting to OMB in the future.  We have to correct this way too often.


We apologize for the oversight in the wording of several race questions in SCV instruments. We have revised the wording as follows:


  • Attachment 3 - CATI/CAPI Address Verification and Household Enumeration Questionnaire:

    • Section B, Question 2F- Changed “all that apply” interviewer instruction to “Please select one or more.”

  • Attachment 4a - CATI/CAPI Screener:

    • Questions 6 and 8g – Changed “check one or more” instruction to “Please select one or more.”

  • Attachment 4b - CATI/CAPI Crime Incident Report:

    • Question 44b – Original wording was: “What race or races was the offender? You may select more than one. Was the offender...” Changed to “What race or races was the offender? Please select one or more. Was the offender…”

    • Question 56c – Original wording was: “What race or races were the offenders? Were they...” Changed to “What race or races were the offenders? Please select one or more. Were they…”

  • Attachment 5 - Web Survey Instrument:

    • Page 2, respondent race question – Changed “ all that apply” instruction to “Please select one or more.”

    • Question 34a – Original wording was: “What race or races was the offender? You may mark more than one. Was the offender...” Changed to: “What race or races was the offender? Please select one or more. Was the offender…”

    • Question 43b – Original wording was: “What race or races were the offenders? Were they...” Changed to “What race or races were the offenders? Please select one or more. Were they…”


These changes are highlighted in yellow in the revised instruments attached to this response.


  1. Please clarify if the household questionnaire and the individual questionnaires are administered during the same visit/call.  And is the initial contact incentivized once or twice (if in experimental condition)?


The household respondent will receive the crime Screener and any required Crime Incident Reports during the same visit/call. These are programmed in both CATI and CAPI (and Web) to flow seamlessly from the Screener into the Crime Incident Reports when one or more crime incidents are reported by the respondent. In the event of a break-off, the interviewer will attempt to schedule an appointment to recontact the respondent and finish the interview.


Because the mode of initial contact will be the same for all members of a sampled household, interviewers will attempt to complete interviews with any other eligible adult household members during the same call/visit with the household respondent. However, follow-up appointments will be set if one or more individuals respondents are not available during the interviewer’s visit/call.





References


Bosnjak, Michael and Tracy Tuten. 2003. “Prepaid and Promised Incentives in Web Surveys: An Experiment.” Social Science Computer Review, 21 (2), pp. 208-217.


Cantor, D., Wang, K. and N. Abi-Habibm. 2003. Comparing Promised and Prepaid Incentives for an Extended Interview on a Random Digit Dial Survey. Paper presented at the Annual Conference at the American Association for Public Opinion, Nashville, TN.


Church, Allan H. 1993. “Estimating the Effect of Incentives on Mail Survey Response Rates: A Meta-Analysis.” Public Opinion Quarterly, 57:62-79.


Dillman, D.A. 2007. Mail and Internet Surveys: The Tailored Design, Second Edition. New York: Wiley and Sons.


Groves, Robert. 2008. The Future of Incentive Research. Paper presented at the Council of Professional Associations on Federal Statistics’ panel on Survey Respondent Incentives, March, Washington, D.C.


Singer, Eleanor, John Van Hoewyk, Nancy Gebler, Trivellore Raghunathan, and Katherine McGonagle. 1999. The Effect of Incentives on Response Rates in Interviewer-Mediated Surveys. Journal of Official Statistics 15:217-230.


Singer, Eleanor. 2002. “The Use of Incentives to Reduce Nonresponse in Household Surveys.” In Survey Nonresponse, ed. Robert M. Groves, et al, pp. 163-178. New York: Wiley.


Singer, Eleanor, John Van Hoewyk, and Mary Maher. 2000. Experiments with Incentives in Telephone Surveys. Public Opinion Quarterly, Vol. 64, pp. 171-188.


Strouse, R. C. and J. W. Hall. 1997. “Incentives in Population Based Health Surveys.” Proceedings of the American Statistical Association, Survey Research Methods Section, pp. 952-957.


Yu, J. and Cooper, H. (1983).A Quantitative Review of Research Design Effects on Response Rates to Questionnaires.” Journal of Marketing Research, 20



7


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorshk
File Modified0000-00-00
File Created2021-02-03

© 2024 OMB.report | Privacy Policy