Attachment J Literature review

Attachment J Literature review.docx

Conservation Auction Behavior: Effects of Default Offers and Score Updating

Attachment J Literature review

OMB: 0536-0078

Document [docx]
Download: docx | pdf

CRP Enrollment: Relevant Literature on Nudges and Experiments


Background Nudges: Defaults and Anchoring


Nudges are “any aspect of the choice architecture that alters people’s behavior in a predictable way without forbidding any options or significantly changing their economic incentives,” (Thaler and Sunstein 2009, p.6). In other words, nudges are low-cost interventions made at the time of a decision, for example, default options, feedback, and anchors. Nudges can have large effects on behavior; in a meta analysis of 100 experiments using nudges, Hummel and Maedche find that two thirds of the effects are statistically significant and the median effect size is 21% (Hummel and Maedche 2019). Nudges are an important tool for policy makers as relatively small interventions with the potential to adjust behavior without significantly changing the incentives or choice options for decision makers (Benartzi et al. 2017).


Defaults are choices or settings which require an individual to make a deliberate action to deviate from the default. Status quo bias describes the reluctance of a decision-maker to change from the default. A good example of this is the dramatic differences in organ donor registration between countries that have an opt-in or an opt-out system (Johnson and Goldstein 2003). Defaults have been shown to have large behavioral impacts in a number of contexts, including to increase voluntary contributions (Messer et al. 2007; Messer, Suter and Yan 2013), reducing over-prescription of opioids (Chiu et al. 2018) and tipping behavior for taxi rides (Haggag and Paci 2014).


Anchoring is a cognitive bias that describes a tendency to rely too heavily on the first piece of information available when making a decision. It is a well-studied and well-documented behavioral bias in the literature (Kahneman 2011). In the context of consumer goods, psychologists and behavioral economists have reported that anchoring can influence valuations (Ariely, Loewenstein and Prelec 2003; Maniadis, Tufano and List 2014). Results from these studies, and their interpretations, have generated disagreement about the stability of consumer preferences (Maniadis et al. 2014; Enke et al. 2020). There is further disagreement if anchoring influences inexperienced consumers’ valuations more than experienced consumers (Alevy, Landry and List 2015; Clark and Ward 2008; Löfgren et al. 2012). There is some evidence of the impact of anchoring on market outcomes fading over time (Alevy et al. 2015). Anchoring has also been shown change behavior across many contexts, including: WTP for environmental action (Li et al. 2019), the value of bids on initial public offering auctions (Gao et al. 2018), farmers’ bidding practices (Holst, Hermann and Musshoff 2015), and others (see a review in Furnham and Boo 2011).


Other nudge strategies include additional information and feedback. Feedback nudges have been shown to have long and persistent effects on water consumption (Chabé-Ferret et al. 2019). Information and social comparisons increase compliance with water protection rules (Peth and Mußhoff 2020).



Experimental Papers on Conservation


Review papers are available for the hundreds of experimental economics studies on auction and auction-like “games” (in the economic sense of the term) (e.g.: Dechenaux et al. 2015) and for the specific domain of conservation auctions (Schilizzi 2017). A subset of the conservation auction literature looks at the issue of information provision during the auctions (Messer et al. 2014)


In addition, the following literature review of conservation auctions is repeated here and is from the ERS white paper on payments (Attachment H to this ICR package).


In looking for experimental studies of conservation auctions, we identified thirty candidate studies. More than half of these studies did not include sufficient information on payments to participants or estimates of relevant treatment effects to be included here. Eleven studies provide sufficient information and involved estimation of a treatment effect on either average rent or total cost. Some of these studies find that withholding information on ranking can reduce information rents (Cason and Gangadharan 2004, Banerjee et al. 2015), while other studies find that withholding ranking information can also reduce benefits (Conte and Griffin 2007). Several studies suggest that pay-as-bid (discriminatory pricing) can reduce costs (Cason and Gangadharan) relative to uniform pricing, but that ordering can reverse if contract compliance decisions are taken into account (Kawasaki et al. 2012). Other key issues covered in these studies include the way in which the dynamic of repeated auctions can improve net benefits even while increasing rents (Fooks et al. 2015), the prevalence of adverse selection in these auctions (Arnold et al. 2013), incentives for offer quality improvement (Banerjee et al. 2018), the impact of excessively restrictive bid caps (Hellerstein et al 2015), the impact of using benefit-cost ratio ranking (Iftekar and Tisdell 2014, Fooks et al. 2015), multiple, interacting auctions (Tisdell and Iftekar 2013), and the role of communication and trust between participants and program administrators (Vogt et al. 2013).


In the studies identified in the above charts, participants were asked to make offers in multiple auctions. Total number of auctions ranged from a low of 8 to a high of 65. In about half of the studies, auctions included multiple rounds, which are opportunities to revise offers within an auction.1 Usually a participant is given a single item (e.g.: a field) on which to make an offer, but in some cases participants were given multiple items. The combination of auctions, rounds, and multiple items mean that over the course of a single session a participant could be making a lot of offer decisions. The most involved experiments involved 91 (Banerjee et al, 2015), 108 (Cason and Gangadharan, 2004), and 130 decisions (Fooks et al., 2016).


References


Alevy, J.E., C.E. Landry, and J.A. List. 2015. “Field experiments on the anchoring of economic valuations.” Economic Inquiry 53(3):1522–1538.

Ariely, D., G. Loewenstein, and D. Prelec. 2003. “‘Coherent Arbitrariness’: Stable Demand Curves Without Stable Preferences.” The Quarterly Journal of Economics 118(1):73–106. Available at: https://academic.oup.com/qje/article-lookup/doi/10.1162/00335530360535153 [Accessed August 13, 2020].

Arnold, M.A., J.M. Duke, and K.D. Messer. 2013. “Adverse selection in reverse auctions for ecosystem services.” Land Economics 89(3):387–412.

Banerjee, S., Kwasnica, A.M. and Shortle, J.S., 2015. Information and auction performance: a laboratory study of conservation auctions for spatially contiguous land management. Environmental and Resource Economics, 61(3), pp.409-431Bardsley, N., Cubitt, R., Loomes, G., Moffat, P., Starmer, C., & Sugden, R. (2010). Experimental economics: Rethinking the rules. Princeton University Press.

Banerjee, S. and Conte, M.N., 2018. Information access, conservation practice choice, and rent seeking in conservation procurement auctions: evidence from a laboratory experiment. American Journal of Agricultural Economics, 100(5), pp.1407-1426.

Benartzi, S., J. Beshears, K.L. Milkman, C.R. Sunstein, R.H. Thaler, M. Shankar, W. Tucker-Ray, W.J. Congdon, and S. Galing. 2017. “Should Governments Invest More in Nudging?” Psychological Science 28(8):1041–1055.

Cason, T.N. and Gangadharan, L., 2004. Auction design for voluntary conservation programs. American Journal of Agricultural Economics, 86(5), pp.1211-1217.

Chabé-Ferret, S., P. Le Coent, A. Reynaud, J. Subervie, and D. Lepercq. 2019. “Can we nudge farmers into saving water? Evidence from a randomised experiment.” European Review of Agricultural Economics 46(3):393–416.

Chiu, A.S., R.A. Jean, J.R. Hoag, M. Freedman-Weiss, J.M. Healy, and K.Y. Pei. 2018. “Association of Lowering Default Pill Counts in Electronic Medical Record Systems with Postoperative Opioid Prescribing.” JAMA Surgery 153(11):1012–1019.

Clark, J., and S. Ward. 2008. “Consumer Behavior in Online Auctions: An Examination of Partitioned Prices on eBay.” Journal of Marketing Theory and Practice 16(1):57–66.

Conte, M.N. and Griffin, R.M., 2017. Quality information and procurement auction outcomes: evidence from a payment for ecosystem services laboratory experiment. American Journal of Agricultural Economics, 99(3), pp.571-591.

Dechenaux, E., Kovenock, D., & Sheremeta, R. M. (2015). A survey of experimental research on contests, all-pay auctions and tournaments. Experimental Economics, 18(4), 609-669.

Enke, B., U. Gneezy, B. Hall, D. Martin, V. Nelidov, T. Offerman, and J. Van De Ven. 2020. “Cognitive Biases: Mistakes or Missing Stakes?”

Furnham, A., and H.C. Boo. 2011. “A literature review of the anchoring effect.” Journal of Socio-Economics 40(1):35–42. Available at: http://dx.doi.org/10.1016/j.socec.2010.10.008.

Fooks, J., K.D. Messer, and J. Duke. 2015. “Dynamic Entry, Reverse Auctions, and the Purchase of Environmental Services.” Land Economics. 91(1): 57-75.

Gao, S., Q. Meng, J.Y. Chan, and K.C. Chan. 2018. “Cognitive reference points, institutional investors’ bid prices, and IPO pricing: Evidence from IPO auctions in China.” Journal of Financial Markets 38:124–140. Available at: http://dx.doi.org/10.1016/j.finmar.2017.09.002.

Haggag, K., and G. Paci. 2014. “Default tips.” American Economic Journal: Applied Economics 6(3):1–19.

Hellerstein, D., Higgins, N.A. and Roberts, M., 2015. Options for improving conservation programs: Insights from auction theory and economic experiments. Amber Waves, February.

Holst, G.S., D. Hermann, and O. Musshoff. 2015. “Anchoring effects in an experimental auction - Are farmers anchored?” Journal of Economic Psychology 48:106–117. Available at: http://dx.doi.org/10.1016/j.joep.2015.03.008.

Hummel, D., and A. Maedche. 2019. “How effective is nudging? A quantitative review on the effect sizes and limits of empirical nudging studies.” Journal of Behavioral and Experimental Economics 80(March):47–58. Available at: https://doi.org/10.1016/j.socec.2019.03.005 [Accessed October 1, 2020].

Iftekhar, M.S. and Tisdell, J.G., 2014. Wildlife corridor market design: An experimental analysis of the impact of project selection criteria and bidding flexibility. Ecological economics, 104, pp.50-60.

Johnson, E.J., and D. Goldstein. 2003. “Do Defaults Save Lives?” Science 302(5649):1338–1339.

Kahneman, D. 2011. “Thinking, Fast and Slow.” Available at: https://books.google.com/books?hl=en&lr=&id=SHvzzuCnuv8C&oi=fnd&pg=PP2&ots=NTohLC_mHz&sig=oGGNNovasu-_1DIwJusqI3joL9U [Accessed October 28, 2020].

Kawasaki, K., Fujie, T., Koito, K., Inoue, N. and Sasaki, H., 2012. Conservation auctions and compliance: theory and evidence from laboratory experiments. Environmental and Resource Economics, 52(2), pp.157-179.

Löfgren, Å., P. Martinsson, M. Hennlock, and T. Sterner. 2012. “Are experienced people affected by a pre-set default option-Results from a field experiment.” Journal of Environmental Economics and Management 63(1):66–72.

Maniadis, Z., F. Tufano, and J.A. List. 2014. “One swallow doesn’t make a summer: New evidence on anchoring effects.” American Economic Review.

Messer, K.D., J.M. Duke, L. Lynch, Kent D. Messer, and Lori Lynch. 2014. “Applying Experiments to Land Economics: Public Information and Auction Efficiency in Ecosystem Service Markets.” In The Oxford Handbook of Land Economics. Oxford University Press, p. 481.

Messer, K.D., J.F. Suter, and J. Yan. 2013. “Context Effects in a Negatively Framed Social Dilemma Experiment.” Environmental and Resource Economics 55(3):387–405.

Messer, K.D., H. Zarghamee, H.M. Kaiser, and W.D. Schulze. 2007. “New hope for the voluntary contributions mechanism: The effects of context.” Journal of Public Economics 91(9):1783–1799.

Peth, D., and O. Mußhoff. 2020. “Comparing Compliance Behaviour of Students and Farmers. An Extra-laboratory Experiment in the Context of Agri-environmental Nudges in Germany.” Journal of Agricultural Economics 71(2):601–615.

Schilizzi, S. G. (2017). An overview of laboratory research on conservation auctions. Land Use Policy, 63, 572-583.

Thaler, R.H., and C. Sunstein. 2009. Nudge: Improving decisions about health, wealth, and happiness. Penguin.

Tisdell, J.G. and Iftekhar, M.S., 2013. Fisheries quota allocation: Laboratory experiments on simultaneous and combinatorial auctions. Marine Policy, 38, pp.228-234.

Vogt, N., Reeson, A.F. and Bizer, K., 2013. Communication, competition and social gift exchange in an auction for public good provision. Ecological economics, 93, pp.11-19.






1 The terminology used in this literature is extremely inconsistent. The studies use the terms sessions, rounds, periods, trials, and eras. Typically the term “session” refers to a single experimental session, or group of participants doing one full run of the experiment. Here we use the term “round” to refer to one full auction, but several of the papers refer to these as “periods” and use the term “rounds” to refer to the revision opportunities within an auction.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorLaura Paul
File Modified0000-00-00
File Created2022-04-07

© 2024 OMB.report | Privacy Policy