Supporting Statement A ERS CRP_20220329

Supporting Statement A ERS CRP_20220329.docx

Conservation Auction Behavior: Effects of Default Offers and Score Updating

OMB: 0536-0078

Document [docx]
Download: docx | pdf

Supporting Statement A

Conservation Auction Behavior: Effects of Default Offers and Score Updating

USDA/Economic Research Service

OMB Control Number: 0536-XNEW

1: Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.

USDA’s Conservation Reserve Program (CRP) enrolls environmentally sensitive cropland in long-term contracts. Enrolled landowners receive annual rental payments for establishing the approved conservation vegetative cover and not farming the land. Most land is enrolled through the CRP General Signup, a multi-unit, sealed-bid, pay-as-bid reverse auction. Offers are ranked on both quality and price. Offers have field-specific bid caps (maximum annual rental payments), which are the equivalent of reserve prices in forward auctions. Participants can increase the probability that their offer is accepted by agreeing to a higher quality conservation cover practice or offer a discount (i.e.: accepting an annual payment below their bid cap). By encouraging better practices and lower payments, the CRP General Signup auction design leverages competitive pressure for getting accepted into the program to improve the cost effectiveness of the program (Hellerstein 2017).

Data collection for this project is authorized by the 7 U.S.C. 2204(a) (see Attachment A: 7 USC 2204a). Within that authority, UDSA’s Economic Research Service conducts research on issues related to environmental issues and agricultural conservation programs. That research includes studies on the design of CRP, other USDA conservation programs, and related agri-environmental programs such as payments for environmental services. The ERS research in this area often involves statistical analysis of policy design elements and other factors that influence farmers’ participation decisions within these voluntary programs (Wallander et al. 2013). When there is appropriate variation in these factors and incentives, such analysis can apply statistical and econometric modeling to observational data such as program administrative data or existing USDA surveys. However, when there is not variation in these factors and incentives within existing observational data or when there is confounding variation that precludes study of a particular research question, ERS relies on the use of experimental economics methods (Higgins et al. 2017). Experimental economics research often occurs with use of an economic lab experiment, so named because they have historically been conducted within university computer labs designed for such research. The key features of most economic lab experiments are as follows: 1. participants are asked to make choices within hypothetical scenarios that are constructed to capture the key features of the actual choice environments being studied; and 2. treatments are variations in key aspects of the scenarios that are randomly assigned to participants to avoid any influence of potentially confounding drivers of the choices asked of the participants.

For this study, USDA is applying insights from behavioral economics research about defaults (e.g.: starting points) and information provision to the context of conservation auctions such as CRP. In many decision contexts, opt-out defaults have been shown to influence final decisions (Attachment J). When the decision environment is complex, final decisions can also be affected by the timing of information that decision-makers receive. These behavioral aspects of choices are relevant to conservation auctions because key aspects of these auctions – such as the software used for enrollment in CRP and the interaction between program agency staff and potential enrollments – necessarily involve decisions about defaults and information provision.

Using a stylized version of the enrollment software to create a simulated (artefactual) CRP auction, this study will test the impacts of two behavioral interventions: (i) a high-scoring default starting offer; and (ii) live updates on the offer score at the point of offer selection. The outcome of interest is the average offer structure – the selected conservation cover, which influences offer quality, and the rental payment, which influences offer cost.

2: Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.

This study will collect data on choice made using a lab-in-the-field experiment, in which the study participants, farmers who have previously participated in a recent CRP signup, will be asked to make a set of hypothetical offers to a simulated CRP signup. The data collection is being conducted by USDA Economic Research Service in collaboration with researchers from the University of Delaware’s Center for Behavioral and Experimental Agrienvironmental Research (CBEAR). This information will potentially be of use to USDA’s Farm Service Agency when conducting signups for CRP. If FSA continues in the current mode of implementing signups by having farmers directly interact with county field offices, then information on the expected impacts of defaults and score updating could inform decisions about how the field offices prepare alternative enrollment scenarios. If FSA moves toward greater online development of scenarios by farmers themselves, such as with NRCS’s Conservation Gateway or Farmers.gov, then the CRP enrollment software (TERRA) could be redesigned to incorporate defaults and score updating.

3: Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.

The experimental treatments will be implemented similarly to how they could be implemented in the General Signup software. The auction outcome, in terms of annual payment and cover practice selection, will also be structured to mimic the outcomes and incentives of the CRP auction. The participants will make offers in multiple rounds of an artefactual auction using software developed by University of Delaware’s Center for Behavioral and Experimental Agri-environmental Research (CBEAR). To reduce hypothetical bias, which can occur when experimental subjects do not internalize the incentives of their task, the study will provide payments to participants based on their success in getting offers accepted (Attachment H).

The experiment will be implemented using an online website portal hosted by USDA. This will help recruitment and reduce respondent burden by avoiding travel expense and time that is required for laboratory auction experiments. Many prior conservation auction experiments have operated using in-person participation in university research labs, which has high coordination costs and results in small participant pools limited by the size of the lab. The use of an online portal will allow for a large pool of participants and for a multi-week enrollment period, both features that closely mirror the CRP General Signup.

This study will use an artefactual field experiment (‘lab in the field”), which involves a set of choices in a context that closely resembles real world decisions and involves actual financial payouts to participants based on their decisions (Harrison and List, 2004). In comparison to laboratory experiments, artefactual experiments have better external validity for the purpose of informing the design of the CRP General Signup.

4: Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.

ERS literature review did not reveal any studies of defaults or score updating with the conservation auction laboratory research (Attachment J). While there are hundreds of studies using experimental methods to test various aspects of auction designs (Dechenaux et al. 2015) and dozens of experimental studies focusing exclusively on conservation auctions (Schilizzi 2017), nearly all of these studies focus on “structural” design factors such as auction clearing rules, price setting (uniform price versus pay-as-bid), and mechanisms for communication between participants. In the behavioral economics literature, which commonly uses experimental methods but rarely focuses on auctions, evidence from many different domains shows that defaults often have significant anchoring effects (Furnham and Boo 2011). There is evidence that anchoring can influence bidding in IPO auctions (Gao et al. 2018). In one study that did look at bidding behavior of farmers within a very generic auction design, evidence of anchoring effects was mixed (Holst et al. 2015). The research team for this proposed study did not find any studies that examined the role of defaults in conservation auctions. Based on this literature review, an important research questions for the design of the CRP and other conservation auctions are whether the findings of anchoring effects in other domains are relevant for a multi-unit, reverse auction with ranking on both quality and cost.

5: If the collection of information impacts small businesses or other small entities (Item 5 of OMB Form 83-I), describe any methods used to minimize burden.

The information collection would impact small agricultural businesses, farmers who have previously participated in a CRP General Signup. To minimize respondent burden, the study uses an online portal, a multi-week participation window, and limits the study to a practice round, three experimental auction rounds, and a short set of follow-on questions. In addition, the project team revised the treatment design to reduce the number of rounds from the original plan for nine auction rounds. Participation in the study is expected to take between 20 and 30 minutes per participant. The study will solicitation participation from a simple random sample of 11,000 farms drawn from a population of recent CRP participants. This sample represents less than one percent of all farms. The study focuses exclusively on recent participants in CRP General signups, since they are representative of the majority of likely future CRP General Signup participants. The study expects to receive 1,100 participants, all small business, out the recruitment pool, which represents 1.5 percent of participants in the two most recent CRP General Signups.

6: Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.

If this study is not conducted, USDA would lack context-specific research when considering program spending on software redesign to change the defaults used in the enrollment software or training of field agents to change the defaults used when advising farmers on program enrollment.

7: Explain any special circumstances that would cause an information collection to be conducted in a manner inconsistent with the general information guidelines in 5 CFR 1320.5.

There are no special circumstances associated with this information collection. All responses will be one-time responses.

8: Provide a copy and identify the date and page number of publication in the Federal Register of the agency’s notice, required by 5 CFR 1320.8 (d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments.

The notice of intent to request this data collections was published on January 7, 2021 (on 86 FR 1084). (Attachment B). ERS received two public comments that raised concerns about the CRP in general and suggest greater use of pretesting. The project team has provided responses, including a description of how the Nash Equilibrium simulation allows the sort of study calibration in lieu of what would otherwise be an extensive pretesting calibration process. (Attachments C1 and C2).

Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and record-keeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.

In the early stages of this project, the research team consulted with field staff from the Farm Service Agency to learn how the actual CRP signup software is designed. The data collection for this instrument is explicitly designed to be a stylized, simplified version of that software.

The study does not collect personal information from participants. Participants will be asked to make choices about hypothetical fields (parcels) in a simulated auction. Since they do not need to recall any personal information, there was no need to check on the availability of data or extent of record keeping.

To assess basic comprehension of the study instructions and format, and to confirm the expected time required for participation, the research team conducted six pre-tests of the study with students from the University of Delaware. The details of that pilot are provided in Attachment N.

Since the value of experimental studies is heavily dependent on several aspects of experimental design, the Economic Research Service instituted a formal external review of the project during the development phase. The research team produced an experimental design and analysis plan (Attachment D) which was then reviewed by three external reviewers (Attachments E1, E2, and E3) A summary of the reviewer comments and the response of the research team is provided in Attachment F. The peer reviewers were Sarah Jacobson, Associate Professor of Economics, Williams College, (413) 597-4766, Zedekiah Higgs, Research Assistant, University of Maryland, and Andrew Card, Research Assistant, University of Maryland. The latter two reviewers can be reached through Loretta Lynch, Chair of Agriculture and Resource Economics Department, Professor of Economics, University of Maryland, (301) 405-1264, who coordinated those two reviews.

Planned Outreach

The study will not be promoted beyond the recruitment materials. ERS and FSA will communicate the existence of the study to the county offices in case farmers have questions and provide a phone number in the recruitment materials. Farmers will be recruited from list of prior CRP General Signup participants using a priming postcard, a primary letter containing their log-in code, and up to two follow-up postcards. The recruitment materials are shown in attachment G.

9: Explain any decision to provide any payment or gift to respondents.

Participants in this study will receive a final payment that is a combination of a participation payment and an incentive payment. ERS has used payments as part of the study design in prior work (e.g.: OMB Control Number 0536-0076). Under the approval for that study, which has since expired, ERS completed an extensive review of the use of payments for economic experiments (Attachment H).

Participant payments can be used to increase response rates. For enumerated surveys, they are most effective at increasing response rates when baselines response rates are low (Singer et al. 1999). While this study is not enumerated, recent studies using lab-in-the-field experiments with farmers typically have low response rates (Weigel et al. 2021). This study uses a minimum payment (“show-up fee”) of $10 for farmers. Approximately 20 percent of the total payments in this study will go toward the participation payment.

Providing incentive payments in economic experiments is an element of study design intended to ensure there is some form of pecuniary tradeoff serving as an analog to the real-world tradeoff that is the basis for the research questions. Incentive payments are commonly used in economic lab experiments to reduce the risk of hypothetical bias. If participants treat the study scenarios as purely hypothetical scenarios with no consequences attached to their individual choices, then they will potentially try to meet other objectives when participating than what is being asked in the study. For example, participants in real-world conservation auctions face a tradeoff between their net returns if they win the auction and the probability that their offer is accepted. Empirical evidence using observational data from conservation auctions is consistent with this theory (Wallander et al. 2013). If a lab experiment of a simulated auction did not include an incentive payment, then some participants might be more likely attempt to “win” the auction by simply trying to maximize the probability of acceptance rather than by trying to maximize the expected net return. This would be expected to reduce the precision of any treatment effect estimates. ERS’s review of payments for experimental studies found some evidence of less precision in treatment effect efforts for studies in which participants have lower incentive payments (Attachment H).

The incentive payment in this study depends upon the structure of the offer and whether the offer is accepted. For each hypothetical field with a particular set of assigned characteristics, a participant can increase the probability of acceptance, at the cost of simultaneously decreasing the payoff if accepted, by selecting a better (higher scoring and more expensive) conservation practice or offering a discount on the “rental” payment. The study used a numerical Nash equilibrium model of the auction to calibrate the expect offer given the auction design parameters – the points and costs for different practices, the points for lower rental rates, and the distribution of field characteristics – and the payment structure to induce a distribution of expected baseline choices (practices and per-acre payments in the absence of any treatment effects) that is roughly comparable to the existing set of offers in the CRP General Signup.

For farmers, the expected average payment for participation in this study is $42 (see table). An estimated 55 percent of the farmer participants would receive only the minimum participation payment ($10), if the distribution of offers is close to the prediction of the Nash Equilibrium model). For the estimated 45 percent of participants who will have an accepted offer, the payments will range from $70-$103 (including the $10 participation payment) at the Nash Equilibrium offers. A maximum payment of almost $143 (including the participation payment) is theoretically possible give the study parameters, although an offer with this payment would be highly uncompetitive and would only be accepted if the large majority of participants select the maximum payment levels and the minimum practices. Participants will be given the option to receive the final payment through Paypal or as a Walmart or Amazon electronic gift card. Provide this form or rapid and flexible payment options provides salience for the financial incentives, which is important for the payments to serve the role of reducing hypothetical bias.

Population

Average Payment

Minimum Payment

Maximum Possible Payment

Nash Equilibrium Payment Range on Accepted Offers

Farmers

$42

$10

$142.50

$70 to $103



The payment level for farmers is commensurate with the expectation that farmers have a high opportunity cost of time and so will need larger incentives to make the tradeoff between net return and probability of acceptance salient.

The overall payments are considerably smaller than the average CRP contracts. However, since the payments are based on net gains from acceptance in the experimental auction, a more detailed comparison is informative with respect to the external validity of this study. The average annual rental payment for offers in the most recent CRP General Signup was about $3,600 (The median annual rental payment was about $2,100). When deciding whether and how to participation in CRP (or any other conservation auction), farmers are thought to be evaluating this annual rental payment, net of the costs of establishing a conservation cover, against the opportunity costs. These opportunity costs are typically thought to be well approximated by returns from renting the same land for crop production. However, the actual opportunity cost of enrollment will vary for actual CRP participants is generally unobservable by agency staff or researchers studying CRP. For the present comparison, if we assume that the average net return to enrollment, after accounting for cover practice costs, is about 10 percent above the opportunity costs, then CRP participants are making about a net gain of about $360 per year for the average contract. At that level of return, this study would be providing a payment on accepted offers that is about 19 to 29 percent of the annual gain from actual CRP participation under the assumptions described here.

10: Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.

Respondent data will be protected by the Confidential Information Protection and Statistical Efficiency Act of 2002 (CIPSEA). Participants will consent to the study at the start of the experiment (Attachment I). They will be allowed to withdraw from the study at any time. Participants will provide an email to receive their electronic payment, but this information will be deleted once payments are processed. The consent screen study includes the Public Burden Statement and CIPSEA statement.

11: Provide additional justification for any questions of a sensitive nature.

There are no questions of a sensitive nature. Participants are asked to make two choices – conservation cover practice and annual rental payment – for hypothetical fields (land parcels) with pre-assigned traits to enroll in a hypothetical conservation program through a simulated auction. They will repeat these two choices in four rounds of the auction, including in one practice round. After completing those four rounds, all participants will be asked a series of questions to assess their comprehension of the auction and requested tasks. In addition, the participants will be asked a series of questions to evaluate how the simulated auction compares to the actual CRP, which will provide statistics related to the external validity of the study design.

12: Provide estimates of the hour burden of the collection of information. The statement should indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens in Item 13 of OMB Form 83-I. Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories.

The total estimated hour burden of collection 1,466.7 burden hours. This includes a burden of 916.7 hours for study participants (responses). This assumes: 1,100 participants; 30 minutes for study participation for each participant; and 5 minutes of recruitment burden. For non-responses, the recruitment burden is 825.0 hours based on an expected 10 percent response rate.





 

Burden Hours: 10% Response Rate

 

 

Responses

Non-Response

 

 

Sample Size

Count

Minutes / response

Subtotal Burden Hours

Count

Minutes / Response

Subtotal Burden Hours

Total Burden Hours

Recruitment

11,000

1,100

5

91.7

9,900

5

825.0

916.7

Participation

 

1,100

30

550.0

 

 

 

550.0

 

 

 

 

 

 

 

 

1,466.7



Respondent cost per hour for the farmer population was derived by using U.S. Bureau of Labor Statistics Occupational Employment and Wages, May 2020, 11-9013 Farmers, Ranchers, and Other Agricultural Managers. The U.S. mean household income, as measured by the Bureau of Labor, is $41.35. Fringe benefits for all private industry workers are an additional 29.9 percent,1 or $12.36, resulting in a total of $53.71 per hour. The estimated respondent cost is $71,612 ($53.71 x 1,333.3) including recruitment burden for both responses and non-responses.

13: Provide an estimate of the total annual cost burden to respondents or record-keepers resulting from the collection of information.

There are no capital/start-up or ongoing operation/maintenance costs associated with this information collection.

14. Provide estimates of annualized cost to the Federal government; provide a description of the method used to estimate cost which should include quantification of hours, operational expenses, and any other expense that would not have been incurred without this collection of information.

The total cost to the Federal Government for this study is approximately $269,376. This includes a $200,000 cooperative agreement with the Center for Behavioral and Experimental Agri-environmental Research (CBEAR) at the University of Delaware. Approximately one-third of this cost is for personnel who are designing and implementing the study, and the remainder is for participant payments. In addition, there is a cost to USDA of approximately $69,376 (800 hours of research staff time for design, implementation, and analysis of the data collection at $86.72 per hour based on 2021 General Schedule, Grade 14, Step 5, Washington D.C. locality with 29.9 percent in fringe benefit costs.)

15. Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the OMB Form 83-I (reasons for changes in burden).

This is a new data collection package, so all changes are due to program changes.

16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.

The study was initiated with an initial cooperative agreement in late fiscal year 2017. Based on the initial experimental design and power analysis conducted with the cooperators under that agreement, and based on feedback from a consultation with OMB, additional funding was provided in an amendment to the cooperative agreement in late fiscal year 2018 in order to increase the sample size for the farmer population and to add the student population. During 2019, the project team developed the full experimental design and analysis plan (Attachment D) and the agency conducted an external peer review of that plan (Attachments E1, E2, and E3). During 2020, the project team developed and refined the software package that will be used to conduct the experiment (Attachment I). In early 2021, the agency began the PRA approval process (Attachment B).

The expected schedule for data collection is:

1. April to June, 2022: Conduct the study with the farmer population in one to three recruitment waves with the exact timing coordinate with FSA to avoid conflict with the actual CRP General Signup period.

2. July-August, 2022: Complete the estimates of the treatment effects for each population. Since the study involves an experiment design with pre-determined statistical tests, the analysis can be immediately upon the completion of data collection.

3. Fall, 2022: Finalize the manuscript for the study and submit for peer review. Expected publication date is Fall, 2022 to Spring, 2023.

17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.

There is no request for approval of non-display of the expiration date.

18. Explain each exception to the certification statement identified in Item 19, “Certification for Paperwork Reduction Act Submissions” of OMB Form 83-I.

There are no exceptions to the certification statement.



List of Attachments

A: Legal Authority

B: Federal Register Notice

C1 – C2: Comments and Responses to Federal Register Notice

D: Experimental Design and Analysis Plan (EDAP)

E1 – E3: EDAP External Reviewer Comments

F: Response to EDAP Reviewers

G: Recruitment Materials

H: Payments Whitepaper

I: Survey Instrument

J: Literature Review

M: Instructional Slides

N: Pre-testing report

O: Summary of NASS ICR Review and ERS Response

Form 83-I



References

Dechenaux, E., Kovenock, D., & Sheremeta, R. M. (2015). A survey of experimental research on contests, all-pay auctions and tournaments. Experimental Economics, 18(4), 609-669.

Furnham, A., & Boo, H. C. (2011). A literature review of the anchoring effect. The journal of socio-economics, 40(1), 35-42.

Gao, S., Meng, Q., Chan, J. Y., & Chan, K. C. (2018). Cognitive reference points, institutional investors' bid prices, and IPO pricing: Evidence from IPO auctions in China. Journal of Financial Markets, 38, 124-140.

Harrison, G. W., & List, J. A. (2004). Field experiments. Journal of Economic literature, 42(4), 1009-1055.

Hellerstein, D. M. (2017). The US Conservation Reserve Program: The evolution of an enrollment mechanism. Land Use Policy, 63, 601-610.

Higgins, N., Hellerstein, D., Wallander, S., & Lynch, L. (2017). Economic experiments for policy analysis and program design: a guide for agricultural decisionmakers (No. 1477-2017-3947).

Holst, G. S., Hermann, D., & Musshoff, O. (2015). Anchoring effects in an experimental auction–Are farmers anchored?. Journal of Economic Psychology, 48, 106-117.

Schilizzi, S. G. (2017). An overview of laboratory research on conservation auctions. Land Use Policy, 63, 572-583.

Singer, E., & Gebler, N. R. T., Van Hoewyk, J., Katherine McGonagle, 1999. The effect of incentives in interviewer-mediated surveys. Journal of Official Statistics, 15, 217-230.

Wallander, S., Aillery, M., Hellerstein, D., & Hand, M. (2013). The role of conservation programs in drought risk adaptation. USDA Economic Research Service ERR, 148.

Weigel, C., Paul, L. A., Ferraro, P. J., & Messer, K. D. (2021). Challenges in recruiting US farmers for policy‐relevant economic field experiments. Applied Economic Perspectives and Policy, 43(2), 556-572.

1 U.S. Bureau of Labor Statistics. “Employer Costs for Employee Compensation.” News release. March 19, 2020. https://www.bls.gov/news.release/ecec.htm.

Page 5

Supporting Statement A

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorWallander, Steve - REE-ERS, Washington, DC
File Modified0000-00-00
File Created2022-06-22

© 2024 OMB.report | Privacy Policy