Supporting Statement for:
Willingness To Pay Survey for Salmon Recovery in the Willamette Watershed
September 1, 2014
1. Identification of the Information Collection.
1(a) Title of the Information Collection
Willingness To Pay Survey for Salmon Recovery in the Willamette Watershed (New), EPA #2489.01, OMB #2080-NEW
1(b) Short Characterization/Abstract
The USEPA Office of Research and Development is investigating public values for options of salmon recovery in the Willamette Watershed in western Oregon. These values will be estimated via a willingness to pay mail survey instrument. Two anadromous fish species in the Willamette watershed are federally listed as threatened species; Spring Chinook, and Winter steelhead. The survey focuses on two attributes of recovery for these fish: the recovery status; and the time to recovery. The levels of recovery vary between “Threatened”, “Basic Recovery”, and “High Recovery”. The levels of time to recovery vary between 15 years, 25 years, and 50 years. The levels of recovery are based on two recovery possibilities described in the recovery plan (Oregon Dept of Fish and Wildlife and National Marine Fisheries Service, 2013). The time to recovery is based on investigating periods inclusive of but also potentially shorter or longer periods than the 25 year planning horizon mentioned in the recovery plan (Oregon Dept of Fish and Wildlife and National Marine and Fisheries Service, 2013). A choice experiment framework is used with statistically designed tradeoff questions, where recovery options are posed as increases in a yearly household tax. Each choice question allows a zero cost “opt out” option. The choice experiment is designed to allow independent isolation of the value of recovery and of time to recovery. A few additional questions to further understand the motivations for respondent choices, their river-related recreation behavior, and their attitudes towards wild origin versus hatchery origin fish are also included. Several pages of background introduce the issue to survey respondents. Limited sociodemographic questions are included to gauge how well the sample respondents represent the target population. The survey will be fielded to Oregon residents.
2(a) Need/Authority for the Collection
Current ORD research revolves around the theme of sustainability (USEPA, 2013a). An overarching goal cited on the EPA website for sustainability research is:
“EPA Sustainable communities research is providing decision tools and data for communities to make strategic decisions for a prosperous and environmentally sustainable future, and providing the foundation to better understand the balance between the three pillars of sustainability- environment, society and economy” (USEPA, 2013b).
As part of including public input for finding the “balance” of sustainability, this survey will estimate public values for salmon recovery in the Willamette watershed. Salmon recovery is a factor for numerous watershed management actions. Salmon is also a key driver for water temperature being listed as a Total Maximum Daily Load (TMDL) contaminant for the Willamette River. The Willamette watershed is a subject of continuing research by ORD and other partners (USEPA, 2012a, 2012b; Willamette Water 2100, 2013). The survey will gather public value information on salmon recovery scenarios to complement partnering natural science research.
2(b) Practical Utility/Users of the Data
A continuing problem for communities dealing with natural resource management problems has been the issue of how to integrate natural resource valuation - both use and non-use values - into a feasible decision making process. One of the primary reasons for conducting economic valuation studies should be to improve the way in which communities frame choices regarding the allocation of scarce resources and to clarify the trade-offs between alternative outcomes. This problem is particularly relevant to salmon conservation efforts in the Pacific Northwest. Despite the deep cultural importance of salmon to the citizens of the Pacific Northwest, there is a remarkable lack of valid empirical economic studies quantifying this importance to the general public who live in the region. This is conspicuously true for the Willamette Basin, home to more than half of the state of Oregon's human population and to the few remaining spring-run Chinook salmon runs in the state. There are many competing uses for Oregon's waters and decision-makers are often faced with trade-offs on how to allocate resources to accommodate these uses. Many of these uses conflict with salmon conservation and to date there is not adequate information to quantify societal values for salmon preservation. Robust estimates of the public value for salmon can be useful in numerous policy contexts and can support numerous government agencies and community organizations in Oregon to factor the value of salmon preservation benefits into their strategic policy and financial decisions. The ecosystem services literature speaks to the importance of valuing these services, yet there is often a lack of valid economic studies evaluating the ecological change under scrutiny. This often leads to inappropriate or questionable usage of previous economic studies to develop an economic estimate based on benefits transfer methods. The history and importance of salmon in the Pacific Northwest underscores the need for more primary economic studies that characterize household preferences for salmon populations that move beyond traditional studies examining recreational demand or commercial demand and their respective use values.
The proposed study will make several contributions to the literature. First, this study will illustrate the importance of characterizing ecological changes in a format that is comprehensible to both respondents and ecologists such that the commodity being valued is at once technically accurate and understandable. This is important from an economic perspective in that it provides a valid basis for interpreting the results as welfare relevant ecological outcomes. It is also important from an ecological perspective in that it provides a mechanism so that ecological studies and modeling efforts can be integrated with ecosystem service valuation in a manner that is consistent and scientifically valid. The importance of making progress in this subject is described in the EPA Scientific Advisory Board report titled "Valuing the Protection of Ecological Systems and Services" (USEPA, 2009).
The proposed study will also examine how public preferences for salmon recovery varies across geography. Our stratified sample will allow us to examine how willingness-to-pay varies across households residing within the Willamette watershed in comparison to households living outside the watershed in other parts of Oregon. This is an important issue to consider when aggregating estimates of nonuse values in studies that are not able to detect this variation. We will also extend the literature on external validity of stated preference studies by explicitly including a test of 'consequentiality' that explores whether or not respondent believe that the results of the study will be considered by policy makers. Controlling for consequentially may alleviate negative hypothetical bias (Vossler & Watson, 2013).
The main interest of this study is the research contributions identified above. Other technical questions must be addressed separately if generating representative estimates of WTP is desired.
3(a) Non duplication
While there is much research and management in the Willamette watershed pertaining to salmon fisheries, there is very little research on quantifying how the public values these resources. The survey will isolate specific aspects of the fishery, and estimate the economic benefits flowing to the general public that are associated with changes in the fishery. This information on the gain or loss to the general public could be used to assess the allocation of scarce financial resource on restoration and/or mitigation projects designed to protect and restore anadromous fish populations in the Oregon. Furthermore, discussions on Willamette salmon tend to focus on commercial or recreational angling, at the exclusion of the general public. This survey will investigate general public values for changes in this public resource. Qualitative research based on focus groups pretesting the survey instrument indicate the potential for significant public value for fishery recovery, regardless of respondent fishing behavior.
There has been one prior general public stated preference survey featuring threatened Willamette watershed salmon. Wallmo and Lew (2012) conducted a choice experiment survey to investigate public values for recovery of a variety of marine organisms, including Willamette Spring Chinook, with a nationwide sample. This study differs from that prior study in the following respects. This study uses detailed information and recovery levels from the recent recovery plan, and furthermore considers two levels of recovery. In addition to recovery levels, preferences for time to recovery will also be investigated in this study. Finally, this study will seek to compare recovery values across different components of Oregon’s population. Oregon subpopulations both within and outside of the watershed (see Supporting Statement Part B for more information). The nationwide sample from Lew and Wallmo is not extensive enough to allow isolating preferences of Oregonians, nor subpopulations of Oregonians (personal communication, Lew and Wallmo, 2012).
EPA has not identified any other studies that would consider Oregonians’ public values for the different levels of recovery and time to recovery posed for Willamette watershed Spring Chinook and steelhead. The options were carefully pretested to balance background information with cognitive effort. The language, graphics, and question formats in the survey were carefully pretested and iteratively refined through seven focus groups. These focus groups were all in Oregon, occurring both within and outside of the Willamette watershed, and with both urban and rural participants.
3(b) Public Notice Required Prior to ICR submission to OMB
The first Federal Register Notice was published on July 1, 2013 under 78 FR 39282. It received one comment. This comment, and EPA’s response, are attached in the document titled “Public Comments.”
3(c) Consultations
The principal investigators for this effort are Michael Papenfus, and Matthew Weber, both affiliated with USEPA, ORD, Western Ecology Division, Corvallis, Oregon. M.Weber has past direct experience with willingness to pay survey research, with a study estimating public values for management changes for the river and riparian area of the Rio Grande in Albuquerque, New Mexico (Weber and Stewart, 2009). Several stated preference surveys previously approved by OMB were consulted in designing this survey. These studies include an EPA study on fish and aquatic habitat impacts from cooling water intake structures (OMB # 2020-0283), a NOAA coral reef valuation study (OMB #0648-0585), an EPA survey on Chesapeake Bay water quality (OMB #2020-0283), and an EPA survey on Santa Cruz River water quality and quantity (OMB #2080-0080). The survey instrument booklet format and several questions formats for the current study were adapted from these surveys. M. Weber participated in a workshop for stated preference survey practitioners working on federal government projects, convened by NOAA and Stratus Consulting in June of 2012 (NOAA and Stratus Consulting, 2012). That workshop was a helpful forum for comparing notes in willingness to pay survey design, with an emphasis on strategies for presenting ecological goods in a way meaningful to the lay public. Several completed or working draft willingness to pay survey instruments were presented for group discussion, including a NOAA study on Elwha River restoration, with an attribute that includes salmon abundance changes (OMB NO. 0648-0638).
For descriptions of salmon abundance changes and management actions that could contribute to recovery, this survey relies on the recovery plan (Oregon Dept of Fish and Wildlife and National Marine and Fisheries Service, 2011).
Additional consultations with experts within ODFW, NOAA, EPA Region 10, and several external experts in stated preference survey design were used to assist in the design of the survey. Although numerous consultations were made, these should not be interpreted as any entity endorsing the survey for management purposes. Again, the main purposes of this study are research oriented and are discussed in the previous section.
One comment was received during the first public notice period. This comment and our response is attached in the appendix titled Public Comments.
3(d) Effects of Less Frequent Collection
Without this collection there will remain a significant gap in knowledge about the nonmarket benefits associated with projects related to salmon restoration and recovery. Moreover, the benefits of protecting a vital pubic asset (salmon populations) benefits will remain unmeasured and thus remain difficult to objectively factor into important resource management decisions. Many projects related to land use options, water quality attainment, etc. are ultimately connected to enhancing salmon recovery. Sustainable fish populations are one of the most important environmental endpoints that matter most to people living in the Pacific Northwest. Without this collection effort, quantifying the economic benefits of recovery efforts will remain unmeasured.
3(e) General Guidelines
The survey will not violate any of the general guidelines described in 5 CFR 1320.5 or in
EPA’s ICR handbook.
3(f) Confidentiality
All responses to the survey will be kept confidential to the extent permitted by law. The surveys will be processed, including data entry, by the principal investigators; nobody else will have a record of who has responded or the answers of any given respondent. A list of the addresses of the members of the sample who have responded versus those who have not will be maintained in order to more efficiently mail reminders and replacement surveys. This will be a single file, accessible to and updated only by the principal investigator. To protect confidentiality in survey results, each respondent will be identified by a numeric code in that file rather than their name or address. The survey questions do not ask for any personally identifiable information and personally identifiable information will not be entered in the results even if volunteered by the respondent, for example in the comments section. In the cover letter, respondents will be informed that their responses will be kept confidential. After the data collection is complete, the respondent status file will be deleted, and only the numeric code assigned to each respondent will remain. After data entry is complete, the surveys themselves will be destroyed.
The USEPA office location (the Western Ecology Division of USEPA) and USEPA electronic file system used by the principal investigator are highly secure. A keycard possessed only by USEPA employees and contractors is necessary to enter the building. The principal investigators are then in a separate keyed office space within the secure building. The computer system where the personal names and addresses associated with respondent numeric codes will be stored during the process of data entry is a secure server requiring principal investigator personal login username and password. At the conclusion of data entry, this file linking personal names and addresses to respondent codes will destroyed (along with hard copy survey responses themselves) at the conclusion of data entry and only respondent codes will remain.
3(g) Sensitive Questions
There are no questions of a sensitive or personal nature in the survey.
4(a) Respondents/SIC Codes
The target respondents for this survey are representatives 18 yrs or older of Oregon households. A sample of household representatives 18 yrs or older, sufficient to address goals of the study will be contacted by mail following multiple contact protocol in Dillman (2000), and Dillman et al. (2009). A response rate of 30% will be targeted for each subpopulation of Oregon to be studied. Use of multiple mail contacts will be used as a method of increasing sample response rates, and includes a pre-notice to all recipients, a main survey mailing, a reminder postcard, and two follow-up mailings.
4(b) Information Requested
(i) Data items, including record keeping requirements
The current draft survey is included as Appendix SurveyBooklet. The survey is divided into 4 main parts. The first part is background for the choice questions. The second part is the choice questions themselves. The third part is questions designed to understand the context for why respondents responded to the choice questions as they did. These questions include attitudinal questions as well as recreational preferences questions. The fourth part is designed to assess whether major sociodemographic categories of the received sample are representative of the population sampled. There are no record keeping requirements asked of respondents.
(ii) Respondent Activities
The following respondent activities are envisioned. Participants will read the cover letter and survey, respond to the survey questions, and return the survey using a provided postage paid envelope. Focus group and cognitive interview participants typically took no longer than 30 minutes to complete the survey, so 30 minutes per response is the estimated burden for the average respondent.
5(a) Agency Activities
Development of the survey questionnaire through focus group and cognitive interview pretesting has occurred under the separate ICR# 2090-0028. Pretest techniques follow standard approaches in the qualitative methods literature (Morgan and Krueger, 1998; Rubin and Rubin, 2005), as well as guidance in the economics literature for the specific purposes of pretesting a willingness to pay survey (Klogaard et al., 2012; Johnston et al., 1995; Kaplowitz et al. 2001, Hoehn et al. 2003).
Under this ICR, agency activities will include:
Develop and finalize the choice experiment design
Obtain a representative sample mailing list for Oregon households sufficient for the goals of the study (see Supporting Statement Part B)
Printing of questionnaires
Mailing of prenotices
Mailing of cover letters and questionnaires
Reminder mailings
Follow-up mailings and replacement questionnaires to non-respondents as needed
Data entry and quality assurance of data file
Analysis of survey results, including characterization of nonresponse and potential degree of nonresponse bias
Modeling choice experiment results
Reporting survey results
5(b) Collection Methodology and Management
The proposed survey is a choice experiment questionnaire delivered and returned by mail. Standard multi-contact mail survey methods will be used to increase response rate (Dillman 2000, Dillman et al., 2009). We have set a target response rate of 30% for each subpopulation of Oregon to be studied.
Data quality will be monitored by checking returned survey responses for internal consistency, and by assessing any comments made on the survey or returned with the survey that signal strategic responses or respondent confusion. Coded survey data will not include any identifying information of the respondents. Returned survey data will be coded, and data passing quality checks will be used as the dataset for discrete choice modeling.
5(c) Small Entity Flexibility
This survey will be administered to individuals, not businesses. Thus, no small entities
will be affected by this information collection.
5(d) Collection Schedule
A breakdown of the expected collection schedule is as follows:
Week 1: Printing surveys
Week 2: First contact mailing for pilot survey, notifying that a survey will be mailed in 1-2 weeks
Week 3 and 4: Pilot survey mailing
Week 5 and 6: Pilot survey reminder postcards mailing
Week 7 through 9: Data entry of pilot survey results. Revising estimation of the beta vector (coefficients on utility variables, see part B of the supporting statement). Adjusting the choice experiment and cost levels for the main survey mailing based on the beta vector estimated from the pilot survey
Week 15: First contact mailing for main survey mailing, notifying that a survey will be mailed in 1-2 weeks
Week 16 and 17: Main survey mailing
Week 18 and 19: Main survey reminder postcards mailing
Week 20 through 23: Main survey additional reminders and replacement surveys as necessary to reach target response rate
Week 24 to 25: Data entry
The schedule above is staged such that if response rates are higher or lower than expected, the appropriate number of replacement surveys will be printed and mailed to most efficiently use funds.
6(a) Estimating Respondent Burden
For a typical respondent, a conservative estimate of their time to review and respond to survey questions is 30 minutes. The target will be 730 responses from the within Willamette basin subpopulation and 270 responses for the non-Willamette basin subpopulation. This is proportionate to population proportions for each stratum in Oregon. Assuming the target of 1000 responses to the survey, the burden is 500 hours. This would be a one-time expenditure of their time.
6(b) Estimating Respondent Costs
(i) Estimating Labor Costs
The Bureau of Labor Statistics reports a median hourly wage for Oregon for all occupations. The most recent data is a median hourly wage of $17.24 per hour, for May 2013 (Bureau of Labor Statistics). Assuming 1000 participants fill out the survey and a cost per participant of $8.62, the total estimated respondent labor cost is $8,620.
(ii) Estimating Capital and Operations and Maintenance Costs
There are no anticipated capital, operations or maintenance costs associated with this collection.
(iii) Capital/Start-up Operating and Maintenance (O&M) Costs
There are no anticipated capital, operations or maintenance costs associated with this collection.
(iv) Annualizing Capital Costs
There are no anticipated capital, operations or maintenance costs associated with this collection.
6(c) Estimating Agency Burden and Cost
The various aspects of the survey mailing are assumed to be done by the principal investigators, with an associated hourly wage rate of $32.50. Preparing survey mailings, tracking nonrespondents, sending new mailings as needed, and data entry are anticipated to amount to 8 weeks total or 320 hours of work. Agency labor cost would be 320 hours times $32.50 per hour or $10,400.
6(d) Estimating the Respondent Universe and Total Burden and Costs
Assuming 1000 participants throughout Oregon fill out the survey, the total labor cost is estimated at $8,620.
6(e) Bottom Line Burden Hours and Cost Tables
Item |
Quantity |
Cost |
Public Burden |
||
Time burden: 0.5 hours per respondent |
1000 persons |
500 hours; $8,620 labor |
Agency Burden |
||
Time burden |
Entire project |
320 hrs; $10,400 labor |
Mailing and printing costs |
||
Mailing list |
Mailing lists for Oregon, including sufficient coverage of subpopulations |
$500 |
Advance letter paper, envelopes, printing, and postage |
3,500 pieces |
$1925 |
Color surveys paper and printing |
5,400 pieces (includes both initial and secondary mailing) |
$9425 |
Printing return envelopes 10.5” x 7.5” |
5,400 pieces (includes estimated replacements) |
$1300 |
Outgoing envelopes 11.5” x 8.75” |
5,400 pieces (includes estimated replacements) |
$975 |
Outgoing survey postage with prepaid return postage |
5,400 pieces
|
$7995 |
Reminder and final contact postcard paper, printing, & postage |
5,400 pieces |
$2860 |
Subtotal mailing |
|
$24,980 |
Total Agency Cost |
|
$35,380 |
Total Public Cost |
|
$8,620 |
The estimated respondent burden for this study is 500 hours and $8,620. The estimated agency cost for this study is 320 hours and $35,380.
6(f) Reasons for Change in Burden
This is a new survey.
6(g) Burden Statement
The annual public reporting and recordkeeping burden for this collection of information is estimated to average 0.5 hours per response. Burden means the total time, effort, or financial resources expended by persons to generate, maintain, retain, or disclose or provide information to or for a Federal agency. This includes the time needed to review instructions; develop, acquire, install, and utilize technology and systems for the purposes of collecting, validating, and verifying information, processing and maintaining information, and disclosing and providing information; adjust the existing ways to comply with any previously applicable instructions and requirements; train personnel to be able to respond to a collection of information; search data sources; complete and review the collection of information; and transmit or otherwise disclose the information. An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB control number. The OMB control numbers for EPA's regulations are listed in 40 CFR part 9 and 48 CFR chapter 15.
To comment on the Agency's need for this information, the accuracy of the provided burden estimates, and any suggested methods for minimizing respondent burden, including the use of automated collection techniques, EPA has established a public docket for this ICR under Docket ID Number EPA-HQ-ORD-2013-0448, which is available for online viewing at www.regulations.gov, or in person viewing at the Office of Research & Development (ORD) Docket in the EPA Docket Center (EPA/DC), EPA West, Room 3334, 1301 Constitution Avenue, NW, Washington, D.C. The EPA Docket Center Public Reading Room is open from 8:30 a.m. to 4:30 p.m., Monday through Friday, excluding legal holidays. The telephone number for the Reading Room is (202) 566-1744, and the telephone number for the ORD Docket is (202) 566-1752. An electronic version of the public docket is available at www.regulations.gov. This site can be used to submit or view public comments, access the index listing of the contents of the public docket, and to access those documents in the public docket that are available electronically. When in the system, select “search,” then key in the Docket ID Number identified above. Also, you can send comments to the Office of Information and Regulatory Affairs, Office of Management and Budget, 725 17th Street, NW, Washington, D.C. 20503, Attention: Desk Officer for EPA. Please include the EPA Docket ID Number EPA-HQ-ORD-2013-0448 and OMB Control Number 2080-NEW in any correspondence.
Part B of Supporting Statement
(a) Survey Objectives
The survey is being proposed by the EPA Office of Research and Development, and is not associated with any regulatory ruling of EPA. The main interests of this study are the research questions identified in the Supporting Statement A. Thus decisions were made in the study design from a perspective of making research contributions, rather than for conducting a definitive benefits analysis for a specific management decision.
The objectives of the survey are:
To estimate public non-use values for changing the status of threatened Willamette watershed Spring Chinook and steelhead to a status of recovery.
To estimate public values for the time to recovery (if recovery is chosen) for Willamette watershed Spring Chinook and steelhead.
To compare estimated public values for households within and outside of the Willamette watershed.
Direct use values for changes to anadromous fish populations can be estimated using a variety of methods. However, nonuse values can only be assessed using stated preference survey methods. Because non-use values may be substantial, failure to recognize and measure these public values may lead to socially undesirable outcomes resulting from public policy decisions that do not account for these societal values (Freeman, 2003).
(b) Key Variables
The survey asks respondents whether they would choose a permanent increase in annual household taxes in exchange for changes in the scope and timing of Willamette basin Spring Chinook salmon and steelhead recovery. The choice experiment framework allows respondents to evaluate pairs of multi-attribute policies associated with Spring Chinook and steelhead recovery in the Willamette basin. Respondents are asked to choose the program that they would prefer, or to choose the status quo (i.e. choose neither of the policy options). The survey follows well-established methodology and design (Adamowicz et al. 1998; Louviere et al. 2000; Bateman et al. 2002, Choice Metrics, 2012). The key variables are:
Status of Willamette Basin Wild Salmon and Steelhead: This is a variable describing the recovery status and annual abundance of wild fish returning to the Willamette basin. The survey considers three possible levels of this variable. The future baseline - status quo option - is termed 'No Intervention / Threatened'. With this option there is no change in recovery status or fish abundance from current conditions. The risk of Spring Chinook salmon going extinct in the next 100 years remains between 60 and 100 percent. Two different increases from the baseline are posed. These increases are termed 'Basic recovery' and 'High recovery'. 'Basic recovery' represents an increase in wild fish returning to the Willamette basin from 20,000 (current status) to 40,000 fish. Under this option, the risk of going extinct in the next 100 years is 'no longer significant'. 'High recovery' represents an increase in wild fish returning to the Willamette basin from 20,000 (current status) to 70,000 fish. Under this option, the risk of going extinct in the next 100 years is 'no longer significant'.
Time to recovery: The survey asks respondents to consider the 'time to recovery' when considering their choices. Respondents are informed prior to the choice questions that 'Time to recovery' can be varied by using the management options designed to recover fish populations more intensively, but only feasible with a higher cost. The 'time to recovery' options include three levels (15 years, 25 years, and 50 years). The 'time to recovery' options can be applied to any of the two recovery level options.
Tax increase per year: Each choice option for recovery requires an associated cost. These cost changes are represented as a permanent annual increase in household taxes. These cost levels are $0 for the 'status quo' option and six varying levels for the recovery options that vary from $40 to $300. These cost levels are not tied to actual costs estimates for the changes, but rather are designed to bracket values. The design goal is to set cost levels such that some people agree to them and some people don’t agree to them. Revisions to the proposed cost levels may occur after analysis of pilot survey results. The analysis of the discrete choice data will use the choices reported by respondents together with the chosen level of tax increase, the degree of improvement in Salmon and steelhead status, the timing of the change, and variables for socio-demographic status to estimate values for changes in those attributes.
(c) Statistical Approach
A randomly drawn sample from across the state of Oregon will be used to estimate public values associated with changes in Spring Chinook salmon and steelhead populations within the Willamette basin. The sample will be include households living within the Willamette Basin and outside of the basin. The paper survey will be administered through the mail. The data will be modeled using standard discrete choice models that have been well-developed in the research literature. (Ben-Akiva and Lerman, 1985; Train, 2009, Kanninen, 2007)
The work in developing the survey was conducted by EPA ORD with the assistance of contractor support (The Henne Group, 116 New Montgomery Street, Suite 812, San Francisco, CA 94105) under the separate ICR# 2090-0028. The contractor recruited and remunerated focus group participants, as well as individual survey pretest participants, and provided transcripts of meetings. The work involved in preparing the survey for mailing, the mailing itself, data management and analysis, and report write-up, will be conducted by EPA ORD.
(d) Feasibility
The overall design of a discrete choice experiment requires a strategic process for developing, testing, and optimizing the survey instrument (Kanninen, 2007; Klojgaard, 2012). The survey has been extensively pretested, as described below, to balance providing background needs and cognitive difficulty for respondents. The overall design of a discrete choice experiment requires a strategic process for developing, testing, and optimizing the survey instrument.
(a) Target Population And Coverage
The target population is individuals residing in the state of Oregon 18 years or older. Respondent coverage will be such that each household in two geographically defined strata has an equal probability of being selected to receive a survey. In order to understand variation in preferences for recovery of salmon and steelhead in the Willamette basin, the target population will be split into two geographically defined strata that are distinguished by residence within the Willamette basin and outside of the basin.
(b) Sample Design
(i) Sampling Frame
The sampling frame for this survey will be the United States Postal Service Computerized Delivery Sequence File (DSF). The universe of sampling units is defined as the set of all individuals over the age of 18 living at a residential address in the state of Oregon. Each household within each of the geographically-defined stratum will have an equal probability of being selected to receive a survey.
(ii) Sample Size
The discrete choice literature covers several methods for computing the minimum sample size for estimating discrete choice models (Rose & Bliemer, 2013). A general rule-of-thumb widely reported in the literature for estimating sample size requirements for stated choice experiments that is focused primarily on estimation of main effects is:
where is the largest number of levels for any of the attributes, is the number of alternatives, and is the number of choice tasks. This equation can be adapted to allow for two-way interactions effects by assuming that represents the largest product of levels of any two attributes (Orme, 2010; Rose and Bliemer, 2013). This formula was also recently utilized by NOAA (OMB # 0648-0585).
A sample size for each Oregon subpopulation will be based on a main effects model. Three choice questions will be in each survey, with two options each (not counting the "status quo" option), and the maximum number of levels for any single attribute is 6. Thus the minimum sample size n is 500 for each of the two subpopulations to be sampled. This value is larger than the 200 minimum suggested when the intent is to compare subgroups (Orme, pg 65). Bateman et al. (2002; pg. 110) recommend a sample size of 500 to 1,000 (for each subgroup) for close-ended contingent valuation questions, but also note a smaller sample size can be used if one collects more information per respondent (as with replications in choice experiments). With a target response rate of 30%, this means approximately 3,500 Oregon households total will receive a survey, distributed across the two target subpopulations.
(iii) Stratification Variables
Households within the Willamette basin, and households located outside the Willamette basin will be treated as different subpopulations. Based on 2013 population estimates from Portland State University's Population Research Center, approximately 72 percent of Oregon's 3,919,020 residents live within the Willamette basin.
(iv) Sampling Method
The sample mailing list (DSF) for each Oregon subpopulation will be purchased from a mail survey support company. The company will be given instructions to prepare the sample such that each household in each subpopulation has an equal chance of being chosen (a simple random sample approach).
(v) Multi-Stage Sampling
Not applicable
(c) Precision Requirements
(i) Precision Targets
Louviere et al. (2000) provide a formula, based on elementary statistical theory, for the minimum sample size N for target accuracy and confidence interval of predicting a proportion, assuming a large sample frame population. Let be the true choice proportion, the level of allowable deviation as a percentage between and . The level of accuracy, , is specified as a percentage that the sample proportions drawn are allowed to deviate from the true population proportions. For a simple random sampling strategy and assuming that choice occasions from each respondent to be independent the minimum sample size is:
where and is the inverse cumulative distribution function of the standard normal taken at . is the number of choice occasions per respondent. This study will use a target of plus or minus 10% for predicting population proportions to be estimated, with a probability of 0.95. Assuming a population proportion of 0.4 and 3 choice occasions per respondent, the minimum sample size is 192 respondents. This is less than the sample sizes planned for each subpopulation under either a simple random sampling scheme or stratified random sampling based on proportionate allocation.
Alternatively, one can compute the margin of error for a sample proportion as:
where is a simple sample proportion estimate, is the sample size, and is the appropriate z-value for the desired level of confidence. Based on proposed sample sizes of 730 for the within Willamette basin stratum and 270 for non-basin stratum , the margin of error is 0.036 for the within basin sample and 0.059 for the non-basin sample. These are computed with sample proportion of 0.5 and a 0.95 confidence level .
To compute the precision of a proportionate stratified sample for the state of Oregon, the 'pooled' estimate of the sample proportion, , weights the sample proportions from each stratum according to the population sizes from the strata, such that the variance of this estimate is:
where is the sum of the sample sizes for each strata, . The following table reports the standard deviation for samples sizes chosen in accord with proportionate sampling where the sample sizes are chosen in proportion to the population sizes so that:
The 2013 population estimate for the state of Oregon is 3,919,020. Approximately 73 percent of the total population resides within the Willamette basin. With a total expected sample size of the following table reports the precision of proportionate stratified random sampling for different values of an estimator of the proportions of a population, . Without information on how sample variances between strata differ, we will use the proportional allocation rule to determine sample sizes for each stratum. If the pilot test reveals that samples variances for the willingness to pay estimates vary substantially, we may readjust the final sample allocation to reflect that information. An example would the proportion of the population that supports a given statement.
|
Stratum |
Expected sample size |
Population proportion |
Standard deviation of |
Example 1 |
Within basin |
730 |
0.5 |
0.0158 |
Outside basin |
270 |
0.5 |
||
Example 2 |
Within basin |
730 |
0.8 |
0.0126 |
Outside basin |
270 |
0.2 |
Under simple random sampling and a population proportion of , the standard deviation is 0.0158. For this example, there is a reduction in the standard deviation from the stratified random sampling strategy. In general, these reductions are greatest with larger differences in the population proportions, , across strata.
(ii) Nonsampling error
Several non-sampling errors may be encountered in stated preference surveys. Protest responses are a type of non-sampling error that occurs when individuals reject the survey scenario for reasons such as format or question design, even though they may value the resource being considered (Mitchell and Carson, 1989). To identify protest responses within survey participants, we have included survey debriefing questions (see Questions 13 -- 15), a technique commonly used to detect protest respondents.
With a target response rate of approximately 30% there will also be a large percentage of nonrespondents. If preferences of nonrespondents differ markedly from respondents, a second type of non-sampling error, nonresponse bias, can affect the results. To promote high participation in the survey, we will follow Dillman's (2009) mail survey approach1. Our approach to analyzing nonresponse will include several techniques. First, we will examine the survey respondent's demographic characteristics and compare them to the state of Oregon population statistics based on 2010 American Community Survey data. We will examine correlations between demographic variables and answers to attitudinal questions. Post-stratification sampling weights will be derived to address any sociodemographic differences found between our survey sample and the full, state of Oregon population. These post-stratification weights help to balance the difference between observable characteristics of respondents and nonrespondents. Once these demographic differences are accounted for, there remains the possibility for sample selection bias, which would occur if there is systematic and unobservable differences between the types of people who choose to respond to the survey and those who choose not to participate. Of particular concern is the scenario in which those who choose not to participate in the survey have less interest in the topic and subsequently be less willing-to-pay for the recovery option improvements offered in the choice questions. Under this scenario, the willingness-to-pay estimate when aggregated across all households in the target population may be biased upwards and reflects an upper bound on aggregated willingness to pay. A lower bound on aggregated willingness-to-pay can be estimated by assuming that all non-respondent's willingness to pay equal is zero. We will also examine differences across sociodemographic variables, answers to attitudinal questions, and our willingness-to-pay estimates between the initial and late responders to the survey.
A nonresponse follow-up survey would allow one to probe differences between respondents and nonrespondents in aspects beyond sociodemographic characteristics, but in the absence of obtaining the nonrespondents estimate of willingness to pay, there is not a well-established procedure for correcting for this type of selection bias in the class of discrete choice models used to estimate choice experiments such as the one proposed in this study. See Cameron and DeShazo (2010) for a good discussion of this issue. Given the difficulties for correcting for this selection bias as discussed above and because this research is not designed to support cost-benefit analysis for a specific regulatory or management decision, we have no plans to incur the expense of a nonresponse survey at this time. We will implement standard procedures for conducting a nonresponse analysis using observable demographic variables. We will also report bounds on the willingness-to-pay estimate as discussed above.
(d) Questionnaire Design
The current draft survey is also uploaded to the federal register. Below is a description of the sections and questions.
Several categories of text and questions are included in the survey. The reasons for including each of the categories is discussed below.
The title on the cover page reads "Oregon Salmon and Steelhead Recovery - What do YOU think should be done?" and the cover photo shows a salmon jumping up a waterfall. Page 1 describes the rationale for the survey and provides a brief description of what the survey is about. Short descriptions of the fish species considered, the geographic extent of the recovery plan, and current population numbers of salmon and steelhead are provided and compared to estimates of fish returning to the rest of the Columbia basin. Page 2 provides additional background information on Willamette basin salmon and steelhead and lists some of the primary factors that have led to them being added to the Endangered Species list. A map of the Willamette basin is shown beneath the background information. Page 3 provides detailed background information on the wild origin salmon and steelhead that are the focus of the recovery plan. Additional information is provided describing the definition of the current 'threatened' status and some information is given on additional fisheries in the Willamette basin and other wild salmon and steelhead populations in Oregon. Page 4 provides a description of hatchery origin salmon and steelhead in the Willamette basin. These descriptions are provided so that respondents clearly understand the differences between hatchery origin fish and the populations of wild origin fish that are target of the recovery plan. Page 5 informs respondents that they will be asked to make choice about wild salmon and steelhead recovery options.
Questions 1 -- 5: These questions evaluate respondents' understanding of salmon and steelhead condition in the Willamette basin prior to reading the survey and asks whether or not they or anybody within their household regularly purchases an Oregon state fishing license.
Pages 6, 7, and 8 provide information on the subsequent choice questions including detailed descriptions of the different attributes included in each recovery option. A description of the payment mechanism is also provided.
In each of the three choice situations, the no-cost, 'reference condition' option is labeled 'No Intervention'. The other two alternatives in each choice situation are given generic labels 'Option A' and 'Option B'
The first attribute of each option is the 'wild salmon and steelhead status'. This attribute has two levels which are labeled as 'Basic Recovery' and 'High Recovery'. The projected number of wild fish returning annually to the Willamette basin under these options are included. The fixed 'reference condition' status, is labeled as 'No Recovery '. The current number of fish returning annually under this reference condition is also included.
The second attribute for each option is 'time to recovery'. This attribute describes the number of years that the recovery plan will take to implement before the chosen recovery status is achieved. Under the 'No intervention' option there is no time to recovery. Page 8 also presents a graph showing the fish population abundances as a function of recovery time. Under recovery options, this attribute has three different levels (15 years, 25 years, and 50 years).
Questions 6: This question on page 9 evaluates a respondent's attitudes towards public issues besides the Willamette recovery plan for salmon and steelhead. This questions serve as a reminder of the many public issues for which public resources can be allocated. The order of the rows in which items are presented will be randomized in different versions of the survey so that responses are not biased in favor of the first item presented.
Instructions for completing the choice questions are outlined on page 10. These instruction are followed by a short 'cheap talk' reminder to ensure that respondents carefully consider their budget constraints and to discourage them from overstating their willingness to pay (Cummings and Taylor, 1999; List, 2001). An example question is shown on page 11. This is followed by several final instructions for answering the subsequent choice questions.
Questions 7 through 12: These questions comprise the choice experiment portion of the survey, where respondents choose between a status quo option describing the current situation and two mutually exclusive alternatives that present different levels for the attributes of interest. Each choice question is immediately followed by a question asking respondents to assess their level of certainty on a scale from 0 to 10. Following standard techniques of choice experiments, the choice situations presented to respondents will be a fraction of the theoretically possible combinations of attributes, selected to yield the most informative preference information. There will be different survey versions, with three questions per survey (also known as “replications”). These replications serve as an efficient method of allowing sufficient number of tradeoffs for model estimation. These practices save expense and also reduce the sample size and associated public burden.
Question 13: This questions asks respondents to agree or disagree with a series of statements probing for motivations regarding their answers to the choice questions. The answers to these questions will aid in identifying 'protest bids' for respondents that choose not to pay for philosophical reasons other than a price that is too high.
Question 14: This question asks respondents to agree or disagree with another series of statements probing for motivations underlying their answers to the choice questions.
Question 15: This question asks respondents to report whether or not they view their responses to the choice question as potentially influencing policy decisions on salmon recovery actions. This test of 'consequentiality' will support the external validity of the survey results.
Recreation activity section
Questions 16 through 19: These questions asks respondents to provide information about their recreational activities on lakes, rivers, and streams in the Willamette basin in the past twelve months.
Questions 20 and 21: These questions asks respondents to report their views on hatchery-origin and wild-origin salmon and steelhead.
Questions 22 - 32: These questions ask respondents to report basic socioeconomic information. These questions are worded so that they are consistent with current American Community Survey questions. The information from this section will be used to examine sample selection and non-response bias, as well as measuring the representativeness of our sample to Oregon statewide demographics.
Design of the discrete choice experiment
The primary goal of a discrete choice experiment is to estimate the weights or preferences that respondents place on each of the attributes that comprise the choice alternatives. Respondents are expected to evaluate the alternatives presented in each choice task and choose the alternative that gives them the highest relative utility. The experimental design underlying the choice experiment can have an impact on the statistical power of the experiment and the analyst's ability to statistically estimate the econometric models underlying the economic analysis. How well the experimental design does in terms of allowing for robust estimation of the model parameters depends in part on which options are used in the choice experiment and how the alternatives are grouped into choice situations.
It is rarely possible to present each respondent with all possible choice situations given the large number of possible combinations of each attribute with multiple possible levels. All possible choice profile tradeoffs could be presented to respondents, but would be an inefficient way to gauge preferences. Instead, “fractional factorial” models are the standard approach (Louviere et al., 2000). Efficient fractional factorial designs have been empirically shown to lead to smaller standard errors and require smaller sample sizes than orthogonal designs (Rose and Bliemer, 2013). For this study, we will use the D-error criterion to optimize efficiency of the experimental design. As discussed in Bliemer and Rose (2010), efficient designs based on a standard multinomial logit model are often sufficient for estimating a variety of discrete choice model specifications. All choice tasks will be created in Ngene 1.1.1 (ChoiceMetrics, 2012), a statistical software package developed specifically for constructing discrete choice experimental designs, or SAS (Kuhfield, 2010). We intend to include 36 profiles blocked into 12 groups. With three choice situations presented to each respondent, there will be 12 unique versions of the survey that differ in the levels of the attributes presented in the choice situations. See attachment DesignProfiles for the set of design profiles that will be used to generate different versions of the survey. Louviere et al. (2000) citing Bunch and Batsell (1989) recommend at least 6 respondents per block to satisfy large sample statistical properties, and our expectation is an average of 22.5 responses per block for the out-of-basin stratum and 60.8 responses for the within-basin stratum. This allows a safety margin given that there will be some variation in the number of returned surveys for each block, even though an equal number for each block will be mailed out. Nonetheless, during follow-up mailings to achieve the target overall response rate, it will also be ensured that enough responses from each of the 12 survey versions is being achieved.
The design was manually inspected to adjust dominating or potentially confusing alternatives per question or per block. The choice experiment design is subject to update based on pilot test results.
Pretests
The survey content and format has undergone extensive pretesting. Focus groups and cognitive interviews to test and refine the survey instrument were conducted from April, 2013 through November, 2013. Eight focus groups were held throughout Oregon under ICR # 2090-0028. Six of the focus groups were held within the Willamette basin and two were held outside the Willamette basin. Focus groups were held in both Willamette basin and non-Willamette basin areas. Focus group participants were recruited from the general public by a market research contractor using standard market research methods, including paying participants an incentive fee as compensation for their opportunity cost of time. In addition to these focus group sessions,
phone-based cognitive interviews were conducted with 22 individuals.
Following standard guidance in the stated preference literature (Arrow et al., 1993; Desvouges and Smith, 1988; Mitchell and Carson 1989; Johnston, 1995), the information collected from the focus groups and cognitive interviews was used to refine and develop the survey instrument to ensure that each question was easy to understand and elicited information from respondents that is consistent with each question's intent. The focus groups and cognitive interviews were critical in the survey design process. This step of the design process also was used to define and describe attributes in a manner that ensured that respondents understood the content of the attribute in a clear and concise manner so that the attributes included in the discrete choice experiment are properly identified.
This qualitative step of the design process also provided assurance that respondents understood the content of the survey in a clear and concise manner and thus helps to support the validity of the survey results. This step also allowed for feedback to be collected on the presentation of factual content used to inform respondents about the choice situations.
Individuals in these pretests completed draft versions of the survey instrument and provided comments and feedback about the survey format, content, their interpretations of the questions, and other issues relevant to stated preference modeling and survey instrument design.
The most important revisions made to the initial survey that stemmed directly from feedback received during the qualitative phase of the survey instrument development include the following:
Using the number of hatchery versus wild origin salmon as an attribute of the recovery plan option was confusing to participants and invited many different forms of speculation regarding the viability and primary differences across recovery options. This attribute was removed all of the stated choice questions.
Most respondents considered options that were scheduled to be implemented over a 100 year period to be too far into the future to take seriously. The maximum 'time to recovery' option included in the current survey version is 50 years. To support the scientific basis underlying the range of recovery times presented in the choice questions, we include the following statement from an EPA anadromous fish biologist:
Recovery of salmonid ecosystems following substantial human impacts is multi-faceted and multi-scalar. The time scale of recovery, and the specific processes that must be restored/rehabilitated will be specific to each individual case, but within landscapes such as the Willamette River basin subject to 100+yrs of human land uses ranging from logging to grazing and agriculture to water withdrawals, recovery of salmon ecosystems generally includes several key components. These include components of habitat, including: hydrologic regime (timing and amount of water in streams and rivers), thermal regime (water temperature), and sediment and large wood dynamics that influence availability and suitability of spawning and rearing habitat. All of these may be strongly influenced by river regulation imposed by dams. Also important are components of mortality due to human harvest (commercial and recreational fishing) and predation by marine mammals, birds, and other fish.
Some of these components of the salmon ecosystem can be recovered within relatively short time frames. For example, harvest due to fishing can be rapidly curtailed. Flow management from reservoir releases can be modified over short time frames to more closely mimic historical flows. Rehabilitation of stream habitats via structural enhancements can be accomplished within a few years. Salmon population responses to these actions can occur relatively quickly, within one or two generations of salmon (less than one decade) if survival or productivity is enhanced. But such effects are usually very localized, and other aspects of ecosystem recovery will take much longer. For example, in forested landscapes that have been heavily logged and where large wood has been removed from streams, recovery of large wood and sediment dynamics will take decades to centuries. Streamside trees and forests can grow to influence shade (and water temperatures) within 20-30yrs, but to mature and fall into streams to create needed habitat structure, streamside forests will require hundreds of years. Similarly, sediment pulses from landslides and other disturbances move through watersheds over decades to centuries, such that the effects of road development and logging may persist for decades after forestry activities have ceased.
Thus, the selection of response periods of 15-50 years covers ecologically-meaningful time scales of watershed recovery while falling within the planning horizon of human perspective.
Close attention in these pretests was given to testing for the presence of potential bias in survey responses due to poorly designed choice experiment questions. Examples of these biases include strategic bias, framing effects, embedding bias and scenario rejection or protest responses. Some questions and instructions were revised and appended in an effort to minimize these sources of bias.
The language used to describe attributes, their levels, and follow-up questions went through an editorial process that helped to minimize the usage of academic and specialized language to a refinement of wording that made the questions and their instructions suitable and understandable to the participants while maintaining enough specificity to remain true to the intended meaning. This process also occurred for survey questions not part of the choice experiment.
Question design and the overall structure of the survey also benefitted greatly from numerous interactions, conversations, and informal reviews from colleagues at the EPA, Oregon State University, Oregon Department of Fish & Wildlife, NOAA, Dr. Trudy Cameron and her graduate students at the University of Oregon, Dr. George Van Houtven of RTI International, participants at the Oregon Resource and Environmental Economics Workshop (Spring 2014), and David Chapman of Stratus Consulting.
The current draft survey instrument has been uploaded to the federal register.
Pilot Test
We plan to implement the survey in two stages. First, a pilot survey will be mailed to a subset of the Oregon subpopulation samples. We will mail the survey to 500 households so that we can recover between 150 -- 250 completed surveys for the pilot test. This will not represent an additional burden, but will be some designated fraction of the total mailing. Responses and preliminary findings to this pilot study will be used evaluate response rates and the quality of the survey data. Analysis of the pilot data will also help to further refine the experimental design by allowing for empirical data to inform defining prior parameter values for the empirical choice model. Using realistic model parameters will help to improve the D-efficiency of the design. In addition to improving the experimental design, analysis of the pilot test results will:
Provide a comparison of actual and expected response rates and to assess whether socioeconomic characteristics of the pilot test respondents vary from average socioeconomic characteristics for the state of Oregon. These data can be used to assess significant problems associated with non-response bias.
Reveal the proportion of respondents choosing the status quo option. If too few respondents are choosing the status quo option, the cost levels may need to be adjusted upwards;
Identify unusual or systematic patterns in respondent choices.
(a) Collection Methods
The survey will be administered as a mail survey. Respondents will be asked to mail the completed survey to the EPA office in Corvallis, Oregon. Administration of the mail survey will follow multiple contact procedures described in Dillman (2009).
(b) Survey Response And Follow-up
The preliminary estimate of the survey response rate is 30%. This is consistent with many response rates for mail surveys reported in the literature (Mansfield, 2012; Johnston et al., 2012). To improve response rates, multiple contact procedures will be followed. All households will receive a reminder postcard approximately one week after the initial questionnaire mailing. Approximately three weeks after the reminder postcard, non-responding households will receive a second copy of the questionnaire with a revised cover letter. The following week, a letter reminding them to complete the survey will be sent.
(a) Data Preparation
All survey responses will be entered into an electronic database after they are returned. All data entry will be conducted by the Principal Investigators. After all the responses have been entered into the database, the contents will be converted into open, machine-readable, data formats suitable for use with most statistical analysis software.
(b) Analysis and Econometric Specification
Stated choices will be analyzed using discrete choice methods derived under the assumption that respondents will choose the alternative, or set of characteristics which they most prefer. These random utility models are the workhorse of modern stated preference research in economics and can be adapted to analyze human choices and behavior under many different forms of behavioral assumptions (Train, 2009; Marschak 1960).
Under the random utility model, the decision maker, labeled , faces a choice among alternatives. The utility that decision maker obtains from alternative is This utility is known by the decision makers, but not to the analyst. The decision maker then chooses the alternative that provides the greatest utility. More formally, decision maker chooses alternative if Because the researcher only observes some of the attributes of the alternative as faced by the decision makers and some of the decision maker's socio-demographic characteristics, utility is decomposed as:
where the function is an indirect (or representative) utility that depends on attributes of the alternatives as faced by the decision maker, , and some attributes of the decision maker, labeled . Factors that affect utility but are not included in are captured in the term labeled , which represents the difference between true utility , and the portion of utility observed by the researcher, The researcher does not know and thus treats these terms as random (Train, 2009).
Under this general framework, the probability that decision maker chooses alternative is
This probability is a cumulative distribution function and under different specifications for the density of the unobserved portion of utility, , different discrete choice models can easily by specified.
For this application, we plan to specify representative utility functions as a linear function of the attribute levels:
where the is a vector of the marginal utilities for improvements in the recovery attributes, are levels of the recovery attributes specified in alternative is the marginal utility of money, and is the cost of the recovery option . Defining willingness to pay for recovery option attributes as , a measure of utility in money metric terms is obtained by rescaling the utility through by . Depending on the particular discrete choice model that is chosen, procedures for estimating these model parameters vary from simpler conditional logit models to more complex models such as a mixed logit that requires simulation methods (Train, 2009; Revelt and Train, 1998; Train and Weeks, 2005).
Based on prior focus groups, expert review, and the attributes of the recovery options under consideration, we will include two attributes of the recovery options, in addition to the cost attribute characterizing unavoidable household costs to be incurred under the recovery options. These attributes and their levels are summarized in the following Table 1 along with the description of those attributes used in the status quo option.
Table 1 Attribute levels used in Willamette basin salmon and steelhead recovery plan choice experiment |
||
Attribute |
Status quo option |
Recovery alternative levels |
Wild Salmon and Steelhead population numbers returning per year |
20,000 wild fish |
High Recovery: 70,000 wild fish Basic Recovery: 40,000 wild fish |
Time to Recovery |
No recovery time |
15, 25, 50 years |
Cost |
0 (USD) |
40, 75, 100, 120, 200, 250, 300 (USD) |
We expect to fit a number of model specifications to the data. For example, under the assumption that are distributed iid extreme value for all a conditional logit model can easily be estimated. The basic logit model assumes a certain pattern of substitution across alternatives. To allow for more generalized, realistic substitution patterns, more flexible models that relax the Independence of Irrelevant Alternatives property and allow for preference heterogeneity, such as a mixed logit model, will also be fit to the data.
(c) Reporting Results
The results will be written up and submitted to a peer-reviewed environmental journal. The survey data will be released only after it has been thoroughly vetted to ensure all potentially identifying information has been removed.
Arrow, K., Solow, R., Leamer, E., Portney, P., Rander, R., Schuman, H., (1993). Report of the NOAA panel on contingent valuation. Federal Register 58(10): 4602-14.
Bateman, I.J., R.T. Carson, B. Day, M. Hanemann, N. Hanley, T. Hett, M. Jones-Lee, G.
Loomes, S. Mourato, E. Ozdemiroglu, D.W. Pierce, R. Sugden, and J. Swanson. (2002).
Economic Valuation with Stated Preference Surveys: A Manual. Northampton, MA:
Edward Elgar.
Ben-Akiva, M., and S. R. Lerman. (1985). Discrete choice analysis. MIT Press, Cambridge, Massachusetts.
Blamey, R. K., J. W. Bennett, J. J. Louviere, M. D. Morrison, and J. Rolfe. (2000). A test of policy labels in environmental choice modeling studies. Ecological Economics 32:269–286.
Bliemer, M. and J. Rose. (2010). Construction of experimental designs for mixed logit models allowing for correlation across choice observations. Transportation Research Part B: Methodology. 44:720-734.
Bunch, D.S. and R.R. Batsell (1989). A Monte Carlo comparison of estimators for the multinomial logit model. Journal of Marketing Research 26:56-68.
Bureau of Labor Statistics. (2012). http://www.bls.gov/oes/current/oes_or.htm#00-0000 . Retrieved May, 2013.
Cameron, T.A. and J.R. DeShazo. (2010). Supplementary materials to accompany demand for health risk reductions. University of Oregon, Department of Economics.
Choice Metrics, (2012). Ngene 1.1.1: User Manual & Reference Guide.
Cummings, R.G., and L.O. Taylor. (1999). Unbiased value estimates for environmental goods: A cheap talk design for the contingent valuation method. American Economic Review 89(3):649-665.
Desvousges, W.H. and V. Kerry Smith, (1988). Focus groups and risk communication: The 'science' of listening to data. Risk Analysis 8(4):479-484.
Desvousges, W.H., Naughton, M.C., and G.R. Parsons. (1992). Benefit transfer: conceptual problems in estimating water quality benefits using existing studies. Water Resources Research 28 (3), 675–683.
Dillman, D.A. (2000). Mail and Internet Surveys: The Tailored Design Method. Second Edition. John Wiley & Sons, Inc., N.Y., N.Y.
Dillman, D.A., J.D. Smyth, and L.M. Christian. Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method. Third Edition. (2009). John Wiley & Sons, Inc., Hoboken, N.J.
Freeman, A.M. III. (2003). The Measurement of Environmental and Resource Values: Theory and Methods. Washington, DC: Resources for the Future.
Hoehn, J.P., Lupi, F., Kaplowitz, M.D., July, (2003). Untying a Lancastrian bundle: valuing ecosystems and ecosystem services for wetland mitigation. Journal of Environmental Management 68(3): 263-272.
Holmes, T. P., and W. L. Adamowicz. (2003). Attribute-based methods. Pages 171–220 in P. A. Champ, K. J. Boyle, and T. C. Brown, editors. A primer on nonmarket valuation. Chap . 6. Kluwer Academic Publishers, The Netherlands.
Johnston, R.J., Weaver, T.F., Smith, L.A., Swallow, S.K., April, (1995). Contingent Valuation Focus Groups: Insights from Ethnographic Interview Techniques. Agricultural and Resource Economics Review, 56-68.
Johnston, R.J., E.T. Schultz, K. Segerson, E.Y. Besedin, and M. Ramachandran. (2012). Enhancing the Content Validity of Stated Preference Valuation: The Structure and Function of Ecological Indicators. Land Economics. 88(1): 102-120.
Kanninen, B.J. (2007). Valuing Environmental Amenities Using Stated Choice Studies: A Common Sense Approach To Theory and Practice. Springer. The Netherlands.
Kaplowitz, M.D., Hoehn, J.P., February, (2001). Do focus groups and individual interviews reveal the same information for natural resource valuation? Ecological Economics 36(2): 237-247.
Klogaard, M.E., M. Bech, and S. Rikke. (2012). Designing a stated choice experiment: the value of a qualitative process. Journal of Choice Modelling 5(2): 1-18.
Kuhfeld, W.F. (2010). Marketing Research Methods in SAS. SAS 9.2 Edition, MR-2010. Available for download at: http://support.sas.com/techsup/technote/mr2010.pdf.
List, J.A. (2001). Do explicit warnings eliminate the hypothetical bias in elicitation procedures? Evidence from field auctions for sportscards. American Economic Review 91(5):1498-1507.
Louviere, J.J., D.A. Hensher, and J.D. Swait. (2000). Stated Choice Methods: Analysis and Application. Cambridge University Press. 402 p.
Mansfield, C. G. Van Houtven, A. Hendershott, P. Chen, J. Porter, V. Nourani, and V. Kilambi. (2012). Klamath River Basin Restoration Nonuse Value Survey, Final Report. Prepared for the U.S. Bureau of Reclamation. Sacramento, CA. http://klamathrestoration.gov/sites/klamathrestoration.gov/files/DDDDD.Printable.Klamath%20Nonuse%20Survey%20Final%20Report%202012%5B1%5D.pdf
Marschak, J. (1960). Binary choice constraints on random utility indications. in K. Arrow, ed., Stanford Symposium on Mathematical Methods in the Social Sciences. Stanford University Press. Stanford, CA pp.312-329
Mitchell, R.C. and R.T. Carson. (1989). Using Surveys to Value Public Goods: The Contingent Valuation Method. Resources for the Future. Washington D.C.
Morrison, M. (2000). Aggregation biases in stated preference studies. Australian Economic Papers 39:215–230.
Morgan, D.L., and R.A. Krueger. (1998). Focus Group Kit (6 volumes). Sage Publications, Thousand Oaks, CA.
NOAA Office of Habitat Conservation and Office of Response and Restoration, and Stratus Consulting. (2012). Ecosystem Valuation Workshop (binder prepared for workshop participants). Dates: June 6-7, 2012. Location: Asheville, N.C.
Oregon Department of Fish and Wildlife and National Marine Fisheries Service. 2011. Upper Willamette River Conservation and Recovery Plan for Chinook Salmon and Steelhead. http://www.dfw.state.or.us/fish/CRP/upper_willamette_river_plan.asp. Retrieved May, (2013).
Orme, B. (2010). Getting Started with Conjoint Analysis; Strategies for Product Design and Pricing Research. 2nd Edition. Madison, WI. Research Publishers LLC.
Revelt, D. and K. Train. (1998). Mixed logit with repeated choices. Review of Economics and Statistics. 80:647-657.
Rose, M. and M. Bliemer. (2013). Sample Size Requirements for Stated Choice Experiments. Transportation. 40:1021-1041.
Rubin, H.J., Rubin, I.S, (2005). Qualitative Interviewing. 2nd Edition. Sage Publications. Thousand Oaks, CA.
Train, K. (2009). Discrete Choice Methods with Simulation. Cambridge University Press.
Train, K. and Weeks, M. (2005). Discrete choice models in preference space and willing-to-pay space, in Scarpa, R. and Alberini, A.(eds), Applications of Simulation Methods in Environmental and Resource Economics, Chapter 1, pp. 1–16. Springer Publisher, Dordrecht, The Netherlands.
USEPA. (2012a). Sustainable and Healthy Communities: Strategic Research Action Plan 2012-2016. http://www.research.epa.gov/research/docs/shc-strap.pdf. Retrieved May, 2013.
USEPA. (2012b). Safe and Sustainable Water Resources.
http://www.epa.gov/research/docs/sswr-strap.pdf. Retrieved May, 2013.
USEPA. (2013a). Research Programs: Science for a Sustainable Future. http://www.epa.gov/ord/research-programs.htm. Retrieved April, 2013.
USEPA. (2013b). Sustainability. http://www.epa.gov/sustainability/. Retrieved April, 2013.
Wallmo, K. and D. Lew. (2012). Public Willingness to Pay for Recovering and Downlisting Threatened and Endangered Marine Species. Conservation Biology. 26(5):830-839.
Willamette Water 2100, (2013). http://www.water.oregonstate.edu/ww2100/. Retrieved May 2013.
Weber, M., Stewart, S., (2009). Public Valuation of River Restoration Options on the Middle Rio Grande. Restoration Ecology 17(6):762-771.
1 Examples of the pre-notice letter, cover letters for the first and second mailings, and a reminder postcard are included as attachments in the docket.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Modified | 0000-00-00 |
File Created | 2021-01-26 |