Supporting_Statement_PartA_Klamath 20110705 7_8_11 4pm

Supporting_Statement_PartA_Klamath 20110705 7_8_11 4pm.docx

Klamath Non-use Valuation Survey

OMB: 1090-0010

Document [docx]
Download: docx | pdf

Supporting Statement for

Paperwork Reduction Act Information Collection Submission

OMB Control Number 1090-0010


Klamath Nonuse Valuation Survey


Terms of Clearance: None

A. Justification


A1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.


The U.S. Department of the Interior (DOI) is requesting approval to modify an existing information collection for the Klamath Nonuse Valuation Survey. The 60-day notice was published in the Federal Register on June 9, 2009 (74 FR 27340). DOI requests approval to close the focus groups approved under this information collection request and to include fielding the survey instrument.

The Klamath River Basin provides essential habitat for several fish species, including Chinook salmon, coho salmon, steelhead trout, Pacific lamprey, and the Lost River and shortnose suckers. Some of these species are important components of ocean and/or in-river harvest (Chinook salmon and steelhead trout), while others are rarely harvested because of fishery regulations, limited availability, and/or listed status under the Endangered Species Act (ESA). In addition to its importance as fish habitat, the Klamath River and its tributaries also provide water to agriculture through the Bureau of Reclamation’s Klamath Irrigation Project. Oversubscription of Klamath water has thwarted recovery of depressed fish stocks and led to economic hardship for farming and fishing communities, prompting federal disaster relief for farmers in 2001 and for fishermen in 2006.

In February 2010, the U.S. government; the states of Oregon and California; the chairmen of the Klamath, Yurok, and Karuk Tribes; and the utility company PacifiCorp formally announced the final Klamath Basin Restoration Agreement (KBRA) and Klamath Hydroelectric Settlement Agreement (KHSA)1. These agreements define a set of activities, including the removal of four dams on the Klamath River by 2020; the dam removals are designed to restore fisheries and provide water supply certainty in the Basin. The Hydroelectric Settlement Agreement calls for the Secretary to determine whether dam removal will advance restoration of the salmonid fisheries of the Klamath Basin and is in the public interest.

The Secretary, acting through the Bureau of Reclamation, has authority to undertake these studies under both the Secure Water Act of 2009 March 30, 2009, 123 Stat. 991, and the Bureau of Reclamation’s general planning authority pursuant to the Act of June 17, 1902, 32 Stat. 388. These studies will be conducted in accordance with all applicable legal requirements.

The Secretary of the Interior authorized development of the Klamath project on May 15, 1905, under provisions of the Reclamation Act of 1902 (32 Stat. 388). The Hydropower Settlement Agreement and the KBRA will affect the Klamath project operations.

Absent the agreements, the Department would be participating in a judicial process administered by the Federal Energy Regulatory Commission (FERC).

Under the KHSA, the Secretary of the Interior is to determine by March 31, 2012, whether the potential removal of these dams will advance restoration of the salmonid fisheries of the Klamath Basin and is in the public interest, which includes but is not limited to consideration of potential impacts on affected local communities and Tribes. The determination will be based on a number of factors, including an economic analysis.

To comply with the Secretary’s responsibilities, one important area of benefits that needs to be addressed is “nonuse value.” Nonuse values accrue to members of the public who value Klamath Basin improvements regardless of whether they ever consume Klamath fish or visit the Klamath Basin. Nonuse value is a component of the total value an individual places on the environmental change. To measure these benefits, DOI has contracted with RTI International in Research Triangle Park, NC, to design and implement a stated-preference (SP) valuation survey of the U.S. public. The survey, which will measure the total value including nonuse value of the environmental change to individuals, will be the only component of the larger economic analysis that assesses the benefits that the public as a whole (who are federal taxpayers) hold for dam removal and implementing the KBRA, which will be funded in part by federal money.

The focus groups approved under the original information collection request are completed, and DOI requests permission to drop them from the information collection request.


A2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection. [Be specific. If this collection is a form or a questionnaire, every question needs to be justified.]


DOI has contracted with RTI International in Research Triangle Park, NC, to conduct the survey. Under the original information collection request, focus groups were conducted to test the survey instrument. The testing is complete, and the survey is ready to be administered.

The final survey will provide information for the economic analysis of the KBRA and the Klamath Hydroelectric Settlement Agreement. The economic analysis provides one piece of information that the Secretary of the Interior will use to evaluate the plans.


There are three primary survey instruments.

  1. The main survey instrument that will be administered by mail and web.

  2. The follow-up phone survey. As part of the nonresponse follow-up, we will attempt to ask nonrespondents for whom we have telephone numbers three questions that can be used for nonresponse bias identification.

  3. Nonresponse survey: A greatly shortened version of the survey for the nonresponse follow-up.

The survey instruments posted with this submission contain all the documents associated with the survey, including the prenotification post card and cover letters. Several versions of the main survey instrument were created as part of the conjoint experimental design and to test specific hypotheses about the impact of the survey instrument on willingness-to-pay (WTP). The versions include:


  1. The order of human uses listed on page 5 of the survey (see description below). The alternate order is listed at the end of the survey instrument.

  2. Conjoint experimental design. There will be 16 blocks of two conjoint choice questions. Most respondents will receive one randomly assigned block of two choice questions.

  3. A one-question version of survey. In the pilot test, 50% of the sample will receive a version of the survey that has only one conjoint question. There will be 16 versions of the one-question survey created by deleting the second conjoint question (and revising the wording of the pages leading up to the conjoint section). If the proportion of the sample that selects Plan A is significantly different in the one-question version of the survey, 20% of the final sample will also receive the one-question version.

  4. Cost in question 43. The costs will be randomly assigned. The device cost will be $25/$50/$110/$175. The reduced electricity costs will be $1/$2/$10/$15. (There are 16 combinations of device cost and reduced electricity cost, each will be assigned to one conjoint block so as not to add to the number of versions of the survey.)

In total, there will be 64 survey versions of the main instrument (2 orders for human uses x 16 blocks of the conjoint experiment + 32 versions of the one-question survey).


Below we discuss the justification for each question in the main survey instrument.


Section 1 (pp. 1–3) Introducing the Survey Issues

The first three pages of the survey introduce the issues to be addressed in the survey. It begins by placing the Klamath Basin issues within a national context. It informs respondents that the Klamath is one of many river basins across the country facing problems and requiring solutions, and the federal government is involved in making decisions about restoring the basin. It then describes how the survey is organized, what types of information will be provided, and what types of questions will be asked.

To encourage and motivate response, the survey informs respondents that their opinions will help government decision makers choose the best options for restoring the Klamath. It also describes some of the unique features of the Klamath.


Section 2 (p. 4) Introduction to the Klamath River Basin

Because many respondents are likely to know very little about the basin, this page provides basic geographic, demographic, and wildlife resource information about the Klamath. For reference, the full-page map showing the outline of the basin and its main water features of interest for the survey (Upper Klamath Lake and the Klamath River) will also be inserted in the survey booklet next to this page.


Question 1 asks whether the respondent had any prior knowledge about the Klamath, and Question 2 asks whether they have ever visited the basin. These questions provide basic indicators of familiarity with the Klamath, which can be used in the analysis to test whether and how prior knowledge affects preferences for restoring the basin.


Section 3 (p. 5) Human Uses of Klamath River Basin Water

This section introduces respondents to the many uses of Klamath Basin water resources. It describes the main ecosystem services (although it does not use this technical term) provided by the basin, and it introduces the parties who benefit from these services. It also introduces the idea that water is a scarce resource and that some of its uses are in direct competition with each other. Competition over water resources is a main source of environmental pressures within the basin.


Question 3 asks respondents about their uses of rivers in their area. This question provides data that can be used to test whether individuals’ own interactions with river resources in their area affect their preferences for restoring the Klamath’s river resources. It also provides respondents with an opportunity to pause from acquiring information about the Klamath.


Section 4 (pp. 6–9) Fish Resources under Threat in the Klamath

These pages introduce two main indicators of river ecosystem health in the Klamath Basin: the status of salmonid populations and risks to threatened (coho salmon) and endangered (Lost River and shortnose suckers) fish species. It describes the main causes of degradation in these indicators: dams, water withdrawals for irrigation, over fishing and water pollution. The map on page 7 is especially important for describing how dams are blocking migratory fish from accessing a large portion of their native habitat in the basin.

Introducing and describing these indicators is critical at this stage of the survey because they are also the main attributes that will be used in the SP choice tasks to represent improvements in river ecosystem health.


Questions 4, 5, and 6 offer respondents an opportunity to reflect on these two key indicators and consider the importance (i.e., level of concern) that they place on them. The responses can also serve as informal cross-checks on the preference weights for these indicators that are estimated as part of the data analysis.


Section 5 (pp. 10–14). Conflicts and Agreements over Water Management in the Klamath

These pages describe the recent history of conflict and more recent agreements over the management of Klamath Basin resources. The discussion emphasizes that, although it is true that there has been substantial and well-publicized disagreement in the past, most parties involved have now agreed on a key set of principles for resolving these conflicts: dam removal, water sharing, and fish restoration. All of the plans for moving forward with restoration of the basin contain these main elements.


Page 12 describes how any plans to implement the agreements would be paid for by a combination of money from PacifiCorp and PacifiCorp ratepayers, the states of Oregon and California, and the federal government.


Page 13 describes in general terms the main positive and negative impacts that would result from implementing these agreements. It emphasizes that there are important trade-offs for the government (and thus the respondent) to consider in moving forward.


Questions 7 and 8 ask respondents to describe their familiarity with the conflicts and agreement. These questions provide data that can be used to test whether individuals’ knowledge affects their preferences for restoring the Klamath’s river resources, and they provide respondents with another opportunity to pause from acquiring new information.


Question 9 asks respondents how much they agree that Oregon and California residents should pay more than residents of other states. The purpose of this question is mainly to remind respondents that all U.S. residents would pay, but Oregon and California residents would pay more. This feature is likely to have a large impact on the perceived fairness of any restoration plan for the Klamath.


Question 10 asks respondents if they receive power from PacifiCorp. This information will help identify individuals who are likely to end up paying the most for Klamath restoration plans, which may be an important factor affecting stated preferences for these plans.


Question 11 asks respondents how much they agree with federal government involvement in the Klamath Basin restoration. This question provides data that can be used to test whether individuals’ attitudes about the role of the federal government affect their preferences for restoring the Klamath’s river resources. It also provides respondents with another opportunity to pause from acquiring information about the Klamath.


Question 12 presents respondents with a series of seven general statements about the role of humans in protecting natural resources. It asks respondents to indicate their level of agreement with each statement. The purpose of this question is to gauge respondents’ overall support for environmental protection and to give respondents an opportunity to reflect on and provide feedback on these general issues, before responding to the choices about very specific plans in the Klamath.


Section 6 (pp. 15–18). Choice Task 1

These pages introduce the first choice task and describe the main features and outcomes of two options: NO ACTION and ACTION PLAN A. Text is included to encourage the respondent to think carefully about each choice and to answer in the same way as they would if it were an actual vote between the two options presented. Pages 16 and 17 allow the reader to compare the two options side by side in three main dimensions:

  • changes in wild salmonid populations

  • changes in extinction risks for suckers and coho

  • added costs to the household

The first two dimensions are described in words and illustrated with color graphics.


Page 18 describes the voting scenario and reminds respondents to consider their households’ budget and spending trade-offs before responding.


Question 13 asks whether the respondent has ever had the opportunity to vote on a similar program. The purpose of this question is to gauge how plausible the choice context is for respondents and also to give respondents a break from acquiring information.

Question 14 asks the respondent to choose (vote for) the option they prefer. The choice data from this question will be used to analyze preferences and estimate willingness to pay (WTP) for Klamath River ecosystem restoration and its associated attributes.

Question 15 asks for respondents’ level of certainty in their response to Question 14. Measuring certainty in this way will allow us to test for its effect on stated preferences and to investigate whether placing more weight on responses involving higher levels of certainty affects WTP estimates.

Section 7 (pp. 1922) Choice Task 2

These pages introduce a second choice task between NO ACTION and a new alternative: ACTION PLAN B. For this choice, the respondent is asked to assume that the previously described ACTION PLAN A is not an option. Pages 20 and 21 provide detailed descriptions of NO ACTION and ACTION PLAN B outcomes, using the same three dimensions and format that were used on pages 16 and 17 to describe the NO ACTION and ACTION PLAN A options. Page 22 again describes the voting scenario and reminds respondents to consider his/her households’ budget and spending trade-offs before responding.


Question 16 asks the respondent to choose (vote for) the option they prefer. The choice data from this question will be used to analyze preferences and estimate WTP for Klamath River ecosystem restoration and its associated attributes.


Question 17 asks for respondents’ level of certainty in their response to Question 14. Measuring certainty in this way will allow us to test for its effect on stated preferences and to investigate whether placing more weight on responses involving higher levels of certainty affects WTP estimates.



Section 8 (pp. 2324) Stated Choice Debriefing Questions

Question 18 presents a series of 10 statements describing possible reactions to the choice questions and asks respondents to indicate their level of agreement. The answers to these questions will be used to determine whether and how the attitudes reflected in these statements affected respondents’ stated choices and trade-offs.

Questions 19 and 20 use the same format as Question 18 but specifically ask about the choice of the NO ACTION and action plans, respectively. They each present two statements and ask for respondents’ level of agreement. The answers to these questions will also be used to determine whether and how the attitudes reflected in these statements affected respondents’ stated choices and trade-offs.

Question 21 asks respondents about their perceived likelihood that the results of the survey will be used by the government to assist in decision making. The purpose of this question is to gauge how relevant respondents believe their answers are for policy, which may affect their stated preferences in the choice tasks.



Section 9 (p. 25) Recreational Use of Klamath

This section asks about the respondents’ past use of the Klamath River Basin (Questions 22, 23, and 24).The purpose of these questions is to develop indicators of recreational use. These indicators will be used in the analysis to determine how these recreational use indicators affect respondents’ stated preferences and total WTP (including both use and nonuse values) for Klamath Basin restoration.



Section 10 (pp. 2629) About You and Your Household

This section primarily collects information about respondents’ and their households’ characteristics. These data will be used as explanatory variables in the SP analysis and to compare the survey samples’ characteristics with those of the populations they represent (i.e., to test for nonresponse bias).

Questions 25 to 34 ask for standard sociodemographic information (gender, age, marital status, household size, household income, education, homeownership, employment status, and ethnicity). Question 35 specifically asks about membership in one of the Klamath Basin tribes, and Question 36 asks about employment in specific industries that are most likely to be affected by Klamath Basin restoration activities.

Questions 37 to 39 ask about specific economic conditions and expectations for the respondent and their household. The purpose of these questions is to investigate how the current economic recession may be affecting stated preferences and WTP for Klamath Basin restoration.

Questions 40 and 41 ask about the respondent’s electric bill and willingness to buy a device that would lower their electric bill over the next 10 years. The purpose of these questions is to get a rough estimate of the respondent’s time preferences for models that use the discount rate to convert the 20-year payment to a single amount. The question will also provide information on how preferences for a good with use value (a device to lower electric bills) vary between the different strata.

Question 42 asks whether the respondent is the person in the household with the most recent birthday. To randomize the selection of a respondent from each household, the instructions included in the survey’s cover letter request that the person with the most recent birthday respond to the survey. This question verifies whether this randomization took place but, to promote as high response rates as possible, it also asks the respondent to complete and return the survey even if these instructions were not followed.

One-question version of survey: The number of SP questions each respondent answers has also been shown to affect responses in some surveys. The effects seem to vary by survey, and it is difficult to identify whether the differences in survey responses reflect strategic bias or learning or some other effect.

Because the survey will be administered primarily by mail, it is possible that survey respondents will look at both conjoint questions before answering. Instead of considering the two questions as separate and unrelated, respondents might let the cost of one plan relative to the benefits influence their response to the other plan, perhaps selecting the plan that was “the best deal”. Alternatively, looking at both questions might help the respondent think more carefully and provide more accurate answers.

In theory, a single, dichotomous choice question will be incentive compatible. However, asking a single question of each respondent greatly increases sample size requirements. We will test whether the number of conjoint questions presented to a respondent affects their responses by comparing responses to a one-question survey with our two-question survey. In the pilot test, 50% of the sample will only get one conjoint question. The one-question version will be created by dropping the second question in each of the 16 blocks of conjoint questions used in the main survey. If there is a significant difference in the percent of the sample selecting Plan A in the two versions of the survey, then 20% of the final sample will also be mailed the one-question version of the survey.



Experimental Design for SP Choice Questions

The main version of the survey instrument includes two stated choice conjoint questions. Each choice question presents one version of the action plan and the no-action plan (or opt-out). The action plans vary across four attributes: the increase in salmon and steelhead population, the risk of extinction for coho salmon, the risk of extinction for the two suckers, and the cost per household.

In an SP question, the levels of the attributes do not need to match existing levels or existing projections. In fact, SP methods are used in marketing to estimate demand for new product features that do not currently exist. In the health literature, researchers are often interested in preferences for outcomes that are not currently feasible with given treatments. In the environmental economics literature, researchers are often interested in valuing the outcome but not the plan used to achieve the outcomes. Past surveys have presented plausible but made up plans to achieve the outcomes of interest and ask survey respondents to value changes in outcomes levels.

For this survey, DOI needs estimates of the total value that individuals place on the outcomes associated with the KBRA and dam removal. The agreements provide a real process and payment vehicle for the SP questions. To provide policy-relevant estimates, the levels for the attributes need to encompass the range of likely outcomes. Based on expert judgment and existing literature, the Environmental Impact Statement (EIS) will provide quantitative or qualitative estimates of the impact on fish populations. The impacts laid out in the EIS will be used to create outcome levels or ranges of outcome levels for which we will calculate WTP. Currently, the expert panels for different fish are still meeting. The current attribute ranges for the SP questions are based on current thinking about possible outcomes. If the fisheries experts revise their estimates, the attribute levels will be adjusted to reflect the latest information at the time data collection begins.

The three levels for the increase in salmon and steelhead population are based on information from the scientific literature on current and historic runs and the professional judgment of fisheries biologists involved in the project. The estimates in the literature for historic and current levels of population vary depending on the methods and data sources. The text in the survey describes a range of current and historic population levels based on the more conservative articles. The levels for the population increase attribute are 30% increase, 100% increase, and 150% increase.

For the coho salmon and the suckers, their threatened and endangered status was converted into a scale depicting risk of extinction based on an article by Patrick and Damon-Randall (2008). The levels for the coho salmon attribute are high, moderate, and low. The status of the suckers is more precarious, and the attribute levels will be very high, high, and moderate.

Finally, the levels for the cost attribute are based on reactions from focus groups and one-on-one interviews. The cost levels used in the Pilot test were $12, $24, $48, and $90 per year for 20 years. Based on the results from the Pilot Test (discussed further below), the levels were adjusted to drop the $24 level and to add a $168 level.

As described in more detail in Section A16, the survey will be mailed out to 10% of the sample at the beginning. After two mailings of the survey instrument, we will use the responses we have received to conduct preliminary analysis on the data. If more than 80% of the sample selects the no-action plan, then the cost of the plans will be adjusted downward. A simple conditional logit will be estimated with the data to examine whether the other attribute levels are significant. If the levels are not significant, then the levels will be adjusted up or down to create greater differences between the plans.


Experimental Design

The experimental design for the SP conjoint questions was created using Sawtooth Software (2009). Each respondent will answer two conjoint questions that consist of one action plan and the no-action plan. To encourage respondents to think about the trade-offs between the plans and no action, the experimental design was created as if the respondents were answering one question comparing two action plans. The design includes a restriction that there are no dominated pairs (one plan is better for all the attributes). If respondents compare the plans, they will see that the plans involve trade-offs. For the actual SP questions, the plans will be split into two questions comparing each plan to the no action plan individually. The format of this design actually provides additional information about the respondent’s preferences, because we may know something about how the respondent ranks the two action plans (for example, if they select plan A and select the no-action plan instead of Plan B, we know that they prefer A to B). This additional information can be incorporated into the analysis; however, the models are more complicated to run.


A3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.


Data will be collected using both a mail questionnaire and a Web instrument. There are at least four important factors related to using this electronic method of data collection. First, it can help increase response rates over and beyond what can be achieved through mail alone. Individuals have a different likelihood to respond in various modes, often related to their personal mode preferences. In addition, increasing response rates by including people who did not cooperate to the mail survey request, by offering a Web mode, can decrease nonresponse bias; those who complete only the mail survey may be systematically different from the rest of the population.

Second, electronic data collection has been demonstrated to lead to improvements in the quality of collected data (Baker, Bradburn, and Johnson, 1995), through capabilities such as automated skip logic and prompting for missing data.

Third, also related to the previous benefit, is a likely reduction in respondent burden. Having automated skip logic relieves the respondent from this task. Some studies have also shown a reduction in the time needed to complete the survey when moving from a paper-based to a computer-based instrument (e.g., Baker, Bradburn, and Johnson, 1995).

Fourth, cost savings are possible through the use of Web survey instruments, with minimal variable costs (cost per additional interview), that can avoid more costly follow-up of the same sample cases.

Maximizing response rate is very important to the validity of the survey. In addition, because of the subject matter, we believe that using self-administered survey instruments is very important. However, we recognize that Web and mail surveys are different modes and different types of respondents may favor one mode over another. In Section B3, we discuss our approach to identifying possible mode effects.


A4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.


To our knowledge, no studies to date have used SP methods to estimate total household values (including nonuse values) for Klamath River Dam removal or other restoration measures; however, a limited number of studies have used these methods to investigate values for related programs in other parts of the United States. Although a number of other economic valuation studies have addressed dam removal activities in the United States, most of them have applied revealed preference (RP) methods and focused on use-related values (Robbins and Lewis, 2008; Provencher, Sarakinos, and Meyer, 2008; Lewis, Bohlen, and Wilson, 2008; Loomis, 1999).

Table A1 identifies and summarizes key features of nine existing studies that have estimated total values for U.S. river ecosystem restoration using SP methods.2 Similar to the current study, the majority of these have assessed total values for western rivers, with only one study done in the East (Adams, 2004). The closest study geographically to the Klamath River study is the one by Douglas and Taylor (1999). They estimated total values for restoration activities in the Trinity River, a southern tributary to the Klamath River, which will not be affected by the current restoration program. Despite its proximity to the Klamath, the results of the Trinity River study are difficult to interpret or to directly transfer to the current program.

Only two of these studies have specifically dealt with dam removal; the Elwha Dam removal project (Loomis, 1996) is the most similar to the Klamath River plans. However, all but one of the studies included fish recovery as a key response to the restoration program being evaluated. Five of the studies specifically describe increases in salmon and other anadromous fish populations, and four of these use specific numbers of additional fish to describe the impacts of the program.

All nine of the studies used contingent valuation methods (CVMs) rather than conjoint methods to elicit WTP; however, a few of them included split-sample designs to measure scope effects associated with alternative programs. The most common form of payment vehicle was an increase in taxes, followed by an increase in utility (power or water) bills.

Table A1. Previous Valuation Studies of Dam Removal or Related Restoration Efforts

Category

Loomis, 1996

Welsh et al., 1995

Bell, Huppert, and Johnson, 2003

Douglas and Taylor, 1999

Hanemann, Loomis, and Kanninen, 1991

Loomis et al., 2000

Sanders, Walsh, and Loomis, 1990

Adams, 2004

Olsen, Richards, and Scott, 1991

River Ecosystem Studied

Elwa River Basin, Olympic Peninsula, WA

Colorado River (including parts of the Grand Canyon) below the Glen Canyon Dam, AZ

Five Pacific Northwest estuaries in WA and OR

Trinity River, CA

San Joaquin Valley, CA

South Platte River, CO

11 rivers in Colorado

Huron River, MI

Columbia River Basin in WA, OR, ID, and MT

Main Restoration Program Elements

Dam removal (2)

Three alternative flow release regimes from the dam

Coho enhancement program

Increase Trinity River flows

Five programs: Two for wetland habitat, two for water contamination, and one for river flows

Conservation easement, riparian buffers, reduced flow diversion

Protection of rivers under the Wild and Scenic Rivers Act

Dam removal or keeping dam in current condition

Dam flow and dam passage changes

Main Program Impacts

Increases in four species of salmon and steelhead

  • Number and size of river beaches

  • Archaeolog-ical and American Indian traditional sites

  • Native fish

  • Trout

  • Electric power rates

  • Farm incomes

Coho salmon recovery

Increase anadromous fish population and improved boating recreation

Wetlands program: maintain or increase wetland habitat.
River flow program: increase river flows and fish populations. Contamination program: maintain or reduce exposure of wildlife to contamination

Ecosystem services: Wastewater dilution, natural water purification, erosion control, habitat for wildlife

Recreation and ecosystem preservation

Improved river recreation and fish vs. continued pond recreation and fish

Doubling the salmon and steelhead runs by 2000

(continued)

Table A1. Previous Valuation Studies of Dam Removal or Related Restoration Efforts (continued)

Category

Loomis, 1996

Welsh et al., 1995

Bell, Huppert, and Johnson, 2003

Douglas and Taylor, 1999

Hanemann, Loomis, and Kanninen, 1991

Loomis et al., 2000

Sanders, Walsh, and Loomis, 1990

Adams, 2004

Olsen, Richards, and Scott, 1991

Fish Population Metrics

Increase of pink salmon and other fish species

Qualitative: “improvement,” change in “danger of extinction”

WA survey: Allowable catch of coho salmon
OR survey: Delisting or allowable catch of coho salmon

Number of spawning adult anadromous fish

Salmon improvement

Improve habitat for six native fish so they are not in danger of extinction

NA

Reduction of lake fish population with increase in river fish population

Quantity of fish in salmon and steelhead runs

Fish Population Metric Range

200,000 pink salmon with a total increase of 300,000 fish

NA

80,000–160,000

9,000–105,000

Not mentioned

NA

NA

NA

Double the amount (an increase of 5 million fish)

SP Valuation Method

CVM

CVM

CVM

CVM

CVM

CVM

CVM

CVM

CVM

SP Question Format

Dichotomous choice

Dichotomous choice

Dichotomous choice

Open-ended/bid cards

Double-bounded dichotomous choice

Dichotomous choice

Open-ended

Dichotomous choice

Open-ended

Payment Vehicle

Taxes

Taxes, utility bills

Taxes

Utility bill

Taxes

Water bill

NA

Taxes

Power bill

Survey Mode

Mail

Mail (telephone follow-up)

Mail

On-site, mail, and telephone.

Mail and telephone

In person

Mail and telephone

Mail

Telephone

Sample Frame

Clallam County, WA, rest of WA, and rest of United States

Power service (marketing) area (WY, UT, CO, NW, AZ, NV), and rest of United States

Coastal WA and OR

Trinity users and households in WA, OR, CA, and NV.

San Joaquin Valley, CA, OR, WA, and NV households

Towns near the river, CO

CO

Ann Arbor, MI

WA, OR, ID, and western MT.

(continued)

Table A1. Previous Valuation Studies of Dam Removal or Related Restoration Efforts (continued)

Category

Loomis, 1996

Welsh et al., 1995

Bell, Huppert, and Johnson, 2003

Douglas and Taylor, 1999

Hanemann, Loomis, and Kanninen, 1991

Loomis et al., 2000

Sanders, Walsh, and Loomis, 1990

Adams, 2004

Olsen, Richards, and Scott, 1991

Sample Size

Total: 2,500
Clallam County: 600
Rest of WA: 900
Rest of U.S.: 1,000

Total: 5,950
Marketing: 3,400
Rest of United States: 2,550

5,000 (1,000 per estuary)

Total: 5,000
On-site users: 200
User mail-out: 2,044
CA households: 2,054
Out-of-state households: 663

1,960

462

~420

2,000

4,028


Completed Surveys


Clallam County: 77%
Rest of WA: 68%
Rest of U.S.: 55%


Total: 3,151
Marketing: 1,728
Rest of United States: 1,423

2,006

Total: 2,347
On-site users: 41
User mail-out: 1,149
CA households: 982
Out-of-state households: 175

Total: 1,004
San Joaquin Valley: 227
Rest of CA: 576
Out-of-state: 201

96

214

766

Nonusers: 695
Users: 482


Survey Year

1994

1994–1995

2000

1993–1994

1989

1998

1983

2003

1989


Main Value Estimate

Average annual household WTP for the dam removal program

Average annual household WTP for the dam water release alternative

Average household WTP over 5 years by income level and estuary

Average annual WTP by users or households

Average annual CA household WTP for each program

Average monthly household WTP for river restoration

Average annual household WTP for increments of river protection by use and preservation values

Average annual household WTP for dam removal or dam maintenance

Average monthly household WTP for a guaranteed doubling of the salmon and steelhead runs



These studies vary widely in the extent of the market surveyed and WTP estimates applied. Four studies only estimate the values for those in the immediate area of the river or watershed. Four other studies use a tiered approach to assess different WTP estimates for households in the immediate area versus those in the rest of the state, nearby states, or the rest of the country.


A5. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.


The survey will only be sent to households, and will not impact small businesses or other small entities.

A6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


This is not a periodic data collection. If the Agency did not conduct the survey, the information basis for the Secretary of the Interior’s determination would be incomplete. The survey is the only component of the economic analysis that captures the total value of the change (including nonuse values). It is also the only component of the analysis that will include data on the values and opinions of people living outside the Klamath Basin area, who are federal taxpayers and who will be supporting the Klamath activities through their tax dollars.


A7. Explain any special circumstances that would cause an information collection to be conducted in a manner:

* requiring respondents to report information to the agency more often than quarterly;

* requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

* requiring respondents to submit more than an original and two copies of any document;

* requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;

* in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;

* requiring the use of a statistical data classification that has not been reviewed and approved by OMB;

* that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

* requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.


All information collection and recordkeeping activities in this submission are consistent with the guidelines in 5 CFR 1320.6.


A8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice [and in response to the PRA statement associated with the collection over the past three years] and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.


Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.


Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years — even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.


In accordance with the Paperwork Reduction Act, DOI published a 60-day notice requesting comments regarding this information collection request (74 FR 27340; June 9, 2009). The Agency received no comments. On August 30, 2010, the Agency published a 30-day notice requesting comments and received one general comment on the overall design of the project, but no comments on the survey instrument. The pilot test was approved on December 14, 2010. The Agency received additional comments on the survey instrument when the supporting statements and survey instruments were posted following approval. All of the comments focused on the background material and description of the no action and action alternatives. No comments were received on the questions themselves. Further revisions were made based on these comments and changes suggested by the team of federal biological scientists working on the project. Four additional one-on-one interviews were conducted to test for reactions to the changes in the survey wording during the 30-day comment period.


In addition to input from RTI International staff and its project team of consultants, including Dr. V. Kerry Smith of Arizona State University and Dr. John Duffield of the University of Montana, DOI has received feedback from an external expert review panel—Dr. Trudy Cameron (University of Oregon), Dr. Kevin Boyle (Virginia Tech University), and Dr. Wictor Adamovicz (University of Alberta)—regarding the design of the survey instrument and the plans for data collection and data analysis. The experts provided feedback first on a draft plan for designing the survey instrument, collecting the data, and analyzing the data. The plan was revised based on the comments. The experts provided a second round of feedback on the revised plan for collecting and analyzing the data and on the draft survey instrument.

The survey plan and an outline of the survey instrument were also presented twice to the Klamath stakeholder group. The stakeholder group includes representatives from all the groups that signed the Klamath Hydroelectric Settlement Agreement and the KBRA, representatives from the parties that declined to sign the agreements (Siskiyou County and the Hoopa Tribe), and members of the public.

As part of the instrument design process, we held six focus groups in different parts of the country: Medford, OR; Klamath Falls, OR; Eureka, CA; Kansas City, KS; Raleigh, NC; and Phoenix, AZ. In addition, the instrument was pretested through 10 one-on-one interviews—4 in Oregon, 3 in California, and 3 in other parts of the country. The survey was revised in the following ways based on feedback from the expert reviewers, focus groups and one-on-one interviews:

  • Shortened text, simplified language,

  • Revised maps to increase geographic scope, add landmarks, added explanation for maps in text

  • Revised descriptions of Klamath basin, fish in the basin, and the agreements to present information people said they needed, delete information that was unnecessary, address benefits and costs of the agreement more fully

  • Revised description of action and no-action plans, payment vehicle

  • Reduced the number of SP conjoint questions

  • Where potential sources of bias in responses were noted, we either tried to address these in the text of the survey or added a debriefing question to identify respondents who may be protesting and to identify reasons for SP responses unrelated to economic trade-offs


Based on comments received following approval of the pilot test survey instrument the following types of changes were made to the instrument:

  • Some information was changed to be consistent with the draft Layperson’s Guide to the Klamath River being prepared as part of the overall project and Agency information to the media.

  • Changes were made to the descriptions of some of the uses of the Klamath Basin, reasons for declining fish populations, threats to the endangered species, the history behind the agreements, the main features of the agreements, the impacts of the agreements, and how the agreement would be paid for.

  • The baseline population of Chinook salmon and steelhead trout through 2060 was revised from a projected 30% decline to current population in the graph with the statement “Scientists expect that wild populations of these fish will remain at low levels in the future.” The change was made based on input from the biological scientists working on the project.


The pretest was completed in June 2011, and the data from the pretest were analyzed to assess whether the survey instrument functioned as expected. A summary of the pretest results are contained in Attachment A. Overall, the data from the pretest suggest that the survey instrument works well. We made three changes to the instrument that should improve the data collected in the final survey, but otherwise, we do not see evidence that changes are needed.

The changes, which are described in Attachment A in more detail, are:

  1. Change the levels of the cost attribute to $12, $48, $90, and $168, instead of $12, $24, $48, $90.

  2. After the conjoint questions in Q19 (long version of the survey), we added “I would not vote for the action plans even if there were no added cost to my household” and drop “I voted for NO ACTION because I believe my taxes are already too high.”

  3. We added instructions to page 3 on how to correctly fill in the boxes next to the response choices so that the scanner can read the surveys more accurately.



A9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


A small monetary incentive of $2 (2 $1 bills) will be sent to sample members to help gain their participation, as demonstrated consistently across studies to increase cooperation (for reviews, see Heberlein and Baumgartner, 1978 and Singer et al., 1999). The money will be included in the initial survey mailing to respondents. This amount is provided as a token of appreciation aimed to build a social exchange between the organizations making the survey request and the individual (Dillman, 1978; Dillman, 2000), to the extent possible. Furthermore, incentives have been shown to reduce nonresponse bias by increasing cooperation particularly among those who are not interested or involved in the survey topic (Groves, Singer, and Corning, 2000; Groves, Presser, and Dipko, 2004; Groves et al., 2006). Thus, the use of incentives is instrumental to increasing response rates and reducing nonresponse bias.

An amount that presents a token of appreciation can help curb nonresponse, but with falling response rates in household surveys (e.g., Curtin, Presser, and Singer, 2000; Stussman, Dahlhamer, and Simile, 2005) higher incentives are needed. Therefore, to gain better understanding of potential nonresponse bias, a nonresponse follow-up phase will be implemented increasing the incentive amount to $20 for 20% of the nonrespondents after the third mailing (a reminder letter). The nonresponse follow-up will include a letter sent by Federal Express or priority mail to 20% of the nonrespondents offering an incentive of $20 to return a short survey consisting of 6 questions taken from the main survey (which will be included in the Fed Ex mailing). A few days later, we will call the nonrespondents with phone numbers to reiterate the offer and answer questions. By providing a significantly higher incentive to return the survey and drastically shorter survey instrument, we hope to entice some percentage of the nonrespondents to return the survey. The characteristics of these “high incentive” responders will be compared to the sample of “low incentive” responders to help evaluate nonresponse bias. Although the literature suggests that including an upfront incentive generates a higher response rate than the promise of an incentive, our budget does not allow us to mail all the nonrespondents $20 up front.


A10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.


At the beginning of the survey, the following statement is included:


Your participation in this survey is voluntary. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific individual. We will not provide information that identifies you to anyone outside the study team, except as required by law. Your responses will be stored separately from your name and address, and when analysis of the questionnaire is completed, all name and address files will be destroyed.


A Federal agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB control number.



In addition, the following statement is included in the survey before we ask the demographic questions: “Finally, we would like to ask you a few questions about you and your household. Responses to these questions will be used only for statistical purposes and to compare respondents to this survey with the U.S. population as a whole. The reports prepared for this study will summarize findings across the sample and will not associate responses with an individual. Your answers will not be saved or stored in a way that can be associated with your name or address.”

A11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.


Participants will not be asked any questions that are personal or sensitive in nature.


A12. Provide estimates of the hour burden of the collection of information. The statement should:

* Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.

* If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens.

* Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included in Item 14.



Table A2 summarizes the burden estimates for the final data collection. The surveys will be mailed out in two batches according to the schedule in Table A4. The initial mailing will be to 10% of the sample, which we will use as a pretest of the mailing procedures and the conjoint attribute levels. The first 10% of the sample will receive the first and second mailing (as needed for nonrespondents to the first mailing). After some preliminary analysis of the responses returned from the first 10% of the sample, the remaining 90% of the sample, the main survey sample, will released. As needed, respondents will receive the prenotification postcard, two mailings of the survey, a reminder postcard between the first and second survey instrument mailings with the address and password for the web version of the survey, and a reminder letter to respondents who have not returned their surveys several weeks after the second mailing of the survey instrument. Finally we will select 20% of the nonrespondents from the main survey sample for non-response follow-up that includes a Fed Ex letter and shorter version of the survey, a higher incentive offer and a follow-up telephone call.


For the estimated response rates, we assumed the lower bound for each response rate to provide a conservative estimate. The actual response rate is likely to be higher. From an original sample of 13,000 addresses, we conservatively expect that 80% (10,400) of the addresses will be valid.


The initial mailing to 10% of the sample (the pretest) has been completed (see Attachment A for details).


10% Sample (Pretest) first mailing: The initial 10% of sample households will receive a prenotification post card followed by the first mailing of the survey instrument with a cover letter. The post card will be mailed to 1,300 households, out of which we expect 80% of the addresses will be valid. The first mailing will be sent to 1,040 households with valid mailing addresses. DOI estimates that everyone will spend .008 hours (approximately 30 seconds) looking at the post card. DOI estimates that 20% of the households in the Klamath area and 15% of households from outside the Klamath area will complete the survey after the first mailing. This yields a total of 177 respondents, and DOI estimates that respondents will spend 0.50 hours (30 minutes) looking at the letter and completing the survey. DOI estimates that there will be 863 nonrespondents after the first mailing and that nonrespondents will spend, on average, 0.05 hours (3 minutes) looking at the letter and survey.


10% Sample reminder postcard and second mailing: A reminder postcard will be sent that includes the address for the web version of the survey along with the respondent’s password (the postcard will fold over so password will not be visible). Then the survey instrument and a second cover letter will be mailed to nonrespondents. DOI estimates that after the second mailing 10% of the households that did not respond to the first mailing will complete the survey. This yields a total of 86 respondents, and DOI estimates that respondents will spend 0.50 hours (30 minutes) completing the survey and reading the letter and postcard. DOI estimates that there will be 777 nonrespondents and that nonrespondents will spend on average 0.05 hours (3 minutes) looking at the letter, postcard and survey.


90% Sample (Main survey sample) first mailing: The remaining 90% of sample households will receive a prenotification post card followed by the first mailing of the survey instrument with a cover letter. The post card will be mailed to 11,700 households, out of which we expect 80% of the addresses will be valid. The first mailing will be sent to 9,360 households with valid mailing addresses. DOI estimates that everyone will spend .008 hours (approximately 30 seconds) looking at the post card. DOI estimates that 20% of the households in the Klamath area and 15% of households from outside the Klamath area will complete the survey after the first mailing. This yields a total of 1,591 respondents, and DOI estimates that respondents will spend 0.50 hours (30 minutes) looking at the letter and completing the survey. DOI estimates that there will be 7,769 nonrespondents after the first mailing and that nonrespondents will spend, on average, 0.05 hours (3 minutes) looking at the letter and survey.


90% Sample reminder postcard and second mailing: A reminder postcard will be sent that includes the address for the web version of the survey along with the respondent’s password (the postcard will fold over so password will not be visible). The survey instrument and a second letter will be mailed to nonrespondents. DOI estimates that after the second mailing 10% of the households that did not respond to the first mailing will complete the survey. This yields a total of 777 respondents, and DOI estimates that respondents will spend 0.50 hours (30 minutes) completing the survey and reading the letter and postcard. DOI estimates that there will be 6,992 nonrespondents and that nonrespondents will spend on average 0.05 hours (3 minutes) looking at the postcard letter and survey.


90% sample third mailing: After the second mailing, the nonrespondents 7,342 households will be sent a reminder letter and will spend 0.008 hours (approximately 30 seconds) looking at the letter. The letter will provide the web address of the survey and a toll-free number and email for the respondent to call or write and get another copy of the survey. DOI expects that 5% of nonrespondents will complete the survey. DOI estimates that 350 respondents will spend 0.50 hours (30 minutes) on the survey.


Fourth mailing (Nonersponse follow-up): After the third mailing, 20% of the nonrespondents (1,328 households) will be sent a letter by Federal Express or Priority Mail with a letter and a much shorter version of the survey offering a $20 incentive to return the shorter survey. DOI assumes there will be telephone numbers for 65% of the nonrespondents. For respondents with telephone numbers, the letter and survey will be followed by a phone call from a live operator who will either talk to the household or leave a message reiterating the higher incentive and offering to mail another copy of the survey if the household needs one. DOI expects that 20% of nonrespondents will complete the survey after the phone call reminder. DOI estimates that 173 respondents will spend an average of 0.17 hours (10 minutes) on the shorter survey, letter, and the phone call. DOI estimates that 691 nonrespondents will spend 0.08 hours (5 minutes) on the survey, letter, and phone call.


For the 35% of households without telephone numbers , DOI expects that 10% of nonrespondents will complete the survey after receiving the Federal Express letter. DOI estimates that 46 respondents will spend 0.08 hours (5 minutes) on the shorter survey and letter. DOI estimates 418 nonrespondents will spend 0.05 (3 minutes) on the survey and letter.


Total Cost: The Agency estimates that it will cost respondents $27.42 per hour in loss of potential salary and benefits to participate in the survey. Respondents will spend an annual total of 2,559 hours at a cost of $70,163.38.3

Table A2. Burden Calculations

Type of Respondent

No. of Respondents

No. of Responses per Respondent

Total Annual Responses

Average Burden Hours per Response

Total Annual Hour Burden

10% Sample Postcard mailing

1300





Sample of valid addresses

1040

1

1,040

0.008

8.32

10% Sample first mailing nonrespondents

863

1

863

0.05

43.16

10% Sample first mailing respondents

177

1

177

0.5

88.40

10% Sample reminder postcard and second mailing nonrespondents

777

1

777

0.05

38.84

10% Sample reminder postcard and second mailing respondents

86

1

86

0.5

43.16

90% Sample Postcard Mailing

11700





Sample of valid addresses

9360

1

9,360

0.008

74.88

90% Sample First mailing nonrespondents

7769

1

7,769

0.05

388.44

90% Sample First mailing respondents

1591

1

1,591

0.5

795.60

90% Sample reminder postcard and second mailing nonrespondents

6992

1

6,992

0.05

349.60

90% Sample reminder postcard and second mailing respondents

777

1

777

0.5

388.44

90% Sample Third mailing reminder letter: nonrespondents

6992

1

6,992

0.008

55.94

90% Sample Third mailing reminder letter: respondents

350

1

350

0.5

174.80

Nonresponse follow-up mailing

1328

1

1,328

0

0.00

Nonresponse bias sample mailing nonrespondents (households with telephone numbers)

691

1

691

0.08

55.26

Nonresponse bias sample mailing respondents (households with telephone numbers)

173

1

173

0.17

29.36

Nonresponse bias sample mailing nonrespondents (households without telephone numbers)

418

1

418

0.05

20.92

Nonresponse bias sample mailing respondents (households without telephone numbers)

46

1

46

0.08

3.72

Total

 

 

39,430

 

2559


a Total includes multiple contacts with some households. The total number of unique individuals contacted will be 10,400 (total valid addresses).

Note: Numbers may not sum because of rounding.



A13. Provide an estimate of the total annual [non-hour] cost burden to respondents or record keepers resulting from the collection of information. (Do not include the cost of any hour burden shown in Items 12 and 14).

* The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life) and (b) a total operation and maintenance and purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information [including filing fees paid]. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.

* If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collection services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.

* Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.


The cost burden on respondents and record-keepers, other than hour burden, is zero.


A 14. Provide estimates of annualized cost to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information. Agencies also may aggregate cost estimates from Items 12, 13, and 14 in a single table.


The estimated annualized cost to the federal government is $583,785 (Table A3). The cost was calculated based on quantification of hours and operational expenses.


Table A3. Annualized Cost to the Federal Government

Expense category

Cost

Labor plus overhead

$282,561

Supplies, postage, copying, incentives, telephone

$301,174

TOTAL

$583,785



A15. Reasons for change in burden


The initial approval was for four focus groups with small sample sizes. The focus groups were used to hone the questions in the final survey. The survey requests a larger burden, as it will be sent nationwide and requires a larger numbers of responses from the public. Without a large population to draw from, the survey may not accurately gage the country’s response to the Klamath Basin Dam removal efforts.


A16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.


After data collection is complete, RTI will prepare a final report presenting the results from the survey and the WTP estimates. Section B.2 describes the analytical techniques that will be used to summarize the results and estimate the WTP values.


The time schedule for the project is presented in Table A4. The first mailing will go to 10% of the sample. After a few weeks, the mailings for the remaining 90% of the sample will be sent.


Table A4. Time Schedule

Date

First 10% of Sample (Pretest)

Last 90% of Sample

(Main survey sample)

Analysis

completed

Lead post card mailing



completed

1st survey mailing



completed

Reminder postcard with website address and password



completed

2nd survey mailing



completed



Begin analysis of 10% sample

7/11/11


Lead post card mailing

Complete analysis of 10% sample

7/20/11


1st survey mailing


7/2911

Reminder postcard with website address and password



8/8/11


2nd survey mailing


8/15/11


3rd mailing

Begin analysis of final data

9/16/11


Main data collection complete


9/16/11


4th mailing, Federal Express (20% NR)

Begin nonresponse follow-up study

9/23/11


Reminder phone call to all households with a number, 2 attempts then message


10/12/11


Nonresponse study data collection complete


10/14/11



Draft report on main study

11/7/11



Draft report on nonresponse study





11/18/11



Complete final report

12/15/11



Final data delivery to DOI

*NR=nonrespondents



A17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


Not Applicable.


A18. Explain each exception to the certification statement.


There are no exceptions to the certification

Attachment A. Pretest Report


Summary of Klamath Pilot Test Results

DOI submitted an Information Collection Request (ICR) to the Office of Management and Budget (OMB) to conduct a pretest of the Klamath Nonuse Valuation Survey. Following approval of the ICR in April 2011, the pretest was conducted in May and June of 2011. The primary goal of the pretest was to assess whether the survey instrument and data collection process worked as expected. This memo summarizes the results from the pretest.

Overall, the data from the pretest suggest that the survey instrument works well. We would like to make two changes that we believe would improve the data collected in the final survey, but otherwise, we do not see evidence that changes are needed.

The two proposed changes follow, with more detailed discussion of the need for the changes below.

  1. We propose to change the levels of the cost attribute to $12, $48, $90, and $168, instead of $12, $24, $48, $90.

  2. After the conjoint questions in Q19 (long version of the survey), we propose to add “I would not vote for the action plans even if there were no added cost to my household” and drop “I voted for NO ACTION because I believe my taxes are already too high.”

We will also add instructions to page 3 to fill in the boxes next to the response choices completely.

  1. Response rates

    1. The response rate is somewhat higher than expected, with all 3 geographic strata responding in similar proportions.

The pretest followed the data collection plan described in the ICR and supporting statements. The households in the sample were mailed a prenotification postcard informing them that their household had been selected to be part of the survey. Following the postcard, households received a packet containing a cover letter on DOI letterhead introducing the survey, a copy of the survey instrument, $2 incentive, and a postage-paid return envelope. A reminder postcard with information about the web version of the survey and the respondent’s username and password were sent a few weeks later. Finally, a second packet was sent that included a letter asking the respondent to complete the survey and providing the information about the web version of the survey and a second copy of the survey instrument. Table 1 shows the mailing schedule for the documents.



Table 1. Pretest Survey Mailing Schedule

Type of Respondent

Date Mailed

Prenotification postcard mailing

April 20, 2011

First mailing of survey Instrument

May 13–17, 2011

Reminder postcard including Web address

May 26, 2011

Second mailing of survey instrument

June 13, 2011


A total of 1,200 household addresses were selected for the pretest sample, divided evenly across three strata: (1) the 12-county area adjacent to the Klamath River, (2) the rest of Oregon and California, and (3) the rest of the United States. Table 2 shows the responses as of June 19, 2011. As described in Supporting Statement A submitted with the ICR, we expected a total of 263 responses based on the following assumptions: response rates of 20% of the households in the Klamath area and 15% of households from outside the Klamath area for the first mailing, and an additional 10% from the reminder postcard and second mailing. As of June 19, 2011, we had received 320 completed surveys, for a combined response rate of 28%, after subtracting undeliverable surveys.

Table 2. Responses as of June 19, 2011


Number of Surveys

Paper surveys returned

314

Paper surveys returned blank

7

Web surveys

6

Undeliverable

51


Data from the first 276 surveys returned have been tabulated and analyzed to assess the results from the pretest. Tables 3 to 5 provide information on the responses by sampling strata, survey length, and undeliverable surveys by sampling strata. Each stratum supplies roughly one third of the sample, although the response rates are slightly higher outside the Klamath area (Table 3). The long version of the survey has a somewhat higher response rate than the short version (Table 4). The number of undeliverable surveys returned is similar across the three strata (Table 5).


Table 3. Responses by Sampling Area


Number of Responses

Percent of Sample

12-county Klamath area

83

30%

Rest of Oregon and California

94

34%

Rest of the U.S.

99

36%

Total

276



Table 4. Response by Survey Length


Number of Responses

Percent

Long version

147

53%

Short version

129

47%

Total

276


Table 5. Undeliverable Surveys by Sampling Area


Number of Responses

Percent

12-county Klamath area

15

33%

Rest of Oregon and California

14

31%

Rest of the U.S.

16

36%

Total

45



  1. Was the survey instrument understandable to the public and to people outside the Klamath River Basin?

    1. The results from the pretest suggest that most respondents could understand the questions, followed instructions and had adequate information to answer the stated-preference conjoint questions.

As part of the survey, respondents were asked their level of agreement with a series of statements related to the choices they made in the conjoint. There were two statements that dealt directly with comprehension, presented in Table 6. Looking first at the statement “The descriptions of the plans were hard to understand”, only 14% of the Klamath area respondents agreed with the statement and 10% or fewer of the respondents from outside the Klamath area. For the statement “The survey provided me with enough information to make a choice between the options shown”, a similar number respondents disagreed with the statement (10% in the Klamath area and the rest of the United States and 11% in the rest of California and Oregon.)

Table 6. Responses to Comprehension Questions


The descriptions of the plans were hard to understand.


Strongly Agree

Agree

Neither Agree nor Disagree

Disagree

Strongly Disagree

12-county Klamath Area

4%

10%

25%

38%

24%

Rest of Oregon and California

0%

9%

23%

49%

19%

Rest of the U.S.

2%

8%

27%

45%

18%

Total

2%

9%

25%

44%

20%


The survey provided me with enough information to make a choice between the options shown.


Strongly Agree

Agree

Neither Agree nor Disagree

Disagree

Strongly Disagree

12-county Klamath Area

20%

47%

22%

9%

2%

Rest of Oregon and California

13%

58%

18%

11%

0%

Rest of the U.S.

16%

55%

19%

7%

3%

Total

16%

53%

20%

9%

2%



We also looked at the written comments provided at the end of the survey for evidence that the survey was hard to understand or biased. A total of 77 respondents out of the 276 wrote additional comments at the end of the survey (33 comments from Klamath area respondents, 22 comments from the rest of Oregon and California, and 21 comments from the rest of the United States). As expected, there are comments on both sides of the issue, as well as comments that were unrelated to the topic of the survey. In Table 7, comments related to the overall clarity of the survey and potential biases are presented. The comments represent anecdotal information on how the survey was received. Overall, there were very few comments charging bias, and a number of comments that the survey was interesting and well written. A number of respondents expressed thanks for the opportunity to complete a survey on the topic, especially among the Klamath area respondents.

Table 7. Handwritten Comments at the End of the Survey.

Comment

Geographic Area

We would like say thank you for this opportunity. The klamath river is the life blood of our area. It is everything to my wife's family.

Klamath Area

You didn't address the main problem, the shasta river scott river, & salmon river-history has said the shasta was the main spawning river for salmon. I still think we should be able to do both-thanks for the survey.


Thanks for opportunity to provide input.


Your questions are slanted


I think you should consider using a similar survey for the san joaquin river restoration program in california.


I've read a lot of form letters and surveys and i was impressed with how plainly worded and clear this one was. It also made me curious to find out more about this issue.

Rest of Oregon and California

I am glad to see a survey such as this being sent to gather public opinion, unfortunately, most people don't have a good biology background to grasp what is happening to our rivers and wetlands. Very sad!


This survey does not provide me the most important information-will water supply be adequate after dam removal. That is my top concern. Without that info, I am not able to choose either plan a/b or no action.


This survey is completely one-sided to support the out of control environmentalists & their allies in the federal government. There was absolutely no consideration of the plight of the farmers that have no water to farm their land…


This was an excellent survey. I wish ballots and/or info about voting was as clear and well written.


I found the survey very informative.

Rest of the United States


  1. Did the levels for the conjoint questions work?

    1. Overall roughly 2/3 of the sample voted in favor of the action plan, but as expected this percentage was lower when the cost of the plan (bid amount) was higher.

Table 8 presents the percent of respondents voting for the action plans and the no action plan by geographic strata. Overall, without accounting for differences in attribute levels across the plans, 63% of the respondents selected a plan and 37% selected no action (last column of Table 8).

Table 9 breaks down the percent voting for a plan by the cost of the plan for the full sample and for the three geographic strata. Pooling the three geographic strata, the percent voting for a plan remains steady until the $90 cost level. By geographic strata, the percent selecting a plan in the rest of the United States drops earlier at $48 cost level (note that the number of respondents in each cell is small, so we do not want to place too much weight on the results by strata).

    1. Based on the responses to the conjoint questions, we propose to change the levels of the cost attribute to $12, $48, $90, and $168, instead of $12, $24, $48, $90

As shown in Table 9, currently the highest bid amount ($90) represents roughly the median willingness to pay (WTP) for the total sample (i.e., 50% vote for the plan). Ideally, we would like the range of cost levels to include WTP for the majority of respondents, not just those with WTP at or below the median. The percent voting for the plan should decline as the cost increases, and we would like to select a top cost level where roughly 30% or fewer vote for the plan. 4

Given these results, we propose adding a higher cost level that would be closer to the right-hand tail of the distribution. A cost of $168 per year ($14 per month) would be substantially higher and should result in a lower percent selecting the plan. However, adding another level to the cost attribute (for a total of 5 levels), complicates the experimental design and increases the sample size needed to obtain the same level of precision in the estimates. Therefore, we propose dropping the $24 cost level. The percent selecting a plan does not change for any of the three geographic strata between $12 and $24, so dropping the dollar amount should not cause problems in the analysis.

Table 8. Reponses to Conjoint Questions by Strata


12-County Klamath Area

Rest of Oregon and California

Rest of the U.S.

Total

Voted for no action

50%

33%

29%

37%

Voted for plan

50%

67%

71%

63%



Table 9. Vote by Cost of Plan


$12

$24

$48

$90

Voted for plan, Total Sample

66%

69%

67%

49%

Voted for plan, Klamath Area

51%

50%

64%

32%

Voted for plan, Rest of Oregon and California

73%

73%

72%

51%

Voted for plan, Rest of United States

76%

78%

63%

61%


    1. The lower rate of pro-plan voting by respondents in the Klamath area reflects different attitudes and perceptions about the effectiveness and desirability of Klamath Basin restoration activities

The finding that respondents living closest to the restoration area have a lower average propensity to vote for the plans (and hence a lower WTP) runs somewhat counter to the findings from other similar studies. For example, Schaafsma (2008) identifies 18 contingent valuation and choice experiment studies applied to environmental programs in the United States or Europe that have found statistically significant “distance-decay” effects, where WTP is negatively related to a respondent’s distance from the program area. For this project, the most directly relevant and comparable study is the Loomis (1996) analysis of the Elwha Dam removal program. That study used results from a nationwide mail CVM survey to estimate average household WTP for increases in native salmon populations resulting from the program. It found that distance (from the respondent’s residence to the Elwha River) had a small, but negative and statistically significant effect on WTP. For example, Loomis estimated that average household WTP by Washington residents was roughly 15% higher than for residents in the rest of the United States ($78 compared to $68 in 1995 dollars).

One of the most important issues in a conjoint survey like the Klamath non-use survey is to ensure, to the extent possible, that individuals responding to the survey are presented with the same information.  In short, the goal is for individuals to value a good that is presented consistently across all individuals that receive the survey.  However, the fact that Klamath Basin residents may have a lower WTP than residents outside of the Klamath Basin does not imply that they are valuing a different good, but that their stated values may account for a different pre-survey information set about the contentious history behind the development of the Klamath Basin agreements due to their proximity to the resource.  The attitudinal and debriefing questions in the survey were designed to control for how these factors influence WTP and could be expected to vary across the three strata.

Our pretest findings suggest that there are important differences in the attitudes and perceptions of individuals living near the Klamath Basin compared to those living farther away. The results in Table 10 highlight these differences. In particular, respondents in the Klamath area stratum are significantly more likely to believe that (1) the plans would hurt the local economy, (2) the plans would not work as described in the survey, and (3) removing Klamath dams is a bad idea. Despite being presented with the same information in the survey, Klamath area residents tend to exhibit much more skepticism about the effectiveness and desirability of the plans.

We find that these differences account at least in part for the lower average WTP by Klamath area residents. For example, Table 11 compares rates of pro-plan voting across strata, controlling for differences in perceptions about whether the plans would work as described. Comparing across only the respondents who agree that the plans would work as described, Klamath area residents actually have the highest propensity to vote for the plan.

In our analysis of the final survey data, we will continue to control for these differences in attitudes and perceptions and to investigate their role in explaining differences in WTP. We will also examine differences in other factors, in particular socioeconomic conditions, to determine their role.


Table 10. Percentage of Respondents Who Agree or Strongly Agree with Statement by Strata


12-County Klamath Area

Rest of Oregon and California

Rest of the U.S.

Total

 

N=83

N=94

N=99

N=276

q18a "My choices would be different if the economy in my area were better"

24.1%

24.5%

22.2%

23.6%

q18b "It is important to restore the KRB, no matter how much it costs"

22.9%

33.0%

35.4%

30.8%

q18c "I do not think I should have to contribute to the restoration of the KRB"

38.6%

17.0%

35.4%

30.1%

q18d "I am concerned that the plans would hurt the economy in the KRB"

37.4%

23.4%

18.2%

25.7%

q18e "The descriptions of the plans were hard to understand"

13.3%

8.5%

10.1%

10.5%

q18f "I do not believe that the plans will actually increase the number of fish as described"

41.0%

10.6%

12.1%

20.3%

q18g "Removing the dams from KR is a bad idea"

44.6%

19.2%

15.2%

25.4%

q18h "Some of the plans cost too much compared to what they would deliver"

45.8%

25.5%

24.2%

31.2%

q18i "The changes offered by the plans happen too far in the future for me to care"

19.3%

10.6%

15.2%

14.9%

q18j "The survey provided me with enough info to make a choice b/w the options shown"

65.1%

69.2%

70.7%

68.5%



Table 11. Percentage of Respondents Choosing Action Plan A (over No Action) by Strata and by Belief that Plan Would Work as Described


12-County Klamath Area

Rest of Oregon and California

Rest of the U.S.

Total

Respondents who agree with "I do not believe that the plans will actually increase the number of fish as described"

11.8%

20.0%

8.3%

12.5%

of

of

of

of

34

10

12

56

Respondents who do NOT agree with "I do not believe that the plans will actually increase the number of fish as described"

79.6%

77.4%

78.2%

78.2%

of

of

of

of

49

84

87

220

All respondents

 

51.8%

71.3%

69.7%

64.9%

of

of

of

of

83

94

99

276



  1. Was there a difference between the long version of the survey (two conjoint questions) and the short version of the survey (one conjoint question)?

    1. The percent selecting Plan A in the long and short versions of the survey is the same, suggesting that the presence of the second conjoint question in the long version did not affect the responses to the first question (see Table 12).

Table 12. Responses to Conjoint Questions


Long Version

(N=142)

Short Version

(N=123)

Voted for Plan A

68%

67%




  1. Additional information on votes for no action.

    1. We propose to add the statement with “I would not vote for the action plans even if there were no added cost to my household” to question 19 and drop “I voted for NO ACTION because I believe my taxes are already too high.”

After the conjoint questions, question 19 (in the long version of the survey) reads:

  1. If you voted for NO ACTION in either of the two choices, please rate how much you agree or disagree with each of the following statements. If not, skip to Q20.


1


Strongly Agree

2



Agree

3
Neither Agree nor Disagree

4



Disagree

5


Strongly Disagree

I voted for NO ACTION because I am against any more taxes or government spending.

1

2

3

4

5

I voted for NO ACTION because I believe my taxes are already too high.

1

2

3

4

5



Question 19 was included for sensitivity analysis. Such debriefing questions are standard practice for stated-preference surveys. These and other similar questions about the respondents choices were included in this survey to look at the impact of opinions about government spending and taxes on responses. Comparing responses to the two statements, the correlation coefficient is 0.87. Because the responses are highly correlated, we propose replacing the second statement with “I would not vote for the action plans even if there were no added cost to my household.” This question would provide information about respondents who may not have a WTP greater than zero, and we feel it would provide more information for sensitivity analysis.



References

Banzhaf, H. S., D. Burtraw, D. Evans, and A. Krupnick. 2006. “Valuation of Natural Resource Improvements in the Adirondacks.” Land Economics 82(3):445-464. doi:10.3368/le.82.3.445.

Loomis, J.B. 1996. “How Large is the Extent of the Market for Public Goods: Evidence from a Nationwide Contingent Valuation Survey.” Applied Economics 28(7):779-782.

Schaafsma, M. 2008. Spatial Effects in Environmental Valuation Using Stated Preference Techniques. FEEM Doctoral Paper Series.

1 The U.S. Government and PacifiCorp are only parties to the KHSA. The U.S. Government becomes a party to the KBRA upon enactment of authorizing legislation.

2 A similar and potentially relevant SP study conducted outside the United States is by Johansson and Kriström (2009), which includes a contingent valuation (CV) analysis of changes in water flow from a hydroelectric dam in Sweden. Another is a paper by Morrison and Bennett (2004) which uses SP methods to estimate and compare values for river restoration projects in five catchments in New South Wales, Australia.

3 Based on an average hourly wage of $19.41. See BLS Employer costs for Employee Compensation – December 2009, March 10, 2010. (http://www.bls.gov/news.release/ecec.nr0.htm).


4 For example, in a similar stated preference study of a fish restoration program in the Adirondacks, Banzhaf et al. (2006) included bids that targeted the median, the 30th and 70th percentiles of the WTP distribution. Similar to our study, roughly 70% voted for the plan at $25 and 50% voted for the plan at $90. In their study roughly 30% voted for the plan at $250.

6



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSupporting Statement for a New Collection RE: Capacity for Local Community Participation in Deer Management Planning
AuthorMegan McBride
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy