Elwha_River_Pilot_Study_ICR_12172012_Part_A 2013.10.24_clean

Elwha_River_Pilot_Study_ICR_12172012_Part_A 2013.10.24_clean.docx

PILOT TEST OF THE ELWHA RIVER DAM REMOVAL AND FLOODPLAIN RESTORATION ECOSYSTEM SERVICE VALUATION PROJECT SURVEY

OMB: 0648-0683

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT

PILOT TEST OF THE ELWHA RIVER DAM REMOVAL AND FLOODPLAIN RESTORATION ECOSYSTEM SERVICE VALUATION PROJECT SURVEY

OMB CONTROL NO. 0648-xxxx



A. JUSTIFICATION

1. Explain the circumstances that make the collection of information necessary.

Background

The National Oceanic and Atmospheric Administration (NOAA) is requesting approval for a new information collection effort for the purpose of conducting a pilot test of a draft survey instrument developed for the Elwha River Dam Removal and Floodplain Restoration Ecosystem Service Valuation Project (the “Project”). Information gained from this pilot test will be used to modify the draft survey instrument prior to administration of a final survey for the Project.

NOAA has received funding from the Estuary Restoration Act of 2000 (Title I, P.L. 106-457) to expand research on ecosystem services valuation. Part of NOAA’s role under this Act is to develop metrics to determine the economic value and impact of restoration. The Project will be NOAA’s first effort to develop these metrics. This project is designed as a research project to evaluate valuation of ecosystem services provided by restoration actions.

The Elwha River Ecosystem and Fisheries Restoration Act of 1992 (i.e., the “Elwha Act,” P.L. 102-495) authorized the Secretary of the Interior to acquire and remove two hydroelectric dams on the Elwha River (the Elwha and Glines Canyon dams) and implement restoration actions to restore the Elwha River and its native anadromous fisheries. The proposed Project will not be used to make agency decisions or to inform policies affecting the dam removal, restoration activities, or groups impacted by the dam removal and restoration on the Elwha River, as these activities are already underway. Rather, it is designed to capitalize on the planned dam removal and restoration efforts to allow NOAA to better understand the public’s comprehension about ecosystem service measures and the value the public places on these types of ecosystem services associated with the river habitat restoration. Additionally, we expect this research to provide helpful insights regarding public values for ecosystem services that may help inform other restoration projects.

While previous work has been done to estimate the value of removing dams on the Elwha River, we are not aware of any research on the value of restoring ecosystem services after the dams have been removed. We have designed this study to improve NOAA’s understanding of how the public values ecosystem services and their restoration more generally.

The Elwha River dam removal and restoration actions present a unique opportunity for NOAA to undertake research on ecosystem service measures and evaluate economic value associated with restoration activities. Because of the extensive planning and review process for the dam removal, significant baseline ecological data are available to allow a comparison of ecological values before and after the floodplain restoration and dam removal, and to investigate potential tradeoffs between ecological and human use values. The ability to link results of the Project to precise measures of ecosystem changes could be applied to future restoration sites, enabling NOAA to evaluate a broader range of ecosystem services provided by future restoration actions. The removal of these dams, scheduled to be completed by the end of 2013, will be the largest dam removal project in U.S. history. This dam removal, along with restoration actions planned for the floodplain and drained reservoir basins, will impact people in the surrounding region in numerous ways. Impacted groups include recreators who engage in river activities such as fishing and rafting, reservoir users, and members of the Lower Elwha Klallam Tribe (LEKT) for whom the river has cultural, environmental, and economic significance. These impacted groups are likely to have associated some value with the natural resources of the Elwha River.

NOAA has contracted with Stratus Consulting in Boulder, Colorado, to undertake the Project and develop the total value survey by conducting qualitative research and, now, a pilot test for the developed survey. NOAA team members and the Stratus Team (collectively “the Team”) anticipate conducting two waves of pretesting in Washington and Oregon. The proposed Project is designed to administer a stated-choice survey to measure the total value (i.e., combined use and non-use values) of alternative levels of salmon, and forest and wildlife, restoration actions to provide ecosystem services, and to address an important gap in research on ecosystem service improvements associated with habitat restoration and protection.

Request

This information collection request (ICR) is to conduct a pilot study of the draft survey instrument developed through focus groups and interviews conducted under OMB Control No. 0648-0638. The proposed pilot study would be administered in two waves. In the first wave, we propose to use Knowledge Networks (KN), a GfK company, to administer the survey online to its existing KnowledgePanel® in Washington and Oregon. The goal is to achieve 1,050 completed surveys. We propose to use the KnowledgePanel® in the first wave to evaluate alternative presentation formats of some of the survey information to respondents. The information gained from the testing in KnowledgePanel® will be used to select and administer one of the approaches evaluated in Wave 1 during Wave 2.

Wave 1 of the pilot study will inform Wave 2 in the following ways:

  1. Bid Design – We will review the data from Wave 1 to assess whether the bid amounts are sufficient to separate differences in WTP between Salmon and Forest restoration, and encompass a reasonable range of respondents’ WTP.

  2. Choice question format – We will determine whether the format of the choice questions is clear and whether one format dominates the other. The most efficient question format is anticipated to be used in Wave 2.

  3. Scenario acceptance – We will determine whether respondents had difficulty in understanding the description of the scenario or the questions being asked.

We will assess this information by analyzing respondents’ votes and their responses to follow-up questions, and reviewing any open-ended comments respondents include on their surveys. This assessment of Wave 1 would then inform any changes made to Wave 2.During survey development, the Team evaluated alternative formats of the choice tasks. (See Appendices B and C) The alternatives varied on the amount of information presented in the actual choice task tables, and in whether or not respondents are required to select the one specific alternative that predetermines the levels of salmon and reservoir restoration, or allows respondents to independently choose the levels of salmon and reservoir restoration. Reservoir restoration is the mechanism to provide increased forest and wildlife services. During survey development efforts, the Team identified respondent’s preference of being able to choose the level of salmon restoration independent of the level of reservoir restoration. This approach would be novel when compared to previous stated choice surveys. Therefore, we propose testing the alternative approaches using the KnowledgePanel® to evaluate the preferred approach prior to Wave 2 of the pretest effort. Because NOAA ultimately plans to administer the final survey instrument using a mail mode, the second wave would be administered by mail with a goal of achieving 250 completed surveys. During this wave, we would administer the preferred choice format, and focus on collecting information to refine bid values for the main survey.



2. Explain how, by whom, how frequently, and for what purpose the information will be used. If the information collected will be disseminated to the public or used to support information that will be disseminated to the public, then explain how the collection complies with applicable NOAA Information Quality Guidelines.

The information collected will be used to improve NOAA’s understanding of the importance of ecosystem service restoration to the public and will not be used to inform policies regarding dam removal or river restoration on the Elwha River.

This information collection will form the basis for pretesting an effective total valuation survey. The information collected during this pretest will be used by the Team to finalize a proposed approach to conduct the main survey. An additional ICR will be developed to undertake the main survey.

At the regional level, we anticipate a diversity of views about the dam removal and habitat restoration. The Team has already investigated the heterogeneity of views and values through a qualitative investigation using a series of focus groups and one-on-one interviews (see Attachment A for a summary of qualitative research findings). Based on this research and input from external peer review, internal peer review, and scientific fact checking, the Team has designed the draft survey instrument included in this ICR (see Attachment B) and is prepared to pretest it with a small sample of people from Washington and Oregon.

The purpose of the first wave of the pretest is to test two ways of presenting the choice question to respondents (see Attachments B and C). Once KN sends the pretest data for the first wave of data collection, the Team will analyze it using simple summary statistics, develop a presentation on the results, and make any changes, if necessary, to the draft survey instrument or experimental design before implementing the second wave of the pretest.

The purpose of the second wave of the pretest is to make sure that there are no issues with administering the survey via mail, which is the proposed mode of data collection for the final survey.

Overview of the draft survey instrument

NOAA had undertaken extensive qualitative research, scientific fact checking and peer review in developing the draft survey and research plan. Through this process, NOAA anticipated that the information collected will comply with NOAA Data Quality Guidelines. Below we describe the efforts undertaken to develop, check and preliminarily test the data collection survey.

The survey instrument included in this ICR has been designed for a mail-mode administration, which is the proposed mode of collection for the second wave of the pretest and the final survey. This version differs from the executable version for the first wave of the pretest in format only. An online address to review the Web Interface version will be provided to the Office of Management and Budget (OMB) as soon as all the screens have been programmed correctly.

Throughout the development and presentation of materials in the survey, the Team has strived to present information in a balanced, neutral manner. Discussions of the details of this balance are provided in the individual sections below.

As the information is presented, it is divided into sections by questions designed to encourage review and consideration of survey information and to provide us with feedback on respondents’ preferences based on the information they have seen up to that point.

Summaries of the major sections of the main survey follow.

Part I: Introduction to the study

The first section provides the background and purpose of the survey, which is to ask people what they think should be done, if anything, to improve the environment around the Elwha River in western Washington. The respondents then learn more about the Elwha River and the surrounding area. The Team included a map as an insert in this survey to provide more context and to familiarize respondents with the area.

The text then emphasizes that the Elwha Dam has been removed and the Glines Canyon Dam will be removed by 2013. The Team tested this section frequently to make sure respondents understood that dam removal was a given.

In this section, we ask respondents about whether they have heard of the Elwha River or about the removal of the dams, and whether they have visited the Elwha River or Olympic National Park. These questions will help the Team understand respondents’ knowledge of and familiarity with the Elwha River.

Part II: The Elwha River ecosystem

This section introduces the concept of ecosystem, which the Team found during testing was a concept most people understood. It discusses two components of the ecosystem: (1) salmon and (2) forests and wildlife.

Respondents learn about a less familiar concept: keystone species. With the help of the diagram, they seemed to understand what a keystone species is, and how salmon was a keystone species for the Elwha River ecosystem before the dams were built.

This section also introduces the role that forests played in the Elwha River ecosystem before the dams were built. The graphic on this page shows respondents the variety of plants and animals found in the forests along the Elwha River and how people use the forests. The section ends with a summary of how the dams have affected the Elwha River ecosystem; for example, they prevented salmon from moving upstream to spawn, and the lakes behind them covered the forests.

Part III: Restoration alternatives

In this section, respondents learn three things about salmon restoration and forest and wildlife recovery: (1) the effects of the dams on each of them, (2) how much could be restored once the dams are gone, and (3) steps that could be taken to speed up the restoration process. For salmon restoration, the baseline condition is presented as a percentage of historical levels of fish returning to the Elwha River each year. Our main source for information about historical, current, and future salmon numbers is NOAA’s Fish Restoration Plan (NOAA, 2008). We included an estimate of historical numbers of returning fish for two reasons. First, this number provides general context for the valuation exercise so that respondents have a sense of the scale of changes they might expect to see. Second, it helps make the basic point that full restoration to historical levels is impossible. However, few records exist for use in estimating numbers of spawning salmon before 1913, when the Elwha Dam was completed. As a result, scientists have struggled to estimate historic numbers. NOAA (2008, pp. 83–85) summarizes the available studies. We are also working with Dr. George Pess of NOAA’s Northwest Fisheries Science Center, one of the lead biologists working on Elwha River anadromous fish restoration, to understand a reasonable range of historical levels to use based on available studies.

The improvement scenarios used in the choice questions drew heavily on the expertise of Dr. Pess. Our “Salmon Alternative 2: Limited Action” (150,000 fish) is in keeping with NOAA’s Fish Restoration Plan (NOAA, 2008, pp. 86–94). “Salmon Alternative 3: Extensive Action” (180,000 fish) reflects the estimates of some of the more optimistic biologists. Both numbers were vetted by Dr. Pess, who believes that the less optimistic alternative is more likely. These values were based on estimates of spawner escapement rather than those of total production, which are much larger. We used spawner escapement figures because we found that people in our focus groups and cognitive interviews could better understand restoration when described in terms of the average number of adult fish returning each year to spawn.

Respondents understood that there are limits to how many fish could return to the Elwha River because of habitat changes below the dam and current fishing pressures. During the qualitative research phase, the Team learned that people really care about restoring the environment to a condition that more closely resembles what it looked like before the dams were built. For forests and wildlife, the baseline condition is presented as a percentage of full recovery.

For both salmon restoration and forest and wildlife recovery, there are three alternatives respondents can consider: doing nothing more after dam removal (“No further actions”), doing some limited actions, or doing some extensive actions. Respondents see a graph and a table showing how the salmon and forests and wildlife would recover under each of the three alternatives. For salmon restoration, respondents see percentages of salmon returning each year, as well as the actual numbers of salmon returning each year. During the qualitative research phase, the Team learned that some people like graphs, some people like tables, and some people like both. The Team also learned that people like to see both the percentages and the cardinal numbers, which is why both are included in the salmon restoration table. For the forest and wildlife restoration section, respondents only see percentages of recovery in the graph and table, which the Team found was acceptable to respondents.

Part IV: Benefits and negative impacts

This section reminds respondents about the tradeoffs between taking additional steps to restore the Elwha River ecosystem and letting it recover naturally after dam removal. Although people may benefit from personally using the river for recreation or just knowing the river is restored, it could have a negative impact on trout populations and would come at a cost to the respondent.

Part V: Payment mechanism

In this section, respondents learn how the additional restoration actions would be paid for, i.e., by adding a surcharge to the electricity bills sent to the general public in Washington and Oregon in 2013.

Part VI: Choice question

This final section asks respondents to choose the alternative or option they think is best. For the first wave of the pretest, we are running an experiment using two versions of the choice question.

In the choice question at the end of Attachment B, respondents can choose the level of restoration they want for salmon independently from what level of restoration they want for forests and wildlife. The Team found during the qualitative research phase that respondents wanted to be able to choose these levels separately, rather than be bound by a preselected bundle of alternatives.

The choice question in Attachment C presents respondents with a more traditional choice format. In this format, respondents can always choose doing nothing for both salmon and forests and wildlife. The other two options, however, could be any number of combinations of doing nothing, doing limited actions, or doing extensive actions (see experimental design below).

Part VII: Debriefing questions

At the end of the survey, we ask several debriefing questions to test whether respondents thought their opinion was consequential (i.e., to see whether they thought they would actually have to pay the surcharge in 2013 and whether they thought public officials would take their opinion into consideration).

Experimental design

This section describes the experimental design for the Elwha River Dam Removal and Floodplain Restoration Ecosystem Service Valuation Pilot Survey (“survey”). The developed design will be pretested using a subset of the overall design. Adjustments to the final design for the main survey will be based upon the results of the pretest. This section describes the method and layout of the experimental design that will be used for the pretest survey. We expect to revise the attribute levels presented here based on results from the pretest.

The survey includes six attributes: (1) number of years until the maximum salmon restoration level is achieved, (2) the maximum percentage of salmon restoration achievable, (3) the number of years until the maximum forest restoration level is achieved, (4) the maximum percentage of forest and wildlife restoration achievable, (5) the costs of the salmon restoration alternatives, and (6) the costs of the forest restoration alternatives.

For each attribute, three alternatives are offered: (1) do nothing more, (2) do limited actions, and (3) do extensive actions. Each option entails a different time path following a logistic curve for recovery of the attribute. The time path is defined by the maximum level the attribute can reach and how quickly it reaches that level under the alternative. For salmon and forest and wildlife time paths, extensive actions provide the fastest recovery; limited actions provide a slower recovery than extensive but a faster one than doing nothing more; and doing nothing more is the slowest way of the three. For forests and wildlife, each alternative leads to the same maximum recovery level (100%). For salmon, extensive actions lead to the greatest percentage of historical levels; limited actions lead to an intermediate percentage of historical levels; and doing nothing more leads to the lowest percentage of historical levels.

The time paths for each attribute are based on the Elwha River fisheries restoration plan, the revegetation plan, and conversations with restoration botanists, wildlife biologists, and fisheries biologists.

There are nine possible combinations of the three alternatives for the non-cost attributes (see Table 1).

Table 1. Possible combinations of alternatives

Combination

Salmon
restoration

Forest and
wildlife restoration

1

Do nothing more

Do nothing more

2

Do nothing more

Limited

3

Do nothing more

Extensive

4

Limited

Do nothing more

5

Limited

Limited

6

Limited

Extensive

7

Extensive

Do nothing more

8

Extensive

Limited

9

Extensive

Extensive



To allow respondents to find their most preferred combination without presenting a choice question with nine alternatives, under one presentation alternative, we split the choice question into two parts. Respondents first pick their most preferred of the three alternatives for salmon, then they pick their most preferred of the three alternatives for forests and wildlife. These alternatives are assigned to survey versions in way that precludes a respondent of receiving the same scenario back to back. To underscore to respondents that their payment is the sum of the two alternatives they choose, we ask them to add the costs and write down the total cost of their selections.

To our knowledge, this approach is novel in the stated preference literature. While participants in focus groups understood the task and preferred the flexibility it offered, we would like to use a larger sample to test for statistical differences between responses to the “independent” approach and a “traditional” choice experiment approach. Both approaches will present three alternatives, with “Do nothing more” as an option for each choice question. We will divide the sample into two parts, with one half of the respondents receiving the independent choice question and the other half receiving three traditional choice questions in each survey. The traditional choice questions will present a subset of three of the nine combinations of alternatives in each version, with “Do nothing” for both attributes as one of the three alternatives.

Table 2 summarizes the attributes and levels we will use for the pretest survey; we will use one level for the non-cost attributes. Cost will include two levels for each alternative. Cost is always greatest for the extensive alternative, less for the limited alternative, and zero for the “Do nothing more” alternative.

As is common in a study using iterative bid design development, we are using this pilot study to develop bid amounts for the full field study. The costs (bid amounts) were determined using feedback from focus group participants and restrictions imposed by the policies we evaluated. During focus groups, we tested different maximum costs to identify an amount that most respondents who otherwise expressed support for the program were not willing to pay. After establishing the maximum cost, we selected bid amounts that satisfied the following criteria: 1) in any given pair, salmon restoration always costs more than forest restoration, 2) the extensive program always costs more than the limited program, and 3) the costs were spaced roughly evenly between zero and the maximum. Based on our experience with bid design, we expect that the range and spacing of bid amounts will provide sufficient variability to estimate willingness to pay. Additionally, the experimental design has been developed to ensure that the bid amounts are set against a range of levels for the other attributes to provide the variability that will allow us to estimate willingness to pay as precisely as possible. This set of levels will allow us to estimate coefficients for cost and each of the six time paths.



Table 2. Program attributes and associated levels

Attribute

Maximum percentage reached

Years until maximum percentage reached

Cost
($/year)

% of salmon restored




Do nothing more

40%

100

0

Limited

50%

30

[40, 60]

Extensive

[50%, 60%]

20

[75, 150]

% of mature forest restored



Do nothing more

100%

200

0

Limited

100%

125

[20, 40]

Extensive

100%

90

[45, 75]



There are eight possible choice sets for salmon that contain all of the different combinations of levels being considered. There are also four possible choice sets for forests and wildlife that contain all combinations of levels. They can be combined into twelve orthogonal, main-effects choice sets, as listed in Table 3. This is the experimental design for the “independent” choice question format. Survey versions are designed provide sufficient variation in main-effects alternatives while eliminating the opportunity for respondents to receive the same scenario twice.

Table 3. Survey versions for independent choice format

Version

Alternative

Salmon

Forests and wildlife

Maximum %

Year max reached

Cost ($/year)

Maximum %

Year max reached

Cost ($/year)

1

Do nothing more

40%

100

0

100%

200

0


Limited

50%

30

40

100%

125

20


Extensive

60%

20

75

100%

90

45

2

Do nothing more

40%

100

0

100%

200

0


Limited

50%

30

40

100%

125

40


Extensive

60%

20

75

100%

90

75

3

Do nothing more

40%

100

0

100%

200

0


Limited

50%

30

40

100%

125

20


Extensive

60%

20

150

100%

90

75

4

Do nothing more

40%

100

0

100%

200

0


Limited

50%

30

40

100%

125

40


Extensive

60%

20

150

100%

90

45

5

Do nothing more

40%

100

0

100%

200

0


Limited

50%

30

60

100%

125

20


Extensive

60%

20

75

100%

90

75

6

Do nothing more

40%

100

0

100%

200

0


Limited

50%

30

60

100%

125

40


Extensive

60%

20

75

100%

90

45

7

Do nothing more

40%

100

0

100%

200

0


Limited

50%

30

60

100%

125

20


Extensive

60%

20

150

100%

90

45

8

Do nothing more

40%

100

0

100%

200

0


Limited

50%

30

60

100%

125

40


Extensive

60%

20

150

100%

90

75

9

Do nothing more

40%

100

0

100%

200

0


Limited

50%

30

40

100%

125

20


Extensive

50%

20

75

100%

90

45

10

Do nothing more

40%

100

0

100%

200

0


Limited

50%

30

40

100%

125

20


Extensive

50%

20

150

100%

90

75

11

Do nothing more

40%

100

0

100%

200

0


Limited

50%

30

60

100%

125

40


Extensive

50%

20

75

100%

90

45

12

Do nothing more

40%

100

0

100%

200

0


Limited

50%

30

60

100%

125

40


Extensive

50%

20

150

100%

90

75



The versions that use the bundled choice question format will select from a subset of the 72 combinations of salmon and forest and wildlife programs to produce a main-effects orthogonal design. There will be 12 versions using the traditional choice question format, listed in Table 4. As in the independent choice format, respondents are precluded from receiving certain scenarios back to back in the traditional choice format.

Table 4. Survey versions for traditional choice questions with three questions per version

Version

Choice set

% maximum, salmon

Year maximum, salmon

% maximum, forests

Year maximum, forests

Cost

1

1

40%

100

100%

200

0


1

50%

30

100%

90

85


1

60%

20

100%

125

115


2

40%

100

100%

200

0


2

40%

100

100%

125

20


2

50%

30

100%

125

60


3

40%

100

100%

200

0


3

60%

20

100%

200

100


3

60%

20

100%

90

175

2

4

40%

100

100%

200

0


4

60%

20

100%

125

120


4

50%

30

100%

90

85


5

40%

100

100%

200

0


5

40%

100

100%

90

75


5

60%

20

100%

90

175


6

40%

100

100%

200

0


6

50%

30

100%

200

40


6

50%

30

100%

125

80

3

7

40%

100

100%

200

0


7

50%

30

100%

90

135


7

60%

20

100%

125

95


8

40%

100

100%

200

0


8

40%

100

100%

90

75


8

60%

20

100%

90

150


9

40%

100

100%

200

0


9

50%

30

100%

200

60


9

50%

30

100%

125

80

4

10

40%

100

100%

200

0


10

40%

100

100%

125

40


10

50%

30

100%

125

100


11

40%

100

100%

200

0


11

60%

20

100%

200

75


11

60%

20

100%

90

120


12

40%

100

100%

200

0


12

60%

20

100%

125

140


12

50%

30

100%

90

105



We will compare the two choice question formats using four criteria: statistical efficiency, consequentiality, burden hours, and item non-response. First, we will compare the standard errors on the cost coefficient. The approach with the smallest standard errors is the most statistically efficient, allowing us to estimate the most precise willingness-to-pay (WTP) values. Second, we will compare whether respondents perceive that the government would be more likely to act based on results from the survey and whether they believed they would have to pay the amount they chose. Greater consequentiality would mean a more realistic scenario and more accurate WTP estimates. Third, we will compare how long it takes respondents to complete the choice question section to see whether the independent approach could significantly reduce the public’s burden hours associated with stated preference surveys. Finally, we will compare item non-response rates between approaches to see if the independent approach could significantly reduce the occurrence of skipped choice questions.

Use of stated choice questions

Stated choice methods have been identified as a useful tool to better understand individuals’ preferences and values for environmental amenities that are not traded in markets. While the Elwha River is currently used by some in the region, its potential restoration would contribute to the broader public good. No markets are available to study the value of restoring this ecosystem. Stated choice methods also allow for the evaluation of a full range of restoration alternatives, including doing nothing more once the dams are removed.

Stated choice methods are well established in the literature on environmental economics (Kanninen, 2007). This approach evolved from conjoint analysis, a method used extensively in marketing and transportation research (Louviere et al., 2000).1 Conjoint analysis requires respondents to rank or rate multiple alternatives in which each one is characterized by multiple characteristics (e.g., Johnson et al., 1995; Roe et al., 1996; Holmes and Adamowicz, 2003). Choice questions require respondents to choose their best alternative (a partial ranking) from multiple alternative goods (i.e., a choice set), where the alternatives within a choice set are differentiated by their characteristics.

There are many desirable aspects of stated choice questions, not the least of which is the nature of the choice being made. Choosing the most preferred alternative from a set of alternatives is a common method. Morikawa et al. (1990) note that responses to choice questions often contain useful information on tradeoffs among characteristics. Quoting from Mathews et al. (1997), stated choice “models provide valuable information for restoration decisions by identifying the characteristics that matter to anglers and the relative importance of different characteristics that might be included in a fishing restoration program.” Johnson et al. (1995) note that “The process of evaluating a series of pair wise comparisons of attribute profiles encourages respondents to explore their preferences for various attribute combinations.” Choice questions encourage respondents to concentrate on the tradeoffs between characteristics, rather than to take a position for or against an initiative or policy. Adamowicz et al. (1998a) note that the repeated nature of choice questions makes it difficult to behave strategically.

Choice questions allow for the construction of goods characterized by levels that currently do not exist. This feature is particularly useful in marketing studies whose purpose is to estimate preferences for proposed goods, where various characteristics can be manipulated in arriving at final product designs.2 For example, Beggs et al. (1981) assess the potential demand for electric cars. Similarly, researchers estimating the value of environmental goods are often valuing a good or condition that does not currently exist, e.g., a restored ecosystem.

Choice questions, rankings, and ratings are increasingly used to estimate the value of environmental goods. For example, Magat et al. (1988) and Viscusi et al. (1991) estimate the value of reducing health risks; Adamowicz et al. (1994, 1998b, 2004), Breffle et al. (2005), and Morey et al. (1999a) estimate recreational site choice models for moose hunting, fishing, and mountain biking; Breffle and Rowe (2002) estimate the value of broad ecosystem attributes (e.g., water quality, wetlands habitat); Adamowicz et al. (1998a) estimate the value of enhancing the population of a threatened species; Layton and Brown (1998) estimate the value of mitigating forest loss resulting from global climate change; and Morey et al. (1999b) estimate WTP for monument preservation in Washington, DC. In each of these studies, a price (e.g., a tax or a measure of travel costs) is included as one of the characteristics of each alternative so that preferences for the other characteristics can be measured in terms of dollars. Other examples of choice questions to value environmental commodities include Swait et al. (1998), who compare prevention versus compensation programs for oil spills, and Mathews et al. (1997) and Ruby et al. (1998), who ask anglers to choose between two saltwater fishing sites as a function of their characteristics.

Alternatively, a number of environmental studies have used ratings, in which survey respondents rate the degree to which they prefer one alternative to another. For example, Opaluch et al. (1993) and Kline and Wichelns (1996) develop a utility index for the characteristics associated with potential noxious facility sites and farmland preservation, respectively. Johnson and Desvousges (1997) estimate WTP for various electricity generation scenarios using a rating scale in which respondents indicate their strength of preference for one of two alternatives within each choice set. Other environmental examples include Rae (1983), Lareau and Rae (1998), Krupnick and Cropper (1992), Gan and Luzar (1993), and Mackenzie (1993). Adamowicz et al. (1998b) provide an overview of choice and ranking experiments applied to environmental valuation, and argue that choice questions better predict actual choices than do rating questions because choice questions mimic the real choices individuals are continuously required to make, whereas individuals rank and rate much less often.3

Choice and rating questions characterize the alternatives in terms of a small number of characteristics. For example, Opaluch et al. (1993) characterize noxious facilities in terms of seven characteristics; Adamowicz et al. (1998b) use six characteristics to describe recreational hunting sites; Johnson and Desvousges (1997) use nine characteristics to describe electricity generation scenarios; Mathews et al. (1997) use seven characteristics to describe fishing sites; Morey et al. (1999a) use six characteristics to describe mountain biking sites; and Morey et al. (1999b) use two characteristics to characterize monument preservation programs.

How information disseminated to the public complies with NOAA Information Quality Guidelines



Utility

The overall study goals were refined through the qualitative research phase of this project and through meetings with key stakeholder groups, including federal and state resource managers and the Team. These initial meetings allowed us to identify key information needs. At critical points throughout the study, we updated the key stakeholders on the status of the study. This ensured that all information developed from this project will be transparent to all members of the public. Any information that is ultimately disseminated to the public will provide detailed analysis on the value associated with improving ecosystem services, which is a key issue associated with many environmental policy decisions. During conversations with stakeholders, they mentioned the desire to get better information on the benefits provided by ecosystem service improvements through habitat restoration. The information developed during the Pilot Project will be able to provide some of this information.

Objectivity

The survey instrument will contain scientific facts/information and potential scenarios that will be presented to respondents. The information will allow them to make tradeoffs and state preferences for different ecosystem services and ecological outputs (e.g., changes in fish biomass). These ecological outputs as presented were vetted for their validity by subject-matter experts such as fish biologists. The goal is to present balanced and factual information to the respondents. We also conducted internal peer reviews on all work products. External reviewers also had an opportunity to comment on factual details presented in the survey and work products during the qualitative research process. Peer review will ensure that the information collected is accurate, reliable, and unbiased and that the information reported to the public is accurate, clear, complete, and unbiased.

Integrity

During both waves of the pretest, participants will be reminded that their participation is voluntary, that their responses will be protected, and that no material identifying them will be provided to anyone.

NOAA will retain control over the information and safeguard it from improper access, modification, and destruction, consistent with NOAA standards for confidentiality, privacy, and electronic information. See response to Question 10 of this Supporting Statement for more information on confidentiality and privacy. The information collection is designed to yield data that meet all applicable information quality guidelines. Prior to dissemination, the information will be subjected to quality control measures and a pre-dissemination review pursuant to Section 515 of Public Law 106-554.


3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological techniques or other forms of information technology (IT).

For the first wave of the pretest, we propose the use of automated, electronic data collection by using KN’s KnowledgePanel® administered over the Internet. The KnowledgePanel® is an online non-volunteer access panel.4 Recruited households without Internet access are provided with a netbook computer and free Internet service to allow them to participate. All Web-enabled panel surveys are self-administered, which allows respondents to complete the surveys at their convenience and own pace, in the comfort and privacy of their homes. The electronic survey system supports the inclusion of video, audio, and 3-D graphics in the questionnaire if so desired. The electronic data collection can track how long respondents spend on each screen.

The data capture survey system, owned by KN, was designed to meet the specific requirements of Web-based surveys. The system supports all types of questions commonly used in complex, computer-based interviewing systems. It uses advanced scripting techniques for customization of individual questions to meet the needs of researchers proposing innovative designs. The data capture platform supports the complexity and type of questions proposed in our study, including multimedia graphics and voice-over presentation. The system also supports the importation of auxiliary data, such as demographic information collected as part of the screening. These data can be used to inform question logic, question wording, etc.

The second wave of the survey will be administered by mail, the planned method for the final survey.

4. Describe efforts to identify duplication.

Based on discussions with a variety of stakeholders (academic, governmental, and Tribal representatives) involved in the dam removal and restoration effort, we have found no existing data collection activities that have specifically addressed the information needs of this study. While research has been done to value dam removal on the Elwha River, it did not address the value of restoration activities once the dams have been removed.

5. If the collection of information involves small businesses or other small entities, describe the methods used to minimize burden.

This pretest will target individuals rather than small businesses or small entities.

6. Describe the consequences to the Federal program or policy activities if the collection is not conducted or is conducted less frequently.

Without this collection, NOAA will be unable to develop the tools necessary to conduct this research.

7. Explain any special circumstances that require the collection to be conducted in a manner inconsistent with OMB guidelines.

For the pretest, we anticipate that the response rate will be lower than OMB guidelines suggest. Because the goals of the pretest are methods development and bid design with limited sample size, we do not anticipate any non-response follow-up efforts. For the first wave of the pretest, we anticipate getting a 20% response rate. The low overall expected response rate is due to the multi-stage construction of the KN Panel. For the second wave of the pretest, we anticipate getting up to a 5% higher response rate, due to incentives (see Question 9).

8. Provide information on the PRA Federal Register Notice that solicited public comments on the information collection prior to this submission. Summarize the public comments received in response to that notice and describe the actions taken by the agency in response to those comments. Describe the efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.

A Federal Register Notice requesting comments regarding this request was published on September 12, 2012 (77 FR 56189). One comment was received by the agency. The commenter, who resides in Illinois, asked about the decision to limit the survey to Washington and Oregon residents, given the presence of a national park in the proposed study area. The agency responded to the comment by indicating that similar studies typically limit geographic scope in some manner, the current limit on geographic scope is based on the expected familiarity with the study area by residents of Oregon and Washington, and, depending on the results of this work, future research may seek to evaluate a national sample. In addition, the agency received one request for additional information; the survey was provided to the person making that request.

Consultants outside the agency

NOAA and Stratus Consulting have compiled a team of experts to carry out this study. Key team members include Mr. David Chapman, Dr. Megan Lawson, Ms. Colleen Donovan and Ms. Heather Hosterman of Stratus Consulting, Dr. Richard Bishop (Professor Emeritus from the University of Wisconsin), Dr. James Boyd (economist with Resources for the Future), Dr. John Duffield (Professor with the University of Montana), Dr. John Loomis (Professor with Colorado State University), Dr. Roger Tourangeau (statistician and sampling expert at Westat), and Dr. Barbara Kanninen (econometrics expert with BK Econometrics, LLP). We have also contracted with Dr. Richard Carson, Professor at the University of California, San Diego, to participate as a peer reviewer. These experts have extensive experience in all disciplines necessary to complete an effective study, including the fields of non-market valuation, econometrics, and survey research and design. They have frequently applied their expertise in the context of environmental issues, including the protection of T&E species, the implementation of ecological restoration projects, water quality issues, water allocation issues, impacts to recreation, and impacts to Tribal resources. Members of this Team have worked extensively for federal, state, and local governments; American Indian Tribes in the Pacific Northwest and throughout the United States; non-profit groups; and research foundations.

Our team has substantial experience using non-market valuation methods to address environmental issues including valuation of ecosystem services, addressing dam removal, dam modification projects, and management of river flows to protect T&E species. Experience specifically related to dam modification projects includes:

Dr. Bishop conducted a study that valued improvements to environmental, cultural, and recreational resources of the Grand Canyon resulting from modifications to the operation of Glen Canyon Dam (Bishop et al., 1987; Welsh et al., 1997). The study involved two non-use surveys – one conducted throughout the United States and one conducted specifically with ratepayers whose electricity costs would increase due to changes in dam operations. The valuation scenarios included protection of Tribal, cultural, and spiritual resources. Secretary of the Interior Bruce Babbitt and Commissioner of Reclamation Eluid Martinez (Martinez and Babbitt, 1996) cited the non-use valuation study in justifying their decision to modify Glen Canyon Dam operations to achieve environmental and other goals.

Dr. Loomis conducted a study that valued the increase in salmon populations from the removal of the Elwha River dams (Loomis, 1996b).. One conclusion of this study involved the extent to which the relevant market for the non-use values for dam removal and restored salmon runs included the national population (Loomis, 1996a).

9. Explain any decisions to provide payments or gifts to respondents, other than remuneration of contractors or grantees.

Two types of respondent incentives are provided: non-survey-specific and survey-specific incentives. Each is described below.

KN uses non-survey-specific incentives to maintain a high degree of panel loyalty and to prevent attrition from the panel. KN provides panel members with Internet connections and laptops (or Web-capable devices) if they do not already have them. For these households, the incentive is the hardware and Internet service. For households using their own personal computers and Internet service, KN enrolls the panelists in a points program that is analogous to a “frequent flyer” card in that respondents are credited with points in proportion to their regular participation in surveys. Panel members receive cash-equivalent checks approximately every four to six months in amounts reflecting their panel participation level, commonly $2 to $6 per month.

KN provides survey-specific incentives to panel members as a result of one or two conditions: (1) the survey is expected to require more than 20 minutes to complete; or (2) there is an unusual request being made of the respondent, such as providing a specimen, viewing a specific television program, or completing a daily diary. In these circumstances, panelists are being asked to participate in ways that are more burdensome than initially described during panel recruitment. For example, for the NOAA Coral Reef Protection Survey, an incentive was provided because the survey was expected to require 20 or more minutes to complete, and maximizing survey participation was a key study goal. Respondents who participated in that survey were credited with 10,000 points, which equates to the $10 that was mailed to them at a later date.

For the first wave of the pretest, KN will provide their usual incentive to their panel members, because the survey is expected to require 20 or more minutes to complete. For the second wave of the pretest, we propose to provide respondents with a $2 incentive when they receive the survey questionnaire in the mail. They will get this incentive regardless of whether they complete the survey.

Inclusion of an incentive acts as a sign of goodwill on the part of study sponsors and encourages reciprocity by the respondent. Singer (2002) provides a review of the use of incentives in surveys. Her findings show that giving respondents a small monetary incentive increases response rates. KN has analyzed the predictors of survey completion rates of studies conducted using its Web-enabled panel. A multivariate analysis based on approximately 500 KN surveys attempted to predict the effect of respondent incentives on survey completion rates while controlling for length of field period, sample composition, use of video in the instrument, and other factors. The effect of respondents’ incentives is significant (p < 0.01) for both $5 and $10 cash-equivalent incentives. Use of a $5 incentive increased response by 4 percentage points, and a $10 incentive increased response by 6 percentage points. Internal KN research has demonstrated that incentives increase the survey completion rate by approximately 5 percentage points. The increase is larger for young adults and Hispanics.

10. Describe any assurance of confidentiality provided to respondents and the basis for assurance in statute, regulation, or agency policy.

No assurance of confidentiality based on statute or regulation will be provided to the respondents. As part of the PRA statement provided to both KN and mail wave participants, respondents will be told that their identity will be protected throughout the data acquisition and analysis process. The anonymity of the survey respondents will be protected by using an independent contractor to collect the information; by enacting procedures to prevent unauthorized access to respondent data; and by preventing the public disclosure of the responses of individual participants. The Team will not be provided respondent identification information.

KN privacy procedures

All KN panel members receive a copy of KN’s Privacy and Term of Use Policy. In this document is a section called “Panel Member Bill of Rights,” which summarizes the information protections for panelists and explains that respondents can decide whether to participate in the panel or answer survey questions. The “Panel Member Bill of Rights” is also available electronically at all times to panelists through the panel member Web site.

Below is a summary of the measures that will be taken to meet the needs for privacy and confidentiality from the point of data access and IT.

First, all employees of KN are required to sign a confidentiality agreement requiring them to keep confidential all personally identifiable information regarding panel members. KN warrants that all employees are bound to protect the privacy and confidentiality of all personal information provided by respondents, and very few employees actually have access to any confidential data. The only employees who have access to this information, which contains personal identification information about panel members, are those with a direct need to know. Therefore, the only persons with access are the following:

  1. Database and IT administrators with access to computer servers for the purpose of maintaining the computer systems at KN.

  2. Staff members in the Panel Relations department who have direct contact with panel members as part of the inbound and outbound call center operations. These staff members are responsible for troubleshooting any problems panelists might have with their equipment or software related to survey administration, incentive fulfillment, and panel management.

  3. Staff members of the Statistics department who have access to personally identifying information in order to draw samples for the various surveys we conduct at KN.

All personally identifying records are kept secured in a separate office in the IT section of the KN office in Palo Alto, CA, and all data transfers from personal computers (both used for survey administration) to the main servers pass through a firewall. KN never provides any respondent personal identifiers to any client or agency without the explicit and informed consent provided by the sampled KnowledgePanel® members. Unless explicitly permitted as documented in a consent form, no personally identifying information will be provided to any parties outside KN in combination with the survey response data.

All electronic survey data records are stored in a secured database that does not contain personally identifying information. The staff members in the Panel Relations and Statistics departments, who have access to the personally identifying information, do not have access to the survey response data. The staff members with access to the survey response data, with the exception of the aforementioned database and IT administrators who must have access to maintain the computer systems, do not have access to the personally identifying information. The secured database contains field-specific permissions that restrict access to the data by type of user, as described above, thereby preventing unauthorized access.

The survey response data are identified only by an incremented ID number. The personally identifying information is stored in a separate database that is accessible only to persons with a need to know, as described above.

The survey data extraction system exports only anonymized survey data identified only by the Panel Member ID number. The data analysts with access to the survey data extraction system cannot join survey data to personally identifying data, as they do not have access to the personally identifying information. The Panel Relations and Statistics staff does not have access to the survey data extraction system, and therefore cannot join survey data to personally identifying data.

KN retains the survey response data in its secure database after the completion of a project for the purpose of operational research, such as studies of response rates, and for the security of our customers who might at a later time request additional analyses, statistical adjustments, or statistical surveys that would require re-surveying research subjects as part of validation or longitudinal surveys.

11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private.

No questions of a sensitive nature will be asked of respondents.

12. Provide an estimate in hours of the burden of the collection of information.

Estimated number of participants: 1,625.

Estimated time per response: 30 minutes.

Estimated total annual burden hours: 812.5.

13. Provide an estimate of the total annual cost burden to the respondents or record-keepers resulting from the collection (excluding the value of the burden hours in Question 12 above).

There will be no recordkeeping/reporting costs to the respondents.

14. Provide estimates of annualized cost to the Federal government.

The cost to the federal government for the two waves of the pretest will be approximately $190,000, which includes approximately $6,250 in paper and mailing costs for the second wave, $20,000 in government staff labor time, and $163,750 in contract costs.

15. Explain the reasons for any program changes or adjustments.

This is a new information collection request.

16. For collections whose results will be published, outline the plans for tabulation and publication.

It is not anticipated that the data collected through this ICR will be independently published or provided to the public. The information collected through this ICR will be summarized in the ICR request for the final survey administration and the final project report. Stratus Consulting will provide NOAA with a report of the pretest findings, and all data files will be documented and submitted to NOAA. The results of the pretest will be tabulated using simple summary statistical analyses of the data (e.g., frequencies, means, medians, standard deviations, maximums, and minimums). The data will be used to estimate a model for each of the three levels of salmon and forest and wildlife restoration. This analysis will be used to evaluate respondents understanding of the attributes presented, scenario description, and choice question, and whether the proposed cost levels need to be adjusted in the main survey.

Results from tests comparing the two stated preference formats will also be reported. See Section B, Question 4 for information on the actual tests.

17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons why display would be inappropriate.

NA.

18. Explain each exception to the certification statement.

NA.

1. Cattin and Wittink (1982) and Wittink and Cattin (1989) survey the commercial use of conjoint analysis, which is widespread. For survey articles and reviews of conjoint analysis, see Louviere (1988, 1992), Green and Srinivasan (1990), and Batsell and Louviere (1991). Transportation planners use choice questions to determine how commuters would respond to a new mode of transportation or a change in an existing mode. Hensher (1994) gives an overview of choice questions applied in transportation.

2. Louviere (1994) provides an overview of choice questions applied in marketing.

3. See, for example, Louviere and Woodward (1983), Louviere (1988), and Elrod et al. (1992).


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleMS Word Default Normal Template
AuthorColleen Donovan
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy